Mean Payoff Optimization for Systems of Periodic Service and Maintenance
Authors: David Klaška, Antonín Kučera, Vít Musil, Vojtěch Řehák
IJCAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | The algorithms of (3) and (4) are evaluated on instances of increasing size. We use planar grids with randomly positioned nodes to avoid bias towards simple instances. The algorithms process relatively large instances and produce high-quality schedules. |
| Researcher Affiliation | Academia | Masaryk University, Brno, Czech Republic david.klaska@mail.muni.cz, {tony, musil, rehak}@fi.muni.cz |
| Pseudocode | Yes | Algorithm 1 Strategy optimization; Algorithm 2 Sampling a periodic schedule from σ; Algorithm 3 Evaluation of a periodic schedule |
| Open Source Code | Yes | The code and experiments setup are available at gitlab.fi.muni.cz/formela/2023-ijcai-periodic-maintenance. |
| Open Datasets | No | For each k, we construct a service specification Sk consisting of one distinguished compulsory node (depot), k nodes modeling machines with long maintenance time and high payoff, and 3k nodes representing machines with short maintenance time and lower payoff. The paper does not provide concrete access information for a publicly available dataset. |
| Dataset Splits | No | The paper evaluates its algorithms on constructed instances and describes the number of runs and steps (e.g., "For every k, we run Alg. 1 twelve times, each with 50 optimization steps.") but does not describe training, validation, or test dataset splits in the typical sense for machine learning. |
| Hardware Specification | No | The paper does not provide specific hardware details used for running its experiments. |
| Software Dependencies | No | We implemented the computation in PyTorch library [Paszke et al., 2019] where the gradients are calculated automatically. The optimization loop (Algorithm 1) is implemented using PYTORCH framework [Paszke et al., 2019] and its automatic differentiation with ADAM optimizer [Kingma and Ba, 2015]. However, specific version numbers for these software components are not provided. |
| Experiment Setup | Yes | For every k, we run Alg. 1 twelve times, each with 50 optimization steps. We used s = 10^5, ℓ= 300 and the depot as an initial vertex. |