SMAC: Sequential Model-based Algorithm Configuration. SMAC (sequential model-based algorithm configuration) is a versatile tool for optimizing algorithm parameters (or the parameters of some other process we can run automatically, or a function we can evaluate, such as a simulation). SMAC has helped us speed up both local search and tree search algorithms by orders of magnitude on certain instance distributions. Recently, we have also found it to be very effective for the hyperparameter optimization of machine learning algorithms, scaling better to high dimensions and discrete input dimensions than other algorithms. Finally, the predictive models SMAC is based on can also capture and exploit important information about the model domain, such as which input variables are most important. We hope you find SMAC similarly useful. Ultimately, we hope that it helps algorithm designers focus on tasks that are more scientifically valuable than parameter tuning.

References in zbMATH (referenced in 79 articles )

Showing results 1 to 20 of 79.
Sorted by year (citations)

1 2 3 4 next

  1. de Souza, Marcelo; Ritt, Marcus; López-Ibáñez, Manuel: Capping methods for the automatic configuration of optimization algorithms (2022)
  2. Hall, George T.; Oliveto, Pietro S.; Sudholt, Dirk: On the impact of the performance metric on efficient algorithm configuration (2022)
  3. Latour, Anna L. D.; Babaki, Behrouz; Fokkinga, Daniël; Anastacio, Marie; Hoos, Holger H.; Nijssen, Siegfried: Exact stochastic constraint optimisation with applications in network analysis (2022)
  4. Toscano-Palmerin, Saul; Frazier, Peter I.: Bayesian optimization with expensive integrands (2022)
  5. Ansótegui, Carlos; Ojeda, Jesús; Pacheco, Antonio; Pon, Josep; Salvia, Josep M.; Torres, Eduard: OptiLog: a framework for SAT-based systems (2021)
  6. Ansótegui, Carlos; Pon, Josep; Sellmann, Meinolf; Tierney, Kevin: PyDGGA: distributed GGA for automatic configuration (2021)
  7. Bakirov, Rashid; Fay, Damien; Gabrys, Bogdan: Automated adaptation strategies for stream learning (2021)
  8. Bertsimas, Dimitris; Stellato, Bartolomeo: The voice of optimization (2021)
  9. Blanchard, Antoine; Sapsis, Themistoklis: Output-weighted optimal sampling for Bayesian experimental design and uncertainty quantification (2021)
  10. Corazza, Marco; di Tollo, Giacomo; Fasano, Giovanni; Pesenti, Raffaele: A novel hybrid PSO-based metaheuristic for costly portfolio selection problems (2021)
  11. de Oliveira, Sabrina M.; Bezerra, Leonardo C. T.; Stützle, Thomas; Dorigo, Marco; Wanner, Elizabeth F.; de Souza, Sérgio R.: A computational study on ant colony optimization for the traveling salesman problem with dynamic demands (2021)
  12. Grosnit, Antoine; Cowen-Rivers, Alexander I.; Tutunov, Rasul; Griffiths, Ryan-Rhys; Wang, Jun; Bou-Ammar, Haitham: Are we forgetting about compositional optimisers in Bayesian optimisation? (2021)
  13. Jomaa, Hadi S.; Schmidt-Thieme, Lars; Grabocka, Josif: Dataset2Vec: learning dataset meta-features (2021)
  14. Kim, Jungtaek; McCourt, Michael; You, Tackgeun; Kim, Saehoon; Choi, Seungjin: Bayesian optimization with approximate set kernels (2021)
  15. Mischek, Florian; Musliu, Nysret: A local search framework for industrial test laboratory scheduling (2021)
  16. Nof, Yair; Strichman, Ofer: Real-time solving of computationally hard problems using optimal algorithm portfolios (2021)
  17. Shi, Junjie; Bian, Jiang; Richter, Jakob; Chen, Kuan-Hsun; Rahnenführer, Jörg; Xiong, Haoyi; Chen, Jian-Jia: MODES: model-based optimization on distributed embedded systems (2021)
  18. Sironi, Chiara F.; Winands, Mark H. M.: Analysis of the impact of randomization of search-control parameters in Monte-Carlo tree search (2021)
  19. Vallati, Mauro; Chrpa, Lukáš; McCluskey, Thomas Leo; Hutter, Frank: On the importance of domain model configuration for automated planning engines (2021)
  20. Yang, Zebin; Zhang, Aijun: Hyperparameter optimization via sequential uniform designs (2021)

1 2 3 4 next

Further publications can be found at: