SMAC

SMAC: Sequential Model-based Algorithm Configuration. SMAC (sequential model-based algorithm configuration) is a versatile tool for optimizing algorithm parameters (or the parameters of some other process we can run automatically, or a function we can evaluate, such as a simulation). SMAC has helped us speed up both local search and tree search algorithms by orders of magnitude on certain instance distributions. Recently, we have also found it to be very effective for the hyperparameter optimization of machine learning algorithms, scaling better to high dimensions and discrete input dimensions than other algorithms. Finally, the predictive models SMAC is based on can also capture and exploit important information about the model domain, such as which input variables are most important. We hope you find SMAC similarly useful. Ultimately, we hope that it helps algorithm designers focus on tasks that are more scientifically valuable than parameter tuning.


References in zbMATH (referenced in 61 articles )

Showing results 1 to 20 of 61.
Sorted by year (citations)

1 2 3 4 next

  1. Blanchard, Antoine; Sapsis, Themistoklis: Output-weighted optimal sampling for Bayesian experimental design and uncertainty quantification (2021)
  2. Nof, Yair; Strichman, Ofer: Real-time solving of computationally hard problems using optimal algorithm portfolios (2021)
  3. Zöller, Marc-André; Huber, Marco F.: Benchmark and survey of automated machine learning frameworks (2021)
  4. Ahmed, Mohamed Osama; Vaswani, Sharan; Schmidt, Mark: Combining Bayesian optimization and Lipschitz optimization (2020)
  5. Baioletti, Marco; Di Bari, Gabriele; Milani, Alfredo; Santucci, Valentino: An experimental comparison of algebraic crossover operators for permutation problems (2020)
  6. Bayless, Sam; Kodirov, Nodir; Iqbal, Syed M.; Beschastnikh, Ivan; Hoos, Holger H.; Hu, Alan J.: Scalable constraint-based virtual data center allocation (2020)
  7. Binois, Mickaël; Ginsbourger, David; Roustant, Olivier: On the choice of the low-dimensional domain for global optimization via random embeddings (2020)
  8. Kandasamy, Kirthevasan; Vysyaraju, Karun Raju; Neiswanger, Willie; Paria, Biswajit; Collins, Christopher R.; Schneider, Jeff; Poczos, Barnabas; Xing, Eric P.: Tuning hyperparameters without grad students: scalable and robust Bayesian optimisation with Dragonfly (2020)
  9. Kletzander, Lucas; Musliu, Nysret: Solving the general employee scheduling problem (2020)
  10. Moriconi, Riccardo; Deisenroth, Marc Peter; Sesh Kumar, K. S.: High-dimensional Bayesian optimization using low-dimensional feature spaces (2020)
  11. Moriconi, Riccardo; Kumar, K. S. Sesh; Deisenroth, Marc Peter: High-dimensional Bayesian optimization with projections using quantile Gaussian processes (2020)
  12. Ribeiro, Rita P.; Moniz, Nuno: Imbalanced regression and extreme value prediction (2020)
  13. Toutouh, Jamal; Rossit, Diego; Nesmachnow, Sergio: Soft computing methods for multiobjective location of garbage accumulation points in smart cities (2020)
  14. Zhan, Dawei; Xing, Huanlai: Expected improvement for expensive optimization: a review (2020)
  15. Banbara, Mutsunori; Inoue, Katsumi; Kaufmann, Benjamin; Okimoto, Tenda; Schaub, Torsten; Soh, Takehide; Tamura, Naoyuki; Wanko, Philipp: \textitteaspoon: solving the curriculum-based course timetabling problems with answer set programming (2019)
  16. ChangYong Oh, Efstratios Gavves, Max Welling: BOCK : Bayesian Optimization with Cylindrical Kernels (2019) arXiv
  17. Eggensperger, Katharina; Lindauer, Marius; Hutter, Frank: Pitfalls and best practices in algorithm configuration (2019)
  18. Franzin, Alberto; Stützle, Thomas: Revisiting simulated annealing: a component-based analysis (2019)
  19. Lindauer, Marius; van Rijn, Jan N.; Kotthoff, Lars: The algorithm selection competitions 2015 and 2017 (2019)
  20. Liu, Jianfeng; Ploskas, Nikolaos; Sahinidis, Nikolaos V.: Tuning BARON using derivative-free optimization algorithms (2019)

1 2 3 4 next


Further publications can be found at: http://www.cs.ubc.ca/labs/beta/Projects/SMAC/#papers/