Benchmarks

This is a repository of optimization benchmarks results where users can upload their own results and compare them with existing state-of-the-art algorithms.

Step 1. Add your results

To upload your own results, proceed to “Add new results” under the “Benchmarks” menu. You will first have to select the benchmark for which you want to upload your results and then download a template file in which to place the results (mean error with regards the optimal value of each problem). To avoid fake results, authors are required to provide either (or both) the source code of the algorithm or a PDF file with the paper in which the results were published. New results must be validated by the admins of this site, so be patient if it takes a few hours for your results to be published on-line.

Step 2. Compare algorithms

Once the results are on-line, or if you just want to compare other already existing algorithms, proceed to “Compare algorithms” under “Benchmarks” and select the benchmark(s), algorithm(s) and dimension(s) for which you want the comparison to be conducted.

Was this service useful?

If you find this service useful, consider citing the paper in which this repository was presented:

LaTorre, A., Muelas, S., & Peña, J. M. (2014). A Comprehensive Comparison of Large Scale Global Optimizers (in press). Information Sciences. doi:10.1016/j.ins.2014.09.031.

Finally, if you want to contribute with corrections, suggestions or want a new benchmark to be added to the repository, please do not hesitate to contact any of the authors.