Skip to content

Commit 10a0711

Browse files
spline2hgpre-commit-ci[bot]janosg
authored
Add optimizers from PySwarms (#639)
* Add global-best PSO * fic parameters * expose verbose * use CONVERGENCE_FTOL_REL * Add local-best PSO * refactor: simplify return * Add general PSO * docs: improve docstrings of global_best * improve local-best pso docstrings * improve general pso docstrings * fix: particle center scaling * feat: enable parallel eval and history in PySwarms * docs: add PySwarms optimizers to algorithms.md * refactor: topology – swap string flags for dataclass configs * docs: clarify boundary & velocity strategy choices in docstrings * docs: add PSO citations to algorithm docstrings * refactor: drop unused GeneralPSOOptions dataclass * docs: clarify topology parameter usage * format docstrings * feat: expose initial_population and oh_strategy in PySwarms optimizers * refactor: simplify bounds handling * refactor: infer actual iterations & evals * add pyswarms to mypy ignore list * fix: mark PySwarms optimizers as stochastic via seed parameter * fix: add missing comma to mypy ignore list * extend mypy ignores to pyswarms.backend.topology * refactor: use dataclasses for PSO hyper-params * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * refactor: rename initial_population to initial_positions and add PyTree support * refactor: use common internal function for all PySwarms optimizers * refactor: move ring topology params into RingTopology dataclass * refactor: use local random-state and fix arrangement * fix: rename VonNeumannTopology.range to range_param * test: add tests for PySwarms helper functions * refactor: use common interface * docs: display inherited params in pyswarms optimizers * fix: update tests * update optimizer param default values * fix: remove STOPPING_MAXITER_GLOBAL, set maxiter to 1000 * warn: add warning for seed * minor fixes after review --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Janos Gabler <[email protected]>
1 parent 9976689 commit 10a0711

File tree

15 files changed

+1169
-2
lines changed

15 files changed

+1169
-2
lines changed

.tools/envs/testenv-linux.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -39,6 +39,7 @@ dependencies:
3939
- kaleido>=1.0 # dev, tests
4040
- bayes_optim # dev, tests
4141
- gradient_free_optimizers # dev, tests
42+
- pyswarms # dev, tests
4243
- pandas-stubs # dev, tests
4344
- types-cffi # dev, tests
4445
- types-openpyxl # dev, tests

.tools/envs/testenv-nevergrad.yml

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,12 +36,13 @@ dependencies:
3636
- kaleido>=1.0 # dev, tests
3737
- bayes_optim # dev, tests
3838
- gradient_free_optimizers # dev, tests
39+
- pyswarms # dev, tests
3940
- pandas-stubs # dev, tests
4041
- types-cffi # dev, tests
4142
- types-openpyxl # dev, tests
4243
- types-jinja2 # dev, tests
4344
- sqlalchemy-stubs # dev, tests
44-
- sphinxcontrib-mermaid # dev, tests, docs
4545
- bayesian_optimization==1.4.0
4646
- nevergrad
47+
- sphinxcontrib-mermaid # dev, tests, docs
4748
- -e ../../

.tools/envs/testenv-numpy.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,7 @@ dependencies:
3737
- kaleido>=1.0 # dev, tests
3838
- bayes_optim # dev, tests
3939
- gradient_free_optimizers # dev, tests
40+
- pyswarms # dev, tests
4041
- types-cffi # dev, tests
4142
- types-openpyxl # dev, tests
4243
- types-jinja2 # dev, tests

.tools/envs/testenv-others.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,7 @@ dependencies:
3737
- kaleido>=1.0 # dev, tests
3838
- bayes_optim # dev, tests
3939
- gradient_free_optimizers # dev, tests
40+
- pyswarms # dev, tests
4041
- pandas-stubs # dev, tests
4142
- types-cffi # dev, tests
4243
- types-openpyxl # dev, tests

.tools/envs/testenv-pandas.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,7 @@ dependencies:
3737
- kaleido>=1.0 # dev, tests
3838
- bayes_optim # dev, tests
3939
- gradient_free_optimizers # dev, tests
40+
- pyswarms # dev, tests
4041
- types-cffi # dev, tests
4142
- types-openpyxl # dev, tests
4243
- types-jinja2 # dev, tests

.tools/envs/testenv-plotly.yml

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,11 +36,12 @@ dependencies:
3636
- fides==0.7.4 # dev, tests
3737
- bayes_optim # dev, tests
3838
- gradient_free_optimizers # dev, tests
39+
- pyswarms # dev, tests
3940
- pandas-stubs # dev, tests
4041
- types-cffi # dev, tests
4142
- types-openpyxl # dev, tests
4243
- types-jinja2 # dev, tests
4344
- sqlalchemy-stubs # dev, tests
44-
- sphinxcontrib-mermaid # dev, tests, docs
4545
- kaleido<0.3
46+
- sphinxcontrib-mermaid # dev, tests, docs
4647
- -e ../../

docs/source/algorithms.md

Lines changed: 101 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4885,6 +4885,107 @@ We wrap the pygad optimizer. To use it you need to have
48854885
.. autoclass:: optimagic.optimizers.pygad_optimizer.Pygad
48864886
```
48874887

4888+
## PySwarms Optimizers
4889+
4890+
optimagic supports the following continuous algorithms from the
4891+
[PySwarms](https://pyswarms.readthedocs.io/en/latest/) library: (GlobalBestPSO,
4892+
LocalBestPSO, GeneralOptimizerPSO). To use these optimizers, you need to have
4893+
[the pyswarms package](https://github.com/ljvmiranda921/pyswarms) installed.
4894+
(`pip install pyswarms`).
4895+
4896+
```{eval-rst}
4897+
.. dropdown:: pyswarms_global_best
4898+
4899+
**How to use this algorithm:**
4900+
4901+
.. code-block::
4902+
4903+
import optimagic as om
4904+
om.minimize(
4905+
...,
4906+
algorithm=om.algos.pyswarms_global_best(n_particles=50, ...)
4907+
)
4908+
4909+
or
4910+
4911+
.. code-block::
4912+
4913+
om.minimize(
4914+
...,
4915+
algorithm="pyswarms_global_best",
4916+
algo_options={"n_particles": 50, ...}
4917+
)
4918+
4919+
**Description and available options:**
4920+
4921+
.. autoclass:: optimagic.optimizers.pyswarms_optimizers.PySwarmsGlobalBestPSO
4922+
:members:
4923+
:inherited-members: Algorithm, object
4924+
4925+
```
4926+
4927+
```{eval-rst}
4928+
.. dropdown:: pyswarms_local_best
4929+
4930+
**How to use this algorithm:**
4931+
4932+
.. code-block::
4933+
4934+
import optimagic as om
4935+
om.minimize(
4936+
...,
4937+
algorithm=om.algos.pyswarms_local_best(n_particles=50, k_neighbors=3, ...)
4938+
)
4939+
4940+
or
4941+
4942+
.. code-block::
4943+
4944+
om.minimize(
4945+
...,
4946+
algorithm="pyswarms_local_best",
4947+
algo_options={"n_particles": 50, "k_neighbors": 3, ...}
4948+
)
4949+
4950+
**Description and available options:**
4951+
4952+
.. autoclass:: optimagic.optimizers.pyswarms_optimizers.PySwarmsLocalBestPSO
4953+
:members:
4954+
:inherited-members: Algorithm, object
4955+
4956+
```
4957+
4958+
```{eval-rst}
4959+
.. dropdown:: pyswarms_general
4960+
4961+
**How to use this algorithm:**
4962+
4963+
.. code-block::
4964+
4965+
import optimagic as om
4966+
om.minimize(
4967+
...,
4968+
algorithm=om.algos.pyswarms_general(n_particles=50, topology_type="star", ...)
4969+
)
4970+
4971+
or
4972+
4973+
.. code-block::
4974+
4975+
om.minimize(
4976+
...,
4977+
algorithm="pyswarms_general",
4978+
algo_options={"n_particles": 50, "topology_type": "star", ...}
4979+
)
4980+
4981+
**Description and available options:**
4982+
4983+
.. autoclass:: optimagic.optimizers.pyswarms_optimizers.PySwarmsGeneralPSO
4984+
:members:
4985+
:inherited-members: Algorithm, object
4986+
4987+
```
4988+
48884989
## References
48894990

48904991
```{eval-rst}

docs/source/refs.bib

Lines changed: 36 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1077,4 +1077,40 @@ @article{gad2023pygad
10771077
publisher={Springer}
10781078
}
10791079

1080+
@INPROCEEDINGS{EberhartKennedy1995,
1081+
author = {Eberhart, R. and Kennedy, J.},
1082+
booktitle = {MHS'95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science},
1083+
title = {A new optimizer using particle swarm theory},
1084+
year = {1995},
1085+
pages = {39-43},
1086+
keywords = {Particle swarm optimization;Genetic algorithms;Testing;Acceleration;Particle tracking;Optimization methods;Artificial neural networks;Evolutionary computation;Performance evaluation;Statistics},
1087+
doi = {10.1109/MHS.1995.494215}
1088+
}
1089+
1090+
@INPROCEEDINGS{Lane2008SpatialPSO,
1091+
author={Lane, James and Engelbrecht, Andries and Gain, James},
1092+
booktitle={2008 IEEE Swarm Intelligence Symposium},
1093+
title={Particle swarm optimization with spatially meaningful neighbours},
1094+
year={2008},
1095+
volume={},
1096+
number={},
1097+
pages={1-8},
1098+
keywords={Particle swarm optimization;Topology;Birds;Convergence;Computer science;USA Councils;Cities and towns;Africa;Cultural differences;Data structures;Delaunay Triangulation;Neighbour Topology;Particle Swarm Optimization;Heuristics},
1099+
doi={10.1109/SIS.2008.4668281}
1100+
}
1101+
1102+
@article{Ni2013,
1103+
author = {Ni, Qingjian and Deng, Jianming},
1104+
title = {A New Logistic Dynamic Particle Swarm Optimization Algorithm Based on Random Topology},
1105+
journal = {The Scientific World Journal},
1106+
volume = {2013},
1107+
number = {1},
1108+
pages = {409167},
1109+
doi = {https://doi.org/10.1155/2013/409167},
1110+
url = {https://onlinelibrary.wiley.com/doi/abs/10.1155/2013/409167},
1111+
eprint = {https://onlinelibrary.wiley.com/doi/pdf/10.1155/2013/409167},
1112+
abstract = {Population topology of particle swarm optimization (PSO) will directly affect the dissemination of optimal information during the evolutionary process and will have a significant impact on the performance of PSO. Classic static population topologies are usually used in PSO, such as fully connected topology, ring topology, star topology, and square topology. In this paper, the performance of PSO with the proposed random topologies is analyzed, and the relationship between population topology and the performance of PSO is also explored from the perspective of graph theory characteristics in population topologies. Further, in a relatively new PSO variant which named logistic dynamic particle optimization, an extensive simulation study is presented to discuss the effectiveness of the random topology and the design strategies of population topology. Finally, the experimental data are analyzed and discussed. And about the design and use of population topology on PSO, some useful conclusions are proposed which can provide a basis for further discussion and research.},
1113+
year = {2013}
1114+
}
1115+
10801116
@Comment{jabref-meta: databaseType:bibtex;}

environment.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -51,6 +51,7 @@ dependencies:
5151
- pre-commit>=4 # dev
5252
- bayes_optim # dev, tests
5353
- gradient_free_optimizers # dev, tests
54+
- pyswarms # dev, tests
5455
- -e . # dev
5556
# type stubs
5657
- pandas-stubs # dev, tests

pyproject.toml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -384,6 +384,8 @@ module = [
384384
"iminuit",
385385
"nevergrad",
386386
"pygad",
387+
"pyswarms",
388+
"pyswarms.backend.topology",
387389
"yaml",
388390
"gradient_free_optimizers",
389391
"gradient_free_optimizers.optimizers.base_optimizer",

0 commit comments

Comments
 (0)