Skip to content
4 changes: 4 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,10 @@ nlp = ADNLPModel(x -> 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2, [-1.2; 1.0])
stats = lbfgs(nlp) # or trunk, tron, R2
```

## Documentation

Click on the badge [![](https://img.shields.io/badge/docs-stable-3f51b5.svg)](https://jso.dev/JSOSolvers.jl/stable) to access the documentation.
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@giovannifereoli It was added here already, do you think of a better place ?
Preview: https://github.com/JuliaSmoothOptimizers/JSOSolvers.jl/blob/add-joss-comments-docs/README.md


## How to cite

If you use JSOSolvers.jl in your work, please cite using the format given in [CITATION.cff](CITATION.cff).
Expand Down
1 change: 1 addition & 0 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ makedocs(
sitename = "JSOSolvers.jl",
pages = [
"index.md",
"examples.md",
"solvers.md",
"benchmark.md",
"floating-point-systems.md",
Expand Down
4 changes: 4 additions & 0 deletions docs/src/examples.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
# Examples

Beyond this repository's documentation, you can also find a list of tutorials on [JuliaSmoothOptimizers Tutorials](https://jso.dev/tutorials) by selecting the tag `JSOSolvers.jl`. For instance, the tutorial [Introduction to JSOSolvers](https://jso.dev/tutorials/introduction-to-jsosolvers/#title) is a good starting point.

36 changes: 31 additions & 5 deletions docs/src/index.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,36 @@
# [JSOSolvers.jl documentation](@id Home)

This package provides a few optimization solvers curated by the [JuliaSmoothOptimizers](https://jso.dev) organization.
`JSOSolvers.jl` is a collection of Julia optimization solvers for nonlinear, potentially nonconvex, continuous optimization problems that are unconstrained or bound-constrained:

```math
\begin{aligned}
\min\; & f(x) \\
\text{s.t.}\; & \ell \leq x \leq u
\end{aligned}
```
where $f:\mathbb{R}^n \rightarrow \mathbb{R}$ is a continuously differentiable function, with $\ell \in \left(\mathbb{R} \cup \{-\infty\} \right)^n$, and $u \in \left(\mathbb{R} \cup \{+\infty\} \right)^n$.
The algorithms implemented here are iterative methods that aim to compute a stationary point of \eqref{eq:nlp} using first and, if possible, second-order derivatives.

This package provides optimization solvers curated by the [JuliaSmoothOptimizers](https://jso.dev) organization.
Solvers in `JSOSolvers.jl` take as input an `AbstractNLPModel`, JSO's general model API defined in `NLPModels.jl`, a flexible data type to evaluate objective and constraints, their derivatives, and to provide any information that a solver might request from a model.

The solvers in `JSOSolvers.jl` adopt a matrix-free approach, where standard optimization methods are implemented without forming derivative matrices explicitly.
This strategy enables the solution of large-scale problems even when function and gradient evaluations are expensive. The motivation is to solve large-scale unconstrained and bound-constrained problems such as parameter estimation in inverse problems, design optimization in engineering, and regularized machine learning models, and use these solvers to solve subproblems of penalty algorithms.

## Installation

`JSOSolvers` is a registered package. To install this package, open the Julia REPL (i.e., execute the julia binary), type `]` to enter package mode, and install `JSOSolvers` as follows

```julia
pkg> add JSOSolvers
```

# Bug reports and discussions

If you think you found a bug, feel free to open an [issue](https://github.com/JuliaSmoothOptimizers/JSOSolvers.jl/issues).
Focused suggestions and requests can also be opened as issues. Before opening a pull request, start an issue or a discussion on the topic, please.

If you want to ask a question not suited for a bug report, feel free to start a discussion [here](https://github.com/JuliaSmoothOptimizers/Organization/discussions). This forum is for general discussion about this repository and the [JuliaSmoothOptimizers](https://github.com/JuliaSmoothOptimizers), so questions about any of our packages are welcome.

## Basic usage

Expand All @@ -23,7 +53,3 @@ where `nlp` is an AbstractNLPModel or some specialization, such as an `AbstractN
- `stats` is a `SolverTools.GenericExecutionStats` with the output of the solver.

See the full list of [Solvers](@ref).

## Tutorials

Beyond this repository's documentation, you can also find a list of tutorials on [JuliaSmoothOptimizers Tutorials](https://jso.dev/tutorials) by selecting the tag `JSOSolvers.jl`.
Loading