Optimization that reads like Python.
📚 Documentation · 🚀 Quickstart · 💡 Examples
| With Optyx | With SciPy |
|---|---|
from optyx import Variable, Problem
x = Variable("x", lb=0)
y = Variable("y", lb=0)
solution = (
Problem()
.minimize(x**2 + y**2)
.subject_to(x + y >= 1)
.solve()
)
# x=0.5, y=0.5 |
from scipy.optimize import minimize
import numpy as np
def objective(v):
return v[0]**2 + v[1]**2
def gradient(v): # manual!
return np.array([2*v[0], 2*v[1]])
result = minimize(
objective, x0=[1, 1], jac=gradient,
method='SLSQP',
bounds=[(0, None), (0, None)],
constraints={'type': 'ineq',
'fun': lambda v: v[0]+v[1]-1}
) |
Your optimization code should read like your math. With Optyx, x + y >= 1 is exactly that—not a lambda buried in a constraint dictionary.
Python has excellent optimization libraries. SciPy provides algorithms. CVXPY handles convex problems. Pyomo scales to industrial applications.
Optyx takes a different path: radical simplicity.
- Write problems as you think them —
x**2 + y**2notlambda v: v[0]**2 + v[1]**2 - Never compute gradients by hand — symbolic autodiff handles derivatives
- Skip solver configuration — sensible defaults, automatic solver selection
Optyx is young and opinionated. It's not a replacement for specialized tools:
| Need | Use Instead |
|---|---|
| Large-scale MILP with custom branching | Pyomo, OR-Tools, Gurobi |
| Convex guarantees | CVXPY |
| Maximum performance | Raw solver APIs |
Optyx does support MILP (via HiGHS), sparse LPs with 100k+ variables, and solver callbacks—but if you need industrial-grade MIP with cutting planes, a dedicated solver is the right choice.
pip install optyxRequires Python 3.12+, NumPy ≥2.0, SciPy ≥1.7.
from optyx import Variable, Problem
x = Variable("x", lb=0)
y = Variable("y", lb=0)
solution = (
Problem()
.minimize(x**2 + y**2)
.subject_to(x + y >= 1)
.solve()
)
# x=0.5, y=0.5, objective=0.5from optyx import Variable, Problem
# Asset weights
tech = Variable("tech", lb=0, ub=1)
energy = Variable("energy", lb=0, ub=1)
finance = Variable("finance", lb=0, ub=1)
# Expected returns and risk (simplified)
returns = 0.12*tech + 0.08*energy + 0.10*finance
risk = tech**2 + energy**2 + finance**2 # variance proxy
solution = (
Problem()
.minimize(risk)
.subject_to(returns >= 0.09) # minimum return
.subject_to((tech + energy + finance).eq(1)) # fully invested
.solve()
)from optyx import Variable
from optyx.core.autodiff import gradient
x = Variable("x")
f = x**3 + 2*x**2 - 5*x + 3
df = gradient(f, x) # Symbolic: 3x² + 4x - 5
print(df.evaluate({"x": 2.0})) # 15.0from optyx import BinaryVariable, Problem
# Decision: pick an item (1) or leave it out (0)
x1 = BinaryVariable("x1")
x2 = BinaryVariable("x2")
x3 = BinaryVariable("x3")
x4 = BinaryVariable("x4")
x5 = BinaryVariable("x5")
value = 10*x1 + 20*x2 + 15*x3 + 25*x4 + 30*x5
weight = 5*x1 + 10*x2 + 8*x3 + 12*x4 + 15*x5
solution = (
Problem()
.maximize(value)
.subject_to(weight <= 30)
.solve()
)
# Automatically routes to a MILP solver| Feature | Description |
|---|---|
| Natural syntax | x + y >= 1 instead of constraint dictionaries |
| Automatic gradients | Symbolic differentiation—no manual derivatives |
| Smart solver selection | HiGHS for LP/MILP, SLSQP/BFGS for NLP |
| Mixed-integer programming | BinaryVariable, IntegerVariable, automatic MILP routing |
| Vector & matrix variables | VectorVariable, MatrixVariable, VariableDict for scalable models |
| Sparse LP support | subject_to(A @ x <= b) with `as_matrix(..., storage="auto" |
| Solver callbacks | Monitor progress, enforce time limits, early termination |
| LP format export | Problem.write("model.lp") for interop with other solvers |
| Solution serialization | to_json() / from_json() for logging and auditing |
| Fast re-solve | Cached compilation + warm starts, up to 900x speedup |
| Debuggable | Inspect expression trees, understand your model |
See the documentation for the full API reference, tutorials, and real-world examples.
Optyx is actively evolving:
- MIQP / MINLP support — Quadratic and nonlinear MIP via native HiGHS or Gurobi
- MPS format I/O — Import and export MPS files for solver interop
- More solvers — IPOPT integration for large-scale NLP
- Better debugging — Infeasibility diagnostics and model inspection
See the roadmap for details.
git clone https://github.com/optyx-dev/optyx.git
cd optyx
uv sync
uv run pytestContributions welcome! See our contributing guide.
MIT