An autonomous ML framework using meta-learning and genetic algorithms to evolve neural architectures in real-time with automatic feature engineering and zero-downtime deployments
- Overview
- Features
- Architecture
- Quick Start
- Installation
- Usage
- Configuration
- Examples
- Development
- Contributing
NeuralForge is an autonomous machine learning framework written in Haskell that combines:
- 🧬 Genetic Algorithms: Evolutionary architecture search for optimal neural network designs
- 🧠 Meta-Learning: Transfer knowledge from past experiments to accelerate new ones
- ⚡ Auto Feature Engineering: Automatic creation of polynomial features and interactions
- 🔄 Zero-Downtime Deployment: Blue-green deployment with automatic rollback
- 🎨 Real-time Evolution: Continuously improve model architecture during training
- ✅ Automatic Neural Architecture Search (NAS) using genetic algorithms
- ✅ Meta-learning with knowledge base of past experiments
- ✅ Automatic feature engineering (polynomial, interactions, scaling)
- ✅ Multiple activation functions (ReLU, Tanh, Sigmoid, ELU, Leaky ReLU)
- ✅ Dynamic hyperparameter optimization
- ✅ Zero-downtime deployments with health checks
- ✅ REST API for training, prediction, and evolution
- ✅ CLI utility with colorful progress displays
- ✅ Model versioning and checkpoint management
- ✅ TensorBoard integration for visualization
- 🧬 Population-based search with configurable size
- 🎯 Tournament selection for parent selection
- 🔀 Crossover and mutation operators
- 🏆 Elitism to preserve best individuals
- 📊 Fitness evaluation on validation data
- 📈 Real-time statistics and progress tracking
NeuralForge/
├── main.hs # Entry point and CLI router
├── pipeline.hs # Training and evolution pipeline
├── genetics.hs # Genetic algorithm implementation
├── api_server.hs # REST API server
├── cli.hs # CLI utilities and display
├── config.yaml # AutoML configuration
└── testdata.csv # Sample dataset
┌─────────────┐
│ main.hs │ ← Entry Point
└──────┬──────┘
│
┌───┴────┐
│ │
▼ ▼
┌──────┐ ┌─────────┐
│ CLI │ │ API │
└──┬───┘ └────┬────┘
│ │
└────┬─────┘
│
▼
┌──────────┐
│ Pipeline │ ← Training & Evolution
└─────┬────┘
│
▼
┌──────────┐
│ Genetics │ ← GA Engine
└──────────┘
- GHC 8.10+ or Stack
- cabal-install 3.0+
# Clone the repository
git clone https://github.com/SnakeEye-sudo/NeuralForge.git
cd NeuralForge
# Install dependencies
cabal update
cabal install --only-dependencies
# Build the project
cabal build
# Or using Stack
stack build# Train with default configuration
cabal run neuralforge -- train config.yaml
# Or run evolution for 50 generations
cabal run neuralforge -- evolve config.yaml 50# Basic training
neuralforge train config.yaml
# With custom epochs (via config)
# Edit config.yaml and set epochs: 200
neuralforge train config.yaml# Evolve for 100 generations
neuralforge evolve config.yaml 100
# The best architecture will be displayed
# Generation 100
# Best Fitness: 0.9723
# Best Architecture: [128, 256, 128, 64, 32]# Run predictions on new data
neuralforge predict models/best_model.bin testdata.csv
# Output:
# Predictions:
# 0.8234
# 0.1456
# 0.9123
# ...# Start on default port (8080)
neuralforge serve
# Start on custom port
neuralforge serve 3000# Export trained model
neuralforge export models/model.bin exports/production_model.onnx# View system status
neuralforge status
# ✓ NeuralForge System Status
# - Core Engine: Running
# - Evolution Engine: Ready
# - API Server: Available
# - GPU Acceleration: Enabledcurl http://localhost:8080/api/health
# Response:
{
"status": "ok",
"message": "NeuralForge API is running"
}curl -X POST http://localhost:8080/api/train \
-H "Content-Type: application/json" \
-d '{
"configPath": "config.yaml",
"epochs": 100
}'
# Response:
{
"status": "success",
"message": "Training started",
"data": "Training with 100 epochs"
}curl -X POST http://localhost:8080/api/evolve \
-H "Content-Type: application/json" \
-d '{
"config": "config.yaml",
"generations": 50
}'
# Response:
{
"status": "success",
"message": "Evolution started",
"data": "Running 50 generations"
}curl -X POST http://localhost:8080/api/predict \
-H "Content-Type: application/json" \
-d '{
"modelPath": "models/best_model.bin",
"dataPath": "testdata.csv"
}'
# Response:
{
"status": "success",
"message": "Predictions generated",
"data": "Model: models/best_model.bin"
}Edit config.yaml to customize NeuralForge behavior:
training:
dataPath: "./testdata.csv"
outputPath: "./models/neuralforge_model.bin"
epochs: 100
validationSplit: 0.2
earlyStoppingPatience: 10architecture:
layers: [128, 64, 32]
activation: "relu"
learningRate: 0.001
batchSize: 32
dropout: 0.3
optimizer: "adam"autoFeatureEngineering:
enabled: true
polynomialDegree: 2
interactions: true
scaling: "standard"
featureSelection:
enabled: true
method: "mutual_info"
threshold: 0.05evolution:
populationSize: 50
generations: 100
tournamentSize: 5
mutationRate: 0.1
crossoverRate: 0.8
elitism: 5searchSpace:
layers:
min: 2
max: 5
neuronsPerLayer:
min: 32
max: 512
activations:
- relu
- tanh
- sigmoid
- elu
- leaky_relu
learningRate:
min: 0.0001
max: 0.1
log_scale: true# 1. Prepare your data in CSV format (like testdata.csv)
# 2. Configure training parameters
# 3. Run training
neuralforge train config.yaml
# 4. Monitor progress with colorful CLI output
# [████████████████████████████████████████] 100% | Epoch: 100/100 | Loss: 0.0234# Let NeuralForge find the best architecture
neuralforge evolve config.yaml 50
# Watch as it evolves:
# Generation 1
# Best Fitness: 0.7234
# Best Architecture: [256, 128, 64]
#
# Generation 50
# Best Fitness: 0.9723
# Best Architecture: [384, 256, 192, 96, 48]import requests
# Start evolution via API
response = requests.post('http://localhost:8080/api/evolve', json={
'config': 'config.yaml',
'generations': 100
})
print(response.json())
# {'status': 'success', 'message': 'Evolution started', ...}# Using Cabal
cabal clean
cabal configure
cabal build
# Using Stack
stack clean
stack build# Run test suite
cabal test
# Or with Stack
stack test- main.hs: Entry point, command-line argument parsing
- pipeline.hs: Training pipeline, evolution orchestration
- genetics.hs: GA operators (selection, crossover, mutation)
- api_server.hs: WAI/Warp-based REST API
- cli.hs: Terminal UI utilities with ANSI colors
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
This project is licensed under the MIT License.
- Inspired by AutoML research and neural architecture search
- Built with Haskell's powerful type system and functional paradigm
- Leveraging genetic algorithms for optimization
Er. Sangam Krishna
GitHub: @SnakeEye-sudo
Email: [email protected]
⭐ Star this repository if you find it useful!