Logistic Regression Demo
AIO2025: Module 06.
๐
About this Logistic Regression Demo
Interactive demonstration of Logistic Regression using NumPy and gradient descent. Learn binary classification with sigmoid activation, binary cross-entropy loss, and adjustable prediction threshold. Visualize training metrics and experiment with different threshold values.
๐ How to Use: Select binary classification data โ Configure target (must have 2 classes) โ Set training parameters โ Adjust threshold โ Enter feature values โ Run training!
Start with sample datasets or upload your own CSV/Excel files.
๐๏ธ Sample Datasets
๐ฏ Target Column
๐ Loading sample data...
๐ Data Preview (First 5 Rows)
๐ Logistic Regression Parameters
0 6
Current Learning Rate: 0.01
0 10
Current Batch Size: Full Batch
๐ Data Split Configuration
0.6 0.9
๐ฏ Prediction Threshold Configuration
0 1
Current Threshold: 0.50
๐ Logistic Regression Results & Visualization
**๐ Logistic Regression Results**
Training details will appear here showing model performance, learned parameters, and predictions with current threshold.
Training details will appear here showing model performance, learned parameters, and predictions with current threshold.
๐ Logistic Regression Guide:
๐ Training Metrics:
- Loss (BCE): Binary Cross-Entropy loss decreases as model learns. Lower loss indicates better fit.
- Accuracy: Classification accuracy improves during training. Monitor both training and validation accuracy.
๐ง Training Parameters:
- Epochs: Number of complete passes through training data. More epochs = better learning, but watch for overfitting.
- Learning Rate: Step size for gradient descent. Recommended: 0.001 to 0.01. Too high may cause instability.
- Batch Size: Samples processed before updating parameters. Powers of 2: 1, 2, 4, 8... or Full Batch. Smaller = faster updates but noisier. Larger = more stable.
- Train/Validation Split: Proportion of data for training vs validation. Default 80/20 split.
๐ฏ Threshold Parameter:
- Threshold: Probability cutoff for binary classification. If predicted probability โฅ threshold โ class 1, else โ class 0.
- Default: 0.5 (balanced)
- Lower threshold (e.g., 0.3): More predictions of class 1 โ higher recall, lower precision
- Higher threshold (e.g., 0.7): Fewer predictions of class 1 โ higher precision, lower recall
- Experiment: Adjust threshold to see how predictions and accuracy change!
๐งฎ Algorithm Details:
- Sigmoid Activation: Maps linear output to probability (0-1 range)
- Binary Cross-Entropy Loss: Optimized for binary classification tasks
- Feature Normalization: Automatic standardization (zero mean, unit variance) for stable training
๐ก Tips:
- Start with default parameters (100 epochs, learning rate 0.01, threshold 0.5)
- Monitor validation metrics to detect overfitting
- Adjust threshold based on your classification goals (precision vs recall)
- Use batch size = Full Batch for most stable training
Created by
VLAI
from AI VIET NAM