Meta-Stacking
Ensemble learning where base model out-of-fold predictions feed a meta-learner. Extracts signal that no single model could capture alone.
8 min readData Scientist
Deep Learning · LLMs · Computer Vision
ML experiments, implementations & research notes
// in-depth technical breakdowns & experiments
Ensemble learning where base model out-of-fold predictions feed a meta-learner. Extracts signal that no single model could capture alone.
8 min readA sanity check for your data pipeline. Train a classifier to distinguish train from test — detect distribution drift before it burns you.
6 min readA complete pipeline: rolling window feature engineering on climate data, wired into an RNN for next-day temperature prediction.
10 min readFeature engineering meets gradient boosting on the Palmer Penguins dataset. Polynomial expansion + CatBoost's native categorical handling — 97.1% test accuracy.
7 min readBuilding K-Means from scratch in pure NumPy and finding the optimal K via the Elbow method — no scikit-learn clustering.
8 min readTeaching an agent to balance a pole using reinforcement learning and PyTorch. Experience replay, epsilon-greedy exploration, mean reward 226.2.
10 min readLearning a structured latent space for image generation with PyTorch. Reparameterization trick, KL annealing, and smooth digit generation.
9 min readBuilding Gini-based tree splitting in pure NumPy on the Iris dataset. No black-box: every prediction follows a traceable path of if-else rules.
7 min readDimensionality reduction via SVD in pure NumPy. 10D to 2D, validated against sklearn to 1e-6. Scree plot, loadings heatmap, and reconstruction error analysis.
8 min readDensity-based clustering in pure NumPy. No k required. Core points, border points, noise detection — tested on three Gaussian clusters with random scatter.
7 min readFrom plain gradient descent to Adaptive Moment Estimation. Momentum, RMSProp, bias correction — then a benchmark pitting Adam vs SGD vs Momentum on XOR.
12 min readTraining a neural network with no autograd whatsoever. Forward pass, cross-entropy loss, full backward pass with chain rule, and weight updates written out in plain NumPy.
9 min read// no articles match
// get notified when new articles drop
// tools & frameworks powering the research