Economist & Data Scientist | PyTorch-first · Bayesian inference · Reproducible, paper-faithful ML builds
Economist & Data Scientist focused on modern deep learning and Bayesian macro/time series.
I build reproducible, paper-faithful ML systems and apply them to policy-relevant problems.
- Reproducible PyTorch research builds: diffusion, ViTs, Transformers.
- Bayesian macro & time series: DSGE / SBVAR / SGDLM with posterior inference (IRFs/FEVD) + robustness checks.
- Econometrics × Deep Learning: representation learning with uncertainty + identification for policy-relevant interpretation.
- Research tooling mindset: clean modular code, configs, and reproducible scripts.
|
|
Bayesian macro/time-series + modern deep learning (generative, vision, LLMs). Reproducible pipelines, clean implementations, and paper-style reporting.
| Project | One-liner | Evidence |
|---|---|---|
| PyDSGEforge | Full-Python DSGE workflow: state-space, solution, and inference. | Reproducible examples + end-to-end scripts (solve → filter → estimate). |
| Bayesian SGDLM | Bayesian dynamic networks for high-dimensional time series. | Posterior simulation + sparse dependency learning demos. |
| DDPM Diffusion | Diffusion models with clean training + sampling. | Training + DDIM sampling scripts, denoising strips, checkpoints. |
| Multiscale ViTs | Unified benchmark of modern ViT families. | Shared pipeline, results table, ablations-ready structure. |
| implementing-gpt | From-scratch GPT training stack. | Tokenizer + training/eval scripts, reproducible configs. |
| Bayesian Structural VAR | SBVAR with posterior IRFs/FEVD and identification. | IRF/FEVD from posterior draws + stored results. |
“Building reproducible ML + Bayesian tooling for scientific inference.”


