Computer Science Interests
Research-wise, I'm interested in quantum computing and deep learning. For the former, I'm interested in current efforts to quantize existing machine learning algorithms. For the latter, click here.
Experience:
Software Engineer Feb. 2023 →
Gearbox Software (Full-Time)
- Writing tools for game designers in C++20 and C# in the behemoth code-base known as Unreal Engine 5.
Software Engineer Jan. 2021 → Feb. 2022
Leidos (Full-Time)
- Wrote C++11 for physics simulators in Linux on a small team working for the Department of Defense. Used GNU Autotools, Conan, and RPMs for building, packaging, and delivering code respectively.
- Sped-up simulator by moving signal processing library to C++11 from Python. (SciPy, FFTW, Jenkins)
- Fixed deadlocks and implemented features in distributed messaging brokers between a physics engine and end-user. Written in Java and C++14 with Google Protobuf and ZeroMQ.
- Migrated a monitoring GUI from Python to TypeScript and React, as a part of our Kubernetes adoption.
- Productionized a research paper’s model of towed sensor arrays (ocean acoustics) by converting a physicist’s MATLAB code to C++14 with CUDA.
Researcher May 2020 → Sep. 2021
University of Maryland, College Park (Part-Time)
- Investigated deep learning models for time-series under Professors Soheil Feizi and Hector Bravo (now of Genentech), and PhD student Aya Ismail.
- Trained neural networks (DeepAR, LSTMs, Transformers, TCNs) on university servers with SLURM, and presented results at meetings. Written in Python with PyTorch.
- Extended recent work providing provable guarantees on neural network convergence by exploring the data augmentation of time-series via spherical harmonics expansion.
Computer Skills:
Proficient: Java, C++20
Working Knowledge: C, Python, R, TypeScript, MATLAB
Technologies: git, PyTorch, Conan, Autotools, Boost, SQL, SLURM
Personal Projects:
PyTorch Implementation of DeepAR Jan. 2021
- Implemented Amazon Research’s DeepAR neural network in PyTorch using their 2020 paper.
- Trained this autoencoder for regression tasks on time-series.
Synthetic Time-series Generator Jun. 2020
- Generated 75,000 synthetic datasets in CSV format for an ongoing research paper on recurrent neural networks
- Wrote framework supporting the creation of AR, ARMA, ARMAX, and non-linear stochastic processes
- Written in C++17, with Catch2, and Boost Math and Multiprecision
Survey on Quantum Perceptrons May 2020
- Surveyed recent work quantizing the perceptron training algorithm for polynomial speedup
- Reviewed necessary game-theoretic and quantum concepts (Durr-Hoyer, Amplitude Amp., etc.)
- Knowledge of basic quantum computing is expected, but machine learning is not
Multi-factor Stock Valuation Model May 2019
- Constructed multi-factor valuation model for stocks in accordance with Arbitrage Pricing Theory
- Achieved annual return of 32.3% in test year 2017
- Trained model from May 2014 to December 2016 using MSCI Barra USE3 factors
- Written in R