College Football Predictor

New for 2017 is a project involving an attempt to "one-up" the College Football Playoff committee.  The code correctly predicts the 2016 Top 4 teams and takes into account overall offensive and defensive performance, weighted by the strength of schedule.

The code is written in Python and scrapes data from sports-reference.com and the CPI Ratings.  It uses total offense, defense, turnover margin, total first downs and penalties as an input, and uses a Stochastic Gradient Descent regressor method as implemented by scikit-learn to rank all 130 teams.

The code is on github and currently considers UCF, Alabama, Georgia, and Clemson the best four teams in the country, but this is subject to change!

Click here to see the rankings as of December 21, 2017!

Notre Dame Stadium on Sept. 9, 2017

Notre Dame Stadium on Sept. 9, 2017


Screen Shot 2017-07-20 at 2.58.40 PM.png

College Basketball Predictor

This project developed as a benchmark to test my ability to fill out an NCAA basketball tournament bracket.  So far, "RoboKurt" is 2-2 against his human counterpart in bracket predictive power.

The code uses input from Jeff Sagarin's rankings, though friends tell me I should switch to using the KenPom rankings as an initial input.  I then throw dice based on weighted criteria tuned to match the historical number of upsets in each bracket round

The code is written in C++ and uses CERN ROOT for visual data output. The full source can be found on github.


Traffic Grid Optimization Project

After sitting behind endless sets of red lights in Chicago city streets, I wondered whether it would be possible to use machine learning techniques to maximize stoplight timing in a standard city grid.

The code attempts to optimize a simple 2x3 city grid with stoplights at each 4-way intersection.  The GUI is written using simple vPython blocks, and the neural network in the background is currently being rewritten to use Google's TensorFlow interface.

Screen Shot 2017-07-20 at 2.55.21 PM.png

 
 
fitpPb_eta-2p5to1p5_SVDistr_150831.png

b-quark Jet Identification in Heavy-Ion Collisions

In this project completed for my research in the field of Heavy-Ion Physics, I used boosted decision trees in order to separate signals from a large (~95%) background.  Particle "jets" are highly-energetic sprays of matter, seen in particle collision experiments. It is of interest in my research field to separate the different types of these jets, as they ought to interact in different ways with the hot and dense "soup" of subatomic particles also created in such collisions.

Boosted decision trees were used in order to optimize selections of roughly 30 different jet kinematic quantities to extract the desired jet types ("b-jets") with roughly 50-60% purity - a factor of 14 improvement from randomness. The TMVA package was used for the boosted decision tree implementation, as part of the CERN ROOT analysis data framework.

The code is available on github