I'm a Machine Learning developer and researcher focusing on graph structured learning with deep learning and genetic algorithms. I often work with chemical and biological data assays in sci-informatics.
I'm an occasional writer ✍ and a frequent reader 📖. I'm an avid piano player 🎹 and an abysmal frisbee tosser 🥏. I'm enthused listener of any music genre 💿 and an obsessive collector of Hi-res audio for a single music genre 📀.
Given free time, I pander to and moderate the (small but growing) Geometric Deep Learning subreddit.
Most recently I've been building several generic graph neural network pipelines for rapid research and development.
I'm currently a Machine Learning Developer intern at Kebotix
For the latest updates on what I'm doing, my setup, or just to dig a bit more about me, visit my blog 📝.
Relation Therapeutics - Research Intern
Sept 2019 - May 2020 - London, UK
ML for Chem/Bio-informatics - Personal Projects
Oct 2018 - June 2019 - Toronto, Canada
Canadian Imperial Bank of Commerce (CIBC) - Front end Developer
June 2018 - Sept 2018 - Toronto, Canada
Jake P. Taylor-King, Cristian Regep, Jyothish Soman, Flawnson Tong, Catalina Cangea, Charlie Roberts
DDD allows for the fitting of continuous-time Markov chains over these basis functions and as a result continuously maps between distributions. The number of parameters in DDD scales by the square of the number of basis functions; we reformulate the problem and restrict the method to compact basis functions which leads to the inference of sparse matrices only -- hence reducing the number of parameters.