About Me

me

My first name is Thomas, but most people call me by my middle name – Andy. I’m currently a final year PhD student supervised by Max Welling at the University of Amsterdam, planning to graduate in the Summer of 2023, and actively looking for next steps! My PhD is focused on developing deep probabilistic generative models which are meaningfully structured with respect to observed real-world transformations. Such structure permits both improved generalization in previously unobserved settings and reduced sample complexity on natural tasks, thereby addressing two of the fundamental limitations of modern deep neural networks. The approaches which I have taken to develop such structured representation learning algorithms have been directly motivated by observations from neuroscience, such as topographic organization and cortical traveling waves, and further reinforced by ideas from machine learning and cognitive theory such as equivariance, optimal transport, and intuitive physics. In the long term, the goal of my research is to understand the abstract mechanisms underlying the apparent sample efficiency and generalizability of natural intelligence, and then integrate these into artificially intelligent systems. In the short term, I hope to be able to answer the question of how transformations and invariances are learned and encoded in the brain, what inductive biases lie behind our natural notions of intuitive physics, and further understand how the 2-dimensional structure of the cortical surface shapes how learning proceeds.

Find my full C.V. here

Find my publication list on Google Scholar

Education (click to expand)

Ph.D. Machine Learning (2018 - expected 2023) University of Amsterdam Supervisor: Max Welling
Thesis: Structured Representation Learning with Probabalistic Generative Models
M.S. Computer Science (2015 - 2017) University of California San Diego Supervisor: Garrison Cottrell
Thesis: Comparison and Fine-grained Analysis of Sequence Encoders for Natural Language Processing
B.S. Computer Science w/ Honors (2011 - 2015) California Institute of Technology Supervisor: Yasser Abu-Mostafa

Experience

Apple Machine Learning Research (Summer 2022)
  • Developed ”Homomorphic Self-Supervised Learning”, a framework which subsumes data augmentation in self-supervised learning through structured equivariant representations.
  • Published NeurIPS 2022 Self-Supervised Learninng Workshop paper based on work, full AISTATS paper still under review.
  • Additional work from collaboration in submission at ICML 2023
Intel Nervana AI Lab (2016 - 2018)
  • Deep Learning Data Scientist (Sept. 2017 - Sept. 2018)
  • Algorithms Engineer Intern (June 2016 - June 2017)
Data Science for Social Good (Summer fellow 2015)
Lyve Minds Inc. (Analytics Engineering Intern Summer 2014)
  • Developed supervised learning algorithm for automatic editing and summarization of user generated handheld video based on predicted level of interest.
California Institute of Technology (Undergraduate Researcher 2012)

Teaching

Master’s Thesis Supervision

As Teaching Assistant

  • Leren (Bachelor’s Machine Learning)
  • Machine Learning 2 (Second Year Master’s)
  • Data Visualization (D3.js)

Personal

Privately, I enjoy cooking (@TheOtherThomasKeller), running, and playing with my gymnastics rings. I was also an organizing member of the Inclusive AI group at the UvA whose goal is to reduce harmful bias (both algorithmic and human) in the field of machine learning. Please feel free to email me if you have any questions!

Twitter