About Me

I am a robotics and AI researcher, currently leading Embodied AI at Hello Robot. At the past, I’ve worked at NVIDIA Research and at FAIR, the fundamental AI research arm of Meta. I’m interested in ways we can allow robots to work alongside humans to perform complex, multi-step tasks, using a combination of learning and planning.

Chris with a Hello Robot Stretch

Me with my Stretch, ready to pick and place some objects.

I got my PhD in Computer Science in 2018 from the Johns Hopkins University in Baltimore, Maryland, focusing on using learning to create powerful task and motion planning capabilities for robots operating in human environments.

From 2018-2022, I was with NVIDIA, at their Seattle robotics lab. Recently, I have been working on approaches that tie together language, perception, and action, in order to make robots into robust, versatile assistants for a variety of applications. Other areas of interest include human-robot interaction.

In 2022, I joined the Embodied AI team at FAIR Labs. My work has looked at how we can make robots into useful, general-purposem mobile manipulators in homes. In particular, I pushed a challenge we called open-vocabulary mobile manipulation, or OVMM, which says robots should be able to pick and place any object in any environment. We ran a Neurips competition to encourage people to build reproducible robots which can perform this OVMM task. I am continuing in this direction at Hello Robot.

You can also find a list of my papers on Google Scholar, or email me at chris.paxton.cs (at) gmail.com. I’m very active on Twitter/X, where I post about my research and other interesting things in robotics and AI. I also have LinkedIn, Bluesky, and Mastodon, though these are updated quite a bit less. I post longer-form stuff on It Can Think, my substack blog.

Papers

Patents

Invited Talks

Work Experience

  • Leading Embodied AI at Hello Robot (2024-present)
  • AI Research Scientist, FAIR Labs (2022-2024)
  • Senior Robotics Research Scientist , NVIDIA (2020-2022)
  • Robotics Research Scientist, NVIDIA (2019-2020)
  • Postdoc at NVIDIA, in their Seattle Robotics Lab (2018-2019)
  • PhD student at Johns Hopkins University (2012-2018): represening tasks for collaborative robots
  • Research Engineer Co-op at Zoox (2016-2017): planning for autonomous vehicles
  • Lockheed Martin (Summer 2012): security with mobile devices
  • US Army Research Lab (2010-2012): pedestrian detection
  • Johns Hopkins Applied Physics Lab: development of a zero gravity robot prototype

Fun Stuff

  • Infinite Quiz Machine - Take bunch of personality quizes procedurally generated by AI! Learn what kind of crustacean you are, what kind of door you are, and more!
  • Boat vs Kraken Game - Use the arrow keys to move the boat and avoid the kraken. Collect the coins for points. An experiment in using AI to generate a game.

You can also check out It Can Think, my substack blog, where I post about AI, robotics, and other interesting things.

Education

In Spring 2018, I successfully defended my PhD in Computer Science at the Johns Hopkins University in Baltimore, Maryland. My PhD thesis is titled Creating Task Plans for Collaborative Robots, and it covers both our CoSTAR system and the novel algorithms we have created for creating robots that can use expert knowledge to plan. I did my research in the Computational Interaction and Robotics Lab with Greg Hager.

I did my undergraduate work at University of Maryland, College Park, where I got a BS in Computer Science with a minor in Neuroscience, where I graduated with University honors as a part of their Gemstone program for young researchers.

During Spring 2016, I led the JHU team for the KUKA Innovation Award competition. I led our successful entry into the KUKA innovation award competition with our updated CoSTAR system. CoSTAR integrates a user interface, perception, and abstract planning to create robust task plans. We have since used CoSTAR on a wide variety of problems.

Finalists at KUKA College Gersthofen
KUKA award finalists at KUKA College Gersthofen in December 2015.

Teaching Experience

  • Taught EN.500.111 “HEART: Making Robots our Friends: Technologies for Human-Robot Collaboration” in Fall 2015.
  • Teaching assistant for EN.600.436/636: “Algorithms for Sensor Based Robotics”, Spring 2015

Find me on Twitter, Github, or Mastodon!