I got my PhD in Computer Science at the Johns Hopkins University in Baltimore, Maryland, focusing on using learning to create powerful task and motion planning capabilities for robots operating in human environments.
Since graduating, I have been a postdoc at NVIDIA, in their new Seattle robotics lab. I have been working on approaches that tie together language, perception, and action, in order to make robots into robust, versatile assistants for a variety of applications.
In particular, I am interested in ways we can allow non-expert users to give robots the knowledge they need to be able to plan and adapt to new and unseen situations. Effectively, I am interested in making robots into true co-workers complete with “common sense” that allows them to do the parts of a task that are difficult for humans.
My work focuses on integrating symbolic task planning with learning from expert demonstrations. The goal is to give robots “common sense” and let them intelligently interact with their environments. The most recent incarnation of this project is our new work on creating “robots with imagination” – letting robots learn about the consequences their actions will have for the world, and being able to picture and plan based on these imagined actions.
During Spring 2016, I led the JHU team for the KUKA Innovation Award competition. I led our successful entry into the KUKA innovation award competition with our updated CoSTAR system. CoSTAR integrates a user interface, perception, and abstract planning to create robust task plans. We have since used CoSTAR on a wide variety of problems.
As of Spring 2018, I successfully defended my PhD in Computer Science at the Johns Hopkins University in Baltimore, Maryland. My PhD thesis is titled “Creating Task Plans for Collaborative Robots,” and it covers both our CoSTAR system and the novel algorithms we have created for creating robots that can use expert knowledge to plan.
I did my undergraduate work at University of Maryland, College Park, where I got a BS in Computer Science with a minor in Neuroscience, where I graduated with University honors as a part of their Gemstone program for young researchers.
- Postdoc at NVIDIA, in their new Seattle Robotics Lab
- PhD student at Johns Hopkins University 2012-2018
- Co-op at Zoox (2016-2017): planning for autonomous vehicles
- Lockheed Martin (Summer 2012): security with mobile devices
- US Army Research Lab (2010-2012): pedestrian detection
- Johns Hopkins Applied Physics Lab: development of a zero gravity robot prototype
- Teaching assistant for EN.600.436/636: “Algorithms for Sensor Based Robotics”, Spring 2015
- Taught EN.500.111 “HEART: Making Robots our Friends: Technologies for Human-Robot Collaboration” in Fall 2015.
News and Links
- My NVIDIA Research profile
- Video Feature on the Seattle Robotics Lab
- NVIDIA Opens Robotics Research Lab in Seattle
- Computational Interaction and Robotics Lab
- The finalists of the Innovation Award 2016 have been selected
- KUKA Innovation Award 2016: Flexible Manufacturing Challenge
- WSE Team Takes Home 2016 KUKA Innovation Award
- Johns Hopkins team’s robot takes top prize at industrial technology trade fair in Germany
- Blog post I wrote for the Maryland Department of Commerce on our Innovation Award win
Giving credit where credit is due, my current theme is based on Flex, chosen because it’s clean and minimal.
Others used and referenced: