Stretch AI, an Open-Source Embodied AI Stack for Mobile Manipulation

09 Dec 2024

I’ve been working on a set of tools to make it easy to test embodied AI tools and systems in homes: Stretch AI, a set of open-source tools for language-guided autonomy, exploration, navigation, and learning from demonstration.

The goal is to allow researchers and developers to quickly build and deploy AI-enabled home robot applications.

Stretch AI is designed so that you can easily get started and try it out on your robot. It supports multiple LLMs, from open-source models like Qwen to OpenAI.

You can even give voice commands to the robot and tell it to perform complex, multi-step tasks:

In addition, we have our own fork of the HuggingFace LeRobot library, which lets you perform leearning from demonstration. You can use Dex Teleop to collect multiple demonstrations and train a policy in your environment, which can then be deployed on the robot:

These can then be combined sequentially to create more complex skills. Tell a robot to go and open the cupboard and take out a bottle and it can chain actions together using your trained Diffusion Policy skills. There’s support for a number of different large language models as well, including OpenAI’s GPT-4o, Qwen, and more.

We also support cutting-edge research like DynaMem, which allows you to handle dynamic and changing environments.

On a personal note, this is something I’ve wanted to build since I left NVIDIA ~2.5 years ago – I think it’s so important to have good, open, easy-to-use tools for mobile robotics research in AI. I’m looking forward to seeing what people can build with it!

I also wrote up a blog post with a few more details if you would like to learn more.

Social media:

Read more:

Published on 09 Dec 2024 Find me on Twitter or Github!