Skip to content

portal-cornell/robotouille

Repository files navigation


Logo

A customizable multi-task cooking environment!

Request Feature

Table of Contents
  1. About The Project
  2. Getting Started
  3. Usage
  4. Contributing
  5. Built With
  6. License
  7. Contact
  8. Acknowledgments

About The Project

Robot making a lettuce tomato burger in a procedurally generated kitchen Robot making a lettuce burger in a custom-made kitchen Robot making a cheese burger in a custom-made kitchen

Robots will be involved in every part of our lives in the near future so we need to teach them how to perform complex tasks. Humans break apart complex tasks like making hamburgers into smaller subtasks like cutting lettuce and cooking patties. We can teach robots to do the same by showing them how to perform easier tasks subtasks and then combine those subtasks to perform harder tasks. We created Robotouille so we can stress test this idea through an easily customizable cooking environment where the task possibilities are endless!

Check out our paper, Demo2Code: From Summarizing Demonstrations to Synthesizing Code via Extended Chain-of-Thought, to learn how we used Robotouille to teach robots to perform tasks that humans demonstrate to them using Large Language Models (LLMs).

(back to top)

Getting Started

It is super easy to get started by trying out an existing environment or creating your own environment!

Setup

  1. Create and activate your virtual environment
    python3 -m venv <venv-name>
    source <venv-name>/bin/activate
  2. Install Robotouille and its dependencies
    pip install -e .
  3. Run Robotouille!
    python main.py
    or import the simulator to any code by adding
    from robotouille import simulator
    
    simulator("original")

(back to top)

Usage

Use Existing Environments

To play an existing environment, you can choose from the JSON files under environments/env_generator/examples/. For example, to play the high_level_lettuce_burger environment, simply run

python main.py --environment_name high_level_lettuce_burger

You can interact with the environment with keyboard and mouse, using the following keys:

  • Click to move the robot to stations and pick up or place down objects. You can also stack and unstack objects by clicking.
  • 'e' can be used to cut objects at cutting boards or cook patties at stoves.
  • 'space' can be used to stay in place (e.g. you are waiting for a patty to cook)

If you would like to procedurally generate an environment based off a JSON file, run the following commands

python main.py --environment_name high_level_lettuce_burger --seed 42
python main.py --environment_name high_level_lettuce_burger --seed 42 --noisy_randomization

Refer to the README.md under environments/env_generator for details on procedural generation.

Create your own Environment!

To create your own environment, add another example into environments/env_generator/examples/. Follow the README.md under environments/env_generator for details on how to customize the environment JSON. If you would like to modify the transitions of the environment entirely, refer to robotouille.pddl under environments. We currently have limited support for customization through the PDDL for non-Markovian actions (cut / cook) and for rendering new objects / actions but plan to add more support in the future. Please contact [email protected] for more details if interested.

(back to top)

Contributing

We appreciate all contributions to Robotouille. Bug fixes are always welcome, but we recommend opening an issue with feature requests with the Feature Request label or reaching out to us if you want to implement a new feature.

(back to top)

Built With

We build atop PDDLGym which converts a PDDL domain and problem file into a Gym environment. We render and take keyboard input using PyGame, building on the tutorial for making custom gym environments.

(back to top)

License

Distributed under the MIT License. See LICENSE.txt for more information.

(back to top)

Contact

Gonzalo Gonzalez - [email protected]

Project Link: https://github.com/portal-cornell/robotouille

(back to top)

Acknowledgments

We thank Nicole Thean (@nicolethean) for her help with creating the assets that bring Robotouille to life!

(back to top)