git submodule update --init --recursive
conda create -y -n egozero python=3.10
conda activate egozero
bash setup.shAdd the following environment variables to your ~/.bashrc or ~/.zshrc with your institution's username and password
export ARIA_MPS_UNAME="your_uname"
export ARIA_MPS_PASSW="your_passw"Verify that your dependencies have been installed correctly, run
aria-doctorPair the glasses via USB to your computer
aria auth pairTo record offline so that the complete .vrs file can be submitted to the MPS server for data postprocessing, you must first install the Aria mobile app and the Aria studio. Then,
- Connect to the glasses on the Aria mobile app
- Create a new recording session
- Transfer the
.vrsfile onto your computer
Submit the video for data processing on the MPS server and reorganize the output folder. Job submission may take anywhere from 5 to 30 minutes. For example, if your .vrs file is pick_bread_1.vrs, you would run
bash scripts/submit_mps.sh pick_bread_1In our experiments, we collect all our data with only the right hand and reset the task with the left hand. You may swap this, but our preprocessing script segments individual demonstrations based on absence of the specified hand.
Copy the collect data to this repo on a machine where you will run preprocessing. Ideally this machine has GPU compute.
Label the expert points on your demonstration by opening label_points.ipynb with a Python kernel. Modify the paths in the first cell and run the entire notebook. Label points on the displayed image by clicking points and click the Save Points button. Run full preprocessing with
python preprocess.py --mps_sample_path mps_pick_bread_1_vrs/ --is_right_hand --prompts "a bread slice." "a plate."- Create a new config yaml for your new task at
point_policy/cfgs/suite/task/franka_env/and customize thenum_object_points,root_dir, andpromptsfields. Seepoint_policy/cfgs/suite/task/franka_env/pick_bread.yamlfor reference. - Modify
scripts/train.shto point to your new dataset and task config. Set thedata_dirsandexperimentvariables. - Train the model with
bash scripts/train.sh. Seepoint_policy/cfgs/config.yamlandpoint_policy/cfgs/suite/aria.yamlfor hydra flags from command line.
First go through the Franka-Teach section to make sure the hardware is running correctly.
- Modify
scripts/eval.shto point to your new dataset, task config, and checkpoint weights (should be saved inpoint_policy/exp_local). - Inference the model with
bash scripts/eval.sh. Seepoint_policy/cfgs/config.yamlandpoint_policy/cfgs/suite/aria.yamlfor hydra flags from command line.
To stream the iPhone to get RGBD for robot rollout, run
python scripts/stream_iphone.pyTo run the robot, see the Franka-Teach repository for how to run Franka robots
