Skip to content

zhifanzhu/getagrip

Repository files navigation

Get a Grip: Reconstructing Hand-Object Stable Grasps in Egocentric Videos

[Project Page] [arxiv]

The EPIC-Grasps annotation

The essential annotation of the EPIC-Grasps dataset is the (start, end) information of each labelled grasp. This is the file code_epichor/image_sets/epichor_round3_2447valid_nonempty.csv. The important fields in the file are:

  • vid: the video ID
  • st: the start frame of the grasp
  • et: the end frame of the grasp
  • cat: the object category
  • handside: either "left hand" or "right hand"

Additionally:

  • fmt: The prefix of output, e.g. 'bottle/P01_14_left_hand_57890_57947_*'

Stable grasps on ARCTIC and HOI4D

The automatic extracted stable grasps on ARCTIC and HOI4D are in the files code_arctic/image_sets/stable_grasps_v3_frag_valid_min20.csv and code_hoi4d/image_sets/stable_grasps_0.3_0.01.csv, respectively.

Installation

conda create --name getagrip-env python=3.8
conda activate getagrip-env
pip -r requirements.txt
sh scripts_sh/install_third_party.sh

This repo has been tested on:

  • Ubuntu 22.04, GTX 1080Ti, CUDA 12.2, python 3.8.13, torch 1.8.1+cu102
  • Ubuntu 22.04, RTX 4090, CUDA 12.2, python 3.10.13, torch 2.0.0+cu118

Setup MANO

Source: https://github.com/JudyYe/ihoi/blob/main/docs/install.md

  • Download MANO Model (Neutral model: MANO_LEFT.pkl, MANO_RIGHT.pkl): - Download Models & Code in the original MANO website. You need to register to download the MANO data. - Put the models/MANO_LEFT.pkl models/MANO_RIGHT.pkl file in: ./externals/mano/

Download data

Link: Google Drive
Place the epicgrasps_storage/ directory in the root of the repository, i.e. the same level as this README. This includes EPIC-Kitchens images and VISOR masks, if you don't yet have them. To save space, only the method input, i.e. sampled 30 frames per sequence, are included in the link. This is the minimal data to reproduce the results in the paper.

Run

To run one sequence, e.g. fmt=bottle/P01_14_left_hand_57890_57947 (which is line 189 in the annotation csv), run:

python temporal/run_fit_mvho.py \
    --config-dir=config/epichor \
    --config-name=mvho_hamer_xxxx \
    hydra.run.dir=outputs/demo_out \
    homan.version=lowdim \
    optim_mv.num_inits_parallel=5 \
    debug_locate=P01_14_left_hand_57890_57947

See example_outputs/demo_out/ for the above command.

To run all sequences, run:

python temporal/run_fit_mvho.py \
    --config-dir=config/epichor \
    --config-name=mvho_hamer_xxxx \
    hydra.run.dir=outputs/demo_out \
    homan.version=lowdim \
    optim_mv.num_inits_parallel=5 

Inspecting the results

The quantitative result of each sequence is saved into *_metrics.csv file, e.g. P01_14_left_hand_57890_57947_metrics.csv. Each csv contains num_init_poses rows. The fields oious and avg_sca correspond to the IOU and SCA in the table-3 in the paper. These two fields should be an indicator of the quality of the result.

Notes

  • FrankMocap related won't work as not installed.
  • The provided input data contains HaMeR results, but the procedure of running HaMeR is not included in this repo (yet). Do refer to HaMeR if you use this code.

Acknowledgements

Much of this repository are based on HOMan and IHOI.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Packages