File size: 3,563 Bytes
8616c73
 
 
 
 
0485ae5
ccdaa75
722372a
2f6b0dd
 
8616c73
 
0485ae5
 
b9dedf6
 
 
 
0485ae5
b9dedf6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0485ae5
b9dedf6
 
 
 
0485ae5
b9dedf6
 
 
 
 
0485ae5
 
 
b9dedf6
 
0485ae5
b9dedf6
 
 
 
 
 
 
 
 
 
 
0485ae5
 
 
 
 
b9dedf6
 
0485ae5
b9dedf6
 
 
 
9edc7cf
b9dedf6
ccdaa75
 
b9dedf6
 
2f6b0dd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
---
tags:
- robotics
- grasping
- simulation
- nvidia
task_categories:
- other
- robotics
license: cc-by-4.0
---

# GraspGen: Scaling Sim2Real Grasping
GraspGen is a large-scale simulated grasp dataset for multiple robot embodiments and grippers.

<img src="assets/cover.png" width="1000" height="250" title="readme1"> 


We release over 57 million grasps, computed for a subset of 8515 objects from the [Objaverse XL](https://objaverse.allenai.org/) (LVIS) dataset. These grasps are specific to three grippers: Franka Panda, the Robotiq-2f-140 industrial gripper, and a single-contact suction gripper (30mm radius). 

<img src="assets/montage2.png" width="1000" height="500" title="readme2"> 

## Dataset Format
The dataset is released in the [WebDataset](https://github.com/webdataset/webdataset) format. The folder structure of the dataset is as follows:
```
grasp_data/
	franka/shard_{0-7}.tar
	robotiq2f140/shard_{0-7}.tar
	suction/shard_{0-7}.tar
splits/
	franka/{train/valid}_scenes.json
	robotiq2f140/{train/valid}_scenes.json
	suction/{train/valid}_scenes.json
```
We release test-train splits along with the grasp dataset. The splits are made randomly based on object instances.

Each json file in the shard has the following data in a python dictionary. Note that `num_grasps=2000` per object.
```
‘object’/
	‘scale’ # This is the scale of the asset, float
‘grasps’/
	‘object_in_gripper’ # boolean mask indicating grasp success, [num_grasps X 1]
	‘transforms’ # Pose of the gripper in homogenous matrices, [num_grasps X 4 X 4]
```

The coordinate frame convention for the three grippers are provided below:
<img src="assets/grippers.png" width="450" height="220" title="readme3"> 

## Visualizing the dataset

We have provided some minimal, standalone scripts for visualizing this dataset. See the header of the [visualize_dataset.py](scripts/visualize_dataset.py) for installation instructions.

Before running any of the visualization scripts, remember to start meshcat-server in a separate terminal:
``` shell
meshcat-server
```

To visualize a single object from the dataset, alongside its grasps:
```shell
cd scripts/ && python visualize_dataset.py --dataset_path /path/to/dataset --object_uuid {object_uuid} --object_file /path/to/mesh --gripper_name {choose from: franka, suction, robotiq2f140}
```

To sequentially visualize a list of objects with its grasps:
```shell
cd scripts/ && python visualize_dataset.py --dataset_path /path/to/dataset --uuid_list {path to a splits.json file} --uuid_object_paths_file {path to a json file mapping uuid to absolute path of meshes} --gripper_name {choose from: franka, suction, robotiq2f140}
```

## Objaverse dataset
Please download the Objaverse XL (LVIS) objects separately. See the helper script [download_objaverse.py](scripts/download_objaverse.py) for instructions and usage.
Note that running this script autogenerates a file that maps from `UUID` to the asset mesh path, which you can pass in as input `uuid_object_paths_file` to the `visualize_dataset.py` script.

## License
License Copyright © 2025, NVIDIA Corporation & affiliates. All rights reserved.

Both the dataset and visualization code is released under a CC-BY 4.0 [License](LICENSE_DATASET).

For business inquiries, please submit the form [NVIDIA Research Licensing](https://www.nvidia.com/en-us/research/inquiries/).

## Contact

Please reach out to [Adithya Murali](http://adithyamurali.com) (admurali@nvidia.com) and [Clemens Eppner](https://clemense.github.io/) (ceppner@nvidia.com) for further enquiries.