Manipulation with underactuated-hands

Underactuated hands with compliant fingers offer an ap-pealing grasping solution due to their adaptability  to objects of uncertain size and shape. Therefore, they enable stable and robust grasps without tactile sensing or prior planning, and with open-loop control. Furthermore, open source underactuated robotic hands, which we rely on in our work, are easy to fabricate and modify. However, due to uncertainties in the manufacturing process, the hands differ in size, weight, friction and inertia.

Consequently, precise analytical models for such hands are usually unavailable as they are hard to derive. Thus, and as described in previous publication, data-based modeling enables somewhat accurate predictions and can be used for motion planning and closed-loop control. However, such an approach in the context of adaptive hands is rather novel and has not been extensively explored. Therefore, we wish to contribute a set of task benchmarks for standard performance evaluation and further development, this along with a benchmarking platform based on simulations and on real-data.

What is RUM?

The Rutgers Underactuated-hand Manipulation (RUM) dataset is a wide set of real data collected from several underactuated hands (the Model-T42 and Model-O) across various types of objects. The data comprises of approximately 300,000 transition points for each object. Furthermore, each objects comes with 10 long validation paths which have substantial coverage of the hand’s workspace and also employs frequent changes of action.

RUM Dataset

Here we provide links to download the dataset separated by hand model and data category.

Click here to download the entire dataset (6 GB zip).

Click here to download just the Model O and T-42 Files (600 MB zip).

Model-O

Action Sequences: Data 

Model-T42

Transition Data: Discrete , Continuous

Failure Data: Discrete , Continuous

Objects

Mesh Files: Data

Information on how to access the data files can be seen in the project Github page.


Source code to control the hand and independently collect data, along with information on the dataset structure, can be viewed on the Github page.