--- license: cc-by-4.0 task_categories: - robotics language: - en pretty_name: LBX Robotics Datasets tags: - robotics - manipulation - bimanual - teleoperation - xarm - mcap - foxglove size_categories: - n<1K viewer: false --- # LBX Robotics Datasets A preview of Labelbox robotics data offerings. ## Bimanual xArm Manipulation Bimanual teleoperated manipulation recordings collected on two dual-arm xArm stations. Sessions cover a broad set of household and table-top tasks and are shared as raw [MCAP](https://mcap.dev/) files so the full sensor and control fidelity is preserved end-to-end. ### Robots Two bimanual stations are included: | Station | Arms | URDF | |---------|------|------| | **Bimanual xArm6** | 2× UFACTORY xArm6 (6 DoF each) with parallel grippers | `urdf/dual_xarm6.urdf` | | **Bimanual xArm7** | 2× UFACTORY xArm7 (7 DoF each) with parallel grippers | `urdf/dual_xarm7.urdf` | Each station carries **three ZED 2i cameras**: one on the left wrist, one on the right wrist, and one overhead looking down at the workspace (the overhead camera's left and right stereo views are exposed as separate topics). Visual STL meshes for both arms, the parallel gripper, the end-tool plate, and the ZED 2i mount are provided so you can render the robot in a 3D viewer. ### Data format All recordings are [MCAP](https://mcap.dev/) files with protobuf-encoded messages. They open directly in [Foxglove Studio](https://foxglove.dev/) for visual inspection, or via the [`mcap`](https://pypi.org/project/mcap/) Python package. Cameras are recorded at roughly 30 fps; robot proprioception and control signals at higher rates (see Topics below). ### Folder layout ``` Labelbox/robotics-datasets ├── README.md └── data/ └── bimanual/ ├── bimanual_foxglove_layout.json # Foxglove layout for these recordings ├── urdf/ # robot models + STL meshes │ ├── dual_xarm6.urdf │ ├── dual_xarm7.urdf │ └── meshes/ │ ├── xarm6/visual/*.stl │ ├── xarm7/visual/*.stl │ ├── gripper/xarm/*.stl │ ├── end_tool/collision/*.stl │ ├── zed2i.stl │ └── zed_camera_mount_xarm.stl ├── xarm6/ │ └── / │ └── recording_NNNN.mcap └── xarm7/ └── / └── recording_NNNN.mcap ``` ### Tasks
**xArm6 station** | Task | Files | |------|-------| | `fold_tshirt` | 13 | | `scrub_plate_with_sponge` | 24 | | `place_clothing_into_basket` | 27 | | `place_pot_on_stove_and_add_food_to_pot` | 2 | | `remove_towel_from_drawer` | 2 | | `watering_house_plant` | 2 | | `put_trash_into_trash_bin` | 2 | | `remove_hangers_from_rack` | 2 | | `remove_hats_and_dirty_laundry` | 2 | **xArm7 station** | Task | Files | |------|-------| | `remove_lunch_box_from_backpack` | 30 | | `separate_condiments_baby_food_and_spices` | 12 | | `set_up_dinner_table` | 27 | | `place_objects_into_drawer` | 7 | | `place_batteries_into_container` | 2 | | `place_toothbrushes_into_holder` | 2 |
### Topics Topics present in every recording, with approximate publish rates from a representative session (rates vary slightly across recordings). | Topic | Approx. rate | Description | |-------|--------------|-------------| | `/left-arm-proprio`, `/right-arm-proprio` | ~245 Hz | Joint state of each arm | | `/left-eef-proprio`, `/right-eef-proprio` | ~245 Hz | End-effector state (pose, gripper) | | `/left-arm-leader`, `/right-arm-leader` | ~73 Hz | VR controller state from the teleop leader | | `/left-arm-ik-solution`, `/right-arm-ik-solution` | varies with motion | Inverse-kinematics solver output | | `/left-arm-tf`, `/right-arm-tf` | ~19 Hz | Per-arm transforms | | `/left-wrist-camera/image-raw` | ~30 Hz | Left wrist ZED 2i | | `/right-wrist-camera/image-raw` | ~30 Hz | Right wrist ZED 2i | | `/top-left-camera/image-raw` | ~30 Hz | Overhead ZED 2i, left stereo view | | `/top-right-camera/image-raw` | ~30 Hz | Overhead ZED 2i, right stereo view | | `/{...}-camera/camera-info` | once per file | Camera calibration | | `/tf-static` | once per file | Static transforms | | `/subtask-annotation` | sparse | Semantic step labels within a session | All messages use protobuf encoding. Robot-specific schemas are namespaced under `lbx_robotics.msg.*` (e.g. `ArmState`, `EndEffectorState`, `IKSolution`, `VRControllerState`, `SubtaskAnnotation`). Camera and transform schemas use the standard `foxglove.*` types. ### Embedded session metadata Every MCAP file contains a `session-metadata` record with the following fields: | Field | Description | |-------|-------------| | `task-id` | Canonical task name (matches the folder name, e.g. `fold_tshirt`) | | `task-instruction` | Human-readable instruction shown to the operator | | `operator-id` | Opaque UUID identifying the teleop operator (no PII) | | `station-id` | Identifier of the physical station the data was collected on | | `session-uuid` | Unique session identifier | | `start-time-unix`, `end-time-unix` | Session start and end times (seconds since epoch) | | `urdf` | The full URDF used during the session, embedded inline as XML | ### Viewing in Foxglove A Foxglove layout is provided at `data/bimanual/bimanual_foxglove_layout.json`. It pre-configures a 3D panel with the robot URDF, four image panels for the cameras, and raw-message panels for the proprioception and teleop streams. 1. Download `data/bimanual/urdf/` and `data/bimanual/bimanual_foxglove_layout.json` to your machine. 2. Open any `.mcap` recording in [Foxglove Studio](https://foxglove.dev/). 3. **View → Import layout from file…** and select `bimanual_foxglove_layout.json`. 4. In the 3D panel, find the **URDF** layer and set its `filePath` to your local copy of `dual_xarm6.urdf` (for xArm6 recordings) or `dual_xarm7.urdf` (for xArm7 recordings). The layout ships with this field intentionally blank. ## License Released under the [**Creative Commons Attribution 4.0 International (CC-BY-4.0)**](https://creativecommons.org/licenses/by/4.0/) license. You are free to share and adapt the material for any purpose, including commercially, with appropriate credit. Suggested attribution: > Labelbox Robotics. *LBX Robotics Datasets — Bimanual xArm Manipulation.* Hugging Face, 2026. https://huggingface.co/datasets/Labelbox/robotics-datasets ## Contact Maintained by the **Labelbox Robotics** team. For questions, collaborations, or feedback, use the dataset's [Discussions tab](https://huggingface.co/datasets/Labelbox/robotics-datasets/discussions) on Hugging Face.