Panoptic Mapping Data

This page contains the accompanying data for the Panoptic Mapping project. The code to run the data and build panoptic maps will be released at https://github.com/ethz-asl/panoptic_mapping. The paper preprint is available on ArXiv:

Schmid, Lukas, et al. “Panoptic Multi-TSDFs: a Flexible Representation for Online Multi-resolution Volumetric Mapping and Long-term Dynamic Scene Consistency.” arXiv preprint arXiv:2109.10165 (2021).

The Flat Dataset

The flat dataset consists of synthetic images, rendered using Unreal Engine 4, of two trajectories in an indoor environment that was subject to change. Structural ground truth point clouds and panoptic annotations are included.

Run 1 Run 2 Changes

Data Layout

The following files are included (# short description). Additional processing and data-to-ROS players are available on github.

Data:

  • run<RunNo>
    • <FrameNo>_color.png # Color image of the sequence.
    • <FrameNo>_depth.tiff # Depth image of the sequence [m].
    • <FrameNo>_segmentation.png # Ground truth panoptic labels, each pixel contains a label ID.
    • <FrameNo>_predicted.png # Detectron2 predicted panoptic labels, each pixel contains a label ID.
    • <FrameNo>_labels.json # Dictionary of accompanying information for each Detectron2 prediction.
    • <FrameNo>_pose.txt # Pose of the sensor in Odom frame.
    • timestamps.csv # Timestamps of each frame as recorded in the simulation.

Ground Truth:

  • flat_<RunNo>_gt_10000.ply # Structural ground truth pointcloud.

Utility Files:

  • groundtruth_labels.csv # Labels of the panoptic ground truth IDs (use with the mapper).
  • detectron_labels.csv # Labels of the panoptic detectron IDs (use with the mapper).
  • changes.txt # Explanations of the changed objects with their ground truth label names.
  • intrinsics.txt # Camera intrinsics of the images [pixels].

Log Files (You can safely ignore these)

  • airsim.yaml # Configuration of the simulator used to create the data.
  • infrared_corrections.csv # Infrared corrections when running the simulator.
  • waypoints<RunNo>.yaml # Way points traveled to when generating the data.

RIO demo data

To run the mapper on real data, two sequences of the RIO dataset [1] were evaluated.

[1] Wald, Johanna, et al. “RIO: 3D object instance re-localization in changing indoor environments.” Proceedings of the IEEE/CVF International Conference on Computer Vision. 2019.

Additional files required to run the experiments will later be provided here.

Download

You are here: homepanoptic_mapping
Valid CSS Driven by DokuWiki Valid XHTML 1.0