Why Calibrate?

Correct sensor calibration is essential to a successful robot deployment.

chevron_right Fleets of robots all behave similarly. No ‘gold units’, no ‘duds’
chevron_right Robot maintenance is easier, faster, and less expensive
chevron_right ML systems generalize better with calibrated sensor data

Calibration makes sensor data more usable


chevron_right Accuracy: Calibration adjusts sensor outputs to more closely match the real world, reducing errors and improving accuracy
chevron_right Precision: Calibration ensures similar sensors generate similar results, improving consistency and simplifying maintenance
chevron_right Sensor fusion: Calibration enables different sensor modalities to perceive the world in the same way, producing coherent and correctly-aligned data that can be fused together

Calibration Anywhere

MSA’s automatic sensor calibration solution works with any number, combination, and layout of perception sensors in any unstructured environment. Calibration Anywhere generates sensor intrinsics, extrinsics, and time offsets for all perception sensors in one pass.

check_box Calibrate anywhere - no checkerboards, no targets, no special environment
check_box Typically takes less than 10 minutes
check_box No engineers or technicians are required

Supported Sensors

Any number of perception sensors with almost any layout and configuration:

Cameras
RGB & Thermal
Global and rolling shutter
Stereo and ToF depth
Lidars
2D and 3D
Radars
IMUs
GPS/GNSS units
Wheel encoders

Calibrated Outputs

Generated in one pass in typically less than 10 minutes:

Extrinsics relative to base_link coordinate frame (including an NVIDIA Isaac Perceptor compatible URDF)
6DoF extrinsics for cameras, 3D lidars, radars, IMUs
3DoF extrinsics for 2D lidars
3D position of GPS/GNSS units
Camera lens intrinsics for cameras
Supporting fisheye, equidistant, ftheta3, rational polynomial, and plumbob models
Includes readout time for rolling shutter cameras
Time offsets (timestamp corrections)
For cameras, lidars, radars, IMUs, wheel encoders, and GPS/GNSS units
Ground detection
Ground plane relative to cameras, 3D lidars, and base_link
Axle track, wheel radius, and corrective gain factors for speeds or ticks (if encoders or wheel speeds are available)

Endorsements

What is a camera calibration?

MSA’s automatic sensor calibration solution works with any number, combination, and layout of perception sensors in any unstructured environment.

check_box Cameras map points in the world into pixels
check_box Calibration tells us how the mapping works
check_box 3D coordinates can be mapped to a pixel location
check_box Pixel coordinates are used to find a ray in 3D space, which can be used to measure the world