Getting Started - How to Use the Calibrator App

You’ve installed your new sensor, you’ve made sure everything works, hardware-wise. The last thing you need to do before scanning is to calibrate. Using Calibrator to calibrate your sensor’s depth feed with the iPad’s color feed is far from intuitive (though we are working to improve this experience every day).

This guide is intended to be used alongside this tutorial video as a complete guide to successfully calibrating (which we promise, can be done!).


What is the importance of calibration?

Both the original Structure Sensor and Structure Sensor Mark II/ Pro work in tandem with your iPad’s color camera. The iPad’s camera applies the high-definition color and texture of the objects being scanned; Structure Sensor captures depth.

The problem is, the two cameras exist in two different places, and if run without calibration, would produce effects where the color from one object may shift over into the color of another. For example, say we were to scan a couch.

If we scanned the couch without calibrating the sensor properly, the scan might result in something like this:

See how the blanket bleeds into the pillow? That’s not right.

So we need calibration to inform the sensor and software of its place in space in relation to the iPad’s camera. When properly calibrated, the scan looks something like this:

As we can see here, there is a clear separation between the blanket and the pillow, and the color lays naturally where it should.

The sensor stores its calibration values locally, and it associates them with the iPad that was used to calibrate. Once you calibrate properly, you will not need to do so for quite a while longer, if ever.

The sensor holds two states of calibration: with and without the Wide Vision Lens.

Back to Top


Step 1: The "Pollock Calibration"

The "Pollock Calibration" method has become the Structure standard for saving robust calibrations to the sensor. It is quick and reliable, regardless of light availability or environment. When possible, we recommend calibrating with this method.

  1. Download the Indoor Calibration Target.
  2. During the "How to calibrate" stage, bring the iPad close to the target, to the point where you can see the target in both IR and visible frames. Bring the iPad close enough so that the image is almost (but not quite) clipped. See the figure below for an example.
  3. Tap "Got it!" and hold still. It should take a few seconds.

  4. Proceed to the refinement stage (it should be pretty close to aligned, if not perfectly aligned), complete, and save the calibration.


  5. Alternate Step 1: Outdoor Calibration Mode

    Both calibration states (with and without the Wide Vision Lens) require bracket calibration. This is the software alignment between the iPad’s camera and Structure Sensor.

    Indoor and Outdoor Calibration modes are fundamentally the same; what changes is the gain and exposure settings of the sensor. Typically we suggest performing Outdoor Calibration, but if you live in a particularly cloudy area or are having trouble, we provide Indoor Calibration as an alternative.

    Calibration Instructions

    1. To begin bracket calibration, tap "Start Calibration".
    2. Depending on your iOS device, you may be taken to a screen that discusses your bracket type. For all supported iPads, we include the XYZ extrinsic translations by default--these are the distance between the sensor and the iPad’s camera in terms of height, width, and depth in millimeters. If you have a custom case, you must make these measurements yourself (if you need to make these measurements, please check out this page). Assuming you have a bracket, tap your bracket type.
    3. You will be taken to a screen that says "Sunlight Required".

      Fill up the sun meter by pointing the sensor outside or through an open window. If there is not enough IR available, you will see this toggle appear after five seconds:

      At this point, you can either tap the toggle to adjust the exposure and gain to begin outdoor calibration or continue pointing your sensor outside to try and fill the meter.

    4. Once full, you will see a split screen of color and IR. This is where bracket calibration occurs.

      Your goal here is to find a complex scene that is high enough in contrast that it appears well in both screens. In the picture below, I use the bare trees against the sky:

      As you can see, the trees show up fairly clearly in the IR feed as well as the color feed. This is an example of a scene with poor complexity.

    5. Once you find a complex enough scene, you don’t need to move a huge amount between frames--just move a little! This way you don’t have to continually find complex enough scenes.

      After two or three good frames, the app will switch to the final stage: refinement.

    Back to Top


    Alternate Step 1: Indoor Calibration Mode

    If you are calibrating at night, or in an area of the world that doesn’t see a lot of clear skies, you may not have enough ambient IR to complete Outdoor Calibration. Not to worry! If you are on this screen for five seconds, a toggle in the lower left corner will show up to enable Indoor Mode.

    What this does is adjust the exposure and gain settings of the sensor, making it more sensitive to IR light.

    The rules of calibration remain the same as Outdoor Calibration Mode; you need to ensure you are scanning a complex enough scene. What constitutes a complex indoor scene? A scene that has a lot of contrast on both the IR stream as well as the color stream. An easy example is a MacBook keyboard:

    This is because the black keys are clearly contrasting with the light gray chassis.

    You might imagine a plant would be a complex enough scene. Not so!

    You might also imagine a geometrically complex scene would be complex enough. Again, not necessarily!

    In this picture, we see a geometrically complex scene, but there are very little high contrast areas in the IR stream.

    While not perfect, the above scene was enough to complete bracket calibration successfully. The high contrast between dark and light created enough similarity between the scenes, and the software was able to do the rest.

    Back to Top


    Step 2: Refinement

    If you’ve made it this far, congratulations! You’ve made it through the hardest part.

    Refinement is the last part of calibration, where you manually tweak the calibration the software determined was correct. We do this by allowing you to drag the depth feed’s color overlay to match the physical scene.

    Please note! You do not need to perform refinement while looking outside! In fact, you might have better luck doing so indoors. We do our best to help you get the hang of it by working through this tutorial:

    Once you think you’ve got it, find a scene with some very clearly-defined edges, like this scene here:

    On iPads, you will drag the scene right and left. On iPhones, you will drag the scene up and down.

    Once the color (not the spaces where there is no color!) matches the physical scene, tap "Save Calibration".

    The calibration state will save to the sensor.

    Remember, there are two calibration states that need to be saved: with and without Wide Vision Lens, so be sure to calibrate with both.

    Once you’re fully calibrated, you are good to go.

    Happy scanning!

Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.