EDIT: I forgot to check Profile.xlsx... Always check Profile.xlsx.
For calibration of 3D sensor values, the old PDF for the FIT SDK states that orientation matrix can contain values between -square_root(3) to square_root(3). This has since been removed and it now only states that it "can support values from + and –".
In the SDK example, the orientation_matrix values are specified as +/- 1 (or 0) for the corresponding axis. However for VIRB Ultra 30 it is so far +/- 65 535 (or 0) for all sensors. The values are stored as sint32, but look suspiciously like a +/- maxed out uint16?
Is it the calibration_factor and calibration_divisor that adjusts this? Or am I completely misunderstanding? I have an implementation that seems to work with the SDK example (if orientation_matrix is assumed to be in the range +/1), but need some input on whether the VIRB's orientation_matrix values need further adjusting before use.
Here's an example of a calibration message for the gyroscope:
Global ID: 167 | Message type: three_d_sensor_calibration | Header: 5/0b00000101
253 timestamp UINT32([6841])
1 calibration_factor UINT32([5])
2 calibration_divisor UINT32([82])
3 level_shift UINT32([32768])
4 offset_cal SINT32([-23, 19, -12])
5 orientation_matrix SINT32([0, -65535, 0, 0, 0, -65535, -65535, 0, 0])
0 sensor_type ENUM([1])
EDIT: If I try to adjust orientation_matrix for the magnetometer data (208) and try to convert the calibrated data to degrees (i.e. some raw-ish heading value), the end result will be the same since the change is linear for all values - the relative proportions do not change. Intermediate values change of course.