Support & FAQ

FAQ

  • Basics of Motion Application Development

    • Why would I want to use a gyroscope if I already use an accelerometer and magnetometer?

      Both accelerometers and magnetometers are noisy sensors, which means they exhibit a slow response rate (due to noisy filtering) in fast moving systems. Accelerometer measurements are also affected by motion. By using a gyroscope and MotionFusion, the MotionFusion solution will have a faster response rate and track small, quick movements much more quickly. Gyroscope MotionFusion also makes it possible to sense linear acceleration separately from gravity.

    • What is heading drift?

      Because 6 axis MotionFusion has no way to directly sense the earth’s magnetic field, it estimates heading by integrating the gyroscope sensor outputs. The gyroscope sensor will have some small amount of noise and may have small biases. Since the technique of gyroscope integration keeps a running sum of gyroscope sensor outputs and those sensor outputs include small errors, the sum will slowly accumulate an error over time. 6-axis MotionFusion can only correct part of this error using information from the accelerometer.

    • What are Euler angles? How do I use Euler angles?

      Euler angles are a representation of an angular frame, as are quaternions and rotation matrices.

      Euler angles are a set of three angles, corresponding to pitch, roll, and yaw.

      Euler angles may be constructed according to many conventions. The axes of and ordering of ordering of pitch, roll, and yaw vary by convention. The Embedded MotionApps Platform makes Euler angles available in the three most common conventions with MLGetEulerAnglesX, MLGetEulerAnglesY, and MLGetEulerAnglesZ. See the Embedded MotionApps Platform Functional Specification for specific details on the definition of these conventions.

      Euler angles are not commutative. For any Euler angles A, B, a rotation A followed by B is not (B + A) is necessarily equal a rotation B followed by A (B + A). Because of this algebraic property, Euler angles are often not very useful for signal processing applications.

      Euler angles suffer from a pair of singularity points at which only two of the three angles are significant. The derivative of the Euler angles is discontinuous at this point. This condition is often called gimbal lock.

    • What is a rotation matrix? How do I use a rotation matrix?

      A rotation matrix is a representation of an angular frame, as are quaternions and Euler angles.

      A rotation matrix has 9 elements, in three rows and three columns. The sum of the elements squared of each row and column is 1. Therefore, the maximum magnitude of any element of a rotation matrix is 1.

      A rotation matrix can be easily converted to a quaternion or Euler angles.

      Rotation matrices are most often used to translate a three dimensional coordinate from one angular frame to another.

      http://en.wikipedia.org/wiki/Rotation_matrix

    • What is a quaternion? How do I use a quaternion?

      A quaternion is a number with a real and three imaginary components, q = a + i b + j c + k d

      A quaternion represents a rotational frame in a form related to the angle theta and axis v of rotation, where v is a three dimensional unit (length 1) vector, and q = sin (theta / 2) + i * vx * cos(theta/2) + i * vy * cos(theta/2) + k * vz * cos(theta/2)

      You can easily convert a quaternion to Euler angles or a rotation matrix.

      http://en.wikipedia.org/wiki/Quaternions_and_spatial_rotation

      The DMP calculates the MotionFusion solution as a quaternion because it is the most efficient representation for mathematical applications.

    • What is a bias tracker ?

      All motion sensors have an output bias point which corresponds to a zero measurement—in the case of a gyroscope, this is the output when the device is at rest. The bias point changes small amounts with time and changes in temperature. A bias tracker estimates the bias point so that small changes in bias do not result in sensor fusion solution inaccuracies.

    • What is an angular frame ?

      An angular frame is a representation of an object’s orientation in 3D space. It is commonly described using euler angles or a quaternion. Please see http://en.wikipedia.org/wiki/Euler_angles, and http://en.wikipedia.org/wiki/Quaternions_and_spatial_rotation.

      An angular frame does not describe the 3D location of an object, only its rotation relative to a reference. The reference rotation for a 6-axis angular frame is flat and level—that is, the vertical axis of the object is aligned with up/down. The reference rotation for a 9-axis angular frame is flat, level, and north: the vertical axis of the object is aligned up/down, and the forward axis of the object is aligned north/south.

  • Embedded MotionApps Platform

    • Can MEMS mics be soldered to a flex PCB?

      Yes, a flex PCB can be used with MEMS microphones. All of our microphones are actually available on flex PCB evaluation boards so that they can be easily placed in your own system for testing. Handling considerations in the datasheets and application note AN-1068 should be followed during assembly.

    • Can a digital microphone’s PDM output be directly connected to an I²S input?

      A PDM output microphone, such as the ADMP621, cannot directly interface to an I²S port. Theses microphones should be connected to the PDM input on a codec or DSP. The ADMP441 MEMS microphone has an I²S output that can be connected directly to a processor or codec’s I²S input port.

    • When I bring up AVR Studio 5 with Embedded MotionApps, it shows three build targets. Which one should I select?

      The Embedded MotionApps Platform AVR Studio 5 project has three build targets, specific to different InvenSense Motion Processors. These are:

      Debug-3k : This option supports the IMU-3000 device with the KXTF9 auxillary accelerometer, on the Inertial2 sensor board.
      Debug-MantisB1 : This option supports MPU-6050 Revision B1 hardware.
      MantisA2 : This option supports MPU-6050 Revision A2 hardware.See “How does the Embedded MotionApps Platform determine gyro bias?” and “How does the Embedded MotionApps Platform provide gyro temperature compensation?”
      You must select the build target appropriate for your hardware before building.

    • Where do I find instructions for downloading AVR Studio 5?

      Click on the link below to the Atmel AVR Studio 5 page. Scroll half-way down the page and click on the link “Register” under the heading “Software”. It will take you to a registration page and once that is complete you will be able to download the installer.

      http://www.atmel.com/dyn/products/tools_card.asp?tool_id=17212&source=avr_5_studio_overview

    • Does Embedded MotionApps Platform use my system resources to do sensor fusion?

      Embedded MotionApps 2.0 as well as Motion Driver use the Digital Motion Processor (DMP™) located on the sensor device to perform 6-axis MotionFusion. The Embedded MotionApps Platform MotionFusion reads sensor fusion results from the DMP using the FIFO buffer.
      For 9-axis operation in the MotionFit SDK the micro is required to integrate compass data as well as some calibration routines.

    • Can I use the Embedded MotionApps Platform software on an operating system?

      The Embedded MotionApps Platform software is not intended for use on an operating system, but it is possible. For use on Android or Linux operating systems, the MotionApps Platform Library is recommended.

    • With which platforms can I use the Embedded MotionApps Platform?

      The Embedded MotionApps Platform may be ported to any platform which can provide the facilities required by the Motion Library System Layer (MLSL). In summary, the Embedded MotionApps Platform can be used on any system which has access to an IMU/MPU device over an I2C serial bus, can provide an interface to sleep for a number of milliseconds, and can provide a millisecond resolution system clock. For more details, see the Embedded MotionApps Platform User Guide.
      The MotionFit SDK is currently tied to the MSP430 MCU from Texas Instruments. There is source code available for modifications but the sensor fusion is locked to this MCU
      We recommend Embedded Motion Driver 5.1 for an MCU agnostic approach, this is a very lightweight and portable solution that generates 6-axis sensor fusion

    • What does the hardware do and what does the BSP do? Does the hardware without any additional mcu have the quaternion 6 dof fusion output? Is there any hardware that can combine a compass in the attitude results?

      The motion algorithms has two major components to it, sensor fusion and calibration. Sensor fusion runs on the DMP (in hardware), while calibration runs on the micro. The hardware sensors (Gyro, and Accel) detect motion, while the BSP has manages the I2c, Timer, and Non volatile memory interfaces. The hardware does perform 6-axis (dof) sensor fusion through the dedicated DMP computation engine. The next release of Embedded MPL software will enable 9-axis sensor fusion that runs in a fully self contained fashion on the DMP. Currently the DMP does not support 9-axis sensor fusion due to a degraded performance. We are planning to eventually push full 9-axis operation to the DMP.
      10 axis (altitude) sensor fusions can be performed on the MCU by taking the 9-axis sensor fusion results from hardware, and combining with altitude information from a pressure sensor.

    • How complicated is the calibration routine? Are expensive calibration fixtures or equipment required?

      The calibration algorithms that are supported in software are No Motion Cal, whenever the Gyro detects a no motion event through the DMP, it immediately triggers a calibration process. There is also a temperature compensation of the Biases, which runs in software on the micro. These calibration routines run in real time on the hardware and micro, and does not need any external equipment.

    • Is the bias correction calculated while the device is accelerated more or when it is static?

      The bias correction is calculated when the device is in no motion state. The compensation of bias values for temperature variations is performed continuously.

loading