Attitude Heading Reference System: An intuitive engineers perspective
Attitude Heading Reference System
Welcome to a
blog on development of Attitude Heading Reference system fusion algorithm using
Linear Kalman Filter and an IMU (Accelerometer, Gyroscope and Magnetometer).
This blog is purely focused on implementation of algorithm in MATLAB using Kalman Filter and not on
explaining the basics of AHRS and KF. I have not explained the complete method
but have attached the necessary resources through which I developed this fusion
algorithm
To learn
about KF: https://www.kalmanfilter.net/default.aspx (I would recommend going through the
complete documentation if you have no idea on KF).
To Learn
about what AHRS is: https://en.wikipedia.org/wiki/Attitude_and_heading_reference_system
An AHRS is a
fusion algorithm that fuses accelerometer, gyroscope, and magnetometer data to
give us Roll, Pitch and Yaw.
I will give you an overview with this image:
Roll, Pitch
and Yaw (from an Automotive perspective)!!!
(Image courtesy: https://www.racecar-engineering.com/tech-explained/racecar-vehicle-dynamics-explained/attachment/racecar-vehicle-dyanmics-roll-pitch-yaw/)
Euler / DCM
/Quaternion??
This fusion
algorithm was developed for automotive application. We use Euler angle
representation instead of quaternion or Direction Cosine Matrix (DCM) because
gimbal lock is near impossible in a 4 wheeled car (unless it’s a rollover or a
topple over). Also implementing in quaternion involves non-linear system
dynamics resulting in a non-linear fusion algorithm such as Extended Kalman
Filter which is difficult for a beginner like me, so I chose to go with a LKF
and Euler representation.
I recommend watching this video on quaternion:
https://www.youtube.com/watch?v=3BR8tK-LuB0&ab_channel=Numberphile
(Image courtesy: https://www.coursera.org/learn/state-estimation-localization-self-driving-cars/lecture/ccy3B/lesson-1-3d-geometry-and-reference-frames)
Before we
start building the KF algorithm, let’s go over and understand how to use mobile
sensor data for this project!!
https://www.youtube.com/watch?v=LCk9gV0ooxI&ab_channel=MATLAB
Just watch
this video to understand how mobile sensors are used for algorithm development.
(For those who don’t have a paid option. I will publish out one or two (or more if needed) .mat files containing sensor data’s.)
Note: Smartphone I used for this project - MotoG52
AHRS Fusion
Algorithm:
https://www.youtube.com/watch?v=0rlvvYgmTvI&t=266s&ab_channel=MATLAB
I learned
the fusion algorithm for AHRS from the above two links. Without understanding
it, we cannot develop the AHRS. Be patient and watch multiple times to have a
good grasp over it.
So now I
think we are good to go and understand our algorithm.
Linear Kalman Filter for AHRS:
Z[k] is
fusion of accelerometer and magnetometer readings. Please watch the YouTube
link described under fusion algorithm section to understand it.
Things to watch for!!!
- It took me several iterations to properly assign the direction of positive/negative axis and rotations. You need to have a good understanding of how the sensor is in your smartphone to manipulate its positive/negative directions and X, Y and Z axes. Matching the X, Y and Z axes of individual sensors was real headache. (Initially I didn’t have a technical reason on how to do this… I did this with multiple iterations but later I understood and is explained below).
- Tuning
of Q and R matrix is crucial in getting the algorithm to work efficiently.
- Sensor
Bias needs to be calibrated. Although the fusion algorithm calculates sensor
bias additional external bias was added/subtracted to get accurate results. (I
was not sure about why this happens).
- Take appropriate step time for proper results.
Matching
sensor axis:
Since I followed the video from brian douglas and he developed AHRS for MPU9250, so I kept MPU9250 axis as reference and changed my smartphone sensor axis to match with MPU9250.
(Image courtesy: https://www.14core.com/wp-content/uploads/2016/10/MPU9250-AXIS-ORIEANTATION-Illustration-Diagram-01.jpg)
My
smartphone sensor axis:
The
accelerometer and gyroscope of MPU9250 and my smartphone is the same, but my
magnetometer axes are different. To match with MPU9250 magnetometer, I interchanged
my X and Y and put a negative sign in Z axis.
Line 20:
mag
= [MagneticField.Y(i) -MagneticField.X(i) -MagneticField.Z(i)] //Modified
//[magX magY magZ] Actual case
Sensor Results:
Here KF represents the Kalman filter fusion algorithm that we have developed and Device represents the algorithm used by the device manufacturer/software developer.
Future
Scope:
·
Here,
we have seen on how to implement AHRS using KF with smartphone sensors. You can
take this into Simulink and do hardware in-loop simulations with MPU9250. A
very few changes are needed to this algorithm, and you are good to go. (Keywords:
persistent, matlab function, Arduino hardware for Simulink)
· We can use this algorithm in development of Inertial navigation system (INS) that uses GPS as well to estimate true North, East and Down (NED) position and velocity (6 state variables).
END!!
The files to this project is available in my mathworks community page:
I’m thankful
to Brian Douglas and tjk electronics from whom I was able to develop this
project. Also, to get a good understanding on Inertial Navigation system, basic
and advanced Kalman filter, I recommend the course, state estimation and
localization in Coursera (link below). If you like to contribute to this work... Adjust your Q and R for better output, run your code at static conditions and comment us the variance along with Q and R. Better Q and R means a better KF.
Hope to see you in another blog...
https://www.youtube.com/channel/UCq0imsn84ShAe9PBOFnoIrg
https://blog.tkjelectronics.dk/
https://www.coursera.org/learn/state-estimation-localization-self-driving-cars/home/welcome
Comments
Post a Comment