A significant portion of the code coordinates the communication between the main computer, the sensors, the cameras, and the thrusters. The main computer - an Intel NUC - and an Arduino Mega communicate using a serial interface. Hardware constraints dictate us to connect the pressure sensor and the thrusters to the Arduino Mega. Data from the pressure sensor and the thrusters are extracted by the Arduino Mega and sent to the NUC. All other sensors and cameras are connected directly to the NUC. At the beginning of the main program on the NUC, separate threads are created, each gathering data from a different device or sending thruster commands to the Mega.
Once the data is gathered in the main program, each sensor data is evaluated to determine any necessary changes to functionality of the thrusters. For example, if the orientation data from the AHRS shows that the AUV is angled too far in one direction and may be about to flip upside down, the side thrusters are turned on in a way that corrects back to the expected orientation. Likewise, the pressure sensor is used to detect the depth of the AUV in the water. If the AUV is about to accidently breach the surface, the up/down thrusters are immediately turned on to force the AUV back down. The data from the thrusters themselves are the most crucial because they reveal whether or not the thrusters themselves are about to overheat, in which case the thrusters are all turned completely off.
Using the data gathered from the cameras and the current positioning of the AUV in space, the AUV calculates the most direct path to the object of interest in the camera frame. This path is then reduced to a unit vector in the direction that the AUV needs to travel based on the distance to the object calculated through the image processing. The AUV then uses a control and feedback loop to align its heading with the correct path, turning the thrusters on and off to control the surge, heave, and sway of the AUV.
The task of tracking moving objects in a dynamic background proves to be complex due to change in orientation of the object, partial and complete object occlusion, varying lighting conditions, camera motion and unwanted noise added to the camera feed. The objective of tracking an object for our AUV is accomplished by deploying the use of Kalman filters. The Kalman filter provides a recursive solution to predict the state of the object from the past values and noisy measurement data. The Kalman filter can get close to accurate predictions after few iterations of its prediction and update steps.
In our algorithm, the Kalman filter also tries to adjust for the movement of the camera. This adjustment can be done by increasing the dimensions of the state variable to compensate the image frame positions and velocity with respect to the object or it can be done by adding the velocity of the camera to the kinematic equations incorporated to predict the future location of the object. In this way, we have tried to use the Kalman filter for image stabilization.