Remotely Operated Underwater Vehicle

Down Arrow

About the Project


There has been an increased interest in autonomously operated underwater vehicles (AUV’s) in a number of fields. Examples include marine biology, mapping of the ocean floor, as well as inspection of pipelines and other marine infrastructure. Making an AUV is a complicated problem though, compared to a remotely operated underwater vehicle (ROV), and requires a variety of sensors and smart software. The ultimate goal of this series of projects is to convert a ROV into an AUV.


ROV-picture The ROV is an underwater vehicle based on the BlueROV kit from Blue Robotics. It is equipped with six thrusters for movement in multiple degrees of freedom. Most of the computations on the ROV are made on an a Raspberry Pi (RPi), while an HKPilot Mega handles the collection of sensor data. There is also connection to an external computer via an ethernet cable.

Several sensors are used for collecting data from the environment that is later processed by the sensor fusion to estimate position, velocity, orientation and angular velocities. The sensor setup consists of: three sonars, an inertial measurement unit (IMU), a magnetometer and a pressure sensor. There is also a camera that is used to track objects and a newly added leakage sensor.



Sensor Fusion

  • Reintroduced an EKF for state estimation and derived a new motion model.
  • Sensor fusion developed in Matlab Simulink and implemented on the RPi using code generation.
  • Analytical Jacobian matrices derived, giving 90 times faster computations compared to numerical calculations.
  • Model of the magentic field disturbances induced by the motors on the ROV identified.

Model and Control

  • Identification of mass moment of inertia for the model.
  • LQ optimal control implemented.
  • Controller can simultaneously operate in multiple degrees of freedom.
  • Vision in pool
  • Precalculated gain matrices in a multidimensional state space grid, in order to
    reduce on-line computations.

Camera Vision

  • The Vision module's speed has improved by up to 20 times in some scenarios by using a lower image resolution and by only processing a subsection of the picture around the last detected object.
  • The thresholding parameters can now be updated in real time by selecting the ball in the image and having the parameters be determined automatically instead of manually tuned. This has also made the thresholding more robust to varying kinds of lighting.

Project Group

Emil Frid
Project Manager
Alexander Smith
Alfred Fredriksson
Fredrik Nilsson
Joel Wilander
Joakim Wallin
Pontus Hållberg