# Project BEAVER: A Budget-Effective Aquatic Vehicle for Environmental Research

Abstract

Within the past few centuries, human population growth and increasing urbanisation have led to the deterioration of many freshwater and marine habitats. To conserve aquatic ecosystems, it is essential to better understand the responses of the habitats to such factors. The purpose of this project was to construct an Unmanned Untethered Vehicle (UUV) for measuring factors affecting water quality. The vehicle was built out of cost-effective Polyvinyl chloride (PVC) material and programmed to be controlled wirelessly via a graphical user interface. After construction and calibration, the robot was field tested in local water bodies. It successfully took pictures and video of the environment it was exploring through user input. It was also able to take water samples as well as measure pH, turbidity, and temperature of the water. The robot was field-tested at Blue Springs, Fore Lake, De Leon Springs, and the Tomoka Basin and has further potential applications in investigating how water quality affects species diversity in an ecosystem.

Introduction

Aquatic ecosystems are the most abundant ecosystems on Earth.[1] Wetlands, rivers, lakes, and coastal estuaries are all aquatic ecosystems—critical elements of Earth’s dynamic processes and essential to human economies and health. The stability of these delicate systems depends on our knowledge of human impacts in the environment, but our ability to explore and research these factors is currently limited.

Utilising human divers to collect water quality and conduct species surveys in some aquatic environments is restricted by depth and time constraints and can be hazardous due to decompression sickness, nitrogen narcosis, environmental dangers, etc.[2] Using a robot to perform this job does not pose such safety risks. Remotely Operated Vehicles (ROVs) are tethered underwater mobile devices which are one of the most effective tools for aquatic data collection due to their high manoeuvrability.[3] However, they are limited by the length of the tether, which can become tangled or caught in shallow, complex environments.[3] To counteract this limitation, the goal of this project was to build a cost-effective, untethered aquatic robot for collecting environmental data.

The robot, named Ridley as a reference to Kemp’s Ridley sea turtle, can explore and research aquatic ecosystems. It is controlled wirelessly via a graphical user interface and can record visual data of the environment it is exploring based on user command. Since wireless communication is severely limited underwater[4], Ridley can dive but must surface to receive signal commands, much like a sea turtle comes up for air.[5] The robot is capable of taking temperature, turbidity, and pH measurements and can take water samples for further lab analysis. This function is advantageous for analysing how the water quality in an ecosystem affects its residents.

Background Information

Unmanned submersible vehicles fall into a few categories. The simplest submersibles are those that are towed behind a ship and act as platforms for cameras and various sensors. The other submersible systems are Remotely Operated Vehicles (ROVs), Autonomous Underwater Vehicles (AUVs), and Unmanned Untethered Vehicles (UUVs).

An ROV is essentially a tethered underwater robot that allows the vehicle’s operator to remain in a comfortable environment while the ROV works in the hazardous environment below.[6] ROVs vary widely in price and functionality. Deep-sea research ROVs are only accessible to organisations with considerable financial resources, but the last decade has seen the advent of “low cost” ROVs that range anywhere between $400 and$40,000.[7] The most popular recreational drones such as the Trident[8] and BlueROV2[9] cost only a few thousand dollars depending on the configuration.

AUVs are programmable robotic vehicles that, depending on their design, can drift, drive, or glide through the ocean without real-time control by human operators.[10] An AUV must contain its own power source and control itself while accomplishing a predefined task. Some AUVs communicate with operators periodically or continuously through satellite signals or underwater acoustic beacons to permit some level of control.[10] Others, such as gliders, require no human assistance while travelling. Underwater gliders achieve this by using small changes in buoyancy to profile vertically and move horizontally on wings.[11]

UUVs are vehicles that can operate underwater without a tether or human occupant.[10] The vehicle engineered in this research project falls under this broader category. While UUVs do require some level of human communication, they can navigate hazardous environments without the limitations of a tether. UUVs which employ real-time control are only found in high-level research applications because they require acoustic communication. Underwater acoustic channels are generally recognised as one of the most challenging communication media in use today. Acoustic propagation is best supported at low frequencies, and the bandwidth available for communication is minimal.[12] Because of these limitations, the robot in this project was designed to submerge for periods of time, and then resurface to receive further commands. Therefore, it cannot receive signals while underwater but does not require acoustic communication to function.

Engineering Goals

The robot must be able to be controlled wirelessly via a graphical user interface (GUI). The robot must also be able to take pictures and record video of the environment it is exploring based on user command. The robot should be able to measure water quality and collect water samples for further lab analysis. The total cost of the finished vehicle will be kept under $2000 so that it remains cost-effective. Method Mechanics The body of the robot (Figure 1) was designed and built using 4″ PVC pipes, and the joints were secured and waterproofed with PVC primer and cement. It was designed with a thruster on each side of the main compartment attached to a pipe that is detachable from the main body. Two more vertical thrusters are mounted to an open joint in each of the detachable side pipes. The inner chamber was waterproofed with rubber plugs in each of the four chamber openings. Each rubber plug has a hole in the middle with a connector bolt to loop thruster cables and sensors cables. The connector bolts were filled, and the cables were secured with marine epoxy. Inside the main compartment, a circular sliding tube holds the electronics such as the Arduino microprocessor, the stripboard, the electronic speed controllers (ESCs), and the battery. The water sampler, the latest addition to the robot, was held by a PVC pipe attached to the front of the robot. The servo that controls the sampler was fastened to one of the removable screw caps on top of the main compartment. The other removable screw cap held the wireless antenna in place. Electronics All the electronics were attached to the sliding compartment (Figure 2) of the robot interior with Velcro, and the wires were secured with electric tape and zip ties. The signal wires of the ESCs, sensors, and servo were connected to header pins soldered onto the stripboard. Jumper wires were also soldered onto the stripboard and directed to the respective signal pins on the Arduino microprocessor. Each of the electrical components was powered by a Lithium Polymer (LiPo) battery which was fastened to the sliding tray. The robot was controlled using XBees (wireless connectivity modules)[13] with cloverleaf antennas attached. The Xbee on the robot was soldered onto the bottom of the Arduino microprocessor, and the other XBee connected to the driver station laptop with a USB dongle. Camera, Sensors and Water Sampler One GoPro camera mount was attached to the back end of the robot. When the GoPro Session camera was placed in the mount, only the lens was in the water, which allowed the camera to be controlled by the GoPro app on a smartphone. This allowed the user to view the live video stream and take pictures wirelessly. Initially, another camera was suspended from the robot and submerged in the water for better recording quality, but this became unnecessary once the diving functionality was added. Ridley has three sensors which measure pH, turbidity, and temperature. The sensors were wired through the rubber plugs and waterproofed with marine epoxy. When the respective function was called, the GUI displays the pH, turbidity, and temperature of the environment the robot is exploring. In addition to taking measures of water quality, Ridley can also take water samples. The water sampler (Figure 3) is a novel design composed of a syringe encapsulated by a PVC pipe. A rubber band attached to the end of the syringe puts tension on the plunger, but a small piece of metal placed in the PVC pipe blocks the plunger from being pulled out. The water sampler was fastened to the front of the robot, and the metal piece was attached to a servo via a string. When the servo rotates, the blockage was removed, and the syringe sucked in water. Code The code to control the thrusters, sensors and water sampler was written in Arduino. Much of the thruster code was adapted from open sources such as Blue Robotics and Open ROV. Ridley was programmed to operate at varying levels of speed, test water quality, take water samples, and dive based on user input. Because wireless signals cannot travel through water, the robot cannot be controlled while underwater. The diving function allows the robot to dive for a set period of time and then stops the thrusters so that the robot automatically resurfaces due to its positive buoyancy. The code for the GUI was written in Java. It is a simple interface in which a number corresponds to a command. The user inputs numbers to control the robot’s sensors and actuators. Field Testing The robot was tested multiple times at first in a swimming pool to calibrate the thrusters. Ridley was then field-tested in various aquatic environments: Blue Springs, Fore Lake, De Leon Springs. The data collected were water temperature and video feed for species surveys. Videos from the GoPro cameras were downloaded to a computer and analysed. Fish and other aquatic organisms were identified to the lowest possible taxon, enumerated, and recorded each time they entered the field of view. Temperature and runtime were also recorded. Each data collection took place mid-afternoon, and the total run time for each location was approximately 45 minutes. Later, when the pH sensor, turbidity sensor, and water sampler were added, the robot’s water quality features were tested in the Tomoka River Basin (Figure 4). Results Table 1 displays the quantity and status of observed species from the video captured by Ridley at Blue Springs State Park. Striped Mullet were by far the most frequently observed animal species with 77 observations. Location 1: Blue Springs Runtime – 46 minutes Mean Water Temperature – 23.15 °C Table 1: Species List and Survey Results for Blue Springs State Park Figures 5-7 are static video frames taken from the Blue Springs data collection video. Figure 5 depicts a small group of striped mullet. Figure 6 depicts Florida spotted gar, and Figure 7 shows a group of west Indian manatees, including a mother and her calf. Location 2: Fore Lake in Ocala National Forest Table 2 displays the quantity and status of observed species from the video captured by Ridley at Fore Lake. Juvenile bluegill were the only observed animal species during this field test. Runtime: 40 minutes Mean Water Temperature: 22.42 °C Table 2: Species list and survey results for Fore Lake Location 3: De Leon Springs Table 3 displays the quantity and status of observed species from the video captured by Ridley at De Leon Springs. Runtime: 44 minutes Mean Water Temperature: 22.25 °C Table 3: Species list and survey results for De Leon Springs State Park Figure 8 depicts a clump of hydrilla at De Leon Springs. Hydrilla is classified as an invasive species in Florida. Location 4: Tomoka Basin in Sanchez Park No species were observed at the Tomoka Basin. Table 4 displays the water quality measurements taken by Ridley at the Tomoka Basin, including measures of turbidity, pH, and temperature. Runtime: 48 minutes Table 4: Tomoka Basin Water Quality Measurements *All turbidity and pH values were calculated in real time within the Arduino code according to the following conversions: turbidity = 100 – (voltage/vClear)(100)[14], where vClear is the voltage of the probe in distilled water. pH = (3.5)(voltage) + offset[15], where offset is the deviation in pH of the probe in a calibration solution with a pH of 7. Discussion Overall, the robot met the engineering goals and performed effectively in data collection. The vehicle was controlled wirelessly via a graphical user interface, and the live video stream allowed the user to take pictures of the environment that the robot was exploring based on user command. The diving functionality was successful as Ridley was able to dive for a set period of time, and its positive buoyancy allowed it to resurface once the thrusters stop moving. The total cost of the robot was$1472, which is 26.4% under the 2000 dollar budget and comparatively less expensive than ROVs of similar or lesser capabilities such as the BlueROV2. It was successfully field tested at Blue Springs State Park, Fore Lake in Ocala National Forest, De Leon Springs State Park, and Sanchez Park.

Blue Springs was high in biodiversity (Table 1), and the crystal clear water allowed species to be readily discernible. The first two attempts at data collection were unproductive due to battery malfunctions, difficulty with the live video stream, and leaking. The issues were addressed, and the third attempt was successful. The video playback showed various organisms such as striped mullet (Figure 5), Florida spotted gar (Figure 6), channel catfish, and manatees including a mother and calf (Figure 7). Striped mullet were by far the most numerous species identified within the video playback.

At De Leon Springs, species were also easily identifiable because of the high visibility (Table 3). The only fish recorded on video was a largemouth bass, but there were innumerable amphipods visible within the vegetation. The video captured submerged aquatic plants such as naiads and Hydrilla (Figure 8), an invasive species. The data collection at Fore Lake (Table 2) was also successful, though the area had a much lower biodiversity than De Leon Springs and Blue Springs. The only fish identified were young bluegills. While the video feed was clear at Blue Springs, De Leon Springs, and Fore Lake, the Tomoka Basin did not yield the same results due to low visibility conditions. However, the robot was able to measure water quality successfully (Table 4). The water sampler also worked as expected and sampled approximately 35 ml of lake water. Most aquatic ecosystems, especially those with healthy, diverse, and productive fish and macroinvertebrates communities, have a water pH between approximately 6.5 and 8.5 units.[16] The average pH of 7.63 at the Tomoka Basin indicates that the water is within the ideal range for aquatic life. The temperature and turbidity vary according to location, time of day, and season, so long term data would be necessary to assess the effect of these factors on water quality and species diversity.

Going forward, some modifications can be made to improve the capabilities of the robot. The live preview and user control of the front GoPro camera were unreliable. It would lose connection when the robot travelled any farther than approximately 10 meters away from the user or when it dived underwater. A waterproof camera integrated into the electrical circuit and programmed into the GUI would increase the flexibility and reliability of the visual feed. Furthermore, the cameras do not function well in low visibility environments, and this was made especially evident in the murky waters of the Tomoka Basin. The visibility can be improved upon the addition of a waterproof light-emitting diode (LED) light near the camera. This project can be further developed with the expansion of the robot’s autonomous capabilities. Currently, the vehicle dives for a few seconds and automatically resurfaces, but it could be programmed to measure factors affecting water quality and take water samples at specific depths. There is also potential for the addition of other water quality sensors such as dissolved oxygen, oxidation-reduction potential and salinity.

Conclusion

While it certainly had limitations, the robot met the budget requirement and was successfully field-tested at Blue Springs, Fore Lake, De Leon Springs, and the Tomoka Basin. There is potential for improvement regarding lighting, live video stream, and thruster efficiency, which will make the vehicle more suitable for data collection in harsher environments such as marine reefs. With its current limitations, the robot performs most effectively in environments with high visibility and high biodiversity such as freshwater springs. It can efficiently measure water quality, and the water sampler can also be utilised if further lab analysis is necessary. The water quality measurements in combination with the video data have promising applications in examining how water quality affects the species diversity and distribution in an area.

References

1. “Different Types of Ecosystems”, Ecosystem, accessed December 31, 2018, http://www.ecosystem.org/types-of-ecosystems.
2. Daniel McQueen, G. Kent, and A. Murrison, “Self-reported long-term effects of diving and decompression illness in recreational scuba divers”, British Journal of Sports Medicine 28, 2 (July 1994): 101-104, http://doi.org/10.1136/bjsm.28.2.101.
3. “Advantages and Disadvantages of Remote Sampling”, UK Marine SACs Project,  accessed December 31, 2018, http://www.ukmarinesac.org.uk/communities/seapens/sp6_1_1.htm.
4. Xi Zhang, Jun-Hong Cui, Santanu Das, Mario Gerla and Mandar Chitre, “Underwater wireless communications and networks: theory and application: Part 1 [Guest Editorial]”, IEEE Communications Magazine 53, 11 (November 2015): 40-41, http://doi.org/10.1109/MCOM.2015.7321969.
5. “How Long Can Sea Turtles Hold Their Breath?”, Olive Ridley Project, accessed December 31, 2018, https://oliveridleyproject.org/ufaqs/how-long-can-sea-turtles-hold-their-breath.
6. “Remotely Operated Vehicles”, Monterey Bay Aquarium Research Institute, accessed December 31, 2018, https://www.mbari.org/at-sea/vehicles/remotely-operated-vehicles/.
7. Jonathan Teague, Michael J. Allen, and Tom B. Scott, “The potential of low-cost ROV for use in deep-sea mineral, ore prospecting and monitoring”, Ocean Engineering 147, (January 1, 2018): 333-339, http://doi.org/10.1016/j.oceaneng.2017.10.046.
8. “Trident Underwater Drone”, OpenROV Underwater Drones, accessed December 22, 2017, https://www.openrov.com/products/trident/.
9. “BlueROV2”, Blue Robotics, accessed December 22, 2017, https://www.bluerobotics.com/store/rov/bluerov2/.
10. D. Richard Blidberg, “The development of autonomous underwater vehicles (AUV); a brief summary”, In Ieee Icra 4, 2001, http://wpressutexas.net/cs378h/images/d/de/ICRA_01paper.pdf
11. Daniel L. Rudnick, Russ E. Davis, Charles C. Eriksen, David M. Fratantoni and Mary Jane Perry, “Underwater Gliders for Ocean Research”, Marine Technology Society Journal 38, 2 (Summer 2004): 73-84, https://doi.org/10.4031/002533204787522703.
12. Milica Stojanovic and James Preisig, “Underwater acoustic communication channels: Propagation models and statistical characterization”, IEEE Communications Magazine 47, 1 (January 2009): 84-89, https://doi.org/10.1109/mcom.2009.4752682.
13. “Digi XBee Ecosystem”, Digi International Inc., accessed December 31, 2018, https://www.digi.com/xbee.
14. “Turbidity Sensor SKU: SEN0189”, DFRobot, last modified May 25, 2017, https://www.dfrobot.com/wiki/index.php/Turbidity_sensor_SKU:_SEN0189.