Human and Robotic Fish Interaction Controlled Using Hand Gesture Image Processing

This paper is about the control of robotic fish movement in an aquarium via human hand gestures detected by image sensors attached in the aquarium. In this study, sensors actively interact with humans and robotic fish. Image and radio frequency sensors are used to identify the position and color of robotic fish. Recently, we have studied human interactive control based on hand gesture recognition. Image sensors send the input signals of hand gestures obtained from real-time video images processed using tracking control algorithms, such as color mark, stop zone, and lead-lag tracking algorithms, to robotic fish. The movement of robotic fish is controlled via the movement of the two hands, where the left hand is for the fish to be controlled and the right hand is for controlling the movement of the robotic fish. Hand gesture recognition consists of hand feature segmentation and gesture recognition from the hand features. Our results show that interactive human control using hand gestures successfully controls the movement of robotic fish.


Introduction
In the fourth industrial revolution, computers have affected every aspect of life and various human activities. The development of human-computer interaction (HCI) to the next level will favorably impact human activities. Many researchers have been developing several robotic fishes. For example, Robot Tuna I was developed by the Massachusetts Institute of Technology (MIT) to improve autonomous underwater vehicles. (1) Robot Pike of MIT attracts the interest of researchers because of its variable acceleration. (2) The manta ray robot of Evologics, Germany, can be applied to environmental monitoring, deep sea exploration, sensible ecological research, offshore industry, and marine security. (3) Robotic koi, used to control the oxygen level in water, is advantageous in maintaining the health of fish. (4) Researchers at the University of Essex developed robotic fish that can swim autonomously, similarly to real fish. (5) PPF-04 was developed by Koichi Hirata of National Maritime Research Institute (NMRI) to improve the movements and performance of underwater robots such as boats, ships, submarines, and underwater craft. (6) This robotic fish can swim either by moving its body or fin or by median and paired fin (MPF) propulsion. (7) Nowadays, robotic fishes are becoming increasingly popular as a tool for realizing automatic operation. (8) Further development of Human Computer Interaction (HCI) will increasingly support human activities. The control of robotic fish with hand gestures can be realized by utilizing image sensors to analyze video frames. The background in the video frames is removed. Contours, convex points, defect points, and a convex hull are extracted from the video frames. The convex hull is used to identify hand gestures. Gestures are an easy way to interact with computers. The human hand is a highly deformable articulated object with many degrees of freedom, and its different postures and movements can be used for expressing information for various purposes.
In our research, we also focused on the design of the robotic fish for a subaqueous system based on the structure of the fish domi (Pagrus major), which lives in the ocean off South Korea. Robotic domi fishes were exhibited at the 2012 International YEOSU Expo, and their performance was excellent. These robotic fishes were made by Science Robotic Company (SRC) and Artificial Intelligence Robot (AIRO) in South Korea.
Chen et al. (9) proposed a recognition system with a hidden Markov model (HMM) training mechanism for recognizing hand gestures in a stationary background. The research of many researchers has been limited to merely a hand gesture recognition system without implementing it in a control system. Our method of controlling a robotic fish with hand gestures is the first of its kind in the world.
In this work, we developed a hand gesture recognition system for controlling the movement of robotic fish via hand gestures. Human-robot interaction (HRI) is an emerging field of study dedicated to understanding, designing, and evaluating robotic systems for use by humans. HRI is defined as requiring communication between robots and humans. We adopted two image processing technologies for identifying the color of robotic fish and human gestures using the OpenCV (Computer Vision) algorithm. A user will be able to control any robotic fish in an aquarium using hand gestures. The right hand is for controlling the movement of the fish and the left hand is for controlling a specific fish. Two image sensors are used, one to collect human hand gestures and the other to collect the robotic fish position in the aquarium. The user can select the tracking control algorithm, such as the color mark, stop zone, or lead-lag tracking control algorithm.
In this study, sensors are the major tools of active communication between a human and a robot, for example, image sensors to detect the movements of the robotic fish and the human hand gestures, and the radio-frequency (RF) module for receiving signals from the human and transmitting data to the robot. Users can control the movement of robotic fish by using their two hands where the left hand controls a specific robotic fish and the right hand controls the native movement of the robotic fish. Real-time video images of hand gestures are processed to convert them to data that is transmitted to the robotic fish. Our results show that our interactive human control system with hand gesture processing using tracking control algorithms and image sensors successfully controls the movement of robotic fish.

Design of Human-Robotic Fish Interactive Control System
Human gestures include the movements of the body, face, and hands. With hand and arm gesticulation, 90% of gestured communication is possible. (10) Communication and interaction are broadly divided into two general categories: remote and proximate. Our system first recognizes and tracks the human hand and fingertips in a video sequence in advance. Then, the posture of the hand and the positions of the fingers are computed using Euclidean distances of the fingertips to determine the movement between consecutive frames of the video sequence using hand segmentation, contour extraction, a convex hull, and convexity defects.
The proposed human-robotic fish interactive control system shown in Fig. 1 consists of the robotic fish, image sensors (cameras), aquarium, RF transmitter (RF-TX), RF receiver (RF-RX), and human hand gestures. In this work, sensors play the important role of receiving signals of human hand movement and transmitting those signals to robotic fish such that they will move in accordance with the human hand gesture. Figure 1(a) shows the human-robotic fish interaction, which was explained in the previous paper entitled "Human Interaction Control of the Robotic Fish Using Detecting Object Algorithms". (11) How human hand gestures are used to control the robotic fish in the aquarium were explained in that paper. The main concept of those algorithms was that all the robotic fishes approach a human being standing in front of the aquarium. Figure 1(b) illustrates the concept of the proposed human-robotic fish interactive control via human hand gestures. The following are the two main tasks: the control of fish movement and the selection of different algorithms, investigated in the current work. Case (1): Users can control the movement of the robotic fish by gesturing with the left and right hands, where the left hand controls a specific fish and the right hand controls the native movement of the fish, that is, left, right, up, down, or forward movement, in the aquarium. Case (2): Through specific human hand-finger gestures, users can select different tracking control algorithms, such as color mark, stop zone, and lead-lag tracking control algorithms. For example, to control the color mark tracking control algorithm, the user raises one finger of either the left or right hand. Here, the right hand is used to control the robotic fish in the aquarium. If the user wants to use the stop zone tracking control algorithm, two fingers are raised, and to use the lead-lag tracking control algorithm, three fingers are raised. Each robotic fish body is designed with an optimized inner area to mimic the structure of a real fish. The dynamic forces of the robot are determined from its instantaneous swimming movement in the aquarium. The forces acting on the robotic fish are the thrust force in the forward direction along the x-axis, water resistance in the reverse direction along the y-axis, and gravity in the vertical direction along the z-axis, as shown in Fig. 2(a).
The mathematical modeling of the buoyancy force acting on the fish in the aquatic environment is expressed as the fixed coordinate system [ˆˆ, , Here, θ i is the joint angle, ɑ i is the amplitude, f is the frequency, t is time, and p i is the phase difference between heaving and pitching movements. Then, the sliding frequency of the robotic fish swimming in the water can be determined using the Lighthill analysis. To alter the movement of the robotic fish, for example, for up and down swimming, the robot is designed to move back and forth relative to its center of gravity similar to the gravity principle of biomimetic fish. (12) The proposed system will be controlled by the human hand-finger gestures.

Hand segmentation
The detection of the hand gesture in this project is based on the segmentation process. Segmentation is the process of grouping points that belong to the same object into segments. The main idea is to extract a set of points from an image. (12) The steps of the segmentation process are explained below.  (a) Capture of target object and recording of frames: The initial step is the capture of the target object and recording of frames to detect the background of the target object. To accomplish this, the background should be white to avoid noise and achieve the optimal extraction of the target object. (13) (b) Extraction of the region of interest (ROI): The following classical method is used to extract the target object or frame from video after capturing. The relevant frames are converted to grayscale and the Gaussian blur process is applied to delete noise. The initial background subtraction entails a reference frame from which each new frame is subtracted. The result, which is a binary (black and white) image, is thresholded to highlight the region of nonstatic objects. (14) (c) Contour extraction: A contour map comprises contour lines that show the steepness of slopes and the valleys and hills. The slope or gradient function is always perpendicular to the contour lines. When the lines are close together, the magnitude of the gradient is large. Contours are straight lines or curves representing the intersection of one or more horizontal planes with a real or hypothetical surface. (15) (d) Drawing of contours: A contour line is drawn around the white blob of the hand obtained by thresholding the input image. There may be the possibility that more than one blob will be formed in the image owing to noise in the background. Therefore, the contour lines are also drawn on the smaller white blobs. All blobs due to noise are considered to be small; thus, the large contour is deemed to be that of the hand and is considered for further processing. The vector contains a set of contour points in the coordinate form as where C(s) is the set of all convex combinations of its points, p j indicates each point, and λ j is the coefficient of points (obtained from their position coordinates) (d) Convexity defects: Both convexity and defect points form the convex hull that joins the points used to draw the green outline of the hand. The set of contour points of the hand fits within the hull when the convex hull is drawn around the contour of the hand. The minimum number of points is used to form the hull and the property of convexity is maintained. From the hull formation and convexity properties, the finger can be distinguished and are used to track the hand. When the hand moves, the robotic fish follows the hand and fingers.

Human interactive control using hand gesture recognition
In general, the color of an object depends on the characteristics of the perceiving eye and brain. Physically, the color of objects is perceived from the light that is reflected from their surfaces. In the aquarium, differently colored robotic fish mimic actual fish. The setup for human interactive control with hand gesture consists of a camera, an OpenCV library, and color detection and position data. Here, we used the OpenCV library along with the python programming language. When the camera is interfaced with raspberry Pi through the OpenCV library, it can identify the color of the robotic fish on the basis of the hue, saturation, and value (HSV) color space and obtain position data of the robotic fish in the aquarium.
The properties of object detection algorithm (16)(17)(18)(19)(20)(21) are shown in Table 1. The color segment algorithm is suited for noise reduction and object detection by the current system. With our proposed system, we are approaching a new application area of human interactive control. Figure 3 shows the hand gesture recognition system that we followed based on the number of fingers opened or closed. The flowchart of human interactive control is given in Fig. 4. The detailed description about the working of this hand gesture and execution of the algorithm is explained in the subsequent sections.
All robotic fishes are swimming in the aquarium. At the position of the color mark, the robotic fish that is in this color mark zone will stop. When the user deactivates the color mark, the robotic fish swims together with other robotic fishes as before. The robotic fishes are controlled via signals from the RF modem. (22)(23)(24)(25)(26)(27)

Proposed experimental system of HRI
The proposed experimental system consists of an aquarium, robotic fishes, raspberry Pi, cameras, RF TX, object-detection algorithm (color segment algorithm), and human interactive control algorithm for hand-finger gestures, as shown in Fig. 5.

Control of robotic fish using hand-gesture recognition with RF modem
The results of robotic fish control in the aquarium are shown in Figs. 6-9. When the user raises one finger, the algorithm is switched automatically to the color mark tracking control. The results show that all the robotic fishes are stopped when the color mark is green, blue, and yellow then the position coordinate value is (361, 401).

Results of human interactive hand control with hand gesture recognition
The results shown in Fig. 9 indicate that the robotic fishes stop at the color mark and resume autonomous swimming when the mark is removed. The results of color mark tracking analysis are shown in Fig. 7. Color marks were placed at two positions in the aquarium. When the user removed the color marks, the robotic fishes resumed autonomous swimming.
When the user raised two fingers, the algorithm was switched automatically to the stop zone tracking control algorithm. All the robotic fishes stopped at the position (501,242) in the aquarium. This position command was given through the programming in the algorithm. The blue, red, and yellow robotic fishes have position coordinate values of (561,294), (545,317), and (526,373) respectively. When the robotic fishes leave this zone, they resume autonomous swimming.
The stop zone tracking analysis is shown in Fig. 9. One stop zone is designated in the aquarium. When robotic fishes reach this stop zone, they stop and swim at this position automatically until they cross the stop zone boundary. Then, these robotic fishes resume swimming autonomously. When the user raised three fingers, the algorithm was switched automatically to the lead-lag tracking control algorithm (Fig. 8). Two robotic fishes in the aquarium stopped swimming. This algorithm is based on the distance between two robotic fishes, for instance, the lead red robotic fish is at position (518, 378) and the lagging yellow robotic fish is at position (549, 251). If the distance between the two robotic fishes is greater   than 150 mm, the lead robotic fish will wait for the lagging robotic fish to reach it, after which they swim together. This process continues until this algorithm is stopped.
The results of lead-lag tracking control analysis of two robotic fishes are shown in Fig. 9. Robotic fish 1 is the red robotic fish and robotic fish 2 is the yellow robotic fish. The position of robotic fish 1 (x 1 , y 1 ) is (518, 378) and that of robotic fish 2 (x 2 , y 2 ) is (549, 251), as shown in Fig. 9(a). Then, using these position coordinates, we determined the distance between the two robotic fishes and analyzed the performance of lead-lag tracking distance control of the robotic fishes. The performance test of this algorithm yielded satisfactory results.

Conclusions
We presented a new method of controlling robotic fish using hand gestures. This method is based on data obtained by sensors; hence, sensors are an important component in this system. For example, image sensors and RF module sensors are used in the control of the movement of robotic fish using two hands and for wireless communication, respectively. In the experiment, the user utilized hand gestures to select the tracking control algorithm from among the color mark, stop zone, and lead-lag tracking control algorithms. Hand segmentation and gesture recognition from the hand features were performed to generate control signals. In this work, raspberry Pi was used instead of a personal computer. The performance test of these algorithms yielded satisfactory results. In the future, research work to realize human interactive control of 3D hologram fish will be carried out.  Institute. He has also researched and developed a fish robot, submarine robot, automatic milking robot with manipulators, personal electrical vehicle, smart accumulated aquarium using heat pump, solar tracking system, 3D hologram system, and gun/turret stabilization system. He is interested in intelligent robots, image signal processing application systems, and smart farms and aquariums fabricated using new energy and IoT technology.