Robot learning using DDQN and NEAT
Just for fun, I wanted to build a robot that learns by experience. I designed and built a small robot that learns to control a laser beam using Deep Reinforcement Learning, Neuroevolution, and Computer Vision. The 2 DOF robot learned to point its laser beam to reach a target located at the center of two marks. It received the graphic information from a smartphone’s camera. Then, it identified the beam and marks position in real-time using computer vision. The high-level processing is done on a computer and the Arduino acts as the robot low-level controller. For the communication between these devices, I used my PyDuino Bridge Library, which is freely available for the community.
The algorithms tested were the Double Deep Q-Learning (DDQN) and NeuroEvolution of Augmenting Topologies (NEAT). A better performance was obtained using the first algorithm, whose results are shown below.
Library for transparent bi-directional communication between Python and Arduino. Available on the official Arduino Library Manager and on the Python Package Index (with the
pip install pyduinobridge command).
Keras implementation of the ARP units. To allow a high-level use of this perceptrons, the ARP library is available on my GitHub repository.
Intelligent spider robot for detecting antipersonnel metallic mines in uneven terrain
For my bachelor thesis project, I designed a spider robot that can move through irregular terrain carrying a mine sensor. The design included the mechanical structure, the electronics and control of the robot. For the joints control, I proposed an algorithm to make the robot learn how to walk, using genetic algorithms and ensemble learning. To test the algorithm designed, I built a small robot with the same geometry and DOF of the designed for carrying the sensor mines.