Robot learning using DDQN and NEAT
Just for fun, I wanted to build a robot that learns by experience. I designed and built a small robot that learns to control a laser beam using Deep Reinforcement Learning, Neuroevolution, and Computer Vision. The 2 DOF robot learned to point its laser beam to reach a target located at the center of two marks. It received the graphic information from a smartphone’s camera. Then, it identified the beam and marks position in real-time using computer vision. The high-level processing is done on a computer and the Arduino acts as the robot low-level controller. For the communication between these devices, I used my PyDuino Bridge Library, which is freely available for the community.
The algorithms tested were the Double Deep Q-Learning (DDQN) and NeuroEvolution of Augmenting Topologies (NEAT). A better performance was obtained using the first algorithm, whose results are shown below.
Library for transparent bi-directional communication between Python and Arduino. Available on the official Arduino Library Manager and on the Python Package Index (with the
pip install pyduinobridge command).
Keras implementation of the ARP units. To allow a high-level use of this perceptrons, the ARP library is available on my GitHub repository.
Intelligent spider robot for detecting anti-personnel metallic landmines in uneven terrain
For my bachelor thesis project, I designed a spider robot that can move through irregular terrain carrying a landmine sensor. The design included the mechanical structure, the electronics, and the control of the robot. I proposed an innovative algorithm for the joints control to make the robot learn how to walk, using genetic algorithms and ensemble learning. To test the algorithm designed for the gait learning, I built a scaled prototype with the same geometry and DOF of the robot designed for moving the landmine sensor.