Droid Bot - Floor Bot With Navigation & Image Recognition
DroidBot is a medium sized bot meant for the the table top or floor environment. The bot is designed to use vision and image regognition with other forms of sensors. A Lidiar as well as a Time of Flight 'radar' and edge sensors are used.
If you were looking for DroidBot2 which is an update done in 2021 where we use a Pi4 on Ubuntu 20.04 server on Ros2 Foxy and a Esp32 MicroRos motor controller then see the
A key new hardware subsystem this bot sports is my new drive control unit which is based on the Mark-Toys Esp32 board as seen on EspressoBot and on the IoT page of this site.
The new Esp32 motor controller subsystem is made up of our
Esp32 Dev board
that then controls an off the shelf H-Bridge style motor controller so you can pick your level of power the motors requre. For DroidBot the H-Bridge controller is the ArduMoto mid-sized driver.
The new motor controller Esp32 software accepts serial packet inputs from the Raspberry Pi 3 ROS from our base_control ROS node.
Because the Esp32 has a BLE gatt server we can even remove the Raspberry Pi 3 and just run this bot using only the motor controller which accepts commands from its onboard BLE (bluetooth) GATT server and we then use our EspBot Android App and away we go on manual mode controlled by human operator.
In mid 2018 the addition of a couple edge sensors as well as a 'TimeOfFlight' distance proximity sensor mounted on a servo allows sensing of objects in front of the Bot. The edge sensors as well as the ToF Radar are connected to the Esp32 board which scans the ToF radar. A ROS node on the Raspberry Pi queries the results from the edge sensors as well as the ToF radar. A demo at HBRC meeting can be seen in
this live demo
The code for Espressobot has been enhanced and used here by adding a serial input that accepts packets from the host (RasPi3 or other system) to query or set parameters in the motor controller. The most valuable command is a command to set both motor control side of right and left at once and we map that on the RasPi3 side when we get a ROS 'twist' command which is a standard in ROS for specification of the required drive movement.
The Esp32 dev board used for the motor control system can be
on this webpage.
This bot runs the motors on a 7.2 volt LiPo cell and runs the Pi and Esp32 from 5Volt cell phone charger type of power supply. It is generally a good idea to keep separate supplies.
DroidBot started doing mapping and navigation around my house in late 2019. In this mode the Lidar is used which outputs 360 distance points all around the robot that are published to the /scan ROS topic.
Existing gmapping and navigation ROS software then is told about DroidBot and how the Lidar is fixed to Droidbot. Mapping happens by driving around the robot and the Lidar distances are looked at along with wheel odometry and a map is formed and shown below
Droid Bot - Floor Navigation
Once a map has been generated as seen above and then saved, the software can be run in a navigation only mode. In this mode you can visualize the robot in RViz tool like in above picture but can also specify the location and angle you want the robot to obtain and the robot will drive there and rotate to the given 'pose'
You can also have a script continue to tell the robot to go from one 'pose' to another where each different location is called a 'waypoint'. This allows the robot to go on a trip anywhere in the map and avoid hitting walls by use of the map and local sensors that may see the cat get in front or whatever. I need to prepare more content on this page to better show how that works, maybe even a video someday.
A small 128x64 pixel OLED display shows the motor parameters for realtime viewing.
Top line is actual speeds and below that is expected speed.
The encoder counts and other motor parameters are shown.
At the bottom the right and left edge detectors light a dot on either side when the table edge is detected.
OLED Motor Display
Mark-World - Tech Projects To Amuse The Curious
Droid Bot - Floor Navigation