Robo Madge-Ellen A Mid-Sized Bot With Inside Navigation


In January of 2015 I re-purposed the bot as an inside navigation bot when I got hold of a Lidar unit (360 degree Laser distance device) seen on the top of the picture that allows detection of nearby objects including walls.  In the picture below we see Madge Ellen  with the spinning Lidar unit on top (with classic time tunnel pattern on top).  This was shown at a robot meeting in late January 2016 to be able to follow the nearest object to it as well as turn the head (with cute eyes) to follow the person.  It seemed a nice indoor use for this mid-sized robot.
Robo Madge Ellen
(Click Picture And go 17min 30 sec in for nice video of this bot)
Robo Madge-Ellen A Mid-Sized Bot With Inside Navigation


Also done in late 2015 I converted to use differential motors that are geared down for slower usage.  The motors have the added advantage of having magnetic encoders on them are are driven now by a RoboClaw drive controler for precise wheel control with ability to travel precise distances or rotate precise angles.

A brief introduction of Madge-Ellen to the Home Brew Robotics Club can be seen at 17min 30 sec into  this video  and I hope to soon link to the demo of the version shown below for the Jan 2016 meeting.

This bot also has  GPS and a Compass and understands high level commands like face a given heading or face a given waypoint.  The bot has a WiFi access point and bluetooth joystick control as well as a a webcam that feeds  OpenCV visual recognition routines which can identify at this time an orange traffic cone.

Interesting Subsystems used on Robo Madge-Ellen

Proximity Sensor Module

An 8-sensor proximity subsystem that uses an Arduino Nano to setup and scan the sensors is used.

This is a custom sketch that initializes and monitors the 8  1-meter time of flight (TOF) sensors where each sensor is an ST Microcomputer VL53L0X little sensor seen as the black rectangle on each board in the picture.

8-boards are placed on a Mark-Toys custom 3D printed hub and the arduino is on the bottom.  This unit is mounted on the bot so that the cones of detection miss the posts on the corners of the clear platform.

Arduino Nano source code is on mjstn github  HERE

RoboClaw Used with base_control Node

An RoboClaw 7-amp dual motor controller with wheel encoder support is a key element to movement of the bot.

Software done by Mark-World includes modifications of a USB based driver to support the RoboClaw packet based serial mode (one-byte checksum protocol).

The mark-world base_control ROS node is then using the serial packet driver in stead of USB which can be a source of grief for many robot developers.   The base_control node offers a friendly interface to main bot ROS control node.

The Neato Lidar Laser TOF 2D range finder Spins on Top

A popular 2D navigation subsystem is the black Neato Lidar unit sitting on top which spins around and offers 360 points of distances so the bot can map and navigate a room.
This is a key and important feature and is a lot of fun as well!
The Neato Lidar uses a getSurreal controller and the xv_11_laser_driver ROS node

Display Board Uses  Mark-World ROS node for Serial LCDs

A serial interface to the multi-line LCD display uses a ROS node that can allow the bot's main node to write out status and other messages to be read by humans.

Webcam With Pan Tilt Using Mark-Toys Servo Control Subsystem

A webcam that is wired into ROS allows this bot to 'see' and can be used in the future as was done on FiddlerBot (seen on these pages).   The green board under the cam set servo pan-tilt.
Orange road cone detection software done using OpenCV is all that is done at this time.


Navigation Visualizaion Tool

A nice picture that shows a tool called  RViz in a screen shot showing many ROS constructs all in one picture.

This tool runs live so you can see the bot move about and objects change it's perception of the detections.

The bot itself is called a  URDF format model and Rviz shows it to scale where it happens to be in the room.

Robo Madge-Ellen has the Lidar that allows the bot to know where walls are and you see those in this picture as the multi-colored dots at the edges.

Next the navigation software makes a 'map' and that is shown in this tool as the gray area with black outline.

Then the proximity sensor shows the cones of detection for 8 sensors that you will note end about where the walls are and each cone shows the spec for the detection cone.

RoboMadge-Ellen URDF Model

The Model in RViz is made in a special language in text and then is rendered into RViz using the location and orientation or pose of the robot at any point in time.  A closeup of just the URDF model is to the right and notice the clear plate is also present.
 
Mark-World - Tech Projects To Amuse The Curious
Robo Madge-Ellen
PROX_MULTI_SENSOR_BAUD