Stirling: Hack a robotic vacuum
3rd November, 2018
This is Stirling, a healthy, happy robotic vacuum cleaner. Unfortunately, Stirling needs a new brain.
You see, Stirlings is stupid. Very, very stupid. It roams the house, bumping into walls and furniture, vacuuming random sections of every room and never finishing any rooms.
This is why it needs a nice Raspberry Pi-based lobotomy to stop it being so stupid. I know, it's better to have a bottle in front of me than a frontal lobotomy, but that doesn't apply to vacuum cleaners, and I already have a bottle in front of me.
My main aim for this project is a roaming security guard that can be called and controlled by my home-automation system. This can extend the reach of the home-automation system to rooms that don't have sensors and cameras.
I generally like to use broken stuff in my projects, and it's a bit sad to wreck a perfectly good appliance.
I originally planned to have the Raspberry Pi spoof the remote control so it could ride Stirling like a horse. This would have left Stirling's insides in place and utilize its various existing smarts and safety features, particularly its self-docking charging ability. Unfortunately this was not practical. The remote control is more a hearding-device than a real remote. It can veer the robot left or right but it resumes its own mind-boggling agenda again as soon as you release the button.
Therefore a full brain-transplant is the only option.
Hardware and smarts
This is a sturdy, well-built unit. I reckon it would cost hundres of dollars to buy a ready-made robotic platform like Stirling.
It has a 14.4V NiCad battery system with both a direct charging cable and a charging dock.
This runs for well over an hour with the vacuum motors running, so I can't wait to see how long it will last with no vacuum running. After replacing its brain it will temporarily lose its ability to drive itself to the charge dock, but it should be fun programming that ability from scratch.
This is the old demented brain, the main board. I can't see any motor_controllers for the drive motors, they must be on the underside of the board. There is a transistor on the left of the board (marked in red) that runs the sweeping brush on the bottom of the vacuum. The motor controller for the vacuum motor and vacuum brush is on a separate board.
The charging/switch/power board (pictured below) connects to the main board at the top-left of the image above.
For now I am keeping the main board and the charging/switch/power board for the sole purpose of managing the battery.
I assume this will provide over-charging protection as well as low-voltage cutouts. Once I've set up my own Arduino-based power and charging system I can ditch the main board.
The vacuum funtionality has to go. I'd love to keep it, but it takes up too much space. I want to keep the finished unit as low as possible so it can drive under chairs and other low objects. I want the sensor mast to fold down into the space where the vacuum unit was any time Stirling needs to drive under something. I'll probably mount the raspberry Pi and Arduino in there, too.
The drive units are fantastic. They are self-contained units with gearboxes and drop-sensors built in. I suspect there may also be rotary encoders built in.
They have a 10-pin plug that connects to the main board. For initial testing I'm just going to take-over the two main motor wires from each plug, and I'll focus on the encoders later.
Stirling is loaded with sensors. The bumper on the front acts as a cutout switch, and also contains four infrared sensors. There are also rear-facing and downward-facing infrared sensors, and a receiver on top to detect the dock.
Amazingly, the sensors in the bumber are all labeled. I nearly fell over when I noticed that. I'd been expecting a microscoppic series of connections, in which case it would have been easier to scrap the sensors and replace them with cheap retail units. As it is I think I can work with these.
The first letter for each refers to the sensor location, ie right, left, middle. The second letter is A for anode, C for cathode, S for signal, V for VIN and G for GND. I can't see any built-in analog-to-digital circuitry so I'm guessing signal will be analog, meaaning the signal wire can connect directly to the analog pins on the Arduino.
The other downward and rear-facing sensors are not labled, but I hope to figure them out.
I'll also be adding my own sensors.
Infrared proximity sensors have a very short range, so I'll be adding several ultrasonic sensors to measure distances out to three or four metres. One of these will be mounted on the rotating mast from my previous robot.
The mast can accurately swivell 360 degrees using a small stepper motor.
There will also be one of my regular home-automation sensor arrays, with temperature, light, smoke and gas sensors. Considering Stirling is the most likely thing to actually set the house on fire I think a smoke detector is a good idea.
The smart phone mounted on Stirling will provide movement feedback from its accelerometer, detect uneven ground with its gyroscope, and take pictures on demand. It would be nice to utilize the phone's GPS, too, but that wouldn't work well indoors, and Stirling is an indoor robot.
The new brains
Now we come to Stirlings new brain, or in this case two brains.
First there's the Raspberry Pi. If you are new to the Rasperry Pi, they are a tiny single-board computer that can run a full Linux operating system and feature input/output (GPIO) pins that can control motors, switch relays and read sensors. We use these for many projects in the Cave.
As well as the Pi there will also be a a small Arduino micro-controller that, along with the sensors, forms the robot's nervous system. Arduinos feature analog to digital converters that can read signals from the analog sensors, and monitor the battery.
I'll be using a Raspberry Pi Zero W model to save both power and space. Because Stirling will work in conjuntion with the more-powerful Raspberry Pi 3 that runs my home automation, it can offload any intensive computing to the Pi 3. Later on if I add image or voice-recognition, or any other CPU or RAM intensive features, I'll still keep the light-weight Pi Zero in Stirling, and just upgrade my home-automation computer to a full desktop to handle the extra computation. With image-recognition, for example, Stirling only needs to take the image and send it to the other computer, which will then do all of the recognition tasks.
Throughout this article I've insulted Stirling's current brain many times, but here's the thing. When I first boot up Stirling with its new brain it's actually going to be even more stupid than before.
It will be like the old Stirling after a bottle of vodka.
This is where the various modules of the AAIMI Project come into play. The AAIMI platform has a wide focus, but it is first and foremost an automated machine-interface to sense and control physical things.
For testing the initial hardware we'll use a phone-based remote control interface from AAIMI that works by tilting your phone forward, left, right, etc. This program works with any Raspberry Pi controlled vehicle with differential steering. We'll have it available for download soon.
We'll use the remote-control program to calibrate the motor encoders with the gyroscope,accelerometer and compass so Stirling can accurately turn to any direction. We'll also check the sensors and all the safety functions they control, such as the motor-cutout if the wheel-drop sensors trigger, or the short-range infrared sensors notice a sudden obsticle.
After we've worked out our basic movement functions we'll ditch manual mode and begin teaching Stirling to learn for itself. By this stage Stirling will be about as smart as a box of hammers.
The first lesson will have Stirling map the entire house. I've already built a lot of that capability into my previous robot so I should just need to spend a few days refining that code.
After that we need to program it to use the map, which I suspect will involve a little math.
I want Stirling to be able to navigate reliably to any specific area of the house. There are lots of cheats we could use here, like line-following techniques, but that doesn't teach Stirling to learn about its own environment. This way you don't need to reprogram anything to move the robot to a new building, it can just create a new map and we can just supply the room names.
We'll also be refining the communications between Stirling and my AAIMI Home Automation system, because that is the system that will give the orders.
Once Stirling is following orders and patrolling the Cave, I think I'll look at voice and image recognition. I can't help thinking I'll need to go to the crossroads at midnight for this one, and bring the big G in.
Google are leaps and bounds ahead when it comes to voice and image parsing. Starting from scratch I could spend months building a system far inferior to their publicly-available APIs.
Just about every thing the AAIMI Project does, however, is about making programs that don't require nosey third parties, so it pains me to consider the Google option. It is the sole reason no AAIMI programs have used voice or image recognition in the past.
If anyone can suggest an open-source image-recognition option that does all computation locally, that would be great.
In the next article I'll go into more details about the electronics. I'll cover our efforts to reverse-engineer some of Stirling's existing circuitry, and the new electronics we are adding.