Anth's Computer Cave

Stirling: Hack a robotic vacuum

3rd November, 2018

This is Stirling, a healthy, happy robotic vacuum cleaner. Unfortunately, Stirling needs a new brain.

A Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

You see, Stirlings is stupid. Very, very stupid. It roams the house, bumping into walls and furniture, vacuuming random sections of every room and never finishing any rooms.

This is why it needs a nice Raspberry Pi-based lobotomy to stop it being so stupid. I know, it's better to have a bottle in front of me than a frontal lobotomy, but that doesn't apply to vacuum cleaners, and I already have a bottle in front of me.

A Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

My main aim for this project is a roaming security guard that can be called and controlled by my home-automation system. This can extend the reach of the home-automation system to rooms that don't have sensors and cameras.

I generally like to use broken stuff in my projects, and it's a bit sad to wreck a perfectly good appliance.

I originally planned to have the Raspberry Pi spoof the remote control so it could ride Stirling like a horse. This would have left Stirling's insides in place and utilize its various existing smarts and safety features, particularly its self-docking charging ability. Unfortunately this was not practical. The remote control is more a hearding-device than a real remote. It can veer the robot left or right but it resumes its own mind-boggling agenda again as soon as you release the button.

Therefore a full brain-transplant is the only option.

Hardware and smarts

This is a sturdy, well-built unit. I reckon it would cost hundres of dollars to buy a ready-made robotic platform like Stirling.


The battery from a Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

It has a 14.4V NiCad battery system with both a direct charging cable and a charging dock. This runs for well over an hour with the vacuum motors running, so I can't wait to see how long it will last with no vacuum running. After removing its brain it will temporarily lose its ability to drive itself to the charge dock, but it should be fun programming that ability from scratch.


The charging circuitry from a Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

I am keeping the charging/switch/power board (pictured above) because I'm hoping it contains all of the inbound charging circuitry to convert voltages and route charging from either the plug or the docking points. Without the main board I will need to create my own charging cutout circuit and a low-voltage alert system, but that will be easy enough.


The vacuum unit from a Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

The vacuum funtionality has to go. I'd love to keep it, but it takes up too much space. I want to keep the finished unit as low as possible so it can drive under chairs and other low objects. I want the sensor mast to fold down into the space where the vacuum unit was any time Stirling needs to drive under somethin. I'll probably mount the raspberry Pi and Arduino in there, too.


The drive motors from a Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

The drive units are fantastic. They are self-contained units with gearboxes and drop-sensors built in. I suspect they also have built in motor controllers because there doesn't seem to be any on the main board. Judging by the number of connections (10 pins) I suspect there may also be rotary encoders built in. I'm thinking four wires for the motor-controller (two power and two signal), two wires for the drop-sensor and four wires for a rotory encoder. I could be way off, though, I'll need to dismantle one to find out.

I'm hoping I can buy the 10-pin plugs to fit the motor units, but I haven't found any yet. I believe they would be called 5*2 edge-connectors.


The main board from Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

This is the old demented brain, the main board. It is now a frisbee. As I mentioned earlier, I can't see any motor_controllers for the drive motors This makes me hope that the controllers are built into the motor units. There is a mosfet on the left of the board (marked in red), but that runs the sweeping brush on the bottom of the vacuum. The motor controller for the vacuum motor and vacuum brush is on a separate board.

The three diodes marked in the maroon rectangle connect to the power and charging board I mentioned earlier. This helped me determine the probable pinouts for that board. I'll cover this in more detail in the next article.

The display goes along with the main board, but I don't need a display. I will have a phone mounted on the unit for the gyroscope/accelerometer and camera, so that will also function as the display. The phone will communicate with the web-server running on the Raspberry Pi, so it can show notifications and play souns, etc.


The sensor array from a Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

Stirling is loaded with sensors. The bumper on the front acts as a cutout switch, and also contains four infrared sensors. There are also rear-facing and downward-facing infrared sensors, and a receiver on top to detect the dock.

Amazingly, the sensors in the bumber are all labeled. I nearly fell over when I noticed that. I'd been expecting a microscoppic series of connections, in which case it would have been easier to scrap the sensors and replace them with cheap retail units. As it is I think I can work with these.

The sensor array from a Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

The first letter for each refers to the sensor location, ie right, left, middle. The second letter is A for anode, C for cathode, S for signal, V for VIN and G for GND. I can't see any built-in analog-to-digital circuitry so I'm guessing signal will be analog, meaaning the signal wire can connect directly to the analog pins on the Arduino.

The other downward and rear-facing sensors are not labled, but I hope to figure them out.

I'll also be adding my own sensors.

A rotating sensor mast. Picture: Anthony Hartup.

Infrared proximity sensors have a very short range, so I'll be adding several ultrasonic sensors to measure distances out to three or four metres. One of these will be mounted on the rotating mast from my previous robot.

The mast can accurately swivell 360 degrees using a small stepper motor.

There will also be one of my regular home-automation sensor arrays, with temperature, light, smoke and gas sensors. Considering Stirling is the most likely thing to actually set the house on fire I think a smoke detector is a good idea.

The smart phone mounted on Stirling will provide movement feedback from its accelerometer, detect uneven ground with its gyroscope, and take pictures on demand. It would be nice to utilize the phone's GPS, too, but that wouldn't work well indoors, and Stirling is an indoor robot.

The new brains

Now we come to Stirlings new brain, or in this case two brains.

First there's the Raspberry Pi. If you are new to the Rasperry Pi, they are a tiny single-board computer that can run a full Linux operating system and feature input/output (GPIO) pins that can control motors, switch relays and read sensors. We use these for many projects in the Cave.

As well as the Pi there will also be a a small Arduino micro-controller that, along with the sensors, forms the robot's nervous system. Arduinos feature analog to digital converters that can read signals from the analog sensors, and monitor the battery.

I'll be using a Raspberry Pi Zero W model to save both power and space. Because Stirling will work in conjuntion with the more-powerful Raspberry Pi 3 that runs my home automation, it can offload any intensive computing to the Pi 3. Later on if I add image or voice-recognition, or any other CPU or RAM intensive features, I'll still keep the light-weight Pi Zero in Stirling, and just upgrade my home-automation computer to a full desktop to handle the extra computation. With image-recognition, for example, Stirling only needs to take the image and send it to the other computer, which will then do all of the recognition tasks.

Software

Throughout this article I've insulted Stirling's current brain many times, but here's the thing. When I first boot up Stirling with its new brain it's actually going to be even more stupid than before.

It will be like the old Stirling after a bottle of vodka.

This is where the various modules of the AAIMI Project come into play. The AAIMI platform has a wide focus, but it is first and foremost an automated machine-interface to sense and control physical things.

The basics

For testing the initial hardware we'll use a phone-based remote control interface from AAIMI that works by tilting your phone forward, left, right, etc. This program works with any Raspberry Pi controlled vehicle with differential steering. We'll have it available for download soon.

We'll use the remote-control program to calibrate the motor encoders with the gyroscope,accelerometer and compass so Stirling can accurately turn to any direction. We'll also check the sensors and all the safety functions they control, such as the motor-cutout if the wheel-drop sensors trigger, or the short-range infrared sensors notice a sudden obsticle.

After we've worked out our basic movement functions we'll ditch manual mode and begin teaching Stirling to learn for itself. By this stage Stirling will be about as smart as a box of hammers.

The first lesson will have Stirling map the entire house. I've already built a lot of that capability into my previous robot so I should just need to spend a few days refining that code.

After that we need to program it to use the map, which I suspect will involve a little math.

I want Stirling to be able to navigate reliably to any specific area of the house. There are lots of cheats we could use here, like line-following techniques, but that doesn't teach Stirling to learn about its own environment. This way you don't need to reprogram anything to move the robot to a new building, it can just create a new map and we can just supply the room names.

We'll also be refining the communications between Stirling and my AAIMI Home Automation system, because that is the system that will give the orders.

Recognition

Once Stirling is following orders and patrolling the Cave, I think I'll look at voice and image recognition. I can't help thinking I'll need to go to the crossroads at midnight for this one, and bring the big G in.

Google are leaps and bounds ahead when it comes to voice and image parsing. Starting from scratch I could spend months building a system far inferior to their publicly-available APIs.

Just about every thing the AAIMI Project does, however, is about making programs that don't require nosey third parties, so it pains me to consider the Google option. It is the sole reason no AAIMI programs have used voice or image recognition in the past.

If anyone can suggest an open-source image-recognition option that does all computation locally, that would be great.

In the next article I'll go into more details about the electronics. I'll cover our efforts to reverse-engineer some of Stirling's existing circuitry, and the new electronics we are adding.

Cheers

Anth

_____________________________________________


Comments

Leave a comment on this article