Anth's Computer Cave

Stirling: Hack a robotic vacuum

3rd November, 2018

Update 30th January, 2019

This is Stirling, a healthy, happy robotic vacuum cleaner. Unfortunately, Stirling needs a new brain.

A Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

You see, Stirlings is stupid. Very, very stupid. It roams the house, bumping into walls and furniture, vacuuming random sections of every room and never finishing any rooms.

This is why it needs a nice Raspberry Pi-based lobotomy to stop it being so stupid. I know, it's better to have a bottle in front of me than a frontal lobotomy, but that doesn't apply to vacuum cleaners, and I already have a bottle in front of me.

As well as giving it a new brain, it's getting a new head, too.

A Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

Spoiler! This is what Stirling looks like now.

My main aim for this project is a roaming security guard that can be called and controlled by my home-automation system. This can extend the reach of the home-automation system to rooms that don't have sensors and cameras.

I generally like to use broken stuff in my projects, and it's a bit sad to wreck a perfectly good appliance.

I originally planned to have the Raspberry Pi spoof the remote control so it could ride Stirling like a horse. This would have left Stirling's insides in place and utilize its various existing smarts and safety features, particularly its self-docking charging ability. Unfortunately this was not practical. The remote control is more a hearding-device than a real remote. It can veer the robot left or right but it resumes its own mind-boggling agenda again as soon as you release the button.

Therefore a full brain-transplant is the only option.

Hardware and smarts

This is a sturdy, well-built unit. I reckon it would cost hundres of dollars to buy a ready-made robotic platform like Stirling.

It has a 14.4V NiCad battery system with both a direct charging cable and a charging dock.

The battery from a Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

This runs for well over an hour with the vacuum motors running, so I can't wait to see how long it will last with just the wheel motors running. After replacing its brain it will temporarily lose its ability to drive itself to the charge dock, but it should be fun programming that ability from scratch.

The main board from Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

This is the old demented brain, the main board. I can't see any motor_controllers for the drive motors, they must be on the underside of the board. There is a transistor on the left of the board (marked in red) that runs the sweeping brush on the bottom of the vacuum. The motor controller for the vacuum motor and vacuum brush is on a separate board.

The charging/switch/power board (pictured below) connects to the main board at the top-left of the image above.

The charging circuitry from a Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

For now I am keeping the main board and the charging/switch/power board for the sole purpose of managing the battery.

I assume this will provide over-charging protection as well as low-voltage cutouts. Once I've set up my own Arduino-based power and charging system I can ditch the main board.

The vacuum unit from a Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

The vacuum funtionality has to go for now, because it is controlled by the main board. Once we've finished the new power system and ditched the main board we'll add the vacuum unit back as a removable accessory.

The drive motors from a Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

The drive units are fantastic. They are self-contained units with gearboxes and drop-sensors built in. I suspect there may also be rotary encoders built in.

They have a 10-pin plug that connects to the main board. For initial testing I'm just going to take-over the two main motor wires from each plug, and I'll focus on the encoders later.


The sensor array from a Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

Stirling is loaded with sensors. The bumper on the front acts as a cutout switch, and also contains four infrared sensors. There are also rear-facing and downward-facing infrared sensors, and a receiver on top to detect the dock.

Amazingly, the sensors in the bumber are all labeled. I nearly fell over when I noticed that. I'd been expecting a microscoppic series of connections, in which case it would have been easier to scrap the sensors and replace them with cheap retail units. As it is I think I can work with these.

The sensor array from a Stirling robotic vacuum cleaner. Picture: Anthony Hartup.

The first letter for each refers to the sensor location, ie right, left, middle. The second letter is A for anode, C for cathode, S for signal, V for VIN and G for GND. I can't see any built-in analog-to-digital circuitry so I'm guessing signal will be analog, meaaning the signal wire can connect directly to the analog pins on the Arduino.

The other downward and rear-facing sensors are not labled, but I hope to figure them out.

I'll also be adding my own sensors.

A rotating sensor mast. Picture: Anthony Hartup.

Infrared proximity sensors have a very short range, so I'll be adding several ultrasonic sensors to measure distances out to three or four metres. One of these will be mounted on the rotating mast from my previous robot.

The mast can accurately swivell 360 degrees using a small stepper motor.

There will also be one of my regular home-automation sensor arrays, with temperature, light, smoke and gas sensors. Considering Stirling is the most likely thing to actually set the house on fire I think a smoke detector is a good idea.

A accelerometer/gyroscope/compass will provide movement feedback and detect when the robot is on uneven ground.

A smart phone mounted on Stirling will act as the camera to take video and images on demand. The phone's screen will act as the robot's display. It would be nice to utilize the phone's GPS, too, but that wouldn't work well indoors, and Stirling is an indoor robot.

The new brains

Now we come to Stirlings new brain, or in this case two brains.

First there's the Raspberry Pi. If you are new to the Rasperry Pi, they are a tiny single-board computer that can run a full Linux operating system and feature input/output (GPIO) pins that can control motors, switch relays and read sensors. We use these for many projects in the Cave.

As well as the Pi there will also be a a small Arduino micro-controller that, along with the sensors, forms the robot's nervous system. Arduinos feature analog to digital converters that can read signals from the analog sensors, and monitor the battery.

I'll be using a Raspberry Pi Zero W model to save both power and space. Because Stirling will work in conjuntion with the more-powerful Raspberry Pi 3 that runs my home automation, it can offload any intensive computing to the Pi 3. Later on if I add image or voice-recognition, or any other CPU or RAM intensive features, I'll still keep the light-weight Pi Zero in Stirling, and just upgrade my home-automation computer to a full desktop to handle the extra computation. With image-recognition, for example, Stirling only needs to take the image and send it to the other computer, which will then do all of the recognition tasks.


Throughout this article I've insulted Stirling's current brain many times, but here's the thing. When I first boot up Stirling with its new brain it's actually going to be even more stupid than before.

It will be like the old Stirling after a bottle of vodka.

This is where the various modules of the AAIMI Project come into play. The AAIMI platform has a wide focus, but it is first and foremost an automated machine-interface to sense and control physical things.

You can check out our current programs in the Stirling software page.

Future goals


Once Stirling is following orders and patrolling the Cave, I think I'll look at voice and image recognition. I can't help thinking I'll need to go to the crossroads at midnight for this one, and bring the big G in.

Google are leaps and bounds ahead when it comes to voice and image parsing. Starting from scratch I could spend months building a system far inferior to their publicly-available APIs.

Just about every thing the AAIMI Project does, however, is about making programs that don't require nosey third parties, so it pains me to consider the Google option. It is the sole reason no AAIMI programs have used voice or image recognition in the past.

If anyone can suggest an open-source image-recognition option that does all computation locally, that would be great.

In the next article I'll go into more details about the electronics. I'll cover our efforts to reverse-engineer some of Stirling's existing circuitry, and the new electronics we are adding.





Leave a comment on this article

Leave a comment on this article