I tend to have lots of silly ideas, and Mrs. Roboto is no exception. The idea emerged while thinking about how to take advantage of Facebook’s Spark AR to connect computer vision technology to the physical world. Fortunately, I’d been introduced to the Neopixel add-on to the BBC micro:bit a few weeks earlier by my mentoring companion Neill, which seemed like an obvious fit for building a low cost electronic robot.
Here I’ll describe how to build one yourself – note that it’s a pretty rudimentary system so could be improved in many ways. Additionally, once you have the basics connected, there’s plenty of variations/additions that could be applied on both the input and output, for example responding to hand movement instead of facial expressions, introducing sound effects or perhaps adding movement via a remote control car. For more sophistication on the output side on a similarly low budget, using a Raspberry Pi/Arduino instead of the micro:bits should work.
What You’ll Need
- Computer with ngrok and Node.js installed
- Spark AR studio software
- Phone with Facebook/Spark AR Player installed
- USB type-B micro cable to connect to micro:bits
- 3 x BBC micro:bits (£13 each)
- 2 x Adafruit battery holders (£2 each)
- 2 x NeoPixel Kitronik Zip Halos (£13.80 each)
- 1 x micro:pixel 4×8 WS2812B board (£17.78)
How It Works
- The user’s face is tracked using the front camera on a phone running a specific Spark AR effect.
- The Spark AR effect is programmed to detect certain gestures (e.g. eyebrows being raised, pursed lips in a kiss gesture) and send an HTTP request to a specific endpoint.
- The endpoint is a dynamic domain provisioned using ngrok (e.g.
https://abcd1234.ngrok.io) which tunnels a request to a specific local machine running the ngrok client.
- The local machine hosts a node.js server using Express to receive the requests and pass instructions onto the master micro:bit connected via USB using the serial data port.
- The master micro:bit forwards the instructions to the 2 client micro:bits using the radio module and updates its own connected micro:pixel via the NeoPixel module.
- The client micro:bits receive the instructions and update their Zip Halos via the NeoPixel module.
Download the source files from the Mrs. Roboto GitHub repository.
- Connect 2 of the micro:bits to the Zip Halos.
- Connect one of these micro:bits to your computer using the USB cable and use to load the
microbit_clientfile from the
microbit_filesdirectory onto the micro:bit by dragging it onto the drive, which will flash the device with the code.
- Repeat for the second micro:bit connected to the Zip Halo.
- Connect the other micro:bit to the micro:pixel unit.
- Connect this micro:bit to your computer and flash it with the
- Download and install Node.js and ngrok on your local machine.
- Ensure you either have Node.js available globally, or move the server folder to a location where node can run.
npm installto install dependencies.
node index.js -p 80to run the server, listening on port 80.
./ngrok http 3000to generate a subdomain on ngrok.io that will redirect traffic to your local machine.
- Copy the full https domain (in the form https://xxxxxxxx.ngrok.io) from the console.
- Open the
Mrs_Roboto.arprojfile in the spark_ar folder in Spark AR, and paste the address above in Whitelisted Domains box under the
Edit Project > Capabilitiessection.
Note: This must be an https domain.
- Edit the
script.jsfile and update line 8 with the same domain.
- Either deploy the effect to your phone using the Spark AR player over USB, or export and upload the effect via the Spark AR hub.
- Open the effect on your phone. If you’ve set up everything correctly, raising your eyebrows using the front camera should change the colour of the lights on the LEDs.
If the above doesn’t work, debug by verifying each step of the chain is working, e.g.
- Check when making facial gestures in your phone’s front camera using the effect that these are being detected and triggering actions in Spark AR. Use the Spark AR player running the effect to do this while connected via the USB cable your computer. Ensure the Spark AR console is showing the phone’s debug output, and verify that messages are printed correlating to the actions you’re taking. Add more debug statements in
script.jsif you need further detail. If these messages aren’t displaying, ensure the gestures are triggering the correct logic in the Patch Editor as you’re expecting
- If gestures in Spark AR are correctly triggering actions, ensure the whitelisted domain and the domain in line 24 of the script are set to the correct ngrok domain.
- Check that requests are being received by your local node server by printing out a debug statement in the console when an endpoint is triggered. You can check this is working by manually calling the equivalent endpoint on the ngrok server.
- Verify that the Node script is able to communicate with the master micro:bit over USB by sending a specific message triggered by keyboard input (e.g. pressing the x key). Verify it’s being received by making a very simple program for the micro:bit which simply shows an LED pattern when that message is received.
- Applying more nuanced effects, for example gradually brightening the mouth depending on how open it is.
- Using motors to adjust the head orientation depending on the orientation of the face.
- Triggering speech samples, or allowing the user to enter text which would be converted to a voice sample on the fly.
Thanks go to Neill Bogie, Bram van de Ven and Priyanka Parmar for their help in building Mrs. Roboto.