Domo Arigato, Mrs. Roboto

By Laurie Ainley

I tend to have lots of silly ideas, and Mrs. Roboto is no exception. The idea emerged while thinking about how to take advantage of Facebook’s Spark AR to connect computer vision technology to the physical world. Fortunately, I’d been introduced to the Neopixel add-on to the BBC micro:bit a few weeks earlier by my mentoring companion Neill, which seemed like an obvious fit for building a low cost electronic robot.

Here I’ll describe how to build one yourself – note that it’s a pretty rudimentary system so could be improved in many ways. Additionally, once you have the basics connected, there’s plenty of variations/additions that could be applied on both the input and output, for example responding to hand movement instead of facial expressions, introducing sound effects or perhaps adding movement via a remote control car. For more sophistication on the output side on a similarly low budget, using a Raspberry Pi/Arduino instead of the micro:bits should work.

What you’ll need

  1. Computer with ngrok and Node.js installed
  2. Spark AR studio software
  3. Phone with Facebook/Spark AR Player installed
  4. USB type-B micro cable to connect to micro:bits
  5. 3 x BBC micro:bits (£13 each)
  6. 2 x Adafruit battery holders (£2 each)
  7. 2 x NeoPixel Kitronik Zip Halos (£13.80 each)
  8. 1 x micro:pixel 4×8 WS2812B board (£17.78)

How it works

  1. The user’s face is tracked using the front camera on a phone running a specific Spark AR effect.
  2. The Spark AR effect is programmed to detect certain gestures (e.g. eyebrows being raised, pursed lips in a kiss gesture) and send an HTTP request to a specific endpoint.
  3. The endpoint is a dynamic domain provisioned using ngrok (e.g. https://abcd1234.ngrok.io) which tunnels a request to a specific local machine running the ngrok client.
  4. The local machine hosts a node.js server using Express to receive the requests and pass instructions onto the master micro:bit connected via USB using the serial data port.
  5. The master micro:bit forwards the instructions to the 2 client micro:bits using the radio module and updates its own connected micro:pixel via the NeoPixel module.
  6. The client micro:bits receive the instructions and update their Zip Halos via the NeoPixel module.

Instructions

Download the source files from the Mrs. Roboto GitHub repository.

  1. Connect 2 of the micro:bits to the Zip Halos.
  2. Connect one of these micro:bits to your computer using the USB cable and use to load the microbit_client file from the microbit_files directory onto the micro:bit by dragging it onto the drive, which will flash the device with the code.
  3. Repeat for the second micro:bit connected to the Zip Halo.
    Note: You can write your own code to control the micro:bits using your favourite micro:bit editor. You can also copy and paste the code in the respective .js files into the JavaScript tab of the MakeCode Editor to customise that. 
  4. Connect the other micro:bit to the micro:pixel unit.
  5. Connect this micro:bit to your computer and flash it with the microbit_server file.
  6. Download and install Node.js and ngrok on your local machine.
  7. Ensure you either have Node.js available globally, or move the server folder to a location where node can run.
  8. Run npm install to install dependencies.
  9. Run node index.js -p 80 to run the server, listening on port 80.
  10. Run ./ngrok http 3000 to generate a subdomain on ngrok.io that will redirect traffic to your local machine.
  11. Copy the full https domain (in the form https://xxxxxxxx.ngrok.io) from the console.
  12. Open the Mrs_Roboto.arproj file in the spark_ar folder in Spark AR, and paste the address above in Whitelisted Domains box under the Edit Project > Capabilities section.
    Note: This must be an https domain.
  13. Edit the script.js file and update line 8 with the same domain.
  14. Either deploy the effect to your phone using the Spark AR player over USB, or export and upload the effect via the Spark AR hub.
  15. Open the effect on your phone. If you’ve set up everything correctly, raising your eyebrows using the front camera should change the colour of the lights on the LEDs.
When trouble strikes, look to the world for solutions

Troubleshooting

If the above doesn’t work, debug by verifying each step of the chain is working, e.g.

  1. Check when making facial gestures in your phone’s front camera using the effect that these are being detected and triggering actions in Spark AR. Use the Spark AR player running the effect to do this while connected via the USB cable your computer. Ensure the Spark AR console is showing the phone’s debug output, and verify that messages are printed correlating to the actions you’re taking. Add more debug statements in script.js if you need further detail. If these messages aren’t displaying, ensure the gestures are triggering the correct logic in the Patch Editor as you’re expecting
  2. If gestures in Spark AR are correctly triggering actions, ensure the whitelisted domain and the domain in line 24 of the script are set to the correct ngrok domain.
  3. Check that requests are being received by your local node server by printing out a debug statement in the console when an endpoint is triggered. You can check this is working by manually calling the equivalent endpoint on the ngrok server.
  4. Verify that the Node script is able to communicate with the master micro:bit over USB by sending a specific message triggered by keyboard input (e.g. pressing the x key). Verify it’s being received by making a very simple program for the micro:bit which simply shows an LED pattern when that message is received.

Potential enhancements

  • Applying more nuanced effects, for example gradually brightening the mouth depending on how open it is.
  • Using motors to adjust the head orientation depending on the orientation of the face.
  • Triggering speech samples, or allowing the user to enter text which would be converted to a voice sample on the fly.

If you have any questions or end up making one yourself, please feel free to send them onto me @LaurieAinley or on laurie@ainley.org.

Thanks go to Neill Bogie, Bram van de Ven and Priyanka Parmar for their help in building Mrs. Roboto.

Get started

Ready to get started with your first 3D or AR experience?
Submit your brief within minutes to start your AR or 3D journey!

Get started Get in touch