Few robots are more recognisable than WALL·E; his cute appearance and distinctive personality make him instantly endearing to anyone who sees him! In this project, I designed a WALL·E replica with the aim to allow each of the robot’s joints to be moveable by hand or using servo motors.

Loosely based on the dimensions and design of ChaosCoreTech’s Wall-E replica, this version was designed from scratch in Solidworks and allows 7 of the joints to be actuated, including the arms, neck, head and eyes. The robot design has the following features:

  • Each eye can be raised and lowered independently with servo motors.
  • There is room in each eye to add a small camera.
  • The head can look left and right using a servo motor.
  • The neck is actuated at two joints, allowing the head to look up/down and to be raised/lowered.
  • Each arm has a motor at the shoulder to move it up/down.
  • The arms consist of pressure fit joints, hands and fingers, which can be manually posed.
  • The tank treads (skid steering) are fully 3D printed and can be powered using two 12V DC geared motors.

This is an ambitious project, aimed at people who want to build a fully animatronic WALL·E robot with servo controlled joints. It took me about 3 months to design and assemble the robot, with more than a month spent on just 3D printing all of the parts. In total, there are 310 parts (although 210 of those are very small and make up the tank treads).

1. List of 3D Printed Parts

[a] Original Design

The robot comprises of 310 individual parts, so this definitely is not an easy project suitable for people who don’t have much experience with 3D printing! Personally, I spent more than a month printing all the parts, with the printer running almost every day. The largest components (the main body parts) took up to 14 hours of print time each, while the smaller parts took 5-6 hours. If you are interested in making your own robot, I have uploaded the 3D files for all the components on Thingiverse.

If you want to jump straight into 3D printing, the *.STL files are available here:

If you want to modify the files in CAD, I’ve also exported the files in the *.STEP format:

Finally, a PDF file containing a list of all of the components required can be found here:

Note: Several people have come across an issue where the gearbox of the 100RPM motor was 2mm too long to fit into the wheel-frame parts. I have now (11th June 2020) updated the files on Thingiverse to solve this issue. If you downloaded the files before this date, please make sure you get the new version of these files.

[b] Remixes by other Makers

Since I first released my original Wall-E design, a variety of other makers have built their own robots and made tweaks and improvements to the design. Before you start printing my design, I would encourage you to check out a few of these remixes to see if you want to use them as well:

  1. Track/drive Improvements
    1. Improved tracks with tensioning system by Guenter (recommended)
    2. Tensioning system with additional room for bearings by dmdiego
    3. Improved drive pinion gear by dmdiego (recommended)
    4. Tracks with paperclip pins (compatible with TPU) by ZDC
  2. Head/neck Improvements
    1. Improved head & neck, using bolts instead of glue by rene1960
    2. Replacement for paperclip linkage arms by ZDC (recommended)
    3. Motorised eyebrows by DaddieMon – note this might prevent a camera from fitting into the eyes
  3. Movie Accuracy Improvements
    1. More realistic hands by Wujek_k
    2. More accurate recording buttons by Xyphinon
  4. Additional Parts
    1. Interior electronics mounting box by christinloehner
    2. Cooler Box and modification for smaller print areas by alvarolg
    3. Integrated hook for a cooler box by milollo
    4. Magnetic door holders by ZDC

I haven’t had a chance to print any of these remixes myself, so if you come across any issues with the remix designs you will need to contact the authors of those parts directly. Click here to view all available remixes on Thingiverse.

2. List of other Components

A variety of other hardware is used to fasten the 3D printed parts together and bring the robot to life. A list of the hardware and electronic parts that I used is shown below. To make WALL·E look more realistic, I took apart some old binoculars and used the lenses as the eyes. I think that the shine and reflections on the glass adds a lot of soul to the robot, and make him look even cuter.

Note: Links are for reference only, and are not where I bought my parts. Please shop around to find the best supplier near you! The DC geared motors can also be bought with additional encoders, allowing you to have better control of the robot’s movement speed. However, if you want to add encoders you will need to modify my Arduino code in order to support them; an intermediate level of programming knowledge is required!

While it is possible to use a Wifi/bluetooth connected Arduino micro-controller to control the robot, I decided to use a Raspberry Pi instead. Since the Raspberry Pi is essentially a small computer, it can be used to play sound effects, stream the video from a USB camera, and host a web interface through which the robot can be controlled.

3. 3D Design and Printing

I designed all the components in Solidworks, using images and other 3D models as reference. The main aim in the design process was to split the robot into small enough pieces so that they would fit into the 3D printer, and also to integrate all the motors and electronic components. I tried to make the robot as small as possible, while still leaving enough room for the motors.

4. Painting

After 3D printing each of the parts, I spent a lot of time sanding the parts to remove all of the print lines and give them a smooth finish. Two coats of filler-primer were then applied, with more sanding done between each of the coats. Using a primer is important, as it helps the paint to stick to the plastic and not rub off as easily. It is also useful as it makes imperfections and bumps on the part more obvious, showing where further sanding needs to be done.

Each of the parts was then individually painted with lacquer spray paints. I only used yellow, white, light grey, dark grey, black, and red spray paints to paint the whole robot. By splattering light layers of black and red paint onto the parts that were painted grey, it was possible to add texture and make them look a lot more like real metal.

Finally, after fully assembling the robot, I used black and brown acrylic paints to weather the robot. This involves applying the paint liberally onto all the surfaces, and roughly wiping away most of it with a towel. The paint that isn’t wiped away stays in the corners and crevices of the parts, making the overall replica look older and more realistic.

5. Assembly

The video below shows how to assemble the robot. Overall, the assembly is not too difficult, but it is important to put the parts together in the right order. While a couple of small parts needs to be glued together, most parts are fastened together using bolts. This makes assembly and disassembly easy if any parts need to be fixed or replaced. The trickiest part was probably the wiring, figuring out how to connect the motors in the eyes of the robot to the controller in the body.

IMPORTANT: Before attaching the servo motors, you need to make sure that the angle of the motor is correct. Since the servos can only rotate 180 degrees, you won’t be able to control the joint correctly if they are attached when positioned at the wrong angle. Diagrams showing the correct angles of each of the servo motors are shown below. To attach the servo, first rotate the output shaft clockwise as far as it will go; this gives you the min/max position of the servo. Then attach the servo horn onto the output shaft at the correct angle, as shown in the diagrams. Minor variances in the positioning of the servo horn will be corrected in the servo calibration step in section 8[c].

In the 3D printed design of the robot, I have left a gap where Wall-E’s “Solar Charge Panel” indicator should go. I purposefully left the gap so that I could add some lights or a screen there later which would show the actual battery level of the robot. To provisionally fill the space (as seen in my images of the robot), I printed out a picture of the panel on some gloss photo paper and taped it into the space. Here is a PDF of the panel I used; it is already at the correct size, just make sure when sending it to the printer to turn off scaling (print at “actual size”):

6. Wiring and Electronics

The wiring diagram is shown below, illustrating how each of the electronic components were connected in the robot. The USB port of the Arduino Uno was then connected to the USB port of the Raspberry Pi. If the 12v to 5v DC buck converter is capable of delivering up to 5 amps, then the Raspberry Pi can be directly powered from the converter. Otherwise, it should be connected to a separate 5v battery.

7. Programming

The programming of the robot can be split into two main parts; the code for the Arduino micro-controller, and the web-server on the Raspberry Pi. I’ve uploaded all my code onto GitHub; the link is shown below.

The Arduino controls all of the motors within the robot, determining how they should move. In the code I added a velocity controller, so that the servo motors don’t suddenly jump into life at full speed, but instead start and stop gently.

The Raspberry Pi is connected to the Arduino via a USB cable, and can send user commands to the Arduino to make the robot move in a specific way. The Pi is also supports a USB webcam and a speaker, and can play sound effects. The code is written in Python, and uses ‘Flask’ to generate a web-server. Any computer on the same local network can then access the page and remote control the robot.

8. Arduino Installation Guide

[a] Basic Installation

  1. Ensure that the wiring of the electronics matches the circuit diagram.
  2. Download/clone the folder “wall-e” from the GitHub repository.
    1. To get all the files from the repository, click on the green Code button on the top-right of the page.
    2. Click on Download ZIP. Once the files have downloaded, extract the zip folder.
  3. To upload the code to the Arduino, you need to download the Arduino IDE from the official website.
  4. Open wall-e.ino in the Arduino IDE; the files MotorController.hpp and Queue.hpp should automatically open on separate tabs of the IDE as well.
  5. Install the Adafruit_PWMServoDriver.h library
    1. Go to Sketch -> Include Library -> Manage Libraries…
    2. Search for Adafruit PWM Servo Driver.
    3. Install latest version of the library.
  6. Connect to the computer to the micro-controller with a USB cable. Ensure that the correct Board and Port are selected in the Tools menu.
  7. Upload the sketch to the micro-controller.

[b] Testing the Main Program

  1. Once the sketch has been uploaded to the Arduino, power on the 12V battery while the micro-controller is still connected to the computer.
  2. Open the Serial Monitor (button in top-right of Arduino IDE). Set the baud rate to 115200.
  3. To control the movement of the robot, send the characters ‘w’, ‘a’, ‘s’ or ‘d’ to move forward, left, back or right respectively. Send ‘q’ to stop all movement.
  4. To move the head, send the characters ‘j’, ‘l’, ‘i’ or ‘k’ to tilt the head left or right and the eyes upwards or downwards. At this stage, the servos may try to move further than they should and may look uncoordinated. This will be solved by performing the servo motor calibration steps below.

[c] Servo Motor Calibration

  1. Download/clone the folder “wall-e_calibration” from the GitHub repository.
  2. Open wall-e_calibration.ino in the Arduino IDE.
  3. Upload the sketch to the micro-controller, and open the serial monitor and set the baud rate to 115200.
  4. The sketch is used to calibrate the maximum and minimum PWM pulse lengths required to move each servo motor across its desired range of motion. The standard LOW and HIGH positions of each of the servos is shown in the images below.
  5. When starting the sketch and opening the serial monitor, a message should appear after 2-3 seconds, saying that it is ready to calibrate the LOW position of the first servo motor (the head rotation).
  6. Send the character ‘a’ and ‘d’ to move the motor backwards and forwards by -10 and +10. For finer control, use the characters ‘z’ and ‘c’ to move the motor by -1 and +1.
  7. Once the motor is position in the correct position (as shown in the images below), send the character ‘n’ to proceed to the calibration step. It will move on to the HIGH position of the same servo, after which the process will repeat for each of the 7 servos in the robot.
  8. When all joints are calibrated, the sketch will output an array containing the calibration values to the serial monitor.
  9. Copy the array, and paste it into lines 144 to 150 of the program wall-e.ino. The array should look similar to this:
int preset[][2] =  {{410,120},  // head rotation
                    {532,178},  // neck top
                    {120,310},  // neck bottom
                    {465,271},  // eye right
                    {278,479},  // eye left
                    {340,135},  // arm left
                    {150,360}}; // arm right

[d] Battery Level Detection (optional)

When using batteries to power the robot, it is important to keep track of how much power is left. Some batteries may break if they are over-discharged, and the SD card of the Raspberry Pi may become corrupted if not enough power is delivered.

  1. To use the battery level detection feature on the Arduino, connect the following resistors and wiring as shown in the image below. The resistors (potential divider) reduce the 12V voltage down to a value below 5V, which is safe for the Arduino to measure using its analogue pins. The recommended resistor values are R1 = 100kΩ and R2 = 47kΩ.
  2. Uncomment line 54 in the main Arduino sketch wall-e.ino.
  3. If you are using different resistor values, change the value of the potential divider gain factor on line 54 of the sketch, according to the formula: POT_DIV = R2 / (R1 + R2).
  4. The program should now automatically check the battery level every 10 seconds, and this level will be shown on the Raspberry Pi web-interface in the “Status” section.
Diagram of the battery level detection circuitry

[e] oLed Display (Optional) (Contributed by: hpkevertje)

It is possible to integrate a small 1.3 inch oLED display which will show the battery level of the robot on the front battery indicator panel. This feature requires the battery level detection circuit in the previous section to be enabled, and the screen will update every time the battery level is calculated. This function uses the u8g2 display library in page mode; on the Arduino UNO you may get a warning that the memory usage is high, but this warning can be ignored.

  1. To use the oLed display feature on the Arduino, connect an i2c oLed display on the i2c bus on the servo motor module (see diagram).
  2. Install the U8g2 library in the Arduino library manager:
    1. Go to Sketch -> Include Library -> Manage Libraries…
    2. Search for U8gt. The library publisher is “oliver”.
    3. Install latest version of the library.
  3. Uncomment line 74 #define OLED in the main Arduino sketch wall-e.ino.
  4. If you are using a different display that is supported by the library, you can change the constructor on line 78 as documented on the library reference page. The default is for an SH1106_128X64_NONAME display.
Diagram showing the wiring of the oLed display

[f] Adding your own Servo Animations (optional)

My code comes with two animations which replicate scenes from the movie; the eye movement Wall-E does when booting-up, and a sequence of motions as Wall-E inquisitively looks around. From code version 2.7 and onwards, I’ve now made it easier to add your own servo motor animations so that you can make your Wall-E do other movements…

  1. Open up the file animations.ino, which is located in the same folder as the main Arduino sketch.
  2. Each animation command consists of the positions you want each of the servo motors to move to, and the amount of time the animation should wait until moving on to the next instruction.
  3. You can add a new animation by inserting an extra case section into the switch statement. You should slot your extra code into the space above the default section.
  4. The time needs to be a number in milliseconds (for example, 3.5 seconds = 3500)
  5. The servo motor position commands need to be an integer number between 0 to 100, where 0 = LOW and 100 = HIGH servo position as calibrated in the wall-e_calibration.ino sketch.
  6. If you want to disable a motor for a specific move, you can use -1.
  7. For example:
case 3:
        // --- Title of your new motion sequence ---
        //          time,head,necT,necB,eyeR,eyeL,armL,armR
        queue.push({  12,  48,  40,   0,  35,  45,  60,  59});
        queue.push({1500,  48,  40,  20, 100,   0,  80,  80});
        // Add as many additional movements here as you need to complete the animation
        // queue.push({time, head rotation, neck top, neck bottom, eye right, eye left, arm left, arm right});

You can then add the animation the animation to the Raspberry Pi web-interface with the following steps:

  1. Edit the HTML file on the Raspberry Pi using the command: nano ~/walle-replica/web_interface/templates/index.html
  2. On lines 245 to 247 you can see the buttons relating to the 3 default animations. To add your own animations, simply copy/paste an extra line to the bottom of the list, changing the following items:
    • Insert the number of the CASE statement relating to the animation, for example file-name="3"
    • Insert the length of the entire animation in seconds, for example file-length="21.5"
    • Update the “on-click” parameter with the same numbers: on-click="anime(3,21.5)"
    • Add the name of the animation, for example Dance Sequence
  3. This would give you a line looking like the code below. Press CTRL+O to save and CTRL+X to exit the editor.

<a href="#" class="list-group-item list-group-item-action" file-name="3" file-length="21.5" onclick="anime(3,21.5)">Dance Sequence <i class="entry-time">&nbsp; | &nbsp; 21.5s</i></a>

9. Raspberry Pi Web Server

[a] Basic Installation

  1. Setup the Raspberry Pi to run the latest version of Raspberry Pi OS (Raspbian) – Full. The setup instructions can be found on the Raspberry Pi website.
  2. Open the command line terminal on the Raspberry Pi.
  3. Ensure that the package list has been updated (this may take some time): sudo apt update
  4. Install Flask – this is a Python framework used to create webservers:
    • Ensure that pip is installed: sudo apt install python3-pip
    • Install Flask and its dependencies: sudo pip3 install flask
  5. (Optional) The Full version of Raspbian includes these packages by default, but if you are using a different OS (for example the Lite version), you will need to run these commands:
    sudo apt install git libsdl1.2 libsdl-mixer1.2
    sudo pip3 install pygame pyserial
  6. Clone repository into the home directory of the Raspberry Pi:
    cd ~
    git clone https://github.com/chillibasket/walle-replica.git 
  7. Set the web server password:
    • Open app.py: nano ~/walle-replica/web_interface/app.py
    • On line 26 of app.py where is says put_password_here, insert the password you want to use for the web interface.
  8. (Optional) Change the default audio directory and location of the script used to start/stop the video stream.
    1. If you followed the steps above exactly, there is no need to do this. However, if you want to move the web-interface files to a different directory on the Raspberry Pi, you will need to change the location where the program will look for the audio files.
    2. On line 29 of app.py, type the directory where the audio files are located. Ensure that the directory location ends with a forward slash: /.
    3. On line 28 of app.py, the location of the script used to start and stop the video camera stream can be modified.
  9. Connect to the Arduino/micro-controller:
    1. Plug the Arduino/micro-controller into the USB port of the Raspberry Pi.
    2. If you would like the serial port used by the Arduino to be selected by default in the web-interface, you can set a preferred serial port device in the code. Go to line 27 of app.py and replace the text ARDUINO with the name of your device. The name must match the one which appears in the drop-down menu in the “Settings” tab of the web-interface.
    3. To make the interface automatically connect to the Arduino when it starts up, you can change line 31 to autoStartArduino = True
    4. Press CTRL + O to save and CTRL + X to exit the nano editor.

[b] Using the Web Server

  1. To determine the current IP address of the Raspberry Pi on your network, type the command: hostname -I
  2. To start the server: python3 ~/walle-replica/web_interface/app.py
  3. To access the web interface, open a browser on any computer/device on the same network and type in the IP address of the Raspberry Pi, follow by :5000. For example
  4. To stop the server press: CTRL + C
  5. To start controlling the robot, you first need to start serial communication with the Arduino. To do this, go to the Settings tab of the web-interface, select the correct serial port from the drop-down list and press on the Reconnect button.

[c] Adding a Camera Stream (optional)

Note: I have gotten reports that this Camera Stream library no longer supports the newest version of the Raspberry Pi OS. I am open to suggestions/contributions of an alternative system that can be used on newer devices.

  1. If you are using the Official Raspberry Pi camera, you will first need to enable the camera in sudo raspi-config. In the config screen which appears, navigate to “Interface Options” > “Camera” > “Enable”.
  2. Install mjpg-streamer – this is used to stream the video to the webserver. The steps below are based on the instructions described here for the CNC JS project. To install the required libraries, run all of the following commands:
    # Update & Install Tools
    sudo apt-get update -y
    sudo apt-get upgrade -y
    sudo apt-get install build-essential git imagemagick libv4l-dev libjpeg-dev cmake -y
    # Clone Repo in /tmp
    cd /tmp
    git clone https://github.com/jacksonliam/mjpg-streamer.git
    cd mjpg-streamer/mjpg-streamer-experimental
    # Make
    sudo make install
  3. Create a new script which will be used to start and stop the camera stream: nano ~/mjpg-streamer.sh
  4. Paste the following code into the script file; you can modify the settings frame rate, quality and resolution to suit the camera you are using:
    # chmod +x mjpg-streamer.sh
    # Crontab: @reboot /home/pi/mjpg-streamer/mjpg-streamer.sh start
    # Crontab: @reboot /home/pi/mjpg-streamer/mjpg-streamer-experimental/mjpg-streamer.sh start
    MJPG_STREAMER_BIN="/usr/local/bin/mjpg_streamer"  # "$(dirname $0)/mjpg_streamer"
    MJPG_STREAMER_LOG_FILE="${0%.*}.log"  # "$(dirname $0)/mjpg-streamer.log"
    RUNNING_CHECK_INTERVAL="2" # how often to check to make sure the server is running (in seconds)
    HANGING_CHECK_INTERVAL="3" # how often to check to make sure the server is not hanging (in seconds)
    RESOLUTION="1280x720"  # 160x120 176x144 320x240 352x288 424x240 432x240 640x360 640x480 800x448 800x600 960x544 1280x720 1920x1080 (QVGA, VGA, SVGA, WXGA)   #  lsusb -s 001:006 -v | egrep "Width|Height" # https://www.textfixer.com/tools/alphabetical-order.php  # v4l2-ctl --list-formats-ext  # Show Supported Video Formates
    ################INPUT_OPTIONS="-r ${RESOLUTION} -d ${VIDEO_DEV} -f ${FRAME_RATE} -q ${QUALITY} -pl 60hz"
    INPUT_OPTIONS="-r ${RESOLUTION} -d ${VIDEO_DEV} -q ${QUALITY} -pl 60hz --every_frame 2"  # Limit Framerate with  "--every_frame ", ( mjpg_streamer --input "input_uvc.so --help" )
    if [ "${YUV}" == "true" ]; then
        INPUT_OPTIONS+=" -y"
    # ==========================================================
    function running() {
        if ps aux | grep ${MJPG_STREAMER_BIN} | grep ${VIDEO_DEV} >/dev/null 2>&1; then
            return 0
            return 1
    function start() {
        if running; then
            echo "[$VIDEO_DEV] already started"
            return 1
        export LD_LIBRARY_PATH="$(dirname $MJPG_STREAMER_BIN):."
        echo "Starting: [$VIDEO_DEV] ${MJPG_STREAMER_BIN} -i \"input_uvc.so ${INPUT_OPTIONS}\" -o \"output_http.so ${OUTPUT_OPTIONS}\""
        ${MJPG_STREAMER_BIN} -i "input_uvc.so ${INPUT_OPTIONS}" -o "output_http.so ${OUTPUT_OPTIONS}" >> ${MJPG_STREAMER_LOG_FILE} 2>&1 &
        sleep 1
        if running; then
            if [ "$1" != "nocheck" ]; then
                check_running & > /dev/null 2>&1 # start the running checking task
                check_hanging & > /dev/null 2>&1 # start the hanging checking task
            echo "[$VIDEO_DEV] started"
            return 0
            echo "[$VIDEO_DEV] failed to start"
            return 1
    function stop() {
        if ! running; then
            echo "[$VIDEO_DEV] not running"
            return 1
        if [ "$1" != "nocheck" ]; then
            # stop the script running check task
            ps aux | grep $0 | grep start | tr -s ' ' | cut -d ' ' -f 2 | grep -v ${own_pid} | xargs -r kill
            sleep 0.5
        # stop the server
        ps aux | grep ${MJPG_STREAMER_BIN} | grep ${VIDEO_DEV} | tr -s ' ' | cut -d ' ' -f 2 | grep -v ${own_pid} | xargs -r kill
        echo "[$VIDEO_DEV] stopped"
        return 0
    function check_running() {
        echo "[$VIDEO_DEV] starting running check task" >> ${MJPG_STREAMER_LOG_FILE}
        while true; do
            sleep ${RUNNING_CHECK_INTERVAL}
            if ! running; then
                echo "[$VIDEO_DEV] server stopped, starting" >> ${MJPG_STREAMER_LOG_FILE}
                start nocheck
    function check_hanging() {
        echo "[$VIDEO_DEV] starting hanging check task" >> ${MJPG_STREAMER_LOG_FILE}
        while true; do
            sleep ${HANGING_CHECK_INTERVAL}
            # treat the "error grabbing frames" case
            if tail -n2 ${MJPG_STREAMER_LOG_FILE} | grep -i "error grabbing frames" > /dev/null; then
                echo "[$VIDEO_DEV] server is hanging, killing" >> ${MJPG_STREAMER_LOG_FILE}
                stop nocheck
    function help() {
        echo "Usage: $0 [start|stop|restart|status]"
        return 0
    if [ "$1" == "start" ]; then
        start && exit 0 || exit -1
    elif [ "$1" == "stop" ]; then
        stop && exit 0 || exit -1
    elif [ "$1" == "restart" ]; then
        stop && sleep 1
        start && exit 0 || exit -1
    elif [ "$1" == "status" ]; then
        if running; then
            echo "[$VIDEO_DEV] running"
            exit 0
            echo "[$VIDEO_DEV] stopped"
            exit 1
  5. Press CTRL + O to save and CTRL + X to exit the nano editor.
  6. Make sure that the manager script you created has the correct name and is in the correct directory: /home/pi/mjpg-streamer.sh. If you want the save the script in a different location, you need to update line 22 of app.py.
  7. To make the script executable by the web-server, run this command in the terminal: chmod +x /home/pi/mjpg-streamer.sh
  8. If you want the camera to automatically startup when you open the web-interface you can change line 32 of app.py to autoStartCamera = True

[d] Automatically start Server on Boot (optional, but recommended)

  1. Create a .service file which is used to start the web interface: nano ~/walle.service
  2. Paste the following text into the file:
    Description=Start Wall-E Web Interface
    ExecStart=/usr/bin/python3 app.py
  3. Press CTRL + O to save and CTRL + X to exit the nano editor.
  4. Copy this file into the startup directory using the command: sudo cp ~/walle.service /etc/systemd/system/walle.service
  5. To enable auto-start, use the following command: sudo systemctl enable walle.service
  6. The web interface should now automatically start when the Raspberry Pi is turned on. You can also manually start and stop the service using the commands: sudo systemctl start walle.service and sudo systemctl stop walle.service

[e] Adding new Sounds (optional)

  1. By default the Raspberry should automatically select whether to output audio to the HDMI port or the headphone jack. However, you can ensure that it always uses the headphone jack with the following command: amixer cset numid=3 1
  2. Make sure that all the sound files you want to use are of type *.ogg. Most music/sound editors should be able to convert the sound file to this format.
  3. Change the file name so that it has the following format: [group name]_[file name]_[length in milliseconds].ogg. For example: voice_eva_1200.ogg. In the web-interface, the audio files will be grouped using the “group name” and sorted alphabetically.
  4. Upload the sound file to Raspberry Pi in the following folder: ~/walle-replica/web_interface/static/sounds/
  5. All the files should appear in the web interface when you reload the page. If the files do not appear, you may need to change the privileges required to access the folder: sudo chmod -R 755 ~/walle-replica/web_interface/static/sounds

[f] Set up Raspberry Pi as a WiFi Hotspot (optional)

If you would like to control the robot outdoors or at conventions, there may not be any safe WiFi networks you can connect to. To overcome this issue and eliminate the need for any external networking equipment, the Raspberry Pi can broadcast its own WiFi network. You can then connect the computer/phone/tablet you are using to control the robot directly to this network.

To set up the WiFi hotspot, we will use the RaspAP project which takes care of all the configuration and tools to get the system working. The following instructions are based on their quick installation guide:

  1. Update Raspian, the kernel and firmware (and then reboot):
    sudo apt-get update
    sudo apt-get dist-upgrade
    sudo reboot
  2. Ensure that you have set the correct WiFi country in raspi-config’s Localisation Options: sudo raspi-config
  3. Run the quick installer: curl -sL https://install.raspap.com | bash
    • For the first few yes/no prompts which will appear during the install, type “y” (yes) to accept all of the recommended settings. The final two prompts (Ad Blocking and the next one) are not required so you can type “n” (no) for those.
  4. Reboot the Raspberry Pi again to implement the changes: sudo reboot
  5. Now the Raspberry Pi should be broadcasting a WiFi network with the following details:
    • SSID (wifi name): raspi-webgui
    • Password: ChangeMe
  6. After connecting to the WiFi network from a your computer, phone or tablet, the Wall-E web-interface can be opened by typing this address into your browser:
  7. (Recommended) To change the WiFi name and password, go to the WiFi configuration webpage at: The default username is admin and password is secret.
    • Click on “Hotspot” in the left sidebar. In the “Basic” tab you can change the WiFi network name, while the WiFi password can be changed in the “Security” tab.
    • To change the admin password for the interface used to manage the WiFi settings, click on the “Admin” icon in the top-right of the interface.

Recent Updates/Edits:

  • 8th August 2020 – to reflect latest updates in the Arduino Code
  • 18th September 2020 – Updated RPi setup instructions to use python3
  • 28th October 2020 – Removed “serial” library from install command since it is not needed and conflicts with “pyserial”
  • 2nd November 2020 – Added section with links to other remixes
  • 16th November 2020 – Instructions of how to add animations to the web-interface
  • 19th December 2020 – Changed instructions of how to set up a WiFi hotspot
  • 29th May 2021 – Added oLED display instructions
  • 22nd May 2022 – Updated instructions to include all code required or setting up camera stream
If you have any questions or comments, please leave a reply below:
Notify of
Oldest Most Voted
Inline Feedbacks
View all comments