• If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • Stop wasting time looking for files and revisions. Connect your Gmail, DriveDropbox, and Slack accounts and in less than 2 minutes, Dokkio will automatically organize all your file attachments. Learn more and claim your free account.


Robert Jackson Final Project: Robot Rock

Page history last edited by Robert Jackson 7 years, 2 months ago

Robert Jackson

EE 47 Final Project Documentation


First Idea: May 6, 2013

The first idea was to build a robot that will be able to drive through heavy sand/dust/dirt with weights of up to 100kg. This whole robot is a multiple quarter project, but what I want to do for EE 47 is to build the system that allows a user to control and interface with the robot through a computer and a set of controls. The user, with a computer, will “talk” to a motor controller that talks to 4 motors, 1 for each wheel.

Robot User Interface Overview Mark I


This project would not have made noise, except for the large amount of noise that the motors make while running, since they are 2.5 inches in diameter. However, I felt that this was a reasonable project to undertake for this class because it would involve a lot of the concepts we have been talking about (interface, electronics, thinking about challenges, specific applications, etc).



1) Ordering/Receiving  all the parts: MIRACLE #0.5

Will take at least a week for the parts to arrive, if I order them today (which I plan to do). However, some may take longer, and I don’t want to be put behind schedule


2) Interfacing with the Motor Speed Controller: MIRACLE #1


3) Adding the Motors into the robot frame: MIRACLE #2



Revisions to First Idea:

After discussion with Professor Ju, I learned that my original idea, outlined above, to have just a robot wasn’t enough - the music had to be involved in some way. Thus, I set about searching through my ideas and diagrams to find a seamless way to incorporate music into my robotics interest.


My inspiration came suddenly at a fraternity party one night: I couple of people rolled up to the party with a grocery cart that had been modified to carry some speakers, a car battery, and a small bar. I really liked the idea of the mobile speaker set, since I spend a lot of time outside, on the fields and such, where there are no outlets but small portable speakers aren’t loud enough to actually hear unless you are very close to them. However, I noticed that the guys who had brought the cart had to push a very heavy object up the hill to get to the fraternity, and I wondered if there was a way that I could have something like that, but that would drive itself. The combination of my robot and my mp3 flashed into my mind.


I considered several ways of driving the robot. The most obvious was with vision tracking, but this is a difficult CS problem. Another option was an Radio Frequency (RF) transmitter/receiver that I would carry with me that the robot would follow. A third, more mechanical solution was for me to have a string that attached to the robot with sensitive movement sensors that would move the robot in the direction of the string. I diagrammed these ideas to think about their pros and cons:

Pros and Cons of each kind of User-Control for the Robot


I did some research into each idea. I decided that I wanted the robot to be fully autonomous, so the string idea was out. After research online, it turns out that RF or GPS at this scale is massively difficult, so I dropped that too. This left vision tracking, for which I created a Verplank Diagram to catalogue the process. This is the Verplank Diagram:

Verplank Diagram for the Robot-Human Interaction


After using the Verplank Diagram as a focusing exercise, this is the updated design state diagram I came up with:


Robot/Mp3 User Interface, Mark II.


This would allow me to have a mobile robot platform that would have my music on it and travel around with me, following me by using it’s vision sensor (IR, Ultrasonic, or a camera) to track my movement, and then interpret that as a set of movement commands. This left me with 2 systems to complete: the MP3 music playing system, and the Robotic system.




I elected to build a no-frills, simple MP3 system because the robot portion of the project would take too much time to allow for deep focus into the MP3 design. Also, since the robot would require a lot of tinkering and moving things around, enclosing my MP3 in some kind of housing for the purposes of this project would just make it difficult to work with.


Key Challenges:

  1. Making music play: I had to first get the simplest version of the MP3 from Lab 6 working

  2. Add user control: buttons, on/off, LCD for song names, etc

  3. Speakers: Since this was to be an outdoor robot for parties and such, it needed to be loud. How to make really loud speakers with so little power from the Arduino?

  4. Attach it to the robot



To address Key Challenges #1-2, I built the Lab 6 MP3 player and added an interrupt driven play/pause button for the user interface. The PBWorks Lab 6 Documentation can be found here: http://pressplay.pbworks.com/w/page/25885347/Lab%206

For the MP3 code, scroll to the bottom of this page to find “Appendix A: MP3 Code”.

MP3 setup on a breadboard


The speakers in Key Challenge #3 were a little more tricky. There was no way that the Arduino could output enough power to run the kind of speakers I wanted, since the goal of this project was to have the robot be able to pound out music in an outdoors or party environment, where loudness was very important. Thus, I decided that the speakers would have to be run from a separate power source with it’s own volume control, rather than trying to do it from the Arduino. I found a set of desktop speakers that ran from a 12V power source, and put some batteries together in series to power them. I could not power the speakers from the same 12V battery that would power the robot, since the two have drastically different current requirements.

Speaker State Diagram


To attach the MP3 player to the robot, I needed to first make sure all the robotics parts were in order. Thus, I left off completing Challenge #4 until after the robot had been completed.




The second system that needed to be build was the robot. This process could really be broken down into two parts as well: Chassis and Control Systems. The Chassis needed to house all the electronics, motors, batteries, etc. Luckily, since I had been working on the robot since the beginning of the class, the chassis parts were already assembled. The chassis is rectangular and made of 80/20 aluminum extrusion, brackets, screws, and plastic board. The full parts list can be found here: https://docs.google.com/spreadsheet/ccc?key=0At0_SrnMnX9gdHFjN3h4TkhQVDFhZUtNYjItSzJUV1E&usp=sharing.


Chassis Key Challenges:

  1. What size was appropriate? It had to carry a lot of weight, but be driveable

  2. Materials

    1. Cost: High cost because of size and strength needs

    2. Difficult to create, since they needed to be machined

    3. All had to be ordered

  3. Configuration: Had to be modular for testing, and had to be able to hold batteries, motors, speakers, and MP3 setup.

  4. Robust: Robot might hit terrain, and couldn’t break with small vibrations


I had designed the chassis with these challenges in mind already, since I am going to adapt to the robot after this class is over to be able to mine dirt like a bulldozer. autonomously. Thus, challenges #1, #2, and #4 were pretty  much taken care of with a chassis that was designed to be robust, large, and strong.

The 80/20 Aluminum Chassis for the robot. Note the central bridge to hold the battery and controls.


Since there was extra space in the middle, to solve Challenge #3 I mounted the speakers on the central bridge as well, facing outward and forward to get maximum sound coverage. The control board was mounted on top of the battery.


Drive and Control Systems Key Challenges:

  1. Large Power Requirement

  2. Heavy: Lead-Acid batteries, 4 large motors, 4 gearboxes

  3. Accuracy: since it was to follow a human, it needs to have accurate movement

  4. Safety

    1. No exposed wires or metal to avoid shock from the large battery

    2. Could not hit other people while driving


The Drive and Control System was the harder part of making the robot run, since I built it during the Final Project time period. As can be seen from the parts list link above and the schematic below, I used a 12V battery, 4 Talon Motor Controllers, 4 CIM 2.5” Motors, 4 AndyMark Toughbox Gearboxes, and an Arduino Mega 2650 as the central core of the system. The IR Sensor sent a value to the AnalogIn Pin 0, which the code translated to a PWM output that was sent through PWM Pin 2. The code for this can be found in Appendix B: Motor Control Code

Control Schematic for the Robot. The Laptop is only to modify and upload code to the Arduino, and isn't necessary to run the programs on the Arduino.


The use of the 12V Motorcycle battery guaranteed enough current, since at stall current the motors can draw 130A each, which is far too much for a conventional, small 12V battery.


For accuracy, I found that the IR Sensors in the lab were not good enough. However, I did not have time to order new ones, so the robot works better at close range but has trouble identifying targets that are farther than 6 feet away. In future iterations, I will add better sensors. In terms of safety, the robot could definitely use some work. A lot of metal connections are exposed, which would lead to shock if they are shorted. Also, I programmed the robot to not run into things (when the IR sensor value is too small), but it doesn’t have sensors on all sides, so there is potential that it could sideswipe someone as it goes by without the sensors picking it up. However, in general, it works fairly well.

Robot in action. In this photo, only two of the motors are being driven (notice the lack of motor in the upper right), but this is sufficient for testing. The speakers are mounted in the middle, and the control board is on top of the battery. On the left side, the two black/silver boxes are the motor controllers, and the wood and duct tape piece holds the barely visible IR sensor on the front.



Final Presentation and Thoughts


Great Success! See the robot in action:


With music, but hand controlled (no IR Sensor)


Without music, but following a person using the IR sensor:


This project was a fun way to bring together my interest in robotics with EE47 curriculum to create a project that I not only had fun building for the class, but also plan to continue to improve and add features to. Although there are many improvements I could make to the robot before I am ready to walk with it around campus, I feel that I reached a reasonable point in the process. before the end of the class.


Thanks to Michael Schoof, Mason Black, Andre Sushko, and Kyana Van Houten for helping me put the chassis together. Also, I would like to extend a special thanks to the Stanford Robotics Club, who funded the materials for the chassis and control system of the robot. Check us out, we’ve got lots of great projects like this one!


If you have questions about my project or want to learn more, feel free to send me an email at rgj@stanford.edu.

-Robert Jackson




Appendix A: MP3 Code



* example sketch to play audio file(s) in a directory, using the VS1053 library

* for playback and the arduino sd library to read files from a microsd card.

* pins are setup to work well for Arduino Micro.


* originally based on frank zhao's player: http://frank.circleofcurrent.com/

* utilities adapted from previous versions of the functions by matthew seal.


* (c) 2011, 2012 david sirkin sirkin@cdr.stanford.edu

*                akil srinivasan akils@stanford.edu


* Modified by Robert Jackson

* June 2013

* EE47 Final Project



#include <SPI.h>

#include <SD.h>

#include <EEPROM.h>

#include <VS1053.h>


#include <Adafruit_GFX.h>

#include <Adafruit_PCD8544.h>


#define max_title_len 60

#define max_name_len  13

#define max_num_songs 40


#define read_buffer  256

#define mp3_vol      150


#define sd_cs         17


VS1053 myVS(A0, A1, A2, -1);

Adafruit_PCD8544 display = Adafruit_PCD8544(7, 6, 5, -1 ,4);


File sd_file;


unsigned char num_songs = 0, current_song = 0;


char fn[max_name_len];


char title[max_title_len + 1];


enum state { DIR_PLAY, MP3_PLAY, PAUSED };

state current_state = DIR_PLAY;


void setup()









 display.print("Barebones Mp3!");






 while (num_songs==0) {


   display.println("No songs on");





 attachInterrupt(1, pauseAndPlay, LOW);





void dir_play() {

 if (sd_file) {



 else {

   // since 'sd_file' isn't open, the recently playing song must have ended.

   // increment the index, and open the next song, unless it's the last song

   // in the directory. in that case, just set the state to PAUSED.

   if (current_song < (num_songs - 1)) {






   else {

     current_state = PAUSED;





void mp3_play() {

 unsigned char bytes[read_buffer]; // buffer to read and send to the decoder

 unsigned int bytes_to_read;       // number of bytes read from microsd card


 // first fill the 'bytes' buffer with (up to) 'read_buffer' count of bytes.

 // that happens through the 'sd_file.read()' call, which returns the actual

 // number of bytes that were read (which can be fewer than 'read_buffer' if

 // at the end of the file). then send the retrieved bytes out to be played.


 // 'sd_file.read()' manages the index pointer into the file and knows where

 // to start reading the next batch of bytes. 'Mp3.play()' manages the index

 // pointer into the 'bytes' buffer and knows how to send it to the decoder.


 bytes_to_read = sd_file.read(bytes, read_buffer);

 myVS.playChunk(bytes, bytes_to_read);


 // 'bytes_to_read' is only smaller than 'read_buffer' when the song's over.


 if (bytes_to_read < read_buffer) {




   // if we've been in the MP3_PLAY state, then we want to pause the player.

   if (current_state == MP3_PLAY) {

     current_state == PAUSED;





void pauseAndPlay() {

 static unsigned long last_interrupt_time = 0;

 unsigned long interrupt_time = millis();

 // If interrupts come faster than 200ms, assume it's a bounce and ignore

 if (interrupt_time - last_interrupt_time > 200)


   current_state = (current_state == PAUSED) ? DIR_PLAY : PAUSED;


 last_interrupt_time = interrupt_time;



void loop()


 switch(current_state) {


   case DIR_PLAY:




   case MP3_PLAY:




   case PAUSED:






Appendix B: Motor Control Code



* Controlling a servo position using a potentiometer (variable resistor)

* by Michal Rinott <http://people.interaction-ivrea.it/m.rinott>

* Modified by Robert Jackson

* June 2013

* EE47 Final Project


* This code reads in a value from an IR Distance Sensor, and maps that to a motor speed value.

* Because I didn't have enough time to machine a steering mechanism for the robot,

*   it only moves forward and backward based on how close an obstacle is to the front of the robot.

* If the robot sees nothing in front, it stops moving until it senses something again.



#include <Servo.h>


const int numReadings = 10; //number of readings to average for smoothing


Servo motorFrontRight;  // create servo object to control a servo

Servo motorFrontLeft;


int IRpin = A0;  // analog pin used to connect the potentiometer

int val;    // variable to read the value from the analog pin


int readings[numReadings];

int index = 0;

int total = 0;

int average = 0;


void setup()


 motorFrontRight.attach(2);  // attaches the servo on pin 9 to the servo object



 for(int i = 0; i < numReadings; i++) {

   readings[i] = 25;

   total += readings[i];






void loop()


 total = total - readings[index];

 readings[index] = analogRead(IRpin);

 total = total + readings[index];

 index = index + 1;


 if(index >= numReadings)

   index = 0;


 average = total/numReadings;

 if(average > 50) {



 else {









void move(int val) {

 int tempVal = map(val, 50, 550, 76, 110);     // scale it to use it with the servo (value between 0 and 180)


 tempVal = map(val, 50, 550, 115, 80);





Comments (0)

You don't have permission to comment on this page.