Robotic Arm with Voice Control - Part 3

Here is part 3 for the robot arm with voice control tutorial. Click here for the first part or here for the second part.

Now that the output has been filtered and tested for confidence, the sentence can now be processed to move the arm:

def process_sentence(sentence_text):
	str1 = remove_tags(sentence_text)
	str2 = str1.replace('sentence1: ','')
	joints = ['ELBOW', 'SHOULDER', 'WRIST', 'GRIP']
	transcribed = str2.split()
	if transcribed[0] in joints:
	elif transcribed[0] == "LIGHT":
	elif transcribed[0] == "BASE":

This function simply sorts the commands and passes them to the correct function. It could probably be written more elegantly, but this works for now. To use the pyusb module, we need to set up the device using the set_configuration() method. This is the complete main() function:

if __name__ == "__main__":
    dev = usb.core.find(idVendor=0x1267, idProduct=0x0000)
    #exit if device not found
    if dev is None:
        raise ValueError('Can\'t find robot arm!')
    #this arm should just have one configuration...
    child = pexpect.spawn ('julius -input mic -C julian.jconf')
    while True:
            child.expect('please speak')
        except KeyboardInterrupt:

After getting the USB sorted, we can then send the commands to the robot arm by setting the half-nibble to 1 or 2, and then doing a bitwise left shift (which is the same as multiplying by 2^n). We then use pyusb to send the command:

def movearm(joint,direction):
    if (direction == "UP") or (direction == "CLOSE"):
        cval = 1
    elif (direction == "DOWN") or (direction == "OPEN"):
        cval = 2
    if joint == "SHOULDER":
        cbyte = (cval << 6)
    elif joint == "ELBOW":
        cbyte = (cval << 4)
    elif joint == "WRIST":
        cbyte = (cval << 2)
    elif joint == "GRIP":
        cbyte = cval
    command = (cbyte, 0, 0 )
    #taken parameters from notbrainsurgery's post:
    dev.ctrl_transfer(0x40, 6, 0x100, 0, command, 1000)
    #stop the arm
    dev.ctrl_transfer(0x40, 6, 0x100, 0, (0,0,0), 1000)

And a similar function for the base:

def rotatebase(direction):
    if (direction == "RIGHT"):
        cval = 1
    elif (direction == "LEFT"):
        cval = 2
    command = (0, cval, 0 )
    dev.ctrl_transfer(0x40, 6, 0x100, 0, command, 1000)
    #stop the arm
    dev.ctrl_transfer(0x40, 6, 0x100, 0, (0,0,0), 1000)  

And finally, for the LED:

def ctrl_light(light_comm):
    if (light_comm == "ON"):
        cval = 1
    elif (light_comm == "OFF"):
        cval = 0
    command = (0, 0, cval )
    dev.ctrl_transfer(0x40, 6, 0x100, 0, command, 1000)

Finishing up

Now save this python script as in the 'voxforge/auto' folder, and run:

sudo python

The script needs to be run with root privileges as we need hardware access to the USB subsystem. And as I said earlier, all of the requirements are also available on Windows and OSX, but will require some tweaks (i.e. pexpect is not available on Windows, so an alternative method should be used). Since everything is freely available, there are plenty of future possibilities for modification and tweaking - so happy hacking!

Finally, you can download the python script from: Download

What Next?

Well as you can see in the video, it's not very convenient controlling the arm with repeated commands. So I've come up with a wish list that I'll try to implement progressively:

  1. more autonomy
    • some kind of sensor for proprioception and/or vision (Kinect?) (OpenTLD?)
    • inverse kinematics (ikfast from OpenRave)
  2. faster movement, more lifting weight (motor/gearbox improvements)
  3. operation from wall socket AC adapter (just some soldering which I'll get around to...someday)
  4. Swordfighting. ;-)
Zircon - This is a contributing Drupal Theme
Design by WeebPal.