me-bot: Gadgeteer Avatar Robot

me-bot: Gadgeteer Avatar Robot

Guest post by Paul C Mineau.

I was inspired by a TED talk given by Cynthia Breazeal filmed in December 2010 entitled The Rise of Personal Robots. She describes the ‘me-bot’ at about 5 minutes in, and later describes the ‘Grandma-bot’.  I’m working on building a me-bot using Gadgeteer, and this is the first in a series of articles on the project. The first basic version should be completed in about 4 weeks, which will be around July 1st, 2012. I hope to get others to follow along, to collaborate, give feedback, and to try different scenarios. After all, this project is pretty cool.

Version one of this project is being designed for grandparents and parents that want to play with their grandchildren and children remotely. Children don’t get excited about talking with Grandma on the phone, but they will get excited about spending time with Grandma-bot.

I spent a couple hours on the project and so far I have a basic robot with an iPhone ‘head’ that takes commands over XBee to look left, center, or right. I have a single arm that goes down, out, and up. The body of the robot was built using something similar to Erector Sets by the German company eitech; the kit I have cost 24 bucks and is called “Construction”.

In the photo here you can see the remote user on the iPhone. We’re using Skype and using the Interactive Console from the XBeeClient project to issue the commands. We also use Kinect to listen for voice commands for look left, center, right, and arm up, down, and out. Future versions will use Skeletal tracking. Future versions will drive around.

Currently if the arm is up or out, the head will ram into the arm as it looks all the way right! so there’s some design work left. Quite a bit.

The first version of this project will be simple, and the goals are as follows.

Project Goals:

  • Remotely (over the internet) control a small robot
  • Remote video and audio from the bot. This will likely be done using an iPhone or Android. My first version uses iPhone as its so easy.
  • The first version can move its arms around, but can’t pick up or move objects reliably (but will be able to in vNext).
  • The me-bot will have a video screen showing the remote user, and this will be done using either an iPhone/Android phone with front facing camera, or perhaps entirely using the touch screen for Gadgeteer from GHI Electronics.
  • Use Kinect to move the arms around
  • Use voice commands to interact with various devices using XBee, for example, turning lights on/off
  • In nNext, as the remote user looks around the environment, she’ll be able to see extra meta data about objects that are wired up with XBee, for example, looking at a plant you can see the moisture level.
  • Control toys using the Infrared module. Hex bugs are a good starting point. Here’s a video showing Arduino controlling Hex bugs.
  • The me-bot can roll around in vNext and will be wireless, but for this version it will be stationary and tethered with an ethernet cable.

What I have so far is simply two servo motors hooked up to a GHI Extender module. I just followed Mike Dodaro’s post on controlling a camera using Servos. I used lego pieces and screws to connect the two servos at first but they turned out to be way to flimsy, then I went out and bought some copper and a drill press, but accidentally found out that eitech kits work great, and the holds perfectly fit the holes on my Servos!!!  The servos I have are the Parallax standard servo and a Futaba S3004. In future articles I’ll give more details on the servos and how everything is wired up.

The code isn’t even checked in yet but will be available soon.

Its hard to believe that its 2012 and we don’t have the me-bot yet! I should be able to walk / roll around my house when I’m away and look around, play with the cat, and interact with my devices. If I get stuck on a project and need someone’s help, I should be able to tell Kinect to call Mike Dodaro, and if he’s available, drive a me-bot around his desktop and check out how his breadboard is wired up. 2012 isn’t over yet! Hope you were inspired by the TED talk video, and inspired to build a me-bot of your very own.

  1. #1 by wintonlin on June 4, 2012 - 5:58 PM

    Thanks Paul:
    I will keep learning from your post.
    Will you share your code in the future?
    I don’t really understand the role of Iphone and Android phone.
    Do you use iphone to connect and control the Xbee? or the iphone is used to provide the video to remote user?
    Who will control the robot? remote user or client user?
    Could you give a figure for the operation for all components?
    Thanks.

    I am looking forward to your next post.

    • #2 by paulmineau on June 4, 2012 - 7:40 PM

      yes the code will be on codeplex at GadgeteerCookbook.Codeplex.com soon, I want to break it down in the style of Arduino Cookbook which doesn’t have big end to end projects but how to do the various components. But I might just publish the end to end me-bot project there.

      The iPhone is used for video/audio to show the remote users face, and for the remote user to see from the robots view. The remote user controls the robot, right now this is over XBee (not very remote!) but will be over the internet soon, in about 2-3 weeks I’ll add GHI’s wi-fi module. I will also attempt to get rid of the phone and stream the video/audio to and from the spider. A downside to the phone is you need someone to answer the call, and place it in the robot!

Leave a comment