A Robot on Every Desk

Last week, I led a robot workshop in New York City. A huge thank you to Sauce Labs for sponsoring and Crowdtap for hosting the event at their office. Thank you specifically to Ashley Wilson at Sauce Labs and Arjun Anand at Crowdtap for setting everything up. In particular, I owe a massive thank you to my lovely wife, Kelly, for taking the day off work the day before the workshop to help me put all the kits together!

So, the workshop was glorious. Crazy dancing robots everywhere.

Two days later, Tapsterbot got TechCrunched. Several of the workshop attendees work at R/GA and are now using Tapsterbot to test apps for Nike. Nike! Robots get you paid!

Last year, I wrote a piece for Wired explaining my belief that as mobile apps get more sophisticated, robots are going to be an important part of the future of test automation:
A robot tester is the best of both worlds – it’s a real-world test on a real device – and because it’s a robot, it also performs the task precisely and quickly.
Bill Gates used to talk about “A computer on every desk”. My dream for the Tapsterbot project is “A robot on every desk.” I’m excited to see R/GA is starting to make that dream a reality!

There are many improvements coming to Tapsterbot in the future, for example, adding a second arm so it can “double-tap”, pinch, and zoom, and integrating more tightly with Appium to automate mobile app testing end-to-end. But along the way, the robot is learning lots of new tricks.

Later in the day, Jonathan Lipps (also from Sauce Labs) gave an excellent overview of Appium and how Tapsterbot and Appium will work together closer in the future.

I’m looking forward to doing more workshops. I’m spending more and more of my time now on robots, and figuring out how to mass produce more Bitbeam.

If you’ll be in the San Francisco Bay Area on May 18 & 19, come check out Tapsterbot and Bitbeam at Maker Faire Bay Area. (I’ll be hanging out at the Gridbeam booth again.)

Until then, grab the Tapsterbot source code (and all the CNC-able/3D-printable Bitbeam parts) and start hacking the future of test automation!
This entry was posted in robots, testing and tagged , . Bookmark the permalink.

9 Responses to A Robot on Every Desk

  1. Lorin says:

    Can the tapster bot use it’s servos to capture motion? Would be a much more natural way to create/recreate movements.

  2. hugs says:

    Lorin, you can’t currently capture motion with the servos. However, is this what Jay Graves (twitter: @jaywgraves) wrote me about the subject awhile back:

    “””
    You can mod a hobby servo to output its position.
    google for ‘servo with feedback’ and/or ‘servo position recording’

    Rewrite the program to have a ‘training’ mode where you position the actuator by hand and then hit a button to remember that position by reading the angle of the 3 servos. Switch back to ‘play’ mode & trigger the position you saved. That will work for the ‘clicks’ in Tic-Tac-Toe.

    You could do something similar by dragging the actuator along a path (Angry Birds example) but you would have to figure out often to sample the servo angles.
    It’s possible ‘start’ and ‘end’ points can be defined and hope that servo angles for the start x/y/z coordinates can be linearly interpolated to the end x/y/z but I don’t know if the math will work out.
    “””
    (Source: Comment section of this video: “Building a Robot that Can Play Angry Birds on a Smartphone” PyCon 2012 http://youtu.be/NkUTLRZBWLM )

  3. Roman says:

    What about detailed assembly instruction for that tapsterbot?

  4. schmurgon says:

    Has anyone attempted to hook up a tapsterbot to a PIXY (http://www.kickstarter.com/projects/254449872/pixy-cmucam5-a-fast-easy-to-use-vision-sensor) or something similar? I’m printing my tapsterbot next weekend – can’t wait!

    • hugs says:

      @schmurgon, I haven’t heard of anyone integrating with a PIXY, exactly, yet. However, I am experimenting with Raspberry Pi, the Raspberry Pi camera module, and OpenCV for a similar effect. The problem with the RPi camera is that it’s fixed focus (actually, glued!) for focusing on things far away. I’m specifically working on “corrective lenses” and a 3D printable enclosure for the RPi camera so it can focus clearly on a phone from 10-20 cm away. (That’ll be fun post if/when I get that done!) Also, I do have a working computer vision demo using a frame grabber (VGA2USB from Epiphan), but it’s expensive and not open source. I’m really looking forward to cheap(ish) web camera + OpenCV support coming to Tapsterbot. If you get anything working, please let me know!

      Lastly, have fun with printing your Tapsterbot. One small pro-tip… Be sure to slow down your printer when making the Bitbeam beams. I’ve found that a slower print speed leads to a big improvement in quality. Oh, and please post pics of your build! 🙂

Leave a comment