iGoBot - a GO game playing robot ... using a raspberry pi, opencv and gnugo

 

The primary facts of iGoBot

Design the base and the mechanics

Shortening of the Ikea "Lack" table legs:

Mounting the X and Y axes at the base:

Training the image recognition

IGoBot uses OpenCV and Hair Cascades to detect the Go stones placed on the board.

I had to take several hundred individual photos of black and white Go stones for reference.

Some of the black stone training images:

Some of the white stone training images:

After training, the system reliably detects black and white Go stones.

Here is an example of white stone detection:

The recognition on the Raspberry Pi in Python:

 A first test for the "stones on board" recognition and translation into coordinates:

The stone dispenser

The stone dispenser is driven by a servo. The two primary parts are 3D printed. The CAD files can be found here.

The electronics

The first arrangement of the electronic components

A first, very wild test of the wiring

The illuminated button for interaction with the player

The way iGoBot plays:

 

 

Roobert V2 - first impressions

Here are some first impressions of the second version of Roobert - a home robot project.

You can download all CAD files and source code on github.

Head and arms of Roobert V2 assembled:

The new face frame with more sensor spaces:

Backside of the new head:

Side of the new head:

The new arm with 5 servo axis (instead of 3 at Roobert V1):

Building a home robot: Part 7 - the front RGB LED display

(see all parts of "building a home robot")

A Raspberry Pi touchscreen is used to show Rooberts face. So it can´t be used to show status information like battery state or the “feelings” of its python finite state machine.

Fortunately the body front was still missing – so this seemed to be a good place to mount additional optical output.

I tried several small LCD- and OLED Displays, but they didn’t please me.

In the end I used an 8x8 Neopixel array, a 24 Neopixel ring and a 1 Neopixel lighted big button.

In the beginning the 8x8 pixel array was too bright to see the 8x8 pixel as one image. After attaching a 3d printed cover it looked like quadratic pixels.

The python code can read a GIF file and display it on the 8x8 pixel display. When in idle mode, Roobert shows a beating heart GIF.

The outer ring of Neopixels shows the battery state when driving around and the buttons Neopixel glows up when it seems to be a good idea to press it now.

(see all parts of "building a home robot")

Building a home robot: Part 6 - the 3d room sensor

(see all parts of "building a home robot")

The built in Roomba distance sensors can´t prevent damage when driving around because Roobert is larger than the original vacuum cleaner. My first idea was to use an old Microsoft Kinect sensor. This worked very well – even the usage in python.

But the battery power went low very quick when driving around the first times. So I needed a solution without such high power consumption.

For this I used an ultrasonic distance sensor and two mini servos.

The servos can move the sensor on x-  and y- axis – like a 2 dimensional radar system.

The detection speed is slower than the Kinect version and depends on the chosen resolution: It can reach 2 FSP when using 4x3 measure points.

Just for fun I tried a resolution of 30x20 points. That takes 10 seconds for a frame but I was impressed how well you can “see” the shapes of obstacle objects.

(see all parts of "building a home robot")