Ball Balancing


 

I know it’s been a while since my last post, but 2 kids and a new job take their toll πŸ™‚

Most importantly, for this past few months, I’ve been thinking of a way to say “thank you” to my former colleagues that somehow managed to offer me the absolute best leaving present: a Meccano kit and a prepaid Maplin card (this is the equivalent of the now defunct Radioshack, for my U.S. readers) !

Ex fellow markiteers, this post is for you, thank you for your present and more importantly thank you for having been such nice colleagues for almost 9 years !

This is, I hope, the 1st phase in “upgrading” one of my son’s toys, a wooden maze, to solve itself.

It deals with 2 main steps:

  1. recognise and position the steel ball using OpenCV running from a Python script on a Raspberry Pi
  2. control the glass plate to keep the ball in the same position, using an Arduino and 2 RC servos

And here’s the test rig, the “plan” is to have it stand alone, but for now, the trusty IKEA shelf does a pretty good job:)

Whole Installation

Whole Installation

The mechanical part

was satisfyingly simple to put together, you’ll recognise an assortment of Lego Technic and Meccano bits:

Servo Mount Pieces

Servo Mount Pieces

that combined with the fact that the original knobs of the toy where exactly the same size as the servo’s white nylon plate, give us nice and sturdy fixations for the actuators:

Mounted Servo

Mounted Servo

The electronics

are straight forward.

A Raspberry Pi with a NoIR camera (that’s the one I had lying around, a normal one would have probably been better), running OpenCV through a Python script, (just barely) manages to recognise the steel ball and its position, and then sends this info to the Arduino using the SPI bus.

I initially wanted to use I2C as I’m more familiar with it or maybe the TTL Serial ports, however, for once, I did some planning before hand, as I knew speed was vital if I was to be able to have smooth balancing…

I was planning on sending ~4bytes of data to the Arduino, basically the current position (X, Y) of the ball and the target one, so the timings would look like (I’m sure the below is actually wrong, as it doesn’t take into account all the handshaking and how many cycles are required for each bit, etc. … but still it’s a good indication) :

  • TTL Serial runs at 115KHz, so that’s ~28ms
  • I2C fast runs at 400KHz, so ~8ms
  • SPI on an Arduino can apparently run up to half the clock speed, so in my case 4MHz, so 1.6ms

Given that I measured the Python loop (doing image recognition and all other stuff) at 60ms, I wanted the communication to be negligible so went with SPI.

We’ll see that actually, far from 4MHz, I can only have it run up to 500KHz (currently using 200KHz for reliability, and still get 1 corrupted transmissions every ~100).

It’s true that I send both a marker for the end of the messages and a checksum, so the protocol is slightly more complex than just piping data through the line.

# MAIN Python script to be executed

import io
import picamera
import cv
import cv2
import numpy as np
import time
import myspi

# Create a window
cv2.namedWindow("Camera")

def getBallContour(imgGrey):
	ret, imWhiteThres = cv2.threshold(imgGrey, 220, 255, cv2.THRESH_BINARY)
	contours, hierarcy = cv2.findContours(imWhiteThres, cv2.RETR_TREE, cv2.CHAIN_APPROX_SIMPLE)
	return contours[0] if (len(contours) == 1) else None


mySpi = myspi.MySpi()

# Create an in-memory stream
stream = io.BytesIO()
with picamera.PiCamera() as camera:
	camera.resolution = (240, 240)
	camera.framerate = 30
	
	# capture_continuous() is faster because it does not re-initialize an encoder prior to each capture
	# (difference is small for raw formats like 'rgb' that don't have an encoder, but big for 'jpeg' for example
	# use_video_port=True is essential for speed
	# do the RESIZING here, rather than using OpenCV, as this will use RPi's GPU and hence much faster
	# 'png' format blocks the camera
	for unused in camera.capture_continuous(stream, 'jpeg', use_video_port=True):
		stream.seek(0)
		# Construct a nmpy array from the stream
		data = np.fromstring(stream.getvalue(), dtype=np.uint8)
		
		# "Decode" the image from the array and convert to grayscale
		# CV2 works with arrays directly, no need to transform it into Mat
		image = cv2.imdecode(data, cv2.CV_LOAD_IMAGE_GRAYSCALE)
		
		ballContour = getBallContour(image)
		cv2.drawContours(image, [ballContour], -1, (0, 0, 255), 4)


		if(ballContour != None):
			x,y,w,h = cv2.boundingRect(ballContour)
			print("Sending " + str(x) + " / " + str(y))
			mySpi.send([x, y])
		


		# UNCOMMENT TO SEE THE IMAGES cv2.imshow("Camera", image)
		# HighGUI crap... give it time to update the display
		# UNCOMMENT TO SEE THE IMAGES cv2.waitKey(10)

	camera.close()
cv2.destroyAllWindows()

I’ve factored out all the error checking and slightly dodgy logic that tries to get an answer from the slave in the same “transaction” into a separate class:

# STAND ALONE UTILITY CLASS

import spidev

class MySpi:
	END_MSG = 254

	def __init__(self):
		self.spi = spidev.SpiDev()
		self.spi.open(0, 0)
		self.spi.max_speed_hz = 200000
		self.spi.cshigh = False
		self.retries = 0

	def send(self, values):
		initialValues = list(values)
		# the checksum has to be a byte, but also differet from END_MSG
		checksum = sum(initialValues) % 256
		if(checksum == MySpi.END_MSG):
			checksum = MySpi.END_MSG - 1
		values.append(checksum)
		values.append(MySpi.END_MSG)
		self.spi.xfer2(values)

		# checksum verification.
		# Need to make another transaction to get the last value from the previous one
		# Need to send at least 1 value
		remoteChecksum = self.spi.xfer2([MySpi.END_MSG])[0]

		# if transmission problems, send again
		if(checksum != remoteChecksum):
			if(self.retries < 3):
				self.retries += 1
				print("+++++Sending again........." + str(checksum) + " VS " + str(remoteChecksum))
				self.send(initialValues)
			else:
				self.retries = 0
				print("Have already tried 3 times. STOP sending.")
		else:
			self.retries = 0

And funnily enough, for now the Arduino code is almost more complex than the Python one, though this is bound to change as I need to add more complex on the Raspberry Pi to deal with following the path.

Another problem with SPI was that there’s no “official” library to use the Arduino as a slave.
So I’ve taken Nick Gammon’s interrupt based example and added some error checking to it, and managed after a lot of long hours to make it work reliably.

I would have really done much better to stick with I2C as I’ve lost almost an entire day on all the SPI issues…

Beyond that, using servos was straight forward and for the first time I’ve used a PID libraryΒ rather than hand craft my own code, and to be honest I was impressed at how much time it saved me. Thank you Brett !

#include <SPI.h>
#include <PID_v1.h>
#include <Servo.h>


#define BUFF_SIZE 100
volatile byte buff[BUFF_SIZE];
volatile byte pos = 0;
volatile boolean process = false;

#define SERVO_MAX_DELTA 400
#define SERVO_MID_POS 1500
Servo servoX;
Servo servoY;

#define PID_OUTPUT_LIMIT 1000
double setPointX, inputX, outputX;
PID pidX(&inputX, &outputX, &setPointX, 7, 0, 0.1, REVERSE);
double setPointY, inputY, outputY;
PID pidY(&inputY, &outputY, &setPointY, 7, 0, 0.1, DIRECT);



void setup() {
  // For DEBUG only
  Serial.begin(115200);
  
  // SPI 
  SPI.setClockDivider(SPI_CLOCK_DIV2);
  
  // Arduino in slave mode -> slave out
  pinMode(MISO, OUTPUT);

  // turn SPI on in SLAVE mode
  SPCR |= _BV(SPE);
  
  // turn interrupts on, see SPI_STC_vect
  SPI.attachInterrupt(); //SPCR |= _BV(SPIE);
  
  // SERVOs
  servoX.attach(2);
  servoY.attach(3);
  servoX.writeMicroseconds(SERVO_MID_POS);
  servoY.writeMicroseconds(SERVO_MID_POS);
  
  
  // PIDs
  // wait for some data and use the 1st valid position as the setPoints
  while(!process);
  setPointX = buff[0];
  setPointY = buff[1];
  inputX = setPointX;
  inputY = setPointY;
  printInput();
  
  // Configure
  pidX.SetOutputLimits(-PID_OUTPUT_LIMIT, PID_OUTPUT_LIMIT);
  pidY.SetOutputLimits(-PID_OUTPUT_LIMIT, PID_OUTPUT_LIMIT);
  // 1ms. Low enough so it actually evaluates when we call Compute()
  pidX.SetSampleTime(1);
  pidY.SetSampleTime(1);
  // Start PIDs
  pidX.SetMode(AUTOMATIC);
  pidY.SetMode(AUTOMATIC);
}


void loop() {
  if(process) {
    process = false;
    
    // 1. get new input data from SPI
    inputX = buff[0];
    inputY = buff[1];
    printInput();
    
    // 2. update the 2 PIDs
    pidX.Compute();
    pidY.Compute();
    printOutput();
    
    // 3. use the calculated outputs to drive the servos
    servoX.writeMicroseconds(SERVO_MID_POS + map(outputX, -PID_OUTPUT_LIMIT, PID_OUTPUT_LIMIT, -SERVO_MAX_DELTA, SERVO_MAX_DELTA));
    servoY.writeMicroseconds(SERVO_MID_POS + map(outputY, -PID_OUTPUT_LIMIT, PID_OUTPUT_LIMIT, -SERVO_MAX_DELTA, SERVO_MAX_DELTA));
  }
}




// SPI interrupt routine see SPI.attachInterrupt()
ISR(SPI_STC_vect) {
  byte b = SPDR;  // get byte from SPI Data Register
  buff[pos++] = b;
    
  // 254 means end of the command. The numbers in the command should never take this value
  if(b == 254) {
    if(pos > 2) {
      byte receivedChecksum = buff[pos - 2];
      byte checksum = 0;
      for(byte i = 0; i < pos - 2; i++) {
        checksum += buff[i];
      }
      // the checksum has to be a byte, but also differet from 254
      if(checksum == 254) {
        checksum = 253;
      }
    
      // only "publish" the data if it's correct
      if(checksum == receivedChecksum) {
        process = true;
      }
      
      SPDR = checksum;
    } else {
      SPDR = 251;
    }

    pos = 0;
  }
}


void printInput() {
  Serial.print("Input: "); Serial.print(inputX);Serial.print(" / ");Serial.print(inputY);
  Serial.print(" ## SetPoint: "); Serial.print(setPointX);Serial.print(" / ");Serial.println(setPointY);
}

void printOutput() {
  Serial.print("Output: "); Serial.print(outputX);Serial.print(" / ");Serial.println(outputY);
}

 

Here are a couple of other videos, showing some intermediary testing, balancing on 1 axis only, and checking that the servo mounting contraption actually works…

 

 

As usual, I hope you enjoyed my ramblings, and stay tuned if you want to see the finished project !

5 Responses to Ball Balancing

  1. Pingback: Ball Balancing – V2 | Robotics / Electronics / Physical Computing

  2. I suppose the final aim was to solve the maze. Did you get it through ?
    Were you planning to use machine vision so that it can solve any maze, or just pre-configure the program to solve one specific maze ?
    Cheers !

    • trandi says:

      yep… and still is πŸ™‚
      Use simple machine vision to identify the black line which is the trajectory to follow (this works quite well), then the black holes and then the ball itself.
      However, I found it lacked the necessary precision, so went ahead and improved the mechanics, more precisely the bits connecting the servos to the plate.
      Then realised the Arduino standard Servo library was not great and made the servos “jittery”… so went and wrote low level code to use interrupts, see https://trandi.wordpress.com/2015/10/25/arduino-when-the-servo-library-is-not-good-enough/
      Then realised that I actually needed more pins/speed so migrated everything to a Stellaris Launchpad LM4F120 (arm proc at @50MHz or something)
      But then realised that was still not precise/fast enough … currently I think the main problem is detecting the ball itself, it’s actually quite tricky to make it reliable and it depends on the light…

      Anyhow, if only I had more time to spend on it πŸ™‚

  3. Pingback: Arduino, when the Servo library is not good enough… | Robotics / Electronics / Physical Computing

  4. Pingback: Double Servo Tester with Display | Robotics / Electronics / Physical Computing

Leave a comment