23 Jun

Gardenbot 2014

Every year, I start a new Gardenbot project, and it rarely gets any further than a wish list. This year is different. I have a fully-functional robot that is battery-powered and can be controlled remotely via WiFi.

IMAG0838

Getting this far has not been easy, so I’ll write up what I can remember so you can do the same (or so I can do the same again next year after I forget!)

The biggest problem was the computer itself. I’m using a Raspberry Pi, but powering it was tricky.

The Pi takes a 5V input, but I couldn’t find any ready-made 5V batteries, and didn’t want to use battery packs as I wanted to easily recharge individual batteries.

In the past, my experience with using batteries in series with each other was that one battery would discharge fully way before any others, leaving an apparently dead pack. To solve that, I’m using li-ion batteries scavenged from phones; each with at least 2800mAh in them. I link them in parallel, and “boost” the voltage using some regulators.

There are currently two voltage boosters in the system. The first one powers the Pi, and the second powers the USB hub. You can’t power the USB hub directly off the Pi as the Pi uses 700mA, and there’s not enough left over to power anything useful. So, for anything external, such as the WiFi and the camera on the robot, you need to use a powered hub.

To save space, I stuck the voltage boosters for the USB hub and the Pi inside the Pi case, as you can see in this photo. They’re the rectangular circuits with the large capacitors on them.

IMAG0839

The capacitors are there to help stop fluctuations in power supply as various bits and pieces are turned on. There’s nothing quite as annoying as turning on a motor only to find that you have lost WiFi because of it and now have no way to turn off the motor.

The robot chassis is a hand-built case made from two perspex sides, a wooden base, and a wooden front. I didn’t measure anything – it was all done by trial/error.

The tracks are from Tamiya (example store). The box comes with enough for a larger base, but I didn’t need it all.

The claw at the front doesn’t work perfectly yet. The one I currently have is one I bought a few years back. It never seems to work properly for me. I think I need either a stronger servo, or just replace the claw completely.

The servo cable has three wires – ground, power in, and signal. The ground and power in can be plugged directly into the batteries. The signal, I hooked to GPIO 1 on the Pi (using this wiring guide for the main GPIO connector), which is then controlled using pulse width modulation (PWM) through the pin.

The motors for the treads are scavenged from the legs of a Robosapien bot I got for Christmas a few years back. These are standard DC motors, probably for up to 5V, but I’m running them off 3V and happy with them.

To control the motors, I was initially planning to create my own motor controller using some PNP and NPN transistors, but found a motor controller circuit from an old Cybot that handily does exactly what I need.

IMAG0840

The camera is a standard web-cam, with the cables shortened.

Turning the machine on is done by simply connecting the little red cable between the battery-side and the “other stuff” side of the breadboard as you can see in the image above.

To charge the battery, I simply hook in a Li-ion charger directly into the left of the board (below). The charging circuit will happily charge multiple Li-ion batteries.

IMAG0841

Hardware-wise, I’m almost happy. I want to replace the claw soon, but apart from that, I’m ready to work on software.

I already have code written for controlling the motors, which I’ll upload into Github over the next few days. I’m looking into SLAM now for creating maps via the camera system. I might have to write the solution myself, though, as the code I’ve found so far is written in academicese and I don’t understand it.

Funny, that, as I’m certain I can write the bloody code, but can’t understand the words that the academics use!

28 Sep

networking a raspberry pi through your laptop

I finally got my own Raspberry Pi; a credit-card-sized computer that’s very cheap and low-power.

It didn’t come with any of the niceties that you would expect from another computer, such as a power supply or a case, or keyboard, monitor, or anything else. Basically, it’s like being given the motherboard of a desktop computer and you need to do the rest yourself.

So first thing was to install the operating system on it. This was easy. Just buy an SD card, and download the ISO of the OS that you want and copy it onto the card.

I already had a micro-SD card from an old Bada phone, so I just stuck that in an adaptor to bring it up to SD card size, then installed the Fedora remix using the Fedora Arm Installer. Painless.

Raspberry Pi connected to laptop using cross-over cable

Next, we need to connect the machine to the network.

My network is mostly WiFi-based, so I chose to hook the RaspPi to the network by piping its network through my laptop.

We need to set up the laptop so it can hand out IP addresses. Install a DHCP server on your laptop. I’m using Fedora, so installed with “yum install dhcp”, then edited /etc/dhcp/dhcpd.conf:

option domain-name "example.org";
option domain-name-servers ns1.example.org, ns2.example.org;
default-lease-time 600;
max-lease-time 7200;
log-facility local7;
subnet 10.5.5.0 netmask 255.255.255.224 {
  range 10.5.5.26 10.5.5.30;
  option domain-name-servers ns1.internal.example.org;
  option domain-name "internal.example.org";
  option routers 10.5.5.1;
  option broadcast-address 10.5.5.31;
  default-lease-time 600;
  max-lease-time 7200;
} 

Then start the DHCP server with “service dhcpd start”.

Next, we need to connect the laptop to the RaspPi. To do this, I made a cross-over cable, and plugged it into the RaspPi.

Before plugging it into the laptop, we need to tell the laptop’s network manager not to set up DHCP over eth0 (as we’re the server, not the client, as far as the cable is concerned). To do this in Gnome, right-click your netowkr icon on the top-right, click Network Settings, and in Wired, click Options, then change the type to “Shared connection” (or whatever sounds like that).

Now plugin the ethernet cable into the laptop, then plug a USB cable into the laptop and the RaspPi.

If you “tail -f /var/log/messages”, you should get something like the below after a minute:

Sep 28 21:12:30 iga dnsmasq-dhcp[30147]: DHCPDISCOVER(em1) 192.168.1.19 b8:27:eb:87:1d:86
Sep 28 21:12:30 iga dnsmasq-dhcp[30147]: DHCPOFFER(em1) 10.42.0.62 b8:27:eb:87:1d:86
Sep 28 21:12:30 iga dnsmasq-dhcp[30147]: DHCPREQUEST(em1) 10.42.0.62 b8:27:eb:87:1d:86
Sep 28 21:12:30 iga dnsmasq-dhcp[30147]: DHCPACK(em1) 10.42.0.62 b8:27:eb:87:1d:86 raspi

That 10.42.0.62 is the address of the RaspPi. You can ssh into it (username root, password fedoraarm), and do stuff!

27 Jun

Raspberry Pi coming

I got an email earlier today saying that my Raspberry Pi would be dispatched in two weeks.

Exciting! I’m looking forward to getting back into my robot and finally giving it some intelligence.

I’ve been working on this robot idea for years, and always wanted to basically have a very very small programmable bot that could do some things intelligently, such as pick up rubbish, cut weeds, do a little but of mapping, etc.

Now I need to decide what language I’ll use for the programming.

I’ve been doing PHP and JavaScript professionally for about 15 years, but Java and C are probably more appropriate.

I think I’ll be brushing the dust off my C books!

22 Mar

ToDo

List of things off the top of my head that I want to do:

  • write a book. already had a non-fiction book published, but I’d love to have an interesting an compelling original fiction idea to write about. I’m working on a second non-fiction book at the moment.
  • master a martial art. I have a green belt in Bujinkan Taijutsu (ninja stuff, to the layman), but that’s from ten years ago – found a Genbukan teacher only a few days ago so I’ll be starting that up soon (again, ninja stuff).
  • learn maths. A lot of the stuff I do involves guessing numbers or measuring. it’d be nice to be able to come up with formulas to generate optimal solutions.
  • learn electronics. what /is/ electricity? what’s the difference between voltage and amperage? who knows… I’d like to.
  • create a robot gardener. not just a remote-control lawn-mower. one that knows what to cut, what to destroy, that can prune bushes, till the earth, basically everything that a real gardener does.
  • rejuvenate, or download to a computer, whichever is possible first. science fiction, eh? you wait and see…
  • create an instrument. I’m just finishing off a clavichord at the moment. when that’s done, I think I’ll build another one, based on all the things I learned from the first. followed by a spinet, a harpsichord, a dulcimer, and who knows what else.
  • learn to play an instrument. I’m going for grades 2 and 3 in September for piano. I can play guitar pretty well, but would love to find a classical teacher.
  • write a computer game. I have an idea, based on Dungeon Keeper, for a massively multiplayer game. maybe I’ll do it through facebook…
  • write programs to:
    • take a photo of a sudoku puzzle and solve it. already wrote the solver.
    • take a photo of some sheet music and play it.
    • show some sheet music on screen, compare to what you’re playing on a MIDI keyboard, and mark your effort.
    • input all the songs you can play on guitar/keyboard. based on the lists of thousands of people, rate all these songs by difficulty, to let you know what you should be able to learn next.
    • input a job and your location. have other people near you auction themselves to do the job for you. or vice versa: input your location, and find all jobs within walking distance to you where you can do an odd job for some extra cash (nearly there: http://oddjobs4locals.com).
    • takes a photo and recognises objects in it (partly done)
    • based on above, but can also be corrected and will learn from the corrections (also partly done)
  • stop being damned depressed all the time.

There’s probably a load of other stuff, but that’s all I’ve got at the moment!

25 Dec

bot grasscutter progress

the robot so far

I spent today working on my robot (apart from the necessary hours spent trying to extricate the kids’ christmas presents from their fiendish wrapping). I got the grass cutting blade working.

For back story, I’m trying to build a robot which will do various things for me. For example, grass-cutting. I’m not trying to build a replacement for one of those 2000 euro robomower things. Instead, it’s a very small bot which will eventually do quite a few things.

Anyway – one thing I wanted was to have it take great care in what it does, and that means cutting each blade individually, mad as that may seem. It will eventually be able to decide whether a blade should be cut or not, for example I may give it an instruction “cut the marsh grass, but leave the ordinary stuff alone”.

So – today’s task was to build a blade that can grab a single blade and then cut it. I chose to build it as a kind of guillotine, instead of a scissors, which had been my plan before. I may change this at some time, but it works at the moment.

the blade mechanism

The guillotine was built with some wood, Meccano, elastics, and a blade taken from a craft knife. First, a hook was built by hacksawing into a Meccano spar such that it could be used to “catch” a blade of grass when turned just so. The blade was glued to a second Meccano piece and they were both placed in some wood such that a channel allowed the blade to slide forwards and back.

The blade is pulled back by an elastic, and forward by a wire which is controlled by a servo (ultimately controlled by an SD-21). The screw that you see there is for helping to push the blade down, so it doesn’t just wave about but actually slices the grass.

It took a while to get the right angle for the blade to slide along the bottom Meccano “hook” such that it was flush with the hook. If the blade is not flush, then the grass would just bend under the blade, and not get cut (an advantage for the scissors idea…). I ended up having to bend the hook itself, and it’s still not perfect.

After that was all set up, a servo was glued to the board and an electrical wire (*shrug*) tied to the blade. That screw you see on the servo’s turny thing is just so the wire can be pulled a bit further than the diameter of the turny thing.

22 Nov

hooking up with an SD21

I bought some bits and pieces for the robot project. After a lot of consideration, I chose to go the servo route instead of the DC motor route. The difference is that with DC motors, you just turn them on and off they go, but with servos, you can tell the motors how far to go, and at what speed. I like the idea of having that fine control (which would probably be needed if I’m picking up twigs etc).

Unfortunately, the LPT port is pretty rare these days. In the first incarnation of my robot, I used the LPT port to directly turn motors on or off, using a home-built h-bridge to tell the motors to go forward or backward depending on what pins of the LPT I turned on.

I opted instead to use USB and I2C. I2C is a cool little standard – it lets you chain a load of devices into one connection. There is no direct I2C port on the outside of a laptop though, so i bought a USB-I2C convertor.

I found a shop online that sold all the necessary parts. Robot Electronics appears to be the shop-front for Devantech, who make a load of brilliant gadgets such as ultra-sonic rangers (not related to power rangers), on-chip compasses, and what I was interested in – servo controllers.

I bought the SD21 servo controller (controls up to 21 servos), a USB-I2C interface module so I could talk to the SD21, and 2 HS-55 Micro Lite servos so the other stuff actually did something, as well as some leads to connect the interface module to the motor controller.

By the way, the word ‘micro’ is a very apt description – they’re tiny! The image on the right shows the two of them with my mouse in the background.

The stuff finally arrived. Next was the tricky part – figuring out how they all go together.

Using the tech-specs for the usb-i2c and the sd21, it was easy to figure out. The only tricky part was that the USB-I2C module has 5 pins on it, although I2C actually only uses 4. in my photo above-left, the white cable is not used at all (it’s a test pin used by Devantech). So hook the USB-I2C and SD21 together connecting to one of the I2C connectors on the board.

in my case, I got a 5-way and a 4-way cable and just stuck the ends together. Connect black to black, red to red. But the blue in each cable connects to the yellow in the other. Then stick them onto the boards.

Power is provided by hooking 7.2v into the green block. I used 6 1.2v rechargeable batteries. Actually, 4 of them are 1.25v, but the board didn’t complain much. Be very careful that you hook them into the SD21 the right way around! I put the power in the wrong way once and had to yank them apart when I smelled burning parts! Luckily, there wasn’t anything wrong afterwards, but beware!

After all that, the hard part was figuring out how to actually talk to the thing. It took me some reading to figure out the obvious… you don’t speak directly to the SD21. You speak to the USB-I2C and it then talks to the SD21.

Here’s a small C program that turns the motors first in one direction, and then in the other. BTW: it’s been a century since I wrote C, so if I’ve done anything obviously wrong (especially that int to bytes conversion bit…) please tell me.

#include <stdio.h>
#include <string.h>
#include <unistd.h>
#include <fcntl.h>
#include <errno.h>
#include <termios.h>

struct termios options;

int open_port(void) {
  // this function taken from http://www.robot-electronics.co.uk/forum/viewtopic.php?f=5&t=165
  int fd; // File descriptor for the port
  fd = open("/dev/ttyUSB0", O_RDWR | O_NOCTTY);
  if (fd == -1) {
    perror("open_port: Unable to open /dev/ttyUSB0 - "); // Could not open the port.
  }
  else {
    fcntl(fd, F_SETFL, 0);

    // Get the current options for the port...
    tcgetattr(fd, &options);

    // Set the baud rates to 19200...
    cfsetispeed(&options, B19200);

    // Enable the receiver and set local mode...
    options.c_cflag |= (CLOCAL | CREAD);

    // Set no parity bit
    options.c_cflag &= ~PARENB;

    // Set 2 stop bits
    options.c_cflag &= ~CSTOPB;

    // Set the character size
    options.c_cflag &= ~CSIZE;
    options.c_cflag |= CS8;

    // Set the new options for the port...
    tcsetattr(fd, TCSANOW, &options);

    fcntl(fd, F_SETFL, FNDELAY);
  }
  return (fd);
}

void set_servo(unsigned char reg, int pos) {
  // sets servo 'reg' to position 'pos'
  unsigned char sbuf[7];
  unsigned char lbyte, hbyte;
  int fd;

  hbyte=(unsigned char) (pos >> 8);
  lbyte=(unsigned char) (pos);

  sbuf[0] = 0x55;  // ftdi mode (multibyte device)
  sbuf[1] = 0xC2;  // address of the i2c device
  sbuf[2] = reg*3; // each servo has three registers
  sbuf[3] = 0x03;  // bytes to send
  sbuf[4] = 0x00;  // set to full speed
  sbuf[5] = lbyte; // position low byte
  sbuf[6] = hbyte; // position high byte

  fd = open_port();
  write(fd, sbuf, 7);
  close(fd);
}

int main() {
  // take a step to the left
  set_servo(0, 800);
  set_servo(1, 800);

  sleep(2);

  // and then a jump to the right
  set_servo(0, 2200);
  set_servo(1, 2200);

  return 0;
}
20 Oct

let's begin again – robots!

For years, I’ve had a (mad?) plan to build a robot to handle gardening for me. And so far, I haven’t built it.

This is not because it is impossible or stupid. Far from it – when you consider the task step by step, it’s reasonable, and could even be very important.

  • build a robot which is completely wireless.
  • the robot must be able to geo-locate and find its way to its charge-point when it needs it.
  • teach the robot to “see” rubbish such as twigs and leaves.
  • teach the robot to pick up rubbish and place it in a designated rubbish area. At this point, we have something which can be developed and sold, although maybe just as a curiosity.
  • teach it to see grass and to judge whether the grass is too long or not.
  • teach it to cut the grass, one blade at a time, and compost the blades. My plan here is that the robot is very small (20cm cubed?), making it difficult to cut a lot of grass at a time, thus making it easier to cut one blade at a time. Even so, cutting one blade at a time allows every piece of compostable material to be composted, thus making the garden neater than if it was cut by “brute force”.
  • teach it to recognise weeds and destroy them or cut them as close as possible to the root. your average lawnmower can’t do that!

These are reasonable goals, and at the end, you have a small robot (or a few small robots) which can manage a medium-sized garden unattended better than you could do yourself. Now that’s a product that would sell.

So what’s so difficult? Why have I not built it? I think the problem is that I was aiming for perfection – I wanted to go straight to the end product so was buying only the components that would fit in the 20cm cubed machine.

Unfortunately, I just don’t have the money for that. For example, the “brain” needed for the robot would need to be something like the Robostix, which would set me back over €300 euro which I just don’t seem to have lying around.

So, I’d dream and pine and do nothing about it.

The solution, which I have somehow failed to see for years, is to build something less than perfect, which does the job, and develop that into something that people can see actually does work. When that happens, someone will hand me the money to develop the proper thing, in the hope that they’ll make a tidy sum in return.

So, I’ve decided to resurrect some old laptops from the attic, in the hope that I can make them chew the grass for me. I’m going to stuck wheels on them and give them knives and other blades to play with. I’ve dug out my Latitude C610 and Travelmate 2420.

One thing discovered so far – laptops don’t like it when you leave them alone in a damp attic eave for years on end. The Latitude’s hard-drive literally squealed a few times when I booted it, and it would only boot once. Every time after that, the hard-drive threw up errors like it was being killed (I will also mention that the HD’s file-system is ReiserFS, making it more ironic…).

The other worked fine though – it has a few lines on the screen, but nothing more serious (the Latitude has no screen at all).

Tomorrow I hope to build the base of the robot for the TravelMate laptop. I’m going to try build two robots, one for each laptop. If I actually do it (notoriously lazy as I am) I’ll post photos.

Anyway – here comes world-domination step 1.

06 Jan

2008 plans

yes. another resolutions page. I’ll call them plans, though, as “resolutions” sounds a bit certain. in my experience, nothing is certain, even promises.

so, my plans:

  • lose a half-stone. when I married Herself, I was 12 stone. this was probably partly caused by the anti-depressants I was on at the time. since then, it took me a few months to manage to cut down to 11 stone. my ideal weight is more like 10 stone, but I’m finding it incredibly difficult to do that – I just seem to be stuck at 11. so, this year’s plan is to get down to 10.5.
  • build the new robot – I want a very small form-factor machine (gumstix or linutop). it should have caterpillar tracks, a robotic arm, and a cutting tool. my current bot is about 9’x14′. the new one should be at most 6’x6′. the software is not yet complete, but at least this will get the hardware out of the way
  • get my finances back under control. for the last few years, I’ve had negative values in almost all of my bank accounts. about time I managed to fix that. I have one account under control by basically denying myself access to it and setting up a standing order into it. the same plan should work with everything else. the plan is to on average raise the total value of each account by a set value every month. this value (none of your damned business 😉 ) should be measured on a specific day each month.
  • build a collapsible mini-ramp. I can’t make it to the local skate-park very often, so the plan is to build something that I can wheel out to the garden when I feel like some exercise. it should be big enough that I can use my bike on it as well as my skateboard.

notice that none of these are vague. it’s been shown that when a resolution is vague, it usually does not happen. an exact plan, though, gives solid goals that can be reached.

27 Oct

letter recognition network

Last week, I wrote a neural network that could balance a stick. That was a simple problem which really only takes a single neuron to figure out.

This week, I promised to write a net which could learn to recognise letters.

demo

For this, I enhanced the network a bit. I added a more sensible weight-correction algorithm, and separated the code (ANN code).

I was considering whether hidden inputs were required at all for this task. I didn’t think so – in my opinion, recognising the letter ‘I’ for example, should depend on some information such as “does it look like a J, and not like an M?” – in other words, recognising a letter depends on how confident you are about whether other values are right or wrong.

The network I chose to implement is, I think, called a “simple recurrent network” with stochastic elements. This means that every neuron reads from every other neuron and not itself, and corrections are not exact – there is a small element of randomness or “noise” in there.

The popular choice for this kind of test is a feed-forward network, which is trained through back-propagation. That network requires hidden units, and each output (is it N, it it Q) is totally ignorant of the other outputs, which I think is a detriment.

My network variant has just finished a training run after 44 training cycles. That is proof that the simple recurrent network can learn to recognise simple letters without relying on hidden units.

Another interesting thing about the method I used is how the training works. Instead of throwing a huge list of tests at the network, I have 26 tests, but only a set number of them are run in each cycle depending on how many were gotten right up until then. For example, a training cycle with 13 tests will only be allowed if the network previously successfully passed 12 tests.

There are still a few small details I’d want to be sure about before pronouncing this test an absolute success, but I’m very happy with it.

Next week, I hope to have this demo re-written in Java, and a new demo recognising flowers in full-colour pictures (stretching myself, maybe…).

As always, this has the end goal of being inserted in a tiny robot which will then do my gardening for me. Not a mad idea, I think you’re beginning to see – just a lot of work.

update As I thought, there were some points which were not quiet perfect. There was a part of the algorithm which would artifically boost the success of the net. With those deficiencies corrected, it takes over 500 cycles to get to 6 correct letters. I think this can be improved… (moments later – now only takes 150+ to reach 6 letters)

07 Mar

ah… spring; when young men's thoughts turn to…

robots!

So anyway, I moved house (long story short), meaning that I get to think more clearly, as the house is less cluttered, and the route to work involves crossing less roads.

This morning, I was thinking about my current project – I’m writing a recurrent connectionist network so my new robot can learn to recognise things like grass and rubbish (to cut the former, and remove the latter).

The walk was getting tiring, so I was thinking about segways as well, and wondering how easy it might be to make one.

This eventually evolved into an idea for a new transport system – you get a load of little robots (my gardening ones, for example), and get them to form a platform. Then a load more of them form another platform on top. Then, you stand on the top.

The “carpet” would move in the direction you lean. Of course, the speed wouldn’t be too impressive, but it would be better than walking.

When the lower layer encounters a rock on the road or something, it moves around it. The upper layer robots interlock with each other to allow the lower level bots to do this without having too much pressure from above.

When you reach where you are going, the robots then disperse and continue their gardening around the new area.

You could even form a baggage train using this idea – a few carpet networks would follow each other in marching-ant form.

This would be easier to do than to create a robot which does your gardening for you…