Last 8th October took place the 3rd edition of the CRJET International Robotics Competition in Cataluña.
Silvestre was the WINNER of the Line Following Robots category. During the qualifying session in the morning, Silvestre set the fastest time, completing 3 laps to the 11.752 meters track in 14.02 seconds (2.5m/s average speed). Piolin - Silvestre's little brother - made the second best time at qualifying but had some issues and got a final 3rd place losing against Shibuya in the Semifinal round.
Its revolutionary positioning system allowed Silvestre to identify the main straight and to speed up to 4.8 m/s. Thanks to its inertial sensors and the wheel encoders, Silvestre could brake at the right point reaching the corner at a safe speed. This strategy had not been seen in any Line Follower competitions so far making Silvestre look even more impressive.
Silvestre Highlights (the slow motion part is very cool 😉 )
Final Race, Silvestre VS Shibuya:
For the next contests, the challenge is to make another key improvement to keep Silvestre on the top of the podium.
Also, Tobias was awarded with the "Most High-Tech" prize in the "Best of Show" category so it was a pretty successful weekend 🙂
Silvestre is a line following robot who was born early this year. So far, he's competed in two national contests in Spain achieving a 5th place in the first one and winning the other one.Among his main features:
8 Infrared sensors
Two Maxon DC motors
Bluetooth enabled (telemetry and configuration)
LPC2148 ARM7 32-bit microcontroller
PD Loop running at 100Hz
Error estimation using cubic interpolation over the IR sensors data
Accelerometer and gyro sensors
In the video above you can see Silvestre running on a 8.44m long track. Whenever he crosses the mark, sends the lap time over the Bluetooth link to a PC software which displays the timing information along with some other data such as battery level. In the best lap in the video he can reach up to 2.24m/s and as long as the wheels become dirty, the grip decreases and therefore, the times get worse.
The microcontroler's got 512KB the flash. Half of this memory's currently holding a custom filesystem to store configuration profiles (speed, PID constants, enabling/disabling features, etc.) which can be loaded, stored and deleted through the PC software.
In order to speed up the communication between the computer and the robots we designed a low bandwith binary protocol with error correction which's been implemented and optimized carefuly in all of our robots. This way, the development of new software (for both robot and computer side) gets simplified from the communication point of view and there's no need to start over again every time. You can see an screenshot of the PC application we've built:
The inertial sensors help Silvestre to measure how good he's performing and enables him to adjust the speed and PID constants as the wheels are losing grip. Basically, the accelerometer data is used to accelerate faster to the setpoint speed whereas the gyro tells him whether he starts drifting (too much angular acceleration).
In the contests being held in Spain, there are no marks indicating when a lap has started so there's no way for the robot to figure out when to modify its parameters so something that might be useful as well with the gyro data is to be aware of the actual orientation of the robot. Thus, integrating the gyro signal, Silvestre knows when he is facing the initial position again and this is likely the starting point. However, this approach is not valid when the track's got two or more straights in the same direction because the robot will get to 360º and the lap's not been completed yet.
The key for improving Silvestre's performance has been undoubtely the ability to gather data from the sensors in real-time so that they can be later analysed on a computer.
My friend Alberto Calvo and me have worked hard on this robot and we're looking forward for new contests and challenges. Our main goal, rather than just following a line, was researching how the usage of inertial sensors and self-learning processes could be applied on this kind of robots. We're still working on Silvestre in our spare times so I'll probably update this post soon.
I want to introduce you the brand-new uXbot (micro xBot) robot. It has been designed for educational purposes and with the main goal of serving as a base for the Robotics Workshop at Campus-Party España 2010.
ARM Cortex-M3 32 bit Microcontroller (LPC1343)
Motor driver up to 3.5A
12 Infrared sensors
Integrated battery charger
Battery Voltage monitoring
Firmware Programming via USB
4 General Purpose LEDs
800mAh Li-Po battery
Metal Gearbox motors
In order to make the development process easier, some libraries have been written for the end user with the following layout:
HALLib (Hardware Abstraction Layer Library): Provides an interface to uXbot hardware and the microcontroller peripherals such as sensor reading, pushbutton, voltage monitor, motors, timing functions.
VCOMLib: Provides a USB Virtual COM Port driver (ACM profile under Linux OS) so that the user can communicate to a PC.
uXbotLib: This is a high-level library which will be mainly used by users who do not have/need to have any knowledge about the underlying electronics of the uXbot. It provides an interface to move the robot in any direction, an open-loop PID controller for line following, sensor reading and filtering, etc.
All the tools used to develop applications for the uXbot are free and available for both Windows and Linux OS (32 and 64-bit).
In order to control the uXbot through the USB VCom port or Bluetooth, a C# application has been written. Anyone will be able to use it from Windows or Linux (using Mono). Here you can see an screenshot:
I want to show you two videos. In the first one you can see the uXbot being controlled from a Windows Mobile device using a simply C# application which sends commands to the motors depending upon the PDA acclerometer sensor readings. It's pretty funny and addicting 🙂 In the second one, the uXbot is following a black line at an average speed of 1.65 meters per second.
I would like to publish all the source and diagrams soon, so stay tunned 🙂
PS. You can find more information at www.uxbot.es (Wiki & Forum in Spanish).
I've been playing around with SMS PDUs encodings and recently observed some curious things regarding the way that cellphones treat the text messages when they include special characters.First look at these two tables of the GSM alphabet:
The tables above show the GSM Alphabet using a 7-bit encoding which implies a maximum number of 160 characters per text message. However, sometimes this is a little bit more tricky and the count is not so straightforward: If, for instance, you type a character from the second table, it needs to be escaped with the escape character 1) 0xB1, and it will take two characters instead of one. What happens if your text message contains 160 characters and one of them is a ']'? Simple: You will get charged for two text messages because you're exceeding the maximum length of a simple PDU. Usually your cellphone won't warn you about this and you will send it anyways without knowing the fact that this text message will cost twice than you think. The same happens with the '€' symbol and some other not showing up in the table above.If you have a closer look at the tables, you might realize that the 'é' symbol appears but where are 'á','í','ó' and 'ú'? The GSM alphabet was originally designed by France and they only use 'é' so if you want to use accents, there's no way using this encoding. I have tested some cellphones and they behave in two different ways:
Removing those characters with accents (except 'é') and substituting them by the same character without accent.
Using a 16-bit UNICODE encoding to allow sending every character (in this case, the maximum length of the textmessage is 70 characters).
In case 1, the only 'side-effect' is that the recipient of the text message won't get your accents and you can send up to 160 characters. In case 2, your cellphone won't warn you and you might send up to 160 characters thinking that it will take just one text message.However, again, you will be charged for up to 3 text messages without knowing it! The recipient will get a multi-part message showing all the characters you sent with no modification. Here you can see the decoding of a PDU (using PDUSpy) of a text message sent with accents and encoded using UCS2:
PROTOCOL IDENTIFIER (0x00)
MESSAGE ENTITIES : SME-to-SME
PROTOCOL USED : Implicit / SC-specific
DATA CODING SCHEME (0x08)
AUTO-DELETION : OFF
COMPRESSION : OFF
MESSAGE CLASS : NONE
ALPHABET USED : 16bit UCS2
If the accents are removed from the original text message, the cellphone will automatically use the GSM7 alphabet and you will be allowed to send up to 160 characters in just one PDU (you will get charged once).All in all, be careful and if possible make some research to figure out what your cellphone does and check it against your bill because you will probably save some (or a lot of) money. Cheers, D.
Since the very first moment I saw a 2-wheel self-balancing robot I got amazed about all the engineering behind it and I was so excited to build one myself. So now that it's become a reality let me introduce you to TOBIAS
Accelerometer: Slow response & sensitive to acceleration forces due to movement
Gyroscope: Fast response & integration drift for angle estimation
Need to mix up the information from both sensors: Kalman Filter
In this graphic you can see some data captured in real time by the microcontroller and then dumped offline to a PC for a later analysis.
The blue signal represents the estimated angle using just the raw data from the accelerometer: arc-tangent of y-axis by x-axis acceleration.
The green signal is the integration of the gyro sensor which clearly shows the drift over the time.
The red signal is the actual angle estimated by the Kalman Filter which shows that in the balancing state the angle falls between -3 and 3 degrees. Block Diagram:
Here you can see the block diagram of the complete system. First, you can observe that the signals are sampled at 3200Hz (oversampling) and then low-pass filtered with a Finite Impulse Response (FIR) Filter with a cutoff frequency of 100Hz.
A 16x decimator is then used to obtain signals with a bandwith of 200Hz and no aliasing. This filtering process improved the angle estimation so much because a lot of noise was removed.
The inputs for the Kalman Filter are the angular rate and the estimated angle from the accelerometer which is computed using an atan2 function call. After that, some tests reported that the angle output by the KF had a precision of about .1º which looks really accurate.
This angle is ready to be processed in order to apply the right torque to the motors using a PID controller -tunned by hand- with more effort than expected (and desired). The integral part of the PID is computed by the trapezoidal rule while the derivative component is calculated using a 7-steps Savitzky-Golay derivator.
The output of the PID is then applied to both motors in order to keep it balanced.
The LPC2148 is a very powerful 32-bit microcontroller which shouldn't have many problems acting as TOBIAS' brain. However, the firmware was as optimized as possible in order to allow future improvements and, in the mean time, keep the processor in power down mode while not doing anything to save battery (every mA of current counts ;)).
In order to figure out how the microcontroller could handle all the tasks, a profiling of the execution was performed using a GPIO and a logic analyzer:
As you can see from the image above, there's plenty of time for the microcontroller to do some other things. This time was used mainly for logging purposes and in the current version, to read from a IR receiver and controlling TOBIAS using a cheap remote controller from an RC helicopter. This performance was achieved after optimizing the code of the most computationally expensive tasks (Kalman & PID). Also these functions execute from RAM and try to make a good use of the MAM (Memory Accelerator Module) hardware in order to speed its execution up as much as possible.
Considerations for future improvements:
The first approach was building a fairly good balancing robot without spending too much money and now I can say that it was definitely achieved.
The sensors used in the IMU were taken off a cheap PS3 gamepad bought on eBay, there's no commercial electronic boards (entirely own design) - apart from the cheap step up/down DC-DC controller ($15) -, and both the plastic sheets and wheels are quite cheap and, thus, the overall cost of the robot doesn't go beyond the 100€ ($140-$150).
However, the cheap motors made all the project a little bit more difficult (and challenging at the same time) than expected: they were not enough responsive and the gearbox wasn't tight enough allowing you to turn the wheels freely about 3 degrees.
I'm sure that if another motors were used in TOBIAS, the performance would have been way better but it was more exciting to face the PID tunning and the signal processing under these 'negative' conditions.
References & Greetings:
T.O.B.B. Balancing Robot by Matthias Toussaint: I would like to thank Matthias so much for answering my e-mails and pointing me in the right direction with his unvaluable advice. All the signal processing was based on TOBB's and the only main difference is that TOBB uses a very interesting complimentary filter (instead of Kalman) which works incredibly well as you can see in the video posted on his site. Thanks once again Matthias because I learnt a lot from you !!
Also big thanks to my friend Alberto Calvo, the co-author, who also made the 3D artwork shown in the article 😉
All in all, it's been a very interesting project and, as a reward, TOBIAS won a prize in the Freestyle Robotics Contest at Campus Party '09 last summer.
In this video you can see TOBIAS in action:
Disclaimer: All the information posted is intended for illustrative andeducational purposes only. I just want to show you that it's possible to set up the iPhone SDK on a Virtual Machine. Please, BUY an Apple Mac OS X License if you are going to use this and BUY a Mac computer (Apple's EULA agreement states that you cannot run Mac OS X under non Apple hardware).
In this post I will try to explain how to set up the SDK for iPhone OS 3.1 on a PC running Windows (Vista 64 in my case). From the readme file of the SDK you can read:
Xcode 3.1.4, when used for Mac-only development, is compatible with Intel and PowerPC Macs running Mac OS X Leopard 10.5 and later. Use of the iPhone SDK requires an Intel-based Mac running Mac OS X Leopard version 10.5.7 or later.
So we need a Leopard 10.5. By googling a little bit you will realize that there are some modified ready-to-use Leopard images out there that you can download and run out of the box on VMWare.
I'm running a 10.5.2 version (which takes about 5 mins to boot on my quad core). Once you get a Mac OS X running on your PC, you can download the free iPhone SDK from developer.apple.com and install it.
The iPhone SDK for OS 3.1 won't install under a version prior to 10.5.7 so I had to 'trick' the installer rather to update the Mac OS X which seems to be a painful process. To do this, you have to modify the /System/Library/Core Services/SystemVersion.plist and change both the ProductUserVisibleVersion and ProductVersion keys to 10.5.7.
With this done, I selected just the SDK for 3.1 (to save space in my hard disk) and the installation process begins.
And 3.5 hours later....
Once you have the iPhone SDK installed, you can run Xcode (from SpotLight) and launch a new project using a template just to try it out on the iPhone Simulator:
At this point you can develop your own iPhone applications and test them on the simulator. Also, if you joined the Apple developer program (the standard one is $99) you can test them in your iPhone as well.
If you're planning to play around with the SDK I strongly recommend you to sign up on the iPhone Dev Center because there are lots of resources available: Getting started documents, videos, sample code, etc.
I've started to build some sort of two-wheel Balancing Robot and before getting the party started I'm having to deal with mixing up the data gathered from my inertal sensors.
My homebrew IMU (Inertial Measurement Unit) is composed by one 3-axis accelerometer and 1-axis gyroscope. The gyro sensor's got an analog output (.67mV per degree/s) and the accelerometer's got an i2c interface. I would love to share its part number with you but, since I liked this unit to be as cheap as possible, the sensors were ripped out (and reverse-engineered with a logic analyzer) from a cheap PS3 gamepad bought on eBay ;).
Since the output of the gyro sensor is 'too low' for my ADC, I built a simple non-inverter amplifier and a low pass filter with a cutoff frequency of 1KHz. Also I placed a high pass filter (cutoff frequency at .3Hz) in order to compensate the temperature drift. The data from the accelerometer's digitally filtered on the microcontroller with a simple 1st order Butterworth filter.
The main idea of this kind of robots (inverted pendulum) is to measure the tilt angle in order to drive the wheels just below the mass. The higher the center of gravity, the easier balancing will be. Why do we need both an accelerometer and a gyro?
- The acclerometer senses not only the gravity (tilt) but the acceleration forces on its axis. So it would be useful if it was static (no acceleration due to movement).
- The gyro outputs angular velocity (degrees per second) and it's not sensitive to acceleration. In order to get the angular position we have to integrate this signal. However this will drift over the time and the estimated tilt angle wouldn't be accurate after some seconds.
The 'trick' is to take out the best of each sensor: the long-term information from the accelerometer and the short-term response from the gyro sensor. One way to do this is using a 'black-box' known as Kalman Filter (if you are brave enough, have a look at the theory; I'm not :)) which mixes up both signals predicting the actual tilt angle.
I got some source code of this filter from rotomotion (http://scratchpad.wikia.com/wiki/RotomotionCode) and pushed it directly onto my microcontroller. The sensors are sampled at 50Hz and fed to the KF at the same rate. In the next plot you can see the raw tilt angle (atan2(raw_accy, raw_accx)), the integral of the gyro sensor (using the trapezoidal rule) and the output of the Kalman Filter.
As you can see, the raw tilt angle is a little bit noisy while the gyro integration is very clean. Also, there's a lot of drift in this signal but magically the KF manages to estimate the angle very accuratelly and free of noise.
Now it's time to try the filter with stronger movements and vibrations before feeding its output to the PID which will - hopefully - make the robot balance 🙂
I'll briefly explain how to generate the signature file for a given library in order to import it from IDA Pro and get the library functions identified by the disassembler (which can save you hours from digging into 'well-known' functions).
Requirements: FLAIR tools installed.
Execute the COFF parser
> pcf ms32.lib miracl
ms32.lib: skipped 0, total 432
>sigmake miracl miracl
You might get collision errors here:
See the documentation to learn how to resolve collisitions.
: modules/leaves: 9021136/432, COLLISIONS: 382
At this point, just edit the .exc file, remove the comments in the first lines and re-execute the sigmake command.
Now you'll see a miracl.sig ready to be imported from the FLIRT signatures window in IDA Pro.
I've developed a little application that switches the audio output from the rear speaker to the front one and viceversa. This is useful for VoIP applications which are quite unusable without headphones since the audio comes from the back speaker. It just runs for 10 minutes and it's supposed to work at least with the latest HTC models.
I'm working on a project which involves an i2c communication between a master and some slaves. The master device is using an LPC2148 microcontroller running at 60 MHz and the slave ones have a low-cost PIC microcontroller. Each slave device runs a different task and some of them are more CPU-intensive than others.
The nature of this i2c communication is essentially some kind of Query-Response protocol in which the master requests some processing from the slaves and they send back the results to the master. The processing time varies from one slave to another and sometimes it will be higher than the master 'clock' time.
This leaded me to find a way to stop the master until the slave is done with its processing: clock stretching. The slave will pull the clock line down, causing the master to stop until it's done with its task and then releases the SCL line (going high due to the required pull-ups of the i2c signals).
As I was writing the slave C code using PICC compiler, there was no way to implement this technique directly from the supplied i2c functions therefore I'd got to implement it by myself:
if(state >= 0x80) //Master is requesting data
delay_us(400); // Simulate processing delay
Please, note that the values defined for SSPCON register / bits are for PIC16F677 microcontroller and it might differ from the one you're using.
Below you can see an screenshot of the Logic Analyzer output I used for testing purposes. You can see the 400 us delay between two consecutive readings from the master while the SCL line remains low.
All in all, it's a well known technique described in the protocol specification but as far as I'm concerned by googling a little bit, its usage is not very extended and there are not so much source code out there addressing this issue.