The analysis of the baud-rates is achieved using a 16-bit timer and information obtained from the external interrupts, INT1 and INT2. The time lapsed between each event is recorded in a buffer using the 16-bit timer. The timer is free running with any overflows recorded between external interrupt events.

The software sets the internal timers to three megahertz (the main clock divided by eight). This gives a frequency resolution of 333.3 nanoseconds. In order to be used for other human interaction events the timer is set to reset every five milliseconds. At this point an overflow is recorded, LEDs are updated and a data time-out is checked for.

Although mostly redundant, the logic level after the event is also recorded. The levels are recorded as single bits, whereas the main event time array stores the timer values as 32-bit numbers. The event memory is reset if no events have occurred for 1 second. In its current form the software has memory for up to 128 events, at a standard 8N1 encoding this is equates to at least 12 characters which may seem like a very small data set but in practical tests it was found to be plenty. The event times are stored as raw clock counts.

When designing the software for the analysis I aimed to simply mimic the actions I would manually take and automate them. The trouble with this approach was that so much of successfully tackling these problems in real life relies on educated guesswork from prior experience. I was able to emulate this through the use of a simple baud-rate table as, more often than not, a standard baud-rate is used (students are often amazed as you look at a rough 100 micro-second pulse and say "hmmm I bet that's 9600 baud".)

I thought of using a similar trick for the encoding schemes but felt it was better to go for a brute force approach as this made it possible to find that one-off crazy setting that you would never use yourself. My manual approach is to try different encoding schemes in order of popularity from my experience, so I usually start with 8N1, 7N1, 8E1, etc., until I see some data that looks about right. With the speed of the micro-controller it is fast enough to simply try every valid possibility. Of course, the other difficulty is defining exactly how to quantify "about right" in terms of C code and arrays of data. Luckily, that is something UARTs do every day and is well documented. I found Jan Axelsons "Serial Port Complete" to be extremely helpful in this respect.


The shortest event is easily used to determine the baud-rate. It is then normalised by selecting the closest value from a table of common baud-rates.

The event data is then converted from time data to bit-length data using the nearest baud-rate as calculated above. So a 300-microsecond event is said to be three bits long if the baud-rate selected is 9600. This data is then used to look for consistent positioning of start, stop and parity bits.

Listing 1: Baud Rate guessing routine from main.c file

#define BAUDRATES 17

const unsigned long BAUD_TABLE[BAUDRATES] = 





Name:     guess_baud     

Parameters: none                     

Returns:  char index to B      


**************************************************************************** */ 

char guess_baud(void){

char guess,best=0;

unsigned long actual;

float diff,last;


return 0;


actual = int_baud();

if (actual > BAUD_TABLE[0]){

last = (actual - BAUD_TABLE[0]);


last = (BAUD_TABLE[0] - actual);


for (guess=1;guess<BAUDRATES;guess++){

if (actual > BAUD_TABLE[guess]){

diff = (actual - BAUD_TABLE[guess]);


diff = (BAUD_TABLE[guess] - actual);


if (diff<last){

last = diff;

best = guess;



return best;



When I first wrote this software I called it a "statistical analysis". I don't know if this term is really correct but the idea is to simply try everything and see how often we were right. We then just simply pick the best solution. This type of software is, of course, only as accurate as the sample data and is prone to being wrong on rather small sample groups. The more astounding thing was how accurate it was.

The first part of the encoding analysis is a simple routine that looks for start and stop bits and then counts the number of times it has successfully found a word at a given word length. This is repeated for each word-length from 7 (5N1) to 12(9x1). The one with the highest score is then assumed to be the total word length.

Listing 2: data length analysis routine from bit_analysis() in main.c file

// convert events to bit time (rounding up to nearest period)

period = FREQ_RESOLUTION/BAUD_TABLE[guess_baud()];

for (i=0;i<event_total;i++){

calc = ((float) event[i]+(period/2))/period; 

if (calc > 12 )

calc =1; //filter large pauses



// produce histogram for statistical analysis



for (i=0;i<event_total;i++){

if (event_bits[i] <HISTROGRAM_MAX)








//check start stop combinations 1 stop bit

for (bitLength=7;bitLength<13;bitLength++){

bitsReceived = 0;



bitsReceived += event_bits[i];

if (bitsReceived == (bitLength)) {//this is a single stop bit

if (!(event_lvl[i/8] & (1<<i%8))) {// stop  bit is set

score[bitLength]++; // add one to the score


}else{// we've got exact bits but wrong level?? do we go forward or back??


i++;// try starting from next bit

//i-=2;// try starting next word from previous bit.



if (bitsReceived > (bitLength)) {// overran a stop bit

// this should find next/last 0 bit 

if ((event_lvl[i/8] & (1<<i%8)))

i++;// try starting next word from previous bit.


// break;




// analyse best score/ total bits recieved.

// find best bit rate



for (i=7;i<13;i++){

if (score[i]>score[j]){




Once word length has been established the software then assumes that there is only one stop bit and analyses the data for the parity bit. If it finds a given parity 100% correct for each of the possible words in the event memory then the communication is deemed to be of that parity. If it finds all the words correctly but mixed parity then it determines there to be no parity. The software could easily be extended to look for more than one stop bit.

Listing 3: Parity analysis routine from bit_analysis() in main.c file




//error = (wordsTotal*100)/wordsPossible;

if (j>6){//found a winner

sprintf(display[0],"%d words",wordsTotal);

sprintf(display[1],"%d bit",j);

DisplayString(LCD_LINE1 ,display[0]);

DisplayString(LCD_LINE2 ,display[1]);


// analyse Parity








if (event_bits[i]>12)

event_bits[i]=1; // blast long pauses

bitval = ((event_lvl[(i-1)/8] & (1<<(i-1)%8))!=0);// get pulse value

for (k=0; k<event_bits[i]; k++){ // expand out into individual bits

word[k+bitsReceived]= bitval;

parity += word[k+bitsReceived]; // keep track of parity


bitsReceived += event_bits[i];

if (bitsReceived == (wordlength)) {//this is a single stop bit

parity -= word[parityBit] + word[wordlength-1]; //don't count the stop bit and parity bit

// check for parity bit?

if (parity & 0x01){ // odd number of ones

even += word[parityBit];// 1 is correct

odd += (1-word[parityBit]);// 0 is correct


even += (1-word[parityBit]);// 0 is correct

odd += word[parityBit];// 1 is correct






if (bitsReceived > (wordlength)) {// overran a stop bit, pause between chars??

// this should find next/last 0 bit 

if (((char)event_lvl[i/8] & (1<<i%8)))

i++;// try starting next word from previous bit.


// alert();

// while(S2);




if (even == wordsTotal){ //even parity

//Databits = wordLength - 3 // assuming 1 start 1 stop 1 parity

sprintf(display[1],"%d E 1  ",wordlength - 3);

DisplayString(LCD_LINE2 ,display[1]);


gDataBits=wordlength - 3;


}else if (odd == wordsTotal){ // odd parity

sprintf(display[1],"%d O 1  ",wordlength - 3);

DisplayString(LCD_LINE2 ,display[1]);

gParity= ODD;

gDataBits=(char) wordlength - 3;


}else if( ((odd+even)==wordsTotal) &&(wordsTotal==wordsPossible)){// guess no parity

sprintf(display[1],"%d N 1  ",wordlength - 2);

DisplayString(LCD_LINE2 ,display[1]);

gParity= NONE;

gDataBits= wordlength - 2;



} else{

DisplayString(LCD_LINE1 ,"NoMATCH");

// DisplayString(LCD_LINE2 ,display[0]);


Future Plans

In developing QuickComs, I was amazed at how I was able to take a small sample of data, perform some simple analysis and have a useful device. I was also amazed at how tolerant the system was of errors given the simplicity of both the software and hardware.

Since designing this competition entry I have begun developing the all-in-one super RS232 tool. However, it is sadly still in development, neglected under some boxes just beside my workbench. With this latest reincarnation, I hope to integrate all the features of the QuickComs project with some new ideas, turning it into a pocket-sized stand alone terminal, consolidating its place as the technician's best friend.