review


This is totally off-topic, and if you’re not interested in the business of blogging, this will probably bore you out of your gourd!

In an earlier post, I commented there was more to blogging than met the eye. As part of my 30 day experiment (we’ll navel gaze about that tomorrow), I decided to make small improvements to the blog:

  • A RSS-feed subscription icon as been added to the sidebar.
  • Technorati stats & search box
  • subscriber count graphic
  • posts now have links to email, del.icio.us and digg posts.

All these changes were made possible thanks to Feedburner.  It you are blogging, do yourself a favor and if you have not already, check them out.  They manage the RSS subscriptions.  Adding the various stats and linking ability to your blog is a snap.  Simply click a few checkboxes and past a bit of code to your blog template and the various items show up automagically.

But the real reason to get on FeedBurner is the stats.  You can see how your subscriber base changes over time. During my period of low blogging, the subscriber count went down, and after 29 days of blogging, it went up. Imagine that!  The best part is the new site stats, where you can see how people landed on your page.  It’s always fun to see the search terms that lead to you.

We’re now returning to our regularly scheduled programming…

According to my better half, I have a serious problem.  It is bordering on an addiction. I like books! We have books strewn throughout the house.  I don’t think there is a room in the house without books;even the kitchen has over 50 books…

Like most people, we have numerous works of fiction, paperbacks & hardbacks alike… The usual authors liked by techies: Assimov, Verner Vinge, Neal Stephenson and many popular ones.  In the office, a vortex of all things paper, technical books dominate.  Everything, from the history of cryptography, Object Oriented Design, Design Patterns, C++, Python, TCL, compiler design, etc…  While I’m not the biggest spender, I’m a very good amazonian :-)

A few years ago, O’Reilly came up with a service called Safari.  This is essentially an online reference library containing over 3000 complete books from a number of publishers.  As I have been a big fan of O’Reilly books since the early 90s, I decided to subscribe.  For $20/month, you get a bookshelf with 10 slots.  You can peruse the content of any books in enough details to decide if it’s something that would interest you.  If it is, you simply add it to your bookshelf.  You can then browse the book online anytime you feel like it.  Once you’re done with it, you can remove it from your bookshelf and replace it with another book.

There are only a couple of rules:

  1. You must keep a book in your bookshelf for at least 30 days before you can remove it
  2. The books are for your personal use only.  No sharing!

You also get 5 tokens every month that allow you to get chapters in PDF format, so you can print them out if you are so inclined.

Another service offered is the availability of “Rough Cut” books, which are essentially pre-publication books.  It allows you to look at a book anywhere from 2 to 4 months before general availability.

They have also introduced the Safari Library a $40/month which does away with the restrictions.  You have access to the entire library with no restrictions on the number of books you can see at a time.

All in all, it’s a great service at an affordable price, but due to my ludite tendencies, it will never replace the feel of a real book.  I have been using Safari for over 2 years now.  I find that it’s a great way to look at a book and evaluate the content.  If the book is a keeper, I end up purchasing a hardcopy.  There is just something about paper…

We have been using PIC based micro-controllers for the last year, and generally been satisfied. One of the annoying thing we ran across is that the smaller controllers (with only 8 pins) don’t support debugging. To develop (and debug) your code, you need to use a special ICD version of the chip which only comes on an adapter board. Honestly, that is just a minor annoyance.
The real problem we ran into is the eternal nemesis of embedded development: firmware upgrades. In order to upgrade the firmware on a PIC based device, you have to connect the programmer to the processor (often using some kind of clip), fire up the firmware upgrade program and download the new firmware image. No big deal, except when you’re in the field, must unmount the unit from the wall, open up the case, connect everything to your laptop, perform the update, close the case and remount the unit. Now multiply this by a few units scattered around campus, and you’ve eaten up a day…

Enter Atmel’s AVR family of processors. The primary reason we’re looking at them? The ability to perform a firmware update under software control. They call this self-programming flash. Essentially, with the AVR processors, you can partition your flash into a small bootloader section and an application section. When running under bootloader control, you can program the application into flash.

This my friends is a huge advantage when upgrading units in the field.

Microchip Technology Inc.

Cost: Free
The MPLAB IDE is an Integrated Development Environment for the PIC microcontroller family. Out of the box, it comes with an assembler, but you can integrate your compiler into it. If also supports debugging with one of the PIC cable. In addition to the assembler, here are two components which are worth the price. It comes with a hardware simulator, and a stimulus generator. The hardware simulator allows you to step through your code without having any hadware. Very convenient when the prototype is not ready, or if you simply want to validate the operations of you code. The stimulus generator allows you to simulate activity on the I/O Pins. This is nice if you want to debug your I/O handling routines.

One of he quirk I noticed on the simulator is the steping over function call. When the system returns from the call, you will not break on the instruction after the call, but the one after. Only a minor annoyance.
If you are using one of the small processor, and do it all in assembler (like I am right now), MPLAB IDE is the de facto standard, and there is no reason to get anything else. If you are using one of the bigger PIC with tons (for a PIC) of RAM & ROM, and are programming it all in C, MPLAB IDE is still be a nice tool to have as you can use it as your single IDE (and not have to learn yet another one).
If you are developing something on a PIC processor, do your self a favor, and get the free MPLAB IDE.

Dynamic C : $200
Dynamic C is Z-World’s compiler for the Rabbit series of micro-controllers. The Rabbits are based on the venerable Z-80 architecture, and Z-world (soon to be renamed Rabbit Semiconductors) has managed to make a series of very affordable and versatile processing modules, including Ethernet Connectivity. The compiler includes an IDE used to perform the compilation, loading and debugging of the program.

The biggest issue you will face when using Dynamic C is the non-standard mechanism used by the compiler. Dynamic C does not have the normal concept of source files compiled to an object module, and the various objects linked together to produce the final image. Instead, it uses one .C file to hold the main program, and in essence, it includes a bunch of .lib files (really source files) which it compiles together to produce the firmware image.

Dynamic C does not have a normal #include directive. Instead it has a #use directive, which imports the above mentioned .LIB files. Since it does not have header files, functions are declared using a special type of comment:

/*** BeginHeader SerInitialize */
void SerInitialize(void);
/*** EndHeader */
void SerInitialize()
{ ... }

Dynamic C also extends the standard C language with the concept of CoState and CoFunctions. It is a way to perform multi-tasking, but much of the multi-tasking guts are hidden from the user. One the surface, this is appealing, but one quickly runs into limitations imposed by this particular approach.
The problem with this approach is that the code you write can’t easily be used as-is in a standard compiler. This makes it difficult to retarget some of the code, to run unit tests for example. If you have an extensive code base making use of the Dynamic C extensions, it would be nigh imposible to move to a different compiler, or easily reuse that code.

The libraries that come with Dynamic C are extensive and provide an good amount of functionality. The whole TCP/IP stack is present, including DHCP, an HTTP server and other functionality. Standard functions are also available to access the Rabbit’s 6 serial ports.

Z-world rolls out new versions of the compilers fairly regularly. As a result of the compiler architecture, whenever a new processor module comes out, a new version of Dynamic C must also be released. Unfortunately, there has been a few instances where upgrading the compiler has proved problematic. Some of it has been related to major changes performed to the TCP/IP stack between release 8 and 9 of the compiler.

At heart, Z-World is a hardware company. Their bread and butter is selling their processors and processor modules. Their compiler is only an enabler; after all, if there was no compiler, nobody would choose their processors. Ultimately, the historical decision to have their own dialect of “C” does developers a disservice.

Unless you absolutely need the support for the latest & greatest Rabbit processor (the R4000), or you already have a significant investment in Dynamic C code, there is at least one other alternative compiler out there you should seriously consider.