> I've never been a huge Java fan, but then I don't write much that needs to > be cross-platform compatible. I use Visual Basic for user interface stuff, > and C++ for real programming. C is much better suited for microcontroller > use than Java. Yeah, you've got to pay attention to pointers and memory > allocation, but that close to the hardware you WANT that kind of control.A lot of people go down this path of "Java is what you use for cross platform", and then wander onto the path of "When you're close to the hardware, you need pointers". Java does in fact provide a road for cross platform software. It does this by eliminating language constructs that are directly aligned with current computer and Micro Controller hardware. What is more prominately at issue is the whole "memory and instructions" verses "objects and capabilities" paradigms. When you change your mindset to "objects and capabilities", you'll probably find yourself frustrated with "memory and instructions".
PICs have some problems in this area where you have to use macros all over the place to refer to control registers and to manage bank access to memory. You have to pay attention to the details and use good software engineering practices. My small experience with a handful of uC programmers has caused me to believe that many are good at getting close to the hardware and being cleaver and squeezing the most out of the platform they've been given.
MSDOS is a great example of this whole concept. Because people were not kept away from the hardware by good APIs, it has taken forever to get the software away from the hardware. Now, it is finally far enough away that a good multi-user OS with real resource allocation can be squeezed in between the user apps and the hardware.
There are many C and C++ and VB APIs that are similar in capabilities, and some are appearing that follow the Java APIs. This is where the portability of Java (as opposed to VB) does provide a decided advantage.
Currently, the choice of Java platforms for micro controller applications is not as plentiful as for C, ASM or C++. But, consider the age of those languages, vs the age of Java, and consider how much momentum Java has accumulated (30,000 people at the JavaOne conference each year) in the past 8 years. I can appreciate everyone wanting to take advantage of their own experience and expertise on various platforms. I have spent considerable effort on applications for VMS, UN*X, proprietary platforms, embedded platforms etc over the past 20 years. When I look back at so many applications that I wrote, which I can't use anymore because of platform and language portability issues, I get a little frustrated.
What | Language | OS | When |
---|---|---|---|
vish - a shell | C | UN*X | 1983 |
Kermit - file transfer | Z80-ASM | TRSDOS | 1984 |
VITPU - vi emulator | TPU | VMS | 1985 |
SCRSMB - scripting print symbiont | Fortran | VMS | 1986 |
SCHED - A long term scheduler to manage CPU use for 20 math and statistics users on a VAX 11/750 | Fortran | VMS | 1986 |
CTLSMB - process control queue symbiont | Fortran | VMS | 1986 |
After college, I went to work at AT&T Bell Labs in Naperville IL. At the AT&T (and the Lucent) 5ESS development labs, I worked in the software retrofit group for 9 years. We were responsible for making the software upgrades to new releases work while the switch was running. The software there was a mess because of too many languages and too many people without adequate training (remember when the 4E's crashed accross the U.S. because someone used 'break' in a switch statement thinking it would take them out of the outer 'while'?).
I wrote many tools for the Blit/630/730 terminals. We had a DMD-5620 at college, and I created some programs for these terminals then too. I wrote a window manager for the 730 series terminals to let you 'tile' windows away to conserve memory. I also wrote an email application that interfaced with the Rand MH mail client (I still use MH to this day but with exmh, a TCL/TK application). It showed faces using the Usenix faces database.
I finished writing a multi-user talk program, that allowed conferences, calls, call barring, paging and delayed messaging (ohh, ask me about digicom which used the cassette port to send packets over 2M radio and was written in Z-80 assembler on a RadioShack Model-1 level 2 computer owned by WB5RWS).
All of these things were written to facilitate more dynamic participative computing environments. The problem was that all the 730 code was written for a specific platform, and the code and APIs were really tied to that hardware. The concepts expressed in the code, and the solutions that they provided were specific to that platform. It was difficult to let go of that stuff when we moved over to Sparc 5, solaris boxes...
To address the too many languages and insufficient training issues, I created a language at the same time that James Gosling was working on Java, for all the same reasons he has discussed in subsequent publications.
My language was called RCL. RCL was targeted at the specific problems related to procedure execution control and error recovery of procedures used to manage the software upgrades of running 5ESS switches.
RCL implemented strict type checking of values. Variables holding references to those values had no type. The language was thus extremely dynamic. I designed the language this way because the people that would be maintaining the language were not compiler writers (compilers and OSes where my areas of concentration in college). I wanted the compiler to not be where they had to mess to add new features. Lex and Yacc are not hard to master, but they are another language to learn.
I wanted to limit what people had to deal with to hopefully just RCL. I designed it to primarily manage the execution of UNIX-RTR programs that we had to run to manipulate the hardware. We replaced shell scripts with RCL scripts because shell scripts are terrible when people create constructs such as the following.
answer=`getsomedata | processdata | grep answer`What happens of course is that processdata will end up with a bug in it, and will dump core. grep will get EOF, and will just exit with a non-zero exit code, and answer will be "". So, you never know that processdata has a problem, you just think that it did not find the data in its input that you were looking for.
RCL has elaborate process management to let you make sure that you don't have this problem. You can put processes into arrays and then call wait() on the array to get the next termination. The process reference in the array that terminates is marked as terminated and is returned from wait() with the exit code set.
Here is a simple example of RCL. This is not really a useful example, but it shows the main formatting concepts.
class foo # class private variables member foo, g, q; # constructor method init() endmethod # A method with three arguments method bar( a, b, c ) # check for class object as opposed to int etc. if( type_of(a) == CLASS ) then # a must have a getValue() method b = a.getValue() * c; else b = a * c; endif; endmethod; endclass;This example shows the fact that 'a' doesn't have a type, and the program is left to implement the correct use of 'a'. Some people are uncomfortable with this level of dynamic programming because they feel that there is no contract that you can check for to make sure that the code is correct.
While converting all of our shell scripts over to RCL, I discovered that I personally did not have problems with this issue. The language checks for 'type use' contracts, and throws exceptions if there are problems. I created a code-coverage analysis facility in RCL early on so that I could make sure that my tests covered all paths through the code. When running in the labs, statistics files were created by the VM, and I could take those and run the coverage tool against those with the source files provided too. It generated printable output files that had either executed, or unexecuted code in bold. We could thus either visibly inspect the unexecuted code (exception handling typically) and pronounce it good to go, or write or alter a test to cover that case.
James Gosling, who was part of many large software projects, seems to have learn lessons similar to those that I have learned. He limited the things that were in Java in the initial release (he actually took at some things things that will reappear in the language for version 1.5) to make sure that it was correct first.
For JDK1.2, Sun tried the 'features first' track, and got bit in the back side, big time. I personally did not use jdk1.2 for anything because it had so many problems. I used jdk1.1.8 until jdk1.3.1 came out.
There were many Java opensource projects that appeared on the web and just never happened in the jdk1.2 time frame, because the platform was just not ready for primetime in that instance. My apps did not work reliably because I had highly dynamic memory use, and the reworked GC, plus the broken symantec JIT compiler just sank the ship.
If you look at other languages that have developed in the public eye, they have all had similar problems with the authors yielding to the feature mongers and creating instability that was hard on all the users. The article by Larry Wall at http://www.perl.com/pub/a/2001/04/02/wall.html is a great example of how historical compromise away from 'Correct' creates language problems that just can't be solved without compromise. So, old, broken or unuseful features continue to exist in the language and allow people to make mistakes with them forever more.
Sun continues to whittle away at java, and for 1.5, they are going to add features. The implementation of those features has had careful consideration. I think there are good things happening, and some not so necessary things happening. But, the more the community gets involved in the evolution of Java, and the more Sun bends towards the demands of those with the $$, the more you will see Java bending in the direction of these other warts that are festering amongst us...