Well, pardon me for sleeping.
Unfortunately for those of us that have to live in the real world, writers and pundits are paid to tell us what's going to "revolutionize the industry." It's no fun for them to tell us that a new technology is a waste of time. It also doesn't sell papers or magazines.
I can't imagine how many millions of dollars have been wasted chasing the illusory futures promised by computer writers. The pundits just move on to the Next Next Big Thing; those who invested time or money in their prognostications are rarely so fortunate.
So, buffeted as I have been by years of strife in the computer industry, allow me to say, "Java Shmava."
First, I have no personal stake in Java. I've tried it, and concluded it's an pointless product. The writers aren't so lucky. Come hell or high water, they have to pump out copy every month, lest the repo men come for the Saab.
Jumping on a bandwagon like Java can provide copy for months. First, the writers tout the arrival of a revolutionary new product, Java. They can do beta previews, reviews, sales projections, and opinion pieces. When the product eventually arrives, they can report on early adopters, industry alliances, companies that are "missing the boat," and write more sales projections.
When Java fails to set the world on fire, the copy will change to a minor key: "Java fails to meet sales projections." Companies that didn't jump into the bottomless pit will be praised as prescient. Columnists will begin to report the unfavorable financial results of firms that invested in Java. After a while, some writers will even begin to deny their previous efforts, claiming they knew Java was a bust all along. Finally, once the furor is completely over and the damage is done, the experts will dare to write "Whatever happened to Java?"
This cycle has been repeated dozens of times in the computer industry. The multimedia, pen computing, and object-oriented fads are among the most egregious examples in recent memory. All of those technologies turned out to be important, but not in the ways the pundits predicted. They also took much longer to blossom than anyone expected.
All too many computer publications simply regurgitate vendor press releases. Even those writers who try to think for themselves are often afraid to criticize something their peers have praised. They say to themselves, "All those other guys can't be wrong." Well, this time, they are wrong.
The classic example of Microsoft Envy is Infoworld columnist Nicholas Petreley. Except for a recent two-week break, his columns for the last several months have dealt almost exclusively with the evil and perfidy of the bad boys in Redmond. The richer Bill Gates gets, the angrier Petreley gets.
According to Petreley, anyone who buys a Microsoft product is essentially a moron. We should all be using things like Linux, OS/2, Java, OpenDoc, and so on, not because they work better than what we already have (or even exist yet!), but because it's the Right Thing to Do. The Only Way to Stop Them. He always has technical reasons why Microsoft is bad, but the undertone of hysteria is omnipresent. Reading between the lines, the real reason to adopt all these unpopular technologies is that they don't come from Microsoft. Lately, I've begun to think that if Black & Decker introduced a new style of Toast-R-Oven, Nicholas Petreley would write a column about how this peerless bread-cooker will lead to the downfall of Bill Gates.
That's not to say that Microsoft is benevolent, believable, or even mildly harmless. On the contrary. It's always bad when a market is dominated by a single player. But there's something wrong when every writer in the industry either slavishly admires the Big M or spends every waking moment yearning for the day when Bill Gates will be living in a cardboard box. Whatever happened to objectivity and balance?
In this case, Microsoft Envy was a major contributor to the Java hoopla. Just as New York Times movie critics always give an extra star to foreign films, industry writers gave a big boost to Java because it came from Sun Microsystems. In Sun they saw a profitable company not dependent on Windows, Microsoft, or Intel. They said to themselves, "Aha! At last, the company that will defeat Microsoft!"
The pundits got so carried away that they soon began saying that existing applications would be rewritten in Java, and that Java would replace C as the professional development language of choice within a year or two. Needless to say, this is complete balderdash.
Rewriting in Java makes no sense whatever. Not only would it require a colossal effort, but the payoff is nonexistent. According to Java boosters, the chief benefit of their language is that it is cross-platform, but all platforms except Windows-on-Intel are rapidly becoming irrelevant.
Further, Java is only theoretically cross-platform. Only the lowest common denominator features are actually identical on all platforms. As soon as you try to use the language's more advanced features, platform differences begin to appear.
The classic case of this syndrome is Java run in browsers. Not only does the language behave differently in different browsers, it even behaves differently in different versions of the same browser. For example, several important features are implemented one way in Netscape Navigator Version 2 and another way in Version 3. These differences are so severe that sophisticated Java apps have to be written to either one browser or the other. Best of all, there's currently no reliable way to sense the version!
Remember that people will be comparing Java programs to the desktop applications they already use. Compared to products like Word for Windows or CorelDraw, the pathetic set of controls offered by the Java Window Toolkit is laughable.
To make their applications comparable to what people are used to, Java developers will have to write and test thousands and thousands of lines of code, code that's already built and tested for Windows and the Mac. They will have to hope that Java's platform differences disappear, write to the lowest common denominator, or write to specific platforms just like today's programmers.
A C++ developer can rely on a class library, such as the Microsoft Foundation Classes, which contains thousands of pre-tested routines. Many of the stickiest Windows programming problems have already been solved in the class library. Java developers are largely starting over from scratch, using a language and development tools that are only months old.
Once again, the experts have it all backwards. According to them, the people who created the applications you use are going to realize that Java has made them irrelevant. All those programmers will just shoot themselves, or, if they're wise, rewrite their products in Java.
Let's think about this for a moment. Which do you think is easier: writing an embeddable, downloadable Excel-equivalent in Java, or adding Internet functionality to Excel? That's right.
Excel, the industry-leading spreadsheet, has been years in the making. Nobody is going to write a Java version of Excel any time soon. There will doubtless be a Java spreadsheet eventually, but it will never compare to the standard.
On the other hand, the Web is built using very simple technology. After all, Netscape Navigator itself is only a year or two old. How hard can it be to add whatever Internet functionality is needed to your favorite applications? The answer, of course, is that it isn't hard.
The pundits have been looking at the Internet the wrong way for quite some time. It's not the operating system or the applications that will disappear: it's the browsers. Microsoft is already showing a Web browser integrated into its operating system; the end of non-Microsoft browser technology on Wintel is now inevitable. Products like Netscape Navigator are classic "middleware:" software that performs a function until it's integrated into the operating system.
Makes sense, at least until you think about it. The problem here is not that nobody predicted what the Net would become; it's that very few people stopped to try to understand why the Web suddenly turned white hot.
When you sit back and examine the Web phenomenon, the issue that comes up again and again is simplicity. It's simple to browse the Web, clicking to follow links from one page to the next. It's simple to create Web documents and put them on the Net. The costs are low and the technology is easy to learn.
What the Web did was make it possible for practically anyone to publish practically anything in a way that practically anybody could read. The Internet is one of the most vibrant and democratic institutions on Earth because it's not restricted to the "high priesthood" of computer nerds.
When thinking about Java, it's worth looking back at a computer language that really did revolutionize the computing industry: Visual Basic. Until VB was introduced, the only way to create Windows applications was to use C and the Microsoft Windows SDK.
With its visual metaphor, easy language, and gentle learning curve, Visual Basic made it possible for a large number of people to develop powerful applications quickly and easily. VB was the Windows equivalent of the Web: it opened application development to the masses.
Java does exactly the opposite: it enables a small group of highly skilled programmers to develop slow, limited applications using a cryptic language and primitive tools. Java is the Anti-Visual Basic.
Millions of people have put up documents on the Web, but only a handful of them have the knowledge, tools, and persistence to develop Java applets. Java is simply beyond the reach of most Web publishers. Only the obsessed (or the well-paid) can take time to learn it.
The truth is that Sun made a devastating mistake. They had a great idea: a portable language that could safely be run in any Web browser. Unfortunately, they designed a language that would be familiar to C++ programmers, who are relatively few, rather than Visual Basic-level programmers, who are numerous.
Java may well turn out to be the worst design choice in software history. Sun had an impressive and innovative design goal that they completely failed to fulfill. Java is a classic example of what not to do when solving a problem.
Java was intended to be easy to learn for programmers familiar with object-oriented development in general and C++ in particular. By chance or by design, Java's syntax is just enough like C++ to be completely infuriating. Knowing C++ is a good way to get into trouble in Java.
Further, for security reasons (and to make implementing the Java compiler easier) many important features of the C++ language were omitted from Java. The hard-core developers Sun was apparently courting will find these omissions a constant source of irritation.
For those who don't already know C++, Java might as well be Sanskrit. I have nothing but pity for the entry-level programmer trying to learn Java.
In other words, you can't win. If you have the skills and knowledge to program Java effectively, its idiosyncratic syntax and missing features will drive you insane. If you don't already have those skills, trying to acquire them may drive you to despair.
To really plumb the depths of the nightmare, factor in the Browser Security Paradox. If you're a developer, Java in a browser has way too much security. Almost all the things you'd like to do are prohibited, and more fences are being built every day.
If you're a user, Java doesn't have enough security. Every few weeks some hideous new security breach is detected. Seeing the words "Applet Running" in the browser status line is enough to give anyone the heebie-jeebies.
Fortunately for us chickens, there's no need to learn Java.
If you develop an application with a C++ compiler, the result is an executable machine-language file. Your program has access to all operating-system functions, including those that could be used to wreak havoc. Also, the program will only run on the processor and operating system you built it for. Obviously, these are serious disadvantages for a program meant for the Web.
When they set out to make Java safe for the average Web wanderer, Java's designers had a very clever idea. When a Java developer compiles his program, the result is an intermediate language called "byte code." This byte code doesn't execute directly. Instead, it is executed by another program called the "Java Virtual Machine."
This approach has two major advantages. First, the virtual machine can (theoretically, at least) prevent operations that might be destructive to the host system. Second, to support a new platform, only the Java Virtual Machine code needs to be ported; the Java applications themselves needn't change.
Byte code is the reason that Java itself will perish, yet the idea of Java will persist. The virtual machine executes byte code in the same way that the CPU in your computer executes machine code. Just as your system's CPU doesn't know whether the machine code it's running came from a Pascal, C, C++, or Assembler program, the Java Virtual Machine has no way of knowing what language actually created the incoming byte code.
In other words, you don't necessarily need to write Java to create Java byte code. That one simple fact will be the Web applet's salvation.
Just as application developers are feverishly adding Internet functionality to their products, so are programming tool developers writing compilers that will produce Java byte code. Within months, you will see C and C++ compilers for byte code. More importantly, you will see languages similar to Visual Basic that produce byte code.
That's what will explode on the Web. The browsers will never know the difference, but you and I will. Simplicity. The ability to develop applets for the Web without being a propeller-head. Give me a simple language that produces Java byte code and I will move the world.
In a year or two, you'll be able to ask yourself: if everybody's running Java byte code, but nobody's writing any Java, does Java exist?