I love Silicon Valley, but lately I’ve been having a hard time enjoying the current one. I’m much more infatuated with the one from the 70’s through the 90’s.
The 70’s had the rise of pong and the first microcomputers (the Apple, Coco, Atari, Sinclair, S100, Osborne, and Commodore systems).
The 80’s had the flourishing of the Apple II, the Atari 800, and the introduction of the Macintosh. The Mac kicked off the 68000 ERA for the Atari ST and the Amiga. Then the next ERA belonged to the workstations like the Sun, Apollo, DEC, SGI, and my personal favorite… NeXT.
The 90’s had the rise of the 486 and Pentium which made that technology affordable. Combine that with the Internet, the browser, and inexpensive networks, and some great things happened (look at Palm).
In the early 2000’s, all we have had are a few pops and bangs (Napster, Google), but nothing that makes me REALLY go Wow! I think this is due to the rise of open source and a tremendous lack of innovation. Look at the Web. So tired. Nothing new and improved is happening here. All the blog postings are about the “Standards Based Web”. Most of these people do not know a bad design when they see one. How is the web going to become easier to use, more appealing, and more accessible if they require huge codebases just to render a simple form. The Vt100 of the 90’s has become the big, incompatible pig of the 2000’s.
Part of this rant came about because of some nightime reading of Folklore.org and this old paper by Ted Kaehler on the Smalltalk VM system. Look at how simple it is. Look at how powerful it is. Why can’t this kind of design and innovation exist today. People in the industry continously make some very poor choices, and the vast majority of programmers/technology industry just eats it up.
Maybe it is just this year…
I mean, the 2000’s did bring WiFi, some Amazing 3D, and the sub-$1000 super computer.
Maybe it is just the copy-catting that goes on. There is now only 2-4 major CPU architectures. There are only about two OS’s. It’s kind of boring.