What a sweet looking little interface on the upcoming BenQ MID.
No, I am not a Linux user, and especially not a fanboy/fanboi, but from an interface design perspective, Linux is poised to become the clear winner of the alternative interface wars. The reasons are simple.
A) No corporate identity. If a Linus distro wants a certain branding-type identity, someone just branches the distribution and codes away. The roots of this 'feature' are in the design of X Windows and the various window managers, like KDE and Gnome, and the philosophy that the interface is an extension of the underlying commands in the system. Don't like that Apple menu bar at the top? Sick of a certain Start button? Too bad. Millions have been spent on figuring out how big to make the menu fonts, the order of each item, etc., and with widespread adoption of Windows and MacOS in their iconic interface form, probably 100x's as much money is spent on making sure version X.5 looks and operates similar enough to version X that people won't be calling tech support and businesses don't need to retrain employees - a huge time/money sink, and one that is almost never budgeted.
B) The unstoppable march of the digital device swarm. While the popular WMs are still 'meh' in many respects, they have also been competing with highly organized, profit driven companies with millions of dollars spent on consumer research. People generally don't go to Linux because it is particularly easy to use of good looking, but now that companies are starting to put dollars behind Linux interface design (due to it widely being recognized as the Achilles' heel of Linux) the money factor is about to change. Netbook, MID, smartphone, and specialty computer makers are realizing that they don't need all the library overhead and background processes used in a modern desktop OS on devices meant for stereotyped tasks. The result? Better battery life (fewer computations, simpler computations incorporated into low level, modern instruction sets, less disk access), faster performance, and fewer errors/crashes/conflicts.
C) Multimedia barrage. I remember downloading MP3s back in 1997, while at college. You better believe that once I found a patch cable long enough to reach my stereo, my room was THE party room in the dorm. Pissed off that smug bastard with the 100-disc CD changer, too! But, seriously, if there is one positive outcome from the RIAA and MPAA* suing instead of innovating, it is that media formats have remained mostly the same. Good thing they spent years and tens to hundreds of millions of dollars to find out people want to play the music they buy on any device they choose (give Apple time - they'll pack it up soon enough), because the encoding and decoding to the few popular codecs used now can now be done in hardware. Before this stagnation, it made less sense for, say, NVidia to put mpeg decoding in hardware, or at least to repeatedly refine the ability in order to consume less power and operate with better stability. Same goes for software. Coders have been given the time to optimize, understand, and tweak media decoding. Sure there are a couple dozen different codecs floating around, but that could have been much worse**. Let's not forget that the whole open source software movement was also figuring out how to work out collaborations among many programmers in sparse locations at the same time. The result is that we now have cell phones running full motion video, etc. On the Linux side, this has translated into valuable time to support modern media formats (attract media type programmers to a decidedly computer-y OS, work out revisioning logistics, create a legal base/consciousness within the community). Without the benefit of corporate talks/alliances/contracts, OSS needed to implement media-centric features in a stable and reliable fashion. That has been largely accomplished.
* - To their credit, the MPAA has been MUCH more reasonable on this front, which is why I still go to movies and pay for rentals, but have not purchased a new CD or a paid music download in at least the last 6 years. I've contributed to artists through secondary means, like concerts, buying games they have tracks on, and snatching up any DVD-Audio they might possibly release.
** - Innovation on the media front has been largely squashed by acceptance from IP holders, once they figured out people were copying content for free. Any new attempt to make a better codec would run the risk of be legally hammered into oblivion if it gained a sizable user base, mired by copy protection at the insistence of IP holders, or developed far enough for the owners to sell it off, get their money, and run. The legacy codecs we use are there only because corporations acted so remarkably slowly and then started a legal shit-storm for anyone associated. When was the last time you saw someone distributing a product for free to millions of people, customized to their exact tastes and with immediate access, and it took the IP holders years to even recognize its existence?
D) Dr. Halo or How I stopped worrying about my child's social life and learned to love the video game. And, thank video games for unstoppable march of realism and complexity in 3D environments. Unlike video, which has several targets (movie, transmission, streaming, local playback, etc.), video games have had relatively few, making hardware the limiting factor. We can argue about changing 3D standards helping this, but the mantra of "if you're not maxing out the hardware, you're not taking full advantage of it" has egged developers on. Video games also pioneered the HUD, menu system, navigation nuances, and many foggy cognitive features of human-computer interaction that we probably won't even scratch the surface of over the next 50 years. While 3D support for Linux is still largely dependant on how nice the hardware folks was to be about releasing source code (which varies greatly), basic 3D operations have been removed from proprietary APIs and placed in lower level, quasi-auto-detected interactions of hardware and software (instruction sets), optimized at the library level, and generally more 'universalized'. So, while specular mapping from 8x anisotropic filtered internal reflections on bump mapped surface of a volumetricly rendered light source might not be in the cards right now, fade rectangle and rotate isn't a big deal, and that's all you need to a neat-o menu effect.
E) A new breed of 'digital ready' people. 97% of school children play video games. 1% of incoming freshmen at colleges have land lines, while 99% have cell phones. 17% of US homes overall (including mine) have only a cell phone. How many high school students do you think have ever seen a hand-cranked window in a car? And for the parents, remember when YOU discovered Playboy/Playgirl? Yeah, imagine now. The biggest change in culture between the PC of 1995 and today is the user, and this is arguably true for any type of significant tool at the timescale of generations. We evolve with the tools we produce because of the tools we produce. This rule applies to culture as much as it applies to the mess of neural networks in our heads.
Younger folks, and I like to include myself as one, just 'get' electronics. We can fuck with a car stereo for 30 seconds and, through some ingrained, trial and error process, figure out how to set the clock, add favorite radio stations, and contact the International Space Station. Whether it is Windows, Mac OS, Linux, BeOS, AmigaOS, or whatever, as long as some thought was put into the interface, we're (mostly) fearless. Yes, grandpa, we know all about how efficient the command line was for punch card computing. Unless you are actually interested in computing, no one cares. This is where the Mac has generally shined, but mostly because it was only compared to Windows, and generally from the perspective of someone with enough familiarity with both to know that both tools would accomplish the tasks they needed a computer for.
Once this seemingly innate understanding of computing interfaces is understood, it will be implemented in niche devices. Right now it is mostly trial, focus group testing, and error, which has taken us far enough that device makers for low adoption devices to get some pretty slick interfaces. Once we have better rules and theories about the Generation X (or Y or whatever this generation is called) specific differences in HCI, it will be easier for those closest to the goal and with smaller user bases to adapt. Those devices are largely Linux based.
Anyhow, don't wipe your PC yet. I'm not talking about desktop computers, and those are likely to run a completely different course. But, don't shy away from other devices because they happen to have an unfamiliar interface. You might just find it more usable than your *ack!* iMac!