Wednesday, November 22, 2006

Death Wish of a Fading Rock Star?

I'm not coming at this as an Apple aficionado. It simply strikes me that people can be so arrogant and blinded by simple facts of life.

Ed Colligan is the CEO of Palm and obviously a bright person. Sure this was something he answered in the spur of the moment to a reporter asking him to comment about rumours of a possible Apple phone. Spur of the moment answers are interesting though. They tell you either what the person's gut feeling about something is if they haven't given much thought to it or what they truly think about the matter after they've thought it through considerably. In either case brushing Apple off in this manner is worrisome.

The situation reminds me very clearly of a company all hands meeting in the summer of 2003 while I was working at Roxio.

BACKROUND: Roxio had just recently bought the Napster brand and and planned to re-launch it as Napster 2.0. Of course none of the original infrastructure could be used, because the original Napster didn't really have any infrastructure so Pressplay was also acquired in May 2003 to actually run the (then) subscription-only service. Basically Roxio had put almost all its spare cash into these purchases. More info here:
http://en.wikipedia.org/wiki/Napster#Current_status
and here:
http://en.wikipedia.org/wiki/Pressplay

In the mean time Apple was busy with a few things. Apple had started staging its multi-year strategy to regain market share and acceptance and clean up the disastrous effects of the '90s. The first phase was a massive product line cleanup and a focus on core competencies. Part of this was reaching out to part of it's core market through the "Rip, Mix, Burn" marketing and engineering campaign where the iMac was equipped with CD-burning hardware and software. Shortly after Apple introduced its iPod in October 2001 to be a Mac-only product originally. The term is now synonymous with MP3 player but at the time it had just started generating buzz. Then Apple made a marketing splash with its "hell froze over" event announcing a windows compatible iPod in July 2002. At the time Apple already had iTunes that was considered by many to be the best music management software around but they chose to release the Windows iPod with a special version of Music Match Jukebox, which at the time was arguably one of the better suited pieces of software available on Windows to get the job done. Fast forward a few more months to end of April 2003 and Apple announces the iTunes music store. With it Apple had firmly placed its stake in the ground stating that pay per download and NOT subscription models are what the user wanted. But at the time obviously the iTunes Music Store was available only for for iTunes and hence only available for Mac; hey was is Apple after all what could you expect... right?
Watch the introduction of the original iPod:
http://uneasysilence.com/archive/2006/10/8008/.
Review information about the iPod and its timeline:
http://en.wikipedia.org/wiki/Ipod#iPod_models.
Background on the iTunes Music Store:
http://en.wikipedia.org/wiki/ITunes_Store#Background

So that's about more than enough background info to be just right. During the Q&A session at the end of the all hands meeting one brave soul asks what Roxio's take on the iTunes Music Store was since there had been some rumours of a Windows version. The response was a simple as can be: "It'll never happen." No one even paused and the Q&A session went on. To that I told my boss at the time not to underestimate Apple's position in this market. Fast forward a few months and in October 2003 Apple was announcing a Windows version of iTunes, the Windows version of the iTunes Music Store and a completely new generation of the iPod; all in time for the holiday season. Napster 2.0 wasn't even ready yet... Ooops!

Now back in the present Napster is still met with lackluster acceptance and iTunes/iPod hold roughly 85% and 75% market share in the US respectively.

Those comments about Apple were made at a time when Apple had just set in place a strategy to come back from the brink of extinction and it was just barely starting to work. Other than the ones who really wanted to believe Apple could make a comeback or to companies like e-Machines copying the iMac because it was trendy and fashionable Apple was a nobody. It almost sort of made sense for a comment like that to slip by at the time. By in our day an age? How can somebody as bright as the CEO of Palm say something like it will take Apple at least as long as us to get the right mix?

He makes some logical conclusions about how Apple might go about executing a mobile phone strategy but the only problem is that it's set 3-4 years in the past. For one you don't have to worry nearly as much about the childish bickering among the various carriers you can operate as a Mobile Virtual Network Operator (MVNO). Virgin mobile is one of the very successful MVNO's and companies like Disney have had more trouble (I think because they don't have the right mix). Here's a good article summing up the current state of MVNO's: http://news.zdnet.com/2100-1035_22-6106423.html. So let's see Apple has core competencies in hardware and software (especially in integrating the two really really well), it has shown that it has a knack for finding the right mix of content and features that people will go for, has a CEO that might just happen to have good connections at Disney (read might have access to contacts to get a head start over the hurdles of setting up an MVMO), and finally runs an incredibly tight ship.

I'm not saying that's a 100% formula for success and of course in all of this discussion I took the luxury of assuming that the iPhone rumour was actually going to be true just so I could illustrate a point about Palm's position. It does seem like a risky move to take when you are already struggling as a company to completely brush off the media and pop-culture darling. I'm just picking on Palm here because the example is too easy to pick on but Palm is not alone in thinking like this (I've made my opinions on RIM quite clear in the past although not on this blog).

Meta-blogging

I really need to blog more!

Why?

For lots and lots on reasons.
- As my day to day activities migrate away from academia (for the time being) and more towards programming I want to keep practising the art of writing effective prose.
-Blogging is a very effective (modern) tool to capture the progression of my ideas so I can review them later.
- I have experienced tremendous internal resistance to blogging regularly. Figuring out what can reduce the activation energy to write regularly might hold the key to finding out how to design interfaces with low activation energy to accomplish tasks in general.
- I have this asinine fear of exposing things that aren't perfect or at least close to completed. I need to start embracing organic development and releasing my need to control everything so tightly.
- I need to figure out how to focus my thoughts. Too often I feel the need to give much too long of a pre-amble to establish shared context and then diverge into way to many avenues of the ideas I'm trying to express. This is related to the need to feel like everything I write is "complete."
- But mostly to collect my ideas and share them; I have spurts of creativity that I want to capture both for myself to come back to and think about more deeply later or just so they get out there since I don't have the time to pursue all the ideas that come into my mind.
ASSIDE: My thoughts on distributed cognition and socio-emergence of ideas is probably best left for an entirely separate entry (or set of entries). That's all.

Thursday, October 05, 2006

Mental Explorations in the Land of Virtualization - MELV

I work on the team responsible for delivering VMware's hosted suite of products (Workstation, ACE, Server, Player). Looking at that line-up of products you could say our part of the company is focused on delivering solutions that push the envelope of things that can be done with virtualization. I personally have come to think that virtualization will provide the computer industry with it's next paradigm shift; how very back to the future! I think part of the reason has to do with how nascent computing (the industry, field of research, and trade) really is compared to what else is around. Many of the things that are needed to make virtualization worth while are just starting to come together.

A particular interest of mine is how virtualization will fundamentally affect the way Joe Consumer uses his computer; how does virtualization affect the end consumer's computing experience. At this point it's a no brainer to me that virtualization should be used in the data centre and if you are a developer (web designer, software engineer, or any other flavour) and are not using Workstation then you are at a disadvantage to your competition that are using Workstation to be more productive. Plain and Simple!

With the easy part out of the way let's move onto the harder questions. The issue I'll target in this blog post is application packaging and virtual appliances.

I think virtual machines provide an interesting way of viewing how applications should be packaged for consumers and enterprise alike. Imagine if Oracle developed their next database engine tied to a specific kernel that they tuned to meet their needs and then shipped as a VM that could simply be copied onto a SAN and there were barely nothing else to do to get it up and running. Better yet, imagine a game publisher having complete control over the run time environment by compiling a custom system into a VM; no more need to worry about what other cruft is already on the gamer's system since the game runs inside the VM.

VMware is promoting this concept under the moniker "virtual appliance." As far as I'm concerned, there have been a few imaginative examples of virtual appliances but most are designed with virtualization as a peripheral thought. Another huge issue is the size of these appliances often ranging in the hundreds of megabytes to gigabytes. They are often full distros plus a few other tools that are actually pertinent to the appliance.

In some ways I see this as an opportunity for the second coming of micro-kernels (I know I'm painfully abusing the meaning of micro-kernels here). If there would be some streamlined way to create a VM with an open source kernel that contains only the bits you need to run the VM and then include your pertinent additions you would end up with a truly streamlined solution. There might be some wasted hard drive space but we are taking in the 10s or maybe, maybe couple 100s of megabytes. RAM is even less of an issue because of shared memory page techniques that the VMware platform can offer.

All right, enough rambling. I've decided to put my money where my mouth is and personally explore various applications. The basic premise is that you can be creative by well... being creative. You need to jump into things and experiment with solutions, try them on for size. These are the sorts of ideas that grow organically. You start with some basic stuff and slowly your mind starts to form a mental model. I personally think that the concept of properly using virtual machines is so radically different from how most of us have grown to think of computing that we literally need to boot-strap a new model.

That's quite literally what I've set out to do. I'm going to start very simple (conceptually anyway) and explore the various ideas I come up with that I think may be interesting to pursue. I suppose I should also set a few ground rules but nothing too limiting that it will keep me from exploring an idea.

1 - I'll try and keep a series of blog posts about my experiences. All these blog entries will have subject lines that end in - MELV for easy searching.
2 - When applicable (if a usable VM is produced out of an exploration) I'll do my best to make it available in some form.
3 - The initial exploration will be focused on application packaging and virtual appliances. Given my current mental model and ideas that have been brewing in my head this is an easy place to start my bootstrapping process.

Note that I've made no promise that an exploration should end in a usable VM of sorts. That would sort of defeat the purpose of exploration now wouldn't it.

------
Shawn Morel works for VMware but eats sleeps and blogs for himself... He certainly doesn't speak for VMware.

Friday, June 09, 2006

Microsoft's new type faces... drool

As some of you may or may not know Microsoft's typography labs has been hard at work coming up with improvements to a new version of clear type and a new set of typefaces slated for release with Office 12 and Windows Vista.

I'm not shamed to admit that I'm a typo-phile at heart and I've been following the development of these type faces on and off for a little more than a year. I was browsing the latest articles on OS News (as I do daily) and came across "A Comprehensive Look at the New Microsoft Fonts," an article about the new type faces. How could I resist!

Reading the article I think the author missed some important points though. Granted, the article does claim to be comprehensive and my writing has the brevity of Buckminster-Fuller or Hofstadter. I do commend the author on being brief, it's a skill that I'm always working hard to achieve.

I must say that the new type faces are absolutely beautiful. I'm a computer science student and I do contract work for a software company. I also happen to have a condition that makes my reading speed very slow (I fall in the 4th percentile of the population). Being someone that stares at a computer screen all day a new typeface like Consolas would warrant the cost of Vista alone. Yes that's right I would pay 200$ for a monospaced font to use in my code editor!

Back to the article though. The author first points out that the majority of the new type faces are sans-serif (simply put that means the letters don't have any decorators on their stems and terminals). A typical example of a serif font is Times New Roman and examples of sans-serif fonts are Futura and the more commonly known Arial. Obviously there are many type faces of either kind. The author seems to be a little confused that the majority of the new fonts are sans-serif since in theory serif fonts are more legible.

Typography myth number 1: Serif type faces are easier to read. This is still a debated point but there is not much conclusive evidence that serifs help with readability (based on testing reading speed of on screen type and print). The general idea was that well designed serifs helped the eye follow the baseline of the font and that the more information (differentiation) that a font displays in it's outline the more the eye will pick up and easily differentiate between letters of a type face. What the research shows is that inter-letter spacing (kerning) and inter-word spacing play a much more important in readability than serifs. This argument is sometimes taken to the extent that serifs become a matter of preference and habit; someone who does a lot of reading and only in a type face with serifs and then must read something in a sans-serif font might struggle a little more at first as they get used to it. The same would be true of someone that normally reads sans-serif and then must read a font with serifs.

The other important point to consider is that computer screens are incredibly low resolution when compared to print. Considering that most modern LCDs fall somewhere in the 100 dpi range every pixel you have to represent a letter counts. Representing the spine of the letter in as true a form as possible is much more important than sacrificing that to squeeze in serifs. Also at 100 dpi it is nearly impossible to accurately represent the serifs so the debatable good they may have brought to readability is negated. Serifs at this resolution fall more in the category of visual noise than valuable structures of a type face. Once screens start reaching 300 dpi (and for various reasons I would argue 600 dpi) then we could consider using serif and sans-serif fonts interchangeably for screen reading. Support for resolution independence in modern operating systems and the social aspects of using serif vs. sans-serif fonts brought about during the Bauhaus movement are 2 separate articles all together.

The author also brings up the use of the letter 'C' for the names of the new type faces. These fonts all start with C because they are the first type faces actually designed with Microsoft's clear-type technology in mind. Originally fonts like Verdana were designed with the screen in mind and clear type made them a little better. As we can see with these new type faces designing with clear type in mind from the start has produced some clearly impressive results.

I especially enjoyed some of the conspiracy theory comments posted by readers about the name choices. Some went as far as to speculate that Microsoft wanted to ensure that the fonts would be at the top of font selection lists. I don't know about you but personally I have a bunch of fonts that have names starting in 'A' and 'B' and Microsoft's fonts will also be scattered among a bunch of fonts starting in 'C.' As for personal bias well I'll let you be the judge of that. On campus I work as the Apple Campus Rep and I do contract work for a software company that is a direct competitor to Microsoft.

Typography myth number 2: Anti-alisasing makes fonts more readable. This point was not made by the author but rather by on of the readers in a comment.

I think you mean for the first time in history on MS-Windows systems. Mac OS X has had an advanced anti-aliasing system in place for a few years now.

Let me say this rather bluntly anti-aliasing is bad for fonts! Anti-aliasing actually blurs edges to trick the eye into thinking that lines are continuous instead of discrete pixels. You want you font to be rendered with a high fidelity. Anti-aliasing actually throws in some uncertainty because you can't be sure how the spine of a font will be deformed and blurred when letters are placed at arbitrary locations on the screen. I have to agree that in practice though OS X does look better for the time being. I think this is partly due to the type system making the spines of fonts a little bolder and having more flexibility in kerning adjustments. This is all going to change soon as we start to see 150 dpi and 200 dpi displays. Microsoft's type rendering engine is hands down, technologically superior.

Finally how can you talk about all these new fonts without even pointing to the people responsible for them.
An amazing video on Channel 9: Cleartype Team - Typography in Windows Vista
Microsoft's typography group web site: www.microsoft.com/typography