Archives for : January2008

Tire kickers

This has been on my mind since the Java Mobile & Embedded Developer Days, which was going on at the same time as a Blu-Ray Disc Java (BD-J) event over in Barcelona. The parallel between the two is the existence of a frustrated group of developers, chafing at the restrictions on developing and deploying for the mobile and Blu-Ray platforms.

What kind of crystallizes it is a pair of messages on the Blu-Ray forum over on, which I noticed on the day job. For background, consider this explanation from Sun’s Bill Foote about why the Blu-Ray culture is so different:

This is a really important thing to understand. The media industry,
and especially the optical disc/Hollywood/movie industry, is not the
same thing as the IT industry. Indeed, in many ways, they’re light-years

In terms of business culture, I personally think that the two will meet
somewhere in the middle. We’re already a whole bunch more open with
BD-J than was ever the case for legacy DVD, for example, but folks
coming from the IT or other more computer-science-y pursuits will
find some culture shocks along the way, too.

Unsurprisingly, most developers who’ve ever touched the web, or even the desktop, expect a high level of autonomy and freedom, something that they’re surprised to find absent on these private, committee-crafted platforms. Consider Bill Shepp’s BD-J forum message a few days later, in reply to a wide-eyed “everything should just be free and open” type plea:

Many of us agree, but for better or for worse some of the BDA Director companies take a far more cautious approach to the platform. We’re working to find a good compromise that makes developer information accessible to those with a bona fide interest without lowering the bar so far that tire-kickers clog up the system…

That spawned a quick counter from Endre Stølsvik:

When did tire-kickers become a problem, I have to ask? Have anyone of
the BDA Director companies that have this opinion had a look at the open
source scene at all? Tire-kickers are often the ones that start new
stuff. If something starts, and it is good, others will chip in, and
soon enough you will have a really good thing going. Stupid stuff, or
really bad stuff, dies all by itself.

Unsurprisingly, I think Endre’s right. But there’s more to it than that.

I don’t think it matters.

Here’s the key premise: the idea of reaching out to developers is to get a high quantity and quality of apps on your platform, preferably innovative apps that will give your platform a unique appeal.

Now, does Blu-Ray really need that? For all the shirt-puffing expounding about amazing next-generation features that you might find in the format’s PR and white papers, the fact is that very few customers are going to be motivated by a message about the “potential” for “innovation” in the format. For every one person who’s inspired by the cool BD-J apps that might come out some day, another 99,999 just want to know which high-def format has Harry Potter and The Little Mermaid.

Even with the format war seemingly won, it’s not like Blu-Ray couldn’t use a little help convincing people it’s better than sticking with DVDs (or just waiting for HD digital downloads). At the MEDDs, the presenter for Blu-Ray was surprisingly candid in admitting that most Blu-Ray titles right now use the simpler HDMV mode for authoring their interactive features than BD-J. Worse, she showed off a demo. Two years into Blu-Ray’s general availability, the BD-J apps are not amazing internet enhanced media experiences, but trite arcade games. The demo consisted of the menuing for War (yawn) and a 2D Surf’s Up pinball game that might not pass muster with the superior Flash-based games on the Candystand site. I can’t imagine the salespeople at Fry’s are going to move a lot of Blu-Ray players with that as their essential feature.

But like I said, I don’t think it matters. Whatever the potential of BD-J, the trade association behind it has a narrow view of what they want to do with the technology, so even if it might be well suited for educational video, commercial uses (industrial training, direct marketing), etc., it’s fated not to be used that way. Apparently, the BDA thinks it will make all the money it needs by selling us our favorite movies again. And they may be right.

But that makes the outreach to developers at these Java conference so strange. The message is like “get excited about Blu-Ray… but no, you can’t have an SDK.” With only a handful of studios putting out Blu-Ray discs, and many of them using HDMV, there can’t be that great a need for BD-J developers. Or, more accurately, there can’t be that many positions total… even though there might be high demand for those few positions, assuming Java-savvy media programmers for resource-constrained environments are a rare find.

It gets weirder. The MEDDs closed with a “fish bowl session”, a panel discussion seeded with a few speakers, who then give up their seats to audience members who want to chime in. After the discussion was fixated solely on the topic of mobile “fragmentation” — incompatibilities between devices supposedly implementing the same standards — I joined in to say that that was just one of many barriers to developers in the field. Having to partner with carriers to get your apps signed, or having them completely disabled to third parties, was extremely uninviting, and with so many other things they could be doing on the web with Ajax and Flash, or on the desktop, there might not be a lot of appeal for developers to put up with this. As an afterthought, I tossed in something like “even BD-J makes it hard for outsiders to get in, and they haven’t sabotaged themselves with incompatibilities.” To which former JCP chair Ohno Klut came in to correct me… not to say that getting in was impossible, but that BD-J was also badly fragmented by incompatibilities between players. In other words: it’s not as bad as you say, Chris, it’s worse.

Whatever the state of BD-J, the idea of actively evangelizing a closed platform seems curiously pointless. You don’t see Apple evangelizing Nano/Classic iPod games to developers, since the SDKs are exposed only to a very small group of partners, quite probably solicited by Apple from among the top game developers (EA, Namco, Harmonix, Sega, etc.). There’s no intention to let Bob or Mary write an iPod game in the basement, and thus, no overt effort to recruit them to do so.

Of course, that brings up the issue of the iPhone SDK, expected to be released in February. I’m interested to understand what Apple gets out of this. The simple model is something like “you submit your app to Apple, they sell it on the iTunes store, and you get 25%.” Mmmmmaybe. But I wonder if the long game isn’t really meant to move the device itself, not third-party software. I can imagine a lot of large businesses being very interested in custom iPhone applications. Think of the pad the UPS guy has you sign for your package, which tracks both the package ID and the physical location of the signer. If the iPhone SDK gives you access to the camera (to read barcodes) and the location technologies, you could develop a similar app without having to spend millions developing and manufacturing your own devices, like UPS did.

And a business that can develop its own custom apps would then be in a position to make a bulk purchase of iPhones. So, whatever you thought Apple’s cut of your iTunes-marketed app was, I’ll bet their margin on a couple thousand iPhones would blow it away.

We’ll see what the iPhone SDK looks like, but I’m cautiously optimistic. I don’t think Apple’s making a charitable offering to the wild mob of developers; I think they want to use those developers to move more product, maybe by fostering innovation and buzz, but just as likely by being amenable to corporate apps that drive bulk purchases. Blu-Ray doesn’t seem to have similar motivations — or prospects, frankly — and is perfectly happy to coast on the appeal of its movie content, not its interactivity.

And it should work out for them pretty well. But why they feel the need to bring the dog and pony show to Java developer events, I just do not get.

New MacBook

My hated PowerBook G4’s drive inexplicably died this week… what I thought was some rogue process pounding the disk turns out to have been the death rattle of a three-year-old HDD. Not that I mind too much; the ridiculously poor wi-fi and the rapid obsolescence of the G4 CPU had me planning to replace it this year anyways.

So, thanks to the modest ADC Select discount (10% off consumer Macs, 20% off pros), and fairly speedy free shipping, my travel machine is now a 2.2 GHz Core 2 Duo MacBook.

My new MacBook

The install in this picture is Final Cut Express… with lots more hard drive space, it’ll be nice to have FCE and Soundtrack, and a machine fast enough to run them, while I’m on the road.

If wishes were horses, we’d all have high-speed internet

See if you can avoid laughing when you read that the White House says that it has met its goals for US broadband access:

In 2004, President Bush pledged that all Americans should have affordable access to high-speed Internet service by 2007. A report to be released Thursday by the administration says it has succeeded — mostly.

“Networked Nation: Broadband in America” is an upbeat assessment of the administration’s efforts to spur growth and competition in the high-speed Internet market. Critics said the report’s conclusion is too rosy.

[…]The report concludes that “a reasonable assessment of the available data indicates” that the objective of affordable access to broadband for all has been realized “to a very great degree.”

Unfurl the banners: Mission Accomplished! So what’s this about critics? Whatever could they object to?

The NTIA report drew its conclusion using data from the Federal Communications Commission and other sources. The FCC reported that more than 99 percent of all U.S. ZIP codes received broadband service from at least one provider by the end of 2006.

Critics say the FCC’s data is misleading. A broadband provider has to serve only a single residence in a ZIP code for it to be counted.

Oh, but wait, it gets better.

But defining broadband is a highly subjective exercise. The FCC defined it as 200 kilobits per second. That’s about four times the speed of a good dial-up connection and barely fast enough to stream video.

So, to summarize, if one household in one zip code can get 200 Kbps — an absolute joke of a speed — then the entire zip code is counted as having access to “broadband”.

Anyone reading this blog knows this is bullshit. There’s no point belaboring the point. Now consider the case for broadband made in this week’s Newsweek by Steven Levy, in a piece on proposed bandwidth caps for broadband:

A more profound problem with the metering scheme, however, involves not corporate competition but international competition. Here in America, where the Internet was born, we pay higher prices (seven times what they pay in Korea) for slower speeds (Japan’s users surf 13 times faster). Though President Bush promised that all citizens could get affordable broadband by 2007, tens of millions are still stuck with dial-up. Fast, cheap, abundant broadband is a fantastic economic accelerator, enabling breakout businesses and kick-starting new industries. Unless we move quickly, these will spring from foreign soil. Instead of testing systems that discourage people from vigorously using our overpriced, underpowered systems, government and industry should be working overtime to figure out how to get faster service for less money, and make sure that all citizens, no matter where they live, have affordable access to the high-speed Net. Maybe then we’ll get out of 24th place.

It’s become tiresome listening to the US broadband providers on why we can’t have more internationally competitive service. They say that it’s unfair to be compared to Japan or Korea, because the US is much larger and more sparsely populated. But that doesn’t explain why big densely-populated US cites still get crappy expensive service, and ignores the fact that there are sparsely populated countries with better broadband, including Canada, Norway, Finland, and Sweden.

Going anecdotal for a second, I have an O’Reilly colleague in Scotland who’s very remote: 10 miles from the nearest town. Yet he got affordable, fast broadband from the government, which apparently has made “broadband for all” a policy priority.

That’s one option: you could make broadband a public good, and make sure anyone anywhere can get it, regardless of cost. That’s not necessarily efficient or even real smart, but at least it’s philosophically consistent. And you could go all the way the other direction and get the government completely out of broadband (or better yet, all communications), and say “if you can run a wire or purchase some spectrum, then as long as you don’t use force or fraud, it’s none of our business.” You’d expect to have extremes of innovation and competition that way, though without the guarantee of universal service.

But of course, the US has chosen a third way: protecting the financial interests of local telephone and cable TV companies, who enjoy monopolies granted to them by either states (phone) or local municipalities (cable). This arrangement doesn’t provide universal service, nor does it offer or the benefits you’d expect from genuine competition (speed, price, quality, choice, etc.).

But at least our elected representatives can count on big checks each year from the cable and telephone companies. And if that’s not worth sacrificing our economic vitality for, then what is?

Things left unsaid

A couple of interesting comments from the first half day of Mobile & Embedded Developer Days:

  • Two speakers were hesitant to even use the word “android” at this Sun-hosted event, demurring with apologies like “I know that’s a dirty word around here”. Here’s the thing: both of these speakers were with outside companies. None of the Sun speakers have mentioned Android at all. All of which leads me to think the enmity towards Android among Java’s old guard is even worse than is generally known.

  • All the Sun people I’ve talked with about Java Media… and remember, I’m trying to retire from this topic… have referred over and over again to what they call “the codec problem.” Remember, I think the issue is one of capability, but we’ve been over this (the Java types generally think playback-only is “good enough”), but that’s neither here nor there. People I know at Sun, even those in no way associated with Java Media, talk about “the codec problem” or “the codec question”, implying that what Sun is wrestling with is what video (and audio?) codecs to license.

    So James Gosling gave a brief keynote and brought up a slide on the Java FX platform. One line item was a pair of key features: media and a scene graph. He noted that the scene graph is looking great (and it is; it’s already open-sourced on, but on media, he made only a very fleeting reference to “dealing with the nightmare of MPEG-LA”. Which tells us two interesting things:

    1. They’re looking at some MPEG-LA-licensed codec, probably H.264, and
    2. it’s not going well

    Is it safe to assume that the price is too dear for Sun? Maybe. Sun did pull MP3 out of the Java Media Framework years ago when the license proved unaffordable (it came back later, but now playback-only). Still, all the other companies that are serious about media — Apple, Sony, and now Adobe — have decided it was worth it.

    What about the alternatives? You have to wonder if they’re at least thinking about VC-1, especially given the new cozy relationship with Microsoft. The hoi polloi will probably demand Ogg, though I think that would be crazy.

TuneStudio not, in fact, looming

Despite my previous speculation that the Belkin Tune Studio was about to release, CES and MacWorld have come and gone without an actual release.

I think. It’s hard to tell. A press release says it’s available now in the US, but the product’s web page still says “coming soon”. Amusingly, this iPod-based audio mixer picked up a CES 2008 Best of Innovation honor from SlashGear… the comedy being that the Tune Studio was introduced at CES 2007, and picked up various honors back then. Pretty soon, it’ll be eligible for vaporware awards.

Apparently not deterred by not shipping their last clever iPod-based product, Belkin sent out a press release inviting podcasters to come check out the Podcast Studio (or is it the “Tune Talk Stereo”, as the press release calls it?), a device designed to wrap around your iPod like a sleeve and provide audio in (supporting 1/4″ and XLR mics!) and a compressor/limiter.

Belkin Podcast Studio (or Tune Talk Stereo?)

The prototype apparently shown at MacWorld didn’t work with all iPod models (for example, the 80GB Classic is supported but the 160GB is not), and a Wired blog says they’re shooting for a June release. Of course, that’s only a few months before Apple’s annual refresh of the iPod line, so if they slip, maybe they’ll hold it to ensure compatibility with the new iPods, and ultimately, it’ll be a race to see what gets released first: Tune Studio, Podcast Studio, Duke Nukem Forever, or Chinese Democracy.

If any of my dozens of readers saw either device on the CES or MacWorld show floor, I’d love to hear comments…

Why you don’t print rumors as news

CNBC looks like big-time idiots for running a story on a supposedly leaked set of Steve Jobs’ notes for today’s MWSF keynote, claiming:

  • 30 new indie labels in iTunes Plus (wrong, no such announcement)
  • New iPhone configurations (wrong, no such announcement)
  • Leopard 10.5.2 (wrong, no such announcement)
  • New black and silver MacBooks (wrong, no such announcement)
  • One More Thing™: an iTunes/YouTube partnership (wrong, no such announcement)

You can source it as rumor or say “and maybe it’s just a hoax” all you want, but someone at CNBC decided that this facile hoax, unsourced and unsourcable, deserved to be on their website, journalistic standards be damned. It makes them look stupid, and convinces anyone else halfway capable of forging documents that they too can hoax old media. If you’re gonna go with this, why not just go ahead and put the Howard Stern hoaxers on air too? Sheesh.

QuickTime 7.4, now with fewer export codecs

Apple will presumably have a support document on this soon enough, but in the meantime, the newly-released QuickTime 7.4 removes many of the older audio and video codecs from export dialogs by default.

If you have QuickTime Player Pro (or presumably any other media app that uses QuickTime, such as iMovie or Final Cut), open a movie, go to the export dialog, and bring up “Movie to QuickTime Movie”. Drill to the advanced options and check out your video and audio settings. Whereas the video codecs for output used to look like this:

Pre-QT 7.4 export video codecs

They now look like this:

QT 7.4 export video codecs

Audio has been similarly pared down, from:
Pre-QT 7.4 export audio codecs


QT 7.4 export audio codecs

This isn’t cause for alarm. It’s probably a good decision. It’s hard enough for typical end-users to know what codecs to use (in some cases, it’s better to make the choice for them by means of a good/better/best set of canned export settings), and many of these are unquestionably obsolete, such as Cinepak. It is interesting, however, to see the Sorenson Video 3 that was such a lynchpin of QuickTime for so long, relegated to footnote status.

Plus, all the codecs are still there, and can be shown by way of an option in the “advanced” tab of the QuickTime preferences:

QT 7.4 preference to show obsolete export codecs

Authoring instead of / in addition to coding

A commonality I’ve recently noticed is that some of the Mac frameworks encourage a combination of coding and authoring, in that for some set of functionality, access is possible both with user-level authoring (via visual tools, scripting, etc.) as well as with full-on coding. And this may be something that developers overlook.

It’s the old “if the only tool you have is a hammer, everything looks like a nail” analogy. Let me give you a practical example. Lots of people post to the various QuickTime lists, seeking to do some sort of “overlay”, taking a video and putting a static image (or, less commonly, another video) on top of it. When this comes up on the QTJ list, the poster almost always assumes they need to hack into the rendering pipeline, either stacking views atop one another with a JLayeredPane or, even worse, doing some sort of nasty callback-driven repainting hack.

Almost nobody realizes that they can do it with about 20 lines of XML:

<smil xmlns:qt="">
      <root-layout id="root" width="320" height="240"
      <region id="main" width="320" height="240"
              z-index="1" fit="meet" />
      <region id="logo-reg" width="286" height="94"
              z-index="2" left="17" top="146"/>

      <img src="sf-logo-2.png" 
            region="logo-reg" dur="15"/>
      <video src="" region="root" dur="15" />

When opened in any QuickTime application, the result looks like this:

SMIL example 1

The above is a simple SMIL movie, which defines areas for the movie and overlay and sets the compositing mode to punch out a transparency. The <par> tags indicate the elements are to be presented in parallel, which would allow you to add an audio file if you were so inclined. A corresponding <seq> tag lets you build sequences with script. And you can composite video too, using a fit attribute on the region to scale the overlaid movie:

SMIL example 2

One advantage of this approach is that all the rendering takes place in QuickTime, so you don’t sacrifice performance by having to copy your pixels over to Java2D/Swing.

But of course the big advantage is obviously the fact that it’s easy. It’s well within the range of the newbie QuickTime or Java programmer, as opposed to the pipeline-rendering hacks implied above. In this sense, it’s preferable to another option I’ve posted to the lists before: creating a one-sample video track with your overlay image. Creating the SMIL file does imply actually writing to disk somewhere, but I suspect that if you must do this on the fly and can’t write to disk, you could probably write the XML file in memory, wrap it with a QTHandle and then create the movie with Movie.fromHandle(). Bonus for experts: this might work faster / more reliably by creating a SMIL-specific MovieImporter and using its fromHandle(). Details…

See, that’s one of the things with a lot of the Mac technologies: there are multiple entry points. With QuickTime, there’s a bunch of functionality that you get by authoring, which you can do with GUIs (QuickTime Player Pro), scripting languages (AppleScript, JavaScript in a browser), or even markup, as seen here. These same things, and of course a lot more, can be achieved with code, but sometimes it makes sense to put down the compiler and just author your functionality. And QuickTime isn’t the only example. Don’t we also see this pattern in the following:

  • AppleScript – as opposed to low-level approaches to inter-application communication and automation
  • WebKit – HTML could be an option for large blocks of styled text, particularly if it needs to change at runtime
  • Quartz Composer

I’m working on a article or screencast proposal on the latter, and the “take” I’m using is that instead of writing your own fancy rendering code (yay, OpenGL), you could just use Quartz Composer to build a .qtz file with a few malleable properties, and then bind those to the values in your program that will change. Imagine a fancy, skinnable MP3 player: if you want the artist and title to be displayed with some crazy combination of gradients, reflections, 3D effects, etc., then instead of coding it, you (or better yet, a QC-savvy graphic designer) could just build those effects on dummy text with the Quartz Composer GUI, then expose those inputs as bindable values, which you’d then wire up in Interface Builder or directly in your code.

Java developers never think this way, perhaps because there are almost no examples of this kind of layered-technology approach in the Java world. About the only examples I can think of are Drools and Fitnesse, both of which are about coding your essential logic in something other than Java, usually because it’s meant for someone other than Java programmers to do. But why shouldn’t we want to see more of this kind of approach? Indeed, on the Java side, we now have all these scripting languages running on the JDK, and have a JavaScript interpreter build into JDK 6. If there’s some part of your system that’s particularly volatile or meant to be hacked on by many people, why not just define that as being written in JavaScript? Early on at Pathfire, we had to handle different business logic for different customers — today, I might well create an API for customer-specific JavaScripts to talk to, with the added bonus that there are millions of techies who might not know (or want to know) Java, but who can bang out JavaScript easily enough.

Speaking of which, Quartz Composer also lets you put JavaScript inside a composition. Authoring begets scripting, and if applied correctly, allows us developers to work on the hard parts they pay us the big bucks for.

The goat is loose

Back in the old days of the Macintosh Runtime for Java, the name of the JVM on the “classic” MacOS, Jens Alfke was the Apple engineer who was frequently available and willing to answer questions about the Mac JVM. Granted, he was so frequently asked about “when is Java 2 coming” (it never did, not for the old MacOS anyways), that he complained it was really “getting his goat”, and from that the whole issue of Java 2 on the Mac became known simply as “Jens’ goat”. Compare to the current JDK 6 snit and you can’t help but think that the more things change, the more they stay the same.

Jens has posted a blog today, much linked from within the Mac community (I got it via IM from Daniel, FWIW), announcing that he’s left Apple and Gone Indie. His specific reasons for leaving — difficulty getting Apple interested in social networking applications, and an unease over the public visibility of employees — is understandable. Jens has done great work in the past, much of it apparently lost to Google (I cannot find his AWT widget collection, “Rich Chocolatey Goodness”), and I’m looking forward to see what he comes up with.

And maybe this is also a sign of the viability of the indie Mac developer. With the consistently measured growth of the Mac platform, and the historical willingness of Mac users to, you know, actually pay for stuff, this is going to be more of a trend. I’m surprised by how much useful payware is out there from the small developers, something that MacBreak Video‘s recent reviews of Acorn, Pixelmator, TextMate, Visual Hub, etc., has made me more aware of.

TuneStudio looms?

Could the vaporous TuneStudio, mentioned previously, be looming for a MacWorld or CES release? I’ve been stalking its web page in hopes of just such a release, and noticed that today the list price was reduced from $399.99 to a “special discounted price” of $249.99, matching its originally announced price. As of this blogging, you can still see the higher price in Google’s cache. The fact that they’re bothering to update price data in the page, and offer a special price that is presumably time-limited, might speak to a release soon.

Belkin web page showing $249.99 price for Tune Studio

Speaking of previously mentioned gadgets, my Bella Final Cut keyboard arrived yesterday. So far, I haven’t put it to significant editing use, so I can’t say too much yet about how well it performs at its intended task. It doesn’t come with built-in settings for Soundtrack, although the jog wheel by default rolls the playhead one tick in the timeline, which is a pretty appropriate and useful action. A very quick glance at Final Cut shows that jogging and shuttling through a video clip in preview “feels” right to me… like using a tape deck, albeit with a lot less physical resistance form the shuttle. I’ll ack back on this after doing some serious FCE work with it.

In the meantime, it’s pretty reasonable for everyday typing, though moving the arrow keys down to the wrist rest is a pretty harsh compromise. I’m willing to just abandon the arrow keys in favor of emacs keybindings, which are quietly supported by many OS X apps (BBEdit, XCode, and Mail among them), but don’t work with many modifiers, such as command-left-arrow to go back to the beginning of a line. The keypad should be able to fill in as a cursor movement device, but it turns out that the Num Lock key in Mac OS X is surprisingly non-standard. Look at Leopard’s help when you look up “Num Lock”:

Leopard help on Num Lock key

The description that only “some applications” support moving around your document with a NumLock’ed keypad is surprisingly unhelpful. How would I know which ones do and don’t? Why is there a difference? Why don’t text areas pick this up for free from the OS?

I built a quick Carbon app in XCode — just an editable text area in a window — and found it didn’t support NumLock… haven’t had time to try Cocoa, but since none of the apps that I found support NumLock are Cocoa, I’m not holding my breath.

I whined about this over in the Apple support forums. It’ll be interesting to see what the answer is.