Rss

Archives for : December2007

Cheap ponies

At the risk of beating a dead pony, I submit the following:


(click to go to Yahoo Finance chart)

This is a comparison of Apple and Sun stock since JavaOne 2000, when Steve Jobs made the famous comments about Apple’s commitment to Java. In the intervening seven and a half years, Apple stock is up over 700%, Sun stock is down 90%.

Enjoy drawing your own conclusions or scathing diatribes. Merry Christmas.

There! There’s your damn pony!

After much gnashing of teeth and ill-informed screeds against Apple, the company has released a developer preview 8 of JDK 6 for Leopard. Surely not ready for prime-time, as the small amount of public documentation notes that it is only available for 64-bit Intel Macs (see my earlier entry about the emergence of Intel-only Mac apps).

The belligerent arrogance of Java developers that I responded to before remains in full effect, as the Whiny Little Bitch contingent continues to make up facts and conveniently ignore the difference between Mac developers and Java developers. Javalobby regular Michael Urban is particularly wretched in this regard, with endless gainsaying insults of Apple and anyone who criticizes him. Here’s one recent winner:

Well, I would say the real problem here is, and always has been, Apple, and the fact that they think they can treat developers and high end power users the same way they treat their consumer end users. And the simple fact is, they can’t. Developers need information about what the future is going to look like when they are trying to target a platform. For example if I start a new project today, it’s unsafe for me to use Java 6 features because of Apple. Even though my project is just starting, and certainly Apple should [have] Java 6 released before it is finished, I still can’t plan on using Java 6 because Apple provides absolutely no information at all about what the future will look like.

I would argue that Java developers are more like end-users than they are like Mac developers. End-users consume products for their own needs. Developers, in the traditional sense, consume products (IDEs, APIs) in a software ecosystem but also create new products for that ecosystem (end-user apps, libraries, etc.) The difference is that the Java developer is rarely (if ever) developing Java apps to be run on the Mac — most are developing web apps. So from Apple’s point of view, the Java developer consumes but never produces. Which is fine: the same could be said of photographers, video professionals, artists, writers, teachers, etc. Thing is, those groups aren’t writing pissy posts about how they deserve special treatment.

Or, for that matter, vowing a mass exodus from the platform:

I’m not surprised about anything either. And I hope Apple isn’t surprised that more and more Java developers who once swore by the Mac, have abandoned it and moved to other platforms because they simply can’t deal with the way Apple has been handling Java anymore.

As I argued before, with the size and growth of the Mac platform, losing a few pissy Java developers wouldn’t even be noticed. No, not even if they told all their friends and their moms how much Apple sucks now.

I’m also waiting for some idiot to say that Apple was “threatened” by the release of Landon Fuller’s Soy Latte, a port of BSD Java to Mac OS X, and that this pressure was what got Apple to put out a new developer preview of JDK 6.

Look, we do not know what is up with Apple and Java. Maybe it’s techincal, maybe it’s business priorities, maybe it’s a legal or licensing tussle with Sun. Since we don’t know, it’s foolish and pointless to speculate as to why they didn’t have a JDK 6 with Leopard, and why they put one out a few weeks later.

And I’d hate for Fuller to be dragged down into this. He was very gracious discussing the project with the Java Posse a few weeks back, and I’ve corresponded with him by e-mail and he seems very thoughtful, open, and fair. I don’t think he’s got some big axe to grind with Apple; I think he finds the project intellectually interesting, and that motivates him.

Honestly, if I could find the time this holiday season, I’d see if I could lend a hand with providing the Core Audio support his project needs. I’ve been reading up on CA. It’s nice. On the other hand, I loathe javax.sound.sampled; I dug in pretty deep for Swing Hacks, providing code to handle arbitrary-length streams in Hacks 76-78. Most JavaSound examples load an entire Clip into memory, because that API is easy to work with. Handling large audio files requires writing your own streamer to hand samples to JavaSound on a periodic basis — it’s a pain, and expecting every developer to have to baby-sit JavaSound for non-trivial sound probably explains why nobody’s real enthusiastic about doing it (indeed, when I was researching Swing Hacks, I found no tutorials, blogs, or other third-party documentation on using the library this way, so I was basically in green fields).

Ultimately, I think Soy Latte — possibly as an official part of the OpenJDK project — will probably be the major Mac JVM a few years from now. If the Java developer community is willing to do this work, and it matters much more to them than it does to Apple or most (any?) of its end-users, then that’s the end-game that makes the most sense. Developers would get an up-to-date VM to work with, and even if it ended up not being installed with the OS, who cares? It’s not like end-users would notice the difference between applets that use Java SE 5 and those that use Java SE 6. Most wouldn’t even notice the difference between having and not having Java. It’s not like we’re talking Flash here or anything…

One more note on Apple’s JDK: Apple replied to an open question I posted to clarify that specifics about JDK 6 DP 8 are not to be discussed publicly, as it’s distributed through ADC, which has a binding NDA.

So while I can’t comment on specifics, I can combine publicly-known facts to postulate on a rather important concern that’s going to come up. This build, at least, is 64-bit only. Remember, 64-bit and 32-bit code can’t exist in the same process, so a 32-bit browser couldn’t use a 64-bit JVM to support applets, and a 64-bit Java application can’t make a JNI call into any 32-bit library. Like, say, all of Carbon. And given that there’s a lot of Carbon-based JNI projects out there — QuickTime for Java, the Mac version of SWT, etc. — a lot of developers could be in for a whole lot of pain if they don’t have a Cocoa migration path figured out.

QTJ not dead after all

I had thought that the disappearance of the QuickTime for Java might signal the end for this little-updated but beloved library, but Apple just announced that the docs are back up.

That QTJ isn’t long for the world is a given, since it calls into a number of deprecated technologies, most notably QuickDraw, and uses deprecated methods, like image compression using the older functions that can’t handle frame-reordering codecs ( SCCompressSequenceFrame instead of the newer ICMCompressionSessionEncodeFrame), etc. And ultimately, all of Carbon is presumably meant for deprecation in the foreseeable future, as it is not supported in 64-bit mode. But more on that another time.

I think the weird thing about QTJ’s history is that rather than not living up to its potential, its potential didn’t live up to QTJ. You have to consider the time when it was introduced: for a short time in the 90’s, there was this “everything’s going to be redone in Java” craze, and that included desktop apps. Until the first major AWT and Swing efforts crashed and burned (Netscape Navigator in Java, Corel’s attempt to do an Office suite in Java, etc.), there was a very real belief that Java would soon be the future. Apple was concerned about losing QuickTime developers, so it created QTJ — from the ruins of Kalieda, actually — for the expressed purpose of holding on to QuickTime developers while they changed languages. It was never intended to provide Java programmers with a media framework; it was meant to keep QuickTime developers in Apple’s ballpark. That’s why the documentation is so bad: the target audience already knew the native QuickTime API, so the Javadocs didn’t really need to describe methods in particular detail; they just needed to give the recent-Java-convert developer some idea what native QuickTime function he or she was really calling.

Of course, the “everything’s going to be in Java” era came to a brutal end, at least on the desktop, and while QTJ worked pretty well, the whole point behind it has been rendered moot. Understandably, it hasn’t been an Apple priority for a while, because Apple would be better off putting work into a single kick-ass native QuickTime framework (unfortunately, they currently have two, but that’s another story), and they really don’t need to be providing the Java community with a kick-ass media framework, since that’s really Sun’s problem to solve.

A few down attacks and reversals

I probably won’t blog a lot about gaming here, but it is a valid and important variety of media, one that I enjoy.

The other night, I happened upon the final round of the Championship Gaming Series on DirecTV. I don’t know who thinks that watching other people play video games is good TV, but in limited doses it does kind of work for gamers. Actually, I found myself actually angry at what I was watching, as the gamers playing Dead Or Alive 4 were playing the game as a sort of generic fighting game, and not actually as DOA.

We had DOA 2 on the Dreamcast and PS2 at Worthless Piece of Crap Wireless Software Company #1, and the gist of the game is “anti-predictability”. More than any other fighting game, DOA expects you to vary your style and improvise. The key is the reversal/hold system: if an opponent comes at you with a high or mid-level attack, you press your guard button and go back on the D-pad to grab the attack and reverse it. To reverse low attacks, you guard and press down and back. The system resists canned moves: if you keep showing favorite attacks, opponents will anticipate and reverse them. Reversal damage is often as great or greater than initiated attacks, so good DOA players combine an unpredictable set of attack moves with reversals, holds, and down attacks to win. But in the CGS, the players were playing “Generic Fighting Game”: attack middle, guard, back up, repeat. A decent DOA player should be able to dismantle such silliness. Every time a player went down, he failed to roll sideways for the wakeup game, and the attackers never used down attacks to further punish the knocked-down characters. These are the best players?

When Kelly walked in the room, she noticed the eye-candy babe with the cutoff shirt and short shorts whose only purpose was to ask the teams if they were ready and then say “go!” Kelly’s take: “couldn’t they just use the start button?” Yes, cheesy, but one of the many things they’ve done to adapt the competition to television. They also borrowed their set design from Who Wants to Be a Millionaire?, and the idea of coaches for videogame players seems silly. But you’re taking an activity that doesn’t actually involve a lot of physical action, and trying to make it televisual: the producers knew they had to do something.

Still, next year, let’s hope they work in some music-rhythm titles (Guitar Hero or Rock Band) or some Wii-mote slinging games to get the kids off their butts.

Speaking of the Wii, we snagged one in September for Christmas, marking the end of my run with Sony consoles. Of course, I’m not the only one. As of this writing, NexGenWars shows the Wii having passed the 360 in terms of installed base, with two and a half times as many units sold as the PS3. November’s numbers from Gamasutra shows the trend continuing, with a modest upturn for PS3 following the introduction of a cheaper model. But it’s still not even selling as well as the PS2.

It doesn’t help that PS3 still doesn’t have appealing software, especially among top-tier would-be system sellers. Lair and Heavenly Sword can’t touch the 360’s Bioshock or Halo 3 for buzz, which in turn pales next to the Wii’s reach across demographics.

You know, at this point in the last generation, the PS1 had dropped to $50, yet the PS2 remains at $130, and still sells (better than the PS3). PS3 still needs to be $100 cheaper, but I wonder if Sony isn’t playing a long game — banking on the Wii to pass as a fad, and keeping some pricing power for the PS3 so they don’t totally lose their shirts. It’s possible they’re even using profits on the still-$130 PS2 to underwrite the PS3. But that can’t last forever. They either need something to make the PS3 appealing in its own right, or they need the Wii hype to suddenly go sour.

And no, Blu-Ray movies aren’t going to save them. HD-DVD seems to have been reinvigorated by being the cheaper of the standards, though Joe Sixpack is really not interested in potentially buying into the losing side of a format war with either one. The smart money says both will be displaced by high-def digital downloads anyways, but it remains to be seen if the various parties will ever agree to a workable system: Hollywood doesn’t want Apple dominating movie distribution like they have with music, yet the existing movie download services work with too few devices and offer too few titles to be viable.

One more thought on Blu-Ray. This year at JavaOne, we heard something of a mea culpa from Blu-Ray stakeholders in that they hadn’t done enough to reach out to open-source and indie developers, keeping the format out of reach with exorbitant license fees. I blogged these JavaOne sessions and wrote:

One comment that Sun’s Bill Foote made indicated that there was disagreement within the Blu-Ray Disc Association as to how to approach non-licensee developers. The current situation, with tools and specs only available to licensees (basically just the studios, as licensing costs are extraordinary), leaves the format with too few programmers to be viable, and while participants like Sun would clearly prefer to get information out to independent developers, this apparently doesn’t sit well with some BDA members, even though Foote reports agreement that some kind of overture to indie developers needs to be made.

I would just note that six months later, I have heard absolutely nothing on this front. I still see a few people trying to get into Blu-Ray, but there doesn’t seem to be anything done to make it easy, and there’s still no sign of BD-J being used for anything more than fancy menus and trivial games. The promise that Blu-Ray was going to let you, among other things, participate in interactive group viewings of your movies over the internet, with downloadable new content, continues to exist only in theory.

This is probably not a good sign for QTJ users

As reported on the quicktime-java list, Apple’s page of QuickTime for Java javadocs is now 404’ing. The QTJ home page and its list of Javadocs are still around, but both the online QTJ javadocs and the downloadable Mac and Windows versions are gone.

Maybe QTJ is less OK than I thought.

OpenID installed

I’ve installed the OpenID plug-in for WordPress, so you should be able to use your OpenID for comments rather than registering an account here. I’ll try commenting this post with my own OpenID…

MacIntel-only arriving sooner than expected

Our DV camcorder is eating tapes when rewinding, so at some point soon, I’m going to need to just dump everything we have from tape into the G5, give up on it (no way would a repair make financial sense), and get a new camcorder. I think this might be a good time to get out of tapes and into hard-drive (or flash memory?) cameras, considering I’ve got a couple hundred gig of free hard drive space available.

The hard drive cameras don’t do DV, though. They largely use new codecs like AVCHD. Fortunately, Final Cut Express 4 supports AVCHD, but look at the little footnote:

* AVCHD video requires a Mac with an Intel processor.

Isn’t that interesting? This is one of the first Intel-only things from Apple, just over a year after they sold their last PowerPC Mac. They’re not the first of course: Cider-powered games are Intel only, as is Adobe Soundbooth. And Soy Latte, the open-source port of BSD Java to Mac OS X (I also expect Apple’s JDK 6, if they ever release one, to be Intel-only).

The move to Intel-only on the Mac is happening faster than a lot of people expected. Even me. Even though I was flamed for predicting it would be fast. In late 2005, I wondered aloud if it was worth waiting for Intel, and speculated that Leopard would be the last version of Mac OS X that ran on PowerPC. This was based on a prediction that Leopard would come out in 2007 (correct), and that Mac OS X 10.6 would therefore hit in 2010, four years after the Intel switch.

The Mac zealots ripped into me for offering “dumb advice”, even though my predictions have proven better than theirs. ianragsdale wrote:

You seem to be assuming that within 4 years, the number of Intel Macs will outnumber the installed base of PowerPC Macs to such an extent that it would make sense for Apple to stop development of the powerpc version, despite the fact that the hard part is pretty much all done. I think that’s a pretty big assumption.

According to browser statistics, Intel Macs on the web started to outnumber PowerPCs in November of this year, less than a year and a half after the last PowerPC Mac shipped. Give it another two whole years and the ratio will be wildly lopsided in favor of Intel. Part of this might be because the Mac is picking up market share even faster than optimists expected, so along with PPC owners trading up to new MacIntels, you have switchers plugging in new MacIntels.

nbh wrote:

A couple of things occured to me reading your article. There’s no guarantee that an Intel laptop will be released first, so how long do you expect someone to wait. And would you really suggest that someone buty the first rev of the Intel machines? You also assume that everyone immediately jumps on a new release of an OS, and that’s not the case, especially for home users. I think you should reevaluate what you’ve written. You’re offering dumb advice.

I didn’t assume that Apple would get a laptop out first, but they did, and in retrospect it makes perfect sense: Apple needed to go Intel to get better performance-per-watt than PowerPC offered, as evidenced by how badly the G4 PowerBooks were languishing. The other surprise, I think, is that Apple started the Intel switch in January 2006, just half a year after announcing the move to Intel, starting with an iMac and the MacBook Pro. I kind of thought they’d start the transition in mid-2006, but they were actually done by then. Also worth noting: these first Intel-based Macs did not generate complaints about being flaky 1.0 models.

Neil_McG wrote:

Is having a faster processor going make my iTunes music play faster? Will it make my word processor typing faster? Answer, No it will not. I don’t need it to.
[…]
I think if you are hanging waiting for any machine that will not date – you will never buy one.

On the first point, what we’re seeing from the faster processors is resolution independence, seamless backups, animation, faster media processing, etc. Also, the Intel-only software seems to primarily involve cases where there are existing Intel codebases, mostly written for Windows, that can be run on MacIntel, but are not cost-effective to rewrite and recompile for the totally different PowerPC architecture. So what you get from the new machine is new software that was never going to get PPC versions.

On the second point, that’s a strawman argument that many of my readers made, accusing me of saying “wait forever, something better’s coming”, when in fact the real issue is that the architecture change represented a unique and substantial change with atypical consequences. It’s not that the Intel CPUs are just faster, it’s that a) they enabled applications that would never come to PPC, and b) the expense of developing for two architectures is such that dumping the older CPU as soon as practical is highly desirable for software developers.

al_bickers predicted:

It will be years before anything is released just for the Intel Macs. I wouldn’t expect to see Intel only versions until at least 2009.

As I noted above, Intel-only Mac software is here in 2007.

OK, that’s enough slamming the Mac zealots of 2005 for one night. Time heals all wounds, after all.

Reversing the XCode / Interface Builder relationship

A while back, when Leopard was still NDA, I complained that stuff has moved. Having never gone back and explained…

If you work from earlier XCode tutorials, you may well get thrown off by the fact that Interface Builder no longer has palettes or a “Classes” menu. The first you can do without, because the functionality has been replaced by the “Library” windoid. The second, though, requires that you change your whole workflow.

Many old guides would have you create a new NSObject subclass in Interface Builder, and then create an instance of it, all from IB’s Classes menu. Then you’d do “create classes for MyController” (or whatever), again from the Classes menu, and then go back to XCode to change all the ids to IBOutlets and IBActions. Then you could control-drag to associate widgets and controller methods. But, like I said, the new XCode doesn’t have a Classes menu, so how do you do all this?

Well, actually, you don’t use IB to create your model or controller classes anymore. The first I saw of this was in the QTKit Capture Programming Guide, whose Create the Project Using Xcode 3 section says:

This completes the first sequence of steps in your project. In the next sequence, you’ll move ahead to define actions and outlets in Xcode before working with Interface Builder. This may involve something of a paradigm shift in how you may be used to building and constructing an application with versions of Interface Builder prior to Interface Builder 3. Because you’ve already prototyped your QTKit still motion capture application, at least in rough form with a clearly defined data model, you can now determine which actions and outlets need to be implemented. In this case, you have a QTCaptureView object, which is a subclass of NSView, a QTMovieView object to display your captured frames and one button to record your captured media content and add each single frame to your QuickTime movie output.

In other words, the New World Order is to start by writing out a new class in XCode, hand-coding your outlets and actions:

#import <Cocoa/Cocoa.h>
#import <QTKit/QTKit.h>

@interface MyCaptureController : NSObject {
	IBOutlet QTCaptureView *capturePreviewView;
	IBOutlet NSPopUpButton *videoDevicePopUp;
	IBOutlet NSPopUpButton *audioDevicePopUp;
	IBOutlet NSTextField *fileNameField;
// ...etc.
}

When I saw this, I really didn’t particularly care for it, because I enjoy using IB as a means of mocking up the GUI, starting the process by thinking of the UI and then backing it up with code. That, I thought, was the Mac way. The QTKit tutorial implicitly does this, but by sketching out the GUI on paper, then coding the outlets and actions, and then going to IB to build the view and wire-up. Well, if I’m going to mock up, I can do it faster and more accurately by drag and dropping real widgets rather than writing it out on paper, sigh…

Still, after a couple tries, I got the hang of it. The thing I was missing was that after coding my controller or document in XCode, the key to getting it into IB is to drag an NSObject from the Library to the nib window, then use the inspector to assign it to the class you’ve created in XCode. Then you can wire up the outlets and actions. Actually, it was easier in the document-based application, since you don’t have to create your own controller object, as the MyDocument class serves this purpose, and has its own nib already in the project, so you just do your wiring there.

Actually, the QTKit Capture tutorial is a little more strident about “XCode first, then IB” than is perhaps necessary. The new Cocoa Application Tutorial shows a process by which you build the UI first in IB, then wire the UI to your models from the XCode side… see the Bridging the Model and View: The Controller section to walk through it.

So, it’s OK. I think everyone will get used to it and maybe like it better. But there are going to be lots of questions from people with old books and tutorials, looking for IB’s missing “Classes” menu.

Considered obsolete: Desert Island Discs

Years ago, Tower Records’ Pulse magazine had a feature…

Wait, let me start over. Years ago there was a record store called Tower, and they had a magazine…

No, wait. There used to be these places called record stores…

OK, so there were things called “records”…

Eh, the heck with the context. This magazine used to have a feature called “Desert Island Discs”, in which readers and musicians would answer the rhetorical question, “if you were stranded on a desert island, what 10 records, tapes, or CDs would you want to have?” In other words, what’s your favorite music, to the degree that you could stand to have these 10 discs and nothing else?

Over the years I, like a lot of people, kept a list in my head of discs that were on and off the list. It’s an interesting exercise to track the evolution of your personal music taste.

So I started banging out an update to my list in a Dashboard sticky the other day, when it hit me:

Why am I trying to pick just 10, when all 700 or so of my CDs:

don’t even fill half of my iPod:

One of the Big Points of digital media is that it’s laying waste to scarcity, and physical storage space is a form of scarcity. In fact, one of the reasons I picked the Classic over the Touch was so that I wouldn’t have to manage scarce storage — instead of picking a hundred favorite CDs and a subset of my podcasts, it’s vastly easier to just take a hands-off “sync everything” approach.

OK, though, for what it’s worth:

It’s also interesting to compare what I say I like to what last.fm says I actually listen to.

Belkin’s TuneStudio looks wonderful, but will it ever come out?

I was looking at audio-in options for potentially using my 160 GB iPod for podcast recording, and came across the Belkin TuneStudio:

My goodness, that’s an attractive setup, and the features are wonderful: four inputs, 3 band EQ, phantom power for your mics, built-in compressor, USB streaming in or out to your Mac (so you can presumably use it as a home mixing board, without the iPod), records to the iPod at 16-bit 44 KHz, etc. Works with iPod Classic, iPod 5th Gen, and all nanos.

Only gotcha I see is that it was announced way back in January and still isn’t out. In fact, November’s update to the press release pushes the release date back to January, 2008, and hikes the MSRP from $249 to $399. Unless it’s out on 1/2/08, it’ll miss the shows I need portable gear for, so I’m holding off on ordering until it’s really really available.

But jeez, it looks promising…