Rss

Archives for : April2008

The middle way is always wrong

Apropos of my buying a high-end Mac, I note a little dustup in the comments to an Apple Matters article on The Non Existent Glaring Hole in the Mac Lineup. The article basically attempts to puncture the argument for the mythical midrange Mac minitower, arguing that the market not served by the mini, iMac, and Mac Pro is too small to matter, if it exists at all.

By and large, I agree, and note that author Chris Seibold completely overlooked the argument that more than half of Macs sold are laptops (this was at least true of Q2 2007, and is in-line with a consumer preference for the flexibility of laptops). Continuing the argument, the demand for an expandable Mac that’s not quite a Mac Pro has to prove that you can divide the Mac market in half, take the smaller half, divide that in four (mini, iMac, MMMM, Mac Pro), and still have a viable product. Given that the mini’s been marginal for a while, I’m not sure I buy this.

The other thing is that expandability may be better in theory than in reality. When it comes to putting cards in your Mac, you take a chance that drivers will continue to be updated for your card as future versions of Mac OS X come out… not a safe bet in the small Mac market. Being able to swap out drives isn’t a huge deal, as external FireWire or USB is an easy option for most (but not all, since I don’t think you can boot into Windows via Boot Camp on an external device).

I had figured the PowerMac (dual 1.8 G5) that I just replaced was going to be a 4-5 year machine, since I’d be able to expand it at any point. I did upgrade the drives (from 80 to 350 and 500), but there was never an option to upgrade the CPU, as the Intel switch rendered PowerPC irrelevant. Really, this is something of a special case. Apple tends to use odd requirements for OS releases (“must have internal FireWire”, “must be 833 MHz or higher”) to drop old Macs when they’re about 3-4 years for consumer Macs, 4-5 years with pro Macs. But the Intel switch is pushing both pro and consumer PPC Macs out the door at the same time. So, atypical case.

Still, some of the best advice I’ve ever heard on computer shopping came from a Computer Science lecture I happened to see on campus TV at Michigan State years ago. Instead of the usual approach of “analyze your needs and figure out what hardware suits you”, the prof said “buy the most you can afford, or the cheapest thing they’ll sell you.” The reasoning being that the highest-end system will have the longest viability, whereas the cheap box obviously doesn’t cost much, so when it falls behind the curve, you’re not surprised, disappointed, or out a lot of money. Obviously, in the Mac world, these extremes are the mini and the Mac Pro. Actually, the mini may be a bit too extreme of an example, as it seems designed to give switchers a cheap way to try the Mac, bringing with them their existing monitor and possibly (if USB) keyboard and mouse.

So then, there’s the no-hassle iMac. No expandability, but the company’s bread-and-butter product, meaning you’re not going to suddenly find its video card not supported when OS X 10.6 comes out in 2010. Is there really a market between that and the big-ass Mac Pro? Are there enough gamers who want to dick around with graphic card upgrades? Probably not — there aren’t enough Mac games to matter, and those that exist have to target the last few iMac models anyways.

There are a few people who long for the MMMM, but I doubt there are truly enough real buyers to bother casting molds or making new packaging for.

Hail Xeon!

The freeze-out on using PowerPC for iPhone development — somewhat inexplicable since people have reported getting the various iPhone SDK betas mostly working with PPCs (seriously, Apple, would supporting PPC really be that burdeonsome?) — has forced me to replace my G5 with an Intel box a year sooner than planned.

Mac Pro (Yuna) and Power Mac (Aeris)

So, here they are, side by side after Firewiring them together for the Migration Assistant during setup. Hard to tell which is which, right? The one on the left is Yuna, the new Mac Pro, and the one on the right is Aeris, the old G5.

Anyone who knows my naming convention might have figured that “Yuna” was the next name in the series, since I name each machine’s partitions after characters in the Final Fantasy games, and I’m up to Final Fantasy X, meaning this box’s partitions are Yuna, Lulu, and Rikku. One thing I do with this convention is to use screenshots or fan art (thank goodness for DeviantArt and all the FF fanboys and fangirls out there) of the character whose name is used for the current partition, giving me an instant reminder of which partition or machine I’m on.

Yuna desktop

I’ve only been up a few hours, and haven’t really pushed the 8-cores very hard:

8 cores not doing much

I feel like I should re-export my AMV from Final Cut or do an MPEG transcode or something that’ll spin up the CPUs.

Or maybe I’ll hit the system-grinding Flash slowness of the My Coke Rewards site, which used to bring the old G4 laptop to a near halt (who knew you needed multi-core gigabyte speed to animate a dialog popping up… seriously Coke and/or Flash, what is your problem?)

Back online…

I feel like I shouldn’t complain when last night’s South Park portrayed the city’s panic when their internet went out.

Still, if either of the regular readers wondered why the site was unreachable Friday through Tuesday, it’s because we found that upon our return from a week in Michigan, our DSL provider was doing an amazing impersonation of a company that had suddenly gone out of business: all their clients’ connections offline, “all circuits busy” when you try to call, offices locked and blinds shut, with a handwritten note saying “please call our parent company in California”, etc. The chaos can be revisited on DSL Reports.

Anyways, it turns out they weren’t out of business, but while they were failing to respond in any meaningful way to the outage, I switched companies. Took a little longer than it actually needed to, but it’s done now, and we’re back.

Ridiculously unproductive week here, FWIW. The house was all torn up with painters working most of the week, I lost of lot of time with child maintenance, and to add insult to injury, I missed the FedEx truck with the new iPhone-SDK-compatible MacPro earlier today.

Still, guess I’d better not complain too loudly, though, or I’ll end up looking like one of the jackass denizens of South Park.

Someone care to explain pointer arithmetic to me again?

So, one of the weird things I’m getting used to in the C-based frameworks (Core Audio, CFNetwork) is a pattern for holding “state data” together across a number of calls to related functions. Coming from 10+ years of OO, it seems unwieldy, but it does afford a certain freedom.

What these functions often have as their first parameter is a pointer to a user-defined structure that can have more or less whatever you want to put in it. The idea is that it needs to contain everything you’re going to need to go into or come out of any of a series of related function calls. For example, if you’re going to read bytes from a file and play them in an audio stream, your structure will have pointers to the AudioQueueRef, the AudioFileID, some ints for the buffer size and number of packets you read on your last call, etc. Each call may use only one or two of these, but you bundle them in one big typedef, and that block of data is passed to and from all the related calls.

So, my current challenge is converting the example in the Audio Queue Services Programming Guide from one that plays bytes from a file to one that gets them from a network stream. What’s hanging me up — and will probably become an ADC support request in the next few days — is that the callback-oriented nature of these APIs requires me to buffer some packets extracted from the stream, and hang on to them until I get a callback from the queue telling me it’s ready for me to fill one of its audio buffers with packets. Typically, I get multiple “you have packets” callbacks for each “fill AQ buffers” callback, which is a complexity that the example doesn’t need to deal with, as it can read exactly the amount of data it needs from the file on demand.

Thus, my attempt to hold onto the packets from the callbacks is where the pointer math comes in. Start with my state structure:

typedef struct  { 
	AudioStreamBasicDescription mDataFormat; 
	AudioQueueRef mQueue;
// ...
	void* myAudioData;	// from last handle packets callback
	UInt32 myAudioDataByteSize; // from last handle packets callback
} AQPlayerState; 

Notice the last two here: I have a big-ass buffer of data that I malloc at some point, and a size counter. The counter is effectively an offset: if I get another callback with more data, I need to copy the new data into the buffer at an offset of myAudioDataByteSize, and then update the offset, of course.

Now, since Dr. Gosling has kept me away from pointers for a good 12 years now, I’m not sure how I cajole the C syntax to express the concept of “the address pointed to by taking the address of myAudioData and adding myAudioDataByteSize bytes”. Currently I’m trying to do something like:

memcpy (&pAqData->myAudioData + pAqData->myAudioDataByteSize,
          (void*) inInputData,
          inNumberBytes);

And given that I’m segfaulting somewhere… probably here… I assume that the first argument is pointing to nonsense.

So, C-friendly people, assuming for a second that this approach is even valid (it may well not be), what’s the syntax for saying “take address A, add B, and treat the result as an address”?

I spy stupidity

As the popular adage says, Never attribute to malice that which can be adequately explained by stupidity.

We’re in Grand Rapids, picking out stuff for a new house, and I took Keagan over to the Apple Store. He loves the “I Spy” games on the demo machines, so I went ahead and got him one, thinking he could play it on the laptop back in the room during downtime.

But it didn’t work out. The “Made With Macromedia” logo on the box should have been my first hint, since that company was acquired by Adobe over two years ago, as well as the fact that the game’s requirements offer compatibility with systems as old as Mac OS 8.5.

But at any rate, I worked through the install problems one-by-one. First, it complained (with a series of dialogs using different look-and-feels) about not having write permissions to its own folder when run from Keagan’s non-admin account, so I reinstalled it to his home folder. Then I ran it again and hit the deal-breaker: the app couldn’t switch to “Thousands of Colors” mode, because the Mac Book only supports millions of colors.

I e-mailed Scholastic support in hopes of getting an update and got a one-line reply: The program is not compatible with or supported on the Intel-based Macintoshes.

Now that’s pretty ridiculous, considering that the PowerPC transition is nearly two years behind us at this point, with all new Macs being Intel-based since May, 2006. And I wrote a duly harsh review on the Apple Store website to warn off potential customers.

But is there anything insidious about this? Can we accuse Apple of nefarious skullduggery! for using its demo machines to promote games that don’t work on modern Mac hardware, perhaps as a means of making the anemic lineup of games for the Mac look better than it really is?

Or is it more likely that the people who run the stores and set up the demo machines just aren’t aware of the problem? Or aren’t savvy enough to realize it could be a problem?

And while it sucks for Scholastic to not update the game for Intel Macs, and not pull it from the market when it’s clearly past its sell-by date, let’s be realistic: they used the half-assed Macromedia tools because they wanted a quick-to-market, cross-platform technology, for an application where the content matters a lot more than the interactivity. It’s not a “real” Mac application, and can’t realistically be updated because the compatibility limitations come from their choice of a third-party runtime, one that’s out of their control. So, insidious? No, just stupid and lazy.

Too bad they’re not interested in hiring competent programmers to write real code for their content, though. The “I Spy” concept would might work really nicely as an iPhone / iPod Touch game.

I haven’t returned the game yet; I want to see if it’ll perhaps run on the Mini back home in Atlanta. And as open-box software, I may be stuck with it at any rate.

Everybody to the limit!

My college friend Mike can finally talk about what he’s up to. And for a change, it neither involves vaguely rabbity things, nor is it doomed to a three-year production schedule and subsequent non-release. Nay, it’s Strong Bad’s Cool Game for Attractive People for WiiWare:

SAN RAFAEL, CA, April 10, 2008 – Interactive entertainment pioneer Telltale, Inc. is pleased to announce Strong Bad’s Cool Game for Attractive People (SBCG4AP), a new series of episodic games for WiiWareâ„¢, in partnership with Videlectrix. Starring Strong Bad, the self-proclaimed coolest person ever, the series is based on Matt and Mike Chapman’s online animated series, which has been running at Homestarrunner.com since 2000. SBCG4AP will launch on WiiWare this June.

As the very first episodic series for connected consoles, Strong Bad’s Cool Game for Attractive People has been designed specifically for WiiWare, with easy-to-use controls and WiiConnect24â„¢ features. Like Telltale’s popular Sam & Max series, SBCG4AP will be released as a five-episode “season” akin to a season of television. The episodes will come out on a monthly schedule. Release dates and pricing details will be revealed in a future announcement.

Bonus points for including an “About Videlectrix” section in the press release, for the fictional 2600-era game company.

Cooper, who lives “across the hall” from one of the Homestar guys, is duly psyched.

Nefarious Skullduggery!

Interesting story about how Adobe won’t be able to get Photoshop 64-bit on Mac for the current release, and are looking at a difficult effort porting it to Cocoa. Daring Fireball has an extensive analysis of the technical issues and how things have played out. My bet, posted in January and still on the table, is that Apple will deprecate Carbon at WWDC and roll out migration tools, based on their own experiences as they migrate apps like iTunes and Final Cut.

Still, the other thing that’s popping up in some quarters is that the unavailability of a 64-bit version of Carbon is part of a scheme to “force” developers to use Cocoa. Even if true — why wouldn’t Apple rather have one application framework to enhance and test rathter than two? — this is hardly an issue of coercion.

What it really evinces is the lazy cynicism that all big tech companies are evil, so just like we bashed on Microsoft when they were the top dog, it’s time to bash Apple now that they’re successful. Apple’s decisions can no longer receive the benefit of the doubt, cannot possibly be the product of rational decision making or allocating limited resources. No, every time they backpedal on Carbon or don’t ship a Java 6 implementation, it’s prima facie evidence of nefarious skullduggery!

In fact, I’m adding a skullduggery! category right now to track blog entries about stuff like this.

Over on java.net, Fabrizio — who was constantly bashing Apple for not shipping JDK 6 until I guess he tired of it — typifies this line of thinking:

It’s also appalling to me to learn that strategic software manufacturers such as Adobe […] don’t get early warnings from Apple about its close technologies such Carbon and Cocoa.

Of course, if Apple did only its strategic partners about major OS strategies, and left all other developers out in the cold, we’d have a bunch more posts about the nefarious skulduggery! of that.

This makes me understand even better how important is to work with open technologies.

Yes, because open-source projects have such consistent and reliable software roadmaps, and will surely be the basis of Adobe’s products going forward.

Speaking of which, while praising Java for being 64-bit, it’s worth noting that neither Fabrizio nor anyone else thinks for a second that Java would be a viable technology for Photoshop. Even if Adobe has to do a massive rewrite to go to Cocoa, and just doing it in Java instead would give them 64-bit capability and an insta-port to every desktop platform, even us Java developers don’t even consider the idea of major apps being written in Java anymore.