Rss

Archives for : April2010

What You Missed at 360iDev

At the beginning of the month, I spoke at 360iDev in San Jose. I’d wanted to do a go-for-broke Core Audio talk for a long time, sent in an ambitious proposal, and got accepted, so this was destined to be it. Now I just had to come up with something that didn’t suck.

Luckily, there was another Core Audio talk, from Robert Strojan, that did a high-level tour of audio on the iPhone, with two deep dives into OpenAL and Audio Units. So that got everyone ready.

I called my talk Core Audio: Don’t Be Afraid To Play It LOUD (from a caption in the liner notes to Matthew Sweet’s Girlfriend), and went with the no fear angle by focusing almost entirely on audio units. The idea being: you’ve seen these low-latency audio apps that do crazy stuff, you know it must be possible, so how? The answer is: you involve yourself in the processing of samples down at the audio unit level.

Click through the 170+ slides — it’s not that bad, I just expanded all the Keynote “builds” into their own slides — and you’ll get the basic grounding in the stuff you always have to do with audio units, and the Remote I/O unit in particular. Stuff like finding the component, getting an instance, setting properties on the input and output scopes to enable I/O and set an AudioStreamBasicDescription, things you will get as used to as implementing init and dealloc in an Obj-C class.

The big win is the examples, which came together in a hideous mess of two spaghetti code files that I’m embarrassed to say are now in the hands of all the attendees. One reason for the blog entry you’re reading is to present cleaned up versions of the five sample applications from the session.

In the past, I’ve done an audio unit example that produces a sine wave by cranking out samples in the callbacks. It’s sort of like the hello world of audio in that it gets you down to the nitty-gritty of samples without too much pain, but it doesn’t play to a crowd all that well. FWIW, the second chapter of the Core Audio book writes a sine wave to a file, again to get you thinking about samples as soon as possible.

But for this talk, I decided to do a set of examples that work with audio input. That way, we got to play with the mic and the speaker — bus 1 and bus 0 for you folks who already know this stuff — and get some halfway interesting audio.

The first example goes through the drudgery of creating and initializing the Remote I/O unit, and connects bus 1 output to bus 0 input to do a pass through: anything that comes in on the mic goes out the speakers. I used to do this with an AU Graph and AUGraphConnectNodeInput(), not realizing it’s easily done without a graph, by just setting a unit’s kAudioUnitProperty_MakeConnection property. With that, I could speak into the simulator and get audio out over the PA (or into my device’s mic, but I used the simulator because it shows better).

Well, yay, I’ve turned the iPhone simulator into a mediocre PA system. What’s next. The key to the good stuff is to be able to involve yourself in the processing of samples, so the next example replaces the direct connection with a render callback. This means we write a function to supply samples to a caller (the Remote IO unit, which needs something to play). In the basic version, we call AudioUnitRender() on the IO unit’s bus 1, which represents input into the device, to provide samples off the mic.

Still boring, but we’re getting there. Instead of just copying samples, example 3 performs a trivial effect by adding a gain slider, and applying that gain to every sample as it passes through.

Core Audio pass through with gain slider

Now we can apply any DSP that suits us as samples go through the render callback function. In example 4, we apply a ring modulator to the samples, which combined a 23 Hz sine wave with the input signal from the mic to create a reasonably plausible Dalek voice.

Dalek voice demo:

I was pretty much over time at this point, but the last example is too much fun to miss. To show off other units, I brought a multichannel mixer unit into play. On bus 0, it got a render callback to our existing Dalek code. For bus 1, I read an entire LPCM song into RAM (which is totally bad and would blow the RAM of earlier iPhones, but I couldn’t get the damned CARingBuffer working), and provided a render callback to supply its samples in order. The result, infamously, is “Dalek Sing-A-Long”:

Dalek sing-a-long

Dalek-sing-a-long demo:

Anyways, great conference, great speakers, great attendees. Thanks for reading, and here’s the code:

What you missed at Voices That Matter: iPhone Developers Conference

Last weekend was the Voices That Matter: iPhone Developers Conference in Seattle, put on by Pearson, who’s publishing our Core Audio book. However, co-author Kevin Avila handled the Core Audio talk for this conference, so I took on two topics that I had fairly deep knowledge of, thanks to my work on Road Tip.

Core Location and Map Kit: Bringing Your Own Maps

This talk starts with a basic tour of Core Location’s more or less straightforward means of getting information about your current location, getting updates as the data gets better (i.e., GPS kicks in) or you move. Then we got into Map Kit and how to get map images of any given location. The sample app for this takes a list of Apple Store locations, which I converted from CSV to a plist and stuck in the app bundle, and shows you the nearest store to a given start location. Each time you hit a + button in the nav bar, it drops the next closest store as a pin on the map and re-scales the map so that all pins are visible.

This is where it gets good. Starting from a canned location at my house, the closest Apple Store is here in Grand Rapids. The next two that come up are outside Detroit (Ann Arbor and Novi). The fourth closest store is in Milwaukee. The comedy is when you see this on the map — Milwaukee is close only if you ignore the fact that going there would involve driving 100 miles straight across Lake Michigan. Since ferries across the lake are slow and expensive, and run only in Summer, you would probably drive through or around Chicago to get to Milwaukee… and go right past 7 Apple Stores in the process.

Calculating as-the-crow-flies Apple Store distances from GRR

This is a common mistake to make — I forgot to show in my slides that the apple.com retail store finder does the same thing. Still, as-the-crow-flies distance calculations, paired with map images, can be a problem. Map Kit doesn’t know anything about roads, bodies of water, geographic features, political borders, etc… all it does is serve up images, and provide a touch UI to interact with them.

The last third of the talk is about “bringing your own maps”, meaning integration with third party map data providers to get some navigational intelligence into your app. The final code sample uses the MapQuest directions web service to get actual driving distances to the first few hits, and to keep the list of locations sorted in order by driving distance. This not only keeps me from going to Milwaukee, it even smartens up the earlier results: being right on I-96, the Novi store is now my second closest result, and as it turns out, even Indianapolis is a shorter drive than going around the lake to Milwaukee.

Calculating drivable Apple Store distances from GRR

In-App Purchase: Best/Worst Thing Ever

My second talk was on In-App Purchase. In many ways, it covered the same hard-earned experience that I covered in An In-App Purchase Brain Dump. I spent a little time on both the iPhone Provisioning Portal (to set up an AppID, authorize it for I-AP, and provide enough of an app submission to create purchase objects, without actually going into the review queue), and on iTunes Connect (creating purchase products). The sample app simulated an online social game, minus the game, in which users might be able to purchase digital goods like virtual clothes for their avatar. The example offered a tabbed UI with a blank view for the game, a table of purchased objects, and a table for the store. When you tap “store”, the app collects the available products with a call to Store Kit, and tapping one of them kicks off the purchase. If the purchase goes through, the item is added to the inventory page.

Purchasing an item with I-AP

Of course, this is easier and demos better because it uses non-consumable products, which have always been the best-understood and best-supported class of I-AP products. I did weigh in against the still-broken subscriptions, and how they have to be restored to a user’s many devices, even though restoreCompletedTransactions doesn’t support subscriptions, and nothing in Store Kit gives you a key you can associate with the user to save the purchase on your own server.

Erica Sadun talked with me before this session and mentioned an interesting workaround. Get your user to purchase a non-consumable item, and persist its transactionId. When they purchase subscriptions, log the purchase and this other transactionId on your server. Then, when they restore on another device, they’ll get this non-consumable back and its transactionId, which you can then use to query your server for subscription purchases. Depending on the nature of your app, and how well you hide the clunkiness from the user, this could be a very viable workaround.

That said, I still think I-AP subscriptions suck and hope that Apple deprecates them soon.

The last part of the talk covers of doing commerce without I-AP, which is more viable than you might think. If you’re already selling digital goods from a web store, you can continue to do so, and make your iPhone app a rich client to your web-vended content. In this case, you’re already doing everything that I-AP offers, so there’s no reason to give Apple a 30% cut. For example, the various e-bookstores within Stanza don’t use I-AP — they go out to websites for O’Reilly, All Romance Books, etc., to complete the purchase. This might not be allowed for an iPhone-only app, but where the goods are available in multiple forms, it only makes sense for Apple to allow an iPhone app to be a rich client to such a store’s goods.

My final example is streaming anime: Crunchyroll sells premium content subscriptions on their website, and their free iPhone app lets you use the same login to get the same content. By contrast, The Anime Network has premium subscriptions on their website and via I-AP in their paid iPhone app, and credentials aren’t shared between the two. Worse, the iPhone app offers fewer videos for the same price. Their use of I-AP is bad for them (they’re paying Apple to process payments they’re already capable of handling), and bad for their users. WTF?

Coming next: my 360iDev slides, and the much-anticipated cleaned-up Core Audio sample code

Back from 360iDev San Jose

I just got back this morning on a red-eye from 360iDev (sorry, no link; not practical to do lots of HTML editing on the iPad, which I’m typing this on) and wanted to get a few thoughts out before I collapse from sleep.

It was so nice to get back into the world of iPhone OS, as I’ve been working on a day-job contract on the Mac, with far too much exposure to the broken parts of that platform, principally the installer technologies. The cleanups and relative lack of legacy entanglements make iPhone a more pleasant platform to develop for.

The conference organizers did a bang up job of getting top tier speakers and attendees. In a final panel, the speakers were nearly all people whose apps I used: Doodle Jump, Air Sharing, and so on.

It was also nice to run into my co-author on the iPhone book, Bill Dudney, his predecessor in iPhone evangelism, Matt Drance (who, in an ironic life swap, has replaced Bill in the Pragmatic Studio seminars), MapQuest’s Carl Edwards, Double Encore’s Dan Burcaw, and three fellow members of the Ann Arbor CocoaHeads group: Tom Hoag, Dan Hibbets, and Henry Balanon.

I did a highly ambitious talk on Core Audio, using all of my 80 minutes to get deeply into audio units, applying a ring modulator effect to captured samples from the mic, and mixing it with samples from a file to produce a “Dalek sing-a-long”. I need to clean up the example code, and will likely do a post on it sometime in the future. Still, this was the kind of conference where such an advanced talk would be well received. After Denver and San Jose, it will be interesting to see where they take this conference next.

The business sessions were at least as good as the tech sessions, maybe better. Harbor Master’s Natalia Luckoynova offered a lot of vital real world lessons about making a go of the app store in her talk, and Dan Burcaw’s tales of the enterprise rang all too true.

I’m not a fan of panels, but a wrap up panel on the iPad started well, and probably would be fondly remembered had they wrapped up at the scheduled time rather than going 40 minutes over.

I don’t really have a conclusion, so I’ll end with an observation: Word Press on the iPad offers no way to scroll through the list of category checkboxes. So I will have to save this post as a draft and finish up after getting back to a full-blown computer. Or maybe use my iPhone and the mobile site. Yeah, irony.