Rss

The Audio Tools for Xcode 4.3 switcharoo

It’s great that Xcode 4.3 packs all the SDKs and essential helper apps into its own app bundle, which makes Mac App Store distribution more practical and elegant, and brings Apple into compliance with its own rules for MAS distribution. Adapting Xcode into this form is also a prerequisite for eventually delivering an iPad version, which I continue to believe is one of those things that may be impossible in the near-term and inevitable in the long-term.

However, this radical realignment means Xcode 4.3 has surely broken something in every Mac and iOS programming book. Lucky for me that mine are still in the works, though very close to being done. I updated the “Getting Started” material in iOS SDK Development this week to handle the changed install process, which along with the new chapter on testing and debugging should be going out to beta readers Real Soon Now.

And then there’s Learning Core Audio, where the situation is a little more involved. For one thing, the Core Audio book is content-complete and copy-edited, and we are now in the layout stage, with dead-tree edition just weeks away. In fact, I have to finish going over the layouts for the last half of the book by Friday, something I’m doing with iAnnotate PDF on the iPad:

20120226-155512.jpg

So this is the wrong time to be making changes, since we can only fix typos or (maybe) add footnotes. And that’s a problem because while Xcode 4.3 includes Core Audio itself in the OS X and iOS SDKs, it doesn’t include the extras anymore. And that breaks some of our examples.

This doesn’t hit us until chapter 8, where we create an audio pass-through app for OS X. This requires using the CARingBuffer, a C++ class that was already a problem because many versions of Xcode ship a broken version that needs to be replaced with an optional download. CARingBuffer lives as part of Core Audio’s “Public Utility” classes in /Developer/Extras/CoreAudio/PublicUtility.

At least it did, until Xcode 4.3. Now that there’s no installer, there’s not necessarily a /Developer directory. Well, there is, but it lives inside the app bundle at Xcode.app/Developer, and includes only some of its previous contents. For the rest, we have to use the “More Developer Tools…” menu item, which takes us to Apple’s site and some optional downloads:

Xcode 4.3 downloads page: Audio Tools

This lets us download the optional Core Audio bits as a .dmg file, the contents of which look like this:

Contents of Audio Tools .dmg

So there’s our CoreAudio/PublicUtility folder, along with the AULab and HALLab applications that used to live at /Developer/Applications/Audio. We use AULab in chapter 12 to create an AUSampler preset that we can play with MIDI, but apps can live anywhere, so this isn’t a real big problem.

PublicUtility does pose an interesting problem, though: where does it go? The code example needs the source file to build, and up to this point, we’ve used an SDK-relative path, which means that the project can’t build anymore, since it won’t find PublicUtility along that path, since the SDK itself is now inside of Xcode’s code bundle. So where can we put PublicUtility where projects can find it? A couple options spring to mind:

  1. Leave the path as-is and move PublicUtility into the Xcode app bundle. Kind of nifty, but I have zero confidence that the MAS’ new incremental upgrade process won’t clobber it. And an old-fashioned download of the full Xcode.app surely would.
  2. Re-create the old /Developer directory and put PublicUtility at its previous path. This requires changing the project to use an absolute path instead of an SDK-relative one, but /Developer is a reasonable choice for a path on other people’s machines. It suited Apple all these years after all.
  3. Forget installing PublicUtility at a consistent location and just add the needed classes to the project itself. Probably the least fragile choice in terms of builds, but means that readers might miss out on future updates to the Audio Tools download.

In the end, I chose option #2, in part because it also works for those happy souls who have not had to upgrade to Lion and thus are still on Xcode 4.2, still with a /Developer directory. There is, however, a danger that future Xcode updates will see this as a legacy directory that should be deleted, so maybe this isn’t the right answer. A thread on coreaudio-api on where to put PublicUtility hasn’t garnered any followups, so we could be a long way from consensus or best practices on where to put this folder.

For now, that’s how I’m going to handle the problem in the book’s downloadable code. This afternoon, I ran every project on Xcode 4.3, changed the paths to CARingBuffer in the one example where it’s used, and bundled it all up in a new sample code zip, learning-core-audio-xcode4-projects-feb-26-2012.zip. That will probably be the version we put up on the book’s official page once Pearson is ready for us.

Now I just have to figure out how to address this mess in a footnote or other change that doesn’t alter the layout too badly.

Comments (13)

  1. Thanks for the info on the PublicUtility download. Why does apple hate Core Audio so?

  2. Well, I don’t think that’s anymore true than saying Apple hates OpenGL, IOKit, Dashboard, or other stuff that’s moved off to the optional downloads. The overwhelming majority of iOS developers — the ones making those hateful “Purchase 100 Smurf Coins for $0.99” freemium games — will never need AU Lab or CA’s PublicUtility or any of the other add-ons, so lightening the download is fine by me.

    After all, the commonly-used parts of Core Audio — Audio Toolbox, Core MIDI, Audio Units — are still in the OSX and iOS SDKs. It’s just the optional bits that are now separate downloads, and even among CA developers, there are only a few people that actually need this stuff. To some degree, the PublicUtility classes are out of favor with many iOS developers (myself included) because they’re written in C++.

    When Apple stops adding new stuff to Core Audio, then it will be time to be afraid. But if you keep up with the API diffs (cough, 10.8, cough), I think we’re in pretty good shape right now.

  3. jonaugust

    Chris,

    I posted the thread on the coreaudio-api list that hasn’t garnered any followups. In the meantime, I’ve been using Xcode 4.2.1. Just wondering about your option #2. How do I tell Xcode to use absolute paths to my PublicUtility.

    Also, while I’m asking you questions… Do you know of a simple way of slowing down audio on OS X without using CoreAudio? I have nightmares about CoreAudio.

    Thanks.

  4. Jon– I’m on the road and laptop only at the moment, so I can’t send a screenshot, but the trick for using an absolute path is to select the file in the file navigator, then bring up the file explorer in the right pane. There’s a popup that lets you pick between different kinds of paths (absolute, sdk-relative, project-relative, etc.), and a tiny icon that’s supposed to be a file dialog. Taken together, these two widgets can set up your reference to a PublicUtility file like CARingBuffer.cpp.
    As for not using Core Audio… at that point, I guess it’s up to you to pull samples out of whatever your file format is and perform your own DSP. Scary as Core Audio might seem, it’s usually lot scarier to go without.

  5. jonaugust

    Chris, thanks for the info. I think I figured out what you mean. This is it, right? http://i.imgur.com/4fUCb.png

    Unfortunately, that doesn’t seem to resolve my Linker issues… http://i.imgur.com/YpmeT.png

    I see you’re speaking, so I appreciate you taking the time to respond to my last comment. If you have any other ideas on getting this to compile, I would be really grateful.

  6. Orpheus

    Hi Chris, thanks for your work. I’m curious if you’ve made public the code for a karaoke style app using Core Audio in the past.

  7. jonaugust

    Just a followup in case someone else runs into this. I resolved my problem by deleting my project completely and starting it fresh in Xcode 4.3.1. Clearly some kind of cross-version Xcode issue.

  8. Chris excellent book so far albeit currently a head banger as I’ve been in chapter 8 struggling with getting CARingBuffer to play nicely and compile in the record/playback/mix project. I’ve been struggling at this point for couple weeks off/on, have tried variety of paths as well as rebuilt several times (I am using the apple updated btw) and am currently at 1 error from Clang of ‘no input file’ which I’m sure is still some kind of path. That all said I will get this and beyond the struggle jazzed and enjoying your work leading me in to Core Audio.

  9. tracyscottsf

    Just a heads up that the July 2012 Audio Tools for Xcode package only has the AU Lab and HALLab binaries (no source or headers). Grab the February 2012 Audio Tools for Xcode package instead.

    Thanks for the great book.

  10. Tracy: If you’re looking for PublicUtility, as of Xcode 4.5, it’s now in a sample code project in Xcode’s documentaiton browser. Search for “Core Audio Utility Classes” and it’ll come up as a pesudo “sample code” project. Open that and the README will tell you where to put the classes.

  11. kanstraktar

    Hi, Chris. I’ve been reading your book but I still can’t figure out how would you let users change, say, an Audio Unit’s parameters in real time (via sliders for example), as normally in the examples, which are great btw, all parameter setting is done in “application didFinishLaunchingWithOptions”. I couldn’t find an example on this anywhere. I’m working on iOS and I’m kind of a noob. If you or someone else reading this code share some example code, would be great, as I’m stuck with this for a few days now.

    Thanks a lot,
    Cheers

  12. kanstraktar: Properties and parameters of Audio Units are two different things. Properties are usually set once when you’re creating and initializing the unit (or node). You wouldn’t want/need (or be able to?) change properties after creating a unit. Parameters are set in real-time, often by the user. Also note that properties can be of any type, while parameters are always Float32s.

    I have sliders that set parameters in my demos from CocoaConf Chicago 2013 (parameters on AUNewTimePitch) and I think in Voices That Matter Boston 2011 (volume on an AUMultichannelMixer).

    Hope this helps.

  13. kanstraktar

    thank you so much! i will give them a look.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.