Rss

Archives for : February2010

Already thinking about a next book

I’ve been away from the blog on a pretty grueling day-job contract. It’s actually been positive for my work on the Core Audio book: nothing encourages a side project like disliking your day job.

Squirreling away some morning hours between 5 and 9 AM over the last two weeks, I’ve managed to get first drafts of the first two chapters done (admittedly with a helpful start from the fragments left from Mike and Kevin’s initial work on the book). I actually surprised myself by finding a concretization of the Nyquist theorem, which (basically) says that to reproduce a frequency, you have to sample at at least double that rate. In chapter 2’s digital audio intro, I added an example to get readers looking at raw samples, writing out their own waves as PCM samples. An inner loop uses a samplesPerWave value, and it occurred to me that something interesting happens at the Nyquist point. If you’re sampling at 44100 Hz, then for a 22050 Hz frequency, samplesPerWave is 2. That’s the minimum for a perceivable, repeatable pattern: at a higher frequency, you have less than 2 samplesPerWave and therefore no more repeating pattern, no wave.

Not a big deal, but it was a nice little “aha” moment to stumble across.

I’m enjoying having momentum on Core Audio, and since I wrote that never-published series of articles (yep, the publisher is still wedged) on low-latency use of units and OpenAL, I’ve already worked through some of the conceptually-hardest material, so there are fewer unknowns looming ahead than normal. Which I wouldn’t have expected for Core Audio, surely the hardest topic I’ve ever written about.

I’ve found myself wondering what would be a good topic to pitch and hopefully write about in late 2010, time and budget permitting (writing for anyone other than the Prags is a financial indulgence… it most certainly does not pay the bills).

Here are a few things I’m thinking about.

  • HTTP Live Streaming – The only choice for streaming video to an iPhone OS device, and a compelling choice for streaming media in general. Would probably cover setting up your own server, working with CDNs that support it, writing your own client (if QTX or MediaPlayer.framework doesn’t work for you), encryption and DRM, etc.
  • Core Services – all the C functions that none of the introductory Cocoa books tell you you’re going to need for a serious iPhone or Mac application. Core Foundation, CFNetwork, Keychain, System Configuration, Reachability, stuff like that.
  • C for iPhone – I’ve mentioned this before, how I wrote about 100 pages of this for the Prags, and everything was fine until it wasn’t. With all the people getting into iPhone OS development without a C background and the problems of applying the very dated K&R to modern C development (especially for beginners, who won’t follow its Unixisms or its analogies to Fortran and Pascal), I still think we badly need a C intro that can serve as a prerequisite to all the iPhone books (including ours) that assume C experience. Plus, I find it a perversely fun topic to speak about (see the slides of my CodeMash session).
  • OpenAL – There’s, like, next to nothing to get beginners started with this audio technology, and what’s out there is frequently wrong or outdated (for example, some “getting started” type blog entries use the deprecated loadWAVFile that isn’t even present on iPhone OS). I’m actually thinking of a radically different format for this book, but don’t want to give it away just yet.

I was going to make this a poll, but I don’t think I have enough readers to make that viable, and all the poll plugins for WordPress suck anyways. So if you’d care to vote for one of these options, please post a comment. Besides, I’d love to have new registered users who aren’t offshored spammers.

There will likely be an update to iPhone SDK Development as well, when SDK changes warrant, but that hasn’t happened yet. Let’s see when we get a 4.0 and what’s in it.

The Long and Winding Road Tip

Road Tip 1.1 is out today after a week in review, and that provides an opportunity to talk about my first foray into location-based development.

The Idea

The origins of Road Tip probably trace back to a trip to Southern California last year with my family. We were on the way back to San Diego from Disneyland, looking to grab some dinner. All kids are picky, ASD kids doubly so, which meant we were on the lookout for a Taco Bell. Not exactly a rarity, but as it turns out, California doesn’t have those blue “what’s at the next exit” signs so common in the midwest and south. Moreover, they also don’t seem to allow billboards. The practical upshot is that if you’re looking for a particular brand of roadside service, well, good luck finding it. I actually ended up typing the search term “taco bell” into the Maps app while driving 65 MPH, a stupidly foolhardy thing to do, but such was my frustration.

The thing is, it actually gets worse: Maps is only smart enough to do a radius search: it can find Taco Bells near you, but can’t account for whether they’re five miles off the freeway, or five miles behind you.

So later that Fall, I came on the idea of making an app to find stuff at exits, by taking into account your direction of travel, and looking for what would be on your way. In general, my plan would be “find exits, then find stuff near them”.

Building the thing

This actually turns out to be an interesting problem to solve. Most of the map APIs out there are built around the concept of “given a destination, how do you get there”, and give you options like taking certain kinds of roads (avoiding traffic, keeping Rain Man off the freeways, etc.). But this app is different: instead of “given destination, find a route”, it’s “given a route, find some destinations”. It also twists the usual logic of point-of-interest searching. Instead of “find stuff near a point”, it’s “find stuff along a route.”

The iPhone includes APIs for working with locations, and for providing map images, but not for working with directions. This is what Apple meant in their 2009 introduction of the iPhone 3.0 SDK when they said “bring your own maps”. I previously detailed my search for a map provider, based on the needs for an iPhone mapping application: an API that can be called from an C/Objective-C app, rich search functionality and a deep database of POIs, and license terms I can live with. I think a lot of people overlook this last point: most map websites are free, but using their data in a commercial application usually is not. I mean seriously, come on, do you think they pay their people to amass all this data and are then just going to give it to you because they like you, or because you make some speech about “information wants to be free” or some garbage like that? Puh-leeze.

Most people I know tend to default to Google Maps, but looking at it from the API point of view, they were out of the running early. They seem primarily interested in their JavaScript API, which can’t usefully be called from an iPhone app (unless you’re just writing a webapp). Their HTTP webservice-ish API has a lot of holes, and their terms didn’t seem welcome to commercial licensing.

I ended up contacting MapQuest about a commercial license, and went with their service. They have a number of APIs for various scripting languages, as well as a machine-to-machine XML API that these are all based on. I realized that as long as I was willing to send and parse a lot of XML, this would suit my needs perfectly well. After getting good results with a few proof-of-concept projects, I got in touch with MapQuest and worked out a commercial license to use their data in the app. In addition, I got access to their developer and marketing support, which has been very helpful, as I’ll get to in a little bit.

Building Road Tip

I had a few specific goals for Road Tip. I wanted it to be as un-fussy as possible. If it was going to be so distracting that it could only be used by a passenger, then there’d be no point writing it: just have your spouse or your kids do the Maps app instead. Using a smartphone while behind the wheel may not be safe, but since people (myself included) are going to do it anyways, I wanted to make the app as non-distracting as possible. So my design goals were:

  • Glance-able: get the info you need with a 1-2 second glance at the screen, no longer than you’d look at a road sign or your radio.
  • One-thumb control: able to be used by holding it in one hand and interacting entirely with your thumb. That means no pinch-to-zoom, no typing. I also made most of the buttons bigger than the default, to make them more hittable
  • Nothing extraneous: all I want is stuff on my current freeway, not what I might find if I turn off onto other highways. And it goes without saying that an app like this cannot use banner ads.

This accounts for the high-contrast, minimalist GUI. It’s gotten me bad reviews on the App Store for not jumping into the bling arms race, but the point of the app is to not look at it for any more than a second or two. Honestly, the ideal UI for this app isn’t visual at all: it would be far more useful to do voice synthesis and recognition, having the app read out potential destinations and letting the user speak his or her choices. If we get speech in iPhone SDK 4.0, you can bet this will be something I adopt for a Road Tip 2.0.

Finding Your Way

So then there’s the whole idea of the app logic. It’s not that hard to find exits in the map data: with MapQuest’s “navt” (Navteq) coverage set, you just search for objects of display type 1736. An Exit Finder demo from MapQuest illustrates using a different database to do the same thing (this example provides road names for the exits, instead of just numbers, something I adopted in 1.1). The problem is to find the right exits. In my first version, I found all exits in a cone-shaped search region ahead of the user, not knowing what freeway they were part of. The problem was that when your freeway intersected another, you’d get info for exits that you couldn’t actually take or wouldn’t do you any good (such as the other freeways’ exits onto your current freeway). I tried to filter these out by examining exit numbers, but this approach was filled with both false positives and false negatives.

My road tests proved this out pretty quickly. Consider the intersection of I-196 and US-131 in downtown Grand Rapids:

The exit numbers only differ by about 8, so a loose rule like “ignore exit numbers off by more than 20 from other nearby exits” that works well most of the time, doesn’t work here. A worse test is US-127 south through Lansing, MI:

If you’re heading south on US-127, exit counting totally doesn’t work, because as I-496 comes in from the West, its exit numbers take over. So you count down through the 70s when suddenly, Trowbridge Road is exit 8 and the exit numbers start counting up, to Jolly Road at exit 11.

It gets even worse if you use the relative location of exits to set up your next search for more exits, as I did. In that case, finding exits for the wrong freeway ends up sending the whole search off-course.

So what I needed was to really keep track of the road the user is actually on. MapQuest’s engineers gave me some advice for a technique that is harder and costlier, but maximally accurate: find the road segments themselves. The logic I ended up going with figures out the name of the road you’re on, then searches ahead and filters and sorts only segments that match this road. The practical upshot is that Road Tip can keep its search limited to the current freeway, ignoring intersecting freeways and even stretches where multiple freeways run together.

To that end, here’s an internal debugging screen (using the built-in map tiles, sorry MapQuest), that shows how this kind of logic handles the US-127 torture test without turning onto either I-496 or I-96.

This approach also has the advantage of handing twisting roads really well. I use a series of searches for road segments, each one set up by the direction vector of the last segment of the previous search, so it doesn’t lose track of twisting roads (and, when searching all services, can kick off the gas/food/lodging search while still searching for more road segments). Here’s the debugging screen again, this time looking at I-80 East near Truckee, CA, where it goes mostly North (and occasionally a little South and a little West) on the way through the Sierras and into Nevada.

Where’s my dinner?!

With a representation of the road accurate to a matter of feet/meters, I can then search for exits, and perform radius searches around each of them for restaurants, hotels, and gas stations.

I decided there were two primary use-cases to address: either the user knows they want to exit soon and wants to look exit-by-exit for what’s available if they get off now, or the user wants to know what services are on the road ahead, regardless of distance.

For the first case, an “exits” button lists the upcoming exit as a grouped UITableView:

Each row of the table is a button. Tap it to bring up the exit’s services, presented as a tab view of gas / food / lodging / all.

For the “hold out for Taco Bell” case, you take the other navigation branch, tapping “services”. This presents a short list to indicate what you’re searching for: gas, food, lodging, or “favorites” (which you set up in the Settings application, ideally before you go driving, since it’s detail-oriented and highly distracting). This view organizes service names (restaurant and hotel chains, for example) by the exits where their locations can be found:

In either branch, the final page is a map of your selected service, relative to the exit:

What’s Next?

So that’s Road Tip as of 1.1. I’m pretty happy with the utility of the app, even if the App Store reviews and ratings are a mixed bag. I built it to suit my needs, and use it on all my long drives, and if it’s not as bling as so many tricked-out iPhone apps, well, that’s really contrary to the point of the app.

The one most useful criticism so far is that the app has a very long startup time. I need to get the current direction in order to know which way to search for road segments, and the iPhone’s GPS can take up to 30 seconds to deliver that. Ugh! However, I’ve been experimenting in the last week to see if I can deliver a “good enough” estimate to start searching, even before I get the high-confidence course from Core Location. It’s a little tricky, because the the first update from CL is usually junk. For example, consider the following map:

When I run the app at my desk and I’m not moving, point 1 is the first location provided by GPS. Problem is, point 2 is where I actually am. Draw a line between them and you’d infer that I’m moving east-southeast. So to do the estimate, I have to ignore the first update and wait for the second and third unique locations before estimating the direction of travel. Still, on the highway, I’m getting this in 5-15 seconds, which is about half the time of waiting for Core Location. So that’s going into 1.2.

After that, I don’t know… I’m very busy with a day-job Mac contract, and I really want to get back into media programming (and the Core Audio book), so this may have been a one-time-only foray into location-based iPhone app development for me. I can’t say at this point if it’s going to end up making financial sense to have spent so much time developing it and putting it out there on the App Store, but at least it’s been intellectually rewarding.