I’ve finally gotten a few livestreams out the door on invalidstream.com, so I think it would be useful to braindump some of what I’ve learned in getting to this point.
A Digital Media Development Blog
Next up on our tour of WWDC 2015 media sessions is the innocently-titled Content Protection for HTTP Live Streaming. Sounds harmless, but I think there’s reason for worry.
For content protections, HLS has always had a story: transport segments get a one-time AES encryption, and can be served from a dumb http server (at CocoaConf a few years back, I demo’ed serving HLS from Dropbox, before it was https:-always). You’re responsible for guarding the keys and delivering them only to authenticated users. AV Foundation can get the keys, decrypt the segments, and play them with no client-side effort beyond handling the authentication. It’s a neat system, because it’s easy to deploy on content delivery networks, as you’re largely just dropping off a bunch of flat files, and the part you protect on your own server is tiny.
So what’s “FairPlay Streaming”, then?
So it’s been a month since I taught my half-day Roku SDK class at CodeMash 2014 (sorry for the lack of blogging… client project in crunch mode). I’ve long since posted my slides, with the sample code in my Dropbox public folder.
Also, if you don’t already have a Roku, get one through my Amazon affiliate link and thereby incentivize me to blog more. Thanks.
I was really happy that we were able to convey the sense that streaming is a multi-disciplinary and holistic pursuit, tying in very different kinds of expertise on the client, the server, networking, encoding, content production, and business concerns. That’s something I tried to stress in my CocoaConf HLS talks. The irony is that my speaker feedback from those sessions would sometimes say “more of the iOS part, please”, and the fact is, creating an
MPMoviePlayerController or an
AVPlayer is easy compared to some of the other tasks involved — encoding for different bitrates, transcode/transmux for non-iOS clients, security, etc. To say nothing of acquiring or creating something worth streaming in the first place.
So, good chat, and quite a brain-dump for a 45-minute podcast. The guys had lots of relevant expertise in encoding and hosting, which took the conversation in directions it needed to go. So, thanks iPhreaks. It was fun.
So I tried something different this year. I took a pass on the angst and nerd drama of the WWDC ticket lottery and saved my pennies for a completely different conference, Streaming Media West. As much as I like working on media applications, I’ve long believed that I’ll be far more useful as a consultant the more I understand the content side and the problems that clients are likely to bring my way. I really like the magazine and its website, and figured I could pick up some useful knowledge both for my development work and my aspirations to do more livestreaming of my own.
tl;dr, was it worth $2,500 of my own money for ticket, hotel, and airfare? No. Not in any obvious way. But the fact that it didn’t pan out may itself provide valuable insights.
For those of you who dig tech conferences held in the middle of winter at indoor waterparks, CodeMash sessions are up, and I’m doing something different this year. I’m teaching a half-day class on Roku development, on January 8, 2014.
Details are on the sessions page — you can’t link to a specific session, but if you filter by “precompiler” classes and type equals other, you’ll find it. Of course, this is three months away, so I don’t have the class materials prepared yet, but my plan is to get a quick win by streaming something simple like the Apple “BipBop” test stream, and then move on to building out a more sophisticated content browser with the various UI widgets.
Actually, we’ll probably dedicate the first hour just to getting set up. Everyone needs to get their Rokus onto the Kalahari wifi, put them into developer mode, get the IP address, and get set up to send zips to it with FTP and watch the debug messages with telnet. Also, everyone’s going to need to see the output from their Roku, either with a second monitor or (preferably) a capture box into their laptop. Lots to wrangle, and makes me appreciate the relative convenience of Xcode and its iOS simulator.
EDIT: Jeff Kelley reminds me I should score some Amazon affil linkage out of this. So if you want a top-of-the-line HD-only, HDMI-only box, check out the 3rd generation Roku 3, whereas I’ll be rocking the 2nd gen SD/HD Roku 2 XS.
Also, as I discovered the hard way at CocoaConf San Jose, a Roku on someone else’s wifi is worthless without the original remote control: there’s no way to connect it to the wifi, which in turn means you can’t find it with the Roku smartphone app. So, yeah, I was very careful to put that in the requirements/prerequisites for the class.
Anyways, that’ll be fun, and very much in CodeMash’s spirit of trying out new things. CodeMash registration is next Tuesday (Oct. 22) for alumni, and the following Tuesday (Oct. 29) for general public. Last year, each sold out in a WWDC-like 30 seconds; in fact, it was the CodeMash 2013 experience that convinced me that WWDC 2013 would sell out in less than a minute.
So, if you make it in, and want to learn Roku programming, I’ll see you in January!
And a week late, I’m finally writing my follow-up post to CocoaConf San Jose. Not that anybody’s missing anything, I think, because I’ve been doing the same two hour-long sessions for the CocoaConf Spring 2013 tour: “Core Audio in iOS 6” and “Mobile Movies with HTTP Live Streaming”. I’ve tweaked each repeatedly, although this time the only one with slides new enough to post to Slideshare is the Core Audio one, since I added some slides at the end to do an overview of Audiobus.
I’d hoped to get a full-blown Audiobus demo ready in time for the conference, but client work took priority, and in the midst of an 11-hour flight delay in Denver, I didn’t have the tools or the stamina to pull it off.
Actually, I’m thinking I may work on digging into Audiobus enough to get a whole one-hour talk on it ready for CocoaConf’s Fall tour. Doing so would also help me deal with the fact that some of the Fall conferences will likely fall inside the iOS 7 / OS X 10.9 NDA period, leaving us unable to talk about the new hotness from Cupertino. Audiobus is an interesting new topic that would not be encumbered by the Cupertino cone of silence.
For CocoaConf DC, I freshened up the HTTP Live Streaming talk with a demo of the streams created for the iOS app working as-is with a Roku HD purchased the night before at Target (because I forgot my Roku XS back in Grand Rapids). Actually, only the basic and variant streams work – I didn’t try getting encrypted streams to work, but Roku apparently supports it, so that’s something to work on for next time.
I also had time left over at the end of my regular Core Audio session in DC… with a few judicious cuts, I could carve out 10 minutes or so for an introduction to AudioBus. Anyone interested in that?
I mentioned a while back that I was bored now with Apple apparently deciding to take the first few months of 2013 off, at least in terms of shipping anything interesting. With all the laptops on a schedule of updating mid-year for back-to-school, and all the iOS devices apparently on a holiday season update, and the SDKs getting revved annually at WWDC, it leaves a big gaping hole of nothing at the beginning of the year.
I’d hoped we’d see an Apple TV SDK by now, and since we haven’t, I’ve gone looking for something else to do. I bought a Roku 2 XS Player (just in time for the Roku 3 to come out, wouldn’t you know), since the Roku platform is highly welcoming of third-party developers, and features a broad selection of third-party content (including, of course, another means of getting my Crunchyroll fix).
OK, let’s do this thing.
Attendees of the iPad Productivity Workshop — an all-day class I did for the first time, following a poll here on [Tc]; about new tutorial topics — have already written all the code, but for DC students who want an advanced peek (or anyone else who’s interested), here’s a zip of the project in its various stages.
The “staged examples” is an idea I got from Daniel Steinberg, who swears by it for his classes. The great thing about it is that if someone falls behind, they don’t get lost: they can just skip ahead to the next checkpoint in the code’s progression. In this class, we build an app that can search iTunes, put results in an
UICollectionView, and then allows the user to build their wishlist of items as a
UIDocument. Along the way, we add in:
It turns out to be more than I can teach in 8 hours, so with the stages, we just skip ahead to a good starting point. In Chicago, we started at stage 3, with the search feature working and the split-view for wishlist browsing set up in the storyboard but not yet implemented. The code might get a few tweaks before DC — possibly sorting the
.wishlist files in the master table, and supporting pasting into the wishlist — but overall things are in really good shape.
As for my other talks, I did Core Audio in iOS 6 and Mobile Movies with HTTP Live Streaming again. They’re good talks and pretty polished at this point, but they were in some ways meant as a placeholder in case Apple gave us something new to play with in time for CocoaConf. Obviously that hasn’t happened… it’s been a real boring Q1 in Apple-land.
If you’re here for the Core Audio, note that this is the corrected, works-on-iOS-6.1 code that I discussed in a previous blog entry.