Rss

Streaming the Streaming Talk (CocoaConf Raleigh, Dec. 2012)

Since Jeff Kelley demanded it, here’s a dump of the setup I used to livestream my CocoaConf Raleigh session about HTTP Live Streaming (and the Core Audio session before it). Attendees of these talks are welcome to skip to the bottom of this blog for promised links to code and slides.

First off, links to the recorded livestreams on UStream:

  • Core Audio in iOS 6 – stream
  • Mobile Movies with HTTP Live Streaming – stream pt. 1, pt. 2

The second talk is split in two parts because Wirecast crashed partway through the session, breaking the stream and forcing me to start the broadcast anew. And that speaks volumes about what a seat-of-the-pants operation this was.

Ustream screenshot of Core Audio talk

Actually, let’s go back a few steps… I hadn’t actually hosted a livestream since my late Summer iOS gaming streams. I did those with the basic Ustream tools, but for Black Friday, Telestream (makers of the great Screenflow) offered a 30% off sale on all their products, which allowed me to pick up the full Wirecast livestream production application for $150 off its $500 tab. And with the advanced features of Wirecast, I was eager to try them out.

Wirecast broadcast document for CocoaConf Raleigh livestream

Perhaps more advanced than a 2012 MacBook Air is cut out for. Aside from the aforementioned crash, prior to starting my broadcast, Wirecast announced that the USB throughput had become overloaded, even though the only devices connected were the internal iSight webcam (which I would have happily reduced the bandwidth of if it were practical to do so) and the USB-ethernet adapter.

Yeah, ethernet. CocoaConf graciously covered the cost of a wired connection for the upload of these talks’ livestreams, since the hotel wifi behaved pretty much like any developer conference hotel wifi typically does — hopelessly overloaded to the point where checking mail or refreshing Twitter is a 60-to-120-second slog.

So aside from the iSight (which I didn’t use), I also had a camcorder on a tripod behind the projector, connected back to the Mac via a 15″ FireWire cable (mini 400 to regular 800, from Monoprice) which then connected via a Thunderbolt adapter (proving again that the point of Thunderbolt is to convert it to other standards that anyone anywhere actually uses). The camcorder was used nearly all of the presentation, since I needed to focus on giving the talk to the on-site attendees and not worry about dicking around with shots in Wirecast. In the HTTP Live Streaming talk, I gave a brief demo of Wirecast, switching between desktop video captured with the built-in Desktop Presenter, the two cameras, and the classic Ellen Feiss “Get A Mac” ad (anime fans who look in my shot list above will see that I also loaded up the AMV Hell 3 “Osaka” version as a possible alternative).

I connected my iPhone to the projector (via dock-VGA) as I did this step, in part to show the high latency (~30 sec.) inherent to the backend-transcode-and-segment nature of HLS, but also because I didn’t seem to have any other way to get my desktop out to the projector, since the Thunderbolt was in use. I later realized I could have used something like XDisplay to mirror the MacBook to my iPad, and then project that via VGA, although this would put me at the mercy of the limited wifi bandwidth. So, there’s no perfect solution… one reason why most of my future streams will be from home, where my Mac Pro may be old, but at least it’s got lots of I/O ports (so how about a modern-era replacement, Mr. Cook?)

Aside: speaking of this wifi-constrained scenario, doesn’t that seem like a real-world problem for the otherwise-clever Logitech Broadcaster Wi-Fi Webcam. I think you’d have to be nuts to depend on always having enough wifi bandwidth to support a 720p H.264 stream (requiring 2 Mbps, according to Logitech?). The design and form factor are lovely, but is it really going to be practical in a real-world shooting environment, or are you always going to have to bring your own my-fi for it?

My broadcast settings were a low-bandwidth (916 Kbps) 640×360 16:9, chosen in part because I didn’t know if the bandwidth crunch we were all experiencing was just slammed wifi or a bottleneck in the hotel’s internet connection itself. It seems like the former, because the wired uplink behaved just fine. The other reason I wanted to run at this bitrate is because it’s pretty much what I’ll be able to do at home with my 1.5 Mbps upstream with AT&T Uverse. Allowing for overhead and bitrate variability, I don’t think I can reliably go much faster than about 1 Mbps (the next Wirecast preset is 854×480, for a bitrate of 1278 Kbps).

Anyways, I think it went pretty well, despite making some errors in something I knew going in would be a long learning process. I got some nice tweets (1, 2) along the way, along with a complaint about slide readability. Oh well, stuff to work on. I’m going to get another shot at it, as I do more or less this setup again in a couple weeks at Ann Arbor CocoaHeads, which of course will be streamed on the invalidstream.

And hopefully sometime early next year, I’ll be able to bring some of my more elaborate livestreaming plans to fruition. If nothing else, expect more use of my amp up pre-roll music as heard under the bars at the beginning of Saturday’s streams. (Did someone say “Contract?”)

Anwyasy, that’s surely enough about livestreaming for now. For those who attended my sessions, here’s the usual links. For the all-day Core Audio tutorial (something I’ll have a follow-up blog about soon), we got through all five of my sample projects, so that has a new zip file. For the regular sessions, code hasn’t changed since the first run of those talks, so the links go zips I’ve posted here from previous CocoaConfs.

Comments (5)

  1. Finally got around to trying out the AQTapDemo on IOS. Worked fine on the simulator but there was no audio on the device. Could see the debug info for the packets and there seemed to be no renderErrs but no sound!
    Thanks for this example. I hope this can make it into a future update of your book.
    On first thought I thought ProcessingTap could just allow you to setup a simple render callback to process the data but wow! So complicated. This is not an easy plugin to an existing app and still after several months a dearth of other examples or documentation!

  2. Works for me on the device – watch me demo it (after a few network false starts) at 01:12:00 in the livestream of the Raleigh talk. Not to be patronizing, but have you checked that ring/silent is off?

  3. Thanks! Ring/Silent was the problem. Do you know why that would silence the audio for this app but none of my other audio apps? I’ve never come across this before. Thanks for the quick reply

  4. Two bits says you set your audio category correctly, and I didn’t bother.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.