A Livestreaming Brain-Dump

I’ve finally gotten a few livestreams out the door on, so I think it would be useful to braindump some of what I’ve learned in getting to this point.

Demoing Motion on invalidstream

I started playing with livestreaming a few years back and gave several talks at CocoaConf about HTTP Live Streaming in iOS, even livestreaming the talks themselves. Those first few efforts were done on UStream, which is a perfectly reasonable way to get started (well, it was in 2012, not sure about now). In fact that’s true of a lot of the all-in-one shops, like Livestream, Twitch, YouTube, etc. — they deal with a lot of the hassle (and expense, which is not to be underestimated!), so you can learn the ropes. However, I didn’t want to rely on a third-party like that, particularly not with the high likelihood that the kinds of content I publish would get flagged and shut down by automated copyright bots like YouTube’s ContentID.

Servers and Streaming

So, if you’re going to go it alone, you need to have two things at a minimum: a streaming server, and a means of delivering live content to it.

10 years ago, a streaming server would have meant something like a Flash Streaming Server, which was expensive and difficult to scale. New adaptive streaming technologies like HTTP Live Streaming, MPEG-DASH, et al, stream what appear to be flat files over HTTP and port 80, so they’re a lot cheaper to work with; since HLS is just a huge collection of files being written on the fly, it works well with CDNs, as opposed to anything that requires keeping a socket open. If you’re just using HLS for video-on-demand — using adaptive streaming instead of flat files because of its multiple bitrates — then you can literally just dump the files on a normal webserver and be done with it. That’s what I did for the demos in my CocoaConf sessions.

For true livestreaming, if you’re not going through an established provider, you’re likely working with some kind of hardware or software system to support your streaming protocol(s) of choice. I use Wowza Streaming Engine, which supports the various protocols, and can be set up on your own server or an AWS instance, as I do (though I may eventually switch over to Wowza Streaming Cloud, which is streaming-as-a-service, which might make more sense if my my streaming continues to be ad hoc and off-the-cuff).

I set up a Wowza AWS instance, which is usually stopped, except when I am ready to kick off a stream. At that point, I wake it up, get its stream URL, and rewrite the livestream page to embed a <video> tag with the stream URL. At the moment, this does indeed limit me to browsers that handle HLS in the video tag. Hey, I’m learning. Ideally, I would have some sort of multi-browser player that would switch between the video tag, Flash player, etc., to handle a wider variety of browsers.

So how do you get stuff there? Wowza (and most streaming servers and services) expects to ingest a Flash RTMP connection. So how do you provide this? Some of the all-in-one shops will provide you with simple software that can at least capture from a webcam, and of course the gaming consoles have this built-in to stream to Twitch. To DIY, you need a hardware or software solution of your own. Twitch has a nice support page listing some popular options. The one I’ve heard of lots of people using, XSplit, is not available on Mac. For us, the options include the Open Broadcast Software (which I have not tried) and Wirecast, which I use.

Wirecast for producing invaildstream

Wirecast is expensive at $500 (which floors me as an iOS developer and my sad little world of users decrying $2.99 apps as a “total rip-off”), but absurdly powerful. In it, you have multiple layers that can be composited one atop another. For each layer, you can set up “shots” from live sources like cameras, static images, existing videos, screen captures, etc. Each shot lets you mix in your audio sources (microphones, system audio, the audio that’s already in an .mp4 or .mov file, etc.). A Wirecast document consists of all the shots you’ve set up, along with output settings. In the output settings, you determine the codec and bitrate to send to the server, and provide a username and password that the app can use to authenticate to your streaming server (something that you’d look up from the streaming server documentation… for example, a Wowza server will generally use “wowza” as the user name and the current AWS instance id for the password). You can have multiple outputs – commonly, one stream to upload, and another you record to disk for a permanent record of your stream (many servers and services can also record on the server side).

So, you build your shots in Wirecast, start the Wowza streaming server, click “Stream”, and off you go.

But what are you streaming?

Encoding Time!

Part of the purpose of my experiments has been purely technical – with my gear, what can I stream, and what does it look like? I’m currently using an Early 2008 8-core Mac Pro, with a Radeon 5770 graphics card. For my internet connection, I have AT&T U-Verse: 20 Mbps down, 2Mbps up.

This puts limits on what I can stream: if I send a 16:9 channel at 360p, the live encoding burns about 60% of CPU most of the time, although I briefly pegged at 100% last night using my iPad as an input and having a lot of onscreen action (while playing Love Live School Idol Festival, of all things). Wirecast’s handling of the iPad as an input device isn’t demonstrably better than my old approach — using AirPlay mirroring to send the video to Reflector on the Mac, and then just screen-capturing that window — so I may use that approach again in the future. This stream is about 960 Kbps on average, so even with occasional dips, I can send a constant stream to Wowza without drops (the stated guidance is to limit your stream bitrate to half your stated upstream bandwidth, so that’s right about where I am). In one experiment, I tried sending 16:9 480p and had some breakup. I initially attributed it to the 1200 Kbps bitrate, but with Wirecast’s CPU usage running in the 80% range, that may have been the limiting factor. Guess we’ll find out once (if?) Apple puts out a new Mac Pro that I think is worth updating to, or if I get desperate enough to roll the dice on an iMac (either way, I’ll have to blow a ton of money on a Thunderbolt HDD enclosure, because nothing Apple sells has enough storage for anyone who’s serious about video, and SSDs aren’t that useful for video work anyways).

My Wirecast shots include the webcam on myself (obviously), for which I have a couple small fill lights so that I’m not backlit by the lights in the rest of the room. I also have two screen recorder shots: one full-screen, and one for a 1280 x 720 area in the middle of the screen for application windows. For this latter one, I set my desktop pattern to show a rectangle exactly that size, as guide for where to position windows to get them on the screen (and where other windows may be safely placed and not show up on the stream). Then I have all the other videos for the show as flat files.


Having said that, what’s worth streaming anyways? I’m pulling in from different inspirations, and seeing what works. For example, the waiting room for a live recording of Accidental Tech Podcast is treated/subjected to playlists of jam bands (maybe it’s all Phish, I can’t tell), so my “waiting room” is of course a shuffle playlist of music from anime. But, being a video medium, I needed more. So I wrote a slideshow app to rotate between “cards” I made in Pixelmator that has a scene from some anime I like, along with its title and streaming URL. This app runs in the middle of my screen (in a defined “safe space” that Wirecast’s desktop capture picks up), with the iTunes mini-player sitting atop it to show the current song.

Slideshow app with anime stills

Next is the “pre-show”. This was always my favorite part when JesuOtaku (now @ANNJakeH) had a weekly livestream, a curated-but-disparate collection of music videos, trailers, viral videos, and so on. I’ve tried to pick fan-made stuff like anime music videos, Hatsune Miku songs, anime trailers, and classic ads from Apple and 80s video game consoles and home computers, and indie bands. Some of this gets me pretty far into the red zone for copyright violations (remember what I said about tempting fate with Content ID?), so I’ve tried to seek out things where the creators have more to gain from exposure than cease-and-desist letters. “Better to beg forgiveness than ask permission”, right?

I edit the preroll videos together in advance in Final Cut, and save it as ProRes… not for quality reasons (my live 360p encode crushes any dreams of a nice image), but because it’s faster for Wirecast to decode live, and thus puts less strain on the system. Having an H.264 video stutter and break up live was a brutal way to learn this lesson, but I get why it happens. Of course, ProRes files are monstrously large; I record the stream live to ProRes, and it burns about 20 GB per hour at 720p. Not for nothing do I have a 4 TB drive set aside just for video.

If you haven’t figured it out yet, the point of the invalidstream is basically to mash up the things I’m most into, largely meaning anime and iOS/Mac stuff. Of course it’s not for everyone — that’s why I’m brain-dumping all this info so you can create your own stream — but it keeps it interesting for me and has something of a unifying theme.

So next up is the “Play” segment, which may be the easiest to understand, because streaming video games on the web is such a common thing thanks to Twitch. To keep it of interest to my crowd, I’m largely sticking to iOS games, though I may throw in one or two OS X or classic emulator games at some point. One thing I’m trying to do when I host these is not to rip on stuff — there’s so much garbage on the internet based on the misguided premise that loudly denouncing things is inherently entertaining. As I’m showing stuff off, I’m more interested in explaining and exploring; to see what something is about, what works, maybe what doesn’t, but mostly just to try different things.

Livestream lets play of Tanto Cuore for iOS

After Play, I have a “Build” segment, which is about making things. Mostly this will be Xcode demos, stuff from my conference talks, stuff from the book, stuff I’m playing around with. But I also want to use and talk about the creative professional tools, particularly the video tools I’m using, which is why the second episode has a 30-minute intro to Motion, which at this point may be my favorite Mac app. Anything that gets built on the stream gets put over on the Invalidstream GitHub site under a Creative Commons Zero license, so people can pick it up and use it.

There are other things I’ve wanted to do with the stream, like having a Skype discussion with friends about new anime on Crunchyroll, or spinning long out-of-print and unlicensed shows (I ripped sample episodes from eight years’ worth of DVDs from the defunct Newtype USA magazine for this purpose), but I already feel that the hour or so of content I’ve done for the test episodes is as much as I think is sustainable on a weekly or bi-weekly basis for me at this point. I do want to create some branding with Motion, like some “now / next / later” bumps, and I still feel like I need to find a way to keep the segments on time, rather than just going as long as they need, so that someone who just wants to drop in at 10:15 for Xcode stuff isn’t stuck watching me play a game for another 10 minutes. But if it’s a little looser and messier for now, that’s not the worst thing in the world.

The other thing I may eventually have to deal with is cost. I think my Wowza instance may still be running an evaluation license, although I also think the version I’m using is deprecated and going away at the end of the month, and that’ll probably force me into $15-50/month of Wowza Streaming Cloud. The other consideration is bandwidth. At around 1Mbps, my AWS outgoing bandwidth rate means I’m paying about 10 cents per viewer-hour (and this rate is similar with Wowza’s cloud offering). My plan is to try to recapture some of this with affiliate links, which is why each show notes page (like this one) is awash in iTunes and Amazon links (I also have a Wirecast affil link in the sidebar, but I think they kicked me out of the affil program for not having a site up for a year after joining the program). I don’t know if this will actually work — in the last two years, my total iTunes affil earnings are still under $20, though Amazon is significantly better (I think some people buy my programming books through the affil links on this blog’s sidebar, and thank you by the way). If I do more anime stuff, I’d like to become a Crunchyroll affiliate, but we’ll see how that goes (an interesting thought: Crunchyroll allows actual embeds of their shows to certain affiliate sites like Anime Planet… in time, might they play ball with livestreams too?)

Beyond the Browser

I suppose it’s no surprise if I say that, ultimately, I want to get invalidstream beyond the browser and onto actual TVs. I’ve done Roku development, and as an iOS developer with an interest in video, it’s natural that I should want to get into Apple TV too. Thing is, I don’t want to come at Apple TV from the iOS developer mindset, which is where I already see a rush of attention. I’m wary of the mindset of seeing the TV as being a big iPad with a crappy remote, and just seeing how much current Cocoa Touch knowledge is applicable. I want to come at it from a content point of view, saying “OK, here’s an archive of episodes, here’s how to get to the livestream when it’s on, how do we present this and make it desirable?” There are a bunch of ideas I’m kicking around — instead of tweeting from @invalidstream when I’m about to go live, wouldn’t a push notification be just perfect?

So that’s the past, present, and possible future of the invalidstream. It has pushed me well outside of my comfort zone, but in a good way.

Previous Post

Next Post

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.