Rss

Brain Dump: Capturing from an iOS Device in Wirecast

So, with the book nearly done (currently undergoing copy-editing and indexing), I’m using some of my time to get my livestreaming plans together. What I’m likely to so is give the “build” section of the show over to working through examples from the book, so those will be archived as video lessons. Then, along with the interstitials of conference updates, fun videos from the anime fan community, and and a read-through of the Muv-Luv visual novels, I’ll be doing a bunch of Let’s Plays of mostly iOS games.

I did this with the first two test episodes: Tanto Cuore in Test Episode 1 and Love Live! School Idol Project in Test Episode 2. To do this, I need to be able to capture video from an iOS device and ingest it into Wirecast, so I can stream it.

Over the years, I’ve used different techniques for this, and decided to take some time today to figure out which works best on Wirecast for Mac. So, after the jump, behold the results of this project, plus instructions on how to configure each approach.

To explain what you’re seeing in the video below, I used each approach for playing two iPad games: Stern Pinball Arcade and Soul Calibur. These games give us different things to look at. When the Pinball Arcade camera moves, it’s mostly up and down, and it’s often fixed on the lower portion of the table with only a little movement of the ball and flippers, so that should give us a good look at almost sprite-like movement without needing a lot of keyframes. On the other hand, Soul Calibur‘s free-roaming camera follows the action in three dimensions — and with my retreat/attack/dodge style of play, the z-axis gets a workout — so the entire scene is frequently changing in non-linear ways and will tend to demand more keyframes. I also put up some overlays to give Wirecast a typical compositing workload.

The tl;dw (too long; didn’t watch) is that capturing with QuickTime Player and then recording that part of the screen with Wirecast worked best, while directly using the iOS device as a native input to Wirecast is completely unacceptable.

Here’s how each approach works.

Direct Wirecast Capture

This is the most straightforward approach. Create a new shot in whichever layer make sense for you (typically 3 or greater, assuming you’re saving 1 and 2 for overlays), choose “Capture Devices”, then choose the name of your iPhone or iPad (“Ifrit” in this case), which must be connected via a Lightning cable.

List of attached capture devices in Wirecast

Easy peasy. But, sadly, completely unacceptable. I had previously had bad experiences with this approach, and was willing to chalk it up to both my iPad and Mac being out of date. But now I’m using an 9.7″ iPad Pro and an Early 2013 Mac Pro, so there’s really no excuse. With simple movement, there are dropped frames, and with a lot of movement (as with the camera swinging around in Soul Calibur), the video falls apart completely. Also, the audio is riddled with dropouts.

Video tearing with Wirecast direct capture

You might not even know that’s Edge Master (left) and Sophitia (right) if their names weren’t under the life bars at the top.

The poor performance surprised me, as I thought there was an AV Foundation API for doing screen capture from iOS devices as virtual cameras on macOS, but apparently not. I wrote a naive playground to try it out:

//: Playground - noun: a place where people can play

import Cocoa
import AVFoundation

if let devices = AVCaptureDevice.devices(),
    let avDevices = devices.filter(
        {$0 is AVCaptureDevice}) as? [AVCaptureDevice] {
    for device in avDevices {
        print("\(device.description)")
    }
}

And my iPad is not shown in the output to the Console area.

So, anyways, scratch that approach. On to Plan B.

QuickTime Player + Screen Capture

While it’s not obviously surfaced in AV Foundation (at least not from my cursory glance above), the Mac can easily get an audio/video stream from a Lightning-connected iOS device, because QuickTime Player does it. Open QTP, go to File -> New Movie Recording, and when the capture window pops up (probably showing your own pretty face from your Mac’s webcam), click the disclosure down-arrow next to the record button to show all available devices; your iOS device will be one of them.

This gets your iOS content on the screen. And, of course, Wirecast has always been able to grab from the screen. Create a new shot, set its input to “Screen Capture…”, and then go into its configuration sheet. Click “Select Window/Monitor” to choose which monitor you want to capture from.

Choosing a monitor for screen capture

Then switch the popup from “fullscreen” to “select screen region” to bring up a GUI where you can specify the region of the screen to be a rectangle around QuickTime’s capture preview window.

Selecting an area for screen capture in Wirecast

If you watched (or at least scrubbed around the video), you saw that this gives great results: the framerate is nice and fast and there’s no video tearing, even when Soul Calibur swings its camera around. This approach also produced the least slowdown on the iPad itself, which is nice for action games, particularly pinball where flips need careful timing to hit their intended targets.

AirPlay + Screen Capture

My original approach (as seen on some UStream videos I did like four years ago), was to use Reflector to turn the Mac into an AirPlay display device, put the iOS device into AirPlay mirroring mode, then to use the same screen capture technique detailed above for QuickTime Player.

Mirror iOS device to Reflector AirPlay

After that, the technique is exactly the same as in approach 2: create a new shot in Wirecast to capture the screen region containing the Reflector window. Granted, that means they both share the same problem: it’s almost impossible to get the capture range pixel-accurate, so you may end up picking up pixels from other windows or your desktop at the bounds of the capture area.

The Reflector approach also suffers from being wireless rather than wired — I have an access point in my office, so this is set up as well as it can be, and it’s pretty good. But looking at the source recordings (which I did in ProRes just to guard against secondary compression artifacts), the QuickTime approach provides somewhat better framerates and image quality than AirPlay.

So, with that, I think I have my approach going forward: capture over lightning into QuickTime, and then capture that region of the screen. I’m kind of disappointed the direct input into Wirecast doesn’t work better than it does, but it may be the case that Apple’s not offering enough support for Wirecast to do a solid implementation of using iOS screen recording as virtual capture devices.

Comments (3)

  1. Chris Adamson

    Thomas Zoechling (@weichsel on Twitter)
    pointed out a step I overlooked for getting iPad screen capture as an AVF capture input on macOS: you need to set the property kCMIOHardwarePropertyAllowScreenCaptureDevices, as shown in a WWDC 2014 session.

    The C code in the slides is a pretty straightforward call into Core Media I/O, but with Swift’s type-safety, it becomes quite burdensome. In the code above, you need to import CoreMediaIO and then add the following:

    
    var prop = CMIOObjectPropertyAddress(
        mSelector: CMIOObjectPropertySelector(kCMIOHardwarePropertyAllowScreenCaptureDevices),
        mScope: CMIOObjectPropertyScope(kCMIOObjectPropertyScopeGlobal),
        mElement: CMIOObjectPropertyElement(kCMIOObjectPropertyElementMaster))
    var allow : UInt32 = 1
    CMIOObjectSetPropertyData(CMIOObjectID(kCMIOObjectSystemObject),
                              &prop,
                              0,
                              nil,
                              UInt32(MemoryLayout.size),
                              &allow)
    

    What surprised me — although I guess it makes sense — is having to put those type-safe initializers around each member of the CMIOObjectPropertyAddress. Good for the compiler, bad for readability. I’ll probably end up using it in my Forward Swift / CocoaConf talk about using Swift with these lower-level media frameworks.

  2. Kevin Lewis

    Being an “old school” Video Guy and a programmer, sometimes I’ve found code can get in the way. That said, I’ve had to capture over the years from VGA thru HDMI/DVI outputs of various devices. I’ve used higher-end broadcast capture devices and they are really picky about formats and had to jump through lots of hoops. However I discovered AV.IO HD. It’s a USB 3 Capture device. Compatible with Skype, FaceTime, etc. It captures cleanly and smoothly. Hope that helps.

    • Chris Adamson

      Ooh, neat. I like that the Amazon page specifically mentions using it with Wirecast or OBS. Pricey, but good stuff usually is. And unlikely I’ll ever use my Canopus SD Firewire capture box again. Thanks for the tip!

Leave a Reply

Your email address will not be published. Required fields are marked *