Rss

So, How’s That Mac Pro Working Out For You?

It’s been a little over two months since I became that guy who actually bought a three-years-old-out-of-the-box Mac Pro. My reasoning and agony over the purchase has already been detailed on this blog, but now that I’ve used it for two months, let’s take a look back at how it’s working out.

The thing that surprises most people when they see the trashcan Mac Pro is how small it is. Remember, the computer is not even a foot tall (um, 30 cm for those of you in civilized parts of the world), so on my desk, it is barely taller than my speakers, and is actually kind of dwarfed by my microphone and pop filter.

Mac Pro on desk

One of the knocks against the design is that by banishing all the interior drive bays and PCI slots of the old tower format, you are doomed to a snaking collection of cables going to external devices. True enough, but I bought a Thunderbolt 2 drive enclosure and put it down on the floor behind the desk, so it genuinely is a case of “out of sight, out of mind.” No complaint about the aesthetics from me.

Another advantage of the drive bay was to simplify migration from the 2008 Mac Pro — I just moved my drives over, lock stock and barrel, and then did migration assistant from my old boot SSD to the new Mac Pro’s internal drive. And, of course, for working with video it’s nice to have big, inexpensive spinning disks, since random access doesn’t actually do that much for you when your data is all interleaved, read in sequentially, and has a playback data rate far slower than the drive is capable of.

I got a 28″ 4K monitor which runs at 1920×1080 HiDPI resolution (de facto Retina, if you will), and it is a tremendous relief to my 49-year-old eyes to have text this big and this clear when I’m writing and coding. The surprise was that the HDMI 1.4 port was only driving the display at 30 Hz, and required a mail-order mini-DisplayPort-to-big-old-DisplayPort cable to get 60 Hz. But, hey, problem solved.

Is it fast? I’m the wrong person to ask. Of course, everything is faster than it was on my 2008 machine, so that’s something I notice when building PDFs of the book or building Xcode projects (with the caveat that building Swift on the Mac Pro is surprisingly slow). But that was always going to be the problem with buying this model in 2017 — it’s nowhere near what a desktop should perform like today, but it’s still an inevitable upgrade from what came before.

If this model had been periodically updated since 2013 — even if just to switch to USB-C and make sure to include the latest Bluetooth and wifi — nobody would be complaining. A lack of CPU speed-bumping would be forgivable, as the x86 architecture really hasn’t moved forward in that time (though I suppose GPUs are another story). But failure to do even this minimal amount of refreshing is what tells a lot of us that Apple doesn’t care about the Mac anymore.

Aside: I just noticed last night, there are fine patterns of dust in the chutes atop the lid, where the fan is pushing out air. It’s no big deal — five seconds with a can of compressed air and it’s clean — just kind of funny to notice.

Lines of dust in Mac Pro fan chutes

Video

The thing that really surprises me is the degree to which I feel suddenly unleashed to work with video. For a couple years, I had been slowly building a livestreaming site at invalidstream.com, blogging my progress now and then over the years, and decided this winter to kick that into gear. In part, this was because the new hardware unleashed me in a key way: my old Mac Pro nearly topped out the CPU doing the live H.264 encodes (one to stream and one to save locally) even for fairly low bitrate livestreams. With the new hardware (and, presumably, Wirecast’s ability to use GPU encoding on modern hardware), I rarely break 20%.

The other thing that unblocked me here is on the content side — I always wanted to do some teachable material like coding, but didn’t know if I had enough material with my conference talks and just screwing around. But then I decided to just start working through all the sample code in the iOS 10 book, which solved a bunch of problems: it satisfied my need for content on the stream, it will eventually create a complete video record of the book’s contents as a series of videos, and it satisfies my publisher’s need for a promotional plan (plus, by posting on Vimeo and my Twitter accounts, it doesn’t depend solely on Pragmatic Programmers’ existing channels, like their newsletter and magazine).

Combined with the stuff I already knew I’d be doing — “let’s plays” of iOS/Mac games and visual novel read-throughs (which will be Muv-Luv for the indefinite future) — I am now generating nearly two hours per week of video content.

And that’s another reason I need big spinning disks. With just nine episodes in the can, I have generated nearly half a terabyte of data.

Folder info for my livestreams, showing I've used 487GB so far

Largely, that’s a result of my insistence on recording the episodes as ProRes 422, an absurdly high-quality format for the nature of the material. I just have a strong preference for not throwing away quality forever if I can help it. I remember all those times that master recordings have been lost (broadcast videotapes that were wiped for reuse, film masters that were thrown away or otherwise lost, etc.), and when storage is fairly cheap, I’d prefer not to irrevocably degrade or destroy my media. Plus, writing ProRes instead of 264 should be less stress on Wirecast while I’m streaming. And at any rate, as my 4TB drive fills up, I can always dump those episodes to bare drives for archiving.

Production Notes

If you’ll indulge a little navel-gazing, I want to note the pieces of the weekly broadcast, and how the Mac helps me create them.

Preroll

When podcasts record live, they often put some audio on the stream before the actual recording begins, so early arrivals know the stream is working. For example, Accidental Tech Podcast features songs from Marco Arment’s Phish collection (or, quite possibly, a single Phish jam that started in 2005 and is still going). The same principle applies for video livestreams, except that you need to provide an image too. Some streams will just put up a test pattern or a logo. Owing to my insane “iOS/Mac + anime” credo, I actually wrote a Core Image slideshow app (source is on GitHub) that cycles every 10 seconds through a set of 100 cards I created, each with a scene from an anime and a URL of where you can legally stream it. I also have a 100-song playlist of anime theme songs (all legally purchased, most from iTunes) that plays over this, with the iTunes miniplayer at the top right of the screen. Here’s what that looks like:

Preroll anime card

Preshow

Inspired by livestreamers before me (most obviously Jacob Chapman’s livestreams as “JesuOtaku” from some years back), the preshow is a collection of different fun videos from various sources, mostly trailers, vintage Apple and 80s video game ads, and anime music videos (AMVs), which are fan-made videos that put anime footage atop popular music or other audio for artistic and entertainment effect, and something that Lawrence Lessig himself identified as a perfect example of remix culture (which, nevertheless, would be immediately shut down by copyright bots that recognized the music, which is one of many reasons I use my own streaming server instead of something like YouTube). I edit the preshow in advance in Final Cut Pro — I’m barely scratching the surface of what FCP can do, but it’s straightforward work and allows me to balance disparate audio levels and smooth out harsh visual transitions.

Editing preroll with Final Cut Pro

Main segments

The show currently has three main segments: Build, Play, and Read (my idea was always to break these out as generic concepts that could be represented as verbs, so I could change formats and perhaps have “Discuss” or “Watch”). “Build” could be any kind of creative work (in test episodes I did Xcode and Motion), but for the time being, it’s going to be all coding examples from the book for reasons listed above. “Play” is a “let’s play” style presentation of playing a game with running commentary, usually something on iOS. “Read” is where I work through long visual novels, like the current project Muv-Luv Unlimited. All of these are broadcast in Wirecast by simply capturing a portion of the screen and, in some cases, mixing in a lower-left box shot of me from the webcam.

Coding demo from invalidstream

The iOS games are captured over a Lightning cable via QuickTime Player (for reasons discussed earlier), and Muv-Luv is captured from Parallels running Steam in Windows 10, because there is no way it would ever be allowed on any Apple platform (no boobs permitted in Steve’s walled garden!)

Interstitials

At two points during the live show, I run videos that list upcoming events: after the coding segment, I run a list of upcoming Mac/iOS developer conferences, and after the gaming, I run one of upcoming anime conventions. These are created with Motion, and I did a test livestream a while back showing how I build them.

For the very beginning of the show (after the preroll, but before the Build segment), I also build a “now / next / later” bump, obviously in the style of Adult Swim’s “Toonami” block, showing the rundown for the evening. I also post this on the @invalidstream twitter a few days before the show. Again, this is built with Motion, an app I enjoy the heck out of using.

Summary

I wouldn’t have gone weekly with this livestream if I didn’t think I could handle the production pace. But even before my day job evaporated, I’d found a routine that appeared to be sustainable — build the preshow and NNL videos on an evening early in the week, go over the book code I want to cover, stream live Friday night after the kids are in bed, then sometime Saturday go back into Final Cut Pro and cut videos of the three main segments (just adding a title card), which I upload to Vimeo overnight Saturday into Sunday, and then write the show notes page on Sunday (adding iTunes and Amazon affil links where I can, in desperate hope of losing less money on this fool’s errand than I already am). It seems sustainable to take it in little chunks like this, and while the audience is practically zero at this point, I’m willing to give it some time and see what develops.

Moreover, it serves as an example of how much you can do with a single computer and a mixture of Apple’s Pro apps (Final Cut Pro, Motion), third-party apps (Wirecast), and a little bit of my own work (the preroll slideshow). And when Tim Cook talks up the iPad by asking why anyone would want a PC… I don’t see any way that this would be possible with iOS anytime soon, if ever — the need for massive storage, heavy CPU load, external video devices, a Windows emulator, etc., are outside the realm of iOS’ competencies and in some ways directly opposed to it. This is a job for a pro Mac.

Which, expense be damned, is why I bought a Mac Pro.

Previous Post

Comment (1)

  1. Kevin

    I used to handle full seasons of broadcast NCAA basketball games for a university (Mens & Women’s, home, away and post season). I would record them off the air as ProRes (yes, I know they are essentially H.264, but I didn’t want to add that next round of copy loss) Anyway, when we got done for the year, highlights edited and such, we’d dump the entire game as MPEG2. The space savings vs. storage vs. conversion time was very, very acceptable. (Via Compressor) And even 10 years ago, the conversion back from MPEG2 to ProRes to edit in FCP7 was quite, quite acceptable and non-strssfull, even in time crunches. Even today, the people who took over for me are still using it for games that they now get raw feeds from the broadcast truck using a Kai Pro.

Leave a Reply

Your email address will not be published. Required fields are marked *