Rss

Archives for : August2012

Warming up the Invalidstream

So, that networking problem I was asking about on Tuesday had an easy enough fix: a thread on the Tom’s Hardware forums had a straightforward recipe for converting a wifi router into a second wifi access point on an existing wired network, which is exactly what I needed.

And what I needed it for was to get a strong enough wifi signal to consistently stream an AirPlay stream from my iPad to my Mac, so that I can then screen capture the iPad stream and livestream it.

Last night — in part as a tech test, as well as part of my vow to completely disengage from politics this year (and thus timed precisely to coincide with Mitt Romney’s speech to the Republican National Convention) — I did a two-hour livestream in which I played through all 12 tables in the Pinball Arcade for iPad, with narration, critical analysis, bald blather, or whatever else you’d care to call my beer-powered, pinball-distracted speaking.

Continue Reading >>

Reflecting and streaming

I’m testing the viability of a project I’d like to launch soon, one element of which involves doing an internet livestream of iPhone and iPad apps. Not from the simulator, but real production apps running on real devices. And I’m going to need some advice from you people who know your networking.

The various streaming services make it possible to stream your desktop easily enough — so far I’m really impressed with UStream producer and I wonder why the kids I see livestreaming are all doing Justin.tv or Twitch instead of UStream or LiveStream (is it the cost? Streaming Media had a nice comparison by Jan Ozer of the different streaming platforms an issue or two ago). So that means you just have to get the iPhone output to the screen, and there you go.

So how do you do that? Amit Agarwal has a nice roundup of the options, saving the best for last: AirPlay. Using Reflection or AirServer, you can mirror the screen from a suitably new iPhone or iPad to a window on your Mac.

I evaluated the two and eventually bought Reflection, based on what seemed like much superior audio performance. But still, it seemed glitchy, and tended to freeze frame frequently. Reflection has a handy recording feature, so here’s what that looks like when playing the action game Deathsmiles:

Note that this video is somewhat deceiving in that the audio gets at least 10 seconds out of sync in the recorded version (you can hear the boss battle long before you can see it), which didn’t happen while I was playing it. Nevertheless, point being that this kind of freeze-up glitch would totally unacceptable to livestream viewers.

But AirPlay works great on my AppleTV, so what’s the deal? Before hanging the problem entirely on Reflection, I played another round much closer to the wifi access point. Flawless. The video I mean… my gameplay leaves much to be desired:

The size and compression on this mp4 is a little different… the point is that the video doesn’t freeze as long as I’m close enough to the wifi access point.

So, the problem is that when I play down in my office, the wifi signal gets weak (I sometimes see 2 or even 1 bar), and that presumably drops below a threshold that’s practical for AirPlay. How do I fix this? This is where I need help. Here’s a simplified look at my home network:

The DSL (U-Verse now, I guess) comes in on the first floor on the left, behind a TV, where it then connects to a D-Link router and wifi AP (shown as an AirPort base station in this picture, uses network 192.168.2.x). I don’t use the U-Verse built-in wifi because the D-Link has such better range, which is important for getting a strong signal to the Mini on the top floor, which was not pre-wired for ethernet. We have ethernet (cat 5e) on the first floor and the basement, so one of the ethernet cables coming out of the D-Link goes down to a switch in the basement, and one of those connections goes to a switch in my office (shown with the Power Mac… hey, old stencils on Graffletopia, OK?)

So… how do I get a better wifi signal in the office? I have ethernet down here, so I’ve generally not really cared about wifi being any better than adequate. I have old wifi routers lying around, so I could easily set up another wifi network in place of the switch in the office, but then I’d be on a different network (192.168.3.x, I guess), and so I’d be cut off from the rest of the house, which hoses me out of my Subversion repository (on the Mini) and everyone else out of the color laser printer (in my office).

Another option would be a wifi repeater, I guess, but where to put it? On this floor, but directly under the wifi AP? Or just in the office itself? Is it going to make enough of a difference to be worth it?

Or is there some other kind of device I don’t know about, one that extends an existing wifi network via ethernet (so that the new AP, presumably down in my office, isn’t in the business of handing out DHCP addresses, but instead can relay it over ethernet to the AP on the first floor)?

Bounds on this problem: there’s no point spending more than $200 on a solution, because at that price-point, I could hard-wire the connection from the iPhone/iPad with VGA or HDMI output and either a VGA2USB or a BlackMagic Intensity Pro.

Comment section below, or @invalidname on Twitter and App.net. Thank in advance for any suggestions you guys and girls care to offer.

Fall Conferences 2012

OK, quick conference update:

  • I’ll be at CocoaConf Portland from October 25-27, doing the all-day Core Audio workshop that went over quite well (and was well-attended) in Columbus. There’s also enough new Core audio stuff in iOS 6 to make that worth doing its own session for, but of course we can’t talk about it until the NDA drops (and no, it’s not just the stuff in the WWDC session).

    I’ll also be doing the HTTP Live Streaming talk again (link goes to Columbus site), although the feedback on that from both DC and Columbus has been weirdly demanding of more client-side iOS code, despite the fact that getting a player going is like a 5-liner with the MediaPlayer framework. The things you need to learn about about HLS is in the encoding, tooling, and deployment.

    Also Reverse Q&A again, which went better in Columbus than DC. It works best when the panelists drive the conversation and are quick to follow-up, and we don’t let it turn into an App Store policy bitch-fest, since those things are endless.

  • CocoaConf Raleigh… not sure. Probably? Anybody in the South or on the Eastern Seaboard up for a day of Core Audio?

  • And then there’s CodeMash 2013. I’ve had somewhat mixed feelings about CodeMash. A tweet from walterg2 in June asked if I’d do the HTTP Live Streaming talk again at CodeMash. I replied that CodeMash has never gone for media topics in a big way — none of mine have ever been accepted, nor have I ever seen any audio or video stuff on their program — and that CodeMash seems to treat iOS as something of a red-headed stepchild. Nevertheless, I got a reply from CodeMash organizer and colleague Dianne Marsh suggesting I submit it.

    That led to an offline discussion about whether CodeMash even wants iOS content. Last year, iOS — despite being the mobile platform that everyone seems to have spent the last five years chasing — was a third (and, frankly, the least third) of a mobile track that also featured Windows Mobile (note that Microsoft is a top sponsor of CodeMash) and Android (presumably more palatable to the sizable Java crowd that CodeMash always draws). And as a colleague noted, CodeMash seems eager to take these sort of “how to write iPhone apps without using Apple’s tools or frameworks” talks, which is at best a backhanded compliment, if not a disavowal of the platform. Taken together, I had to ask myself if I and my core platform were even wanted there anymore; I don’t want to overstay my welcome if I’m no longer compatible with where CodeMash wants to take their content catalog.

    What I heard back from CodeMash is that they’re not deliberately dissing iOS or OSX, but that they have received few submissions for iOS/OSX talks, and taking the best of those only amounted to 2-3 talks last year.

    So, that’s why I’m putting this out there… are there other Midwestern iOS/OSX people who’d like to bolster the platforms’ presence at CodeMash? Considering CocoaConf had enough content for a three-track conference down in Columbus, it seems like there must be plenty of potential speakers who could spare a couple days for a January conference at an indoor waterpark. Submissions opened this week, explicitly list “iOS” (and “iPhone/Mac”[?]) as a topic, and are open through September 15.

    So for the time being, I’m going to take CodeMash at their word in saying that they’re interested in iOS/OSX talks and pitch them some new stuff. Maybe I’ll even propose an audio/video/streaming talk. One of these years, they just might bite…

Bittersweet Ending

Somewhere between the Happily Ever After and the Downer Ending, the Bittersweet Ending happens when victory came at a harsh price, when, for whatever reason, the heroes cannot fully enjoy the reward of their actions, when some irrevocable loss has happened during the course of the events, and nothing will ever be the same again.

So, Apple wins big in their patent case against Samsung, and reactions are pretty much not what you’d expect. While the Fandroids console themselves with a straw-man claim that “Apple patented rounded rectangles”, writers in iOS circles are hardly delighted. Pre-verdict, Matt Drance wrote “However the verdict falls, I feel like there are no winners here in the long term — certainly not us.” And a day after the verdict, John Gruber’s Daring Fireball hasn’t even mentioned the outcome.

Matt’s concern is giving Apple too much power to control the market, and the verdict likely does that. Following along with The Verge’s liveblog, I noticed that a few devices were found as non-infringing. That’s got to be even worse news for Samsung and Google, because the jury is effectively saying that it is possible to make smartphones without copying Apple, and Samsung largely (and willfully) chose not to. Combine this with the speed at which they reached their conclusions and it’s utterly damning.

And yet, on the other hand, what we’re discussing is patents like “slide to unlock”, which many/most of us think is unworthy of patentability in the first place. And that’s what makes this so uncomfortable: Android’s and particularly Samsung’s copying of Apple was egregious and shameless, but since that itself is not illegal (and how could you even codify that as law?), then does settling for a victory over stuff that probably shouldn’t even be patentable count as a victory at all? Making things worse, the jury had the option of invalidating patents on both sides, and declined to do so on every count.

Then again, what do I know? I thought ripping off the Java programming language and practically the entire API of Java SE was a lot worse, but the court said that was OK. So I guess stealing is bad, except when it’s not.

Yay?

My CocoaConf Columbus file-dump

I spent too much time away from sessions (readying my sessions, revising the book for iOS 6) at CocoaConf Columbus to do my usual “What You Missed…” style blog entry. I guess you could say I missed it too. But at any rate, I still owe the attendees some files.

Everyone in the all-day Core Audio tutorial should already have the five sample apps that we built in class, and a fallback zip was given to the group first thing in the morning, but if you some how missed it, here you go:

The HTTP Live Streaming talk is little changed from its CocoaConf DC version, but for the addition of a few new slides that show off more HLS apps (like NBC’s Olympics apps) and some off-the-shelf encoding boxes that’ll set you back $25,000. Here are the updated slides; sample iPad code is the same as DC’s:

Twice now, I’ve tried to show actual live broadcasting, and been thwarted both times. There’s actually nothing in Apple’s toolchain that provides the MPEG-2 transport stream needed by the mediastreamsegmenter tool. There is a trick on Stack Overflow that gets a seemingly-suitable stream out of VLC, but while the .ts segment files and the .m3u8 files are created and look correct, QuickTime X is unable to open the resulting stream. It doesn’t help matters that VLC can only capture from the webcam and can’t do audio capture, nor can it use screen:// as an input in Lion or later, which breaks Erica Sadun’s HLS screencast trick of a few years ago.

I’m going to do the all-day Core Audio tutorial and the HLS talk again at CocoaConf Portland in October, possibly something new too. Hope to see some west coast media developers out there.