Wistia Presentation Syncing

 


Summertime is play time! One of the things I’ve been playing with lately is a new video hosting service called Wistia. For a relatively low monthly cost, we can store lots of videos at a site with more privacy controls than YouTube. It also has some pretty nice features regarding user engagement and analytics.

The folks at Wistia have some really good (if somewhat goofy) videos on some production basics as well as a ton of other resources in their library. Like all good companies, you also need a space to experiment, and that’s where Wistia Labs comes in.

The above example is an example of their Presentation Sync feature from the labs. There are two components. First, it takes advantage of a service called Speaker Deck and uses their API. You upload your slides in PDF format and they convert them to an embeddable slide presentation. The other component is a video you have already uploaded to Wistia. Then Wistia’s presentation sync takes care of matching the two together. It’s pretty easy to do, and then the whole synchronized presentation can be embedded in a WordPress post (like this one). This isn’t a new idea, but their code builder makes it pretty seemless.

Two last points unrelated to the Presentation Sync – Remember to check the equipment bag to see that all of the parts are there – and no, I don’t always wear the same shirt when I do my videos.

A Follow-up

I’ve been meaning to post this for a while. At the moment, I’m taking some needed days off, but I’m happy to say that “Andyland” is back online and the above video is submitted as proof. It seems like I blinked an eye and this CRAZY semester ended. Now it’s summertime and the livin’ . . . well, it’s not easy because there’s a ton of work to do. DTLT is also losing a couple of key players in Ryan and Jim, so part of the summer scramble will include search committees.

I have a bunch of summer projects to put the polish on some of the ITCC facilities, and I’m going to try to blog about them. However, my vacation days are ending and I’m missing my production space. So I’m posting a video to remind me.

A Setback

“I’ve lost all my work!” That was the first thought in my head as I found out that the Media Editing Suite, affectionately called “Andy Land”, had been flooded with water from the ceiling due to a … well, I still don’t know the cause really. I’ve listened to words come out of people’s mouths, but I can’t make sense of it. Frozen and thawed pipes. HVAC units failing. Antifreeze. The cause is immaterial. The incident has caused countless people headaches and almost criminal wastes of time.

So while waiting for a backup to occur on the Mac Pro that was in the room – it was still running when the Building Manager, Cartland opened the door and then kicked of the power – I created a therapeutic video. Nothing special, just some footage that I was able to recover from the Blackmagic 4K camera that I used just six days before. Yeah, you heard right. I was able to backup the hard drive on the Mac Pro, which I turned on today and got connected to some of the other equipment that I was able to get running.

My work, for the moment, is safe. I wasted a good amount of time the last couple of days. We had visitors from Michigan State that I would have liked to spend more time with. Other UMW personnel have lost some of their work time as well. If it wasn’t for Cartland coming in on his day off, the flood may have been discovered a lot later. DTLT folks have been displaced from their ITCC work spaces – only 7 months old – obviously the building isn’t able to walk by itself yet.

It’s been bittersweet this new building of ours. Student love the space. It’s a roaring success in terms of filling a need. I continue to hope that it remains worthy of the talented students we have at UMW. But Tuesday, February 17, 2015 was a setback for that hope. Next we’ll talk about who’s fault it was and who pays for the damage and what’s “covered”.

As I said on Twitter, “it’s just stuff”. But it’s also a representation of work, and not just mine. The people who helped order and receive the stuff. The people who delivered and even set up the stuff. We do our work to get paid, mostly. Some of us are lucky enough to work at things we’re passionate about. Sometimes we have to work on things we shouldn’t have to.

Our work is beyond the machines. Sometimes it’s inside of them, and we work to keep them working so that they can help us create more of our work. If the machines stop working, we can lose our work. Bits and bytes dried up and blown away. Or flooded and washed away with corrosive water. Something is keeping those bits of my work alive, and I’m grateful to whatever (and whoever) it is.

Now, can we please get back to work?

Gotchas in the 4K/Retina World

gotcha

Vexing problems are sometimes set aside to deal with at another time.

Let’s begin at the beginning. Since moving into the ITCC I have had the pleasure of using some 4K monitors (these Sharps – yes, Plural!) on the new Mac Pro in my video editing room. And yes, I am completely spoiled now. However, you may not understand how a 4K monitor fits into a desktop setup. I have another post brewing about 4K in general, but with 4K resolution brings a new term coined by Apple – Retina Displays.

So how does a Retina display figure into this post? Well, one of the problems (believe me the benefits outweigh the problems) is that some programs don’t know how to handle Retina, or HiDPI mode. HiDPI mode is essentially taking a high resolution and squeezing it into a smaller resolution. In this case squeezing 4K of resolution (3840×2160 or 2160p) down to a 1080p screen.

One example of a program that doesn’t handle HiDPI correctly is MPEG Streamclip, one of our favorite free programs we use to manipulate video. I wanted to do a screencast on the Mac Pro about MPEG Streamclip and it wasn’t behaving properly in HiDPI mode. The playbar was split and a small slice, including the “play” button, was off to the right, like this:

MPEG Streamclip split

I then had two choices. I could do the screencast at 4K resolution, which isn’t a good choice at this point in time (just trust me), or I could record it on another machine that isn’t using a Retina display. I wound up using a 21″ iMac that has a native resolution of 1080p (1920×1080). The resulting screencast is on YouTube.

The MPEG Streamclip/HiDPI problem was put on the back burner, but I eventually wanted to research if/how I could do screencasts with these problematic programs on my new Mac Pro (actually UMW’s new Mac Pro).

Today I was doing some clean-up on a website that is being resurrected – the Digital Media Cookbook site (yet another post is brewing about that). I was using a program called Image2icon. I wanted to create a new “favicon” for the site and knew that the “Pro” version (It’s $4.00 if you’re interested) of this program would do it. However, it wasn’t working. On their support page, an FAQ entry talked about an issue the program had on Yosemite, the latest Mac OS. It talks about enabling the program to “open in low resolution” mode. After that, Image2icon created my favicon without a hitch.

This got me thinking, is there a “Open in Low Resolution” checkbox for MPEG Streamclip? Take a look below to see the answer:

MPEG Streamclip Get Info

You right click on the program in the Applications folder and choose Get Info, then click the checkbox for “Open in Low Resolution”.

Two problems solved in one day! And now I don’t have to use another machine for screencasting. I can use HiDPI mode on any machine, including my home machine which coincidentally has a Dell 4K monitor (I got it as a Christmas present) that allows me to use HiDPI mode as well!

Gotcha squashed!

creative commons licensed ( BY-NC-ND ) flickr photo shared by Chad Horwedel

Iterating the Kit

media production

Transition. It can be fun. Frustrating. Overwhelming. Invigorating. All those and many other things. For me it’s always about the next step. No one I know is more critical of me than me. One of the “criticisms” I hear – in a total teasing, smack-down, weirdly motivating and one-upsmanship way – is that I, and other DTLTers, should be blogging more. And it’s true. We’re all doing good work and we should be sharing what we’re doing. Especially now.

So in that vein of sharing, I will start with an update to something I’ve gone back to well on many times – The Kit.

For those of you who don’t know what “The Kit” is, it’s basically a live video stream setup that is as compact and portable (and inexpensive) as possible. At the time that I first gave it the name, The Kit was a backpack with a Mac laptop and a Canon video camera. The only thing that didn’t fit in the backpack was a good, sturdy, Manfrotto tripod. What made the live-streaming all work was a piece of software on the laptop called Wirecast. The limiting thing, however, was that multiple cameras was difficult (not enough “inputs” on the laptop – firewire only goes so far). That, and the on-the-fly encoding and streaming that Wirecast was providing, and it would tax the computer well beyond its CPU capabilities. Reliability was an issue at times, especially if we wanted to push the capabilities (something we do in DTLT all the time).

The Kit made its debut in 2011 and remained largely comprised of the basic set of components – until we began to get ready to move into the ITCC. As part of the research I was doing into the production spaces that would be a part of the new building, I came across what were essentially hardware solutions to what the software-based kit provided. In many ways the production spaces of the ITCC are relatively expensive. I still wanted to maintain a relatively inexpensive “Kit”, and of course, keep it portable.

Many weeks ago, Jim Groom asked me if I would provide the live-stream for the 2nd Edition of the Open VA Conference in Virginia Beach. The main reason I said “absolutely” was to meet the challenge of the next version of the Kit, and to provide a multi-camera setup. I’ll be honest, a side reason I said yes was the “beach” part 😉

So enough of the reasons, let’s get to what the Open VA version of the Kit looked like. Here’s a fast and loose video I put together of what I envisioned I would use, and following that a list of the individual equipment used (the actual implementation was scaled back a bit):

Rack

  • Blackmagic Design (BMD) ATEM Television Studio – This is the heart of the system. More about this item below.
  • BMD Hyperdeck Shuttle Pro – records to SSD hard drives in ProRes format
  • Furman Power Conditioner – provides clean power
  • Nady 6-Channel Rack Mount Mixer (not currently used – don’t know if we ever will)
  • Includes 4U Case, AV cables (HDMI & SDI), ethernet cable

Camera Setup

  • Asus HDMI monitor – used for “Multiview” (Program/Preview monitor).
  • Canon Vixia HV40 camera (2) – generally any HDMI camera will do – the HDMI cameras are cheaper than any SDI model which start at over $2000.
  • BMD Mini-Converter HDMI to SDI (2) – we have to use these to get the signal converted into SDI for long runs. HDMI cables over 10’ or so just don’t work in this case.
  • Manfrotto tripods (2) – Model 055XPRO B w/ 701HDV head – classic solid tripods
  • HDMI cables (2) – out from camera into mini-converters.
  • Power Strips (2) – power the cameras and mini-converters.

Other Equipment

  • Backpack – Carry cameras, power supplies, converters, etc.
  • Computer (and software) to control switcher interface – we use a Macbook Air or Pro Can be a Mac or Windows. Computer connects via ethernet to ATEM switcher.
  • Also we have Photoshop on this computer as well. Provides On-air graphics editing that can be exported to ATEM Media Player.
  • Ethernet cable – for above connection.
  • MagSafe Power adapter for Macbook Pro
  • Mac Mini – for “computer source” images (slides, web pages, video, etc.)
  • TP-LINK Wireless router – I set up an ad-hoc network for the ATEM. Various devices can connect to control the switcher, such as iPads, iPhones, other computers.
  • Live Stream Hardware – we use a Teredek VidiU w/Ethernet cable & HDMI cable – It’s $700 but this makes it so easy to stream to various CDNs like YouTube. Software solutions exist such as Wirecast.
  • DVI cable –  we used this to get a direct view to Mac Mini. Hard to see detail on Mac on the “Multiview” monitor.
  • Mackie 402VLZ4 4-Channel mixer – great small footprint mixer that takes room audio in then we go out to cameras.
  • Stereo RCA to 3.5mm – we run this cable from the mixer to the Canon Vixia to provide “system” audio (it runs via HDMI/SDI to ATEM switcher). Obviously individual channels direct in would be better. This requires separate interface hardware to do analog to digital (AES/EBU) for ATEM Television Studio. A whole other conversation.

The Blackmagic Design ATEM Television Studio is one of those unique pieces of hardware that has almost a cult following, and I think with good reason. It is a 6 input (HD) switcher for less than $1000. Nothing else can touch that price point. It has so much built into such a small footprint, and the quality is outstanding. Keep in mind when you buy one of these, there are pieces that need to be added on if you get into a complex production, which the multiple inputs beg you to do. If you’re using only one camera, there’s really no need to use the ATEM. If you’re using two cameras, and you have them in close proximity (using HDMI cables of less than 10 feet) to the switcher, the ATEM is ideal. With more cameras, or cameras at a greater distance from the switcher, you need to do SDI, and hence you need to convert the HDMI camera into SDI. The BlackMagic Design HDMI to SDI Mini-converter does the trick here. For about $300 they turn an inexpensive camcorder into a serviceable production camera. Inexpensive camcorders don’t have the power to send an HDMI signal long distances (again we’re talking over 10-15 feet), so SDI must be used.

I’ll talk more about how I’m using this kit, as well as future iterations, but here’s an example of one of the live streams at Open VA (featuring several of my DTLT colleagues):

Fun with YouTube Subtitles

weather is a network

I had a bit of unexpected fun yesterday. One of the things (on my long list of things) to explore this summer is closed captioning (subtitling/transcribing) videos and getting a manageable workflow going. As we begin the Fall semester in about 6 weeks, I want to have a plan for implementing transcriptions as a part of the many videos that we will begin to produce in the new building (you know that ITCC thing I keep talking about?). I’m working on that workflow and hope to have recommendations soon.

Meanwhile, I was playing around with the YouTube Closed Caption tool. It looks to be a great way to start the process of getting automatic transcriptions for video, although, as it is the subject of this post – it’s not perfect.

But, that’s the beauty of it. Let me show you.

One of the videos that was transcribed, again automatically by YouTube simply by uploading it, was a video on the Domain of One’s Own project. In the video, you’ll recognize some DTLT staff members, Martha Burtis, Ryan Brazell, and Tim Owens, as well as some UMW faculty, Jeff McClurken, Andi Livi Smith, and Sue Fernsebner, and one UMW student, Jack Hylan.

What was particularly entertaining was the attempt by the transcription service to get term Domain of One’s Own, and LMS, correct. On rare occasions it would get the terms right, spelling out the words “domain of one’s own”, albeit in lower case, and the acronym “LMS”. However, it did struggle. Here’s where it got entertaining. It seems to pick on Martha and Jeff the most. First, Domain of One’s Own . . .

two main ones

munich won Tonys

demand of ones own

dominicans out

YouTube’s struggles with LMS (as in Learning Management System) were equally funny.

in relation to the Alamosa

Elemis

alum ask

As well as saucier versions . . .

alum ass

elem ass

And my favorite . . .

clothes wellsley Almazan

The actual spoken words from the above clip are “closed walls of the LMS”. See YouTube Closed Captions can even teach you about geographic locations you didn’t know about – Almazán, Spain. And I never knew about it’s association with Wellesley. Oh, and don’t forget Alamosa, Colorado.

To finish up the fun, there were a couple more transcription errors – one just basically silly, and another one fun in a teenage boy kind of way. First . . .

hammock resume

You can guess what the real spoken words are in this next one . . .

the poop i need to jump through

After it’s all said and done, it is amazing what an accurate job this automatic transcription service does. Anyone who has the task of creating captions for a video might find it to be a quite entertaining task. I hope the student aides that I assign to this task think so as well.

Transition of a Space

This is a follow-on from my last post. Some of us DTLT folks got another chance to see the progress of the ITCC (if you don’t know what that stands for, read more of my blog). As I mentioned previously, I’ve been concentrating on the DTLT Edit Suite, and the progress of that room is coming along nicely. Here is a just a brief sequence of photos to show you where we are.

It started out with the framing:

Then it got walls:

As of June 11, 2014 it’s got paint and carpet:

Most of the equipment for this room has been ordered and will start to arrive soon. One of my next posts will go into detail of the actual equipment setup. Stay tuned.

Imagining a Space

DTLT Edit Suite

Imagine the possibilities. My mind is preoccupied with what the Information and Technology Convergence Center (ITCC) will do. I have to think of the possibilities of individual rooms, as well as how those rooms fit into the overall vision. I’ve got my vision for video and audio production in the ITCC and if I could sum it up in a word, it would be “enable”.

We have a video recording studio in the building unlike anything we could have imagined a few years ago. It’s very exciting, and I hope to write more about that space soon. Right next door is a space for editing digital projects (and even a vocal booth for quiet audio recordings). But to me, the whole building is a production studio. There are lots of great spaces to capture (i.e. video record) conversations, and as I’ve said before, it is about furthering digital scholarship.

The space I’m currently thinking the most about is within the DTLT suite. It is adjacent to the “bullpen”, but in many ways I’m thinking of it as an office – a word derived from two latin words, opus (work), and facere (to make). The idea of this room is to serve as an editing suite – a new Mac Pro, two 32″ 4K monitors, a large 24TB raid array, a microphone and new digital 4K camera for recording, along with video switching and routing equipment to, you guessed it, enable possibilities.

The other part of this space is a “viewing” area. Projects can be visualized at any given point in time on a large 4K home theater style monitor (I’m shooting for 70″). At any point in the production process we can suggest elements to add to a project such as music, sound effects, visual effects. Faculty and students (staff too?) will be able to sit comfortably in a space and help make editorial decisions. That’s something else that we couldn’t have imagined a few years ago.

So the space illustrated at the top of this post is a general idea of what I’m thinking. Here is what it looks like as of May 1, 2014:

DTLT Suite

Here’s another shot taken from inside the room:

DTLT Suite other corner

And here’s the visualization:

DTLT Edit Suite render (other corner)

With the help of some software, in this case a program called Live Interior 3D, I can quickly drag in some elements (although they’re somewhat generic – note the huge desktop PC element instead of the Mac Pro) to visualize the space. Don’t you think that rug ties the room together? 😉

I’ve also got a QuickTime VR video (download it for better performance) of the space, again courtesy of Live Interior 3D.

Anyway, this is the space my head is in lately. I’m imagining the space and also thinking about hardware and software that will help realize the visions of members of the UMW community.

imagine sponge bob

DTLT Tomorrow

One of the initiatives that I am currently working on here at UMW is something called the Digital Media Commons Initiative. Part of the purpose of that program is to get people up to speed with some more sophisticated digital video and audio equipment. We are going to have a full-blown studio in the new Information and Technology Convergence Center, so people will use some pretty high-end equipment in that space.

DTLT also has this thing called “The Kit“, which is a portable “studio” that can be set up in a variety of spaces. Mostly we have it set up in our office with a green screen, and we use Wirecast to control the broadcast (live-streaming and recording). Because of the nature of the laptop, it is limited in terms of the number of camera inputs, computer inputs, etc. We need to shift to the next gear.

The episode of DTLT Today (#112) included above, begins to describe what that next gear is. We needed a full-on switcher with true multiple inputs so we can do multiple camera angles, include computer content such as demoing websites, Skype conversations (or Google Hangouts), playing YouTube videos, and so on. The video is pretty rough, but it goes over some of the components that we used. I’ll let the video itself do the rest of the talking, but I did promise that I would list the equipment that we used, so here it is:

  • Blackmagic Design ATEM Television Studio (rack mounted w/power unit) ~ $1000
  • Blackmagic Design HDMI to SDI Converter (required to do long cable runs with SDI – HDMI will not do long cable runs) – * ~$300 ea.
  • Vixia HDMI Camera (we’ve used HV-40 and HF R400) – * ~$300 ea.
  • Mackie 402VLZ4 Mixer – * ~$100
  • RCA stereo to 3.5mm to run audio into camera
  • XLR audio cables
  • XLR microphones – In the video we use Shure SM58 (~$100), but we’ll also use Shure MX150 (~$300), Sennheiser MKE600 (~$330), and Sennheiser wireless
  • HDMI cables – standard (HV-40), or HDMI to mini HDMI
  • SDI cables – 50 ft.
  • Other inputs – Mac Mini via HDMI (input @ 1080i), GoPro via HDMI micro to HDMI,
  • Mac Mini or other for computer input (Skype, Hangouts, YouTube, etc.)
  • MacBook Pro running ATEM Software Control (switcher app) – connected to switcher via USB and Ethernet
  • iPad running Strata Lite ($10) for ATEM switcher control
  • Teradek VidiU ($700) box for live streaming to YouTube – *
  • Things we plan to add or use on certain occasions – Atomos Samurai Blade ($1300) for monitoring and redundant recording via SDI, Blackmagic Design Hyperdeck Shuttle Recorder ($345) to include video playback

* – needs an A/C Adapter

Let’s Un-F*** Up Our Internet

1173 20120820 The Internet Is Completely Over

UPDATE: We Un F’d-up the Internet!

Lately it’s been knee-jerk to Tweet an article that we recommend to our followers to read. I do it with articles, videos and funny pictures all the time. A long time ago, in a place not so far away (right here actually), I would blog about articles that I recommended. It would be a quick post with a link and maybe some short commentary. Blogging is not dead for me, even though we joke about it in the DTLT office. We are not as prolific as our fearless leader, our “Big (Blogging) Toe“.

However, now its time to BLOG about an article. One that I feel is extremely important. I guess, so important that I didn’t Tweet it – I need to BLOG it!

Nilay Patel is a tech journalist for The Verge. I was first introduced to him back in 2010 when he talked with Christina Warren and Dan Benjamin about the legal issues surrounding the h.264 codec. He is a former copyright lawyer who for that particular subject meticulously explained the legal issues surrounding the licensing of the video codec.

Well, it’s time once again to listen to (read) the words of Nilay as he fires his warning shot about our F’d Up Internet. That’s it. No further introduction. Go Read It! (by the way, the subtitle is “but we can fix it”)

Did you read it? If you did, good. No, great! Now go act. Contact the FCC. Save the Internet before it’s too late. I’m not being hyperbolic. The Internet as we know it, or rather, knew it, is being morphed from what will serve the needs of the public, to what will serve the needs of those few companies that provide services and access to it. With no competition and ever rising prices for access.

OK, so maybe you need more evidence. Then check out this story on WNYC’s “On The Media” regarding the Comcast/Time Warner Cable merger. Hopefully this combination will compel you to act. Now will you go act?

photo credit: Chris Piascik via photopin cc