Category: Audio & Video (page 1 of 15)

Iterating the Kit

media production

Transition. It can be fun. Frustrating. Overwhelming. Invigorating. All those and many other things. For me it’s always about the next step. No one I know is more critical of me than me. One of the “criticisms” I hear – in a total teasing, smack-down, weirdly motivating and one-upsmanship way – is that I, and other DTLTers, should be blogging more. And it’s true. We’re all doing good work and we should be sharing what we’re doing. Especially now.

So in that vein of sharing, I will start with an update to something I’ve gone back to well on many times – The Kit.

For those of you who don’t know what “The Kit” is, it’s basically a live video stream setup that is as compact and portable (and inexpensive) as possible. At the time that I first gave it the name, The Kit was a backpack with a Mac laptop and a Canon video camera. The only thing that didn’t fit in the backpack was a good, sturdy, Manfrotto tripod. What made the live-streaming all work was a piece of software on the laptop called Wirecast. The limiting thing, however, was that multiple cameras was difficult (not enough “inputs” on the laptop – firewire only goes so far). That, and the on-the-fly encoding and streaming that Wirecast was providing, and it would tax the computer well beyond its CPU capabilities. Reliability was an issue at times, especially if we wanted to push the capabilities (something we do in DTLT all the time).

The Kit made its debut in 2011 and remained largely comprised of the basic set of components – until we began to get ready to move into the ITCC. As part of the research I was doing into the production spaces that would be a part of the new building, I came across what were essentially hardware solutions to what the software-based kit provided. In many ways the production spaces of the ITCC are relatively expensive. I still wanted to maintain a relatively inexpensive “Kit”, and of course, keep it portable.

Many weeks ago, Jim Groom asked me if I would provide the live-stream for the 2nd Edition of the Open VA Conference in Virginia Beach. The main reason I said “absolutely” was to meet the challenge of the next version of the Kit, and to provide a multi-camera setup. I’ll be honest, a side reason I said yes was the “beach” part 😉

So enough of the reasons, let’s get to what the Open VA version of the Kit looked like. Here’s a fast and loose video I put together of what I envisioned I would use, and following that a list of the individual equipment used (the actual implementation was scaled back a bit):


  • Blackmagic Design (BMD) ATEM Television Studio – This is the heart of the system. More about this item below.
  • BMD Hyperdeck Shuttle Pro – records to SSD hard drives in ProRes format
  • Furman Power Conditioner – provides clean power
  • Nady 6-Channel Rack Mount Mixer (not currently used – don’t know if we ever will)
  • Includes 4U Case, AV cables (HDMI & SDI), ethernet cable

Camera Setup

  • Asus HDMI monitor – used for “Multiview” (Program/Preview monitor).
  • Canon Vixia HV40 camera (2) – generally any HDMI camera will do – the HDMI cameras are cheaper than any SDI model which start at over $2000.
  • BMD Mini-Converter HDMI to SDI (2) – we have to use these to get the signal converted into SDI for long runs. HDMI cables over 10’ or so just don’t work in this case.
  • Manfrotto tripods (2) – Model 055XPRO B w/ 701HDV head – classic solid tripods
  • HDMI cables (2) – out from camera into mini-converters.
  • Power Strips (2) – power the cameras and mini-converters.

Other Equipment

  • Backpack – Carry cameras, power supplies, converters, etc.
  • Computer (and software) to control switcher interface – we use a Macbook Air or Pro Can be a Mac or Windows. Computer connects via ethernet to ATEM switcher.
  • Also we have Photoshop on this computer as well. Provides On-air graphics editing that can be exported to ATEM Media Player.
  • Ethernet cable – for above connection.
  • MagSafe Power adapter for Macbook Pro
  • Mac Mini – for “computer source” images (slides, web pages, video, etc.)
  • TP-LINK Wireless router – I set up an ad-hoc network for the ATEM. Various devices can connect to control the switcher, such as iPads, iPhones, other computers.
  • Live Stream Hardware – we use a Teredek VidiU w/Ethernet cable & HDMI cable – It’s $700 but this makes it so easy to stream to various CDNs like YouTube. Software solutions exist such as Wirecast.
  • DVI cable –  we used this to get a direct view to Mac Mini. Hard to see detail on Mac on the “Multiview” monitor.
  • Mackie 402VLZ4 4-Channel mixer – great small footprint mixer that takes room audio in then we go out to cameras.
  • Stereo RCA to 3.5mm – we run this cable from the mixer to the Canon Vixia to provide “system” audio (it runs via HDMI/SDI to ATEM switcher). Obviously individual channels direct in would be better. This requires separate interface hardware to do analog to digital (AES/EBU) for ATEM Television Studio. A whole other conversation.

The Blackmagic Design ATEM Television Studio is one of those unique pieces of hardware that has almost a cult following, and I think with good reason. It is a 6 input (HD) switcher for less than $1000. Nothing else can touch that price point. It has so much built into such a small footprint, and the quality is outstanding. Keep in mind when you buy one of these, there are pieces that need to be added on if you get into a complex production, which the multiple inputs beg you to do. If you’re using only one camera, there’s really no need to use the ATEM. If you’re using two cameras, and you have them in close proximity (using HDMI cables of less than 10 feet) to the switcher, the ATEM is ideal. With more cameras, or cameras at a greater distance from the switcher, you need to do SDI, and hence you need to convert the HDMI camera into SDI. The BlackMagic Design HDMI to SDI Mini-converter does the trick here. For about $300 they turn an inexpensive camcorder into a serviceable production camera. Inexpensive camcorders don’t have the power to send an HDMI signal long distances (again we’re talking over 10-15 feet), so SDI must be used.

I’ll talk more about how I’m using this kit, as well as future iterations, but here’s an example of one of the live streams at Open VA (featuring several of my DTLT colleagues):

That’s Easy

That's Easy!


Just Because!

The Steadicam of Life

This little video blew my mind in so many ways. I don’t expect it to blow the minds of very many others – or maybe it will – I don’t know. First, some background. The Techcast Focus Network (TTFN) is a group dedicated to informing would-be broadcasters about technology that delivers the highest bang for the buck. They are a consultant group and therefore get paid for what they do, but they give back to the community in many ways through their video reviews and coverage of technology shows – especially the National Association of Broadcasters Show (NAB Show).

I have subscribed to TTFN’s YouTube videos for a couple of years now, and they have informed some of my approaches to video and live production ever since. I especially like the fact that have an ethics page to transparently disclose any corporate assistance they have received and address the lack of influence that may have on their reviews.

So imagine the aligning of planets that brought the TTFN folks together at the NAB Show at the Tiffen Booth to talk to Garrett Brown about the latest Steadicam products. “So what?”, you might say. Well, if you don’t know who Garrett Brown is, he is the inventor of the Steadicam. It is a tool to stabilize a hand-held camera for film, and now video. The Steadicam was first used in the Hal Ashby film Bound for Glory. It was perhaps more famously used in The Shining, following the character Danny on his trike through the Overlook Hotel, and later through the snow-filled hedge maze at the end of the film. It was revolutionary. It allowed the camera to go anywhere, at least anywhere a human could go with a camera strapped on. Cranes and dollies would be impractical to follow a character doing one complete revolution of the hotel perimeter in one complete take. The Steadicam made it possible – and mesmerizingly unique. Wikipedia has a picture of Brown walking and talking with Stanley Kubrick, with the device, on the set of The Shining.

So this got me thinking about my favorite subject, or at least work subject, education. Specifically the tools that we use to not just enhance, but to transform education. What the Steadicam has done is provide an extension to the body that allows what I said before, to “see” in the most accurate way to how humans see life. And to go anywhere that humans can go. As Brown says in the video above, we humans have a built-in Steadicam. The technological device transformed filmmaking. Tools like this have transformed education as well. I’ll let others argue the accuracy and application of the terms enhance and transform for now.

But we know some of the tools that have changed for the better how education can be delivered. Altered and enhanced learning by enabling networks and communities of practice. It’s OK to celebrate the tools for what they have enabled. The Steadicam didn’t save the film industry. No technology will save education. What has happened to the Steadicam technology is that it has become less expensive, and therefore more democratized. It allows filmmakers on a small budget to get the look that films like The Shining have. You can get them for DLSRs and even get one for your iPhone for about $150.

Educational technology tools are similar. And the best ones are those that you don’t think immediately as being specifically a tool for education. Textbooks, film projectors, overheads, blackboards, and even computers, all enhanced education in certain ways. To a small extent they changed how we “see” education, but in a literal way. The technologies that will transform education are the ideas that are born from them. My friend Gardner Campbell talks often and lovingly of “Alan Kay’s aphorism that “the computer is an instrument whose music is ideas.” The Personal Cyberinfrastructure that Gardner has championed the last several years, and that DTLT uses as a frame for our “culture of innovation”, is some of that wonderful music born from the technology. A Domain of One’s Own is what we think will at least enhance one’s education and perhaps even transform it in profound ways.

Now like all analyses of this type, discussions can get bogged down in over analysis. What about “X” or”Y”? Time to rip to shreds your little dream-like analogy or aphorism. Steadicam’s are still relatively expensive for some. There are DIY versions of  them out there. So how about DIY education? Questions and further analysis for another day. I prefer to bask in the strange and delightful performance of Garrett Brown hawking a product on a convention show floor that was derived from a device he imagined and used on the set of a film by one of the greatest directors of all time. Life is full of delight.

oEmbed, Can You Hear Me?

Can You Hear Me?

We’ve arrived at the final post (for now) on using WordPress embeds (see part 1 and part 2). This post fills in the gap to something that is obviously missing in our oEmbed examples. We saw the popular media sites like VimeoYouTubeFlickrSlideshare, and even Twitter all had the capability of simply copying and pasting a URL to the media page in a WordPress blog and the full media shows up. The glaring omission is audio. If you look closely at the WordPress Embed Codex page, as of WordPress 3.5, SoundCloud is now an option for embed. Below is an example from the Radiolab show:

Again, WordPress knows what to do when I paste the URL from the media page and it gives us a nice SoundCloud player. SoundCloud is a popular service, but it’s limited in the amount of audio one can upload and play from an account for free. You can upgrade to a premium plan, but they start at 29 euros (which equals about 39 U.S. dollars today). To get unlimited downloads you need to pay 79 euros per year ($107 per year). That might be a bit too pricey for some. Another option is to create a “video” out of your music or audio file. By that I mean import your audio file into a video editor, then add an image or text/title, and finally export that as a video file to a site like YouTube. It’s slightly inelegant, but it gets it published for free. Be aware that even if you think you have a file that clears copyright (like Creative Commons audio), someone associated with YouTube may try to make a claim to it.

Lets step back and think about what if we simply have an MP3 file on a server somewhere (maybe you uploaded it into the WordPress Media Library). Well, to follow our philosophy on using the minimal URL of a file to get our media embedded and playing in our posts, I recommend the oEmbed HTML5 Audio plugin. Now I warned of using plugins in the first post of this series because plugins can become outdated and unsupported in the future. However, this plugin does not use any shortcodes in its implementation. What it does is emulate an oEmbed option for audio. Paste in the URL for the audio file and you’ll get a built-in player embedded in your post. It will even work on an iOS device because it supports HTML5. A caveat is that you generally have a limit to the size of the file you can upload to your WordPress account which usually is 2MB. If you have control over your server you could up that limit or place it somewhere else on a server you control (It’s why you need A Domain of Your Own!).

So I paste this:

And I get this:

If you have a URL to an MP3 elsewhere, you can just paste it into your post, and it should play. I used this technique when I posted the audio file on my Got Running post.

Hopefully WordPress will begin to natively support this type of embedding in the future so you won’t have to install the plugin. In the mean time, even if the plugin goes away, you will still have the link to the audio file. Some users may not know where to go from there, but they could always ask their local instructional technologist.

The WordPress oEmbed technology seems to add new support with every new release of the software. I’ll be curious to see what they add in version 3.6.

This is part 3 of the series of posts on WordPress embeds. Here’s Part 1, and Part 2.

YouTube Time

In this 2nd post about WordPress embeds (here’s the 1st one), I wanted to point out a simple trick that is part of the API for the YouTube embeds. You may not notice anything special about the YouTube video included above, but if you click the play button you will notice that it does not start from the beginning, but at the 15:10 mark of the video instead.

This is accomplished by adding a small piece of extra “code” to the end of the YouTube link. Again, in the instance above we wanted start this 15 minutes and 10 seconds into the video, so we add the following to the standard YouTube link:


Show the whole link looks like this:

Again pretty easy. Wes Fryer also pointed me to a site that generates the extra time code for you. It’s available at Sure you can type in the code yourself, but laziness is the mother of invention.

Now what if you want to END a YouTube video at a specific time? A little bit of research didn’t lead to any answers using the oEmbed API, but it may be possible that I missed it. What you can use it a site called TubeChop. It will generate code you embed into your posts. Just enter in your YouTube video link and then on the resulting page choose the start and end points for the video. Finally, click the “chop it” button and you’ll see both a link and the embed code for the video. It would look something like this:

The downside to TubeChop appears to be that it generates only a Flash version of the outputted video so it’s a no-go on an iOS device.

So remember when it comes to your WordPress YouTube video embeds, it all in the timing.

This is part 2 of the series of posts on WordPress embeds. Here’s Part 1.

Andy Rush TV

Andy Rush TV

Stuff we said in the past sometimes comes back to haunt us. However, sometimes we are reminded of things we’ve said that sounded like good ideas at the time. We research if it’s possible and then find what we need or move on. Because of some of our recent work on the media server, I have gotten a chance to review some of the TV shows we did more than a year ago now. On December 14, 2011, Tim Owens and I were doing our 101st DTLT Today episode. It was a show about the Opening of the Venice 2011 Art History Exhibit. Tim and I had recorded the students giving their presentations. We also streamed it live for anyone who cared to watch at that given moment. At about the 5:10 mark in the video Tim talks about publicizing this idea to more professors and encourage them to not keep student presentations “behind closed doors”.

That subsequently triggered an idea in my head (at about the 5:40 mark), “what if there were a channel” where fellow students, students’ friends at other campuses, or parents could watch these types of presentations on a continuous loop or on a specific schedule somewhere. You know like a TV station.

After the show I began the search for sites or services that offered such a concept. YouTube and Vimeo offered the idea of “channels” to its users. However, it wasn’t more than a “playlist” of videos played in sequence. You go to a YouTube channel and it starts with the same video every time. Vimeo has something similar now called “couch mode” that allows you to watch a channel as a playlist. Neither service has an option that allows you to play a video at 8:00pm on Saturday night. I had hopes back then that something called WorldTV might do what I wanted, and it did incorporate live streaming into the mix, but again nothing really on a schedule. So there wasn’t much to move ahead with, so I moved on.

Fast forward to January 23, 2013 and after re-watching that video I decide to renew the research. What I stumbled upon turned out not to be as earth shattering as I hoped, but cool enough to say it has a lot going for it. So my proof of concept is to build something that I am initially calling Andy Rush TV. The idea is to run videos from various sources, but primarily YouTube and Vimeo. Occasionally, I will want to play a video at a precise time (like 8 o’clock on a Saturday). I will also want to stream a live show to this channel. Well, I’m currently experimenting with something called StationCreator.


StationCreator is a service that allows you to have a schedule, but also something called “autopilot” so you can have a “station” running 24/7. There are 3 pricing plans. A free account gives you one channel and you can only use autopilot. The “Pro” account, for $200 a month, gives you unlimited channels, as well as scheduling and even analytics. There is an “Enterprise” account (at $1,000 a month) that adds real-time tracking, video hosting and API access.

For this proof of concept, I created a new WordPress site at StationCreator gives you the embed code to place on your site and that provides the window to your station. Whatever gets scheduled will play in that window. Autopilot videos are played when there is nothing scheduled. Scheduled shows start right when they should. I’m early on in this experimentation so I haven’t incorporated live content yet.

I’ve had some wonky behavior when I insert scheduled videos into the list. Eventually things will stabilize as long as you don’t make a live change and then scheduled shows play nicely between autopilot shows. It seems to be in a beta state right now. Definitely functional, but with a few bugs. I am working to find the way to play the channel on my mobile device. Their site says “We give each station a mobile-ready fullscreen player page on our site, too.” I’m still looking for it.

As I say, it is early on in the game, but it looks like even the “free” channel would work pretty well. When it’s working properly, you do get the “joined in progress” feeling of a real TV station. You also get widget code that can go into WordPress sidebars for what’s playing currently and what’s coming up in the schedule. Autopilot will still give you a schedule of upcoming shows. LOTS of potential here and I hope to put it to a good test. So go enjoy Andy Rush TV now!

A Mac-like Video Converter for Windows

Many moons ago I blogged about a video converter called Evom. I loved it (still do) for its simplicity and for its unique features. I’ve found something similar for the PC. It’s been available for a while, but a new version (3.0) has just been released that gets close to Evom for the PC. It’s called Miro Video Converter and to use it you simply drag your video file into the window, select what device you want to convert for, and then click the convert button at the bottom of the window. There are tons of choices to convert files to. All of the latest Apple devices are listed, as well as Android devices and even the Kindle Fire. It also allows the conversion to “open” format file types such as Ogg Theora (video) and Ogg Vorbis (audio). There’s even the choice of WebM for those of you still holding out hope for that format to catch on. Though my advice to you would be to exhale.

Now my favorite feature from Evom was that you are able to drag a YouTube URL from a web browser window into Evom and it would begin downloading and convert your video. That feature works much less consistently now, if at all. So I still use Firefox and the Video Download Helper plugin to download YouTube videos. Once they are on my machine I can then use Evom to convert them to an audio MP3 file. I’m happy to report that the MP3 conversion feature works in Miro Video Converter too, though quite a bit slower than Evom. But hey, these are free programs we’re talking about.

So Miro is also available for the Mac, but I prefer Evom, for most of what I do. Mostly because it is faster. However there is one other intriguing feature that Miro has. It can convert into what are known as “ingestion” formats, such as ProRes (what Final Cut Pro X likes), AVC Intra, and DNxHD. What this means in theory is that you could convert videos into formats that are recognized natively in video editing software. How this would work in practice remains to be seen. But it’s interesting to see those options.

I have several students every semester ask how they can get the audio from a YouTube clip into their projects, and now I have a program that I can recommend for PC users.

iMacs for Student Editing

iMac B&W

While we’re not quite ready for our Convergence Center yet, with the help of our head librarian and some funds from the state, we will soon establish a space in the Simpson Library to facilitate video editing. We are going to go with iMacs for the simple reason that PC video editors are just dogs. And I’m happy to have the argument conversation with anybody about which is better, just as long as we start with the comparison of video editors that “come with” the Mac and the PC (iMovie vs. Movie Maker). I put “come with” in quotes because iMovie is now pre-installed on all Macs, and Movie Maker is a free download for PCs. I will also accept submissions for best paid video editor software on either platform, though I think Final Cut Pro X is the best, Adobe Premiere CS6 is a close 2nd (and available on both platforms), and back in the day Sony’s Vegas on the PC was cool. There. Let comments fly.

Before I tell you what I’m spec’ing out, I will say that we will have both Final Cut Pro X and Adobe Premiere (as well as the rest of Production Suite) on these iMacs. So here goes. I have a budget, but I don’t know that exact figure as I’m writing this. I had to approximate the cost of each iMac months ago, knowing there was a good  chance a new version would be introduced. Therefore my approach is getting what I think to be the minimum features and then adding or subtracting as needed. More importantly I’ll give you my reasons (and a few beefs).

I’m going with the 2.9GHz Quad-core Intel Core i5 version of the iMac including 16GB of memory and the 1TB Fusion drive. Also the wired Apple mouse and the wired keyboard with numeric keypad. Wired peripherals are much more reliable when editing, and the numeric keypad adds keyboard shortcuts that editors need. Now the other iMac option for the 21.5″ size screen is a 2.7GHz model, but that doesn’t offer the Fusion drive. What is a Fusion drive you may ask? It is essentially a “hybrid drive” made up of a solid-state drive (the iMac uses a 128GB drive) and a traditional platter based drive. It will all add up to 1 TB of space. Solid-state drives are very similar to USB flash drives but much faster and in a hard-drive form factor. The platter drives are what we’ve been using for over 20 years, only the speeds and capacities have changed. I remember purchasing hard drives measured in megabytes (MB). The “hybrid” is software that controls how and where the data on the two drives is saved. Files or programs that are accessed more often will be available on the flash drive, where less frequently accessed files will reside on the platter drive. Software will monitor this and operate in an optimized way (in theory).

As far as memory goes, it’s a choice of 8 or 16GB. I’m going with 16GB because programs like video editors like memory, usually the more the better. An extra $200 is a bit over priced, but it is virtually impossible to add memory later. A beef that many people have raised already. For most (non-video editing) people, 8GB would be fine for several years into the future and only people who know what the benefits of more memory will even consider 16GB. I don’t have the same beef about the memory issue. Don’t get me wrong. iMacs aren’t cheap and paying $1300 or more, some people expect more access to the components for upgrading. Well to them I say those days have sailed. We buy computers as appliances. PCs are still more accessible in terms of upgrades and tweaking. Macs are sealed up from the factory, except for the Mac Pro, and the Mac Mini to some extent. If you want to hack your Mac, you need to build it from scratch – Hackintoshes anyone?

What does make me a bit nervous (and Apple Care will have to be added for insurance) is the hard drive going bad, as our current 27″ iMac suffered exactly that fate. Taking it apart (by an Apple tech who came to campus) was not for the faint of heart. He came with a toolkit that included suction cups – I kid you not – to take off the glass screen to get to the hard drive. In some ways the new iMac may be easier to get to the hard drive because it is the uppermost component underneath the screen. However, the screen needs to be pried open with what looks like a guitar pick and then presumably needs to be resealed with adhesive. *shudder*

The other thing that makes me nervous is simply, I hope the Fusion technology works. I’m a big believer in solid-state hard drives. You can’t just order a solid-state drive in the 21.5″ iMacs, however. It’s either a traditional drive or the Fusion. The 27″ iMac does have the solid-state option (as well as user accessible memory), but it’s a $1300 upgrade (for a 768GB drive)! An option for a 256GB drive on all of the iMacs would be my preference. Often a company called Other World Computing (OWC) offers various upgrades on Macs. They may have some choices in the future.

So once again:

  • 2.9GHz Quad-core Intel Core i5, Turbo Boost up to 3.6GHz
  • 16GB 1600MHz DDR3 SDRAM – 2x8GB
  • 1TB Fusion Drive
  • NVIDIA GeForce GT 650M 512MB GDDR5
  • Apple Mouse
  • Apple Keyboard with Numeric Keypad
  • Apple Care

That’s around $2000 before software. I may have to give up the Fusion drive and more memory if we’re over budget. Regardless, I still think this gives students a better creative experience for editing video (and if need be audio). And hey, we can always create a Windows partition if we need to…nahhh.

We’ll update as this story unfolds…


Google Earth Fun

I decided not to wait until Friday to have some fun. All of us in DTLT are acting a bit shell-shocked to be back after Thanksgiving num-nums. I started the week by going through the process of switching hosting for this site. I’m moving to MediaTemple for all of my business and personal hosting.

In the mean time, while waiting for files to FTP to a local drive, I have been playing with Google Earth. It’s been a while, but I have always known that I wanted to incorporate “fly-over” videos into some video projects. The Google Earth Pro program allows you to export your “tours” as movies. The Pro version is $400 though. However, if you have good screen capture software (I used Telestream’s ScreenFlow), you can fake it with a little extra time and effort.

The above video was done by playing the Google Earth tour while recording in ScreenFlow. Then I cropped out the navigation buttons and graphics. I added the Shining music and then exported the video. Pretty simple as far as this goes. More experiments to come. I would love to see some folks do personalized versions of these types of videos. Make a tour from your home (or some other place in the world if you’re paranoid) to work and add the Shining opening theme to it. It’s fun.

What Color is Your Bumbershoot?

Before I get to the subject of this post, I need to say something. If anyone ever asks you what Twitter is good for, ask them if anyone has shared something inspiring before. It happens often for me. If it doesn’t happen for you, maybe you need new people to follow.

Twitter is exactly where I found inspiration this weekend. My good friend and colleague Gardner Campbell started off my Saturday morning (told here using Storify):

Looking back at the archive, the president, after a standing ovation for parents, states “Wow, this is a great lively crowd, this is fun.” It was indeed!

There are links to a part 1 video, but part 2 has the Eugene Mirman talk (at 33:10).

Video streaming by Ustream

* Bumbershoot – another name for an umbrella, and a festival in Seattle where Eugene Mirman played.

Older posts

© 2017 And He Blogs

Theme by Anders NorenUp ↑