Thanks, Tom Woodward

the majestic Tom Woodward

Photo by Serena Epstein

Thanks to Tom Woodward, I am doing some Thanksgiving WordPress blogging. He wrote this awesome, simple, WP plugin that makes an “Easy Button” for writing posts. It get’s placed in your Dashboard and beckons you to click the shiny button and BLOG! Just be sure to change the bit of code to reflect your website or else you’ll find yourself trying to blog at Tom’s Site.

Here’s the link to the GitHub site for the plugin.

Here’s a picture of MY button, available in my experimental “Testing” blog.


A Joyful Reunion

Gardner Campbell UNFIS
Photo by Mike Boyles

Hey, Gardner Campbell, I’d like to introduce you to my new friends.

Back on August 13th, I accepted a job to come to the University of North Florida to work at the Center for Instruction & Research Technology (CIRT). Exactly a month later, I posted my goodbye to UMW and Fredericksburg on Facebook (of all places). Among the well-wishes I received from that post was one from Gardner. He finished by saying, “Oh, and p.s.: see you in November!” I didn’t know at first what this meant, but I soon realized that he was referring to CIRT’s annual Innovation Symposium, UNFIS 2015. Gardner would be keynoting the symposium, and it would be my maiden voyage into helping host a conference at my new place of work.

Well, the conference has come and gone. I am now in video editing mode and will soon be posting all of the conference videos. It reminds me very much of the UMW Faculty Academies we did so long ago, including several with Gardner. True Fact: One of the first times I videotaped Gardner was a Faculty Academy in the Jepson Science Center doing a demo of Sound Forge on the PC. I believe that was in 1999 or 2000, well before anything known as DTLT or even Gardner being director of the talented “dream team”. We go back a ways.

Fast forward to 2015.

I do my best to actually pay attention to what people are saying during their presentations that I am taping, and not just adjusting knobs and dials and trying to get the recording perfect. I DID like what I was hearing from the UNF Faculty participants, and I was getting excited about working with them. Overall it was an impressive conference and well attended (the highest attendance so far). It started two years ago, when another upstart, Jim Groom gave a keynote talking about this crazy idea of a “Domain of One’s Own”.

Years earlier (at Open Ed 2009) there was a talk by Gardner that we will forever refer to as the Bags of Gold talk. Within the first 5 minutes of the video, Gardner expresses the frustration he has with some colleagues of living in what Clay Shirky calls “the largest increase in expressive capability in the history of the human race”. Here, watch:

“It’s a bag of gold, what part of that do you not understand.”

Now we have Dr. Campbell in 2015, thankfully seeing the fruition of his talks about “Personal Cyberinfrastructures” in the many “domains” projects at several universities, with Mr. Groom (succesful businessman), along with his partner, my friend and colleague Tim Owens, spearheading the Reclaim project. Paying attention 😉 to Gardner’s talk was one of the better experiences I have had recently as far as Edtech conferences go. It was a rush of emotion bringing the work that I and others have done over the many years and introducing it to a new, fertile campus.

So without further ado, here is Gardner’s UNFIS Keynote, complete with post-lunch opening statement, Jerome Bruner references, and a beautiful message at the end about how we can bond with our students. When he utters the line “I am SO not kidding”, a chill goes down my spine. Enjoy!

A Happy Ending/Beginning


This brief story started on September 7, 2015 when the good (did I say good, I mean great!) folks at the University of Mary Washington gave me a going away party at one of DTLT’s favorite hangouts – Hard Times Cafe (I always have the Alamo Chicken – highly recommended). The present that they gave me was enough money on gift cards to get the new Apple TV when it came out. Well, today was finally that day when that gift was obtained.

One cool thing is that Jacksonville has its own Apple Store. It’s about 4 miles from the University of North Florida where I work.

JAX Apple Store

So my lunch hour was a quick trip to the store to pick up the new Apple TV 4th Generation that I ordered online in the morning. As it turns out it was the quickest method to get it, as opposed to pre-ordering online and having it delivered.

There were several people who, like me, ordered online to pick up at the store. As you might know, the Apple Store is a bit different from other stores in that you are greeted at the door for triage. I’m not sure the system is any better, but it is different. I gave them my name and eventually they came back with an all black box (with another all black box inside it). I signed the iPhone device that the gentleman handed me to verify my pick-up and that was it. Tonight, I will bring to home and begin to play with it.

So it’s kind of bittersweet because the Apple TV will remind of the great people at my former workplace, whom I miss, but now I get to talk to my remote and have my TV do my bidding – “Siri, show me all of the films directed by Stanley Kubrick.”

So thank you to all those who made this day possible – Jim, Tim, Martha, Jerry, Shannon, Jeff, Mark, Steve, Leah, Debra, Cartland, Jess, John, and Betsy (and Lisa and Mary who couldn’t be there). Thank you all of UMW!


I lost count how many times I saw this thumbnail in my Twitter stream yesterday:

sleeper thumb

I and other folks in DTLT have been following the work of Michael Wesch for years now. I have written a few posts about him as well. He is particularly inspiring to me because he is not only a professor of Anthropology, but a filmmaker.

He came to UMW in 2011 for our Faculty Academy and talked about students becoming “Knowledge-Able”

Part 1:

Part 2:

So what were folks tweeting about yesterday? It was Wesch’s new video “The Sleeper“, a brilliant little short that reminds us what students bring to class. It reminds us what the idea of education is – student centered. However, it also reminds faculty to think about who is out there in the vastness of the college (or high school, middle school, elementary school classrooms) lecture halls. It is the next generation of people that we are teaching. Those people have lives outside of class and they all have unique stories – and problems – and obstacles. They are the Why. If we allow students to bring their stories into their schooling, we might will make better connections.

So after briefly bemoaning that he wasn’t making a difference, or not a good teacher because this one student was sleeping or barely paying attention in class, Dr. Wesch did something simple. He asked the student to lunch. He had a conversation with his student. He found out more about someone in the sea of students in his class. I’ll let the video explain the rest, but there’s a particular part in the video, which Wesch animates himself, that reminds me of another great talk that Wesch gave at UC Irvine titled “Why We Need a Why?” . In it he gets up on the desk and “shakes his tailfeather”, in service of demonstrating an anthropological concept, of course.



Here’s “The Sleeper”

Thank you, again, Dr. Wesch!

Wistia Presentation Syncing


Summertime is play time! One of the things I’ve been playing with lately is a new video hosting service called Wistia. For a relatively low monthly cost, we can store lots of videos at a site with more privacy controls than YouTube. It also has some pretty nice features regarding user engagement and analytics.

The folks at Wistia have some really good (if somewhat goofy) videos on some production basics as well as a ton of other resources in their library. Like all good companies, you also need a space to experiment, and that’s where Wistia Labs comes in.

The above example is an example of their Presentation Sync feature from the labs. There are two components. First, it takes advantage of a service called Speaker Deck and uses their API. You upload your slides in PDF format and they convert them to an embeddable slide presentation. The other component is a video you have already uploaded to Wistia. Then Wistia’s presentation sync takes care of matching the two together. It’s pretty easy to do, and then the whole synchronized presentation can be embedded in a WordPress post (like this one). This isn’t a new idea, but their code builder makes it pretty seemless.

Two last points unrelated to the Presentation Sync – Remember to check the equipment bag to see that all of the parts are there – and no, I don’t always wear the same shirt when I do my videos.

A Follow-up

I’ve been meaning to post this for a while. At the moment, I’m taking some needed days off, but I’m happy to say that “Andyland” is back online and the above video is submitted as proof. It seems like I blinked an eye and this CRAZY semester ended. Now it’s summertime and the livin’ . . . well, it’s not easy because there’s a ton of work to do. DTLT is also losing a couple of key players in Ryan and Jim, so part of the summer scramble will include search committees.

I have a bunch of summer projects to put the polish on some of the ITCC facilities, and I’m going to try to blog about them. However, my vacation days are ending and I’m missing my production space. So I’m posting a video to remind me.

A Setback

“I’ve lost all my work!” That was the first thought in my head as I found out that the Media Editing Suite, affectionately called “Andy Land”, had been flooded with water from the ceiling due to a … well, I still don’t know the cause really. I’ve listened to words come out of people’s mouths, but I can’t make sense of it. Frozen and thawed pipes. HVAC units failing. Antifreeze. The cause is immaterial. The incident has caused countless people headaches and almost criminal wastes of time.

So while waiting for a backup to occur on the Mac Pro that was in the room – it was still running when the Building Manager, Cartland opened the door and then kicked of the power – I created a therapeutic video. Nothing special, just some footage that I was able to recover from the Blackmagic 4K camera that I used just six days before. Yeah, you heard right. I was able to backup the hard drive on the Mac Pro, which I turned on today and got connected to some of the other equipment that I was able to get running.

My work, for the moment, is safe. I wasted a good amount of time the last couple of days. We had visitors from Michigan State that I would have liked to spend more time with. Other UMW personnel have lost some of their work time as well. If it wasn’t for Cartland coming in on his day off, the flood may have been discovered a lot later. DTLT folks have been displaced from their ITCC work spaces – only 7 months old – obviously the building isn’t able to walk by itself yet.

It’s been bittersweet this new building of ours. Student love the space. It’s a roaring success in terms of filling a need. I continue to hope that it remains worthy of the talented students we have at UMW. But Tuesday, February 17, 2015 was a setback for that hope. Next we’ll talk about who’s fault it was and who pays for the damage and what’s “covered”.

As I said on Twitter, “it’s just stuff”. But it’s also a representation of work, and not just mine. The people who helped order and receive the stuff. The people who delivered and even set up the stuff. We do our work to get paid, mostly. Some of us are lucky enough to work at things we’re passionate about. Sometimes we have to work on things we shouldn’t have to.

Our work is beyond the machines. Sometimes it’s inside of them, and we work to keep them working so that they can help us create more of our work. If the machines stop working, we can lose our work. Bits and bytes dried up and blown away. Or flooded and washed away with corrosive water. Something is keeping those bits of my work alive, and I’m grateful to whatever (and whoever) it is.

Now, can we please get back to work?

Gotchas in the 4K/Retina World


Vexing problems are sometimes set aside to deal with at another time.

Let’s begin at the beginning. Since moving into the ITCC I have had the pleasure of using some 4K monitors (these Sharps – yes, Plural!) on the new Mac Pro in my video editing room. And yes, I am completely spoiled now. However, you may not understand how a 4K monitor fits into a desktop setup. I have another post brewing about 4K in general, but with 4K resolution brings a new term coined by Apple – Retina Displays.

So how does a Retina display figure into this post? Well, one of the problems (believe me the benefits outweigh the problems) is that some programs don’t know how to handle Retina, or HiDPI mode. HiDPI mode is essentially taking a high resolution and squeezing it into a smaller resolution. In this case squeezing 4K of resolution (3840×2160 or 2160p) down to a 1080p screen.

One example of a program that doesn’t handle HiDPI correctly is MPEG Streamclip, one of our favorite free programs we use to manipulate video. I wanted to do a screencast on the Mac Pro about MPEG Streamclip and it wasn’t behaving properly in HiDPI mode. The playbar was split and a small slice, including the “play” button, was off to the right, like this:

MPEG Streamclip split

I then had two choices. I could do the screencast at 4K resolution, which isn’t a good choice at this point in time (just trust me), or I could record it on another machine that isn’t using a Retina display. I wound up using a 21″ iMac that has a native resolution of 1080p (1920×1080). The resulting screencast is on YouTube.

The MPEG Streamclip/HiDPI problem was put on the back burner, but I eventually wanted to research if/how I could do screencasts with these problematic programs on my new Mac Pro (actually UMW’s new Mac Pro).

Today I was doing some clean-up on a website that is being resurrected – the Digital Media Cookbook site (yet another post is brewing about that). I was using a program called Image2icon. I wanted to create a new “favicon” for the site and knew that the “Pro” version (It’s $4.00 if you’re interested) of this program would do it. However, it wasn’t working. On their support page, an FAQ entry talked about an issue the program had on Yosemite, the latest Mac OS. It talks about enabling the program to “open in low resolution” mode. After that, Image2icon created my favicon without a hitch.

This got me thinking, is there a “Open in Low Resolution” checkbox for MPEG Streamclip? Take a look below to see the answer:

MPEG Streamclip Get Info

You right click on the program in the Applications folder and choose Get Info, then click the checkbox for “Open in Low Resolution”.

Two problems solved in one day! And now I don’t have to use another machine for screencasting. I can use HiDPI mode on any machine, including my home machine which coincidentally has a Dell 4K monitor (I got it as a Christmas present) that allows me to use HiDPI mode as well!

Gotcha squashed!

creative commons licensed ( BY-NC-ND ) flickr photo shared by Chad Horwedel

Iterating the Kit

media production

Transition. It can be fun. Frustrating. Overwhelming. Invigorating. All those and many other things. For me it’s always about the next step. No one I know is more critical of me than me. One of the “criticisms” I hear – in a total teasing, smack-down, weirdly motivating and one-upsmanship way – is that I, and other DTLTers, should be blogging more. And it’s true. We’re all doing good work and we should be sharing what we’re doing. Especially now.

So in that vein of sharing, I will start with an update to something I’ve gone back to well on many times – The Kit.

For those of you who don’t know what “The Kit” is, it’s basically a live video stream setup that is as compact and portable (and inexpensive) as possible. At the time that I first gave it the name, The Kit was a backpack with a Mac laptop and a Canon video camera. The only thing that didn’t fit in the backpack was a good, sturdy, Manfrotto tripod. What made the live-streaming all work was a piece of software on the laptop called Wirecast. The limiting thing, however, was that multiple cameras was difficult (not enough “inputs” on the laptop – firewire only goes so far). That, and the on-the-fly encoding and streaming that Wirecast was providing, and it would tax the computer well beyond its CPU capabilities. Reliability was an issue at times, especially if we wanted to push the capabilities (something we do in DTLT all the time).

The Kit made its debut in 2011 and remained largely comprised of the basic set of components – until we began to get ready to move into the ITCC. As part of the research I was doing into the production spaces that would be a part of the new building, I came across what were essentially hardware solutions to what the software-based kit provided. In many ways the production spaces of the ITCC are relatively expensive. I still wanted to maintain a relatively inexpensive “Kit”, and of course, keep it portable.

Many weeks ago, Jim Groom asked me if I would provide the live-stream for the 2nd Edition of the Open VA Conference in Virginia Beach. The main reason I said “absolutely” was to meet the challenge of the next version of the Kit, and to provide a multi-camera setup. I’ll be honest, a side reason I said yes was the “beach” part 😉

So enough of the reasons, let’s get to what the Open VA version of the Kit looked like. Here’s a fast and loose video I put together of what I envisioned I would use, and following that a list of the individual equipment used (the actual implementation was scaled back a bit):


  • Blackmagic Design (BMD) ATEM Television Studio – This is the heart of the system. More about this item below.
  • BMD Hyperdeck Shuttle Pro – records to SSD hard drives in ProRes format
  • Furman Power Conditioner – provides clean power
  • Nady 6-Channel Rack Mount Mixer (not currently used – don’t know if we ever will)
  • Includes 4U Case, AV cables (HDMI & SDI), ethernet cable

Camera Setup

  • Asus HDMI monitor – used for “Multiview” (Program/Preview monitor).
  • Canon Vixia HV40 camera (2) – generally any HDMI camera will do – the HDMI cameras are cheaper than any SDI model which start at over $2000.
  • BMD Mini-Converter HDMI to SDI (2) – we have to use these to get the signal converted into SDI for long runs. HDMI cables over 10’ or so just don’t work in this case.
  • Manfrotto tripods (2) – Model 055XPRO B w/ 701HDV head – classic solid tripods
  • HDMI cables (2) – out from camera into mini-converters.
  • Power Strips (2) – power the cameras and mini-converters.

Other Equipment

  • Backpack – Carry cameras, power supplies, converters, etc.
  • Computer (and software) to control switcher interface – we use a Macbook Air or Pro Can be a Mac or Windows. Computer connects via ethernet to ATEM switcher.
  • Also we have Photoshop on this computer as well. Provides On-air graphics editing that can be exported to ATEM Media Player.
  • Ethernet cable – for above connection.
  • MagSafe Power adapter for Macbook Pro
  • Mac Mini – for “computer source” images (slides, web pages, video, etc.)
  • TP-LINK Wireless router – I set up an ad-hoc network for the ATEM. Various devices can connect to control the switcher, such as iPads, iPhones, other computers.
  • Live Stream Hardware – we use a Teredek VidiU w/Ethernet cable & HDMI cable – It’s $700 but this makes it so easy to stream to various CDNs like YouTube. Software solutions exist such as Wirecast.
  • DVI cable –  we used this to get a direct view to Mac Mini. Hard to see detail on Mac on the “Multiview” monitor.
  • Mackie 402VLZ4 4-Channel mixer – great small footprint mixer that takes room audio in then we go out to cameras.
  • Stereo RCA to 3.5mm – we run this cable from the mixer to the Canon Vixia to provide “system” audio (it runs via HDMI/SDI to ATEM switcher). Obviously individual channels direct in would be better. This requires separate interface hardware to do analog to digital (AES/EBU) for ATEM Television Studio. A whole other conversation.

The Blackmagic Design ATEM Television Studio is one of those unique pieces of hardware that has almost a cult following, and I think with good reason. It is a 6 input (HD) switcher for less than $1000. Nothing else can touch that price point. It has so much built into such a small footprint, and the quality is outstanding. Keep in mind when you buy one of these, there are pieces that need to be added on if you get into a complex production, which the multiple inputs beg you to do. If you’re using only one camera, there’s really no need to use the ATEM. If you’re using two cameras, and you have them in close proximity (using HDMI cables of less than 10 feet) to the switcher, the ATEM is ideal. With more cameras, or cameras at a greater distance from the switcher, you need to do SDI, and hence you need to convert the HDMI camera into SDI. The BlackMagic Design HDMI to SDI Mini-converter does the trick here. For about $300 they turn an inexpensive camcorder into a serviceable production camera. Inexpensive camcorders don’t have the power to send an HDMI signal long distances (again we’re talking over 10-15 feet), so SDI must be used.

I’ll talk more about how I’m using this kit, as well as future iterations, but here’s an example of one of the live streams at Open VA (featuring several of my DTLT colleagues):

Fun with YouTube Subtitles

weather is a network

I had a bit of unexpected fun yesterday. One of the things (on my long list of things) to explore this summer is closed captioning (subtitling/transcribing) videos and getting a manageable workflow going. As we begin the Fall semester in about 6 weeks, I want to have a plan for implementing transcriptions as a part of the many videos that we will begin to produce in the new building (you know that ITCC thing I keep talking about?). I’m working on that workflow and hope to have recommendations soon.

Meanwhile, I was playing around with the YouTube Closed Caption tool. It looks to be a great way to start the process of getting automatic transcriptions for video, although, as it is the subject of this post – it’s not perfect.

But, that’s the beauty of it. Let me show you.

One of the videos that was transcribed, again automatically by YouTube simply by uploading it, was a video on the Domain of One’s Own project. In the video, you’ll recognize some DTLT staff members, Martha Burtis, Ryan Brazell, and Tim Owens, as well as some UMW faculty, Jeff McClurken, Andi Livi Smith, and Sue Fernsebner, and one UMW student, Jack Hylan.

What was particularly entertaining was the attempt by the transcription service to get term Domain of One’s Own, and LMS, correct. On rare occasions it would get the terms right, spelling out the words “domain of one’s own”, albeit in lower case, and the acronym “LMS”. However, it did struggle. Here’s where it got entertaining. It seems to pick on Martha and Jeff the most. First, Domain of One’s Own . . .

two main ones

munich won Tonys

demand of ones own

dominicans out

YouTube’s struggles with LMS (as in Learning Management System) were equally funny.

in relation to the Alamosa


alum ask

As well as saucier versions . . .

alum ass

elem ass

And my favorite . . .

clothes wellsley Almazan

The actual spoken words from the above clip are “closed walls of the LMS”. See YouTube Closed Captions can even teach you about geographic locations you didn’t know about – Almazán, Spain. And I never knew about it’s association with Wellesley. Oh, and don’t forget Alamosa, Colorado.

To finish up the fun, there were a couple more transcription errors – one just basically silly, and another one fun in a teenage boy kind of way. First . . .

hammock resume

You can guess what the real spoken words are in this next one . . .

the poop i need to jump through

After it’s all said and done, it is amazing what an accurate job this automatic transcription service does. Anyone who has the task of creating captions for a video might find it to be a quite entertaining task. I hope the student aides that I assign to this task think so as well.