Current Issue
April 2016

Recent Blog Posts in June 2013

June 20, 2013
  Will my stream exceed my grasp?
Posted By Tom Coughlin
By Tom Coughlin

SMPTE Entertainment Technology in the Internet Age, June 18, 2013:  IMF addresses a significant problem in creating alternative versions of a piece of content to meet language, subtitle and other requirements for various markets.  According to Howard Lukk from Disney Studios up to 35,100 versions of a single piece of content are possible. IMF is a master file format that allows mezzanine level data compression and stores differences between versions rather than flattened linear versions. This saves storage space and makes management and repurposing of content much easier. Pierre Lemieux of Sandflow Consulting said that IMF stands between the source master (digital intermediate) and deliverable content for distribution channels.

IFM reuses proven technologies developed for digital cinema. It synchronizes content essence and metadata and provides a composition timeline broken into segments composed of sequences and captions.  An Output Profile List (OPL) governs specified transformations of the essence. According to John Hurst of Cinecert most IMF files are XML format.

Content delivery over the internet is increasingly popular but that popularity has dangers. Mark Watson of Netflix says 84% of their customers stream video at least once per week. YouTube has 13 billion videos with an average user viewing 401 minutes/month. Internet video traffic is now the majority of bits transferred through the internet and without new compression and delivery technologies video streaming could "break" the Internet, especially with even larger 4K video files on the horizon to feed new high resolution consumer TVs.

Adaptive Dynamic Streaming over HTTP (DASH) is one important element in conserving bandwidth assets. DASH features seamless adaptive streaming of content. Will Law from Akamai and Jesse Rosenzweig from Elemental spoke about DASH. 9 companies are providing players today for DASH and they believed that Adobe and Microsoft would switch to DASH in the future, leaving Apple as the only remaining proprietary streaming format.

New compression technology will also help control internet traffic, particularly for 4K content. One conference participant told me that MPEG H.265 encoding  (which promises up to a 50% additional compression beyond H.2645) requires 2-3 X additional overhead for decoding (at the user).  However the processing load at the source is much greater and to get the best quality delivery content about 100 X more overhead is required.  

IMF and DASH provide a framework for digital content delivery over the Internet.  Combined with HEVC compression (MPEG H.265) these technologies pave the way for increases in video streaming in a world where many people don't want to physically possess or even have a local copy of content. These technologies can satisfy that customer demand for more content without exceeding available bandwidth.

Tom Coughlin, the founder of Coughlin Associates ( has over 30 years of magnetic recording engineering and engineering management experience at companies developing flexible tapes and floppy disc storage as well as rigid disks at such companies as Polaroid, Seagate Technology, Maxtor, Micropolis, Nashua Computer Products, Ampex and SyQuest.

Continue reading "Will my stream exceed my grasp?" »

Permalink | Comments(0)
June 20, 2013
  SMPTE and the Internet
Posted By Tom Coughlin
By Tom Coughlin

The SMPTE and Stanford Entertainment Technology in the Information  Age conference June 18-19, 2013 included about 300 attendees interested in all aspects of the critical role that the Internet is playing in the media and entertainment industry. The sessions cover many topics related to connected media.  On the first day these included Content Creation for the Internet:  New Tools and Concepts; Flash Forward-How HTML-5 an Canvas Will Become the Next Interactive Screen for Web Media; Future Fiel Formats for Entertainment Media:  What are the Tech Trends and Implications for Internet Distribution?; Gaming, Entertainment and the Internet; Internet Media Delivery Formats-A DASH to the Races?; Next Generation Content in the Cloud:  Ultra Violet; and Mobile Internet Media: Content on the Go!

Content Creation and Distribution using the internet was an important theme at the conference. Including consumer interactivity and choices are increasingly important in the more open on-line world. New cloud-based technologies have removed the barriers to greater participatory entertainment in the future. Traditional laid back entertainment has it's place but there are new models that include social interaction with content that are changing the nature of entertainment. Games (whether for entertainment or business) are also an important element in the growth of on-line entertainment and in many ways have prepared the ground for the development of second screen and other interactive media.

Ann Greenberg from Sceneplay pointed out that fans and artists are more connected than ever before. Sceneplay allows users to be "co-producers" using micro-metadata that adds intelligence to scripts. Metadata capture and management is an important element in combining content from disparate sources.  Carl Rosendahl from the CMU Extension Entertainment Technology Center in Mountain View, CA has about 20 students per semester developing interactive technology for games and other entertainment.  

Peter Hirshberg from Enterprise Marketing showed some interesting video. He was involved with Bill Gates in doing a video to accompany Gate's 1993 book, The Road Ahead.  Peter said that the video (and book) missed important trends such as the Internet, Long Tail Content, the Ascendent Audience, Open Systems and Social Media.  He pointed out that TV will not make a return to high viewership without the help of other media-note recent Netflix deals with studios. Personal communications, such as twitter, give real time metrics and are displacing traditional measurement methods such as those from Nielson. He said in the future you won't just watch television, television will also watch you.  He pointed out patents that use cameras and other sensors built into TVs to observe a viewer in order to tailor advertising to the viewer.  

Social media also leads to broader game activity such as a Grand Central Game with global financial simulation  New technologies also allow unique ways to communicate such as writing on water streams using water jets controlled by inkjet printer like technology and airborne helicopter drones from MIT with lights on them that can be flown in tandem and controller by individuals to create collective art and communication. An activity called "Conspiracy for Good" in Europe and supported by Nokia involved 130 people in 5 countries with a 3 month long social benefit storytelling alternative reality game.  

Making video content interactive and game-like opens up entire new possibilities  for entertainment and for people to work together in new ways. Clearly this is an area that will develop much further in the future and is a great example of how the internet is changing our interaction with the world around us and as a consequence changing the nature of media itself.

Tom Coughlin, the founder of Coughlin Associates ( has over 30 years of magnetic recording engineering and engineering management experience at companies developing flexible tapes and floppy disc storage as well as rigid disks at such companies as Polaroid, Seagate Technology, Maxtor, Micropolis, Nashua Computer Products, Ampex and SyQuest.
Continue reading "SMPTE and the Internet" »

Permalink | Comments(0)
June 17, 2013
  Thoughts on the new Mac Pro
Posted By Larry Jordan
Last week, Apple gladdened the hearts of power users everywhere by providing a "sneak peek" at the new Mac Pro. Stylish, diminutive, and blindingly fast - at least according to the specs provided by Apple. Since that time, I've been thinking a lot about a system that is directly targeted to meet the performance needs of video editors, and other power users.

First, keep in mind that this was a "Sneak Peek" - a tantalizing glimpse of what is coming in the future, not a formal product launch. (This is similar to what Apple did a couple years ago when they provided an "advanced look" at Final Cut Pro X at the 2011 NAB SuperMeet.) Consequently, while this "peek" provided an overview, it was intentionally sparse in providing details. Partly, I suspect, because Apple wants to gather feedback from potential users before nailing down the final specs.


One of the key things I realized was that this system is envisioned to be highly configurable. Just as the current Mac Pro has a wide variety of options for RAM, GPU, storage, and connectivity, this unit is envisioned to be highly customizable as well.

If you think about it, the current Mac Pro is the most customizable system that Apple makes. Configuration is at the heart of the new Mac Pro as well. While I expect that there will be one physical unit, we will have a lot of choices about what goes into that unit.

This also means that we will see a variety of price points as well, depending upon how each system is configured. In this regard, the new Mac Pro is identical to the current Mac Pro.


Also keep in mind that Apple views Thunderbolt as more than a fast way to move data to and from a hard disk. Apple considers Thunderbolt as a direct connection to the PCI bus of the computer, able to deliver up to 20 Gb/second of data. Think of Thunderbolt as a direct line connecting the PCI bus to the expansion chassis of your choice.

NOTE: According to a reader, Intel is claiming a throughput of Thunderbolt 2 of about 1.6 GB/second, which is still very fast.

For most people, a fast computer coupled with lots of RAM and a really fast storage system will be all they need. In fact, Philip Hodgetts has written that more than 80 percent of Mac Pro users don't have any PCI cards in their system; aside from the graphics card. For those users, the new Mac Pro fits their needs for raw power, without adding tons of unneeded expansion slots.

NOTE: We used to think of PCIe card performance in terms of the number of "lanes" they used to connect to the motherboard. There were four, eight, and sixteen lane cards. The more lanes, the faster the potential communication speed between card and bus. With Thunderbolt, Apple is moving away from the concept of lanes, to straight data transfer speeds.

Thunderbolt 2 is fully-backward compatible with the original Thunderbolt. Thunderbolt devices can be connected by either copper or optical cables. Copper cables can be up to 3 meters in length (about 10 feet). Optical cables can extend up to 100 meters, for users that want to store their computers or RAIDs in a machine room for security, noise, or air conditioning reasons. Currently, optical cable lengths of 10, 20, and 30 meters are available on the market.

For users that need to expand the capabilities of their computer, for example DSP audio cards, video ingest and capture cards, mini-SAS or eSATA cards, more graphics cards, a very real question becomes "how many card slots should the computer hold?" Apple felt that picking any number of internal card slots would be limiting to some number of users. By moving all expansion cards outside the box, then connecting with the very high-speed Thunderbolt 2 data bus, Apple essentially provided a virtually unlimited number of card slots for users that need the maximum in expandability.

NOTE: As a sidelight, one Thunderbolt 2 connection provides sufficient data bandwidth to ingest uncompressed 4K images, or output video to a 4K video monitor, or support VGA, DVI, and DisplayPort computer monitors. Plus Apple put an HDMI port on the back of the Mac Pro just for good measure.

Already, ATTO and Sonnet, along with others are offering Thunderbolt to "X" converter boxes: mini-SAS, FibreChannel, eSATA, Ultra-SCSI. And vendors such as AJA, Blackmagic Design, and Matrox offer ingest and monitoring options connected via Thunderbolt.

The one missing piece is the lack of high-speed Thunderbolt-native RAID 5 storage systems, with the notable exception of Promise. There are plenty of two-drive RAID 0 and RAID 1 systems, but very, very few 5 to 10 drive RAID 5 systems, which we editors need the most. I've heard lots of rumors of what's causing the problem. Without pointing fingers, I hope this bottleneck gets resolved quickly.


We also need to consider that this is a system and not focus on one single element. The new CPU is twice as fast as the current Mac Pro in floating point operations. Memory bandwidth has doubled and now supports four channels of communication between RAM and the CPU.

The big news, though, was the addition of multiple GPUs. Although the ATI FirePros were featured, I suspect other options will also be available as part of the customization options Apple offers at launch.

NOTE: In terms of Final Cut Pro X, the GPUs determine performance for rendering effects, real-time playback of multiple layers, optical flow retiming, exporting, and video output to external monitors.

Here, things get interesting.

On Monday, Apple made a point to say that Final Cut Pro X would release a new version that supports the Mac Pro. That instantly made me think that all applications would need to be rewritten in order to run on the Mac Pro, which would make this new system a non-starter.

This is not the case.

Instead, think of the dual-GPUs in the Mac Pro as similar to when Apple released multi-processor CPUs. All applications would run on a multi-processor system, but until they were re-written to support multi-threading (which is the technical ability software uses to take advantage of more than one processor) the application would be limited to using only one processor. This was one of the big limitations of Final Cut Pro 7.

So, the Mac Pro will run all current Mac software. However, if the software wants to take advantage of the dual GPUs, it may need to be reconfigured to do so. This is not a small task for developers, but it isn't impossible. This is what Apple was referring to when they said a new version of Final Cut Pro X would be released to support the Mac Pro.

NOTE: Once developers know they can count of dual GPUs, they can design new software from scratch to take advantage of it, the way that everyone writes software today to take advantage of multiple processors and multiple cores.

UPDATE: A reader points out: "When using OpenCL, no code modification is required (problem only for Dev's which don't use OpenCL). Some use CUDA-API (Nvidia) - and this requires re-coding.

UPDATE: Another reader points out that the next version of Adobe Premiere and After Effects already support Open CL.

And the performance results of optimizing for dual GPUs can be astounding. Grant Petty, CEO of Blackmagic Design, tweeted earlier this week that they have been testing Resolve 10 on the new Mac Pro and it "screams."


Apple designed the Mac Pro as its most powerful and flexible desktop computer. They architected it to reflect where they see computers going for the next ten years. They provided a wealth of Thunderbolt ports - and converters - so that all legacy monitors, storage, and cards can be supported.

This has the potential to be an amazing piece of gear and I can't wait to learn more at the launch.

As always, I'm interested in your thoughts.

Larry Jordan is a producer, director, editor, author, and Apple Certified Trainer with more than 35 year's experience. Based in Los Angeles, he's a member of the Directors Guild of America and the Producers Guild of America. Visit his website at
Continue reading "Thoughts on the new Mac Pro" »

Permalink | Comments(0)