Year: 2010

Yesterday (11/30/2010) Adobe announced the beta release of Flash Player 10.2 for Windows, Mac, and Linux.  This update introduces some key enhancements in the area of video playback, including a new API known as Stage Video, that dramatically improves performance for HD content delivery, as well as an API that enables the use of native custom mouse cursors, and support for full screen playback while using multiple displays.

Stage Video

Perhaps the biggest change comes with the introduction of the Stage Video API, which provides an alternative method of rendering video in Flash besides using the Video object in the display list.  Instead, video can now be rendered onto a flash.media.StageVideo object, which gets created by Flash Player and composited through the GPU instead.  When playing video that has been encoded to the H.264 specification, and thus optimized for GPU acceleration, Flash Player 10.2 can decode and render video entirely on the GPU without sending that data to the CPU for processing, dramatically decreasing CPU load.

This reduction in processing demand means that higher quality video (read: higher bitrates/framerates) can now be rendered successfully by less powerful machines.  This is a huge plus for mobile devices, set-top boxes, and TVs that, while typically have powerful video rendering capabilities, usually lack the CPU power that a desktop computer may have.  Indeed, sites like YouTube are already preparing to support Stage Video, and Google TV currently supports Stage Video on their set-top devices.

Content providers can still use their existing content with Stage Video.  There’s no need to re-encode their video assets after implementing the Stage Video API into their applications.  More information on how developers can incorporate Stage Video into their sites here.

Demo For Flash Player 10.2 Beta From Adobe Labs

Here’s a quick demo taken from Adobe Labs:

Install Flash Player 10.2 beta to preview Stage Video hardware acceleration in the demos below. We’ve found the beta to be stable and ready for broad testing, but keep in mind this a sneak peak, and not everything will be fully baked just yet.

Click on the thumbnail below for a simple example of Stage Video using the example code described in Thibault Imbert’s Stage Video developer article Getting Started with the StageVideo API.

(c) copyright 2008, Blender Foundation / www.bigbuckbunny.org

The Stage Video API for advanced GPU Acceleration, the ability for developers to implement custom, native mouse cursors in their applications, Internet Explorer 9 GPU support, enhanced text rendering capabilities, and full screen support while using multiple displays all add up to a nice amount of improvements being offered with Flash Player 10.2 Beta.

The GTrack plugin has been built as an example of an OSMF proxy plugin. The plugin is able to send page tracking and event tracking for an OSMF MediaElement. The GTrack plugin uses the gaforflash library to send tracking to google analytics.

Page tracking is per MediaElement. Basically when the MediaElement is loaded and begins to play the URL of the resource is sent as a pageView to Google Analytics. You can also configure the value that is sent by adding metadata to the URLResource for the MediaElement. An example of this can be found in the Configuration section.

Events are sent to Google Analytics based on an XML configuration. Each of the MediaElements main traits can be configured to send a tracking event.

The GTrack plugin determines which tracking should be sent and what values are sent to Google Analytics via XML configuration.

This node specifies the Google Account to associate the tracking with. You can specify multiple <account> nodes to send tracking to multiple accounts. The value for this node can be found in your Google Analytics account and should look similar to – ‘UA-1234567-1′.

Example:

        <account><![CDATA[UA-1782464-4]]></account>
        <account><![CDATA[UA-1782464-5]]></account>

The <url> node is the URL that was set as the profile URL to be tracked by Google Analytics.

Example:

        <url><![CDATA[http://osmf.realeyes.com]]></url>

The <event> node is what defines the tracking that will be sent to your Google Analytics account. The ‘name’ attribute of the node is the key that tells the GTrack plugin to send an event. So, the names much match exactly. There are multiple types of events that can be tracked:

Example:

        <event name="percentWatched" category="video" action="percentWatched">
                <marker percent="0" label="start" />
                <marker percent="25" label="25PercentView" />
                <marker percent="50" label="50PercentView" />
                <marker percent="75" label="75PercentView" />
        </event>

This configuration example will track the start, 25, 50 & 75 percent markers as the media item is played. The complete is tracked by the complete event see MediaElement Events below.

Example:

        <event name="timeWatched" category="video" action="timeWatched">
                <marker time="5" label="5sec" />
                <marker time="10" label="10sec" />
                <marker time="20" label="20sec" />
        </event>

This example will send tracking at 5, 10, & 20 seconds respectively

MediaElement events are based off of the MediaElement’s available Traits. If the MediaElement supports a specific trait and there is an event that can be associated with the trait tracking can be defined for it. Example:

        <event name="complete" category="video" action="complete" label="trackingTesting" value="1" />

This example will send tracking when the MediaElement has completed playing.

The <updateInterval> node defines the interval that the GTrack plugin checks the current time and position of the currently playing MediaElement to determine when to send the time and/or percentage based tracking.

The <debug> node is not currently used but is planned to be implemented as a custom logging & debugging feature.

The node attributes (except for the name attribute) correspond to the tracking values defined in the Google Analytics Tracking API for Event Tracking

  • category: String – The general event category (e.g. “Videos”).
  • action: String – The action for the event (e.g. “Play”).
  • label: String – An optional descriptor for the event.
  • value: Int – An optional value associated with the event. You can see your event values in the Overview, Categories, and Actions reports, where they are listed by event or aggregated across events, depending upon your report view.

Sample XML configuration:

<value key="reTrackConfig" type="class">
        <!-- Set your analytics account ID -->
        <account><![CDATA[UA-1782464-4]]></account>

        <!-- You can track to multiple analytics accounts by adding additional <account /> nodes -->
        <!-- <account><![CDATA[{ADDITIONAL_GA_ID}]]></account> -->

        <!-- Set the url that you registered with your GA account -->
        <url><![CDATA[http://osmf.realeyes.com]]></url>

        <!-- Set up the percent based tracking -->
        <event name="percentWatched" category="video" action="percentWatched">
                <marker percent="0" label="start" />
                <marker percent="25" label="view" />
                <marker percent="50" label="view" />
                <marker percent="75" label="view" />
        </event>

        <!-- Set up the event tracking for the completed event -->
        <event name="complete" category="video" action="complete" label="trackingTesting" value="1" />

        <!-- Set up the event tracking for the completed event -->
        <event name="pageView" />

        <!-- These are the other available events that can be tracked -->
        <!--
        <event name="autoSwitchChange" category="video" action="autoSwitchChange" />
        <event name="bufferingChange" category="video" action="bufferingChange" />
        <event name="bufferTimeChange" category="video" action="bufferTimeChange" />
        <event name="bytesTotalChange" category="video" action="bytesTotalChange" />
        <event name="canPauseChange" category="video" action="canPauseChange"  />
        <event name="displayObjectChange" category="video" action="displayObjectChange"  />
        <event name="durationChange" category="video" action="durationChange"  />
        <event name="loadStateChange" category="video" action="loadStateChange"  />
        <event name="mediaSizeChange" category="video" action="mediaSizeChange"  />
        <event name="mutedChange" category="video" action="mutedChange"  />
        <event name="numDynamicStreamsChange" category="video" action="numDynamicStreamsChange"  />
        <event name="panChange" category="video" action="panChange"  />
        <event name="playStateChange" category="video" action="playStateChange"  />
        <event name="seekingChange" category="video" action="seekingChange"  />
        <event name="switchingChange" category="video" action="switchingChange"  />
        <event name="traitAdd" category="video" action="traitAdd" />
        <event name="traitRemove" category="video" action="traitRemove"  />
        <event name="volumeChange" category="video" action="volumeChange" />
        <event name="recordingChange" category="dvr" action="recordingChange" />
        -->
        <!-- Time based tracking (in seconds)-->
        <!--                            
        <event name="timeWatched" category="video" action="timeWatched">
                <marker time="5" label="start" />
                <marker time="10" label="start" />
                <marker time="20" label="start" />
        </event>
        -->
        <debug><![CDATA[true]]></debug>
        <!-- How often you want the timer to check the current position of the media (milliseconds) -->
        <updateInterval><![CDATA[250]]></updateInterval>
</value>

  1. Add the <event> node to the XML configuration
  2.         <event name="complete" category="video" action="complete" label="trackingTesting" value="1" />
  3. Add a MetadataValue to the URLResource object for the MediaElement
  4.         var resource:URLResource = new URLResource( PROGRESSIVE_PATH );
            resource.addMetadataValue( GTRACK_NAMESPACE, {pageURL:"AnalyticsTestVideo"} );
  5. Create the MediaElement
  6.         var element:MediaElement = mediaFactory.createMediaElement( resource );

This will track the String ‘AnalyticsTestVideo’ instead of the URL of the media resource.

  1. Set up the player
  2.         protected function initPlayer():void
            {
                    mediaFactory = new DefaultMediaFactory();
                    player = new MediaPlayer();
                    container = new MediaContainer();                      
                    this.addChild( container );
                    loadPlugin( {PATH_TO_GTRACK_PLUGIN_SWF} );
            }
  3. Set up a loadPlugin() method.
  4. Create a URLResource that points to the GTrackPlugin.swf
  5.         var pluginResource:MediaResourceBase = new URLResource( {PATH_TO_GTRACK_PLUGIN_SWF} );
  6. Add the XML configuration to the pluginResource as a MetaData value – ‘gTrackPluginConfigXML’ is an XML variable
  7.         pluginResource.addMetadataValue( "http://www.realeyes.com/osmf/plugins/tracking/google", gTrackPluginConfigXML );
  8. Listen for the plugin load events
  9.         mediaFactory.addEventListener( MediaFactoryEvent.PLUGIN_LOAD, onPluginLoaded );
            mediaFactory.addEventListener( MediaFactoryEvent.PLUGIN_LOAD_ERROR, onPluginLoadFailed );
  10. Load the plugin
  11.         mediaFactory.loadPlugin( pluginResource );
  12. The loadPlugin() method should look something like:
  13.         private function loadPlugin( source:String ):void
            {
                    // Create the plugin resource
                    var pluginResource:MediaResourceBase = new URLResource( source );
    
                    // Add the configuration data as Metadata to the pluginResource
                    pluginResource.addMetadataValue( GTRACK_NAMESPACE, gTrackPluginConfigXML );
    
                    // Set up the plugin listeners
                    mediaFactory.addEventListener( MediaFactoryEvent.PLUGIN_LOAD, onPluginLoaded );
                    mediaFactory.addEventListener( MediaFactoryEvent.PLUGIN_LOAD_ERROR, onPluginLoadFailed );
    
                    // Load the plugin
                    mediaFactory.loadPlugin( pluginResource );
            }
  14. Once the plugin is loaded, remove the plugin listeners and load the media
  15.         protected function onPluginLoaded( event:MediaFactoryEvent ):void
            {
                    // Remove the plugin listeners
                    mediaFactory.removeEventListener( MediaFactoryEvent.PLUGIN_LOAD, onPluginLoaded );
                    mediaFactory.removeEventListener( MediaFactoryEvent.PLUGIN_LOAD_ERROR, onPluginLoadFailed );
    
                    // Create the media resource
                    var resource:URLResource = new URLResource( PROGRESSIVE_PATH );
    
                    // Set up the page tracking
                    resource.addMetadataValue( GTRACK_NAMESPACE, { pageURL:"AnalyticsTestVideo" } );
    
                    // Create & set the MediaElement
                    var element:MediaElement = mediaFactory.createMediaElement( resource );
                    player.media = element;
                    container.addMediaElement( element );
            }

http://code.google.com/p/reops/source/browse/#svn/trunk/plugins/tracking/google/GTrackPlugin

Contact RealEyes

For more information about OSMF plug-ins, or to inquire about custom plug-in development:

Flash Media Server 4 Released

Posted on September 09, 2010 at 10:37 am in Development, Media Solutions, Strategic Consulting

Adobe announced today that Flash Media Server 4 is now available for trial/purchase.  In its most basic form, Flash Media Server 4 (FMS 4), is an advanced media delivery solution for those looking to leverage the advantages of dynamically streamed video over progressive download.  However, the FMS 4 family is capable of much more than that.  Aside from the free Flash Media Development Server, which is used to develop new applications for Flash Media Interactive Server, as well as run very low-volume streaming applications, FMS 4 comes in Three flavors:

  • Flash Media Streaming Server
  • Flash Media Interactive Server
  • Flash Media Enterprise Server

All three versions offer new features from what were available in FMS 3.5, such as:

Full 64-bit support - Server resources improved by being supported on 64-bit processors. Able to be installed on a wider range of platforms. Now supports CENTOS 5.3, Red Hat Enterprise Linux 5.3, and Windows Server 2008.

Enhanced buffer performance – You can now take advantage of Flash Player 10.1′s ability to interactively access media held in the buffer, which allows for actions such as fast motion, slow motion, etc.

Live HTTP Dynamic Streaming – In addition to streaming content via RTMP/RTMPE, FMS 4 can now take advantage of the industry-standard HTTP protocol, and still enjoy the quality-of-service features provided by dynamic streaming. FMS 4 also allows for the addition of DRM protection with Flash Access 2.

Faster Switching with RTMP Dynamic Streaming – FMS 4 provides improved adaptive bitrate delivery, giving your end users seamless video playback regardless of their bandwith stability.

The Three Flavors of FMS 4

Flash Media Streaming Server 4

  • Designed as an affordable step up from progressive download video delivery.
  • Faster dynamic switching, HTTP dynamic streaming, and server-side Access C++ plug-ins to enable more secure communication with Adobe Flash Media Live Encoder 3.1.
  • $995

Flash Media Interactive Server 4

  • Takes advantage of new IP multicast to maximize network efficiencies.
  • Enhanced multi-user experiences, such as chat, VoIP, video overlays, server-side playlists, and server-side recording.
  • $4,500

Flash Media Enterprise Server 4

  • Utilizes the new RTMFP (Real Time Media Flow Protocol) to allow for peer-to-peer assisted networking.  Drastically increase network efficiencies by leveraging peer-to-peer communication without being routed through a server.
  • Take advantage of both IP multicast and Application multicast.
  • Call for pricing (303-862-8611)


With the release of FMS 4, the possibilities for creating interactive, seamless video delivery experiences have never been more promising. By taking advantage of the many features available from the Flash Media Server family, as well as from the production tools provided in the OSMF, Creative Suite, etc., media content providers are now in a great position to make exciting advances in the area of media content delivery.

  • For more information about the Flash Media Server family, or to inquire about purchasing options, please call 303-862-8611.

Previously we introduced three media players built using the Open Source Media Framework (OSMF); the RealEyes OSMF Sample Player (REOPS), the Strobe Media Playback component, and the Flash Media Playback component.  All three of these players offer various degrees of customization, with REOPS and Strobe being the most flexible.  However, even though the Flash Media Playback component is a pre-built player hosted by Adobe, it still can be customized in several ways to fit your needs.  In this article we discuss what it takes to  skin the Flash Media Playback component with some custom bitmap images and an XML file that directs the player to those images.

Skinning the Player

First, you need to determine which elements of the player’s user interface you’d like to modify. As expected, certain visual elements in the player will have different appearances in different states-a play button, for example, will have  default, over, and down states, and you should consider this when creating your custom bitmap images.

Next, you’ll need to get the ID for the element that you wish to change. A comprehensive listing of the editable user interface elements along with their default sizes and descriptions can be found here. I’m choosing to modify the play button overlay, and the the IDs for this element and its states are as follows:

  • playButtonOverlayNormal
  • playButtonOverlayDown
  • playButtonOverlayOver

I’ve created my own custom bitmap images that I’m going to use to replace the default play button overlay for each of its states.  Images to be used must be saved as either JPEG, GIF, or PNG.

From here you can choose between two methods for telling your player where to look for the image files. You can set a FlashVar skin variable in your page’s HTML code that contains the path to the image, or you can use an XML configuration file. I chose the latter, and my XML looks like this:

<skin>
<element id = "playButtonOverlayNormal" src = "http://www.realeyes.com/assets/PlayNormal.png"/>
<element id = "playButtonOverlayOver" src = "http://www.realeyes.com/assets/Play_Over.png"/>
<element id = "playButtonOverlayDown" src = "http://www.realeyes.com/assets/Play_Down.png"/>
</skin>

From here I simply uploaded my XML file and the three custom bitmap images to a webserver and into the same directory.  I then used the Flash Media Playback’s setup configurator page to automatically generate HTML code that directs my player to use these new images. In the configurator’s “advanced” section, I entered the URL for the XML file I uploaded where it asks for the “skinning file location” (not where you would enter a “configuration file location”).  Clicking on preview at this point should show you the player with your new skin applied. I copied the HTML that the setup page generated for me, and pasted it into this page-that’s it!

Flash Media Playback Component With Custom Play Button Overlay Applied

In a previous post, we introduced you to the Open Source Media Framework (OSMF) created by Adobe, Akamai, et. al., which is a new, optimized media delivery platform based on the Open Video Player project.  Born from a desire to create a common platform for media playback, advertising, customizable branding experiences, analytics, etc., OSMF is a Flash-based solution that addresses the many challenges of bringing media content to the web.

Developers can benefit from OSMF’s free and open-source code base by leveraging its pre-built, based on best practices components to quickly build their media delivery solution. An important advantage for the developer using OSMF is that they can spend less time creating the player itself, and more time perfecting the user experience by taking advantage of customizable skinning, plug-in implementation, etc.

Those interested in creating their own custom media players built with OSMF should consider the following two projects:

The RealEyes OSMF Player Sample(REOPS)

RealEyes Media has developed a sample player based on OSMF with a very extensible control bar skinning solution, full screen support, Closed Captioning from an external file, and OSMF dynamic plugin support. REOPS can be used to develop media players that support progressive video playback, video on demand streaming, as well as live and dynamic streaming. Read More

An example of the RealEyes OSMF Player in action

Strobe Media Playback

Strobe Media Playback is an OSMF-based media player available free as a compiled SWF along with source code.  Strobe Media Playback, like OSMF, supports progressive download, RTMP and live streaming, HTTP dynamic streaming, as well as content protection with Adobe® Flash® Access™ 2.0. Read More

For those interested in a quick, easy-to-use solution for including media assets in their blog, website, etc. should consider using Flash Media Playback.

Flash Media Playback

Flash Media Playback is a free media player from Adobe based on OSMF.  This player, unlike REOPS and Strobe Media Playback, is hosted on Adobe’s servers.  Simply provide the player with the location of your media asset, assuming it’s on a web server, and Flash Media Playback will take care of the rest.

Configuration of the Flash Media Playback component can be achieved easily by utilizing the Flash Media Playback Setup configuration site. This site will automatically generate HTML code based on the parameters you choose to edit, which can then be pasted into your web page.  The only two parameters that you must provide are your media asset’s URL, and the dimensions of the media player (default size is 470 X 320). Read More

Flash Media Playback example

This video player was easily added to the page by simply pasting in the generated HTML code from the Flash Media Playback Setup page.

OSMF provides developers and content providers alike with exiting new opportunities for delivering their media content to the web. In future installments, we will explore more of the features and possibilities available to you from the Open Source Media Framework.

RealEyes Media, VideoPress & OSMF

Posted on August 19, 2010 at 12:39 pm in Development, Media Solutions

The VideoPress media player – built using OSMF & REOPS

When VideoPress decided to rebuild their media player, they chose to do it using the Open Source Media Framework (OSMF). By leveraging the work done by Realeyes Media with the REOPS project, the VideoPress media player provides a robust media playback control that can be easily implemented. In addition to the built-in features of the player – media control, sharing etc – extending the functionality of the player using the OSMF plug-in architecture allows for partner integrations such as stats and CDN integration as well as additional functionality from other plug-in providers.

What is OSMF?

Open Source Media Framework (OSMF) simplifies the development of media players by allowing developers to assemble components to create high-quality, full-featured video playback experiences. This open framework enables development focused on web-based video monetization, with lower costs and faster turnaround.

What is REOPS?

Like OSMF, REOPS is open source and available for developers to create their own custom players on top of it. With REOPS, even those without developer experience can customize skins and functionality using the several premade skin templates and the system’s XML configuration file.

Why REOPS for VideoPress?

The VideoPress media player needed to be a flexible and robust player that provides user customization and the ability to easily manage the media displayed on one’s site. REOPS leverages OSMF to provide ease of configuration for media, look and feel as well as extensibility to the VideoPress media player through the OSMF plug-in architecture.

Configuration

Everything from the media being played to the skin of the player is controlled at runtime via an XML or AMF configuration object. This configuration object allows the player to be configured each time it loads meaning different media, even a different look and feel can be presented based on the configuration provided at load time.

Media Definition

The configuration object begins by defining the media to be played back. As an example the following XML snippet defines a progressive video to be played back.


<mediaElement>
    <media url="http://www.server.com/video/MyVideo.flv" />
</mediaElement>

This configuration snippet adheres to the Flash Media Manifest (F4M) format and allows for explicit definition of media types (live streams, dynamic streams for switching etc) to be provided to the player. To learn more about the additional configuration settings you can continue reading the Building and Configuration Adobe Developer Connection article.

Skinning

An important feature of the VideoPress media player was customization of the player skin. For example, a sports team site can allow for the customization of the player colors to match their team colors. REOPS provides the ability to completely re-skin a player. The skin is created using a Flash template file. The assets in the template file can be change to reflect the new look and feel. A SWF file is created and new skin (SWF file) can then be applied via the configuration object. Below are 3 examples of control bar skins that can be applied to a player via the configuration object:



This flexibility allows the VideoPress media player to be customized via a player skin. Details on creating a skin for the REOPS player can be found at the Adobe Developer Center in the Skinning and Control bar REOPS article and in this article written by Juan Sanchez.

Configuring skins in the REOPS player is as simple as specifying the correct skin SWF file in the configuration object. An XML example of this would be:


<skin path="assets/skins/RE_Skin.swf">
    <skinElement id="controlBar"
        elementClass="com.realeyes.osmfplayer.controls.ControlBar"
        initMethod="initControlBarInstance"
        scaleMode="NONE"
        hAdjust="0" vAdjust="0"
        vAlign="BOTTOM"
        autoPosition="true"
        draggable="true"
        autoHide="true" />
</skin>

The skin configuration allows one to specify additional features of the skin using the sub node <skinElement>. In the above example we are configuring a custom control bar that will provide the main media control for the player.

Plug-ins

In addition to the built in features of the VideoPress player, the configuration object allows for the loading of OSMF plug-ins. The OSMF plug-in architecture allows for developers and partners to create plug-ins that unobtrusively enhance and extend the functionality of a media player.

The VideoPress player includes a built-in Closed Captioning plug-in that was provided with the OSMF framework as well as additional custom plug-ins built to enhance the base functionality of the player. Other plug-ins could include analytics tracking, functionality enhancements or restrictions as well as advertising and playlist control. The flexibility of the OSMF plug-in architecture allows for an endless possibility of plug-in development. The osmf.org site has more information about OSMF partners who are actively developing plug-ins.

As an example of configuring additional plug-ins for the REOPS player, this XML snippet tells the player to load a simple plug-in that overlays a watermark image over the media that is being played:


<plug-in path="http://www.realeyes.com/SampleBugPlug-in.swf">
    <metaData namespace="http://www.realeyes.com/watermark">
        <value key="watermarkURL">
            <id>http://www.realeyes.com/watermark.png</id>
        </value>
    </metaData>
</plug-in>

In the above sample, the <plug-in> node tells the REOPS player where to load the plug-ins from. This example also includes a metadata definition that REOPS uses to pass the location of the watermark image to the plug-in so the plug-in knows what image to display over the media.

Summary

The Open Source Media Framework and REOPS were identified by VideoPress as 2 tools that would lead to a best of class media player that provided them with the features and functionality at a level needed by their users and clients. With the flexibility provided by OSMF and configuration flexibility provided by REOPS, the VideoPress media player allows for the ease of deployment and feature rich implementations that were required to be developed quickly and easily.

Another important item to note is the open source nature of the VideoPress project. Each component on its own has been released under an open source license for the community to use and improve upon.

Adobe’s Open Source Media Framework (OSMF) enables developers to easily assemble pluggable components to create high-quality, full-featured playback experiences. The open aspect of the framework enables collaborative development focused on web video monetization with lower costs and faster turnaround.

RealEyes’ David Hassoun and John Crosby have just released the second in a series of articles on OSMF and REOPS for Adobe’s Developer Connection.  This article is a follow-up to their previous article, Part 1: Setup and deployment, which introduced the REOPS project. Part 2: Building & Configuration will show you how to build a very simple OSMF player to get familiar with the building blocks of the OSMF. John and David then explain each of these OSMF building blocks in simple terms, as well as how the REOPS project integrates each of them.

The RealEyes OSMF Player Sample (REOPS) is a project developed by RealEyes Media to provide the community a sample and an optional starting point for a robust video player utilizing the Open Source Media Framework (OSMF) from Adobe. The REOPS project is meant to be a building block for developers, as well as a visual representation to illustrate the capabilities of the OSMF framework and a “how to” for using it.

Adobe’s Open Source Media Framework (OSMF) enables developers to easily assemble pluggable components to create high-quality, full-featured playback experiences. The open aspect of the framework enables collaborative development focused on web video monetization with lower costs and faster turnaround.

This article is an introduction to the RealEyes OSMF Player Sample (REOPS) that provides developers, designers, and/or implementers with the proper knowledge to customize and deploy a video solution based on the Open Source Media Framework (OSMF).

RealEyes’ David Hassoun and John Crosby have just released the first in a series of articles on OSMF and REOPS for Adobe’s Developer Connection.  This article provides a brief introduction into the visual and technical capabilities and implementation parameters of this new sample. Future articles will dive deeper into how REOPS uses the OSMF to build a robust media player solution.  Read the article to get up and running with OSMF and the REOPS project.

For a recent project, RealEyes was asked to transcribe the speech from a video provided to us by a client. The transcribed text was to be included in the video and presented in subtitle fashion. To complete this task, we relied on the new speech-to-text capabilities that are available in Adobe Premiere CS4, and Soundbooth CS4. We also utilized some motion graphics firepower with After Effects CS4, (with the help of a customized expression, courtesy of Dan Ebberts’ excellent Developer Connection article, XMP metadata in Creative Suite 4 Production Premium).

In this tutorial, we’ll walk through the process of transcribing speech from a Quicktime video file (.mov) using Adobe Premiere CS4, and creating subtitles in the video from the transcription using Adobe After Effects CS4. We’ll incorporate the use of an expression, courtesy of Dan Ebbert, (slightly modified to suit our needs), that is needed to display the transcribed text in our video.

In order to get the most out of this tutorial, you’ll need the following:

Creating the Transcription in Adobe Premiere CS4

Create a New Project:

  1. Start Adobe Premiere Pro.
  2. From the Welcome screen, select New Project.
  3. In the New Project panel, specify a name and location for the project.
  4. Click OK.
  5. In the New Sequence dialog box, choose DV-NTSC/Standard 48kHz.
  6. Click OK.

Import Your Video Clip:

  1. Choose File > Import, then navigate to the “BarackInLax.mov” clip.
  2. Double-click the video file to import it into Premiere.
  3. Once the file is in the Project panel in Premiere, double-click it again to open it in the Source Monitor.
  4. Use the transport controls in the Source Monitor to preview the video for a few seconds. Be sure to pay close attention to the audio during the preview (so that you can later tell whether-or-not Premiere did a good job with the speech transcription.)

Open Metadata View:

  1. Choose Window > Metadata and check to see that the Metadata View is enabled.
  2. The Metadata View should have Three main sub-headings: “Clip”, “File”, and “Speech Transcript”. If you import a video clip into your project that has been previously transcribed, that transcription metadata will be displayed in the “Speech Transcript” panel. Of course, we haven’t transcribed our clip yet, so this panel is blank for now.

Create the Transcription:

  1. Make sure that your video clip is selected and highlighted in the project panel, and click “Transcribe” in the Speech Transcript panel of the Metadata View.
    The Metadata View
    The Metadata View – Click to Enlarge

  2. A dialog box appears, giving you a few options. Choose the proper language setting based on your video’s speech content.
  3. Choose between high quality processing(slower), and medium quality processing(faster).
  4. If your video clip contains multiple speakers, you can choose to check the “identify speakers” checkbox.
  5. Click OK.
    Transcription Dialog
    Transcription Dialog
  6. Transcriptions done in Adobe Premiere and Adobe Soundbooth are handled by Adobe Media Encoder. Adobe Media Encoder automatically launches when you complete the transcription dialog.
  7. Your video clip should load into Adobe Media Encoder’s queue automatically, with the appropriate format and preset for speech transcription already set.
  8. Click “Start Queue”.
  9. Adobe Media Encoder will now encode your clip, adding the speech transcription metadata to the video file.
    Adobe Media Encoder (Video Asset Loaded By Premiere)
    Adobe Media Encoder (Video Asset Loaded By Premiere) – Click to Enlarge

Working With The New Transcription Metadata

Correcting Mistakes In The Speech Transcript
At this point you should see the transcription metadata in the Speech Transcript panel of the Metadata View in Premiere. It is likely that the outputted text doesn’t quite match what was said in the video clip so some editing may be in order. Single-clicking on any word in the transcribed text highlights that word, and displays some potentially valuable metadata at the bottom of the Speech Transcript panel. Information about when the selected speech occurred in the timeline, as well as the duration of that text, is displayed.

By double-clicking on a word in the transcription, you can edit the text to match what was actually said in the video. Right-clicking gives you more options. For example, often you’ll find that you’ll need to replace two words with one, or insert a missing word before or after another word, etc. You can use the options available to you by right-clicking on a word in the transcription to make these kinds of edits. This can be a tedious task, as it involves meticulously going through the video content to match it with the transcribed speech output. For this reason, we haven’t edited our example transcription for accuracy, but for the purposes of this tutorial, it will work fine.
Video Transcription Text (Needs Editing)
Video Transcription Text (Needs Editing) – Click to Enlarge

Adding The Transcribed Speech To Your Video With After Effects CS4

Create a New Project:

  1. Open After Effects CS4.
  2. Close the Welcome Screen dialog box.

Import Your Transcribed Video Clip and Create a New Composition

  1. Choose File > Import, then navigate to the “BarackInLax.mov” clip.
  2. Drag the newly imported clip over the “Create a New Composition” icon at the bottom of the project panel. This automatically creates a new composition with settings that match those of your video clip.
  3. Adobe Media Encoder has included the new transcription metadata in the video file itself, so you should now see that it appears in the timeline. (you may need to adjust the timeline zoom in order to see the text clearly)
    Setting Up a New Composition in After Effects
    Setting Up a New Composition in After Effects – Click to Enlarge

Create a New Text Layer to Display the Transcription Text

  1. Select the Horizontal Type Tool from the tool panel and click on your video in the Composition panel in the area where you’d like the transcription text to be displayed.
  2. Make sure the Paragraph panel is open, and with the new text layer still selected, ensure that “Center text” is selected.
  3. At this time, feel free to set the font style and font size for your text layer. (You can also hold off on this step if you’d rather wait to see how the transcribed text looks after it’s displayed in your video).
    New Text Layer Created
    New Text Layer Created – Click to Enlarge

Using an Expression to Display the Transcribed Speech
An “expression”, in After Effects, is a small piece of software that you use in your project to efficiently control a single layer property. By adding an expression to a layer, you can create animations that might have otherwise required you to set hundreds of individual keyframes. Expressions written for After Effects are based on basic JavaScript. You can find more information about using expressions here, here, and here.

Modifying the Sample Expression
In order to make Dan’s expression work for our project, we need to change a couple of lines of code. First, we need to make sure that the expression references our video, and not the one Dan used in his example.

  1. Change the line that reads: L = thisComp.layer(“Legato_Ames_BTS_02_trans.mov”);to this: L = thisComp.layer(“BarackInLax.mov”);so that it references our video clip.
  2. Next, take a look at this line of code: max = 5; // number of words to displayThis tells After Effects to display the transcription with a maximum of Five words at a time. That works for us, so we’ll keep it as-is. However, if you’d like a different setting for your project, you can make that adjustment here.
  3. Finally, let’s look at this bit of code toward the end of the expression: s += L.marker.key(i).comment + ” “;Part of what this line does is to determine the spacing between words in the displayed text in the video. You can control how much space goes between each word by manipulating the whitespace between the quotation marks at the end of the statement. Currently there are two spaces between the quotes, which may or may not spread your displayed text out too much, depending on what you’re going for. For our project, we left only one space between the quotation marks, and that seemed to work well.
  4. Confirm that the necessary changes have been made to the expression, and save.
    The Expression With Adjustments Made For Current Project
    The Expression With Adjustments Made For Current Project – Click to Enlarge

Prepare the Text Layer for the Expression
The expression will be copied and pasted into a single property of the text layer in the timeline.

  1. First, open the text layer’s property panel by clicking on the sideways triangle next to the layer name, and then again on the sideways triangle next to the “text” property to open this property.
  2. Alt-click (Windows) or Option-click (Mac OS) the stopwatch icon to the left of the Source Text property’s name.
  3. Notice that doing this reveals the property “text.sourceText” in the timeline.
  4. Make sure that “text.sourceText” is highlighted, and replace the current setting with that of your edited expression by pasting the expression over “text.sourceText”.
    Timeline Ready To Have Expression Added
    Timeline Ready To Have Expression Added – Click to Enlarge

Observe the Transcribed Speech in Your Video Clip
Move the Current Time Indicator to a spot on the timeline that contains transcribed text. If everything worked correctly, you should see the transcribed text displayed in the video in the Composition panel. At this time, you may wish to make adjustments to your text layer’s font size, etc., to optimize the display of text in your video.
The Transcribed Text Shows Up in The Video
The Transcribed Text Shows Up in The Video – Click to Enlarge

Moving On

The speech-to-text capabilities found in Adobe Premiere CS4, and Adobe Soundbooth CS4 represent a nice leap forward in terms of what’s now possible when working with video. Transcribing a video’s speech is just one of the many new possibilities that we have. The ability to search for text within a video, and to use this metadata in custom applications being two more exciting prospects that come to mind.

Please consider the following resources to learn more about using metadata in your multimedia projects:

Peer to Peer using the Adobe Flash Platform

Posted on February 22, 2010 at 3:25 pm in Development, Media Solutions

Adobe Flash Player 10.1, Adobe Stratus 2, and Real-Time Media Flow Protocol (RTMFP) are setting a firm foundation for peer-to-peer (P2P) with peer-assisted networking. Using the capabilities of groups and the new features around them, you can make deployments of nearly any scale and take advantage of multiuser interactive applications for data and media. Everything from application-level video multicasting to swarming file delivery and multiuser games are within easy reach of developers, without the heavy burden being laid upon a server infrastructure.

RealEyes’ David Hassoun and Jun Heider have just released the first in a series of articles for Adobe’s Developer Connection that focus on the P2P capabilities of the Adobe Flash Platform, Adobe Stratus, and RTMFP. Future articles will dive deeper and provide a hands-on approach to utilizing the new groups and peer-assisted network topologies to make corporate enterprise, social media, and entertainment applications.  Read the article.