Development

Load testing service API’s got you down? How about load testing PHP-based, AMF service API’s? Thought so. Fear not, because John Crosby recently posted his findings about two AMF load testing tools he says are great! He’s talking about soapUI, and loadUI, the free-of-charge, open-source tools created by the fine people at SmartBear.

John shows you how to use these tools, walking you through step-by-step as you set up a project, configure an AMF request, and set up load testing using soapUI. He also walks you through load testing with loadUI.

It’s clear that John is pretty excited about the handiness of these two load testing applications, and he’s already looking forward to integrating them with our Continuous Integration (CI) system. Stay tuned for more on that soon! For now, happy testing!

Read the original article

The latest version of Flash Player (v.11.0) includes some exciting new features, including performance upgrades such as native 64-bit support, and asynchronous bitmap decoding. Perhaps most newsworthy though, is Flash Player’s new capability to encode live video streams to the H.264/AVC standard. This new feature will allow developers to create real-time, high-quality, live video streaming applications for chat, conferencing, and live event broadcasting.

The following article demonstrates how to take advantage of Flash Player 11.0′s new H.264 encoding capabilities within a video streaming application built using Flash Builder 4.5. The application does the following:

  • Captures live video from a webcam
  • Establishes a connection to Flash Media Server 4.5 using the NetConnection class
  • Publishes video stream from application to FMS using an instance of the NetStream class
  • Displays outgoing video stream from camera (prior to being encoded) in a Video component within the application
  • Sends encoding parameters to Flash Player 11.0 to encode the raw webcam video to H.264
  • Displays encoded video’s metadata, demonstrating that encoding worked
  • Streams live, encoded video from FMS to the application using another instance of the NetStream class
  • Displays newly encoded, streamed live video in another Video component within the application

H.264 Encoding in Flash Player 11.0 Example Application

Example Application showing live stream from webcam (left) and stream encoded to H.264 in Flash Player 11.0 (right).

To follow along with the example, please be sure to have the following:


Getting Started - Configuring  Compiler Settings

To develop applications that target the new features available in Flash Player 11.0, it is necessary to configure the compiler to target player-version “11.0″, and SWF-version “13″, as well as the playerglobal.swc for Flash Player 11.0. To make these changes:

    1. Download the new playerglobal.swc for Flash Player 11.0, and rename this file from “playerglobal11_0.swc”  to “playerglobal.swc“.
    2. Create a folder named “11.0” in the directory “frameworks\libs\player” that is inside your Flex SDK installation folder. (Fig. 1.0)
    3. Put the playerglobal.swc inside the new folder (“11.0”).
    4. Locate the file “flex-config.xml“, that is located in the “frameworks” folder within your Flex SDK installation directory.
    5. Within “flex-config.xml“, locate the “target-player” tag, which specifies the minimum player version that will run the compiled SWF.
    6. Set “target-player” value to “11.0“. (Fig.1.1)
    7. Also within “flex-config.xml“, locate the “swf-version” tag,  which specifies the version of the compiled SWF.
    8. Set “swf-version” value to “13“. (Fig. 1.1)
    9. Save “flex-config.xml“.

CreateFolderForPlayerGlblSwc

Figure 1.0. Create a folder for the playerglobal.swc named “11.0″.

Edit Values in flex-config

Figure 1.1. Edit values of “target-player” and “swf-version” tags within the flex-config.xml file.


Setting Up the Project in Flash Builder 4.5

The example application is a simple ActionScript 3.0 project (not a Flex or AIR project). To create a similar project in Flash Builder:

    1. Choose File -> New -> ActionScript project.
    2. Name the project “H264_Encoder”, and click “Finish”.
    3. In Flash Builder, with the H264_Encoder project selected, choose Project -> Properties.
    4. Verify that the compiler is targeting Flash Player 11.0. (Fig. 1.2) If it isn’t, select the “Use a specific version” radio button, and type “11.0.0″ for the value.

SettingFlashPlayerVersionInFlashBuilder

Figure 1.2. Make sure that the compiler is targeting Flash Player 11.0 by inspecting the project’s properties.

At this point, the application should look similar to the following:

package
{
public class H264_Encoder extends Sprite
{
public function H264_Encoder()
{
}
}
}

Next up, you’ll be modifying the application so that it can communicate with your webcam. In addition, you’ll add the code necessary for establishing a NetConnection to connect the application to Flash Media Server, as well two NetStream instances; one responsible for getting the video from the application into Flash Media Server, and one for bringing it back from the server into the application.


Coding the Application – Connecting a Camera, Establishing a NetConnection and NetStreams
    1. Directly under the opening class definition statement, but before the constructor method, create a private variable named “nc”, and data typed as a NetConnection. Use code hinting to have Flash Builder generate the necessary import statements for you by starting to type “NetC..”, then hit CTRL-SPACE to receive code hinting. Select “NetConnection” from the list, and notice that Flash Builder has imported the NetConnection class from within the flash.net package. If for some reason the import fails, go ahead and import it manually. Your code should appear as follows:

package
{
import flash.net.NetConnection;

public class H264_Encoder extends Sprite
{
private var nc:NetConnection;

public function H264_Encoder()
{
}
}
}

    1. Create two private variables to represent the NetStreams data typed as NetStream. Create one for the stream going from the application to the server (ns_out), and another for the stream coming back into the application from the server (ns_in), and remember to use code hinting to have Flash Builder import the necessary classes.

package
{
import flash.net.NetConnection;
import flash.net.NetStream;

public class H264_Encoder extends Sprite
{
private var nc:NetConnection;
private var ns_out:NetStream;
private var ns_in:NetStream;

public function H264_Encoder()
{
}
}
}

    1. Next, create a private variable named “cam” of type “Camera”, and set its value = “Camera.getCamera()”. The Camera class is a little different than other classes, in that you don’t call a constructor to instantiate an object of type Camera. Instead, you call the getCamera() method of the Camera class. This method will return an instance of a Camera object unless there isn’t a camera attached to the computer, or if the camera is in use by another application.

private var cam:Camera = Camera.getCamera();

Make sure the Camera class was imported:

import flash.media.Camera;

    1. It is now time to add code that will allow the application to connect to Flash Media Server using an instance of the NetConnection class. Under the import statements, the local variables, and the closing brace of the constructor function, create a private function named initConnection() that takes no arguments and returns void:

private function initConnection():void
{
}

    1. As the first line of the function body, create a new NetConnection by instantiating the nc:NetConnection variable, which you declared in step 1:

nc = new NetConnection();

    1. It’s always a good practice to verify that the NetConnection was successful. Next, add an event listener to listen for an event named “onNetStatus()”. You will create the onNetStatus() event in the next section:

nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);

Be sure to either use code hinting, or import manually, the NetStatusEvent class, which is in the flash.events package:

import flash.events.NetStatusEvent;

    1. Next, and still within the initConnection() function body, tell the NetConnection where to connect to by calling the connect() method of the NetConnection class. As an argument to this method, add the URL for the location of the “live” folder within the installation Flash Media Server you want to connect to. The URL included in the example uses the RTMP protocol, and connects to the “live” folder within a copy of Flash Media Server installed on one of our servers. You can also stream to a local version of Flash Media Server, if you have one installed, by setting the URL to: “rtmp://localhost/live”.

nc.connect("rtmp://office.realeyes.com/live");

    1. Finally, tell the NetConnection where Flash Media Server should invoke callback methods by setting the value for the NetConnection’s “client” property to “this”. Callback methods are special handler functions invoked by Flash Media Server when a client application establishes a NetConnection. Later on in this example you will work with the “onMetaData()” and “onBWDone()” callback methods. You will include these callback methods within the main application class, which is in fact the same object that will establish the NetConnection, and therefore the value of the NetConnection instance’s (nc) client property should be set to “this”.

nc.client = this;

The completed initConnection() function should appear as follows:

private function initConnection():void
{
nc = new NetConnection();
nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);
nc.connect("rtmp://office.realeyes.com/live");
nc.client = this;
}


Coding the Application – Verifying a Successful NetConnection
    1. As mentioned, it’s always a good practice to verify the success of a NetConnection attempt. To do this, create a protected function named onNetStatus() that takes an event, named “event”, of type “NetStatusEvent” as its only argument, and returns void:

protected function onNetStatus(event:NetStatusEvent):void
{
}

    1. Within the onNetStatus() event body, create a Trace statement that outputs the value of event.info.code to the console during debugging. The code property of the info object in the NetStatus event will contain String data that indicates the status of the attempted NetConnection, such as “NetGroup.Connect.Success”, or “NetGroup.Connect.Failed”. Tracing the value of this property allows you to confirm the status of the NetConnection easily by simply running the application in debug mode.

protected function onNetStatus(event:NetStatusEvent):void
{
trace(event.info.code);
}

    1. Next, within the function body, and beneath the existing Trace statement, create a conditional statement that checks the value of event.info.code and compares it to the value “NetConnection.Connect.Success”. If event.info.code == “NetConnection.Connect.Success”, call three functions that you will create in the next section; one that publishes an outgoing video stream, one that displays the incoming video from the webcam, and one that displays the video stream being sent back to the application from the server. The completed onNetStatus() function should appear as follows:

protected function onNetStatus(event:NetStatusEvent):void
{
trace(event.info.code); if(event.info.code == "NetConnection.Connect.Success")
{
publishCamera();
displayPublishingVideo();
displayPlaybackVideo();
}
}

    1. This example attempts to connect to the server and start playing/publishing video automatically when launched. To achieve this, call initConnection() from within the main class’ constructor method:

public function H264_Encoder()
{
initConnection();
}

At this point, you have included the code necessary to establish a NetConnection, and verify the success or failure of that connection with a Trace statement. In addition, you’ve included calls to functions that, when written, will handle the publishing and playback of the video from the webcam, as well as the video coming back from the server.

If you save the application now you’ll notice some errors. The calls to publishCamera(), displayPublishingVideo(), and displayPlaybackVideo() generate errors because we haven’t written them yet. You can comment out the calls to these functions and run the application in debug mode. If everything is set up correctly, you should see the Trace output “NetConnection.Connect.Success”.

Comment out calls to unwritten functions

However, you should also see this error in the console: “ReferenceError: Error #1069: Property onBWDone not found on flash.net.NetConnection and there is no default value.”. This is because Flash Media Server is attempting a callback function on the application that hasn’t been written yet. In the next section you will include those callback functions.

connect success and bwdone error in console


Coding the Application – Including the Callback Functions, and Creating a TextField to Display Metadata

The sample application contains two callback functions – onBWDone(), and onMetaData(). The onBWDone() callback checks for available bandwith, which can be useful in applications that need to dynamically switch video assets according to the bandwith that’s currently available. Although it’s necessary to include this function in the client code (omitting it will generate an error when the server tries to make the function call) it’s not necessary to actually do anything with it. This application isn’t concerned with monitoring bandwith, so it can be left as an empty function.

The onMetaData() callback function is useful for accessing a video stream’s metadata, and you will be adding code to this callback to do just that. The onMetaData() callback returns an Array of generic objects that represent the video stream’s metadata. In the next section, you will create those objects to represent various metadata, and access their values in order to display the information within the UI. For now, you will simply add the two callback functions, and add some code to onMetaData() to access that metadata. In addition, you will create a TextField that you will eventually use to display the metadata in the UI.

    1. Create a new private instance variable named “metaText”, and type it as an instance of the TextField class. Set its initial value to “new TextField()”

private var metaText:TextField = new TextField();

*Note – At this point you are simply creating the metaText object in memory. You won’t actually add it to the display list until further on in the example.

Be sure to import the necessary TextField class:
import flash.text.TextField;

    1. Include the required onMetaData() callback function. Create a new public function named “onMetaData()” that accepts an Object named “o” as its only parameter, and returns void.

public function onMetaData( o:Object ):void
{
}

    1. To access the video stream’s metadata, you will loop through the objects returned by the onMetaData() callback function. Again, you will create those objects in the next section, but for now, within the onMetaData() function create a “for…in” loop to loop through the objects. Within the loop’s initializer, declare a local variable named “settings” data typed as a String within the Object “o”.

public function onMetaData( o:Object ):void
{
for (var settings:String in o)
{
}
}

    1. Next, within the loop body, include a Trace statement that will output the name of each “settings” object returned by onMetaData(), concatenated with “=”, and the object’s value.

trace(settings + " = " + o[settings]);

    1. Finally, inside the for…in loop body, assign a text value to the metaText variable equal to each returned object’s name, concatenated with “=”, and the object’s value. Create a new line for each iteration, and adjust the spacing between the double quotes, (and add an extra “\n” if you want to double-space the text) to properly layout the text in the UI.

metaText.text += "\n" + " " + settings.toUpperCase() + " = " + o[settings] + "\n";

*Note* The layout and styling in this example are not intended to be examples of UI programming best practices. UI programming is outside the scope of this article.

The completed onMetaData() callback function should be similar to the this:

public function onMetaData( o:Object ):void
{
for (var settings:String in o)
{
trace(settings + " = " + o[settings]);
metaText.text += "\n" + " " + settings.toUpperCase() + " = " + o[settings] + "\n";
}
}

    1. Next, add the onBWDone() callback function. Create a new public function named “onBWDone()” that takes no arguments, and returns void.

public function onBWDone():void
{
}

Remember that the onBWDone() callback function is what Flash Media Server uses to check available bandwith, and this application doesn’t require that information. It still must be included, however, since the server will be calling it on the application object. To avoid a runtime error, simply include an empty onBWDone() callback.

public function onBWDone():void
{
}

Now that the application has the necessary callback functions, and it loops through the objects returned by onMetaData() to populate a TextField with that data, it’s time to add code that enables the application to read webcam data, encode that webcam data to the H.264 standard, and to then stream the encoded video.


Coding the Application – Setting Up H.264 Encoding, and Publishing to the NetStream

In this next section, you will attach your webcam to an instance of the Camera class. You will then encode the webcam input to H.264 using properties of the Camera class, and new H264VideoStreamSettings class. Certain encoding parameters can’t be set (yet, although support for this is hopefully coming soon) with the new H264VideoStreamSettings class, so you’ll be setting those values from properties in the Camera class.

Next, you will attach the encoded video to a live video stream, and stream it to Flash Media Server’s “live” directory. (You will bring a new stream back into the application from Flash Media Server in the next section)

Finally, in order to read the metadata of the newly encoded video stream, you will call the send() method of the NetStream class (available only when using Flash Media Server). As arguments to the send() method, you will include @setDataFrame, a special handler method within Flash Media Server, the onMetaData() callback method you added earlier to listen for the metadata client-side, and finally, the name of a local variable (“metaData”), data typed as an Object, used to represent the desired metadata items. First:

    1. Create a protected function named “publishCamera()” that takes no arguments and returns void:

protected function publishCamera():void
{
}

    1. In the first line of this new function, instantiate the ns_out NetStream object by calling its constructor. Pass the constructor the NetConnection instance “nc”:

ns_out = new NetStream(nc);

    1. On the next line, attach the Camera instance “cam” to the outgoing NetStream by calling the attachCamera() method of the NetStream class. Pass this method the cam instance:

ns_out.attachCamera(cam);

    1. Next, create a new local variable named “h264Settings”, data typed as H264Settings and set its initial value equal to “new H264Settings()”:

var h264Settings:H264VideoStreamSettings = new H264VideoStreamSettings();

Be sure to import the H264VideoStreamSettings class:

import flash.media.H264VideoStreamSettings;

    1. Call the setProfileLevel() method of the H264Settings class on the h264Settings object to encode the video using the “BASELINE” profile, and a level of “3.1″:

h264Settings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_3_1);

Be sure to import both the H264Profile class, and the H264Level class:

import flash.media.H264Level;
import flash.media.H264Profile;

    1. Next, use the setQuality() method of the Camera class to encode the video stream at 90000 bps (900Kbps), and with a quality setting of “90″:

cam.setQuality(90000, 90);

    1. Use the setMode() method of the Camera class to set the video’s width, height, and frames per second, and to determine if it should maintain its capture size when if camera has no default behavior for this parameter:

cam.setMode(320, 240, 30, true);

    1. Next, using the setKeyFrameInterval() method of the Camera class, set the video’s keyframe interval to 15 (two keyframes per second):

cam.setKeyFrameInterval(15);

  1. To set the outgoing video’s compression settings, assign the values of the h264VideoStreamSettings variable to the videoStreamSettings property of the outbound stream, “ns_out”
    ns_out.videoStreamSettings = h264Settings;
  2. Call the publish() method of the NetStream class on the outgoing NetStream, and pass it parameters to provide a name for the stream (“mp4:webCam.f4v”), as well as a destination folder in Flash Media Server (“live”):
  3. ns_out.publish("mp4:webCam.f4v", "live");
  4. Now it’s time to create the objects that will hold the metadata values of the encoded video you will access at runtime. Create a new local variable named “metaData”, data typed as an Object, and set its initial value equal to “new Object()”:
  5. var metaData:Object = new Object();
  6. These metaData objects are generic, meaning you can assign any name/value pairs you like. For example, there’s no encoding setting that comes from the Camera, VideoStreamSettings, or H264VideoStreamSettings classes that would allow you to display a copyright, but you can add one easily enough like this:
  7. metaData.copyright = "Realeyes Media, 2011";Of course, you can also create objects with values that do come from settings within the aforementioned classes, such as:

    metaData.codec = ns_out.videoStreamSettings.codec;
    metaData.profile = h264Settings.profile;

  8. Create the following metaData objects and add them to the publishCamera() function:
  9. metaData.codec = ns_out.videoStreamSettings.codec;
    metaData.profile = h264Settings.profile;
    metaData.level = h264Settings.level;
    metaData.fps = cam.fps;
    metaData.bandwith = cam.bandwidth;
    metaData.height = cam.height;
    metaData.width = cam.width;
    metaData.keyFrameInterval = cam.keyFrameInterval;
    metaData.copyright = "Realeyes Media, 2011";
  10. Call the send() method of the NetStream class on the ns_out object and pass it the name of the handler method “@setDataFrame”, and the callback method “onMetaData”, as well as the local variable metaData:
  11. ns_out.send("@setDataFrame", "onMetaData", metaData);The completed publishCamera() function should resemble the following, with the exception of the commented-out code:

    protected function publishCamera():void
    {
    ns_out = new NetStream(nc);
    ns_out.attachCamera(cam);
    var h264Settings:H264VideoStreamSettings = new H264VideoStreamSettings();
    h264Settings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_3_1);

    // ALTHOUGH FUTURE VERSIONS OF FLASH PLAYER SHOULD SUPPORT SETTING
    // ENCODING PARAMETERS ON h264Settings BY
    // USING THE setQuality() and setMode() METHODS,
    // FOR NOW YOU MUST SET THE PARAMETERS ON THE CAMERA FOR:
    // BANDWITH, QUALITY, HEIGHT, WIDTH, AND FRAMES PER SECOND.
    // h264Settings.setQuality(30000, 90);
    // h264Settings.setMode(320, 240, 30);

    cam.setQuality(90000, 90);
    cam.setMode(320, 240, 30, true);
    cam.setKeyFrameInterval(15);
    ns_out.videoStreamSettings = h264Settings;
    trace(ns_out.videoStreamSettings.codec + “, ” + h264Settings.profile + “, ” + h264Settings.level);
    ns_out.publish(“mp4:webCam.f4v”, “live”);

    var metaData:Object = new Object();
    metaData.codec = ns_out.videoStreamSettings.codec;
    metaData.profile = h264Settings.profile;
    metaData.level = h264Settings.level;
    metaData.fps = cam.fps;
    metaData.bandwith = cam.bandwidth;
    metaData.height = cam.height;
    metaData.width = cam.width;
    metaData.keyFrameInterval = cam.keyFrameInterval;
    metaData.copyright = “Realeyes Media, 2011″;
    ns_out.send(“@setDataFrame”, “onMetaData”, metaData);
    }


Coding the Application – Displaying and Encoding the Video From the Webcam, and Displaying Video Streamed Back From the Server

The application needs to display both the raw, un-encoded incoming video from the webcam, as well as the inbound streaming video after it has been encoded to H.264 in the Flash Player, sent to Flash Media Server, and then back to the application. In addition, the metadata that you defined in the previous section needs to be displayed in the UI to reveal the encoding settings defined in publishCamera().

In this next section, you will create two functions, displayPublishingVideo(), and displayPlaybackVideo() to play the streams and display the metadata on screen.

    1. Create a new private instance variable named vid_out, and set its data type to Video:

private var vid_out:Video;

Be sure to import the Video class:

import flash.media.Video;

This new instance of the Video class will be used to playback the not-yet-encoded video coming in from the webcam.

    1. Next, create a protected function named displayPublishingVideo() that takes no arguments and returns void:

protected function displayPublishingVideo():void
{
}

    1. In the first line of the function body, instantiate the vid_out variable by calling the constructor method of the Video class:

vid_out = new Video();

    1. To place the new Video component on screen correctly, assign x and y values to vid_out so x = 300, and y = 10

vid_out.x = 300;
vid_out.y = 10;

    1. Next, use the height and width values from the webcam to set the height and width of the video display:

vid_out.width = cam.width;
vid_out.height = cam.height;

    1. To allow the vid_out component to display video coming from the webcam, call the attachCamera() method of the Video class, and pass that method the instance of the Camera class that represents the webcam:

vid_out.attachCamera(cam);

    1. Finally, add vid_out to the display list by calling the addChild() method of the DisplayObjectContainer class:

addChild(vid_out);

At this point, the displayPublishingVideo() function should look similar to:

protected function displayPublishingVideo():void
{
vid_out = new Video();
vid_out.x = 300;
vid_out.y = 10;
vid_out.width = cam.width;
vid_out.height = cam.height;
vid_out.attachCamera(cam);
addChild(vid_out);
}

If you run the application at this point, provided you have a webcam attached to your computer (and you un-commented the calls to the functions publishCamera(), and displayPublishingVideo() within onNetStatus()), you should see the Flash Player dialog that asks permission to access your camera. Grant Flash Player permission, and you should now see a live video feed coming from your webcam.

Next, you’ll add code to the displayPublishingVideo() function that will display the metadata objects you created earlier. The metadata text won’t show up until the code is in place to handle the incoming stream, however. This is because metaText’s text property is set within the onMetaData() function, and onMetaData() is run only when Flash Media Server sends the stream back to the application. You’ll start by adding the metaText TextField object to displayPublishingVideo() and assigning values for its properties:

    1. In the displayPublishingVideo() function, directly under the existing addChild() method call, set metaText’s “x” value to “0″, its “y” value to “55″, its width to “300″, and its height to “385″

metaText.x = 0;
metaText.y = 55;
metaText.width = 300;
metaText.height = 385;

    1. Assign color values for the backgroundColor, textColor, and borderColor of metaText. In order to display backgroundColor and borderColor, you must assign both the background and border properties to “true”.

metaText.background = true;
metaText.backgroundColor = 0x1F1F1F;
metaText.textColor = 0xD9D9D9;
metaText.border = true;
metaText.borderColor = 0xDD7500;

    1. Add the metaText TextField object to the display list by calling the addChild() method, and passing it the metaText object.

addChild(metaText);

Next, you’ll create a function that will bring the video stream back in from Flash Media Server, and display it in another Video object.

    1. Create a new instance variable named vid_in and data type it as a Video.

private var vid_in:Video;

  1. Next, create a new protected function called “displayPlaybackVideo()” that takes no arguments and returns void.protected function displayPlaybackVideo():void
    {
    }
  2. In the first line of the function body, instantiate a copy of ns_in, the NetStream variable you declared earlier, and set its initial value equal to new NetStream(nc) with the “nc” NetConnection passed as an argument.
  3. ns_in = new NetStream(nc);
  4. Instead of calling the attachCamera() method, as you did for the previous NetStream, set the client property of the new NetStream to “this”.
  5. ns_in.client = this;
  6. Next, call the play() method of the NetStream class, and pass it the String value for the name of the stream. This should be the name of the outgoing stream as well.
  7. ns_in.play("mp4:webCam.f4v");
  8. Instantiate the vid_in variable by calling its constructor.
  9. vid_in = new Video();
  10. Next, set some sizing and layout properties for the new Video object so that it sits properly on the stage.
  11. vid_in.x = vid_out.x + vid_out.width;
    vid_in.y = vid_out.y;
    vid_in.width = cam.width;
    vid_in.height = vid_out.height;
  12. Attach the incoming NetStream to the Video object to have it playback the video.
  13. vid_in.attachNetStream(ns_in);
  14. Finally, add vid_in to the display list by calling the addChild() method and passing vid_in as its only argument.
  15. addChild(vid_in);Make sure to un-comment the call to displayPlaybackVideo() in the onNetStatus() function, and then save and run the application. You should see a dark rectangle appear that displays the video’s encoding settings, and two video streams, side-by-side. The video on the left is the raw video footage coming from the webcam, and the one on the right is the stream coming back from Flash Media Server.

Coding the Application – Adding Some Finishing Touches

The application is almost done! It could stand a little visual clean up however.

    1. First, add Metadata above the class declaration to set the height and width of the application to something more reasonable.

[SWF( width="940", height="880" )]

Next, you’ll create three more TextFields that will display a simple label for the encoding settings list, as well as information about each of the separate video streams. You’ll also work with some simple text formatting to size the text something different than the default.

    1. Create three new TextField variables, one named “vid_outDescription”, one named “vid_inDescription”, and one named “metaTextTitle”. Data type each of them as TextField, and call the constructor for each.

private var vid_outDescription:TextField = new TextField();
private var vid_inDescription:TextField = new TextField();
private var metaTextTitle:TextField = new TextField();

    1. Within the displayPublishingVideo() function, directly below the call to add metaText to the display list, add a line that sets the text property for metaTextTitle. Play with spacing between the double quotes and add a “\n” to get the positioning the way you’d like it.

metaTextTitle.text = "\n - Encoding Settings -";

    1. Next, create a local variable named “stylr”, that is an instance of the TextFormat class. Instantiate this variable by calling its constructor.

var stylr:TextFormat = new TextFormat();

Ensure that the TextFormat class has been imported.

import flash.text.TextFormat;

    1. Set the size property of the new TextFormat class to “18″.

stylr.size = 18;

    1. Apply the style defined with the stylr variable to the metaTextTitle TextField by calling the setTextFormat method of the TextField class and passing that method the name of the TextFormat object “stylr” as an argument.

metaTextTitle.setTextFormat(stylr);

    1. Add more styling and layout property values to metaTextTitle the same way you added them to metaText earlier:

metaTextTitle.textColor = 0xDD7500;
metaTextTitle.width = 300;
metaTextTitle.y = 10;
metaTextTitle.height = 50;
metaTextTitle.background = true;
metaTextTitle.backgroundColor = 0x1F1F1F;
metaTextTitle.border = true;
metaTextTitle.borderColor = 0xDD7500;

    1. Create descriptive text to be displayed for the outbound video stream. Set the text property for the vid_outDescription TextField to display this descriptive text. Again, play with the spacing and new lines to get it positioned correctly

vid_outDescription.text = "\n\n\n\n Live video from webcam \n\n" +
" Encoded to H.264 in Flash Player 11 on output";

    1. Add both the metaTextTitle TextField, and the vid_outDescription TextField to the display.

addChild(vid_outDescription);
addChild(metaTextTitle);

    1. Add descriptive text for the incoming video stream in the same manner. Set values for properties on vid_inDescription, and add the TextField to the display.

vid_inDescription.text = "\n\n\n\n H.264-encoded video \n\n" +
" Streaming from Flash Media Server";
vid_inDescription.background = true;
vid_inDescription.backgroundColor =0x1F1F1F;
vid_inDescription.textColor = 0xD9D9D9;
vid_inDescription.x = vid_in.x;
vid_inDescription.y = cam.height;
vid_inDescription.width = cam.width;
vid_inDescription.height = 200;
vid_inDescription.border = true;
vid_inDescription.borderColor = 0xDD7500;
addChild(vid_inDescription);

There you have it! The application should now automatically attach a webcam, display the webcam video, encode that video to H.264, and then stream it to and from Flash Media Server, displaying the end result in another video. The source files can be downloaded here. The completed code should appear as follows:

package
{
import flash.display.DisplayObject;
import flash.display.Sprite;
import flash.events.NetStatusEvent;
import flash.media.Camera;
import flash.media.H264Level;
import flash.media.H264Profile;
import flash.media.H264VideoStreamSettings;
import flash.media.Video;
import flash.net.NetConnection;
import flash.net.NetStream;
import flash.text.TextField;
import flash.text.TextFormat;

[SWF( width="940", height="880" )]
public class H264_Encoder extends Sprite
{
private var nc:NetConnection;
private var ns_out:NetStream;
private var ns_in:NetStream;
private var cam:Camera = Camera.getCamera();
private var vid_out:Video;
private var vid_in:Video;
private var metaText:TextField = new TextField();
private var vid_outDescription:TextField = new TextField();
private var vid_inDescription:TextField = new TextField();
private var metaTextTitle:TextField = new TextField();

public function H264_Encoder()
{
initConnection();
}

private function initConnection():void
{
nc = new NetConnection();
nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);
nc.connect(“rtmp://office.realeyes.com/live”);
nc.client = this;
}

protected function onNetStatus(event:NetStatusEvent):void
{
trace(event.info.code);
if(event.info.code == “NetConnection.Connect.Success”)
{
publishCamera();
displayPublishingVideo();
displayPlaybackVideo();
}
}

protected function publishCamera():void
{
ns_out = new NetStream(nc);
ns_out.attachCamera(cam);
var h264Settings:H264VideoStreamSettings = new H264VideoStreamSettings();
h264Settings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_3_1);

// ALTHOUGH FUTURE VERSIONS OF FLASH PLAYER SHOULD SUPPORT SETTING ENCODING PARAMETERS
// ON h264Settings BY USING THE setQuality() and setMode() METHODS, FOR NOW YOU MUST SET
// SET THE PARAMETERS ON THE CAMERA FOR: BANDWITH, QUALITY, HEIGHT, WIDTH, AND FRAMES PER SECOND.

// h264Settings.setQuality(30000, 90);
// h264Settings.setMode(320, 240, 30);

cam.setQuality(90000, 90);
cam.setMode(320, 240, 30, true);
cam.setKeyFrameInterval(15);
ns_out.videoStreamSettings = h264Settings;
// trace(ns_out.videoStreamSettings.codec + “, ” + h264Settings.profile + “, ” + h264Settings.level);
ns_out.publish(“mp4:webCam.f4v”, “live”);

var metaData:Object = new Object();
metaData.codec = ns_out.videoStreamSettings.codec;
metaData.profile = h264Settings.profile;
metaData.level = h264Settings.level;
metaData.fps = cam.fps;
metaData.bandwith = cam.bandwidth;
metaData.height = cam.height;
metaData.width = cam.width;
metaData.keyFrameInterval = cam.keyFrameInterval;
metaData.copyright = “Realeyes Media, 2011″;
ns_out.send(“@setDataFrame”, “onMetaData”, metaData);
}

protected function displayPublishingVideo():void
{
vid_out = new Video();
vid_out.x = 300;
vid_out.y = 10;
vid_out.width = cam.width;
vid_out.height = cam.height;
vid_out.attachCamera(cam);
addChild(vid_out);
metaText.x = 0;
metaText.y = 55;
metaText.width = 300;
metaText.height = 385;
metaText.background = true;
metaText.backgroundColor = 0x1F1F1F;
metaText.textColor = 0xD9D9D9;
metaText.border = true;
metaText.borderColor = 0xDD7500;
addChild(metaText);
metaTextTitle.text = “\n – Encoding Settings -”;
var stylr:TextFormat = new TextFormat();
stylr.size = 18;
metaTextTitle.setTextFormat(stylr);
metaTextTitle.textColor = 0xDD7500;
metaTextTitle.width = 300;
metaTextTitle.y = 10;
metaTextTitle.height = 50;
metaTextTitle.background = true;
metaTextTitle.backgroundColor = 0x1F1F1F;
metaTextTitle.border = true;
metaTextTitle.borderColor = 0xDD7500;
vid_outDescription.text = “\n\n\n\n Live video from webcam \n\n” +
” Encoded to H.264 in Flash Player 11 on output”;
vid_outDescription.background = true;
vid_outDescription.backgroundColor = 0x1F1F1F;
vid_outDescription.textColor = 0xD9D9D9;
vid_outDescription.x = 300;
vid_outDescription.y = cam.height;
vid_outDescription.width = cam.width;
vid_outDescription.height = 200;
vid_outDescription.border = true;
vid_outDescription.borderColor = 0xDD7500;
addChild(vid_outDescription);
addChild(metaTextTitle);
}

protected function displayPlaybackVideo():void
{
ns_in = new NetStream(nc);
ns_in.client = this;
ns_in.play(“mp4:webCam.f4v”);
vid_in = new Video();
vid_in.x = vid_out.x + vid_out.width;
vid_in.y = vid_out.y;
vid_in.width = cam.width;
vid_in.height = vid_out.height;
vid_in.attachNetStream(ns_in);
addChild(vid_in);
vid_inDescription.text = “\n\n\n\n H.264-encoded video \n\n” +
” Streaming from Flash Media Server”;
vid_inDescription.background = true;
vid_inDescription.backgroundColor =0x1F1F1F;
vid_inDescription.textColor = 0xD9D9D9;
vid_inDescription.x = vid_in.x;
vid_inDescription.y = cam.height;
vid_inDescription.width = cam.width;
vid_inDescription.height = 200;
vid_inDescription.border = true;
vid_inDescription.borderColor = 0xDD7500;
addChild(vid_inDescription);
}

public function onBWDone():void
{

}

public function onMetaData( o:Object ):void
{
for (var settings:String in o)
{
trace(settings + ” = ” + o[settings]);
metaText.text += “\n” + ” ” + settings.toUpperCase() + ” = ” + o[settings] + “\n”;
}
}

}
}


Download Source Here

Get Ready – Adobe MAX 2011 is Near!

Posted on September 28, 2011 at 9:42 am in Development, Training

Adobe MAX 2011

MAX!

There’s a great deal of excitement in the AIR here at RealEyes Media as the premiere Adobe conference of the year-Adobe MAX 2011, rapidly approaches! This Saturday, October 1st, the epicenter of design, media, and development will be Los Angeles California, as Adobe settles in for the 3rd year in a row at the L.A. Convention Center, and the beautiful Nokia Theater L.A. LIVE. The Adobe MAX conference has always been the place to listen to and meet world-renowned speakers, learn about the latest tools and techniques, and connect with potential clients, new partners, and old friends…and this year is no exception!

If you’ve ever been to MAX, you know that Adobe pulls out all the stops for this event. Keynote addresses are given by the biggest names in tech and entertainment. In case you haven’t already heard, the musical entertainment for this year’s MAX Bash will be provided by the band Weezer!

Weezer!


A chance to learn from the best

Whether you’re a designer, developer, or business strategist, MAX is an environment that deepens your expertise, and ultimately makes you more productive in your work. Every skill and experience level is welcome during this Five-day learn-a-thon. Whether you’re someone who’s never opened Photoshop, or you’re interested in creating high-tech video players destined for multiple devices, there’s something at MAX for you.

learn at MAX


Learn about multiscreen development with Realeyes’ own David Hassoun, John Crosby, and Jun Heider

The last few years have seen a steady upswing in multiscreen application development, and MAX has responded to this trend by providing developers, designers, and entrepreneurs with the best resources for learning how to rise to the top in this environment.

David, John, and Jun

Realeyes Media is pleased to announce that Jun Heider, David Hassoun, and John Crosby will be speaking at MAX! Be sure to check out the following sessions:


David Hassoun & John CrosbyVideo Player Development for Multiple Devices

“Learn how to create compelling, robust, and high-performing video player experiences for desktops, tablets, and smartphones including HTML5 and Adobe AIR for iOS. This lab for developers will step through what’s needed to develop and optimize the video experience across all devices. Using Adobe Flash Media Server on the back end, you’ll use Adobe Flash Builder and Open Source Media Framework to create video players that just work. Explore how to tune hardware acceleration with Stage Video to optimize battery life.”

  • Are you attending this session? Would you like early access to the sample files? We can help out. Sign up here to download the files.

Jun HeiderMultiscreen Project Best Practices

“Prepare to take the next step in multiscreen development. Review important considerations in planning multiscreen projects geared toward efficient code reuse and workflow. Also, see how to structure projects to match the strategy chosen to fit the application’s use case. By the end of the session, you’ll walk away with an understanding of how to start architecting your multiscreen Adobe Flash Platform applications and build them using Adobe Flash Builder.”

  • If you’re attending this session you can sign up here to receive the presentation materials from Jun’s session.

These guys know their stuff, so if you’re interested in developing applications that are destined for multiple screens, be sure to attend their sessions, ask questions, and meet them in person-you won’t be disappointed!

Can’t make it to MAX? Attend “Mini-MAX”!

Here in Denver, the Rocky Mountain Adobe User Group traditionally provides an annual “mini-MAX”, for those who couldn’t make the trip out to California. Those that were there give us their take on the conference, providing us with a remote insite into MAX’s highlights. Join us on 11/08/11 at Casselman’s - 2620 Walnut Street in North Denver, CO, as well as every 2nd Tuesday of the month throughout the year to talk all things Adobe!

Adobe Releases OSMF, Strobe Media Playback 1.6

Posted on September 08, 2011 at 3:38 pm in Development, Media Solutions

Back in early June, we reported on the pre-release of Adobe’s OSMF 1.6, and its support for late-binding audio. Adobe has been working hard to improve upon the upgrades they gave us with the OSMF 1.6, Sprint 5 release, and to add even more new features for mobile as well. Today Realeyes Media is pleased to announce that OSMF 1.6, and Strobe Media Playback 1.6 have been granted their final release status.

A brief overview of the updates available in OSMF  and Strobe Media Playback 1.6:

OSMF 1.6
  • In regards to late-binding audio, as promised, today’s release supports live playback as well as video on demand (VOD).
  • Also in regards to late-binding audio, fixes to seek issues resolved.
  • For mobile – offers Stage Video support for hardware-accelerated video presentation(requires Flash Player 10.2+).
  • DVR rolling window support, which allows you specify how far back from the live point viewers can rewind (requires the newly released FMS 4.5).
Strobe Media Playback 1.6
Core Framework
  • Improvements to HTTP Dynamic Streaming as well as the ability to better manage bitrate profiles with multi-level manifests.
Documentation

http://sourceforge.net/apps/mediawiki/osmf.adobe

This is exciting news for those of us using OSMF and/or the Strobe Media Playback. Thank you to Cathi Kwon and the rest of the OSMF team for giving us these new and powerful feature updates!



For information on how Realeyes Media can help you integrate OSMF into your media solutions, please feel free to contact us today.


Scott Sheridan writes about, and messes around with, the latest technologies in digital motion media at Realeyes. He also does triathlons. Really big triathlons.

Feel free to reach out with any questions-we’re glad to help!

scott@realeyesmedia dot com

Jun Heider gave a really nice presentation this morning on how to leverage the Adobe Flash platform P2P API to create applications for sharing video, audio, and data among application peers. In his talk, Jun demonstrates P2P technologies working across multiple devices and taking advantage of the flexible RTMFP protocol, an Adobe technology that allows for maximum scalability coupled with a dramatic reduction in server infrastructure and bandwidth costs.

View the presentation:

Meeting recording

Presentation slides (PDF)

Additional Resources

You can also check out a recent screencast from Jun’s blog demonstrating a multiscreen P2P call center application


The following are demonstrations of some of the ways in which this technology can be implemented:

Basic Demos (right-click demos to view source)

Metrics Demo (Serverless)

Multiuser Video Demo

Elearning Demo

Adobe AIR – File Sharing

File Sharing Demo (AIR)

File Sharing Demo Source

Byte Array Chunker Utility

Part 1 of this series discussed HTTP Dynamic Streaming (HDS) at a fairly high level. The next few editions in the series will explore some of the more powerful features that make using this protocol advantageous. Multi-bitrate stream switching and file encryption are two important features that we’ll cover in the very near future, as they’re very big reasons to stream over any protocol. However, in this article I’d like to discuss a brand new feature of the Open Source Media Framework (OSMF) known as “late-binding audio”.


Late Binding Audio Defined

Late-binding audio refers to the ability to stream videos with multiple associated audio tracks. This makes it possible to play an alternative audio track on the client-side using the same video file. There’s no need to encode, store, and deliver separate video + audio assets for each version you would like to provide. Say for example that you would like to provide video content with audio translated into multiple languages. Instead of  creating separate video + audio files for each language, you instead encode the video only once, and include the alternate audio-only tracks along with the it. This represents a huge savings in time, storage, and bandwith that anyone making the switch to HTTP Dynamic Streaming can take advantage of.

Updates to OSMF that came in version 1.6, Sprint 5 make streaming late-binding audio files over HTTP possible. Specifically, the MediaPlayer class now contains the read-only public property hasAlternativeAudio : Boolean. By using the LateBindingAudio example application included in the latest OSMF release, I’ll demonstrate step-by-step how to get this new feature to work.

Many of the steps we’ll be taking are the same steps we took when packaging our files for simple streaming over HTTP, so if you’d like to review, please check out HTTP Dynamic Streaming – Part 1: An Introduction to Streaming Media.


Late-Binding Audio, Step-by-Step

1. Gather your media assets

In this example, we’ll be working with a video that has one alternate audio track. (President Barack Obama’s speech from July 25th, and an alternate audio track of the transcription translated into Spanish) You can include as many alternate audio tracks as you’d like, however there are some recommendations from the OSMF team in regards to how you prepare your media. One suggestion is that you should use audio tracks that are at least as long as the main video + audio track to ensure smooth stream switching. Other guidelines relate to encoding best practices for streaming over HTTP in general. You can read the white paper on encoding standards here. A list of known issues with OSMF 1.6 Sprint 5 can be found in the release notes.

The creation of the media assets prior to packaging them for HTTP streaming is beyond the scope of this article, but for your information:

  • I used Adobe Premiere Pro 5.5 to edit the original video file down to something shorter (~2 min).
  • I used Adobe Audition CS 5.5 to edit the audio, and to create the alternate audio track.
  • I encoded the video and audio files to .f4v using Adobe Media Encoder (see part 1 of the series for file type requirements).
  • I happily found a transcription of the speech online.
  • Google Translate helped me with the translation (it’s been awhile since I’ve spoken Spanish).
  • At&t Natural Voices text-to-speech demo provided me with the .wav files of the Spanish audio.
So, to start you’ll need a minimum of 3 separate files:
  • The original video + audio file encoded into an .flv or Mp4-compatible format
  • The audio track from the original video + audio encoded the same as above
  • An alternate audio track, hopefully of the same duration as the original audio, encoded the same as above
2. Package your media using the f4fpackager tool

This step is the same as it is for packaging files for simple streaming over HTTP, covered in part 1.

using the f4fpackager to package the media files

At this point, if you’d like to send additional arguments to the packager, you can enter them here and they’ll show up in the XML of the .f4m file, otherwise use the minimum arguments. We’ll be editing the XML of the main video’s .f4m file in the next step. After you’ve packaged all of the files, it’s time to create a “master” .f4m file. I’m using 3 source files, so I have 3 sets of 3 packaged files:

  • Obama.f4m
  • ObamaSeg1.f4x
  • ObamaSeg1.f4f
  • Obama_Audio.f4m
  • Obama_AudioSeg1.f4x
  • Obama_AudioSeg1.f4f
  • Obama_altAudio.f4m
  • Obama_altAudioSeg1.f4x
  • Obama_altAudioSeg1.f4f
3. Create master .f4m file

Next, we’ll be adding some information from the two audio tracks’ .f4m files (the separated audio from the original video, and our alternate Spanish track) to the .f4m of the packaged main video file. Copy the “bootstrapInfo” and “media” tags from inside the .f4m files of the two audio tracks, and paste them into the main video’s .f4m file.

Add media and bootstrapInfo tags to main .f4m file

Add media and bootstrapInfo tags to main .f4m file

4. Add attributes to media tags in master .f4m

In order for late-binding audio to work, we’ll need to add a few attributes to the media tags inside the main .f4m file. In the media tag of your alternate audio, add:

  • alternate=”true”
  • type=”audio”
In order to get the example application that I’m using to behave the way I’d like it to, I added another attribute to the alternate audio’s media tag:
  • lang=”Spanish”
The player is using that attribute to populate a dropdown menu of available alternate audio tracks, and by including this attribute, I get a nicely-named menu item in the player.
*Note*I’ve noticed that when using packaged .f4v’s, the example player can’t load the files unless I add yet another attribute to (every) media tag:
  • bitrate=””
Apparently, the player doesn’t care what the bitrate value is, even if it’s an empty String, but it does seem to require that you include that attribute when streaming packaged .f4v’s.
Updated master .f4m file

Updated master .f4m file

5. Place all packaged files into vod folder in the webroot of your Apache server

When done, it should look something like this: (“readme.htm” and “sample2_1000kbps.f4v” are files that come with Flash Media Server, and can be ignored)

Packaged files in the vod folder on the server

Packaged files in the vod folder on the server


Setting Up Flash Builder

6. Make sure you’re using the latest versions of Flash Builder, Flash Player, and OSMF

In order for this example to work, you’ll need to ensure that you’re using Flash Builder 4.5.1 and the latest OSMF .swc. You’ll need to replace the OSMF .swc that comes with the latest Flex SDK with the one from OSMF 1.6 Sprint 5, and deploy your project to the latest version of the Flash Player. (At least 10.2)

Use Flex 4.5.1, and Flash Player 10.2 and up

Use Flex 4.5.1, and Flash Player 10.2 and up

Use the latest OSMF .swc-OSMF 1.6, Sprint 5

Use the latest OSMF .swc-OSMF 1.6, Sprint 5

As mentioned earlier, this example uses the LateBindingAudioSample application that comes bundled with the latest OSMF release. It can be found in OSMF/apps/samples/framework/LateBindingAudioSample. Modify this application to point to your main video’s .f4m file on the server.

That’s it! Ensure that your Apache web server is running, and if you’re using the same example application, run the application in debug mode to get valuable information about the stream in the Console. Select your video asset from the dropdown menu up top, and hit “Play”. Choose the alternate audio stream at any time from the dropdown in the lower left of the application.



Where to go from here

For a more in-depth look into HDS, including discussions on file encryption, and live streaming, please refer to John Crosby’s series on HTTP Dynamic Streaming

For an informative look into the world of OSMF, including deep-dives into such things as building custom media players and plugin integration and development, please see David Hassoun and John Crosby’s article series “Mastering OSMF“on the Adobe Developer Connection site .


For information on how Realeyes Media can help you make the switch to HTTP Dynamic Streaming, please feel free to contact us today.


Documentation

Adobe HTTP Dynamic Streaming documentation

OSMF.org

f4fpackager documentation

F4M file format specification

Downloads

HTTP origin module

f4fpackager

Flash Media Development Server (free)

Apache web server


Scott Sheridan writes about, and messes around with, the latest technologies in digital motion media at Realeyes. He also does triathlons. Really big triathlons.

Feel free to reach out with any questions-we’re glad to help!

scott AT realeyes DOT com

Kao tehnologija koje se odnosi na proizvodnju i isporuku digitalnih pokreta medija nastavlja da napreduje, tako da su zahtevi potrošača za sve raznovrsniji i bogatiji mediji iskustveniji.. Video visoke definicije se sada isporučuje na više korisnika, na razlicitim vrstama uređaja, i kroz više kompleksnih mreža nego ikada pre. Za provajdere sadržaja, to naravno znači više raspoloživih mogućnosti za medijsku distribuciju i monetizaciju.

Zatim

Tradicionalno, video je dostavljen klijentu na jedan od dva načina:bilo progresivnim preuzimanjem korišćenjem široko podržanog HTTP protokola, ili strimovanjem, korišćenjem protokola RTP, RTMP, UDP, ili TCP, u saradnji sa specijalizovanim serverom softvera za rukovanje tokom (npr. Flash Media Server ili Windows Media Services). Ove dve metode dostave imaju i prednosti i mane. Streaming medija protokoli nude gledaocu bolje iskustvo omogućavajuci videu da reprodukuje odmah, bez potrebe da prvo sačekamo da su potpunosti uploaduje. Oni su takođe omogućili takve karakteristike kao adaptivne bitrate strimovanja da nadoknadi za fluktuacije u korisničkom protok, uživo gledanje, sadržaj enkripciju, i pametne ribanje. Ove karakteristike su često vrlo skupe, međutim, i kao takve nije su održiva opcija za mnoge sadržaje. Isporuka preko HTTP, s druge strane, traži kompletan file download pre nego se moze poceti gledanje. Pored toga, sadržaj prebačen na ovaj način je uskladištena na hard disku krajnjeg korisnika, i zbog toga često nije najbolje rešenje za prikazivanje zaštićenog sadržaja autorskih prava. Međutim, podrška za HTTP protokol bio je i ostao, veoma rasprostranjen. Specijalizovani server tehnologije se ne zahteva da bi se dostavio sadržaj preko HTTP- jednostavno (i besplatano) ce to uraditi web server. Biti podrzan postojećim i rasprostranjenim serverom hardvera i caching infrastruktura nastavlja da bude jedan od glavnih prednosti pri korišćenju HTTP protokola.

Sada

U prošlosti, provajderi sadržaja su često bili suočeni sa teškom dilemom. Da li bi oni trebali da naprave velike finansijske investicije u cilju pružanja najboljeg streaming videa iskustva svojim krajnjim korisnicima? Ili bi njihov ROI bio bolji ako isporuci robustano iskustvo gledanja, iako ima potencijalno veću publiku, preko HTTPa? Kompanije kao što su Move Networks, Microsoft, Adobe i Apple su došli sa svojim jedinstvenim rešenjima ovog problema – dinamičan problem streaming medija preko HTTP protokola. Svako rešenje podrazumeva razbijanje kodiranih medijskih datoteka u manje komade, koji su zatim ponovo okupljeni od strane medija playera klijenta.

Nekoliko adaptivnih bitrate streaming rešenja:

HTTP Dynamic Streaming – Adobe

Od puštanja Adobe Flash Playera 10.1, i Open Source Media Framework 1.0 (OSMF), sadržaj isporuke usluga, stvaraoci i izdavači imali su opciju usklađivanje HTTP Dynamic Streaming da znatno povećaju svoj domet kada je u pitanju pružanje kvalitetnih video iskustva klijentu. HTTP Dinamic Streaming (HDS) je prava streaming tehnologija, ali ne zavisi od specijalizovanih streaming servera ili vlasničkih transfer protokola. . Pored toga, potreban alat da biste gledali svoj video preko HTTP dobija se besplatno sa Adobe.


Da biste pripremili svoje medije za HDS, uradite sledeće:

Spakujte vaš FLV, F4V ili druge MP4-kompatibilne datoteke pomoću slobodanog f4fpackager alata.

Preuzmite f4fpackager. f4fpackager je komandna linija alat dostupan za Windows i Linux koji koristite za pretvaranje svoje izvorne medijske datoteke u nužno-fragmentiranedatoteke potrebne za strimovanje. Možete skinuti packager  besplatno sami, ili koristite verziju koja se isporučuje u okviru Flash Media Server 4.0 i dalje. Proces je prilično jednostavan i brz, mnogo brži nego kodiranje izvornog fajla za početak! Za pokretanje Packager u operativnom sistemu Windows, na komandnoj liniji, “cd” u svoj Flash Media Server ili Apache web server instalaciom ““tools\f4fpackager” foldera. Odavde se lako pokrene Packager jednostavnim kucanjem “F4″ (tada Tab), i pustiti komandni prozor da auto kompletira pokretanje f4fpackager.exe. Dajte Packageru najmanje sledeće argumente:

01 --input-file=<em>theFullPathToYour/Media
01 --output-path=theFullPathToTheOutputLocationOfYourChoice

Alternativno, možete izostaviti argument izlaz, a Packager će staviti upakovane fajlove u izvornom direktorijumu. Više Packager argumenta za stvari kao sto su trazenje bitrate nekog datoteke, kriptovanje, itd mogu se naći ovde.

f4fPackager at the command line
Korišćenje f4fPackager na komandnoj liniji

Ako sve ide dobro, trebalo bi da imate 3 nova fajlova za svaku izvornu datoteku koju ste poslali Packager-u:

  1. .F4M (manifest) file
  2. .F4F (fragment) file
  3. .F4x (index) file

Manifest fajl (F4M.). je XML fajl koji sadrži bitne informacije o vašim medijima koje medija plejer analizira kako bi se reprodukovao pravilno datoteku. Da biste saznali više o F4M, i. F4F tipovima datoteka., Pogledajte Džona Krozbi serija na HTTP Dinamic Streamingu

The 3 packaged files: .F4M, .F4F, and .F4X

Uverite se da imate HTTP Poreklo modula spreman zarad

Instalirajte i konfigurišite HTTP Module Porijekla u postojeću instalaciju Apache web servera. HTTP Modul porekla je proširenje na Apache web serveru koji je neophodan za striming medija preko HTTP-a za Flash Player. Možete preuzeti modul ovde.  Alternativno.,I HTTP modul porekla i Apache web server dolaze u paketu i konfigurisani sa Flash Media Server om verzije 4.0 i gore. *Napomena* Uvjerite se da se vas server Apache koristi operativni sistem Windows, podrazumevan, Apache web servis počinje konfiguraciju unutar Flash Media Servera 4 koji je postavljen na “Manual”. Možda ćete želeti da se prebaci ovo na “automatski”.

Postavi vase spakovane media server datoteke u vod direktorijum vašeg Apache web servera (webroot / vod/)

Kada imate sve datoteke pravilno upakovane, i vi ste instalirali i konfigurisali Apache web server, kao i HTTP Modul porijekla (kao samostalan ili u paketu u okviru Flash Media Server), sve sto treba da uradite na strani servera je postaviti 3 upakovana fajla u vod fasciklu unutar vašeg Apache servera, i zgrabite URL (e) medijske datoteke (e) koje želite da pustite. *napomena* Apache je postavljen za slušanje na port 80 kao podrazumevano, i prelaze na port 8134 ako port 80 je u upotrebi. Međutim, možete konfigurisati Apache server da sluša bilo koji dostupan port.

Make sure Apache service is running

Konfigurisanje media player da ukaže na URL medi unutar vašeg vod fajla

Dobrodošli ste da izgradite sopstveni prilagođeni player na potoku preko HTTP-a, međutim, fini ljudi na Adobe i Realeyes mediji su već uradili dosta posla za vas. Dajte nekima ili svima od sledećih playera sansu:

REOPS Player   Moćan, OSMF-baziran media player iz Realeyes medija The Realeyes OSMF player primer (REOPS) nudi odličnu bazu za stvaranje snažnog video playera koristeći Open Source Media Framework (OSMF) iz Adobe. REOPS je trebalo da bude kamen temeljac za programere, kao i vizuelni prikaz da ilustruje mogućnosti i kako da se od OSMF framework. The REOPS projekat uključuje veoma proširu i robustnu kontrolnu traku i šablone da pomogne da prilagodite traku kontrole, kao i Full-screen podršku, zatvoren Captioning iz eksternog fajla, kao i OSMF dinamički plugin podršku. The REOPS projekat može da se koristi za primenu lako prilagođenog player videa koji podržavaju progresivnu video reprodukciju, video na zahtev, striming live streaming i dinamičkih striming. Šta više, sve ove funkcije se mogu konfigurisati iz spoljnog XML fajla.

Flash Media Playback Besplatan, standardni medija player za Adobe Flash platformu Flash Media Playback može koristiti bilo koji sajt sa samo nekoliko linija HTMLa, omogućavajući video i druge medije u nekoliko minuta. Njegova prošira plug-in arhitektura omogućava jednostavnu integraciju sa mrežama sadržaja isporuke (CDNs) i oglašavanja platformi, kao i podršku za analitiku i dodatne nezavisne usluge. Sa podrškom za najnovije metode isporuke, Flash Media Playback omogućava web programerima svih nivoa da u potpunosti iskoriste mocne karakteristike videa na Flash Platformi.

Strobe Media Playback Besplatni, OSMF bazirani media player iz Adobe. Strobe Media Playback je Open Source Media Framework (OSMF) baziran media player koji možete brzo i lako integrisati u vaš sajt. Sastavni SVF i njegov izvorni kod su dostupni za besplatno preuzimanje ovde.

… ili napravi svoj pomocu sledeceg tutorijala!  Ovladavanje OSMF-Adobe Developer Connection serije. John Crosby i David Hassoun iz Realeyes medija napisali su odličan niz članaka i tutorijalekoji nas vode kroz ucenje rada sa OSMF. Oni počinju sa izgradnjom jednostavnog medija playera, a onda zarone dublje u složenije teme, kao što su odvajanje kontrole, uključujući media prekrivače, kao i integraciju i razvoj custom plugin-a.

Bilo da odlučite da izgradite sopstveni ili da koristite medija player koji je vec napravljen, moraćete da istaknete svoju aplikaciju na F4M fajlu. vod direktorijumu vašeg Apache servera. Opet, ovo je media manifest datoteka, XML datoteka koju media player koristi da analizira ažne informacije o medijima, kao što su bitrate, trajanje, itd

*Napomena*
 HTTP Dinamic Streaming zahteva Flash Player 10.1 i gore. Bilo koja verzija OSMF, počevši od 1.0 bice kompatabilna sa HDSom.


Streaming Demo

Ispod je primer ugrađenog Flash Media Playback medija playerakoji dostavlja video preko HTTP-a. Ako želite,možete podesiti svoj player ovde. Ukoliko želite da koristite iste upakovane fajlove koji se koriste u demo, možete ih preuzeti ovde.

Naravno, ova demonstracija samo pokazuje video snimak preko HTTP-a to nije primjer moćnih funkcije dostupnih preko HDSa, kao što su promenljive bitrate prebacivanje, enkripcija, ili kasno vezivanje audio. Ostanite uz nas za vise informacija i o tome..


Where to go from here

Za dublji pogled na HDS, uključujući rasprave o File Encriptionu, Live streamingu, pogledajte John Crosby seriju na HTTP Dinamic Streamingu

Za informativni pogled na svet OSMF, uključujući i duboko zaranjanje u takvim stvarima kao građenje prilagođenih medija playera i dodatke integracije i razvoja, pogledajte David Hassoun i John Crosby’s clanak serijeMastering OSMF na Adobe Developer Connection sajtu .


Za informacije o tome kako Realeyes Mediji vam mogu pomoći da se prebacite na HTTP Dinamic Streaming, budite slobodni da nas kontaktirate danas.

This article is translated to “http://science.webhostinggeeks.com/http-dinamic-streaming” Serbo-Croatian language by Vera Djuraskovic from “http://webhostinggeeks.com/” Webhostinggeeks.com

As technologies related to the production and delivery of digital motion media continue to advance, so do consumer demands for an increasingly varied and rich media viewing experience. High-definition video is now being delivered to more users, on a wider variety of devices, and through more complex networks than ever before. For content providers, this of course means more available avenues for media distribution and monetization.

Then

Traditionally, video has been delivered to the client in one of two ways: either by progressive download using the widely-supported HTTP protocol, or by streaming, using protocols such as RTP, RTMP, UDP, or TCP, in conjunction with specialized server-side software to handle the stream (e.g., Flash Media Server or Windows Media Services). These two delivery methods had both advantages and disadvantages. Streaming media protocols offered the viewer a better experience by allowing the video to play back right away, without having to first wait for it to completely download. They also made possible such features as adaptive bitrate streaming to compensate for fluctuations in user bandwith, live viewing, content encryption, and smart scrubbing. These features often came at a significant cost, however, and as such weren’t a viable option for many content providers. Delivery over HTTP on the other hand, required a complete file download before viewing could start. In addition, content transferred in this way was stored on the end user’s hard drive, and was therefore often not the best solution for displaying copyright-protected content. However, support for the HTTP protocol was, and remains to be, very widespread. Specialized server technology isn’t required to deliver content over HTTP-a simple (and free) web server will do. Being supported by existing and widespread server hardware and caching infrastructures continues to be one of the major advantages of using the HTTP protocol.

Now

In the past, content providers often faced a difficult dilemma. Should they make the relatively large financial investment in order to provide the best streaming video experiences to their end-users? Or would their ROI be better served by delivering a less-robust viewing experience, albeit to a potentially larger audience, over HTTP? Companies such as Move Networks, Microsoft, Adobe, and Apple have come up with their own unique solutions to this problem-the problem of dynamically streaming media over the HTTP protocol. Each solution involves breaking up the encoded media files into smaller chunks, which are then re-assembled by the media player on the client end.

A few adaptive bitrate streaming solutions:

HTTP Dynamic Streaming – Adobe

Since the release of Adobe Flash Player 10.1, and the Open Source Media Framework 1.0 (OSMF), content delivery providers, creators, and publishers have had the option of leveraging HTTP Dynamic Streaming to vastly increase their reach when it comes to delivering quality video experiences to the client. HTTP Dynamic Streaming (HDS) is a true streaming technology, but not dependent on specialized streaming servers or proprietary transfer protocols. In addition, the tools required to make your media files streamable over HTTP are provided free from Adobe.


To prepare your media for HDS, you do the following:

Package your FLV, F4V, or other MP4-compatible files using the free f4fpackager tool.

Download f4fpackager The f4fpackager is a command line tool available for Windows and Linux that you use to convert your source media files into the necessarily-fragmented files required for streaming. You can get the packager for free on its own, or use the version that ships within Flash Media Server 4.0 and up. The process is fairly simple and quick-much faster than encoding the source files to begin with! To run the packager in Windows, at the command line, “cd” into your Flash Media Server or Apache web server installation’s “tools\f4fpackager” folder. From here, easily run the packager by simply typing “f4″ (then Tab), and let the command window auto-complete your prompt to launch f4fpackager.exe. Give the packager at least the following arguments:

--input-file=<em>theFullPathToYour/Media
--output-path=theFullPathToTheOutputLocationOfYourChoice

Alternatively, you can omit the output argument, and the packager will place the packaged files into the source directory. More packager arguments for doing things like declaring a file’s bitrate, encrypting, etc. can be found here.

f4fPackager at the command line
Using the f4fPackager at the command line

If everything goes well, you should have 3 new files for every source file you sent to the packager:

  1. .F4M (manifest) file
  2. .F4F (fragment) file
  3. .F4x (index) file

The manifest file (.F4M) is an XML file that contains pertinent information about your media that the media player parses in order to play back the file appropriately. To learn more about the .F4M, and .F4F file types, please check out John Crosby’s series on HTTP Dynamic Streaming.

The 3 packaged files: .F4M, .F4F, and .F4X

Ensure that you have the HTTP Origin Module ready to go.

Install and configure the HTTP Origin Module into an existing Apache web server installation. The HTTP Origin Module is an extension to the Apache web server that is necessary for streaming media via HTTP to the Flash Player. You can download the module here. Alternatively, both the HTTP Origin Module and the Apache web server come bundled and configured with Flash Media Server versions 4.0 and up. *Note* Make sure your Apache server is running. On Windows, by default, the Apache web service start configuration within Flash Media Server 4 is set to “manual”. You may want to switch this to “automatic”.

Place your packaged media files into the vod folder of your Apache web server. (webroot/vod/).

Once you have your files properly packaged, and you’ve installed and configured your Apache web server, as well as the HTTP Origin Module (as a standalone or bundled within Flash Media Server), all you need to do on the server side is place the 3 packaged files into the vod folder within your Apache server, and grab the URL(s) of the media file(s) you wish to stream. *note* Apache is set to listen to port 80 by default, and to switch over to port 8134 if port 80 is in use. However, you may configure your Apache server to listen to any available port.

Make sure Apache service is running

Configure your media player to point to the URL of the media within your vod directory.

You’re welcome to build your own custom player to stream via HTTP, however, the fine people at Adobe and Realeyes Media have already done a lot of the work for you. Give any or all of the following example players a try:

REOPS Player  A powerful, OSMF-based media player from Realeyes Media The Realeyes OSMF Player Sample (REOPS) offers an excellent base for creating a robust video player utilizing the Open Source Media Framework (OSMF) from Adobe. REOPS is meant to be a building block for developers as well as a visual representation to illustrate the capabilities and how to of the OSMF framework. The REOPS project includes a very extensible and robust control bar skinning solution and templates to help customize the control bar, as well as Full-screen support, Closed Captioning from an external file, and OSMF dynamic plugin support. The REOPS project can be used to deploy easily customized video players that support progressive video playback, video on demand streaming, live streaming and dynamic streaming. What is more, all of these features are configurable from an external XML file.

Flash Media Playback A free, standard media player for the Adobe Flash Platform Flash Media Playback can be used by any website with only a few lines of HTML, enabling playback of video and other media in minutes. Its extensible plug-in architecture enables easy integration with content delivery networks (CDNs) and advertising platforms, as well as support for analytics and additional third-party services. With support for the latest delivery methods, Flash Media Playback enables web developers of all levels to fully utilize the powerful video features of the Flash Platform.

Strobe Media Playback A free, OSMF-based media player from Adobe Strobe Media Playback is an Open Source Media Framework (OSMF) based media player that you can quickly and easily integrate into your website. The compiled SWF and its source code are available for free download here.

…Or Build Your Own With These Tutorials! Mastering OSMF-Adobe Developer Connection Series John Crosby and David Hassoun of Realeyes Media have written an excellent series of articles and walkthrough tutorials for teaching us how to work with OSMF. They start with building a simple media player, and then dive deeper into more complex topics such as separating control, incorporating media overlays, as well as integrating and developing custom plugins. 

Whether you decide to build your own, or use a media player that’s been provided, you’ll need to point your application to the .F4M file within your Apache server’s vod directory. Again, this is the media manifest file, an XML file that the media player uses to parse important information about the media, such as bitrate, duration, etc..

*Note*
 HTTP Dynamic Streaming requires Flash Player 10.1 and above. Any version of OSMF, starting with 1.0, will be capable of HDS.


Streaming Demo

Below is an example of the embedded Flash Media Playback media player delivering a video over HTTP. If you’d like, you can configure your own player here. If you would like to use the same packaged files playing in the demo, you can download them here. Of course, this demonstration is merely showing a video stream via HTTP-it’s not an example of the more powerful features available with HDS, such as variable bitrate switching, encryption, or late-binding audio. Stay tuned for coverage of those topics and more.


Where to go from here

For a more in-depth look into HDS, including discussions on file encryption, and live streaming, please refer to John Crosby’s series on HTTP Dynamic Streaming

For an informative look into the world of OSMF, including deep-dives into such things as building custom media players and plugin integration and development, please see David Hassoun and John Crosby’s article series “Mastering OSMF“on the Adobe Developer Connection site .


For information on how Realeyes Media can help you make the switch to HTTP Dynamic Streaming, please feel free to contact us today.


Documentation

Adobe HTTP Dynamic Streaming documentation

OSMF.org

f4fpackager documentation

F4M file format specification

Downloads

HTTP origin module

f4fpackager

Flash Media Development Server (free)

Apache web server


Scott Sheridan writes about, and messes around with, the latest technologies in digital motion media at Realeyes. He also does triathlons. Really big triathlons.

Feel free to reach out with any questions-we’re glad to help!

scott AT realeyes DOT com

Adobe Labs is currently previewing their latest animation tool-Adobe Edge, which you will be able to use to create animations destined for screens of all sizes. By using the latest web standards, such as HTML/HTML 5, CSS 3, and JavaScript, animators will be able to use Edge to create motion content with its easy to use, timeline-based interface. Edge will allow you to create compositions from scratch, or to import and animate existing web graphics (bitmap or SVG) and CSS-based HTML layouts.

Here’s an introduction to Adobe Edge from Doug Winnie:


Stay tuned, as an early preview release should be available soon. (July-ish)

  • You can sign up to be notified when the preview release becomes available here.
  • Follow Adobe Edge on Facebook

Today Adobe released version 4.5 of Flash Builder and the open-source Flex SDK. Arguably the most important update in this release is the added support for mobile application development. Developers can leverage Flash Builder 4.5 and the Flex 4.5 SDK to create applications destined for multiple mobile platforms using a common code base (Flex and or ActionScript 3.0). The new SDK includes 21 new, ready-made components that developers can use to build applications for Android, BlackBerry, or iOS. Testing and debugging can done on the desktop by leveraging the AIR-based device emulator, or by connecting devices locally and using a one-click process to package, deploy, and launch the application. Flash Builder 4.5 will generate platform-specific installer files to upload to a mobile application distribution site or store.

Adobe’s Serge Jespers demonstrates how to create an application that can run on several devices running Android, BlackBerry, or iOS using Flex and Flash Builder.

More on Flash Builder 4.5, and Flex 4.5:

Article: Coding productivity enhancements in Flash Builder 4.5

Video: Testing Android applications on the desktop

Download Adobe Flash Builder/Flex 4.5


Adobe Flex SDK

Adobe Flash Builder

TryBuyLearn MoreHelp & Support