New in Flash Player 11: Encoding Live Video to H.264/AVC

Posted on October 31, 2011 at 6:59 pm in Development, Media Solutions

The latest version of Flash Player (v.11.0) includes some exciting new features, including performance upgrades such as native 64-bit support, and asynchronous bitmap decoding. Perhaps most newsworthy though, is Flash Player’s new capability to encode live video streams to the H.264/AVC standard. This new feature will allow developers to create real-time, high-quality, live video streaming applications for chat, conferencing, and live event broadcasting.

The following article demonstrates how to take advantage of Flash Player 11.0′s new H.264 encoding capabilities within a video streaming application built using Flash Builder 4.5. The application does the following:

  • Captures live video from a webcam
  • Establishes a connection to Flash Media Server 4.5 using the NetConnection class
  • Publishes video stream from application to FMS using an instance of the NetStream class
  • Displays outgoing video stream from camera (prior to being encoded) in a Video component within the application
  • Sends encoding parameters to Flash Player 11.0 to encode the raw webcam video to H.264
  • Displays encoded video’s metadata, demonstrating that encoding worked
  • Streams live, encoded video from FMS to the application using another instance of the NetStream class
  • Displays newly encoded, streamed live video in another Video component within the application

H.264 Encoding in Flash Player 11.0 Example Application

Example Application showing live stream from webcam (left) and stream encoded to H.264 in Flash Player 11.0 (right).

To follow along with the example, please be sure to have the following:


Getting Started - Configuring  Compiler Settings

To develop applications that target the new features available in Flash Player 11.0, it is necessary to configure the compiler to target player-version “11.0″, and SWF-version “13″, as well as the playerglobal.swc for Flash Player 11.0. To make these changes:

    1. Download the new playerglobal.swc for Flash Player 11.0, and rename this file from “playerglobal11_0.swc”  to “playerglobal.swc“.
    2. Create a folder named “11.0” in the directory “frameworks\libs\player” that is inside your Flex SDK installation folder. (Fig. 1.0)
    3. Put the playerglobal.swc inside the new folder (“11.0”).
    4. Locate the file “flex-config.xml“, that is located in the “frameworks” folder within your Flex SDK installation directory.
    5. Within “flex-config.xml“, locate the “target-player” tag, which specifies the minimum player version that will run the compiled SWF.
    6. Set “target-player” value to “11.0“. (Fig.1.1)
    7. Also within “flex-config.xml“, locate the “swf-version” tag,  which specifies the version of the compiled SWF.
    8. Set “swf-version” value to “13“. (Fig. 1.1)
    9. Save “flex-config.xml“.

CreateFolderForPlayerGlblSwc

Figure 1.0. Create a folder for the playerglobal.swc named “11.0″.

Edit Values in flex-config

Figure 1.1. Edit values of “target-player” and “swf-version” tags within the flex-config.xml file.


Setting Up the Project in Flash Builder 4.5

The example application is a simple ActionScript 3.0 project (not a Flex or AIR project). To create a similar project in Flash Builder:

    1. Choose File -> New -> ActionScript project.
    2. Name the project “H264_Encoder”, and click “Finish”.
    3. In Flash Builder, with the H264_Encoder project selected, choose Project -> Properties.
    4. Verify that the compiler is targeting Flash Player 11.0. (Fig. 1.2) If it isn’t, select the “Use a specific version” radio button, and type “11.0.0″ for the value.

SettingFlashPlayerVersionInFlashBuilder

Figure 1.2. Make sure that the compiler is targeting Flash Player 11.0 by inspecting the project’s properties.

At this point, the application should look similar to the following:

package
{
public class H264_Encoder extends Sprite
{
public function H264_Encoder()
{
}
}
}

Next up, you’ll be modifying the application so that it can communicate with your webcam. In addition, you’ll add the code necessary for establishing a NetConnection to connect the application to Flash Media Server, as well two NetStream instances; one responsible for getting the video from the application into Flash Media Server, and one for bringing it back from the server into the application.


Coding the Application – Connecting a Camera, Establishing a NetConnection and NetStreams
    1. Directly under the opening class definition statement, but before the constructor method, create a private variable named “nc”, and data typed as a NetConnection. Use code hinting to have Flash Builder generate the necessary import statements for you by starting to type “NetC..”, then hit CTRL-SPACE to receive code hinting. Select “NetConnection” from the list, and notice that Flash Builder has imported the NetConnection class from within the flash.net package. If for some reason the import fails, go ahead and import it manually. Your code should appear as follows:

package
{
import flash.net.NetConnection;

public class H264_Encoder extends Sprite
{
private var nc:NetConnection;

public function H264_Encoder()
{
}
}
}

    1. Create two private variables to represent the NetStreams data typed as NetStream. Create one for the stream going from the application to the server (ns_out), and another for the stream coming back into the application from the server (ns_in), and remember to use code hinting to have Flash Builder import the necessary classes.

package
{
import flash.net.NetConnection;
import flash.net.NetStream;

public class H264_Encoder extends Sprite
{
private var nc:NetConnection;
private var ns_out:NetStream;
private var ns_in:NetStream;

public function H264_Encoder()
{
}
}
}

    1. Next, create a private variable named “cam” of type “Camera”, and set its value = “Camera.getCamera()”. The Camera class is a little different than other classes, in that you don’t call a constructor to instantiate an object of type Camera. Instead, you call the getCamera() method of the Camera class. This method will return an instance of a Camera object unless there isn’t a camera attached to the computer, or if the camera is in use by another application.

private var cam:Camera = Camera.getCamera();

Make sure the Camera class was imported:

import flash.media.Camera;

    1. It is now time to add code that will allow the application to connect to Flash Media Server using an instance of the NetConnection class. Under the import statements, the local variables, and the closing brace of the constructor function, create a private function named initConnection() that takes no arguments and returns void:

private function initConnection():void
{
}

    1. As the first line of the function body, create a new NetConnection by instantiating the nc:NetConnection variable, which you declared in step 1:

nc = new NetConnection();

    1. It’s always a good practice to verify that the NetConnection was successful. Next, add an event listener to listen for an event named “onNetStatus()”. You will create the onNetStatus() event in the next section:

nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);

Be sure to either use code hinting, or import manually, the NetStatusEvent class, which is in the flash.events package:

import flash.events.NetStatusEvent;

    1. Next, and still within the initConnection() function body, tell the NetConnection where to connect to by calling the connect() method of the NetConnection class. As an argument to this method, add the URL for the location of the “live” folder within the installation Flash Media Server you want to connect to. The URL included in the example uses the RTMP protocol, and connects to the “live” folder within a copy of Flash Media Server installed on one of our servers. You can also stream to a local version of Flash Media Server, if you have one installed, by setting the URL to: “rtmp://localhost/live”.

nc.connect("rtmp://office.realeyes.com/live");

    1. Finally, tell the NetConnection where Flash Media Server should invoke callback methods by setting the value for the NetConnection’s “client” property to “this”. Callback methods are special handler functions invoked by Flash Media Server when a client application establishes a NetConnection. Later on in this example you will work with the “onMetaData()” and “onBWDone()” callback methods. You will include these callback methods within the main application class, which is in fact the same object that will establish the NetConnection, and therefore the value of the NetConnection instance’s (nc) client property should be set to “this”.

nc.client = this;

The completed initConnection() function should appear as follows:

private function initConnection():void
{
nc = new NetConnection();
nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);
nc.connect("rtmp://office.realeyes.com/live");
nc.client = this;
}


Coding the Application – Verifying a Successful NetConnection
    1. As mentioned, it’s always a good practice to verify the success of a NetConnection attempt. To do this, create a protected function named onNetStatus() that takes an event, named “event”, of type “NetStatusEvent” as its only argument, and returns void:

protected function onNetStatus(event:NetStatusEvent):void
{
}

    1. Within the onNetStatus() event body, create a Trace statement that outputs the value of event.info.code to the console during debugging. The code property of the info object in the NetStatus event will contain String data that indicates the status of the attempted NetConnection, such as “NetGroup.Connect.Success”, or “NetGroup.Connect.Failed”. Tracing the value of this property allows you to confirm the status of the NetConnection easily by simply running the application in debug mode.

protected function onNetStatus(event:NetStatusEvent):void
{
trace(event.info.code);
}

    1. Next, within the function body, and beneath the existing Trace statement, create a conditional statement that checks the value of event.info.code and compares it to the value “NetConnection.Connect.Success”. If event.info.code == “NetConnection.Connect.Success”, call three functions that you will create in the next section; one that publishes an outgoing video stream, one that displays the incoming video from the webcam, and one that displays the video stream being sent back to the application from the server. The completed onNetStatus() function should appear as follows:

protected function onNetStatus(event:NetStatusEvent):void
{
trace(event.info.code); if(event.info.code == "NetConnection.Connect.Success")
{
publishCamera();
displayPublishingVideo();
displayPlaybackVideo();
}
}

    1. This example attempts to connect to the server and start playing/publishing video automatically when launched. To achieve this, call initConnection() from within the main class’ constructor method:

public function H264_Encoder()
{
initConnection();
}

At this point, you have included the code necessary to establish a NetConnection, and verify the success or failure of that connection with a Trace statement. In addition, you’ve included calls to functions that, when written, will handle the publishing and playback of the video from the webcam, as well as the video coming back from the server.

If you save the application now you’ll notice some errors. The calls to publishCamera(), displayPublishingVideo(), and displayPlaybackVideo() generate errors because we haven’t written them yet. You can comment out the calls to these functions and run the application in debug mode. If everything is set up correctly, you should see the Trace output “NetConnection.Connect.Success”.

Comment out calls to unwritten functions

However, you should also see this error in the console: “ReferenceError: Error #1069: Property onBWDone not found on flash.net.NetConnection and there is no default value.”. This is because Flash Media Server is attempting a callback function on the application that hasn’t been written yet. In the next section you will include those callback functions.

connect success and bwdone error in console


Coding the Application – Including the Callback Functions, and Creating a TextField to Display Metadata

The sample application contains two callback functions – onBWDone(), and onMetaData(). The onBWDone() callback checks for available bandwith, which can be useful in applications that need to dynamically switch video assets according to the bandwith that’s currently available. Although it’s necessary to include this function in the client code (omitting it will generate an error when the server tries to make the function call) it’s not necessary to actually do anything with it. This application isn’t concerned with monitoring bandwith, so it can be left as an empty function.

The onMetaData() callback function is useful for accessing a video stream’s metadata, and you will be adding code to this callback to do just that. The onMetaData() callback returns an Array of generic objects that represent the video stream’s metadata. In the next section, you will create those objects to represent various metadata, and access their values in order to display the information within the UI. For now, you will simply add the two callback functions, and add some code to onMetaData() to access that metadata. In addition, you will create a TextField that you will eventually use to display the metadata in the UI.

    1. Create a new private instance variable named “metaText”, and type it as an instance of the TextField class. Set its initial value to “new TextField()”

private var metaText:TextField = new TextField();

*Note – At this point you are simply creating the metaText object in memory. You won’t actually add it to the display list until further on in the example.

Be sure to import the necessary TextField class:
import flash.text.TextField;

    1. Include the required onMetaData() callback function. Create a new public function named “onMetaData()” that accepts an Object named “o” as its only parameter, and returns void.

public function onMetaData( o:Object ):void
{
}

    1. To access the video stream’s metadata, you will loop through the objects returned by the onMetaData() callback function. Again, you will create those objects in the next section, but for now, within the onMetaData() function create a “for…in” loop to loop through the objects. Within the loop’s initializer, declare a local variable named “settings” data typed as a String within the Object “o”.

public function onMetaData( o:Object ):void
{
for (var settings:String in o)
{
}
}

    1. Next, within the loop body, include a Trace statement that will output the name of each “settings” object returned by onMetaData(), concatenated with “=”, and the object’s value.

trace(settings + " = " + o[settings]);

    1. Finally, inside the for…in loop body, assign a text value to the metaText variable equal to each returned object’s name, concatenated with “=”, and the object’s value. Create a new line for each iteration, and adjust the spacing between the double quotes, (and add an extra “\n” if you want to double-space the text) to properly layout the text in the UI.

metaText.text += "\n" + " " + settings.toUpperCase() + " = " + o[settings] + "\n";

*Note* The layout and styling in this example are not intended to be examples of UI programming best practices. UI programming is outside the scope of this article.

The completed onMetaData() callback function should be similar to the this:

public function onMetaData( o:Object ):void
{
for (var settings:String in o)
{
trace(settings + " = " + o[settings]);
metaText.text += "\n" + " " + settings.toUpperCase() + " = " + o[settings] + "\n";
}
}

    1. Next, add the onBWDone() callback function. Create a new public function named “onBWDone()” that takes no arguments, and returns void.

public function onBWDone():void
{
}

Remember that the onBWDone() callback function is what Flash Media Server uses to check available bandwith, and this application doesn’t require that information. It still must be included, however, since the server will be calling it on the application object. To avoid a runtime error, simply include an empty onBWDone() callback.

public function onBWDone():void
{
}

Now that the application has the necessary callback functions, and it loops through the objects returned by onMetaData() to populate a TextField with that data, it’s time to add code that enables the application to read webcam data, encode that webcam data to the H.264 standard, and to then stream the encoded video.


Coding the Application – Setting Up H.264 Encoding, and Publishing to the NetStream

In this next section, you will attach your webcam to an instance of the Camera class. You will then encode the webcam input to H.264 using properties of the Camera class, and new H264VideoStreamSettings class. Certain encoding parameters can’t be set (yet, although support for this is hopefully coming soon) with the new H264VideoStreamSettings class, so you’ll be setting those values from properties in the Camera class.

Next, you will attach the encoded video to a live video stream, and stream it to Flash Media Server’s “live” directory. (You will bring a new stream back into the application from Flash Media Server in the next section)

Finally, in order to read the metadata of the newly encoded video stream, you will call the send() method of the NetStream class (available only when using Flash Media Server). As arguments to the send() method, you will include @setDataFrame, a special handler method within Flash Media Server, the onMetaData() callback method you added earlier to listen for the metadata client-side, and finally, the name of a local variable (“metaData”), data typed as an Object, used to represent the desired metadata items. First:

    1. Create a protected function named “publishCamera()” that takes no arguments and returns void:

protected function publishCamera():void
{
}

    1. In the first line of this new function, instantiate the ns_out NetStream object by calling its constructor. Pass the constructor the NetConnection instance “nc”:

ns_out = new NetStream(nc);

    1. On the next line, attach the Camera instance “cam” to the outgoing NetStream by calling the attachCamera() method of the NetStream class. Pass this method the cam instance:

ns_out.attachCamera(cam);

    1. Next, create a new local variable named “h264Settings”, data typed as H264Settings and set its initial value equal to “new H264Settings()”:

var h264Settings:H264VideoStreamSettings = new H264VideoStreamSettings();

Be sure to import the H264VideoStreamSettings class:

import flash.media.H264VideoStreamSettings;

    1. Call the setProfileLevel() method of the H264Settings class on the h264Settings object to encode the video using the “BASELINE” profile, and a level of “3.1″:

h264Settings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_3_1);

Be sure to import both the H264Profile class, and the H264Level class:

import flash.media.H264Level;
import flash.media.H264Profile;

    1. Next, use the setQuality() method of the Camera class to encode the video stream at 90000 bps (900Kbps), and with a quality setting of “90″:

cam.setQuality(90000, 90);

    1. Use the setMode() method of the Camera class to set the video’s width, height, and frames per second, and to determine if it should maintain its capture size when if camera has no default behavior for this parameter:

cam.setMode(320, 240, 30, true);

    1. Next, using the setKeyFrameInterval() method of the Camera class, set the video’s keyframe interval to 15 (two keyframes per second):

cam.setKeyFrameInterval(15);

  1. To set the outgoing video’s compression settings, assign the values of the h264VideoStreamSettings variable to the videoStreamSettings property of the outbound stream, “ns_out”
    ns_out.videoStreamSettings = h264Settings;
  2. Call the publish() method of the NetStream class on the outgoing NetStream, and pass it parameters to provide a name for the stream (“mp4:webCam.f4v”), as well as a destination folder in Flash Media Server (“live”):
  3. ns_out.publish("mp4:webCam.f4v", "live");
  4. Now it’s time to create the objects that will hold the metadata values of the encoded video you will access at runtime. Create a new local variable named “metaData”, data typed as an Object, and set its initial value equal to “new Object()”:
  5. var metaData:Object = new Object();
  6. These metaData objects are generic, meaning you can assign any name/value pairs you like. For example, there’s no encoding setting that comes from the Camera, VideoStreamSettings, or H264VideoStreamSettings classes that would allow you to display a copyright, but you can add one easily enough like this:
  7. metaData.copyright = "Realeyes Media, 2011";Of course, you can also create objects with values that do come from settings within the aforementioned classes, such as:

    metaData.codec = ns_out.videoStreamSettings.codec;
    metaData.profile = h264Settings.profile;

  8. Create the following metaData objects and add them to the publishCamera() function:
  9. metaData.codec = ns_out.videoStreamSettings.codec;
    metaData.profile = h264Settings.profile;
    metaData.level = h264Settings.level;
    metaData.fps = cam.fps;
    metaData.bandwith = cam.bandwidth;
    metaData.height = cam.height;
    metaData.width = cam.width;
    metaData.keyFrameInterval = cam.keyFrameInterval;
    metaData.copyright = "Realeyes Media, 2011";
  10. Call the send() method of the NetStream class on the ns_out object and pass it the name of the handler method “@setDataFrame”, and the callback method “onMetaData”, as well as the local variable metaData:
  11. ns_out.send("@setDataFrame", "onMetaData", metaData);The completed publishCamera() function should resemble the following, with the exception of the commented-out code:

    protected function publishCamera():void
    {
    ns_out = new NetStream(nc);
    ns_out.attachCamera(cam);
    var h264Settings:H264VideoStreamSettings = new H264VideoStreamSettings();
    h264Settings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_3_1);

    // ALTHOUGH FUTURE VERSIONS OF FLASH PLAYER SHOULD SUPPORT SETTING
    // ENCODING PARAMETERS ON h264Settings BY
    // USING THE setQuality() and setMode() METHODS,
    // FOR NOW YOU MUST SET THE PARAMETERS ON THE CAMERA FOR:
    // BANDWITH, QUALITY, HEIGHT, WIDTH, AND FRAMES PER SECOND.
    // h264Settings.setQuality(30000, 90);
    // h264Settings.setMode(320, 240, 30);

    cam.setQuality(90000, 90);
    cam.setMode(320, 240, 30, true);
    cam.setKeyFrameInterval(15);
    ns_out.videoStreamSettings = h264Settings;
    trace(ns_out.videoStreamSettings.codec + “, ” + h264Settings.profile + “, ” + h264Settings.level);
    ns_out.publish(“mp4:webCam.f4v”, “live”);

    var metaData:Object = new Object();
    metaData.codec = ns_out.videoStreamSettings.codec;
    metaData.profile = h264Settings.profile;
    metaData.level = h264Settings.level;
    metaData.fps = cam.fps;
    metaData.bandwith = cam.bandwidth;
    metaData.height = cam.height;
    metaData.width = cam.width;
    metaData.keyFrameInterval = cam.keyFrameInterval;
    metaData.copyright = “Realeyes Media, 2011″;
    ns_out.send(“@setDataFrame”, “onMetaData”, metaData);
    }


Coding the Application – Displaying and Encoding the Video From the Webcam, and Displaying Video Streamed Back From the Server

The application needs to display both the raw, un-encoded incoming video from the webcam, as well as the inbound streaming video after it has been encoded to H.264 in the Flash Player, sent to Flash Media Server, and then back to the application. In addition, the metadata that you defined in the previous section needs to be displayed in the UI to reveal the encoding settings defined in publishCamera().

In this next section, you will create two functions, displayPublishingVideo(), and displayPlaybackVideo() to play the streams and display the metadata on screen.

    1. Create a new private instance variable named vid_out, and set its data type to Video:

private var vid_out:Video;

Be sure to import the Video class:

import flash.media.Video;

This new instance of the Video class will be used to playback the not-yet-encoded video coming in from the webcam.

    1. Next, create a protected function named displayPublishingVideo() that takes no arguments and returns void:

protected function displayPublishingVideo():void
{
}

    1. In the first line of the function body, instantiate the vid_out variable by calling the constructor method of the Video class:

vid_out = new Video();

    1. To place the new Video component on screen correctly, assign x and y values to vid_out so x = 300, and y = 10

vid_out.x = 300;
vid_out.y = 10;

    1. Next, use the height and width values from the webcam to set the height and width of the video display:

vid_out.width = cam.width;
vid_out.height = cam.height;

    1. To allow the vid_out component to display video coming from the webcam, call the attachCamera() method of the Video class, and pass that method the instance of the Camera class that represents the webcam:

vid_out.attachCamera(cam);

    1. Finally, add vid_out to the display list by calling the addChild() method of the DisplayObjectContainer class:

addChild(vid_out);

At this point, the displayPublishingVideo() function should look similar to:

protected function displayPublishingVideo():void
{
vid_out = new Video();
vid_out.x = 300;
vid_out.y = 10;
vid_out.width = cam.width;
vid_out.height = cam.height;
vid_out.attachCamera(cam);
addChild(vid_out);
}

If you run the application at this point, provided you have a webcam attached to your computer (and you un-commented the calls to the functions publishCamera(), and displayPublishingVideo() within onNetStatus()), you should see the Flash Player dialog that asks permission to access your camera. Grant Flash Player permission, and you should now see a live video feed coming from your webcam.

Next, you’ll add code to the displayPublishingVideo() function that will display the metadata objects you created earlier. The metadata text won’t show up until the code is in place to handle the incoming stream, however. This is because metaText’s text property is set within the onMetaData() function, and onMetaData() is run only when Flash Media Server sends the stream back to the application. You’ll start by adding the metaText TextField object to displayPublishingVideo() and assigning values for its properties:

    1. In the displayPublishingVideo() function, directly under the existing addChild() method call, set metaText’s “x” value to “0″, its “y” value to “55″, its width to “300″, and its height to “385″

metaText.x = 0;
metaText.y = 55;
metaText.width = 300;
metaText.height = 385;

    1. Assign color values for the backgroundColor, textColor, and borderColor of metaText. In order to display backgroundColor and borderColor, you must assign both the background and border properties to “true”.

metaText.background = true;
metaText.backgroundColor = 0x1F1F1F;
metaText.textColor = 0xD9D9D9;
metaText.border = true;
metaText.borderColor = 0xDD7500;

    1. Add the metaText TextField object to the display list by calling the addChild() method, and passing it the metaText object.

addChild(metaText);

Next, you’ll create a function that will bring the video stream back in from Flash Media Server, and display it in another Video object.

    1. Create a new instance variable named vid_in and data type it as a Video.

private var vid_in:Video;

  1. Next, create a new protected function called “displayPlaybackVideo()” that takes no arguments and returns void.protected function displayPlaybackVideo():void
    {
    }
  2. In the first line of the function body, instantiate a copy of ns_in, the NetStream variable you declared earlier, and set its initial value equal to new NetStream(nc) with the “nc” NetConnection passed as an argument.
  3. ns_in = new NetStream(nc);
  4. Instead of calling the attachCamera() method, as you did for the previous NetStream, set the client property of the new NetStream to “this”.
  5. ns_in.client = this;
  6. Next, call the play() method of the NetStream class, and pass it the String value for the name of the stream. This should be the name of the outgoing stream as well.
  7. ns_in.play("mp4:webCam.f4v");
  8. Instantiate the vid_in variable by calling its constructor.
  9. vid_in = new Video();
  10. Next, set some sizing and layout properties for the new Video object so that it sits properly on the stage.
  11. vid_in.x = vid_out.x + vid_out.width;
    vid_in.y = vid_out.y;
    vid_in.width = cam.width;
    vid_in.height = vid_out.height;
  12. Attach the incoming NetStream to the Video object to have it playback the video.
  13. vid_in.attachNetStream(ns_in);
  14. Finally, add vid_in to the display list by calling the addChild() method and passing vid_in as its only argument.
  15. addChild(vid_in);Make sure to un-comment the call to displayPlaybackVideo() in the onNetStatus() function, and then save and run the application. You should see a dark rectangle appear that displays the video’s encoding settings, and two video streams, side-by-side. The video on the left is the raw video footage coming from the webcam, and the one on the right is the stream coming back from Flash Media Server.

Coding the Application – Adding Some Finishing Touches

The application is almost done! It could stand a little visual clean up however.

    1. First, add Metadata above the class declaration to set the height and width of the application to something more reasonable.

[SWF( width="940", height="880" )]

Next, you’ll create three more TextFields that will display a simple label for the encoding settings list, as well as information about each of the separate video streams. You’ll also work with some simple text formatting to size the text something different than the default.

    1. Create three new TextField variables, one named “vid_outDescription”, one named “vid_inDescription”, and one named “metaTextTitle”. Data type each of them as TextField, and call the constructor for each.

private var vid_outDescription:TextField = new TextField();
private var vid_inDescription:TextField = new TextField();
private var metaTextTitle:TextField = new TextField();

    1. Within the displayPublishingVideo() function, directly below the call to add metaText to the display list, add a line that sets the text property for metaTextTitle. Play with spacing between the double quotes and add a “\n” to get the positioning the way you’d like it.

metaTextTitle.text = "\n - Encoding Settings -";

    1. Next, create a local variable named “stylr”, that is an instance of the TextFormat class. Instantiate this variable by calling its constructor.

var stylr:TextFormat = new TextFormat();

Ensure that the TextFormat class has been imported.

import flash.text.TextFormat;

    1. Set the size property of the new TextFormat class to “18″.

stylr.size = 18;

    1. Apply the style defined with the stylr variable to the metaTextTitle TextField by calling the setTextFormat method of the TextField class and passing that method the name of the TextFormat object “stylr” as an argument.

metaTextTitle.setTextFormat(stylr);

    1. Add more styling and layout property values to metaTextTitle the same way you added them to metaText earlier:

metaTextTitle.textColor = 0xDD7500;
metaTextTitle.width = 300;
metaTextTitle.y = 10;
metaTextTitle.height = 50;
metaTextTitle.background = true;
metaTextTitle.backgroundColor = 0x1F1F1F;
metaTextTitle.border = true;
metaTextTitle.borderColor = 0xDD7500;

    1. Create descriptive text to be displayed for the outbound video stream. Set the text property for the vid_outDescription TextField to display this descriptive text. Again, play with the spacing and new lines to get it positioned correctly

vid_outDescription.text = "\n\n\n\n Live video from webcam \n\n" +
" Encoded to H.264 in Flash Player 11 on output";

    1. Add both the metaTextTitle TextField, and the vid_outDescription TextField to the display.

addChild(vid_outDescription);
addChild(metaTextTitle);

    1. Add descriptive text for the incoming video stream in the same manner. Set values for properties on vid_inDescription, and add the TextField to the display.

vid_inDescription.text = "\n\n\n\n H.264-encoded video \n\n" +
" Streaming from Flash Media Server";
vid_inDescription.background = true;
vid_inDescription.backgroundColor =0x1F1F1F;
vid_inDescription.textColor = 0xD9D9D9;
vid_inDescription.x = vid_in.x;
vid_inDescription.y = cam.height;
vid_inDescription.width = cam.width;
vid_inDescription.height = 200;
vid_inDescription.border = true;
vid_inDescription.borderColor = 0xDD7500;
addChild(vid_inDescription);

There you have it! The application should now automatically attach a webcam, display the webcam video, encode that video to H.264, and then stream it to and from Flash Media Server, displaying the end result in another video. The source files can be downloaded here. The completed code should appear as follows:

package
{
import flash.display.DisplayObject;
import flash.display.Sprite;
import flash.events.NetStatusEvent;
import flash.media.Camera;
import flash.media.H264Level;
import flash.media.H264Profile;
import flash.media.H264VideoStreamSettings;
import flash.media.Video;
import flash.net.NetConnection;
import flash.net.NetStream;
import flash.text.TextField;
import flash.text.TextFormat;

[SWF( width="940", height="880" )]
public class H264_Encoder extends Sprite
{
private var nc:NetConnection;
private var ns_out:NetStream;
private var ns_in:NetStream;
private var cam:Camera = Camera.getCamera();
private var vid_out:Video;
private var vid_in:Video;
private var metaText:TextField = new TextField();
private var vid_outDescription:TextField = new TextField();
private var vid_inDescription:TextField = new TextField();
private var metaTextTitle:TextField = new TextField();

public function H264_Encoder()
{
initConnection();
}

private function initConnection():void
{
nc = new NetConnection();
nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);
nc.connect(“rtmp://office.realeyes.com/live”);
nc.client = this;
}

protected function onNetStatus(event:NetStatusEvent):void
{
trace(event.info.code);
if(event.info.code == “NetConnection.Connect.Success”)
{
publishCamera();
displayPublishingVideo();
displayPlaybackVideo();
}
}

protected function publishCamera():void
{
ns_out = new NetStream(nc);
ns_out.attachCamera(cam);
var h264Settings:H264VideoStreamSettings = new H264VideoStreamSettings();
h264Settings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_3_1);

// ALTHOUGH FUTURE VERSIONS OF FLASH PLAYER SHOULD SUPPORT SETTING ENCODING PARAMETERS
// ON h264Settings BY USING THE setQuality() and setMode() METHODS, FOR NOW YOU MUST SET
// SET THE PARAMETERS ON THE CAMERA FOR: BANDWITH, QUALITY, HEIGHT, WIDTH, AND FRAMES PER SECOND.

// h264Settings.setQuality(30000, 90);
// h264Settings.setMode(320, 240, 30);

cam.setQuality(90000, 90);
cam.setMode(320, 240, 30, true);
cam.setKeyFrameInterval(15);
ns_out.videoStreamSettings = h264Settings;
// trace(ns_out.videoStreamSettings.codec + “, ” + h264Settings.profile + “, ” + h264Settings.level);
ns_out.publish(“mp4:webCam.f4v”, “live”);

var metaData:Object = new Object();
metaData.codec = ns_out.videoStreamSettings.codec;
metaData.profile = h264Settings.profile;
metaData.level = h264Settings.level;
metaData.fps = cam.fps;
metaData.bandwith = cam.bandwidth;
metaData.height = cam.height;
metaData.width = cam.width;
metaData.keyFrameInterval = cam.keyFrameInterval;
metaData.copyright = “Realeyes Media, 2011″;
ns_out.send(“@setDataFrame”, “onMetaData”, metaData);
}

protected function displayPublishingVideo():void
{
vid_out = new Video();
vid_out.x = 300;
vid_out.y = 10;
vid_out.width = cam.width;
vid_out.height = cam.height;
vid_out.attachCamera(cam);
addChild(vid_out);
metaText.x = 0;
metaText.y = 55;
metaText.width = 300;
metaText.height = 385;
metaText.background = true;
metaText.backgroundColor = 0x1F1F1F;
metaText.textColor = 0xD9D9D9;
metaText.border = true;
metaText.borderColor = 0xDD7500;
addChild(metaText);
metaTextTitle.text = “\n – Encoding Settings -”;
var stylr:TextFormat = new TextFormat();
stylr.size = 18;
metaTextTitle.setTextFormat(stylr);
metaTextTitle.textColor = 0xDD7500;
metaTextTitle.width = 300;
metaTextTitle.y = 10;
metaTextTitle.height = 50;
metaTextTitle.background = true;
metaTextTitle.backgroundColor = 0x1F1F1F;
metaTextTitle.border = true;
metaTextTitle.borderColor = 0xDD7500;
vid_outDescription.text = “\n\n\n\n Live video from webcam \n\n” +
” Encoded to H.264 in Flash Player 11 on output”;
vid_outDescription.background = true;
vid_outDescription.backgroundColor = 0x1F1F1F;
vid_outDescription.textColor = 0xD9D9D9;
vid_outDescription.x = 300;
vid_outDescription.y = cam.height;
vid_outDescription.width = cam.width;
vid_outDescription.height = 200;
vid_outDescription.border = true;
vid_outDescription.borderColor = 0xDD7500;
addChild(vid_outDescription);
addChild(metaTextTitle);
}

protected function displayPlaybackVideo():void
{
ns_in = new NetStream(nc);
ns_in.client = this;
ns_in.play(“mp4:webCam.f4v”);
vid_in = new Video();
vid_in.x = vid_out.x + vid_out.width;
vid_in.y = vid_out.y;
vid_in.width = cam.width;
vid_in.height = vid_out.height;
vid_in.attachNetStream(ns_in);
addChild(vid_in);
vid_inDescription.text = “\n\n\n\n H.264-encoded video \n\n” +
” Streaming from Flash Media Server”;
vid_inDescription.background = true;
vid_inDescription.backgroundColor =0x1F1F1F;
vid_inDescription.textColor = 0xD9D9D9;
vid_inDescription.x = vid_in.x;
vid_inDescription.y = cam.height;
vid_inDescription.width = cam.width;
vid_inDescription.height = 200;
vid_inDescription.border = true;
vid_inDescription.borderColor = 0xDD7500;
addChild(vid_inDescription);
}

public function onBWDone():void
{

}

public function onMetaData( o:Object ):void
{
for (var settings:String in o)
{
trace(settings + ” = ” + o[settings]);
metaText.text += “\n” + ” ” + settings.toUpperCase() + ” = ” + o[settings] + “\n”;
}
}

}
}


Download Source Here

John Crosby (72 Posts)

John is a partner at Realeyes Media and has had his fingers in many projects and technologies from Realeyes’ beginning. John has created training curriculum for the Flash Platform and surrounding technologies as well as researched and implemented new processes and tools for management and development teams. Problem solving and providing solutions is where John excels. Aside from the courseware, development and consulting, John has published multiple articles and papers and continues to do so at his blog and on Adobe’s Developer Connection. As a former professional foodie turned keyboard jockey, John spends his time away from the internet as a father, husband, vinophile & productivity monk. Regardless of the subject, John is always on the lookout for tools, ideas and people that make food, code, business – whatever it is – more fun.


* Required


Comments

  1. John Kohl 28 January 2012 at 10:05 am permalink

    Exactly what I have been looking for! Great info, thanks for posting! Very thorough and comprehensive.

  2. D Mahoney 2 February 2012 at 3:40 pm permalink

    Thanks for posting this! The H.264 part works great.

    However, I seem to still be getting Speex audio. At least, the Wowza server I’m streaming to complains about it. I didn’t see anything in the blog post about how to specify AVC (AAC) audio; could you offer any advice on that? I’d love to look at the docs but it appears Adobe hasn’t released any significant documentation on those classes yet :(

  3. Scott Sheridan 2 February 2012 at 5:14 pm permalink

    @John-Thanks! Glad you liked it!

    @D Mahoney-Unfortunately, I don’t think AAC is supported yet. I’ll take another look, and if I’m wrong, I’ll point you in the right direction

  4. Matt 4 February 2012 at 7:48 am permalink

    Great article! Thank you for taking the time to publish it.
    I’m seconding what D Mahoney wrote – I assume one of the goals here is to be able to encode both Audio and Video using a Flash, and be able to play it back on an iOS device.
    If I’m not mistaken, what we’re missing from Adobe is the AAC support.

  5. Matt 4 February 2012 at 1:43 pm permalink

    D Mahoney – it looks like if you specify an H.264 Baseline code for the video, and use the Speex codex for the audio, Wowza should be able to transcode the live stream into an H.264/AAC stream – that can be viewed live on an iOS device. I’m giving this a try.

  6. Arnold 29 February 2012 at 2:13 am permalink

    I publish the video stream with H.264 settings to fms 3.5 as recorded and at the receiver’s end play it as live. The problem is that at the receiver’s end stream is played from the last 4-5 seconds. I want it to play from the current position of the live stream.

    Any Help please???

    netstrm = new NetStream(nc);

    netstrm.play(“mp4:”+instanceName+”.f4v”, -1);

    h264Settings= new H264VideoStreamSettings();

    h264Settings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_1_1);

    nsPublish.videoStreamSettings = h264Settings;

    nsPublish.publish(“mp4:”+instanceName+”.f4v”,”record”);

  7. D Mahoney 2 April 2012 at 10:35 am permalink

    In “Coding the Application” you set the frame rate to 30 and set key frame interval to 15, then in the text you describe that as a key frame every two seconds. Wouldn’t that actually be two key frames per second?

    • S Sheridan 2 April 2012 at 11:46 am permalink

      @D Mahoney-Yes, you are correct, thank you for catching this! Will make the correction.

  8. lamsiokeng 16 April 2012 at 7:37 pm permalink

    Mk452 ng

  9. Eddie 17 April 2012 at 7:38 pm permalink

    This sample works well with Flash Player 11.1.*. But, if you update to 11.2.202.233, you won’t get instant video feed when you change webcam device.

  10. luiza 1 May 2012 at 4:11 pm permalink

    keyboard cat

  11. Francis J 1 May 2012 at 11:48 pm permalink

    Thanks for posting this! Great info. I was wondering if its possible to dynamic stream with multiple bit rates. If so what would be the easiest approach?

  12. Khurram 2 May 2012 at 1:44 am permalink

    Can any one please advice or give me code to add audio also in this application.

    Also if camera does not attach then the application won’t show i want that application should run but if camera does not attach then it will show any messages like “Camera is not attached” or show blank video area with message “Camera is not attached”. I am new in action script so unable to add this in the code if any one can help me by providing me the code then I’ll be really thankful.

    • Trevor 14 August 2013 at 3:03 pm permalink

      To attach audio to your outbound stream, add this before publishing the stream:
      ns_out.attachAudio(Microphone.getMicrophone());

  13. Elena 24 July 2012 at 6:44 am permalink

    Hi. How can I check if H.264 is really used for encoding video stream. I’m recording video stream on server and in recorded file I see “Sorenson” codec.

    • Scott 24 July 2012 at 12:51 pm permalink

      Hi Elena,
      Please check to make sure that you are using the mp4 prefix in your stream name when you publish. If you leave this out, you will be creating an .flv by default.

      Does this help?
      -Scott

      • Elena 25 July 2012 at 5:05 am permalink

        Yes, I use “mp4:webCam.f4v” name for publishing stream. I’m just trying to find out if there is any limitations for using H264 encoding in Air mobile projects for android.

        • Scott 25 July 2012 at 9:25 am permalink

          @Elena,
          Gotcha. Unfortunately, encoding to H.264 in Flash Player is limited to the desktop.

          • David 12 October 2012 at 1:01 pm permalink

            Actually, it works under mobile with 11.4, I tried it using your code with no changes (except using /livepkgr instead.

  14. Ray 26 July 2012 at 3:31 pm permalink

    When we use FMLE with HTTP streaming we must set streamsynchronization to “true” Is there a way to set this using your approach?