Development

HTML5 Boilerplate: Making Web Development Easier

Posted on April 24, 2012 at 11:38 am in Blog, Development

HTML5 Boilerplate Site

Boilerplate: Web design and development ain’t as easy as it used to be – it’s easier!

NOTE: This look at Boilerplate is part of an upcoming look at the Roots WordPress Theme and, as such, it focuses mostly on v2. Keep in mind that Boilerplate is under constant development (v3 was released in February). In fact, you could think of the Boilerplate changelog as the pulse of HTML5 development. Stay tuned for a look at some fascinating changes in v3.

Ah.…life used to be so much simpler.

In 1998 I picked up a book called ‘Teach yourself HTML 4 in 24 hours’. A couple of days and 350 pages later I had designed, coded and validated my first site.

Of course, that site didn’t do very much or even look very good by today’s standards.

All of this can be overwhelming and the good news has been that there is an incredible community of developers furiously creating fantastic (and free!) tools to make all of this easier.

But this leads to another problem – which tools do I use and trust?

For example, hop into a front-end developer discussion group or forum and ask what HTML5 framework you should use and see how many different recommendations you get..whew..!!

So, what if you wanted a default template for your development that already had all the tried-and-true, up-to-date tools installed and ready to be adapted to your project’s needs – a tool-kit, if you will.

Well, we have those too.

And probably the most popular right now is called HTML5 Boilerplate.

HTML5 Boilerplate (H5BP) is the brain-child of superstar developers Paul Irish and Divya Manian.
I won’t go into all of H5BP’s features (that is covered much better here) but the bottom-line is H5BP is like having a team of developers work for several years to give you an HTML5 template with all the best practices learned the hard way baked in.

H5BP seems especially suited for designers with deadlines who want to focus on presentation and not have to monkey around with a lot of project set-up. Just dump the H5BP files into your project and get to work. Depending on which version you’re using – 1,2, or (new as of February) 3 – here’s what you’ll be starting with:

  • Reset CSS with normalized fonts (Eric Meyer’s reset reloaded with HTML5 Baseline and YUI CSS fonts) or Nicolas Gallagher’s Normalize.css.
  • Basic print and mobile styles
  • .htaccess and other server config files (full of really clever snippets), empty crossdomain policy file for flash, robots.txt, favicon, apple-touch-icon, and 404 files
  • HTML5-ready. H5BP uses a tool called Modernizr that includes another tool called the HTML5 Shim (among other things like feature detection) to make sure your HTML5 code looks fine across all browsers including IE6
  • jQuery loaded from the Google CDN or locally if the user is offline.
  • ddbelated png for an IE6 png fix
  • yui profiling
  • Optimized Google Analytics script
  • Cool little things like a fixes to avoid console.log errors in IE & a fix for document.write issues etc.

The latest H5BP is version 3 and over the past couple of years the development team has grown and the product has been continuously improved. Recently the focus has been on web site performance. To this end, Paul and the crew have developed the H5BP ‘Build Script’. This is something that you run when you’ve finished your design/development work that handles optimizing and minification to make your site a lean and mean web machine.

Ultimately we live in a world of paradox. While the world of web design and development is more complex than ever, there has also never been a better time to work in this field thanks to well thought-out and free tools like HTML5 Boilerplate.

Want to learn more?

Check out this is a video where Paul Irish walks through the entire Boilerplate template and is a great resource.

or

contact-us-to-learn-more

Javascript Selector API – Should I care?

Posted on April 09, 2012 at 12:09 pm in Development, Training

Javascript Selector API – Should I care?

What is it?

Using JavaScript with CSS selectors, particularly Classes has traditionally been a little awkward. You end up needing dozens of lines of code with fun stuff like regular expressions to do something simple like toggle a Class. Looking for a better way to do this is how many of us got introduced to jQuery and it’s easy access to the DOM.

The JavaScript API has showed up to the party by implementing the W3C Selectors API.

What does it look like?

It looks a lot like jQuery.

The following example would select all p elements in the document that have a class of either “error” or “warning”.

var alerts = document.querySelectorAll("p.warning, p.error");

(Example above taken from the API Examples)

I’ve created a demo that shows this example in action. It uses the classList property  so don’t try this in IE. :)

In addition to querySelectorAll, we can use querySelector which returns only the first descendant element. Also, querySelector is not restricted to CSS IDs and Classes – you can use this with HTML5 elements as well:

document.querySelector(‘footer’);

Um…What About Browser Support?

QuerySelector and querySelectorAll are supported by all the major browsers from IE8 and up. Of course you need to be careful which CSS selectors you are querying because not all browser versions recognize all selectors.

Should I care?

Poke around inside jQuery and you’ll find references to querySelector – looks like jQuery is using this native API too (when it can). So, if you’re already using jQuery in your project and you’re more comfortable with jQuery selectors this new API isn’t going to rock your world. If you’re not using jQuery, are not worried about old pre-IE8 browsers and are trying to keep your project super-lightweight then these new selectors will make your coding much easier. So it looks like it is up to you and your situation.

Want to Improve your JavaScript Chops?

Sencha Animator: A Test Drive

Posted on March 27, 2012 at 12:55 pm in Blog, Development

What is Sencha Animator?
Sencha Animator is a new tool that makes it easy to create CSS3 transformation-based animations. So easy that you don’t even need a whiff of CSS3 skills!
Actually, working with Animator will look very familiar to anyone who’s used the Flash IDE (or any tool that uses timelines) to create animations.

Let’s walk through a simple Sencha Animator project.

Our finished project will look like this.

First, you’ll need to download and install Animator – get it here.

1. Set Up Your Project
Once you get it up and running you’ll select File–>New Project and set the size (ours is 600×320). Next, save your project (File –> Save) where you can find it again.

2. Add Images
For our project we’ll be fading in each of the four elements of our logo. Assuming we’ve already separated the logo into PNGs, the first step is to place the images onto the Canvas.
Select the Image Tool and then click anywhere on the Canvas.

Now we have a placeholder graphic on the Canvas. Let’s link this to our image. Click the button next to the default image name in the General Object panel and browse to your image.

While in the Object Panel with your image selected you’ll also want to set the image Name and Position.

Repeat these steps with the additional images. You should now have 3 layers in your Object Tree. You can rearrange these so that the layers are stacked correctly.

3. Well, that’s great – LET’S ANIMATE!
Set the Playhead between 0s and 1s and double-click in the timeline of the bottom layer (ours is called ‘LeftThing’).
This will create a white Keyframe and the Properties for this Keyframe will be displayed.

Under Properties, change the Easing to ‘Linear’. This will connect the Keyframe to another Keyframe at 0s.
Select the Keyframe at 0s and change the Opacity to 0% so that this element will appear to fade in to the scene.
(You can scrub the playhead to watch it fading in — ooohhh, aaahhh!)

Repeat this process with the other two images so that each element fades-in on top of each other. Your timeline should look similar to this.

4. Add Some Text
Let’s create some text and fade that in too.
Select the Text Tool and click on the Canvas.
Just like with the Image Tool, we need to adjust the properties of our new Text Element.
See the screenshot below to see the settings that we used.

To simulate our logo, we duplicated the text layer (Ctrl-D) and changed the Content to a left parenthesis and then repeated to create a right parenthesis. We then positioned and changed the Fill Color of these new text layers to match our RealEyes logo.

Next we’ll animate these layers to fade-in like the previous layers.

5. Add Interactivity
Excellent. Now we have a logo whose various elements fade-in and then the animation stops.
So, how hard would it be to add some interactivity and make the animation repeat if the user clicked on the logo?
Easy!
Here’s how.
Select the top-most image layer (‘Yellow Thing’ in our example) and open the Actions panel. You’ll notice several interactions to choose from.
Select ‘click’ and then ‘Go to next scene’ from the drop-down menu.

6. Export the Project
Almost done! Lastly, we need to select File–>Export Project and then FTP this to our favorite web-server or simply open the html file that Animator creates as it exports the project.
Viola – you have some snappy animation that looks a whole lot like Flash – but isn’t!

Conclusion:
With browser support for CSS3 animation growing everyday, designers and developers have been turning to frameworks, libraries and plugins like transform.js, paper.js, move.js and JSAnim to simplify their workflow. However, making convincing animations with pure code can be a frustrating and ultimately disappointing process. Because successful animation depends on nuance and timing, creating them with some kind of IDE or GUI has always been the natural solution (Flash owes a lot of it’s success to it’s easy to use and powerful timeline controls).

Without getting into advanced easing, multiple scenes, z-axis rotations, etc.., we’re really just scratching the surface of what this tool is capable of. While Sencha Animator is still a work in progress and will never be able to offer the power of the Flash IDE, we’ve seen that Animator is intuitive, easy to learn and offers a time-saving GUI for modifying CSS properties over time.
Another plus – the version that we used (1.2) seemed very stable.

Interested in learning more about the power of Sencha or their tools? Contact us!

The past week has brought a series of announcements from Adobe that has elicited myriad speculation and concern from the Flash Platform and Adobe community.  As a leading Adobe Solutions provider for Flash Platform solutions, RealEyes wants to address these announcements and how we see their impacting our focus in the technological ecosystem.

Before we begin this analysis, from our vantage point, the largest issue with these announcements is the way in which they were communicated—to the public, to partners, everyone.  There was much good news in what Adobe announced; unfortunately, their public relations team chose to focus largely on what was being deprecated, which colored the resulting dialog.

We’d like to take a moment to refocus this conversation for our customers and community.  Contrary to popular debate, Flash is NOT dead.  And here’s why:

Adobe Focus on Mobile Applications

Adobe announced that it would be more aggressively contributing to HTML5, with future Flash Platform development to focus on personal computer and mobile applications.  Great!  Our clients who are developing mobile experiences are universally doing so with the intention of making installable applications.  More Adobe focus in this area will only enhance the experiences that we are able to work with them to deliver.

The Flash Platform is still the best way to develop mobile application experiences intended to be deployed across the major application marketplaces: Apple, Android, and Blackberry.

However, what got the most attention in this announcement was that Adobe is discontinuing development of Flash Player for the mobile browser.  While this got many people up in arms, declaring the general demise of the Flash Player, we at RealEyes can respect this decision and see the validity of it.  For Adobe, the return on investment for this runtime simply wasn’t there, and with the fragmented nature of Android (and a few other issues that contribute to delivering an application to all browser, OS, and mobile hardware configurations) the continued development of the mobile Flash Player would be exponentially complex.

For application developers, the mobile Flash Player was never as good a runtime as the desktop one.

So, how is the discontinuation of mobile Flash Player affecting our clients? Really, it isn’t.

Because mobile device users are more likely to look exclusively toward installable applications for rich media content—and RealEyes’ Flash Platform applications largely deliver rich media content—our customers have been developing applications built using the Flash Platform and relying less on the mobile web.  Mike Chambers does a nice job of discussing the differences in how users consume rich content on mobile devices compared to the desktop, and we agree wholeheartedly that this is the way to go.

Because Flash Player doesn’t have the same ubiquity on mobile devices as it does for desktop browsers RealEyes was already advising our clients to create fallback experiences for their Flash content for mobile browsers.  For most of them we could achieve the same functionality in HTML as in Flash (video being the exception as you’ll see below).  Why not forgo Flash entirely and have a single HTML codebase to support?  Seems like a decision that makes good business sense.

Not that we aren’t sad to see mobile Flash Player go: we are.

If only because we don’t want the web to have missing plugin alerts. Having the Flash Player plugin available to Android and Blackberry mobile browsers was a convenience that offered a great marketing pitch, but, truthfully, delivered very little.  This is due, in large part, to the fact that the majority of the web was design for the desktop and was not meant for (nor is it very functional for) mobile phones – period, full stop.

In truth, we’ve seen just a very few Flash applications developed specifically for the mobile browser.  We at RealEyes have developed just one of these for commercial release. And this application was built before AIR for Android and was always intended to be a stop-gap until this runtime was available.

Now, tablets make a better use case for Flash’s place in the mobile ecosystem; however, the number of tablets that support Flash is under 30% of market share.  Given this and Apple’s seemingly prohibition on Flash, the Flash Player was just never going to achieve the same ubiquity as it has on the desktop for tablets, or for mobile phones for that matter.

Adobe Supports HTML5 Development

As Adobe is a multimedia creation company it will want to be at the forefront of whatever technology is defining exceptional user experiences for multimedia delivery.  And, for a few years now, Adobe’s been looking toward HTML5.  Unfortunately, the announcement from Adobe that contains the information about the discontinuation of the mobile Flash Player makes it sound like Adobe’s just jumping on HTML as a development platform.  That’s just not true.

Even more unfortunate in the present debate is a perception that Steve Job’s thoughts on Flash have somehow won and that this was just fallout from an Apple v. Adobe war.  Not so fast.  Apple and, to some degree Microsoft, have done much to market HTML5 development to the point that its perception overpromises what it can deliver.  Although Adobe has been working to educate its community about the benefits of the Flash Player over HTML5 and was backed by legions of developers, animators, designers, and content creators, they couldn’t overcome the tactics of a such powerful and cunning marketing machines.  While standing its ground on the mobile Flash Player, Adobe was, in many ways, able to achieve what critics said was not possible with Flash Player on mobile devices.

So, if Steve didn’t win, who did?

Well, Adobe is still poised to win and … more importantly so is its community of developers and customers.  Look at tools like Adobe Edge and the new mobile enhancements to Dreamweaver.  Also, with Adobe’s acquisition of PhoneGap, Adobe developers are poised to deliver the best HTML5 experiences out there.  Yeah, it’s not Flash … but that’s OK. While it seems like Adobe’s making a sharp turn toward HTML5, from where we sit, they are more fully committing to a direction that Macromedia, and then Adobe, started in some time ago.  Remember the HTML and Flash being friends video from Adobe MAX last year?

And, with other recent innovations for mobile AIR such as the availability of native extensions, the future of mobile development is exhilarating for any Flash Platform developer.  We’re hopeful that Adobe will use this opportunity to sharpen their focus on native mobile functionality and continue the path of making the Flash Platform the best choice for developing multi-platform mobile applications with a single code base.

However, the perception that Adobe’s making a rash decision is very damaging and something that we’re working with our clients to help them understand.  The reality of the situation is that not much has changed; however, poor communication, horrible messaging, and virtually no community outreach from Adobe regarding this messaging has made the perception the accepted reality in the short term.

And, if that weren’t enough news for one week …

Adobe Really Open Sources Flex

In clarifying its future plans for the Flex SDK, Adobe announced that the Flex SDK will be contributed to an open source foundation.  The good news in this move is that the Flex community is mature enough to take on the governance of this robust framework moving forward.  This wasn’t the case in February of 2008 when Adobe released Flex 3 as open source (Adobe had been planning to open source it since April of 2007).

For several years now, Adobe has been moving towards a more open standard with their development and this decision to contribute the Flex SDK to an open source foundation isn’t something that’s Adobe has done in isolation, and not just to the Flash Platform.  Some other projects that are on this path are:

  • PhoneGap
  • BlazeDS
  • Flex SDK

And, in reading Adobe’s clarification to this open source announcement, we see even more reason to be excited.  They are also open sourcing tools that support Flex including an experimental one (Falcon JS) that cross-compiles MXML and ActionScript to HTML and JavaScript.  Now, that’s exciting!  And, we’re sure that more is on the horizon.  Maybe HTML and Flash can be friends after all.

And, let’s be honest, the original model that Adobe used to open source Flex didn’t go as planned.  While Adobe always said they welcomed contributions from the community to grow and improve the Flex SDK, the process for getting a change accepted was unclear and many community contributions were rejected for any number of reasons (valid or invalid).  Adobe simply did not have the process or the resources to handle the influx of developers who wanted to contribute.  It was a frustrating situation for the Flex development community (and arguably Adobe as well).

So, the vibrant Flex community answered back earlier this year by creating the Spoon Project to better organize and test Flex SDK modifications submitted by the Flex community.  It proved to be an excellent model, drove innovation of the Framework, and was an initial step toward the full open source move that Adobe just announced.

Who’s governing the future of Flex? We are!

In case the nuance in what’s different now versus Adobe’s 2007 decision to open source Flex isn’t apparent, the major difference is that the Flex community will extend the Flex code base without needing Adobe’s permission to do so.  A new governance, following Apache’s well-established community rules, will be formed to determine the future direction of the codebase.

Since our inception RealEyes has been in close contact with Adobe’s Flash Platform team, we’re excited for this change in governance. RealEyes has always been super excited about the Spoon Project, and our Development Manager (Jun Heider) is very active in this community as the Infrastructure Chairman.  We’ve seen that this is truly a community-driven initiative that is supported by Adobe to increase the volume, speed (and maybe even the quality) in which the Flex framework can grow.

We are excited to contribute further to the future of Flex and confident that, like other successful open source communities, the language will continue to evolve.

Also … Flex isn’t all of the Flash Platform

Sadly, many of the announcements that we’ve been talking about, including the open sourcing of Flex, led many to say that Flash is dead. That simply isn’t true.  Let’s talk about what the Flex framework actually is: a particular framework used to structure Flash Platform development.  Do you have to use it to develop Flash Platform applications? No. And, to be honest, RealEyes doesn’t use Flex in every Flash Platform project because sometimes that framework can make applications to “heavy”.  If performance is of paramount concern for a Flash Platform application, Flex often cannot replace pure ActionScript.

Flash and Flex are not going away.  Adobe is still committed to developing tooling to support development for the Flash Platform. Further, Adobe hasn’t open sourced the Flash Player, the most installed piece of software in the history of the internet.  Adobe plans on steadily contributing to the Flex SDK in its open sourced project and we are working with the Flex community to make us contributors as well.

Adobe and Enterprise Applications

In a week of poorly handled communication, probably RealEyes’ largest concern was Adobe’s statement that “In the long-term, we believe HTML5 will be the best technology for enterprise application development.” Ouch.  Big enterprises have invested millions upon millions of dollars in the development and maintenance of Flash Platform applications.  At the very least that statement can erode the confidence that large companies (or companies of any size, really) have when building systems based upon Adobe technology.  Something that we feel is probably a bit of an over-reaction.

Also, without context this statement is very misleading.  Currently, HTML5 does not have full functional parity with the Flash Platform.  A few days after making this statement, Adobe clarified it by indicating what it intended as a timeframe for HTML5 to be able to truly complete with Flash Platform development: three to five years. That timeframe could be heavily extended when considering corporate browser adoption timelines.

There’s no enterprise that can wait three to five years for functionality.

As Adobe stated, “Flex has now, and for many years will continue to have, advantages over HTML5 for enterprise application development – in particular:

  • Flex offers complete feature-level consistency across multiple platforms
  • The Flex component set and programming model makes it extremely productive when building complex application user interfaces
  • ActionScript is a mature language, suitable for large application development
  • Supporting tools (both Adobe’s and third-party) offer a productive environment with respect to code editing, debugging and profiling and automation.

We see all that as being the case and some more:

  • Enterprise clients tend to have slower adoption rates for software, meaning that not all enterprises support the advanced HTML5 features that exist.
  • In particular, the video capabilities in HTML5 are not as robust as what is available in the Flash Platform including multicasting with integrated hardware acceleration and advanced security models.
  • The testing issues for supporting browser fragmentation can be daunting to enterprises, compared with supporting a Flash Platform application that can be deployed across desktop browsers with consistent display and functionality.

RealEyes will continue to recommend Flex and Flash Platform development to our clients where it makes real business sense to do so.  That said, there are reasons to use HTML over (or alongside) the Flash Platform, and we have plenty of clients we support who do that as well.

The Impact to RealEyes

So, what does all of this mean to RealEyes?  In the short term, it has meant a challenge to bring context to Adobe’s announcements and dispel rumors and misinformation to our clients. In the long run, it probably doesn’t mean a lot.

We have already been on a path of technology diversification with continued focus and adoption of HTML5, its supporting technologies, and native mobile development. Many of us are in the technology space because we enjoy the challenge of evolving our skills as the industry grows.  However, for the next few years, we anticipate that the Flash Platform will continue to be our predominant focus.

Our development specialty has been in delivering industry-leading streaming media solutions and multiscreen development. Flash and AIR are still the best solutions for this and will be for a while.  The timeline for that largely depends on Adobe and, as a valued Adobe Solutions Partner, we will continue to support them in as educated and balanced way as possible.

We are actively involved in the future of the Flex framework through the Spoon Project and excited about the potential for future growth for that project.  We are now even more apt to contribute to the betterment of this already robust framework for the benefit of the Flex community.

Finally, RealEyes has always helped our clients to choose the best technology to power a given project and we will continue to do this.  And, as HTML5 becomes a more comprehensive solution, we will likely recommend it more frequently. It is truly about what is right for the current and future on a case by case basis. Our clients and projects will continue to be industry leaders, no matter the technology behind them.

——————–

Now, we can’t see all of the news in a positive light.  And not all of it is positive – certainly not for the 750 Adobe employees who were laid off and their families. However, this degree of restructuring in the fourth quarter isn’t unprecedented for Adobe.  We’ve seen this over the past couple of years.  This year, as in years past, we lost meaningful relationships with Adobe employees that we’ve been happy to collaborate with on community and development projects.  We at RealEyes have close contact with Adobe and tend to focus on how individuals shape the platforms, products, and communities that we work with instead of quarterly earnings and fiscal projections.  While adjusting to this restructuring, we wish all of the affected employees only the best in their next moves and hope that they will continue to make positive contributions to the technical community they have helped to shape.

Additional Links:

ckeditor logo
The Javascript-based rich text editor-CKEditor is a great tool to use in projects that require you to give your clients the ability to edit their own HTML pages. Implementation is dead simple, and the list of configuration options is long, giving you a great deal of flexibility in terms of the functionality you can implement in the HTML editor. For example, in addition to editing text, you can configure CKEditor to give your clients the ability to add images to their pages, either by linking to existing ones on the web, or uploading them from their own computer.

ckeditor screenshot
If this sounds like a tool you’d like to use in your own workflow, be sure to check out what Realeyes developer Nils Thingvall has to say about it. Nils’ article gives you helpful tips on configuring your editor to work with image uploading, a topic that is unfortunately under-documented on the CKEditor site. Thanks Nils!

Load testing service API’s got you down? How about load testing PHP-based, AMF service API’s? Thought so. Fear not, because John Crosby recently posted his findings about two AMF load testing tools he says are great! He’s talking about soapUI, and loadUI, the free-of-charge, open-source tools created by the fine people at SmartBear.

John shows you how to use these tools, walking you through step-by-step as you set up a project, configure an AMF request, and set up load testing using soapUI. He also walks you through load testing with loadUI.

It’s clear that John is pretty excited about the handiness of these two load testing applications, and he’s already looking forward to integrating them with our Continuous Integration (CI) system. Stay tuned for more on that soon! For now, happy testing!

Read the original article

The latest version of Flash Player (v.11.0) includes some exciting new features, including performance upgrades such as native 64-bit support, and asynchronous bitmap decoding. Perhaps most newsworthy though, is Flash Player’s new capability to encode live video streams to the H.264/AVC standard. This new feature will allow developers to create real-time, high-quality, live video streaming applications for chat, conferencing, and live event broadcasting.

The following article demonstrates how to take advantage of Flash Player 11.0′s new H.264 encoding capabilities within a video streaming application built using Flash Builder 4.5. The application does the following:

  • Captures live video from a webcam
  • Establishes a connection to Flash Media Server 4.5 using the NetConnection class
  • Publishes video stream from application to FMS using an instance of the NetStream class
  • Displays outgoing video stream from camera (prior to being encoded) in a Video component within the application
  • Sends encoding parameters to Flash Player 11.0 to encode the raw webcam video to H.264
  • Displays encoded video’s metadata, demonstrating that encoding worked
  • Streams live, encoded video from FMS to the application using another instance of the NetStream class
  • Displays newly encoded, streamed live video in another Video component within the application

H.264 Encoding in Flash Player 11.0 Example Application

Example Application showing live stream from webcam (left) and stream encoded to H.264 in Flash Player 11.0 (right).

To follow along with the example, please be sure to have the following:


Getting Started - Configuring  Compiler Settings

To develop applications that target the new features available in Flash Player 11.0, it is necessary to configure the compiler to target player-version “11.0″, and SWF-version “13″, as well as the playerglobal.swc for Flash Player 11.0. To make these changes:

    1. Download the new playerglobal.swc for Flash Player 11.0, and rename this file from “playerglobal11_0.swc”  to “playerglobal.swc“.
    2. Create a folder named “11.0” in the directory “frameworks\libs\player” that is inside your Flex SDK installation folder. (Fig. 1.0)
    3. Put the playerglobal.swc inside the new folder (“11.0”).
    4. Locate the file “flex-config.xml“, that is located in the “frameworks” folder within your Flex SDK installation directory.
    5. Within “flex-config.xml“, locate the “target-player” tag, which specifies the minimum player version that will run the compiled SWF.
    6. Set “target-player” value to “11.0“. (Fig.1.1)
    7. Also within “flex-config.xml“, locate the “swf-version” tag,  which specifies the version of the compiled SWF.
    8. Set “swf-version” value to “13“. (Fig. 1.1)
    9. Save “flex-config.xml“.

CreateFolderForPlayerGlblSwc

Figure 1.0. Create a folder for the playerglobal.swc named “11.0″.

Edit Values in flex-config

Figure 1.1. Edit values of “target-player” and “swf-version” tags within the flex-config.xml file.


Setting Up the Project in Flash Builder 4.5

The example application is a simple ActionScript 3.0 project (not a Flex or AIR project). To create a similar project in Flash Builder:

    1. Choose File -> New -> ActionScript project.
    2. Name the project “H264_Encoder”, and click “Finish”.
    3. In Flash Builder, with the H264_Encoder project selected, choose Project -> Properties.
    4. Verify that the compiler is targeting Flash Player 11.0. (Fig. 1.2) If it isn’t, select the “Use a specific version” radio button, and type “11.0.0″ for the value.

SettingFlashPlayerVersionInFlashBuilder

Figure 1.2. Make sure that the compiler is targeting Flash Player 11.0 by inspecting the project’s properties.

At this point, the application should look similar to the following:

package
{
public class H264_Encoder extends Sprite
{
public function H264_Encoder()
{
}
}
}

Next up, you’ll be modifying the application so that it can communicate with your webcam. In addition, you’ll add the code necessary for establishing a NetConnection to connect the application to Flash Media Server, as well two NetStream instances; one responsible for getting the video from the application into Flash Media Server, and one for bringing it back from the server into the application.


Coding the Application – Connecting a Camera, Establishing a NetConnection and NetStreams
    1. Directly under the opening class definition statement, but before the constructor method, create a private variable named “nc”, and data typed as a NetConnection. Use code hinting to have Flash Builder generate the necessary import statements for you by starting to type “NetC..”, then hit CTRL-SPACE to receive code hinting. Select “NetConnection” from the list, and notice that Flash Builder has imported the NetConnection class from within the flash.net package. If for some reason the import fails, go ahead and import it manually. Your code should appear as follows:

package
{
import flash.net.NetConnection;

public class H264_Encoder extends Sprite
{
private var nc:NetConnection;

public function H264_Encoder()
{
}
}
}

    1. Create two private variables to represent the NetStreams data typed as NetStream. Create one for the stream going from the application to the server (ns_out), and another for the stream coming back into the application from the server (ns_in), and remember to use code hinting to have Flash Builder import the necessary classes.

package
{
import flash.net.NetConnection;
import flash.net.NetStream;

public class H264_Encoder extends Sprite
{
private var nc:NetConnection;
private var ns_out:NetStream;
private var ns_in:NetStream;

public function H264_Encoder()
{
}
}
}

    1. Next, create a private variable named “cam” of type “Camera”, and set its value = “Camera.getCamera()”. The Camera class is a little different than other classes, in that you don’t call a constructor to instantiate an object of type Camera. Instead, you call the getCamera() method of the Camera class. This method will return an instance of a Camera object unless there isn’t a camera attached to the computer, or if the camera is in use by another application.

private var cam:Camera = Camera.getCamera();

Make sure the Camera class was imported:

import flash.media.Camera;

    1. It is now time to add code that will allow the application to connect to Flash Media Server using an instance of the NetConnection class. Under the import statements, the local variables, and the closing brace of the constructor function, create a private function named initConnection() that takes no arguments and returns void:

private function initConnection():void
{
}

    1. As the first line of the function body, create a new NetConnection by instantiating the nc:NetConnection variable, which you declared in step 1:

nc = new NetConnection();

    1. It’s always a good practice to verify that the NetConnection was successful. Next, add an event listener to listen for an event named “onNetStatus()”. You will create the onNetStatus() event in the next section:

nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);

Be sure to either use code hinting, or import manually, the NetStatusEvent class, which is in the flash.events package:

import flash.events.NetStatusEvent;

    1. Next, and still within the initConnection() function body, tell the NetConnection where to connect to by calling the connect() method of the NetConnection class. As an argument to this method, add the URL for the location of the “live” folder within the installation Flash Media Server you want to connect to. The URL included in the example uses the RTMP protocol, and connects to the “live” folder within a copy of Flash Media Server installed on one of our servers. You can also stream to a local version of Flash Media Server, if you have one installed, by setting the URL to: “rtmp://localhost/live”.

nc.connect("rtmp://office.realeyes.com/live");

    1. Finally, tell the NetConnection where Flash Media Server should invoke callback methods by setting the value for the NetConnection’s “client” property to “this”. Callback methods are special handler functions invoked by Flash Media Server when a client application establishes a NetConnection. Later on in this example you will work with the “onMetaData()” and “onBWDone()” callback methods. You will include these callback methods within the main application class, which is in fact the same object that will establish the NetConnection, and therefore the value of the NetConnection instance’s (nc) client property should be set to “this”.

nc.client = this;

The completed initConnection() function should appear as follows:

private function initConnection():void
{
nc = new NetConnection();
nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);
nc.connect("rtmp://office.realeyes.com/live");
nc.client = this;
}


Coding the Application – Verifying a Successful NetConnection
    1. As mentioned, it’s always a good practice to verify the success of a NetConnection attempt. To do this, create a protected function named onNetStatus() that takes an event, named “event”, of type “NetStatusEvent” as its only argument, and returns void:

protected function onNetStatus(event:NetStatusEvent):void
{
}

    1. Within the onNetStatus() event body, create a Trace statement that outputs the value of event.info.code to the console during debugging. The code property of the info object in the NetStatus event will contain String data that indicates the status of the attempted NetConnection, such as “NetGroup.Connect.Success”, or “NetGroup.Connect.Failed”. Tracing the value of this property allows you to confirm the status of the NetConnection easily by simply running the application in debug mode.

protected function onNetStatus(event:NetStatusEvent):void
{
trace(event.info.code);
}

    1. Next, within the function body, and beneath the existing Trace statement, create a conditional statement that checks the value of event.info.code and compares it to the value “NetConnection.Connect.Success”. If event.info.code == “NetConnection.Connect.Success”, call three functions that you will create in the next section; one that publishes an outgoing video stream, one that displays the incoming video from the webcam, and one that displays the video stream being sent back to the application from the server. The completed onNetStatus() function should appear as follows:

protected function onNetStatus(event:NetStatusEvent):void
{
trace(event.info.code); if(event.info.code == "NetConnection.Connect.Success")
{
publishCamera();
displayPublishingVideo();
displayPlaybackVideo();
}
}

    1. This example attempts to connect to the server and start playing/publishing video automatically when launched. To achieve this, call initConnection() from within the main class’ constructor method:

public function H264_Encoder()
{
initConnection();
}

At this point, you have included the code necessary to establish a NetConnection, and verify the success or failure of that connection with a Trace statement. In addition, you’ve included calls to functions that, when written, will handle the publishing and playback of the video from the webcam, as well as the video coming back from the server.

If you save the application now you’ll notice some errors. The calls to publishCamera(), displayPublishingVideo(), and displayPlaybackVideo() generate errors because we haven’t written them yet. You can comment out the calls to these functions and run the application in debug mode. If everything is set up correctly, you should see the Trace output “NetConnection.Connect.Success”.

Comment out calls to unwritten functions

However, you should also see this error in the console: “ReferenceError: Error #1069: Property onBWDone not found on flash.net.NetConnection and there is no default value.”. This is because Flash Media Server is attempting a callback function on the application that hasn’t been written yet. In the next section you will include those callback functions.

connect success and bwdone error in console


Coding the Application – Including the Callback Functions, and Creating a TextField to Display Metadata

The sample application contains two callback functions – onBWDone(), and onMetaData(). The onBWDone() callback checks for available bandwith, which can be useful in applications that need to dynamically switch video assets according to the bandwith that’s currently available. Although it’s necessary to include this function in the client code (omitting it will generate an error when the server tries to make the function call) it’s not necessary to actually do anything with it. This application isn’t concerned with monitoring bandwith, so it can be left as an empty function.

The onMetaData() callback function is useful for accessing a video stream’s metadata, and you will be adding code to this callback to do just that. The onMetaData() callback returns an Array of generic objects that represent the video stream’s metadata. In the next section, you will create those objects to represent various metadata, and access their values in order to display the information within the UI. For now, you will simply add the two callback functions, and add some code to onMetaData() to access that metadata. In addition, you will create a TextField that you will eventually use to display the metadata in the UI.

    1. Create a new private instance variable named “metaText”, and type it as an instance of the TextField class. Set its initial value to “new TextField()”

private var metaText:TextField = new TextField();

*Note – At this point you are simply creating the metaText object in memory. You won’t actually add it to the display list until further on in the example.

Be sure to import the necessary TextField class:
import flash.text.TextField;

    1. Include the required onMetaData() callback function. Create a new public function named “onMetaData()” that accepts an Object named “o” as its only parameter, and returns void.

public function onMetaData( o:Object ):void
{
}

    1. To access the video stream’s metadata, you will loop through the objects returned by the onMetaData() callback function. Again, you will create those objects in the next section, but for now, within the onMetaData() function create a “for…in” loop to loop through the objects. Within the loop’s initializer, declare a local variable named “settings” data typed as a String within the Object “o”.

public function onMetaData( o:Object ):void
{
for (var settings:String in o)
{
}
}

    1. Next, within the loop body, include a Trace statement that will output the name of each “settings” object returned by onMetaData(), concatenated with “=”, and the object’s value.

trace(settings + " = " + o[settings]);

    1. Finally, inside the for…in loop body, assign a text value to the metaText variable equal to each returned object’s name, concatenated with “=”, and the object’s value. Create a new line for each iteration, and adjust the spacing between the double quotes, (and add an extra “\n” if you want to double-space the text) to properly layout the text in the UI.

metaText.text += "\n" + " " + settings.toUpperCase() + " = " + o[settings] + "\n";

*Note* The layout and styling in this example are not intended to be examples of UI programming best practices. UI programming is outside the scope of this article.

The completed onMetaData() callback function should be similar to the this:

public function onMetaData( o:Object ):void
{
for (var settings:String in o)
{
trace(settings + " = " + o[settings]);
metaText.text += "\n" + " " + settings.toUpperCase() + " = " + o[settings] + "\n";
}
}

    1. Next, add the onBWDone() callback function. Create a new public function named “onBWDone()” that takes no arguments, and returns void.

public function onBWDone():void
{
}

Remember that the onBWDone() callback function is what Flash Media Server uses to check available bandwith, and this application doesn’t require that information. It still must be included, however, since the server will be calling it on the application object. To avoid a runtime error, simply include an empty onBWDone() callback.

public function onBWDone():void
{
}

Now that the application has the necessary callback functions, and it loops through the objects returned by onMetaData() to populate a TextField with that data, it’s time to add code that enables the application to read webcam data, encode that webcam data to the H.264 standard, and to then stream the encoded video.


Coding the Application – Setting Up H.264 Encoding, and Publishing to the NetStream

In this next section, you will attach your webcam to an instance of the Camera class. You will then encode the webcam input to H.264 using properties of the Camera class, and new H264VideoStreamSettings class. Certain encoding parameters can’t be set (yet, although support for this is hopefully coming soon) with the new H264VideoStreamSettings class, so you’ll be setting those values from properties in the Camera class.

Next, you will attach the encoded video to a live video stream, and stream it to Flash Media Server’s “live” directory. (You will bring a new stream back into the application from Flash Media Server in the next section)

Finally, in order to read the metadata of the newly encoded video stream, you will call the send() method of the NetStream class (available only when using Flash Media Server). As arguments to the send() method, you will include @setDataFrame, a special handler method within Flash Media Server, the onMetaData() callback method you added earlier to listen for the metadata client-side, and finally, the name of a local variable (“metaData”), data typed as an Object, used to represent the desired metadata items. First:

    1. Create a protected function named “publishCamera()” that takes no arguments and returns void:

protected function publishCamera():void
{
}

    1. In the first line of this new function, instantiate the ns_out NetStream object by calling its constructor. Pass the constructor the NetConnection instance “nc”:

ns_out = new NetStream(nc);

    1. On the next line, attach the Camera instance “cam” to the outgoing NetStream by calling the attachCamera() method of the NetStream class. Pass this method the cam instance:

ns_out.attachCamera(cam);

    1. Next, create a new local variable named “h264Settings”, data typed as H264Settings and set its initial value equal to “new H264Settings()”:

var h264Settings:H264VideoStreamSettings = new H264VideoStreamSettings();

Be sure to import the H264VideoStreamSettings class:

import flash.media.H264VideoStreamSettings;

    1. Call the setProfileLevel() method of the H264Settings class on the h264Settings object to encode the video using the “BASELINE” profile, and a level of “3.1″:

h264Settings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_3_1);

Be sure to import both the H264Profile class, and the H264Level class:

import flash.media.H264Level;
import flash.media.H264Profile;

    1. Next, use the setQuality() method of the Camera class to encode the video stream at 90000 bps (900Kbps), and with a quality setting of “90″:

cam.setQuality(90000, 90);

    1. Use the setMode() method of the Camera class to set the video’s width, height, and frames per second, and to determine if it should maintain its capture size when if camera has no default behavior for this parameter:

cam.setMode(320, 240, 30, true);

    1. Next, using the setKeyFrameInterval() method of the Camera class, set the video’s keyframe interval to 15 (two keyframes per second):

cam.setKeyFrameInterval(15);

  1. To set the outgoing video’s compression settings, assign the values of the h264VideoStreamSettings variable to the videoStreamSettings property of the outbound stream, “ns_out”
    ns_out.videoStreamSettings = h264Settings;
  2. Call the publish() method of the NetStream class on the outgoing NetStream, and pass it parameters to provide a name for the stream (“mp4:webCam.f4v”), as well as a destination folder in Flash Media Server (“live”):
  3. ns_out.publish("mp4:webCam.f4v", "live");
  4. Now it’s time to create the objects that will hold the metadata values of the encoded video you will access at runtime. Create a new local variable named “metaData”, data typed as an Object, and set its initial value equal to “new Object()”:
  5. var metaData:Object = new Object();
  6. These metaData objects are generic, meaning you can assign any name/value pairs you like. For example, there’s no encoding setting that comes from the Camera, VideoStreamSettings, or H264VideoStreamSettings classes that would allow you to display a copyright, but you can add one easily enough like this:
  7. metaData.copyright = "Realeyes Media, 2011";Of course, you can also create objects with values that do come from settings within the aforementioned classes, such as:

    metaData.codec = ns_out.videoStreamSettings.codec;
    metaData.profile = h264Settings.profile;

  8. Create the following metaData objects and add them to the publishCamera() function:
  9. metaData.codec = ns_out.videoStreamSettings.codec;
    metaData.profile = h264Settings.profile;
    metaData.level = h264Settings.level;
    metaData.fps = cam.fps;
    metaData.bandwith = cam.bandwidth;
    metaData.height = cam.height;
    metaData.width = cam.width;
    metaData.keyFrameInterval = cam.keyFrameInterval;
    metaData.copyright = "Realeyes Media, 2011";
  10. Call the send() method of the NetStream class on the ns_out object and pass it the name of the handler method “@setDataFrame”, and the callback method “onMetaData”, as well as the local variable metaData:
  11. ns_out.send("@setDataFrame", "onMetaData", metaData);The completed publishCamera() function should resemble the following, with the exception of the commented-out code:

    protected function publishCamera():void
    {
    ns_out = new NetStream(nc);
    ns_out.attachCamera(cam);
    var h264Settings:H264VideoStreamSettings = new H264VideoStreamSettings();
    h264Settings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_3_1);

    // ALTHOUGH FUTURE VERSIONS OF FLASH PLAYER SHOULD SUPPORT SETTING
    // ENCODING PARAMETERS ON h264Settings BY
    // USING THE setQuality() and setMode() METHODS,
    // FOR NOW YOU MUST SET THE PARAMETERS ON THE CAMERA FOR:
    // BANDWITH, QUALITY, HEIGHT, WIDTH, AND FRAMES PER SECOND.
    // h264Settings.setQuality(30000, 90);
    // h264Settings.setMode(320, 240, 30);

    cam.setQuality(90000, 90);
    cam.setMode(320, 240, 30, true);
    cam.setKeyFrameInterval(15);
    ns_out.videoStreamSettings = h264Settings;
    trace(ns_out.videoStreamSettings.codec + “, ” + h264Settings.profile + “, ” + h264Settings.level);
    ns_out.publish(“mp4:webCam.f4v”, “live”);

    var metaData:Object = new Object();
    metaData.codec = ns_out.videoStreamSettings.codec;
    metaData.profile = h264Settings.profile;
    metaData.level = h264Settings.level;
    metaData.fps = cam.fps;
    metaData.bandwith = cam.bandwidth;
    metaData.height = cam.height;
    metaData.width = cam.width;
    metaData.keyFrameInterval = cam.keyFrameInterval;
    metaData.copyright = “Realeyes Media, 2011″;
    ns_out.send(“@setDataFrame”, “onMetaData”, metaData);
    }


Coding the Application – Displaying and Encoding the Video From the Webcam, and Displaying Video Streamed Back From the Server

The application needs to display both the raw, un-encoded incoming video from the webcam, as well as the inbound streaming video after it has been encoded to H.264 in the Flash Player, sent to Flash Media Server, and then back to the application. In addition, the metadata that you defined in the previous section needs to be displayed in the UI to reveal the encoding settings defined in publishCamera().

In this next section, you will create two functions, displayPublishingVideo(), and displayPlaybackVideo() to play the streams and display the metadata on screen.

    1. Create a new private instance variable named vid_out, and set its data type to Video:

private var vid_out:Video;

Be sure to import the Video class:

import flash.media.Video;

This new instance of the Video class will be used to playback the not-yet-encoded video coming in from the webcam.

    1. Next, create a protected function named displayPublishingVideo() that takes no arguments and returns void:

protected function displayPublishingVideo():void
{
}

    1. In the first line of the function body, instantiate the vid_out variable by calling the constructor method of the Video class:

vid_out = new Video();

    1. To place the new Video component on screen correctly, assign x and y values to vid_out so x = 300, and y = 10

vid_out.x = 300;
vid_out.y = 10;

    1. Next, use the height and width values from the webcam to set the height and width of the video display:

vid_out.width = cam.width;
vid_out.height = cam.height;

    1. To allow the vid_out component to display video coming from the webcam, call the attachCamera() method of the Video class, and pass that method the instance of the Camera class that represents the webcam:

vid_out.attachCamera(cam);

    1. Finally, add vid_out to the display list by calling the addChild() method of the DisplayObjectContainer class:

addChild(vid_out);

At this point, the displayPublishingVideo() function should look similar to:

protected function displayPublishingVideo():void
{
vid_out = new Video();
vid_out.x = 300;
vid_out.y = 10;
vid_out.width = cam.width;
vid_out.height = cam.height;
vid_out.attachCamera(cam);
addChild(vid_out);
}

If you run the application at this point, provided you have a webcam attached to your computer (and you un-commented the calls to the functions publishCamera(), and displayPublishingVideo() within onNetStatus()), you should see the Flash Player dialog that asks permission to access your camera. Grant Flash Player permission, and you should now see a live video feed coming from your webcam.

Next, you’ll add code to the displayPublishingVideo() function that will display the metadata objects you created earlier. The metadata text won’t show up until the code is in place to handle the incoming stream, however. This is because metaText’s text property is set within the onMetaData() function, and onMetaData() is run only when Flash Media Server sends the stream back to the application. You’ll start by adding the metaText TextField object to displayPublishingVideo() and assigning values for its properties:

    1. In the displayPublishingVideo() function, directly under the existing addChild() method call, set metaText’s “x” value to “0″, its “y” value to “55″, its width to “300″, and its height to “385″

metaText.x = 0;
metaText.y = 55;
metaText.width = 300;
metaText.height = 385;

    1. Assign color values for the backgroundColor, textColor, and borderColor of metaText. In order to display backgroundColor and borderColor, you must assign both the background and border properties to “true”.

metaText.background = true;
metaText.backgroundColor = 0x1F1F1F;
metaText.textColor = 0xD9D9D9;
metaText.border = true;
metaText.borderColor = 0xDD7500;

    1. Add the metaText TextField object to the display list by calling the addChild() method, and passing it the metaText object.

addChild(metaText);

Next, you’ll create a function that will bring the video stream back in from Flash Media Server, and display it in another Video object.

    1. Create a new instance variable named vid_in and data type it as a Video.

private var vid_in:Video;

  1. Next, create a new protected function called “displayPlaybackVideo()” that takes no arguments and returns void.protected function displayPlaybackVideo():void
    {
    }
  2. In the first line of the function body, instantiate a copy of ns_in, the NetStream variable you declared earlier, and set its initial value equal to new NetStream(nc) with the “nc” NetConnection passed as an argument.
  3. ns_in = new NetStream(nc);
  4. Instead of calling the attachCamera() method, as you did for the previous NetStream, set the client property of the new NetStream to “this”.
  5. ns_in.client = this;
  6. Next, call the play() method of the NetStream class, and pass it the String value for the name of the stream. This should be the name of the outgoing stream as well.
  7. ns_in.play("mp4:webCam.f4v");
  8. Instantiate the vid_in variable by calling its constructor.
  9. vid_in = new Video();
  10. Next, set some sizing and layout properties for the new Video object so that it sits properly on the stage.
  11. vid_in.x = vid_out.x + vid_out.width;
    vid_in.y = vid_out.y;
    vid_in.width = cam.width;
    vid_in.height = vid_out.height;
  12. Attach the incoming NetStream to the Video object to have it playback the video.
  13. vid_in.attachNetStream(ns_in);
  14. Finally, add vid_in to the display list by calling the addChild() method and passing vid_in as its only argument.
  15. addChild(vid_in);Make sure to un-comment the call to displayPlaybackVideo() in the onNetStatus() function, and then save and run the application. You should see a dark rectangle appear that displays the video’s encoding settings, and two video streams, side-by-side. The video on the left is the raw video footage coming from the webcam, and the one on the right is the stream coming back from Flash Media Server.

Coding the Application – Adding Some Finishing Touches

The application is almost done! It could stand a little visual clean up however.

    1. First, add Metadata above the class declaration to set the height and width of the application to something more reasonable.

[SWF( width="940", height="880" )]

Next, you’ll create three more TextFields that will display a simple label for the encoding settings list, as well as information about each of the separate video streams. You’ll also work with some simple text formatting to size the text something different than the default.

    1. Create three new TextField variables, one named “vid_outDescription”, one named “vid_inDescription”, and one named “metaTextTitle”. Data type each of them as TextField, and call the constructor for each.

private var vid_outDescription:TextField = new TextField();
private var vid_inDescription:TextField = new TextField();
private var metaTextTitle:TextField = new TextField();

    1. Within the displayPublishingVideo() function, directly below the call to add metaText to the display list, add a line that sets the text property for metaTextTitle. Play with spacing between the double quotes and add a “\n” to get the positioning the way you’d like it.

metaTextTitle.text = "\n - Encoding Settings -";

    1. Next, create a local variable named “stylr”, that is an instance of the TextFormat class. Instantiate this variable by calling its constructor.

var stylr:TextFormat = new TextFormat();

Ensure that the TextFormat class has been imported.

import flash.text.TextFormat;

    1. Set the size property of the new TextFormat class to “18″.

stylr.size = 18;

    1. Apply the style defined with the stylr variable to the metaTextTitle TextField by calling the setTextFormat method of the TextField class and passing that method the name of the TextFormat object “stylr” as an argument.

metaTextTitle.setTextFormat(stylr);

    1. Add more styling and layout property values to metaTextTitle the same way you added them to metaText earlier:

metaTextTitle.textColor = 0xDD7500;
metaTextTitle.width = 300;
metaTextTitle.y = 10;
metaTextTitle.height = 50;
metaTextTitle.background = true;
metaTextTitle.backgroundColor = 0x1F1F1F;
metaTextTitle.border = true;
metaTextTitle.borderColor = 0xDD7500;

    1. Create descriptive text to be displayed for the outbound video stream. Set the text property for the vid_outDescription TextField to display this descriptive text. Again, play with the spacing and new lines to get it positioned correctly

vid_outDescription.text = "\n\n\n\n Live video from webcam \n\n" +
" Encoded to H.264 in Flash Player 11 on output";

    1. Add both the metaTextTitle TextField, and the vid_outDescription TextField to the display.

addChild(vid_outDescription);
addChild(metaTextTitle);

    1. Add descriptive text for the incoming video stream in the same manner. Set values for properties on vid_inDescription, and add the TextField to the display.

vid_inDescription.text = "\n\n\n\n H.264-encoded video \n\n" +
" Streaming from Flash Media Server";
vid_inDescription.background = true;
vid_inDescription.backgroundColor =0x1F1F1F;
vid_inDescription.textColor = 0xD9D9D9;
vid_inDescription.x = vid_in.x;
vid_inDescription.y = cam.height;
vid_inDescription.width = cam.width;
vid_inDescription.height = 200;
vid_inDescription.border = true;
vid_inDescription.borderColor = 0xDD7500;
addChild(vid_inDescription);

There you have it! The application should now automatically attach a webcam, display the webcam video, encode that video to H.264, and then stream it to and from Flash Media Server, displaying the end result in another video. The source files can be downloaded here. The completed code should appear as follows:

package
{
import flash.display.DisplayObject;
import flash.display.Sprite;
import flash.events.NetStatusEvent;
import flash.media.Camera;
import flash.media.H264Level;
import flash.media.H264Profile;
import flash.media.H264VideoStreamSettings;
import flash.media.Video;
import flash.net.NetConnection;
import flash.net.NetStream;
import flash.text.TextField;
import flash.text.TextFormat;

[SWF( width="940", height="880" )]
public class H264_Encoder extends Sprite
{
private var nc:NetConnection;
private var ns_out:NetStream;
private var ns_in:NetStream;
private var cam:Camera = Camera.getCamera();
private var vid_out:Video;
private var vid_in:Video;
private var metaText:TextField = new TextField();
private var vid_outDescription:TextField = new TextField();
private var vid_inDescription:TextField = new TextField();
private var metaTextTitle:TextField = new TextField();

public function H264_Encoder()
{
initConnection();
}

private function initConnection():void
{
nc = new NetConnection();
nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatus);
nc.connect(“rtmp://office.realeyes.com/live”);
nc.client = this;
}

protected function onNetStatus(event:NetStatusEvent):void
{
trace(event.info.code);
if(event.info.code == “NetConnection.Connect.Success”)
{
publishCamera();
displayPublishingVideo();
displayPlaybackVideo();
}
}

protected function publishCamera():void
{
ns_out = new NetStream(nc);
ns_out.attachCamera(cam);
var h264Settings:H264VideoStreamSettings = new H264VideoStreamSettings();
h264Settings.setProfileLevel(H264Profile.BASELINE, H264Level.LEVEL_3_1);

// ALTHOUGH FUTURE VERSIONS OF FLASH PLAYER SHOULD SUPPORT SETTING ENCODING PARAMETERS
// ON h264Settings BY USING THE setQuality() and setMode() METHODS, FOR NOW YOU MUST SET
// SET THE PARAMETERS ON THE CAMERA FOR: BANDWITH, QUALITY, HEIGHT, WIDTH, AND FRAMES PER SECOND.

// h264Settings.setQuality(30000, 90);
// h264Settings.setMode(320, 240, 30);

cam.setQuality(90000, 90);
cam.setMode(320, 240, 30, true);
cam.setKeyFrameInterval(15);
ns_out.videoStreamSettings = h264Settings;
// trace(ns_out.videoStreamSettings.codec + “, ” + h264Settings.profile + “, ” + h264Settings.level);
ns_out.publish(“mp4:webCam.f4v”, “live”);

var metaData:Object = new Object();
metaData.codec = ns_out.videoStreamSettings.codec;
metaData.profile = h264Settings.profile;
metaData.level = h264Settings.level;
metaData.fps = cam.fps;
metaData.bandwith = cam.bandwidth;
metaData.height = cam.height;
metaData.width = cam.width;
metaData.keyFrameInterval = cam.keyFrameInterval;
metaData.copyright = “Realeyes Media, 2011″;
ns_out.send(“@setDataFrame”, “onMetaData”, metaData);
}

protected function displayPublishingVideo():void
{
vid_out = new Video();
vid_out.x = 300;
vid_out.y = 10;
vid_out.width = cam.width;
vid_out.height = cam.height;
vid_out.attachCamera(cam);
addChild(vid_out);
metaText.x = 0;
metaText.y = 55;
metaText.width = 300;
metaText.height = 385;
metaText.background = true;
metaText.backgroundColor = 0x1F1F1F;
metaText.textColor = 0xD9D9D9;
metaText.border = true;
metaText.borderColor = 0xDD7500;
addChild(metaText);
metaTextTitle.text = “\n – Encoding Settings -”;
var stylr:TextFormat = new TextFormat();
stylr.size = 18;
metaTextTitle.setTextFormat(stylr);
metaTextTitle.textColor = 0xDD7500;
metaTextTitle.width = 300;
metaTextTitle.y = 10;
metaTextTitle.height = 50;
metaTextTitle.background = true;
metaTextTitle.backgroundColor = 0x1F1F1F;
metaTextTitle.border = true;
metaTextTitle.borderColor = 0xDD7500;
vid_outDescription.text = “\n\n\n\n Live video from webcam \n\n” +
” Encoded to H.264 in Flash Player 11 on output”;
vid_outDescription.background = true;
vid_outDescription.backgroundColor = 0x1F1F1F;
vid_outDescription.textColor = 0xD9D9D9;
vid_outDescription.x = 300;
vid_outDescription.y = cam.height;
vid_outDescription.width = cam.width;
vid_outDescription.height = 200;
vid_outDescription.border = true;
vid_outDescription.borderColor = 0xDD7500;
addChild(vid_outDescription);
addChild(metaTextTitle);
}

protected function displayPlaybackVideo():void
{
ns_in = new NetStream(nc);
ns_in.client = this;
ns_in.play(“mp4:webCam.f4v”);
vid_in = new Video();
vid_in.x = vid_out.x + vid_out.width;
vid_in.y = vid_out.y;
vid_in.width = cam.width;
vid_in.height = vid_out.height;
vid_in.attachNetStream(ns_in);
addChild(vid_in);
vid_inDescription.text = “\n\n\n\n H.264-encoded video \n\n” +
” Streaming from Flash Media Server”;
vid_inDescription.background = true;
vid_inDescription.backgroundColor =0x1F1F1F;
vid_inDescription.textColor = 0xD9D9D9;
vid_inDescription.x = vid_in.x;
vid_inDescription.y = cam.height;
vid_inDescription.width = cam.width;
vid_inDescription.height = 200;
vid_inDescription.border = true;
vid_inDescription.borderColor = 0xDD7500;
addChild(vid_inDescription);
}

public function onBWDone():void
{

}

public function onMetaData( o:Object ):void
{
for (var settings:String in o)
{
trace(settings + ” = ” + o[settings]);
metaText.text += “\n” + ” ” + settings.toUpperCase() + ” = ” + o[settings] + “\n”;
}
}

}
}


Download Source Here

Get Ready – Adobe MAX 2011 is Near!

Posted on September 28, 2011 at 9:42 am in Development, Training

Adobe MAX 2011

MAX!

There’s a great deal of excitement in the AIR here at RealEyes Media as the premiere Adobe conference of the year-Adobe MAX 2011, rapidly approaches! This Saturday, October 1st, the epicenter of design, media, and development will be Los Angeles California, as Adobe settles in for the 3rd year in a row at the L.A. Convention Center, and the beautiful Nokia Theater L.A. LIVE. The Adobe MAX conference has always been the place to listen to and meet world-renowned speakers, learn about the latest tools and techniques, and connect with potential clients, new partners, and old friends…and this year is no exception!

If you’ve ever been to MAX, you know that Adobe pulls out all the stops for this event. Keynote addresses are given by the biggest names in tech and entertainment. In case you haven’t already heard, the musical entertainment for this year’s MAX Bash will be provided by the band Weezer!

Weezer!


A chance to learn from the best

Whether you’re a designer, developer, or business strategist, MAX is an environment that deepens your expertise, and ultimately makes you more productive in your work. Every skill and experience level is welcome during this Five-day learn-a-thon. Whether you’re someone who’s never opened Photoshop, or you’re interested in creating high-tech video players destined for multiple devices, there’s something at MAX for you.

learn at MAX


Learn about multiscreen development with Realeyes’ own David Hassoun, John Crosby, and Jun Heider

The last few years have seen a steady upswing in multiscreen application development, and MAX has responded to this trend by providing developers, designers, and entrepreneurs with the best resources for learning how to rise to the top in this environment.

David, John, and Jun

Realeyes Media is pleased to announce that Jun Heider, David Hassoun, and John Crosby will be speaking at MAX! Be sure to check out the following sessions:


David Hassoun & John CrosbyVideo Player Development for Multiple Devices

“Learn how to create compelling, robust, and high-performing video player experiences for desktops, tablets, and smartphones including HTML5 and Adobe AIR for iOS. This lab for developers will step through what’s needed to develop and optimize the video experience across all devices. Using Adobe Flash Media Server on the back end, you’ll use Adobe Flash Builder and Open Source Media Framework to create video players that just work. Explore how to tune hardware acceleration with Stage Video to optimize battery life.”

  • Are you attending this session? Would you like early access to the sample files? We can help out. Sign up here to download the files.

Jun HeiderMultiscreen Project Best Practices

“Prepare to take the next step in multiscreen development. Review important considerations in planning multiscreen projects geared toward efficient code reuse and workflow. Also, see how to structure projects to match the strategy chosen to fit the application’s use case. By the end of the session, you’ll walk away with an understanding of how to start architecting your multiscreen Adobe Flash Platform applications and build them using Adobe Flash Builder.”

  • If you’re attending this session you can sign up here to receive the presentation materials from Jun’s session.

These guys know their stuff, so if you’re interested in developing applications that are destined for multiple screens, be sure to attend their sessions, ask questions, and meet them in person-you won’t be disappointed!

Can’t make it to MAX? Attend “Mini-MAX”!

Here in Denver, the Rocky Mountain Adobe User Group traditionally provides an annual “mini-MAX”, for those who couldn’t make the trip out to California. Those that were there give us their take on the conference, providing us with a remote insite into MAX’s highlights. Join us on 11/08/11 at Casselman’s - 2620 Walnut Street in North Denver, CO, as well as every 2nd Tuesday of the month throughout the year to talk all things Adobe!

Adobe Releases OSMF, Strobe Media Playback 1.6

Posted on September 08, 2011 at 3:38 pm in Development, Media Solutions

Back in early June, we reported on the pre-release of Adobe’s OSMF 1.6, and its support for late-binding audio. Adobe has been working hard to improve upon the upgrades they gave us with the OSMF 1.6, Sprint 5 release, and to add even more new features for mobile as well. Today Realeyes Media is pleased to announce that OSMF 1.6, and Strobe Media Playback 1.6 have been granted their final release status.

A brief overview of the updates available in OSMF  and Strobe Media Playback 1.6:

OSMF 1.6
  • In regards to late-binding audio, as promised, today’s release supports live playback as well as video on demand (VOD).
  • Also in regards to late-binding audio, fixes to seek issues resolved.
  • For mobile – offers Stage Video support for hardware-accelerated video presentation(requires Flash Player 10.2+).
  • DVR rolling window support, which allows you specify how far back from the live point viewers can rewind (requires the newly released FMS 4.5).
Strobe Media Playback 1.6
Core Framework
  • Improvements to HTTP Dynamic Streaming as well as the ability to better manage bitrate profiles with multi-level manifests.
Documentation

http://sourceforge.net/apps/mediawiki/osmf.adobe

This is exciting news for those of us using OSMF and/or the Strobe Media Playback. Thank you to Cathi Kwon and the rest of the OSMF team for giving us these new and powerful feature updates!



For information on how Realeyes Media can help you integrate OSMF into your media solutions, please feel free to contact us today.


Scott Sheridan writes about, and messes around with, the latest technologies in digital motion media at Realeyes. He also does triathlons. Really big triathlons.

Feel free to reach out with any questions-we’re glad to help!

scott@realeyesmedia dot com

Jun Heider gave a really nice presentation this morning on how to leverage the Adobe Flash platform P2P API to create applications for sharing video, audio, and data among application peers. In his talk, Jun demonstrates P2P technologies working across multiple devices and taking advantage of the flexible RTMFP protocol, an Adobe technology that allows for maximum scalability coupled with a dramatic reduction in server infrastructure and bandwidth costs.

View the presentation:

Meeting recording

Presentation slides (PDF)

Additional Resources

You can also check out a recent screencast from Jun’s blog demonstrating a multiscreen P2P call center application


The following are demonstrations of some of the ways in which this technology can be implemented:

Basic Demos (right-click demos to view source)

Metrics Demo (Serverless)

Multiuser Video Demo

Elearning Demo

Adobe AIR – File Sharing

File Sharing Demo (AIR)

File Sharing Demo Source

Byte Array Chunker Utility