AngularJS, what is it?

AngularJS, Simply Put

AngularJS

AngularJS is a JavaScript framework. It provides an Application Programming Interface (API) that allows you to build web applications using JavaScript, HTML, & CSS. The API that Angular provides is a well thought out and robust set of solutions that provide you the ability to create powerful web applications more simply & quickly.

AngularJS lets you extend HTML vocabulary for your application. The resulting environment is extraordinarily expressive, readable, and quick to develop. – angularjs.org

Thinking Differently

For me, and I’m imagining many others, Angular required a paradigm shift in the way I thought about application development. Angular doesn’t just allow you to manipulate the DOM. What angular does is allow you as the developer to extend the existing features of HTML and add on to the capabilities of the browser in a very slick way. Angular’s angle (ha!) is to control the view by inspecting the DOM for an injection point where you can hen create the features of your application. These injections can be objects that act as the logical control center for an existing tag like an <h1> or adding on attributes to that same tag. Angular also allows you to create completely new tags that can be used in you application. The result then become the components, views and logic of you’re application.

The point being is that now you have control over the features and functionality of the browser for your application. Understanding how your new features features are added to the browser, and how you can best structure your application to allow for simplified and maintainable development is the first step. The way I tend to think of it is that Angular is responsible for telling the browser what to do, you just need to provide the logic and data to allow it to do so. That is a very important concept to grasp with Angular – the data is what drives the application. You don’t, and shouldn’t, manipulate the DOM directly. You change the data, then it is up to angular to render the view correctly based on that data.

For example, instead of using jQuery (or angular’s jQLite) to manipulate the DOM to show some text or change a class, we change the data model. This change is evaluated and angular then updates and renders the view appropriately based on that changed data. You can see this in the sample below:

The displayed H1 value changes when we type into the text input. This is Angular being notified that the model has updated and that the view needs to re-render the updated data. There is nothing we need to do programmatically to set the “innerHTML” of a specific div with the new data. Also notice that there is no custom JavaScript in the jsFiddle above. We only have to include the library & specify the model & bindings.

For me, it took some time to get it. But, once I did, I loved it!

The Browser Is Where It’s At

I used to create a lot of Flash based content – be it video players, widgets, or full on applications what I built was built using the Flash Platform. I haven’t opened the Flash IDE or Flash Builder to create a SWF in a long, long…long time. What does this mean? First, technology is changing – no doubt. Second, I get to learn some new stuff. Guess what? I’m okay with that.

Users, platforms, and developers have have forced browsers to evolve. The browser is no longer just a window to view content. It is an environment that applications execute in. It used to be that you’d open a browser, search for something, or read something, then close it down and get to work. Now, what you work on is in in the browser & those browsers are open all day.

What should I learn? What should you learn, if you’re not already? Learn about the stuff that happens in the browser – Yep,  JavaScript, CSS, & HTML. Learn the existing APIs as well as the upcoming API changes and additions.

Banner of Browser Logos

The browser as a first-class citizen?

Browsers are more powerful, they are more feature rich, and are becoming first class citizens when it comes to how people use them. I don’t think that it is 100% where it needs to be, but it won’t be long until that tipping point that causes a shift in how people use and think about the browser. Consider Google services – to name a few, GMail, Drive & Docs, & Calendar. These services all run in the browser and each week new features are added to make them more comparable and sometimes better than their desktop counterparts. It used to be that everyone relied on Microsoft Office and Word to create documents, edit and track changes, Outlook to manage their email and calendar. Now, all of that and more is in the browser, of course you can still use desktop applications to manage that data, but, like I said there will be a tipping point. The point of this? Pay attention to the browser, the browser is where things are headed.

Why the browser?

Because:

  • Browsers are familiar to users
  • They exist for all major platforms
  • Browsers have established a quick and easy update path
  • Browsers will become more accepted by the enterprise
  • They take advantage of HTTP protocols
  • Browsers leverage new and existing technology

Familiarity

A browsers is an easy path to entry. A browser is a simple concept to grasp and easy to explain and learn. Although it has a low learning curve, browsers have and can be extended on to provide functionality needed for today and tomorrow’s users.

All major platforms

All major platforms have a browser. Desktop & mobile, even TVs and DVD players have browsers. For developers the headache is support different platforms. You will have to provide platform specific code. But, the main point to get across here is that HTTP, JavaScript, & CSS are will be supported by more and more platforms.

Quick and easy update path

Chrome and Firefox update at lightning speed, and for many users without them even knowing. This helps roll out new features (Web RTC, Media APIs etc) more quickly. There is a major barrier when it comes to the enterprise and government, but this is something that I think will change in the near future.

Accepted by the enterprise & government

Currently these are two areas where updates and browser versions can really hold back innovation. But, with the current direction of and additions to APIs and security, this issue should become a problem of the past as browser updates are easier, more secure and become the norm rather than the exception for the enterprise and government.

HTTP Protocol

HTTP has been around forever and for good reason. It works. It is flexible and powerful. Innovation and increased bandwidth allow for more innovative and more interesting uses of the protocol. HTTP video streaming is a great example of this. The client is responsible for managing the HTTP requests that it will need to successfully play back video served up in HTTP chunks, while still providing expected functionality to the user. We still have conversations about “chatty” applications, but these conversations will be minimised as a different perspective and different technologies emerge that leverage HTTP to a greater and more efficient degree.

Leveraging Existing & New Technology

As with HTTP, other established technologies will be accepted and leveraged by the browser. For instance, browsers are finally getting around to integrating media playback. WebRTC is another example things like WebSockets, node.js, socket.io, there are some really interesting things going on excite about the next-gen applications and tools that will be created.

What I see

All of this isn’t to say what we as developers are doing now will go away. Things certainly won’t change immediately. But, I am looking to the future, evaluating trends and technology, and emerging conversations, and what I see is the browser. Maybe not in it’s current incarnation, but the browser is what I see.

What do you see?

HDS & Bootstrap Data

inspect-hds-bootstrap

Working with HDS Bootstrap Data

I’ve always been curious about the bootstrap data for HDS content. Recently, I had the chance to find out more about it and get in some fun development with Node.js. We’ve been kicking around the idea of building a tool set for Adobe Media Server using Node.js and possibly socket.io. Last weekend we got some of the ideas going and one of those was parsing the hds bootstrap data created when content is packages for HDS delivery.

The bootstrap data can live in a couple of places:

  1. In the <metadata> node of an F4M file
  2. In an external .bootstrap file

The .bootstap file contains binary data and the F4M file contains the same binary data that has been Base64 encoded. So, getting to the data is pretty trivial – either read in the .bootstrap file or un-encode the string in that is in the F4M. Getting to the data contained in the bootstrap binary data is the fun part.

Understanding the bootstrap data

To do so, check out the F4V file format specification. This PDF gives you the details for the entire F4V file format. If you read through the PDF, you’ll  see that it is built using what are called “boxes”. These boxes are given identifiers such as “abst”,  “adaf”,  “adkm”,  “aeib”,  “afra”,  & “afrt” to name a few. Each box contains a header, that header identifies the box by its identifier and lets you know how much data is contained in the box. These boxes are also arranged into a hierarchy, so each box has some data that is specific to some part of the data contained in the file.

It is all in the boxes

The boxes that we are concerned with are “abst” or the bootstrap information box, “asrt” or the segement run table box, and “afrt” or the fragment run table box.

The abst box

The bootstrap information box contains information needed to bootstrap playing of HDS content – specifically to construct the URLs necessary to retrieve the fragments for playback. This includes information about the server, media, & segment information.

The asrt box

The segment run table box contains data about the segments for the media item. There can be multiple ASRT boxes – each representing a different quality level. There are some rules that you’ll want to pay attention to for the data in the asrt box:

  • An asrt box can represent fragment runs for several quality levels.
  • Each entry gives the first segment number for a run of segments with the same count of fragments.
    • The count of segments having this same count of fragments can be calculated by subtracting the first segment number in this entry from the first segment number in the next entry.

The afrt box

The fragment run table box is used to find the fragment corresponding to a given time. Similar to the asrt box, there are some rules that you’ll want to pay attention to:

  • Fragments are individually identifiable by the URL scheme based on segment number and fragments number.
  • Fragments may vary both in duration and in number of samples.
  • Duration of the fragments are stored in the this box.
  • A Fragment Run Table may represent fragments for more than one quality level.
  • Each fragment run table entry gives the first fragment number for a run of fragments with the same duration.
    • The count of fragments having this same duration can be calculated by subtracting the first fragment number in this entry from the first fragment number in the next entry.

Parsing the bootstrap data using Node.js

Parsing binary data in Node.js can be done using “Buffer”. For the most part parsing the bootstrap data was pretty straight forward. There is one issue that I ran into with 64bit Integers which was solved easily enough (there are node modules for just about anything) using the node-int64 module to represent the 64Bit Integers. Once that was solved it was just a matter of parsing through the box header to figure out where you are in the dataset, and then creating the appropriate data structures to represent what you want and need in from the bootstrap data.

In our case we want to be able to monitor live events across multiple servers to make sure that they are all on the same segment and fragment. We’re building a services that in the case that something happens to a server and it goes haywire, will notify another service that can then restart or shut down that particular server or let caching servers know that they need to flush or refresh cache. We’re still dreaming up things we can use this type of data for.

Just want to get to that data?

If you have a .bootstrap file you can use the f4fpackager.exe that is part of the Adobe Media Server toolset to inspect the bootstrap data. All you need to do is run the tool with the argument “–inspect-bootstrap”. So the command looks something like the following if you have a bootstrap file named mydata.bootstrap:

[shell].f4fpackager.exe –input-file=mydata.bootstrap –inspect-bootstrap[/shell]

Anyways, if you have any questions or input let me know in the comments.

Creating Set-level Manifest Files Using the F4M Configurator Tool

Here is a quickie on how to use Adobe’s F4M configurator tool to create set level manifest files.

The configurator is installed with AMS 5.0 and can be found in the following directory: {AMS_INSTALL}/tools/f4mconfig/configurator/

  1. Open the f4mconfig.html file in a browser.
    Adobe Media Server - F4M Configurator Tool
  2. Enter the path to your server, application and event. For example for an event named “myliveevent” using the “livepkgr” application the Base URL would look like: http://192.168.1.113/hds-live/livepkgr/_definst_/myliveevent/
  3. If you are going to use DVR, enter a value for “DVR Window Duration”. A value of -1 configures the DVR window for all of the available content. A value greater than zero configures the amount of time in seconds available before the live point. We’ll set a 30 minute DVR window, so 1800 seconds.
  4. Enter the stream name and bit rate for each bit rate you are encoding. For this example lets say we have a single bit rate of 300 for a stream named “mylivestream”
    Adobe's F4M Configurator - Stream Name and DVR Window
  5. Click the “Save Manifest” button. A file will be created and you will prompted to save the file. Save the file and open it.
  6. The file should look similar to the following:
    <manifest xmlns="http://ns.adobe.com/f4m/2.0">
      <baseURL>http://192.168.1.113/hds-live/_definst_/myliveevent/</baseURL>
      <dvrInfo windowDuration="1800"/>
      <media href="mylivestream" bitrate="300"/>
    </manifest>
  7. This file can now be used to specify live DVR content. If you add an additional bitrate, you not have a set-level F4M file for multi-bitrate streaming.

Hope this helps and save a bit of time for you.

Configure Adobe Flash Media Server for Live HTTP Dynamic Streaming

How to set up Live HTTP Dynamic Streaming

So you want to stream a live event using HTTP Dynamic Streaming (HDS) and HTTP Live Streaming (HLS)? No problem. Adobe Media Server (AMS) provides a right out-of-the-box solution for you. To do so, you’ll need to:

  1. Download and install Flash Media Live Encoder (FMLE)
  2. Make a small configuration change to the encoder
  3. Setup your live event
  4. Begin streaming
  5. Set up a player

Installing and configuring Flash Media Live Encoder

  1. Download FMLE from http://www.adobe.com/products/flash-media-encoder.html
  2. Once it is installed open the config.xml file from
    1. Windows: C:Program FilesAdobeFlash Media Live Encoder 3.2conf
    2. Mac: /Applications/Adobe/Flash Media Live Encoder 3.2/conf/
  3. Locate the “streamsynchronization” tag under flashmedialiveencoder_config -> mbrconfig -> streamsynchronization and set the value for “enable” to “true”. The streamsynchronization node should look similar to the following:
    <flashmedialiveencoder_config>
     <mbrconfig>
       <streamsynchronization>
         <enable>true</enable>
       </streamsynchronization>
    ...
  4. Save and close the file.

Setting up the live event

Streaming a live event involves using the “livepkgr” application that comes installed with AMS. The livepkgr application comes with a preconfigured event named livestream. We’ll use this as a template for our live event.

  1. On your server navigate to the {AMS_INSTALL}/applications/livepkgr/events/_definst_ directory.
  2. We’re going to call our event “myliveevent”. Create a new directory and name it “myliveevent”.
  3. Open the newly create mylivestream directory and create a new XML file named “Event.xml”. This file is used to configure the just-in-time (JIT) packaging settings for your HDS content. Add the following XML to the file. Note: You can also copy the Event.xml file from the liveevent directory that is setup by default. Just update the EventID to match the folder name.
    <Event> 
      <EventID>myliveevent</EventID> 
      <Recording> 
        <FragmentDuration>4000</FragmentDuration> 
        <SegmentDuration>16000</SegmentDuration> 
        <DiskManagementDuration>3</DiskManagementDuration> 
      </Recording> 
    </Event>

    For more information about the values in the Event.xml  file you can review Adobe’s documentation – link in the resources section below.

  4. Save and close the file.
  5. Your event is now set up. You can reuse this event all you want, or create another one for a different event name.

Begin streaming

Now we can start up FMLE and set it up to connect to our livepkgr application and begin streaming.

  1. In the left panel of FLME make sure the “Video” and “Audio” sections are both checked.
  2. Video
    1. In the video section, set the format to be “H.264” and then click the button with the wrench icon.
    2. In the resulting pop-up window, make sure the settings match the following:
      1. Profile: Main
      2. Level: 3.1
      3. Keyframe Frequency: 4 seconds
        Live HTTP Dynamic Streaming H.264 Settings
    1. Click “OK” to close the pop-up window.
    2. In the “Bit Rate” section make sure you only have one of the bit rates selected. We’re only creating a single stream for now.
      Live HTTP Dynamic Streaming Video Encoder Settings
  3. Audio
    1. In the Audio section, set the format to “AAC”
      Live HTTP Dynamic Streaming Audio Encoder Settings
  4. In the right panel set “FMS URL” to point to your server and the livepkgr application:
    1. Example: rtmp://192.168.1.113/livepkgr
  5. Set the “Stream” value to be mylivestream?adbe-live-event=myliveevent
    1. “mylivestream” is the name of the stream and can be anything you’d like. The actual files that AMS creates will be stored in the livepkgr/streams/_definst_/mylivestream directory.
    2. “?adbe-live-event=myliveevent” tells the livepkgr application to use the Event.xml in the livepkgr/events/_definst_/myliveevent directory that we created.
      Live HTTP Dynamic Streaming RTMP Server Settings
  6. Click the “Connect” button. If all goes well, you’ll connect to your server. If not, check to make sure there aren’t any typos in the values for “FMS URL” and “Stream” and that you can connect to your server and it is running.
  7. Click the bug green “Start” button to begin streaming.
    Live HTTP Dynamic Streaming Big Green Start Button
  8. You now have a stream. Let’s see if we can get a player to play it back.

Setting up the player

Getting to the HDS content for your new stream involves requesting a URL that lets Apache (installed with AMS) know what we are looking for. The path will consist of the following parts:

  1. The protocol: http://
  2. The server location: 192.168.1.113/ (in my case, yours will be different)
  3. The Location that is configured to deliver live streams. By default these are:
    1. HDS: hds-live/
    2. HLS: hls-live/
  4. The application name: livepkgr/
  5. The instance name (we’ll use the default): _definst_
  6. The event name: myliveevent
  7. The stream name: mylivestream
  8. The F4M file extension for HDS – .f4m or the M3U8 file extension for HLS.

So if we put all of that together we’ll get a URL that looks like:

  • HDS: http://192.168.1.113/hds-live/livepkgr/_definst_/myliveevent/mylivestream.f4m
  • HLS: http://192.168.1.113/hls-live/livepkgr/_definst_/myliveevent/mylivestream.m3u8

Note: You may need to add the 8134 port to the URL if you didn’t install AMS on port 80: http://192.168.1.113:8134/hds-live/livepkgr/_definst_/myliveevent/mylivestream.f4m

  1. Open a browser window and navigate to that URL, you should see the F4m’s XML content.
    Live HTTP Streaming F4M XML
  2. Open the following URL: http://www.osmf.org/configurator/fmp/#
  3. Set your F4M url as the value for “Video Source”
  4. Select the “Yes” radio button for “Are you using HTTP Streaming or Flash Access 2.0?”
  5. Set “Autoplay Content” to “Yes”
    Live HTTP Dynamic Streaming Player Settings
  6. Click the Preview button at the bottom of the page.
  7. Congratulations. You are now streaming live media over HTTP.

To verify the HTTP Streaming, open a tool that will let you inspect the HTTP traffic (something like Developer Tools or Firebug). You should see requests for resourecs like “mylivestreamSeg1-Frag52” and “mylivestream.bootstrap”. This is the player requesting HDS fragments and Apache and AMS working together to package them just-in-time for the player.
Live HTTP Dynamic Streaming HTTP Traffic

Hopefully this provides you with some good information about Live HTTP Dynamic Streaming and clarifies some of the setup and configuration details. Please, if you have any questions, let me know in the comments or contact me.

Resources

Take the PIA out of load testing AMF services

Testing service APIs is a big pain. More so when you are using a format that isn’t widely supported by testing tools. Load testing AMF services…ugh.

Recently I was tasked with finding a way to load test some AMF services built using PHP. In the past I had used jMeter for load testing. So, that is where I started. jMeter is a good tool, but not for AMF. A few extensions for jMeter have been built for AMF, but they are a PIA to setup.

I found a couple of tools that have made load testing AMF serivices a snap. Both of the tools are from SmartBear, and both are open source and free to use. Bonus!

  1. Tool one: soapUI
  2. Tool two: loadUI

The following are the basic steps to get a simple load test set up:

Setting up the project

  1. Download and install soapUI (you’ll need Java too).
  2. Start up soapUI.
    soapUI Startup
  3. Create a new soapUI project – File -> ‘New soapUI Project’
  4. Name your project and click ‘OK’.
    soapUI New Project
  5. Add a new TestSuite – right-click your new project -> ‘New TestSuite’
  6. Name your TestSuite and click ‘OK’
    soapUI New TestSuite
  7. Create a new TestCase – right-click the new TestSuite -> ‘New TestCase’
  8. Name your TestCase and click ‘OK’
    soapui New TestCase
  9. Expand the Test Case
  10. Right-click ‘Test Steps’, select ‘Add Step’ -> AMF Request.
    soapui New AMF Request
  11. Your project is set up and ready-to-roll. Save it.

Configuring the AMF Request

Now that we have a request set up we need to specify the arguments for the call. The is where I had troubles with jMeter – setting up the data required proxies and additional tools. There was no where to easily create & edit AMF object to pass along in the AMF request.

Enter soapUI. Lets say we have an API method called getMonkeys(). getMonkeys() requires an array of ids that specifies what monkeys we want in the list. The name of this parameter is ‘monkeyIds’.

  1. In soapUI, right-click the AMF Request object and select ‘Open Editor’. You should see a window similar to the following:
    AMF Request Editor
  2. In the text field for ‘Endpoint’ enter your service end point. For example: http://www.thekuroko.com/Monkey/amfphp/gateway.php
  3. Enter the name of your call in the text field for the AMF Call setting: For example Monkey.getMonkeys
  4. Just under the entry for Endpoint add property to the property list for the call by clicking the ‘Add Property’ button.
    New Property Button
  5. Enter ‘monkeyIds’ as the name of the property. If the value for this property were a simple value we could enter it into the value column. We need an array though.
  6. To set the value for the property we’ll use the script window just under the property list.
  7. In the script window, enter the following to create an Array that contains the id values 1,2 & 3 and assigns that Array to the monkeyIds parameter.
    parameters[“monkeyIds”] = [1,2,3];
  8. That is it. The call for getMonkeys() is set up.
  9. To test the call click the little green arrow in the top left of the AMF Request editor.
  10. If your paths and data are set up correctly, you should see an XML formatted response in the window to the right of the Script editor.
    soapUI AMF Request Result

Creating and Running a Load Test

So now we have a test for a service, but we wanted to get some load testing done. If you’re looking for quick and simple load testing, you don’t have to go much further than soapUI itself. To create a load test in soapUI:

  1. Right-click the ‘Load Tests’ node under ‘Test Steps’ -> ‘New Load Test’
  2. Name the new load test and click ‘OK’
  3. 2 steps. That’s it, the load test is set up. You can run the test “as-is”.

Now, this is a very simple load test and there are a ton of things you can add to the test to improve it to build more useful load tests within soapUI.

Running Load Test with loadUI

The other tool I mentioned, loadUI, is built to integrate with soapUI and make load testing “easier and more efficient”.

Once loadUI is installed can you execute the test case that you set up in soapUI in loadUI.

  1. Right-click the test case, then select ‘Run with loadUI’.
  2. You will be prompted to save the project, do so.
  3. Select ‘Fixed Rate for the ‘Default Generator’ selection – this will determine how “clients” are generated for the load test.
  4. Select ‘Statistics’ for the ‘Default Statistics’ selection – this will display a graph for the load test metrics.
    loadUI Test Settings
  5. Click ‘OK’.
  6. loadUI will launch.
    loadUI Sstartup
  7. Click the ‘Play/Stop’ button to start the load test.
    load Play/Stop Button

You can play around with the Generator settings to change the rate at which clients are created to see changes in the load test results while the load test is running.
loadUI Generator

loadUI Stats

To view a report of the results you can click the ‘Summary Report’ button in the top right of the loadUI interface.
loadUI Summary Report Button

This is just a simple load test and there are plenty of additional settings, assertions and analysis tools that can be added, adjusted and tweaked to improve the validity of the load tests.

Next Steps

Our next step is to integrate the tests into our Continuous Integration (CI) system. We use Jenkins and I saw this post about Automating loadUI and Jenkins (formerly Hudson). So, in theory it can be done. I’ll let you know what we get worked out on that end when we get there.

So far, I’m pretty excited about the two tools. They are very useful, and free to boot. Hey SmartBear – you really are smart, thank you – you rule.

Resources:

Mobile Flex: View Data

From the previous post you should know how to navigate from 1 view to the next using the ViewNavigator.  Now, you want some data in that view right? No problem, this is where the View object’s ‘data‘ property comes into play. Setting the data property is accomplished by passing the data object, in addition to the View’s class name, into the pushView() method on the navigator object.

Example:

[as3]navigator.pushView(MyNewView, dataObject);[/as3]

This effectively calls the setting for the data property of the new View (MyNewView) object that is created.

Managing View Data

You could work with the data property on the View object directly. For instance, if the data object passed into the View via the pushView() method was a simple user object that contained a name property, you could bind the name property to a label control.

Example:

[xml]<s:Label id="name_lbl" text="{data.name}" />[/xml]

Overriding the Data Property Setter

Usually though, you’d want to override the setter for the data property. Then you can type your object and work with it in a better manner.

Example:

[as3]protected var user:User;
override public function set data(value:Object):void
{
super.data = value;
user = value as User;
}[/as3]

[xml]<s:Label text="{user.name}" />[/xml]

So now we’ve got the data in the view. The next step is to manage the state of each view. With mobile apps you can’t count on the view staying around, so we’ll need to keep a tight control on the state of each view. That way we can bring the user right back where they expect to be when they come back to the app after a call for example. In the next post we’ll look into how to do this. Stay tuned.

This article has also been posted on the Realeyes website.

Mobile Flex: ViewNavigator Basics

Flex 4.5 provides some pretty slick updates and enhancements, the least of which is the of Mobile components and the ability to slam out some pretty nice mobile apps easily. The first thing I’d like to talk about is a new concept, the ViewNavigator. The ViewNavigator provides some pretty intense functionality such as view management.

What is the ViewNavigator?

The ViewNavigator keeps track of your views. It does this by keeping your views in a list.  To add a new view you ‘push’ the view into the list, to remove a view you can ‘pop’ a view out of the list. You can think of it as a stack – first in, last out – and the last view in is the visible view.

Pushing a View into ViewNavigator's 'stack'
Popping a view out of the ViewNavigator's 'stack'

Using the ViewNavigator

Using the view navigator is a pretty straight forward process of capturing a user interaction, such as a button click, then pushing the new View into the ViewNavigator’s stack.

For example, let’s pretend that you have a new Flex mobile project. The default view of that project contains a button, that when clicked should display another view named MyNewView.  MyNewView also contains a button, that when clicked returns you to the the home view.

Home View Component

In the Home View component all you really need to worry about the the click handler on the button:

[xml]<s:Button id="next_btn"
label="NEXT"
width="100%"
click="navigtor.pushView(MyNewView)" />[/xml]

The click handler calls the pushView() method on ‘navigator‘, a property available from the View class, passing it the class name of the View that you want to display. We’ll cover getting data into that view and transitions in other posts.  The creation of the new View & default transition are all handled by the  framework.

MyNewView Component

The MyNewView View component is basically the same thing:

[xml]<s:Button id="back_btn"
label="BACK"
width="100%"
click="navigtor.popView()" />[/xml]

You call popView() on the ‘navigator‘ property which removes the view from the stack displaying the Home view again.

Here is a quick screen cast of an application using similar code:
[kml_flashembed publishmethod=”static” fversion=”9.0.0″ movie=”http://thekuroko.com/wp-content/uploads/2011/03/ViewNavigatorSample.swf” width=”485″ height=”785″ targetclass=”flashmovie”]Get Adobe Flash player

[/kml_flashembed]

This article has also been posted on the Realeyes website.

OSMF Custom Media Elements

OSMF Video Sample

A good argument for using a framework is the ability to extend the built in capabilities of the framework. For example, there was a comment on the ‘Getting Sstarted with OSMF Plugins‘ post that asked about using embedded images in theWatermarkPlugin sample.

Here are the steps that I took to get an embedded asset (instead of a ‘loadable’ asset) to show as a watermark:

1. Create a new class that extends MediaElement (this is a simple element, but you could extend any existing element depending on your needs). I named mine StaticImageElement.

[actionscript3]
package com.realeyes.osmf.plugin.element
{
public class StaticImageElement extends MediaElement
{

}
}
[/actionscript3]

2. Add a private Bitmap property with a getter and setter to the class – I named mine _bitmap.

[actionscript3]
private var _bitmap:Bitmap;

public function get bitmap():Bitmap
{
return _bitmap;
}

public function set bitmap( value:Bitmap ):void
{
if( value != _bitmap )
{
bitmap = value;
}
}
[/actionscript3]

3. In the setter for the bitmap property add the DisplayObjectTrait to the StaticImageElement

[actionscript3]
addTrait( MediaTraitType.DISPLAY_OBJECT, new DisplayObjectTrait( _bitmap as DisplayObject, bitmap.width, bitmap.height ) );
[/actionscript3]

4. The completed class is pretty simple because we get to use everything already created for OSMF.

[actionscript3]
package com.realeyes.osmf.plugin.element
{
import flash.display.Bitmap;
import flash.display.BitmapData;
import flash.display.DisplayObject;

import org.osmf.media.MediaElement;
import org.osmf.traits.DisplayObjectTrait;
import org.osmf.traits.MediaTraitType;

public class StaticImageElement extends MediaElement
{
private var _bitmap:Bitmap;

public function StaticImageElement()
{
super();
}

public function get bitmap():Bitmap
{
return _bitmap;
}

public function set bitmap( value:Bitmap ):void
{
if( value != _bitmap )
{
_bitmap = value;

addTrait( MediaTraitType.DISPLAY_OBJECT, new DisplayObjectTrait( _bitmap as DisplayObject, bitmap.width, bitmap.height ) );
}
}
}
}
[/actionscript3]

5. Create the embedded asset in the WatermarkPluginElement class
[actionscript3]
[Embed( "/assets/osmf_logo.png" )]
protected static const OSMF_LOGO:Class;
[/actionscript3]

6. Now all we need to do in the WatermarkProxyElement set the bitmap property on a new instance of the StaticImageElement instead of creating an ImageElement with the watermark URL and ImageLoader.
Before:

[actionscript3]
var watermark:ImageElement = new ImageElement( new URLResource( watermarkURL ), new ImageLoader() );
[/actionscript3]

After:

[actionscript3]
var watermark:StaticImageElement = new StaticImageElement();
watermark.bitmap = new OSMF_LOGO();
[/actionscript3]

Bonus points for developing with a framework – more specifically OSMF! The embedded watermark shows up.

Download the original sample code:
[dm]10[/dm]

UPDATE: I’ve created an additional custom MediaElement called InteractiveImageElement.as. Thanks for the idea @cucu_adrian! The new element handles rollover and rollout by adjusting the image’s alpha property and setting the cursor to a button cursor. It also navigates to a url specified in the class – this would be any easy thing to make configurable though.
[dm]11[/dm]

Adobe Marketplace and the GTrack OSMF Plugin

The Adobe Marketplace launched recently:

Adobe Marketplace is the ultimate destination for all things Adobe — If you’ve created a technology or service that enhances or integrates with Adobe AIR, Photoshop, or the Open Source Media Framework, we invite you to register with the largest community of AIR users, Photoshop enthusiasts and OSMF developers in the world.

We’ve added the GTrack plug-in (part of the REOPS project) as an offering in the OSMF marketplace. You can check out the Marketplace offering here.

New things have been added to the OSMF Marketplace on a daily basis, so keep your eye on it for great tools, times and downloads!