HDS & Bootstrap Data

inspect-hds-bootstrap

Working with HDS Bootstrap Data

I’ve always been curious about the bootstrap data for HDS content. Recently, I had the chance to find out more about it and get in some fun development with Node.js. We’ve been kicking around the idea of building a tool set for Adobe Media Server using Node.js and possibly socket.io. Last weekend we got some of the ideas going and one of those was parsing the hds bootstrap data created when content is packages for HDS delivery.

The bootstrap data can live in a couple of places:

  1. In the <metadata> node of an F4M file
  2. In an external .bootstrap file

The .bootstap file contains binary data and the F4M file contains the same binary data that has been Base64 encoded. So, getting to the data is pretty trivial – either read in the .bootstrap file or un-encode the string in that is in the F4M. Getting to the data contained in the bootstrap binary data is the fun part.

Understanding the bootstrap data

To do so, check out the F4V file format specification. This PDF gives you the details for the entire F4V file format. If you read through the PDF, you’ll  see that it is built using what are called “boxes”. These boxes are given identifiers such as “abst”,  “adaf”,  “adkm”,  “aeib”,  “afra”,  & “afrt” to name a few. Each box contains a header, that header identifies the box by its identifier and lets you know how much data is contained in the box. These boxes are also arranged into a hierarchy, so each box has some data that is specific to some part of the data contained in the file.

It is all in the boxes

The boxes that we are concerned with are “abst” or the bootstrap information box, “asrt” or the segement run table box, and “afrt” or the fragment run table box.

The abst box

The bootstrap information box contains information needed to bootstrap playing of HDS content – specifically to construct the URLs necessary to retrieve the fragments for playback. This includes information about the server, media, & segment information.

The asrt box

The segment run table box contains data about the segments for the media item. There can be multiple ASRT boxes – each representing a different quality level. There are some rules that you’ll want to pay attention to for the data in the asrt box:

  • An asrt box can represent fragment runs for several quality levels.
  • Each entry gives the first segment number for a run of segments with the same count of fragments.
    • The count of segments having this same count of fragments can be calculated by subtracting the first segment number in this entry from the first segment number in the next entry.

The afrt box

The fragment run table box is used to find the fragment corresponding to a given time. Similar to the asrt box, there are some rules that you’ll want to pay attention to:

  • Fragments are individually identifiable by the URL scheme based on segment number and fragments number.
  • Fragments may vary both in duration and in number of samples.
  • Duration of the fragments are stored in the this box.
  • A Fragment Run Table may represent fragments for more than one quality level.
  • Each fragment run table entry gives the first fragment number for a run of fragments with the same duration.
    • The count of fragments having this same duration can be calculated by subtracting the first fragment number in this entry from the first fragment number in the next entry.

Parsing the bootstrap data using Node.js

Parsing binary data in Node.js can be done using “Buffer”. For the most part parsing the bootstrap data was pretty straight forward. There is one issue that I ran into with 64bit Integers which was solved easily enough (there are node modules for just about anything) using the node-int64 module to represent the 64Bit Integers. Once that was solved it was just a matter of parsing through the box header to figure out where you are in the dataset, and then creating the appropriate data structures to represent what you want and need in from the bootstrap data.

In our case we want to be able to monitor live events across multiple servers to make sure that they are all on the same segment and fragment. We’re building a services that in the case that something happens to a server and it goes haywire, will notify another service that can then restart or shut down that particular server or let caching servers know that they need to flush or refresh cache. We’re still dreaming up things we can use this type of data for.

Just want to get to that data?

If you have a .bootstrap file you can use the f4fpackager.exe that is part of the Adobe Media Server toolset to inspect the bootstrap data. All you need to do is run the tool with the argument “–inspect-bootstrap”. So the command looks something like the following if you have a bootstrap file named mydata.bootstrap:

[shell].f4fpackager.exe –input-file=mydata.bootstrap –inspect-bootstrap[/shell]

Anyways, if you have any questions or input let me know in the comments.

Creating Set-level Manifest Files Using the F4M Configurator Tool

Here is a quickie on how to use Adobe’s F4M configurator tool to create set level manifest files.

The configurator is installed with AMS 5.0 and can be found in the following directory: {AMS_INSTALL}/tools/f4mconfig/configurator/

  1. Open the f4mconfig.html file in a browser.
    Adobe Media Server - F4M Configurator Tool
  2. Enter the path to your server, application and event. For example for an event named “myliveevent” using the “livepkgr” application the Base URL would look like: http://192.168.1.113/hds-live/livepkgr/_definst_/myliveevent/
  3. If you are going to use DVR, enter a value for “DVR Window Duration”. A value of -1 configures the DVR window for all of the available content. A value greater than zero configures the amount of time in seconds available before the live point. We’ll set a 30 minute DVR window, so 1800 seconds.
  4. Enter the stream name and bit rate for each bit rate you are encoding. For this example lets say we have a single bit rate of 300 for a stream named “mylivestream”
    Adobe's F4M Configurator - Stream Name and DVR Window
  5. Click the “Save Manifest” button. A file will be created and you will prompted to save the file. Save the file and open it.
  6. The file should look similar to the following:
    <manifest xmlns="http://ns.adobe.com/f4m/2.0">
      <baseURL>http://192.168.1.113/hds-live/_definst_/myliveevent/</baseURL>
      <dvrInfo windowDuration="1800"/>
      <media href="mylivestream" bitrate="300"/>
    </manifest>
  7. This file can now be used to specify live DVR content. If you add an additional bitrate, you not have a set-level F4M file for multi-bitrate streaming.

Hope this helps and save a bit of time for you.

Configure Adobe Flash Media Server for Live HTTP Dynamic Streaming

How to set up Live HTTP Dynamic Streaming

So you want to stream a live event using HTTP Dynamic Streaming (HDS) and HTTP Live Streaming (HLS)? No problem. Adobe Media Server (AMS) provides a right out-of-the-box solution for you. To do so, you’ll need to:

  1. Download and install Flash Media Live Encoder (FMLE)
  2. Make a small configuration change to the encoder
  3. Setup your live event
  4. Begin streaming
  5. Set up a player

Installing and configuring Flash Media Live Encoder

  1. Download FMLE from http://www.adobe.com/products/flash-media-encoder.html
  2. Once it is installed open the config.xml file from
    1. Windows: C:Program FilesAdobeFlash Media Live Encoder 3.2conf
    2. Mac: /Applications/Adobe/Flash Media Live Encoder 3.2/conf/
  3. Locate the “streamsynchronization” tag under flashmedialiveencoder_config -> mbrconfig -> streamsynchronization and set the value for “enable” to “true”. The streamsynchronization node should look similar to the following:
    <flashmedialiveencoder_config>
     <mbrconfig>
       <streamsynchronization>
         <enable>true</enable>
       </streamsynchronization>
    ...
  4. Save and close the file.

Setting up the live event

Streaming a live event involves using the “livepkgr” application that comes installed with AMS. The livepkgr application comes with a preconfigured event named livestream. We’ll use this as a template for our live event.

  1. On your server navigate to the {AMS_INSTALL}/applications/livepkgr/events/_definst_ directory.
  2. We’re going to call our event “myliveevent”. Create a new directory and name it “myliveevent”.
  3. Open the newly create mylivestream directory and create a new XML file named “Event.xml”. This file is used to configure the just-in-time (JIT) packaging settings for your HDS content. Add the following XML to the file. Note: You can also copy the Event.xml file from the liveevent directory that is setup by default. Just update the EventID to match the folder name.
    <Event> 
      <EventID>myliveevent</EventID> 
      <Recording> 
        <FragmentDuration>4000</FragmentDuration> 
        <SegmentDuration>16000</SegmentDuration> 
        <DiskManagementDuration>3</DiskManagementDuration> 
      </Recording> 
    </Event>

    For more information about the values in the Event.xml  file you can review Adobe’s documentation – link in the resources section below.

  4. Save and close the file.
  5. Your event is now set up. You can reuse this event all you want, or create another one for a different event name.

Begin streaming

Now we can start up FMLE and set it up to connect to our livepkgr application and begin streaming.

  1. In the left panel of FLME make sure the “Video” and “Audio” sections are both checked.
  2. Video
    1. In the video section, set the format to be “H.264” and then click the button with the wrench icon.
    2. In the resulting pop-up window, make sure the settings match the following:
      1. Profile: Main
      2. Level: 3.1
      3. Keyframe Frequency: 4 seconds
        Live HTTP Dynamic Streaming H.264 Settings
    1. Click “OK” to close the pop-up window.
    2. In the “Bit Rate” section make sure you only have one of the bit rates selected. We’re only creating a single stream for now.
      Live HTTP Dynamic Streaming Video Encoder Settings
  3. Audio
    1. In the Audio section, set the format to “AAC”
      Live HTTP Dynamic Streaming Audio Encoder Settings
  4. In the right panel set “FMS URL” to point to your server and the livepkgr application:
    1. Example: rtmp://192.168.1.113/livepkgr
  5. Set the “Stream” value to be mylivestream?adbe-live-event=myliveevent
    1. “mylivestream” is the name of the stream and can be anything you’d like. The actual files that AMS creates will be stored in the livepkgr/streams/_definst_/mylivestream directory.
    2. “?adbe-live-event=myliveevent” tells the livepkgr application to use the Event.xml in the livepkgr/events/_definst_/myliveevent directory that we created.
      Live HTTP Dynamic Streaming RTMP Server Settings
  6. Click the “Connect” button. If all goes well, you’ll connect to your server. If not, check to make sure there aren’t any typos in the values for “FMS URL” and “Stream” and that you can connect to your server and it is running.
  7. Click the bug green “Start” button to begin streaming.
    Live HTTP Dynamic Streaming Big Green Start Button
  8. You now have a stream. Let’s see if we can get a player to play it back.

Setting up the player

Getting to the HDS content for your new stream involves requesting a URL that lets Apache (installed with AMS) know what we are looking for. The path will consist of the following parts:

  1. The protocol: http://
  2. The server location: 192.168.1.113/ (in my case, yours will be different)
  3. The Location that is configured to deliver live streams. By default these are:
    1. HDS: hds-live/
    2. HLS: hls-live/
  4. The application name: livepkgr/
  5. The instance name (we’ll use the default): _definst_
  6. The event name: myliveevent
  7. The stream name: mylivestream
  8. The F4M file extension for HDS – .f4m or the M3U8 file extension for HLS.

So if we put all of that together we’ll get a URL that looks like:

  • HDS: http://192.168.1.113/hds-live/livepkgr/_definst_/myliveevent/mylivestream.f4m
  • HLS: http://192.168.1.113/hls-live/livepkgr/_definst_/myliveevent/mylivestream.m3u8

Note: You may need to add the 8134 port to the URL if you didn’t install AMS on port 80: http://192.168.1.113:8134/hds-live/livepkgr/_definst_/myliveevent/mylivestream.f4m

  1. Open a browser window and navigate to that URL, you should see the F4m’s XML content.
    Live HTTP Streaming F4M XML
  2. Open the following URL: http://www.osmf.org/configurator/fmp/#
  3. Set your F4M url as the value for “Video Source”
  4. Select the “Yes” radio button for “Are you using HTTP Streaming or Flash Access 2.0?”
  5. Set “Autoplay Content” to “Yes”
    Live HTTP Dynamic Streaming Player Settings
  6. Click the Preview button at the bottom of the page.
  7. Congratulations. You are now streaming live media over HTTP.

To verify the HTTP Streaming, open a tool that will let you inspect the HTTP traffic (something like Developer Tools or Firebug). You should see requests for resourecs like “mylivestreamSeg1-Frag52” and “mylivestream.bootstrap”. This is the player requesting HDS fragments and Apache and AMS working together to package them just-in-time for the player.
Live HTTP Dynamic Streaming HTTP Traffic

Hopefully this provides you with some good information about Live HTTP Dynamic Streaming and clarifies some of the setup and configuration details. Please, if you have any questions, let me know in the comments or contact me.

Resources

OSMF Fix: “Local HDS” Playback

OSMF & Local F4M Files

The Open Source Media Framework (OSFM) doesn’t respond well when you pass it a local F4M file path for playback. Playing back local HDS fragments is an edge case, but enough people have encountered this problem that I think it deserves a little attention.

Fixing the IndexHandler

The basics involve extending the HTTPStreamingF4FIndexHandler then overriding the constructFragmentRequest() method & removing the requirement that HTTP be part of the URI.

So, after you’ve created your new class that extends org.osmf.net.httpstreaming.f4f.HTTPStreamingF4FIndexHandler, add the following override:

[actionscript3]override protected function constructFragmentRequest(serverBaseURL:String, streamName:String, segmentId:uint, fragmentId:uint):String
{
var requestUrl:String = "";
requestUrl += streamName + "Seg" + segmentId + "-Frag" + fragmentId;
return requestUrl;
}[/actionscript3]

The original constructFragmentRequest() method looks like:

[actionscript3]protected function constructFragmentRequest(serverBaseURL:String, streamName:String, segmentId:uint, fragmentId:uint):String
{
var requestUrl:String = "";
if (streamName.indexOf("http") != 0)
{
requestUrl = serverBaseURL + "/" ;
}
requestUrl += streamName + "Seg" + segmentId + "-Frag" + fragmentId;
return requestUrl;
}[/actionscript3]

So you can see we just removed the “http” requirement.

Now you need to get OSMF to use your adjusted IndexHandler. This means you need to create a custom NetLoader, that creates a custom StreamingFactory & then the Factory creates your IndexHandler. Once you have all of this done, you can then create a custom MediaFactory (I usually just extend DefaultMediaFactory) and create a MediaFactoryItem that can handle your local F4M file resource, & specify your new NetLoader for the element that is returned.

Convoluted eh? Let’s break it down.

Creating the Factory

Since we are dealing with HDS content extend org.osmf.net.httpstreaming.f4f.HTTPStreamingF4FFactory & override the createIndexHandler() method. In this method set the indexHandler property equal to an instance of your IndexHandler (mine is named “LocalFileStreamingF4FIndexHandler”).

Example:
[actionscript3]override public function createIndexHandler(resource:MediaResourceBase, fileHandler:HTTPStreamingFileHandlerBase):HTTPStreamingIndexHandlerBase
{
indexHandler = new LocalFileStreamingF4FIndexHandler(fileHandler);
return indexHandler;
}[/actionscript3]

Creating a NetLoader

Now we need our NetLoader – this is where the Factory is created. I extended org.osmf.net.httpstreaming.HTTPStreamingNetLoader & created an override for the createHTTPStreamignFactory() method.

Example:
[actionscript3]override protected function createHTTPStreamingFactory():HTTPStreamingFactory
{
return new LocalFileStreamingF4FFactory();
}[/actionscript3]

Creating a custom MediaFactory

Now we need to be able to injet this NetLoader into the system. So extend the org.osmf.media.DefaultMediaFactory & add a MediaFactoryItem that return a new VideoElement. Pass in an instance of your NetLoader to the the VideoElement constructor.

Example:
[actionscript3]localFileStreamingNetLoader = new LocalFileStreamingNetLoader();
localHDSFileNetLoader = new LocalHDSContentNetLoader();
addItem( new MediaFactoryItem( "com.realeyes.osmf.elements.video.httpstreaming", localHDSFileNetLoader.canHandleResource,
function():MediaElement
{
return new VideoElement(null, localHDSFileNetLoader);
}
)
);[/actionscript3]

Specifying the canHandle and passing in the NetLoader

You can also see in the previous bit of code that the MediaFactoryItem asks for a canHandleResource function. This is where conditions are added to make sure the you’re working with the right type of media. For this situations you can default to the super’s canHandleResource() which checks the resource HTTP Streaming metadata.

Now you have a setup that can handle a local F4M file and the downloaded fragments. There are a couple of considerations you’ll need to keep in mind:

  1. The fragments and F4M file must reside in the same directory. OSMF looks for the fragments in the directory that they F4M is in.
  2. Multi-bitrate won’t work, unless you can download the fragments for all the bitrates and work out storage for them.

The provided examples is the least obtrusive way to get local HDS content to play. Alternatively  you can edit the HTTPStreamingIndexHandler and compile an OSMF swc to use for a quick and dirty fix.

HTTP Dynamic Streaming Content Download and Playback

Recently I’ve been working on a system to playback HTTP Dynamic Streaming (HDS) content locally, from a single file. If you have seen my previous post on HTTP Dynamic Streaming (HDS) or are already familiar with it, you know that the media item is packaged in such a way that there are multiple segments and fragments that make up an entire media item. Similar to the image below:

A sample segment and its fragments
A sample segment and its fragments

The system involves a client, an AIR application, requesting some remote HDS content (basically an F4M file). The client downloads the fragment data for the media item and writes it to disk. Instead of writing each fragment to a separate file, the fragment data is written to a single file. This part alone is pretty straight forward. The tricky part is when you want to play the content back.

A few problems needed to be overcome to get playback to work. First, to get the local fragments to playback, I needed to fix an issue in the OSMF framework that only accounts for requests for remote HDS fragments. This was accomplished by overriding the HTTPStreamingIndexHandler class and removing some code that only accounted for “HTTP” being part of the request. Second, and more importantly, I needed to intercept the request for the HDS fragment that is generated when OSMF is playing back HDS content, use the request to determine where the fragment’s byte data exists in local file that was created when the client downloaded the content. Then return this byte data to the rest of the OSMF code that parses it into FLV data to pass onto the appendBytes() method on the NetStream.

On top of that, we wanted to allow for playback while the fragments were still downloading. On OS X this wasn’t a huge deal because AIR on OS X can have multiple FileStreams open the same file. On Windows the file is locked when it is opened by the first FileStream that open the file. This is a problem because I want to write the downloaded fragment data to the file and I want to read fragment data for playback at the same time. This issue was solved with a little utility library that uses only 1 FileStream instance and manages read and write requests by queuing up requests and only allowing the requests to happen 1 at a time.

It was a huge headache and lots of time was spent in the OSMF rabbit hole but, I now have a great File IO library for AIR and I’m able to download and playback HDS content locally.

HTTP Dynamic Streaming & Live Events

Flash Media Server (FMS) allows for the streaming of live video over HTTP. The following is a list of concepts that apply to HTTP Dynamic Streaming & live streams:
  • To stream live video over HTTP, the FMS uses the concept of live events.
  • A live event is configured as part of a FMS application.
  • An application configured for live events and can have multiple events.
  • As with normal FMS applications the name of the directory is the name of the event.
  • Live events can be configured for multi-bitrate streaming.
  • The streams are packaged as fragments and written to disk on the fly. Meaning a player can access the content after the live stream has stopped.
  • Live events support DVR and DRM.

Creating a live event

The steps involved in creating a live event are very similar to creating a normal FMS application with the addition of the Event.xml file. They are:

  1. Create the FMS application {FMS_INSTALL}/applications/{APPLICATION_NAME}
  2. In the application directory create the following directory sturcture: events/_defInst_/{EVENT_NAME}
  3. Create an Event.xml file in the ‘events/_defInst_/{EVENT_NAME}’ directory and at a minimum specify the fragment & segment duration:[xml]<EventID>{EVENT_NAME}</EventID>
    <Recording>
    <FragmentDuration>4000</FragmentDuration>
    <SegmentDuration>10000</SegmentDuration>
    </Recording>[/xml]

When you install the Flash Media Server a preconfigured live event is installed as well. The application is called ‘livepkgr’ with a live event called ‘liveevent’. The configuration exists in ‘{FMS_INSTALL}/applications/livepkgr/events/_defInst_/liveevent’

Packaging for Live Events

The Event.xml contains the configuration information for the live event. This includes:

  • Segment and fragment settings
  • Content protection (Flash Access) information

A sample Event.xml file with segment/fragment settings as well as content protection configuration:

[xml]
<Event>
<EventID>liveevent</EventID>
<Recording>
<FragmentDuration>4000</FragmentDuration>
<SegmentDuration>10000</SegmentDuration>
<ContentProtection enabled="true">
<ProtectionScheme>FlashAccessV2</ProtectionScheme>
<FlashAccessV2>
<ContentID>foo</ContentID>
<CommonKeyFile>common-key.bin</CommonKeyFile>
<LicenseServerURL>http://dill.corp.adobe.com:8090</LicenseServerURL>
<TransportCertFile>production_transport.der</TransportCertFile>
<LicenseServerCertFile>license_server.der</LicenseServerCertFile>
<PackagerCredentialFile>production_packager.pfx</PackagerCredentialFile>
<PackagerCredentialPassword>hbXX5omIhzI=</PackagerCredentialPassword>
<PolicyFile>policy01.pol</PolicyFile>
</FlashAccessV2>
<ContentProtection>
</Recording>
</Event>[/xml]

The parts to pay attention to in the file Event.xml file above are:

  • SegmentDuration: The length of the segments in milliseconds. Each F4F file contains one segment.
  • FragmentDuration: The length of the fragments in milliseconds. Each segment can contain multiple fragments.
  • ContentProtection: Specifies if the content is protected as well as the details necessary for Flash Access content protection.

DVR & Multi-bitrate Live Events

To create a DVR or multi-bitrate (MBR) live event you will also need to create a Manifest.xml file in the live event folder (events/_defInst_/{EVENT_NAME}/Manifest.xml). This file will contain the DVR and MBR settings for the live event. Sample Manifest.xml file:

[xml]<manifest xmlns="http://ns.adobe.com/f4m/1.0">
<dvrInfo beginOffset="0" endOffset="0"></dvrInfo>
<media streamId="livestream1" bitrate="100" />
<media streamId="livestream2" bitrate="200" />
<media streamId="livestream3" bitrate="350" />
</manifest>[/xml]

The <dvrInfo> node:

The <dvrInfo> node contains 2 attributes and controls the DVR functionality for the live event:

  • beginOffset: The value (in seconds) is where the client players will begin viewing the stream. The default is 0 & negative values are treated as 0.
  • endOffset: This value in seconds specifies how many seconds behind the current duration of the stream clients can view. The default is 0 & negative values are treated as 0.

Do not include this node if you do not want to use DVR functionality.

The <media> node:

Thenodes are used to specify the multi-bitrate (MBR) settings for the live event. The file is parsed and used by FMS to create the streams for the live event based on the settings specified in the media nodes. The manifest file is then updated with the nodes and data necessary (id, duration, bootstrap and metadata) for a client player to play the stream. The initial file should contain the following attributes for eachnode:

  • streamId: The name of the publishing stream
  • bitrate: The bitrate the stream was encoded at

If you do not want to use MBR do not include the media node(s).

Playing Back Live Events

Setting up an FMS application to publish a live event is as simple as creating the application (directory), and the Event.xml file. Of course adding MBR, DVR and content protection will add to the setup and configuration, but it is still a pretty straight forward process. Once the FMS application is configured we’ll need to do a couple of things:

  1. Publish a stream to the application and associate the stream with the live event
  2. Play back the stream in a player. We’ll use the Flash Media Playback to keep things simple.

Publish a Stream and Associate it with a Live Event

To publish a stream we’ll use the Flash Media Live Encoder. We’ll also need to associate the stream to the live event. This will be accomplished with a main.asc file and a bit of server side code. First we’ll cover the server side code, then we’ll set up the Flash Media Live Encoder.

Associating a Stream with a Live Event

I used the main.asc found in the ‘livepkgr’ application installed with Flash Media Server as a base (I’ve change a couple of things). You can download the main.asc I use here. The main.asc file just needs to go in the application directory ({FMS_INSTALL}/applications/{APPLICATION_NAME}). A little explanation of what is going on in the main.asc: The stream is associated to the live event in the onPublish() method (ln. 60 in main.asc):

[as3]s.liveEvent = liveEventName;[/as3]

The variable ‘liveEventName’ is set to the stream name (default):

[as3]if( nv.localeCompare( "event" ) == 0 )
{
liveEventName = nval;
break;
}[/as3]

or to the value passed in as the URL variable ‘event’ (ln. 34 – 44 in main.asc).

[as3]var nvpairs = new LoadVars();
nvpairs.decode(queryString);
for( var nv in nvpairs )
{
var nval = nvpairs[nv];
if( nv.localeCompare( "event" ) == 0 )
{
liveEventName = nval;
break;
}
}[/as3]

Now to publish the stream.

Publishing the Stream Using Flash Media Live Encoder

To publish the stream we’ll need to connect to the FMS application and then pass the name of the event that we want to stream to. This is accomplished by adding URL variables to the stream name. For example: Live event publish settings Make sure to replace {APPLICATION_NAME} with the actual name of your live event application and {EVENT_NAME} with the actual name of your event. You will also need to make sure you are using H.264 and set up any MBR streams: Live event publish settings 2Click the start button, and the stream should start publishing to your FMS server.

Playing Back the Live Event

To play back the stream, we’ll use the Flash Media Playback. So go to http://osmf.org/configurator/fmpto setup the player.

  1. Set the ‘Video Source’ to: http://{FMS_SERVER}:{FMS_HTTP_PORT}/live/events/{APPLICATION_NAME}/events/_defInst_/{EVENT_NAME}.f4m
    • For FMS 4.5 the path will be http://{FMS_SERVER}:{FMS_HTTP_PORT}/hds-live/{APPLICATION_NAME}/{INSTANCE_NAME}/{EVENT_NAME}/{STREAM_NAME}.f4m
  2. Select Yes for ‘Are you using HTTP Sstreaming or Flash Access 2.0?’
  3. Remove the value for ‘Poster frame file location’
  4. Select Yes for ‘Autoplay Content’
  5. Click the ‘Preview’ button.

The live event should start playing: Hds live event playbackThere you go – live Events streamed over HTTP that supports multi-bitrate, DVR & content protection using Flash Access. In the next few articles I’ll dive into Flash Access and what is required to get your content protected and secured. Resources:

  • Flash Media Server Developer’s Guide: http://help.adobe.com/en_US/flashmediaserver/devguide/WSd391de4d9c7bd609-52e437a812a3725dfa0-8000.html

HTTP Dynamic Streaming Presentation Summary

Okay, after a crazy week buried under bugs and project management tasks I’ve finally carved out some time to post the slides and resources from my HTTP Dynamic Streaming presentation I gave last month.

You can grab the slides on Slide Share, although they won’t be too useful.

The presentation was recorded – which will be more useful than the slides alone.  You can view the Connect session here: http://bit.ly/isfJmo.

Resources:

Next on the list of stuff I’d like to get up on the site is a write-up about creating Live events. So look for that in the next few days now that I have some room to breath.

HTTP Dynamic Streaming: Simple HDS Playback

In my last few posts i’ve covered:

  • What is HDS? – Describing what HTTP Dynamic Streaming is and where it is useful
  • Getting Started with HDS – The pieces and parts that you need to be familiar with to start using HDS video content

Now let’s get to the fun part. Let’s play back our HTTP Dynamic Streaming videos. First we’ll use the Flash Media Playback to play our HDS content, then we’ll build a simple OSMF player that plays back the same HDS content. Both of the examples that I’ll walk though use the Open Source Media Framework. OSMF provides the code necessary for the heavy lifting parts of HDS playback. This includes:

  • Loading the F4M file
  • Determining if miulti-bitrate (MBR) is available
  • Determining the best MBR content for playback
  • Extracting and inspecting the bootstrap data contained in the F4M file
  • Using the bootstrap data to determine what segment and fragment to call based on the current time of the video playback
  • Assembling the video data for playback as the fragments are loaded
  • Playing back the video stream

There are a couple of things that we’ll need to make sure we have in place to get the players working:

  1. Hopefully you have some HDS video files to play with. If you don’t, you can download some that I’ve packaged here:
    [dm]13[/dm]
  2. If you haven’t already, install and configure the HTTP Origin module on your Apache 2.2 web server, or install the Flash Media Server  and its pre-configured webserver.

Deploying the video content

Depending on how you’ve configured the HTTP Origin Module, the directory you will need to deploy your files to will differ from mine, so we’ll use ‘{HTTP_STREAMING_DIR}‘ to represent the path to the directory configured to deliver HDS content using the HTTP Origin Module. On my windows server my path is:
C:Program FilesAdoebFMSwwwrootvod

Here are the 2 simple steps to deploying your files:

  1. Upload: Copy your packaged HDS files to the {HTTP_STREAMING_DIR}
  2. Verify: Check to make sure that the files are accessible by requesting the F4M file in a browser

* Note: you’ll also need to have a crossdomain.xml file if you are loading across domains. You can put this in the {HTTP_STREAMING_DIR} as well.

Playing the content back using Strobe Media Playback

Now that we have our content online, we can see if we can load it. I use the Flash Media Playback to test this. Flash Media Playback is a hosted OSMF player built by Adobe. If you want something that you host, check out the Strobe Media Player – another OSMF player provided by Adobe that you can download and deploy to your own server.

To use the the Flash Media Playback go to http://osmf.org/configurator/fmp/

You should see a page that looks similar to the following:
Flash Media Playback

We’ll need to update a couple of things to get our HDS content playing:

  1. Change the video source URL to point to the F4M file that you deployed your Apache server. Mine is ‘http://office.realeyes.com:8134/vod/hds-sample/sample.f4m’
  2. Change the selected value for ‘

The settings should resemble the following:
HDS Flash Media Playback Settings

Click the preview button and the video should begin playing. I’ve embedded the Flash Media Playback here with the above settings (except autoplay is false):

Now that we know the our HDS content is working correctly, let’s build a simple OSMF player that plays back the same content.

  1. Create a new ActionScript 3 project in Flash Builder and name it ‘SimplePlayer’.
  2. In the constructor of the class that is created for you ‘SimplePlayer.as’ create a local variable named mediaFactory and typed as a MediaFactory. Set mediaFactory equal to a new DefaultMediaFactory()
    [as3]var mediaFactory:MediaFactory = new DefaultMediaFactory();[/as3]
  3. Create a local URLResource variable named resource and set it equal to a new URLResource(). Pass the URLResource constructor the URL to your F4M file. Mine is ‘http://office.realeyes.com:8134/vod/hds-sample/sample.f4m’
    [as3]var resource:URLResource = new URLResource( “http://office.realeyes.com:8134/vod/hds-sample/sample.f4m” );[/as3]
  4. Create an MediaElement local variable named element and set it equal to the result of calling the createMediaElement() method on the mediaFactory object. Make sure to pass ‘resoure’ as the only argument to the createMediaElement() method.
    [as3]var element:MediaElement = mediaFactory.createMediaElement( resource );[/as3]
  5. Create a MediaPlayer local variable named mediaPlayer and set it equal to a new MediaPlayer, passing the constructor the element variable as the only argument.
    [as3]var mediaPlayer:MediaPlayer = new MediaPlayer( element );[/as3]
  6. Create a local variable named mediaContainer typed as a MediaContainer. Set it equal to a new MediaContainer.
    [as3]var mediaContainer:MediaContainer = new MediaContainer();[/as3]
  7. Call the addMediaElement() method on the mediaContainer object, passing the element object in as the only argument.
    [as3]mediaContainer.addMediaElement( element );[/as3]
  8. Add the mediaContainer to the display list by calling the addChild() method making sure to pass in the mediaContainer object as the only argument.
    [as3]addChild( mediaContainer );[/as3]
  9. The completed class should resemble the following:
    [as3]package
    {
    import flash.display.Sprite;import org.osmf.containers.MediaContainer;
    import org.osmf.elements.VideoElement;
    import org.osmf.media.DefaultMediaFactory;
    import org.osmf.media.MediaElement;
    import org.osmf.media.MediaFactory;
    import org.osmf.media.MediaPlayer;
    import org.osmf.media.URLResource;
    import org.osmf.utils.URL;

    public class SimplePlayer extends Sprite
    {
    public function SimplePlayer()
    {
    var mediaFactory:MediaFactory = new DefaultMediaFactory();

    var resource:URLResource = new URLResource( “http://office.realeyes.com:8134/vod/hds-sample/sample.f4m” );

    var element:MediaElement = mediaFactory.createMediaElement( resource );

    var mediaPlayer:MediaPlayer = new MediaPlayer( element );

    var mediaContainer:MediaContainer = new MediaContainer();

    mediaContainer.addMediaElement( element );
    addChild( mediaContainer );
    }
    }
    }[/as3]

When you run this the video should play. There aren’t any controls, but the important thing to note here is that there wasn’t anything special we needed to do to playback our HDS content. We relied on OSMF’s DefaultMediaFactory to determine the type of media and play it back. You can download the project source here: [dm]12[/dm]

Getting Started with HTTP Dynamic Streaming

As described in the first post in this series: What is Dynamic Streaming?, HTTP Dynamic Streaming (HDS) allows for the delivery of streaming content over the HTTP protocol. Streaming video over HTTP requires some knowledge and preparation. The following is a rundown and description for each of the components involved with HTTP streaming.

The files: The F4F and F4M file formats

The F4F file format specification (http://www.adobe.com/devnet/f4v.html) describes how to divide media content into segments and fragments. These segments and fragments are what make ‘streaming’ content over HTTP possible. Basically, the file is broken up into multiple pieces and parts when it is prepared for for HTTP streaming.

A sample segment and its fragments
A sample segment and its fragments

These way the segments and each segments fragments are created is based on multiple factors such as the length of the video content, the the number of keyframes and the length specified for each segment. The result of packaging your video content for HDS delivery is a set of files that look similar to the following for a simple, single bit-rate file:

HTTP Dynamic Streaming Packaged Files for a short video file
The set of files created by the f4fpackager for a short video clip.

From the image above:

  • sample.f4m: This is an XML file created by the f4fpackager that contains the information necessary for a client player to playback the video file. There will be only one of these for each source video file.
  • sampleSeg1.f4f: This segment file contains the fragments that the client player will request for playback. There can be multiple of these files for each source video file.
  • sampleSeg1.f4x: This is an index file that contains specific information about the fragments inside the segment files. There will be one of these types of files for each segment file. The HTTP Origin Module uses the data in this file to determine the fragment to send back to the client after a request.

Each F4F file is a single segment and the segment can contain multiple fragments. So, if you inspect the HTTP requests as a media item is being played back you will see files being requested that map to a fragment within a segment. For example:

Segments and fragments being requested over HTTP
Segments and fragments being requested for HDS video

This request specifies to the HTTP Origin Module what segment’s fragment to pull out of the F4F file and deliver to the client for playback. These requests are based on where the time code of where the media item is during playback. So if you scrub to later in the video, the Requests might look something like the following:

Scrubbing HTTP Dynamic Streaming Video Content
Segments and fragment requests while scrubbing HDS video content

This allows the client to request any piece of the video content and start playing back almost immediately. You can control how the files are segmented and fragmented when preparing content with the f4fpackager. The basic concept is to balance the size of the fragment being delivered and the number of HTTP requests that a client needs to make for playback. Please note, the fragments may not be sequential, so you cannot rely on this when requesting content. The the fragments are based on the settings passed into the packager and can skip fragment numbers. So , you can expect to see fragment sequences like the following (no scrubbing involved):

  1. sampleSeg1-Frag1
  2. sampleSeg1-Frag2
  3. sampleSeg1-Frag3
  4. sampleSeg1-Frag5
  5. sampleSeg1-Frag6

This doesn’t mean that the file is incomplete, it is just how the fragments were created by the packager.

The Details

The F4M File Format

The F4M or Flash Media Manifest file format contains the information about the package of files created when video content is packaged for HDS. The information included in the manifest file can include some or all of the following:

  • Media location
  • Media type
  • Media bootstrap
  • Multi-bitrate (MBR) availability
  • Codecs
  • Resolutions
  • Digital Rights Management (DRM) authentication
  • DVR information
  • moov atom, metadata block and XMP metadata block

When playing back HDS video content, the F4M file is expected to be loaded as the ‘media file’. The client is responsible for inspecting the data included in the F4M file to authenticate (if DRM authentication is required), determine if MBR content is available and select the best MBR version of the content and then request the media from the server.

Sample of a simple F4M file with a single media item:

[xml][/xml]
Installing and configuring the Apache HTTP Origin Module: http://help.adobe.com/en_US/HTTPStreaming/1.0/Using/WS7b362c044b7dd076-735e76121260080a90e-8000.html