Configure Adobe Flash Media Server for Live HTTP Dynamic Streaming

How to set up Live HTTP Dynamic Streaming

So you want to stream a live event using HTTP Dynamic Streaming (HDS) and HTTP Live Streaming (HLS)? No problem. Adobe Media Server (AMS) provides a right out-of-the-box solution for you. To do so, you’ll need to:

  1. Download and install Flash Media Live Encoder (FMLE)
  2. Make a small configuration change to the encoder
  3. Setup your live event
  4. Begin streaming
  5. Set up a player

Installing and configuring Flash Media Live Encoder

  1. Download FMLE from http://www.adobe.com/products/flash-media-encoder.html
  2. Once it is installed open the config.xml file from
    1. Windows: C:Program FilesAdobeFlash Media Live Encoder 3.2conf
    2. Mac: /Applications/Adobe/Flash Media Live Encoder 3.2/conf/
  3. Locate the “streamsynchronization” tag under flashmedialiveencoder_config -> mbrconfig -> streamsynchronization and set the value for “enable” to “true”. The streamsynchronization node should look similar to the following:
    <flashmedialiveencoder_config>
     <mbrconfig>
       <streamsynchronization>
         <enable>true</enable>
       </streamsynchronization>
    ...
  4. Save and close the file.

Setting up the live event

Streaming a live event involves using the “livepkgr” application that comes installed with AMS. The livepkgr application comes with a preconfigured event named livestream. We’ll use this as a template for our live event.

  1. On your server navigate to the {AMS_INSTALL}/applications/livepkgr/events/_definst_ directory.
  2. We’re going to call our event “myliveevent”. Create a new directory and name it “myliveevent”.
  3. Open the newly create mylivestream directory and create a new XML file named “Event.xml”. This file is used to configure the just-in-time (JIT) packaging settings for your HDS content. Add the following XML to the file. Note: You can also copy the Event.xml file from the liveevent directory that is setup by default. Just update the EventID to match the folder name.
    <Event> 
      <EventID>myliveevent</EventID> 
      <Recording> 
        <FragmentDuration>4000</FragmentDuration> 
        <SegmentDuration>16000</SegmentDuration> 
        <DiskManagementDuration>3</DiskManagementDuration> 
      </Recording> 
    </Event>

    For more information about the values in the Event.xml  file you can review Adobe’s documentation – link in the resources section below.

  4. Save and close the file.
  5. Your event is now set up. You can reuse this event all you want, or create another one for a different event name.

Begin streaming

Now we can start up FMLE and set it up to connect to our livepkgr application and begin streaming.

  1. In the left panel of FLME make sure the “Video” and “Audio” sections are both checked.
  2. Video
    1. In the video section, set the format to be “H.264” and then click the button with the wrench icon.
    2. In the resulting pop-up window, make sure the settings match the following:
      1. Profile: Main
      2. Level: 3.1
      3. Keyframe Frequency: 4 seconds
        Live HTTP Dynamic Streaming H.264 Settings
    1. Click “OK” to close the pop-up window.
    2. In the “Bit Rate” section make sure you only have one of the bit rates selected. We’re only creating a single stream for now.
      Live HTTP Dynamic Streaming Video Encoder Settings
  3. Audio
    1. In the Audio section, set the format to “AAC”
      Live HTTP Dynamic Streaming Audio Encoder Settings
  4. In the right panel set “FMS URL” to point to your server and the livepkgr application:
    1. Example: rtmp://192.168.1.113/livepkgr
  5. Set the “Stream” value to be mylivestream?adbe-live-event=myliveevent
    1. “mylivestream” is the name of the stream and can be anything you’d like. The actual files that AMS creates will be stored in the livepkgr/streams/_definst_/mylivestream directory.
    2. “?adbe-live-event=myliveevent” tells the livepkgr application to use the Event.xml in the livepkgr/events/_definst_/myliveevent directory that we created.
      Live HTTP Dynamic Streaming RTMP Server Settings
  6. Click the “Connect” button. If all goes well, you’ll connect to your server. If not, check to make sure there aren’t any typos in the values for “FMS URL” and “Stream” and that you can connect to your server and it is running.
  7. Click the bug green “Start” button to begin streaming.
    Live HTTP Dynamic Streaming Big Green Start Button
  8. You now have a stream. Let’s see if we can get a player to play it back.

Setting up the player

Getting to the HDS content for your new stream involves requesting a URL that lets Apache (installed with AMS) know what we are looking for. The path will consist of the following parts:

  1. The protocol: http://
  2. The server location: 192.168.1.113/ (in my case, yours will be different)
  3. The Location that is configured to deliver live streams. By default these are:
    1. HDS: hds-live/
    2. HLS: hls-live/
  4. The application name: livepkgr/
  5. The instance name (we’ll use the default): _definst_
  6. The event name: myliveevent
  7. The stream name: mylivestream
  8. The F4M file extension for HDS – .f4m or the M3U8 file extension for HLS.

So if we put all of that together we’ll get a URL that looks like:

  • HDS: http://192.168.1.113/hds-live/livepkgr/_definst_/myliveevent/mylivestream.f4m
  • HLS: http://192.168.1.113/hls-live/livepkgr/_definst_/myliveevent/mylivestream.m3u8

Note: You may need to add the 8134 port to the URL if you didn’t install AMS on port 80: http://192.168.1.113:8134/hds-live/livepkgr/_definst_/myliveevent/mylivestream.f4m

  1. Open a browser window and navigate to that URL, you should see the F4m’s XML content.
    Live HTTP Streaming F4M XML
  2. Open the following URL: http://www.osmf.org/configurator/fmp/#
  3. Set your F4M url as the value for “Video Source”
  4. Select the “Yes” radio button for “Are you using HTTP Streaming or Flash Access 2.0?”
  5. Set “Autoplay Content” to “Yes”
    Live HTTP Dynamic Streaming Player Settings
  6. Click the Preview button at the bottom of the page.
  7. Congratulations. You are now streaming live media over HTTP.

To verify the HTTP Streaming, open a tool that will let you inspect the HTTP traffic (something like Developer Tools or Firebug). You should see requests for resourecs like “mylivestreamSeg1-Frag52” and “mylivestream.bootstrap”. This is the player requesting HDS fragments and Apache and AMS working together to package them just-in-time for the player.
Live HTTP Dynamic Streaming HTTP Traffic

Hopefully this provides you with some good information about Live HTTP Dynamic Streaming and clarifies some of the setup and configuration details. Please, if you have any questions, let me know in the comments or contact me.

Resources

HTTP Dynamic Streaming Content Download and Playback

Recently I’ve been working on a system to playback HTTP Dynamic Streaming (HDS) content locally, from a single file. If you have seen my previous post on HTTP Dynamic Streaming (HDS) or are already familiar with it, you know that the media item is packaged in such a way that there are multiple segments and fragments that make up an entire media item. Similar to the image below:

A sample segment and its fragments
A sample segment and its fragments

The system involves a client, an AIR application, requesting some remote HDS content (basically an F4M file). The client downloads the fragment data for the media item and writes it to disk. Instead of writing each fragment to a separate file, the fragment data is written to a single file. This part alone is pretty straight forward. The tricky part is when you want to play the content back.

A few problems needed to be overcome to get playback to work. First, to get the local fragments to playback, I needed to fix an issue in the OSMF framework that only accounts for requests for remote HDS fragments. This was accomplished by overriding the HTTPStreamingIndexHandler class and removing some code that only accounted for “HTTP” being part of the request. Second, and more importantly, I needed to intercept the request for the HDS fragment that is generated when OSMF is playing back HDS content, use the request to determine where the fragment’s byte data exists in local file that was created when the client downloaded the content. Then return this byte data to the rest of the OSMF code that parses it into FLV data to pass onto the appendBytes() method on the NetStream.

On top of that, we wanted to allow for playback while the fragments were still downloading. On OS X this wasn’t a huge deal because AIR on OS X can have multiple FileStreams open the same file. On Windows the file is locked when it is opened by the first FileStream that open the file. This is a problem because I want to write the downloaded fragment data to the file and I want to read fragment data for playback at the same time. This issue was solved with a little utility library that uses only 1 FileStream instance and manages read and write requests by queuing up requests and only allowing the requests to happen 1 at a time.

It was a huge headache and lots of time was spent in the OSMF rabbit hole but, I now have a great File IO library for AIR and I’m able to download and playback HDS content locally.

OSMF 1 Day Course: What do you want?

We are working on a 1 day Open Source Media Framework (OSMF) course at Realeyes and wanted to know what should be included in the training. Here is the initial course description:

This intense one day hands-on course is a serious dive into the Open Source Media Framework (OSMF). Starting with an introduction of the OSMF the course quickly picks up pace and covers the basics of simple implementations that can be done in less 4 lines of code or less for powerful media player applications to the creation, understanding, and use of the more robust features for dynamic media management, from progressive playback to HTTP Dynamic Streaming. The course also will cover creating and using HTML embeddable players, implementing brandable media player user interfaces, advanced stream and connection management, as well as a solid exploration of the advanced plugin architecture available with OSMF.

And here is the initial outline:

  • Unit 1: Getting Started
    • Introducing OSMF
    • Using the Strobe Media Player
    • The basics of OSMF
      • BuildingThe Simplest Player
      • Breaking down and understanding the parts of OSMF
        • MediaElement
        • MediaPlayer
        • MediaContainer
        • MediaFactory
      • Handling different types of media (Images/SWF/Video/Audio)
      • Understanding Traits
      • Utilizing MetaData
      • Making an Embeddable Media Player
        • Using FlashVars to control the player
        • Using the HTMLMediaContainer
  • Unit 2: Enhanced Media Experiences
    • Creating and implementing User Interface control with OSMF
    • Media Compositions & base level custom advertising/monetization
      • Serial
        • Poster art, playlists, pre/post/in-play injection
      • Parallel
        • Overlay/PiP, outside synchronized advertising
          • Basics of Layout controls
  • Unit 3: Extensibility
    • Advanced low level control techniques
      • Advanced NetConnection Management & Control
      • Advanced NetStream Management & Control
      • Advanced Buffer Management
    • Plugins
      • Understanding the different types:
        • Static and Dynamic
          • Reference and Proxy
      • CDN Plugins
        • Implementing a CDN plugin
          • Akamai, LimeLight
      • Analytics Plugins
        • Building a simple plugin with JS integration/reporting
        • Implementing the GTrack Plugin
      • Advertising Plugins
        • VAST/MAST – pre-roll and overlay

Do you see anything missing?

Something you’d like to have in the 1 day course?

Would you be interested in sitting in on a test run of the course in the future?

Let me know.