Roku SDK Documentation : Media Playback Markup

Table of Contents


Finally, let's see how to play streaming media files in a Roku SceneGraph application. We just have to add one of the following media playback nodes to the application (or maybe both):

  • Audio

  • Video

These two media playback nodes both have very similar uses and markup.

Audio Markup

Example Application: AudioExample.zip
Node Class Reference:
Audio

The Audio node just needs to be created, and then have the URL of an MP3 audio file as a Content Meta-Data attribute in a ContentNode node assigned to the content field. Then set the Audio node control field to play, and listen to the sound of non-silence, as shown in the audioscene.xml component file in AudioExample.zip:

Audio Markup Example
<component name = "AudioExample" extends = "Scene" >

  <script type = "text/brightscript" >

    <![CDATA[

    sub init()
      m.top.backgroundURI = "pkg:/images/rsgde_bg_hd.jpg"

      audio = createObject("roSGNode", "Audio")

      audiocontent = createObject("RoSGNode", "ContentNode")
      audiocontent.url = "http://www.sdktestinglab.com/Tutorial/sounds/audionode.mp3"

      audio.content = audiocontent

      m.top.appendChild(audio)

      audio.control = "play"

      m.top.setFocus(true)
    end sub

    ]]>

  </script>

</component>

For this example, we've used a very simple ContentNode node, and for some Audio node use cases, that may be exactly what you want. You might only want to play single sound effect over a scene in your application, with no additional visual element associated with the sound, so something like this example is all you need. The Audio node was designed not to have any visual user interface elements built into the node for this reason. This example is so simple, the audio will play to completion unless the Back key is pressed to exit the application, and if the audio plays to completion, you'll have to press the Back key to exit the application.

But if you want to play a number of sounds, and show some visual elements that are associated with the sounds, you'll need a ContentNode node with several more Content Meta-Data attributes...and you'll have to build your own custom visual user interface elements in markup.

Audio List Markup

Example Application: AudioListExample.zip
Node Class References: Audio, LabelList

For some applications that use an Audio node, a user interface to allow selection from a set of audio items, and visual indicators of the currently-selected audio item, is required.

For these applications, the ContentNode node that is assigned to the Audio node content field can contain as many Content Meta-Data descriptive attributes as needed to meet the user interface requirements of the application.

For this Audio node example, we'll just allow a user to select an audio item from a LabelList node list. We could use any of the list and grid nodes described in Creating Lists and Grids, but for this example we'll use a simple list. We also could set up a more elaborate user interface, that would show more descriptive text and images for each audio item as the user scrolls down the list. For example, we could show information about each song in a list of songs, such as the length, musicians, songwriters, and so forth. For ideas on how to implement that type of interface, see Video List Markup below.

But we will add some additional control functions for the audio playback compared to the very simple Audio Markup above, to allow users to stop an audio item during playback, and reset the Audio node after an audio item has completed playback. From audiolistscene.xml in AudioListExample.zip, note the playaudio() and controlaudio() callback functions, triggered by the LabelList node itemselected and Audio node state fields respectively:

sub init()
  ...
  m.audiolist.observeField("itemSelected", "playaudio")
  ...
  m.audio.observeField("state", "controlaudioplay")
  ...
end sub

...

sub playaudio()
  m.audio.control = "stop"
  m.audio.control = "none"
  m.audio.control = "play"
end sub

sub controlaudioplay()
  if (m.audio.state = "finished")
    m.audio.control = "stop"
    m.audio.control = "none"
  end if
end sub

Remember Observed Fields Markup? So when the user selects the audio item from the list, the playaudio() function is called. When the audio item has a playback state change, the controlaudioplay() function is called to reset the Audio node if the playback has completed. And we'll need a remote key event handler if the user presses the Back key to stop an audio item playback (review Key Events Markup if needed):

function onKeyEvent(key as String,press as Boolean) as Boolean
  if press then
    if key = "back"
      if (m.audio.state = "playing")
        m.audio.control = "stop"

        return true

      end if
    end if
  end if

  return false
end function

The remainder of the audiolistscene.xml file is the <children> element, containing the XML Group node layout of the scene visual elements: the LabelList and Poster nodes.

You can see how the content meta-data for the LabelList node is used to set the audio item strings in the list using the Content Meta-Data title attribute, as well as setting the streamFormat and Url Content Meta-Data attributes for audio playback:

http://www.sdktestinglab.com/Tutorial/content/audiocontent.xml

<Content >
  <item
    title = "Main Menu"
    streamformat = "mp3"
    url = "http://www.sdktestinglab.com/Tutorial/sounds/mainmenu.mp3" />
  ...

Video Markup

Example Application: VideoExample.zip
Node Class Reference: Video

It's time to actually play a video. The Video node, like the Audio node, requires a ContentNode node to be assigned as the value of the content field. The ContentNode node must contain all of the required Content Meta-Data attributes for playback of a video, but can also contain additional descriptive Content Meta-Data attributes, some of which are used in the Video node user interface.

The videoscene.xml component file in VideoExample.zip shows close to the minimum amount of markup required to play a video:

Video Markup Example
<component name = "VideoExample" extends = "Scene" >

  <script type = "text/brightscript" >

    <![CDATA[

    sub init()
      videocontent = createObject("RoSGNode", "ContentNode")

      videocontent.title = "Example Video"
      videocontent.streamformat = "mp4"
      videocontent.url = "http://roku.cpl.delvenetworks.com/.../rr_123_segment_1_072715.mp4"

      video = m.top.findNode("exampleVideo")
      video.content = videocontent

      video.setFocus(true)
      video.control = "play"
    end sub

    ]]>

  </script>

  <children >

    <Video id = "exampleVideo"/>

  </children>

</component>

First, we create a ContentNode node that contains the Content Meta-Data attributes for the video. Actually, of the three attributes added to the ContentNode node, only one is actually required: url, the location of the video file. The streamformat attribute is quite often not required because the Video node can determine the streaming file format based on the file suffix, mp4 in this case. But you should include it in your server video content meta-data files to avoid any possible problems. And the title attribute is only used for parts of the Video node user interface, and the video will play without it.

Once the required and optional playback attributes are set in a ContentNode node, we assign the ContentNode node to the content field of the Video node that was created in the <children> element of the <component>. After that, we activate the Video node user interface by setting focus on the Video node, and set the control field to play to start playback. In this simple example, the video will play to completion unless the Back key is pressed to exit the application. If the video plays to completion, you will have to press the Back key to exit the application.

Video List Markup

Example Application: VideoListExample.zip
Node Class References: Video, LabelList

Finally, let's look at how you can create a dynamic screen that allows a user to browse through a list of video items, look at some details about each video item, then select a particular video for playback. Yes, this will show you the methods used to implement video-on-demand in a Roku application.

We've already seen how to select audio from a list of sounds in Audio List Markup, and you want to use the same basic techniques to integrate the Video node with a LabelList node (which could be any of the list and grid nodes described in Creating Lists and Grids). But for this example, we'll use several more Content Meta-Data descriptive attributes in the ContentNode node for the screen. This allows the application to display the text and images set in the attributes as the user scrolls down the list of video items.

You will note that the videolistscene.xml file in VideoListExample.zip is set up in a very similar manner to the audiolistscene.xml file of AudioListExample.zip (see Audio List Markup above). This is because the Audio and Video nodes are very similar, and both examples use a LabelList node to select the media playback items. So the Task node Content Meta-Data URL transfer, list selection, and media playback control callback functions are mostly the same. Both examples use a similar <children> element, containing the XML Group node layout of the scene visual elements.

But the user interface is enhanced by the addition of two lines to the setvideo() callback function (equivalent to setaudio() in audiolistscene.xml), triggered by the LabelList node itemFocused field:

sub init()
  ...
  m.videolist.observeField("itemFocused", "setvideo")
  ...
end sub

....

sub setvideo()
  videocontent = m.videolist.content.getChild(m.videolist.itemFocused)
  m.videoposter.uri = videocontent.hdposterurl
  m.videoinfo.text = videocontent.description
  m.video.content = videocontent
end sub

This is where we set the hdposterurl and description Content Meta-Data attributes, as the m.videoposter Poster node uri, and m.videoinfo Label node text fields, respectively. As the user scrolls down the list of videos, each change of the LabelList node itemFocused field triggers a setvideo() callback. The index number for the currently-focused video item in the itemFocused field is used to get the ContentNode node child containing the Content Meta-Data for the item. This allows all the Content Meta-Data for the focused video item to be assigned to various node fields in the application. As each video item is focused, these node field values are reassigned with the Content Meta-Data for that item.

You can see how the content meta-data for the LabelList node is used to set the video item strings in the list using the Content Meta-Data title attribute, as well as setting the other descriptive and media playback Content Meta-Data attributes:

http://www.sdktestinglab.com/Tutorial/content/videocontent.xml

<Content >
  <item
    title = "Puppies and Kittens"
    description = "So Cute!"
    hdposterurl = "http://s2.content.video.llnw.net/lovs/images-prod/.../ZLh.540x304.jpeg"
    streamformat  =  "mp4"
    url = "http://roku.cpl.delvenetworks.com/media/.../rr_123_segment_1_072715.mp4" />
  ...

 

Attachments:

VideoExample.zip (application/zip)
AudioExample.zip (application/zip)
AudioExample.zip (application/zip)
AudioListExample.zip (application/zip)
AudioListExample.zip (application/zip)
VideoListExample.zip (application/zip)
videolistdoc.jpg (image/jpeg)
audiolist.jpg (image/jpeg)
audiolistdoc.jpg (image/jpeg)
AudioExample.zip (application/zip)
AudioListExample.zip (application/zip)
VideoExample.zip (application/zip)
VideoListExample.zip (application/zip)
audiolistdoc.jpg (image/jpeg)
videolistdoc.jpg (image/jpeg)
audiolistdoc.jpg (image/jpeg)
videolistdoc.jpg (image/jpeg)