Video Playback App on Android* Whitepaper

Downloads

VideoPlayback.zip [4 MB]

Basic Principles

Before we start describing the sample app programming, it is better to understand the basic of the video processing and video content.

Video codecs and containers

The video has a high data rate when playing, so it must be processed before being transferred with the limitation of transportation bandwidth. There are two layers of processing, encoding and packaging. First the video and audio must be compressed with the compression algorithm, and then they must be package with a container so that the video and audio streams are multiplexed together.

There are many kinds of video compression algorithms defined by the standards, we normally call them the codecs; the most commonly used video codecs are h.264, mp4 and WMV(VC-1).

The most commonly used video containers are MP4, QuickTime, ASF, 3GP and RM. Refer to this web page for details;

One codec doesn’t have to be packaged by one container. For example, as a commonly used codec, H.264 can be packaged by MP4, QuickTime, 3GP, etc.

Video processing

Because the video content is transferred in a container, in order to play it on device, the video process must have the following steps:


Figure 1: video playback process pipeline

 

The de-multiplex step splits the data in the video container into the video and the audio streams, each stream is then uncompressed and re-constructed to the original picture in the decoder and then rendered to the display screen by the render.

Video programming with Android* application framework

Because the Android application framework is well designed to parse the media file and set up the processing pipeline automatically, the programmer don’t have to deal with the detail of the video content and processing. The framework has a unified interface to accept the media content with different codecs.

Although the media codec we used in this sample app example is H.264, it applies to all the codecs and containers that Android supported, for example, MPEG-4. Please go to this page to check the codecs and containers supported by Android.

The video playback sample code.

We are showing an example of H.264 playback in our sample app suite and describe briefly how it is organized.

    public class VideoPlayback extends Activity implements OnPreparedListener {  
        @Override  
        public void onCreate(Bundle savedInstanceState) {  
            super.onCreate(savedInstanceState);  
            setContentView(R.layout.media);  
            mPreview = (SurfaceView) findViewById(R.id.surface);  
            holder = mPreview.getHolder();  
            // Set up the play/pause/reset/stop buttons  
            mPlay = (ImageButton) findViewById(R.id.play);  
            mStop = (ImageButton) findViewById(R.id.stop);  
            mPause = (ImageButton) findViewById(R.id.pause);  
      
            mPlay.setOnClickListener(new View.OnClickListener() {  
                public void onClick(View view) {  
                    if (mMediaPlayer == null){  
                        prepareVideo();  
                    }  
                    if (!mMediaPlayer.isPlaying()) {  
                        mMediaPlayer.start();  
                        videoStopped = false;  
                    } else {  
      
                    }  
                }  
            });  
            mPause.setOnClickListener(new View.OnClickListener() {  
                public void onClick(View view) {  
                    if (mMediaPlayer != null) {  
                        mMediaPlayer.pause();  
                    }  
                }  
            });  
      
            mStop.setOnClickListener(new View.OnClickListener() {  
                public void onClick(View view) {  
                    if (mMediaPlayer != null) {  
                        stopVideo();  
                    }  
                }  
            });  
        }  
      
        private void stopVideo() {  
            videoStopped = true;  
            if (mMediaPlayer != null) {  
                mMediaPlayer.stop();  
                mMediaPlayer.release();  
                mMediaPlayer = null;  
            }  
        }  
      
        private void prepareVideo() {  
            // Create a new media player and set the listeners  
            try {  
                videoUri = Uri.parse("android.resource://" +   
                    getPackageName() + "/"  + R.raw.h264);  
                mMediaPlayer = new MediaPlayer();  
                mMediaPlayer.setDataSource(this, videoUri);  
                mMediaPlayer.setDisplay(holder);  
                mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);  
                mMediaPlayer.prepare();  
                mMediaPlayer.setOnPreparedListener(this);  
            } catch (Exception e) {  
                Log.e(TAG, "error: " + e.getMessage(), e);  
                finish();  
            }  
            videoStopped = false;  
        }  
      
        public void onPrepared(MediaPlayer arg0) {  
            // TODO Auto-generated method stub  
            mMediaPlayer.start();  
            videoStopped = false;  
        }  
    }  

The VideoPlayback class extends the activity class which creates a UI thread and graphic presentation for the user in order to render the application content and deal with user events. It also implements a OnPreparedListener interface to cope with the preparation event of the MediaPlayer object.

The video content is assigned by a URI object-videoUri, in the example code, this is generated by parsing the resource path to a media file labelled as “R.raw.h264”; this is a video file encoded with H.264. Although we use an example of H.264 content, the file can be replaced with the content encoded by other codecs.

For the user control, this class creates 3 buttons-play, stop and pause-and registers the onClick() function for each of them;

When the user clicks the play button, onClick() calls prepareVideo() and prepareVideo() finds MediaPlayer object is null and does following actions to initialize and configure it for the next operations:

  • Get the URI path to the video content. In this example, the video content is a MP4 file with h.264 encoded video stream. The video can also be a user selected video file and an RTSP video stream;
  • Call the MediaPlayer constructor to create the object; the MediaPlayer object is in Idle state when created;
  • Set the data to the URI path in the above step, the MediaPlayer class has several setDataSource() signatures to accept different content type than URI. After this call, the MediaPlayer object is in “Initialized” state;
  • Call setDisplay() to set the object to render the video; MediaPlayer accepts either Surface or SurfaceHolder object;
  • Call setAudioStreamType() to set the audio stream type. This call must be made before prepare() according to the function reference;
  • With the data object and surface object, prepare() can be called to create the data process path from de-muxing, decoding to rendering. This call will be blocked until all its operations have been finished. It will change the MediaPlayer state to “Prepared”. In this state, the MediaPlayer object can be operated to play, pause and stop, etc.
  • Finally, setOnPreparedListener() is called to register the VideoPlaypack as the callback object. This is because prepare() only configures the data process path. Since the data content process is relatively slower, we have to wait for the data source to be ready.

The exception handling is included in the above operations since we want it to smoothly deal with an error during this complicated process.

The playback starts after start() of the MediaPlayer object is invoked inside of the onPrepared() callback function. The MediaPlayer object is at the “Started” state now and can be paused and stopped.

When user clicks the pause button, onClick() of the “pause” button only pauses the MediaPlayer object. Since this only pauses the operation, the MediaPlayer is put into the “Paused” state and all the resources will not change. To resume the playback, simply call the start() function.

When the user clicks the stop button, the stopVideo() function is invoked by onClick(), and it will do followings:

  • Call stop() of MediaPlayer to stop the playback. This sets the object to the “Stopped” state. Once in the “Stopped” state, the MediaPlayer must be prepared again in order to play.
  • Call release(). This function will remove all the resources from the current object, including the data content, data path, hardware and software components related to current playback. The object is in the “End” state and must be reset before setting the data content and preparing again.
  • Set object reference to null so that the current object can be destroyed by the garbage collector. Android garbage collector has the same behavior as Java, it will destroy the unused object automatically;

 

The user interface design of the video playback app.

 <framelayout xmlns:android="http://schemas.android.com/apk/res/android" android:orientation="horizontal" android:layout_width="fill_parent" android:layout_height="fill_parent"> <surfaceview android:id="@+id/surface" android:orientation="horizontal" android:layout_width="fill_parent" android:layout_height="fill_parent" android:layout_gravity="center"> </surfaceview> <linearlayout android:orientation="horizontal" android:layout_height="wrap_content" android:layout_width="fill_parent" android:layout_gravity="bottom|center" android:padding="10dip"> <imagebutton android:id="@+id/play" android:layout_height="wrap_content" android:layout_width="wrap_content" android:padding="10dip" android:src="@drawable/play"> <imagebutton android:id="@+id/pause" android:layout_height="wrap_content" android:layout_width="wrap_content" android:padding="10dip" android:src="@drawable/pause"> <imagebutton android:id="@+id/stop" android:layout_height="wrap_content" android:layout_width="wrap_content" android:padding="10dip" android:src="@drawable/stop"> </imagebutton></imagebutton></imagebutton></linearlayout> <framelayout> 

Let’s look at the layout file of the app, we can see it defines one surface object and a group of three buttons in a FrameLayout; the buttons are defined in a LinearLayout. This design creates the relative positions for each graphic component. These components will be created in the app by the findViewById() calls.

The media classes in Android application framework

By default, Android supplies 2 classes to handle the video directly-MediaPlayer and VideoView;

MediaPlayer

MediaPlayer is a controller class that has a lot of low level function calls which directly invoke the functions in the native media framework library; it can handle not only video but audio content as well;

MediaPlayer has several internal states and can be changed by different operations. It is easier for the developers to make mistake if they are doing some operations with the wrong state of the MediaPlayer. Based on above sample code, we can see the MediaPlayer object has the following state changes:


Figure 2: MediaPlayer internal state change.

The state changes we show here are just a part of the puzzle. For the complete diagram, please refer to the Android reference page of MediaPlayer;

VideoView

VideoView is an Android Widget. It is a composite class which inherited from SurfaceView and uses MediaPlayer as an internal object. If we look at the sample code of VideoView from the Android developer site, it is much simpler than the MediaPlayer.

Since VideoView is a sub class of SurfaceView, we can just use the layout file of the MediaPlayer sample code by removing three buttons group; then created the VideoView object by findViewByID(). Since the control buttons are ready by default, they will start working immediately.

Video processing in MediaPlayer

Inside the media framework, the MediaPlayer does the following steps to process the video content:

  • Identify the codec by parsing the video container(de-multiplex) and call setDataSource();
  • Configure the software stack to create a data path for parsing the container, decoding the streams, rendering on the frame buffer and displaying on screen.
  • Manage the data path dynamically during the playback;

The video playback work flow in the media framework

Based on the above description, blow is the detailed works done in the MediaPlayer object,

  • setDataSource(): The function call passes several layers and finally reaches the method AwesomePlayer::setDataSource() in the native media framework library. A MediaExtractor object is created to de-multiplex the data content and check if the video content is valid. It also creates the video track, audio track and saves the original video resolution;
  • prepare(): This function finally invoked the AwesomePlayer::prepare(). This is the start point where the native media framework changes the hardware, the driver layer and the system states to configure the whole data path for playback. The main objects are video decoder, audio decoder, A/V sync logic and the graphic driver.
  • Android media framework has two preparing interfaces, prepare() and prepareAsync(). They do the same preparation operations except prepareAsync() doesn’t wait for completion. The application program has to be designed and wait for the completion event from the preparation process before the further operations.

Conclusion

From the video playback example, we analysed the video process in the Android media framework. As the centre of the media process, MediaPlayer represents the media engine at the application layer. The media engine is a controller object of the media processing. The goal of the media engine is to configure the media data path to create an automated real-time process. Take playback as an example, the data path goes from de-mux, decode to render to both video and audio sink. This process reflects the general steps of media processing and can be applied to other platforms.

References

Android Developer Site: http://developer.android.com/index.html

Android Reference Page: http://developer.android.com/reference/packages.html

Android VideoView sample: http://developer.android.com/resources/samples/ApiDemos/src/com/example/android/apis/media/VideoViewDemo.html

Android supported media format: http://developer.android.com/guide/appendix/media-formats.html

About The Author

Mark Liu worked in UMG as a software engineer to develop the validation framework on Android based devices. He also joined several Android projects in UMG, including smartphone and tablet devices; a lot of works in these projects are related to media playback, video conference and software stack performance tuning.

After he joined SSG Atom device software enabling team, he worked on several horizontal efforts, include the development of Android sample media app, the system optimization of Windows 8 media framework and the document of media capabilities. He is the media lead of the horizontal support effort.