Intel® INDE Media for Mobile Tutorials - Advanced Video Capturing for Unity3d* Applications on Android*

At the end of the previous tutorial we noticed the issue about Unity* GUI layer.  But imagine that you already have a complicated Unity game with intensive GUI usage. What to do? Read this tutorial. It’s about more advanced use of Intel® INDE Media for Mobile. Moreover now we can use free version of Unity. How? We will explore the approach without fullscreen image postprocessing effects.

Prerequisites:

First of all you have to integrate Intel® INDE Media for Mobile to your game as described in the first tutorial. We won't discuss this process once again. We will be focused on changes.

Open Capturing.java file. Now our class has to look as follows:

package com.intel.inde.mp.samples.unity;

import com.intel.inde.mp.IProgressListener;
import com.intel.inde.mp.domain.Resolution;
import com.intel.inde.mp.android.graphics.FullFrameTexture;
import com.intel.inde.mp.android.graphics.FrameBuffer;
import com.intel.inde.mp.android.graphics.EglUtil;

import android.os.Environment;
import android.content.Context;

import java.io.IOException;
import java.io.File;

public class Capturing
{
	private static FullFrameTexture texture;
	private FrameBuffer frameBuffer;
	private VideoCapture videoCapture;
	
	private IProgressListener progressListener = new IProgressListener() {
        @Override
        public void onMediaStart() {
        }

        @Override
        public void onMediaProgress(float progress) {
        }

        @Override
        public void onMediaDone() {
        }

        @Override
        public void onMediaPause() {
        }

        @Override
        public void onMediaStop() {
        }

        @Override
        public void onError(Exception exception) {
        }
    };
	
    public Capturing(Context context, int width, int height)
    {
		videoCapture = new VideoCapture(context, progressListener);
		
	    frameBuffer = new FrameBuffer(EglUtil.getInstance());
		frameBuffer.setResolution(new Resolution(width, height));

		texture = new FullFrameTexture();
    }

    public static String getDirectoryDCIM()
    {
        return Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM) + File.separator;
    }

    public void initCapturing(int width, int height, int frameRate, int bitRate)
    {
        VideoCapture.init(width, height, frameRate, bitRate);
    }

    public void startCapturing(String videoPath)
    {
        if (videoCapture == null) {
            return;
        }
        synchronized (videoCapture) {
            try {
                videoCapture.start(videoPath);
            } catch (IOException e) {
            }
        }
    }
	
	public void beginCaptureFrame()
    {
    	frameBuffer.bind();
    }
	
	public void captureFrame(int textureID)
    {
        if (videoCapture == null) {
            return;
        }
        synchronized (videoCapture) {
            videoCapture.beginCaptureFrame();
            texture.draw(textureID);
            videoCapture.endCaptureFrame();
        }
    }
	
	public void endCaptureFrame()
    {
    	frameBuffer.unbind();
		int textureID = frameBuffer.getTextureId();
    	captureFrame(textureID);
		texture.draw(textureID);	
    }

    public void stopCapturing()
    {
        if (videoCapture == null) {
            return;
        }
        synchronized (videoCapture) {
            if (videoCapture.isStarted()) {
                videoCapture.stop();
            }
        }
    }

}

As you can see there are some changes. The main one is the frameBuffer member. Constructor now accepts width and height parameters to create properly sized FrameBuffer. There are three new public methods: frameBufferTexture(), beginCaptureFrame() and endCaptureFrame(). Their meanings will become clear later on the C# side.

Leave VideoCapture.java file without any changes. Please notice the package name. Keep it the same as in the player settings (Bundle identifier) in Unity. Don’t forget about manifest file. Set all necessary permissions and features.

Now we have our AndroidManifest.xml and our Java* files under /Plugins/Android. Create an Apache* Ant* script and build all your stuff with it. For more details look in the previous tutorial. Notice the new file Capturing.jar in the directory.

Switch to Unity. Open Capture.cs and replace its content with the following code:

using UnityEngine;
using System.Collections;
using System.IO;
using System;

[RequireComponent(typeof(Camera))]
public class Capture : MonoBehaviour
{
	public int videoWidth = 720;
	public int videoHeight = 1094;
	public int videoFrameRate = 30;
	public int videoBitRate = 3000;

	private string videoDir;
	public string fileName = "game_capturing-";
	
	private float nextCapture = 0.0f;
	public bool inProgress { get; private set; }
	private bool finalizeFrame = false;

	private AndroidJavaObject playerActivityContext = null;
	
	private static IntPtr constructorMethodID = IntPtr.Zero;
	private static IntPtr initCapturingMethodID = IntPtr.Zero;
	private static IntPtr startCapturingMethodID = IntPtr.Zero;
	private static IntPtr beginCaptureFrameMethodID = IntPtr.Zero;
	private static IntPtr endCaptureFrameMethodID = IntPtr.Zero;
	private static IntPtr stopCapturingMethodID = IntPtr.Zero;

	private static IntPtr getDirectoryDCIMMethodID = IntPtr.Zero;

	private IntPtr capturingObject = IntPtr.Zero;

	void Start()
	{
		if (!Application.isEditor) {
			// First, obtain the current activity context
			using (AndroidJavaClass jc = new AndroidJavaClass("com.unity3d.player.UnityPlayer")) {
				playerActivityContext = jc.GetStatic<AndroidJavaObject>("currentActivity");
			}

			// Search for our class
			IntPtr classID = AndroidJNI.FindClass("com/intel/penelope/Capturing");

			// Search for it's constructor
			constructorMethodID = AndroidJNI.GetMethodID(classID, "<init>", "(Landroid/content/Context;II)V");

			// Register our methods
			initCapturingMethodID = AndroidJNI.GetMethodID(classID, "initCapturing", "(IIII)V");
			startCapturingMethodID = AndroidJNI.GetMethodID(classID, "startCapturing", "(Ljava/lang/String;)V");
			beginCaptureFrameMethodID = AndroidJNI.GetMethodID(classID, "beginCaptureFrame", "()V");
			endCaptureFrameMethodID = AndroidJNI.GetMethodID(classID, "endCaptureFrame", "()V");
			stopCapturingMethodID = AndroidJNI.GetMethodID(classID, "stopCapturing", "()V");

			// Register and call our static method
			getDirectoryDCIMMethodID = AndroidJNI.GetStaticMethodID(classID, "getDirectoryDCIM", "()Ljava/lang/String;");
			jvalue[] args = new jvalue[0];
			videoDir = AndroidJNI.CallStaticStringMethod(classID, getDirectoryDCIMMethodID, args);

			// Create Capturing object
			jvalue[] constructorParameters = AndroidJNIHelper.CreateJNIArgArray(new object [] { playerActivityContext, Screen.width, Screen.height });
			IntPtr local_capturingObject = AndroidJNI.NewObject(classID, constructorMethodID, constructorParameters);
			if (local_capturingObject == IntPtr.Zero) {
				Debug.LogError("Can't create Capturing object");
				return;
			}

			// Keep a global reference to it
			capturingObject = AndroidJNI.NewGlobalRef(local_capturingObject);
			AndroidJNI.DeleteLocalRef(local_capturingObject);

			AndroidJNI.DeleteLocalRef(classID);
		}
		inProgress = false;
		nextCapture = Time.time;
	}

	void OnPreRender()
	{
		if (inProgress && Time.time > nextCapture) {
			finalizeFrame = true;
			nextCapture += 1.0f / videoFrameRate;
			BeginCaptureFrame();
		}
	}

	public IEnumerator OnPostRender()
	{
		if (finalizeFrame) {
			finalizeFrame = false;
			yield return new WaitForEndOfFrame();
			EndCaptureFrame();
		} else {
			yield return null;
		}
	}

	public void StartCapturing()
	{
		if (capturingObject == IntPtr.Zero)
			return;

		jvalue[] videoParameters =  new jvalue[4];
		videoParameters[0].i = videoWidth;
		videoParameters[1].i = videoHeight;
		videoParameters[2].i = videoFrameRate;
		videoParameters[3].i = videoBitRate;
		AndroidJNI.CallVoidMethod(capturingObject, initCapturingMethodID, videoParameters);
		DateTime date = DateTime.Now;
		string fullFileName = fileName + date.ToString("ddMMyy-hhmmss.fff") + ".mp4";
		jvalue[] args = new jvalue[1];
		args[0].l = AndroidJNI.NewStringUTF(videoDir + fullFileName);
		AndroidJNI.CallVoidMethod(capturingObject, startCapturingMethodID, args);

		inProgress = true;
	}

	private void BeginCaptureFrame()
	{
		if (capturingObject == IntPtr.Zero)
			return;

		jvalue[] args = new jvalue[0];
		AndroidJNI.CallVoidMethod(capturingObject, beginCaptureFrameMethodID, args);
	}

	private void EndCaptureFrame()
	{
		if (capturingObject == IntPtr.Zero)
			return;

		jvalue[] args = new jvalue[0];
		AndroidJNI.CallVoidMethod(capturingObject, endCaptureFrameMethodID, args);
	}

	public void StopCapturing()
	{
		inProgress = false;

		if (capturingObject == IntPtr.Zero)
			return;

		jvalue[] args = new jvalue[0];
		AndroidJNI.CallVoidMethod(capturingObject, stopCapturingMethodID, args);
	}

}

This is the place where our changes take place more than anywhere else. But logic behind all this changes is simple. We pass screen dimensions to Capturing.java constructor. Notice the constructor’s new signature - (Landroid/content/Context;II)V. On the Java side we create FrameBuffer.  OnPreRender() is called before a camera starts rendering the scene. We bind our FrameBuffer here. All actual rendering of the scene becomes off-screen. OnPostRender() is called after a camera finished rendering the scene. We wait until the end of the frame, switch back to default on-screen FrameBuffer and copy texture directly to screen (find endCaptureFrame() method inside Capturing.java). We can’t use Graphics.Blit(). It requires Unity Pro. We use the same texture to capture frame.

It will be convenient to show how your game’s performance is affected by capturing algorithm. So let’s create simple FPSCounter class:

using UnityEngine;
using System.Collections;

public class FPSCounter : MonoBehaviour
{
	public float updateRate = 4.0f; // 4 updates per sec.

	private int frameCount = 0;
	private float nextUpdate = 0.0f;
	private float fps = 0.0f;
	private GUIStyle style = new GUIStyle();

	void Start()
	{
		style.fontSize = 48;
		style.normal.textColor = Color.white;

		nextUpdate = Time.time;
	}

	void Update()
	{
		frameCount++;
		if (Time.time > nextUpdate) {
			nextUpdate += 1.0f / updateRate;
			fps = frameCount * updateRate;
			frameCount = 0;
		}
	}

	void OnGUI()
	{
		GUI.Label(new Rect(10, 110, 300, 100), "FPS: " + fps, style);
	}
}

Add this script to any object in your scene.

That’s all. Now Build & Run your test application for Android platform. You can find recorded videos in /mnt/sdcard/DCIM/ folder of your Android device.

Known issues:

  • With this approach we can’t capture any off-screen rendering (drop shadows, deferred shading and fullscreen post-effects).
  • Not working since Unity 4.5. This is due to recent changes of OnPreRender() internals. Instead, use method described in the first article.
For more complete information about compiler optimizations, see our Optimization Notice.

8 comments

Top
Angie T.'s picture

Hai please help, i try tutorial but still, camera not render and output video was unsupported file. I am using unity 4.6.1 free. 

Ben's picture

I also faced with same issue like Huy。the scene is twinkling many times. 

is there anybody help me。

Nick A (Intel)'s picture

Hi alok, the main approach was suggested by Ilya in another thread: https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials-video-capturing-for-unity3d-applications?page=2

"While recording, FPS drops anyway. You can decrease resolution to reduce this side effect. But much better to use adaptive rendering - simplify complex shaders and geometry, reduce drawing distance and etc." 

alok k.'s picture

Hi Huy,

Were you able to fix the fps drop issue?

I am facing the same the issue but was unable to fix it.

Thanks in advance.

Huy T.'s picture

Hi Ilya,

I'm very glad to tell you that I had fixed this issue. But now I 'm facing with the new issue. While recording, FPS drops down from 60 to 44. Can you help me to improve performance?

BTW, I want to share my project code to you but it is belong to my company and I can't publish. Would you please tell me your email address to share the dropbox link? 

Ilya Aleshkov's picture

Hi Huy T.

Sorry for long delay. I'm trying to reproduce your issue.

Huy T.'s picture

I used Tracer for OpenGL ES tool to debug. And GL_INVALID_OPERATION error occurs only when I use texture.draw(textureID); to draw again to window display. 

Would you please tell me the reason of this bug?

Any help will be greatly appreciated

Huy T.'s picture

I tried your solution on Samsung Galaxy S4 device.The output video is ok but while recording, the scene is twinkling many times.Maybe when recording,Unity cannot render gameplay texture to device screen.BTW, the game performance is not good.FPS is reduced to be 36→40, 

Would you please tell me the reason of this bug? Any help will be greatly appreciated

 

 

Add a Comment

Have a technical question? Visit our forums. Have site or software product issues? Contact support.