Suggestions

close search

Back to Tutorials

Custom Video Rendering (Android)

Overview

This tutorial walks through the steps required to make minor modifications to the video renderer used by a Subscriber object. You can also use the same techniques to modify the video renderer used by a Publisher object (though this example only illustrates a custom renderer for a subscriber).

Setting up your project

The code for this section is in the basic-renderer branch of the learning-opentok-android repo, so if you haven't already, you'll need to clone the repo into a local directory — this can be done using the command line:

git clone https://github.com/opentok/learning-opentok-android.git

Then check out the branch:

git checkout basic-renderer

Open the project in Android Studio to follow along.

Exploring the code

In this example, the app uses a custom video renderer to display a black-and-white version of the Subscriber object's video.

BlackWhiteVideoRender is a custom class that extends the BaseVideoRenderer protocol (defined in the OpenTok Android SDK). The BaseVideoRenderer class lets you define a custom video renderer to be used by an OpenTok publisher or subscriber.

mSubscriberRenderer = new BlackWhiteVideoRender(this);

In the main ChatActivity class, after initializing a Subscriber object, the setRenderer(renderer) method of the Subscriber object is called to set the custom video renderer for the subscriber:

mSubscriber = new Subscriber(this, stream);
mSubscriber.setRenderer(mSubscriberRenderer);

The BlackWhiteVideoRender() constructor sets a mRenderView property to a GLSurfaceView object. The app uses this object to display the video using OpenGL ES 2.0. The renderer for this GLSurfaceView object is set to a GLRendererHelper object. GLRendererHelper is a custom class that extends GLSurfaceView.Renderer, and it is used to render the subscriber video to the GLSurfaceView object:

public BlackWhiteVideoRender(Context context) {
    mRenderView = new GLSurfaceView(context);
    mRenderView.setEGLContextClientVersion(2);

    mRenderer = new GLRendererHelper();
    mRenderView.setRenderer(mRenderer);

    mRenderView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
}

The GLRendererHelper class includes code that converts a video frame to a black-and-white representation.

The BaseVideoRenderer.onFrame() method is called when the subscriber (or subscriber) renders a video frame to the video renderer. The frame is an BaseVideoRenderer.Frame object (defined by the OpenTok Android SDK). In the BlackWhiteVideoRender implementation of this method, it takes the frame's image buffer, which is a YUV representation of the frame, and transforms it into black-and-white. It then passes the buffer to the displayFrame() method of the GLRendererHelper object and calls the requestRender() method of the GLSurfaceView object:

@Override
public void onFrame(Frame frame) {
    ByteBuffer imageBuffer = frame.getBuffer();

    // Image buffer is represented using three planes, Y, U and V.
    // Data is laid out in a linear way in the imageBuffer variable
    // Y plane is first, and its size is the same of the image (width * height)
    // U and V planes are next, in order to produce a B&W image, we set both
    // planes with the same value.

    int startU = frame.getWidth() * frame.getHeight();
    for (int i = startU; i < imageBuffer.capacity(); i++) {
        imageBuffer.put(i, (byte)-127);
    }

    mRenderer.displayFrame(frame);
    mRenderView.requestRender();
}

The GLRendererHelper class renders the frame contents to an OpenGL surface in Android.

Congratulations! You've finished the Custom Video Rendering Tutorial for Android.
You can continue to play with and adjust the code you've developed here, or check out the Next Steps below.