Suggestions

close search

Back to Custom Video Rendering Overview

Custom Video Rendering Step 1: Implementing custom video rendering

To see the code for this sample, switch to the basic-renderer branch of the learning-opentok-android repo:

git checkout basic-renderer

This page shows the difference between the basic-renderer branch and the basics.step-6 branch, which this branch builds from.

This branch shows you how to make minor modifications to the video renderer used by a Subscriber object. You can also use the same techniques to modify the video renderer used by a Publisher object (though this example only illustrates a custom renderer for a subscriber).

In this example, the app uses a custom video renderer to display a black-and-white version of the Subscriber object's video.

BlackWhiteVideoRender is a custom class that extends the BaseVideoRenderer protocol (defined in the OpenTok Android SDK). The BaseVideoRenderer class lets you define a custom video renderer to be used by an OpenTok publisher or subscriber.

mSubscriberRenderer = new BlackWhiteVideoRender(this);

In the main ChatActivity class, after initializing a Subscriber object, the setRenderer(renderer) method of the Subscriber object is called to set the custom video renderer for the subscriber:

mSubscriber = new Subscriber(this, stream);
mSubscriber.setRenderer(mSubscriberRenderer);

The BlackWhiteVideoRender() constructor sets a mRenderView property to a GLSurfaceView object. The app uses this object to display the video using OpenGL ES 2.0. The renderer for this GLSurfaceView object is set to a GLRendererHelper object. GLRendererHelper is a custom class that extends GLSurfaceView.Renderer, and it is used to render the subscriber video to the GLSurfaceView object:

public BlackWhiteVideoRender(Context context) {
    mRenderView = new GLSurfaceView(context);
    mRenderView.setEGLContextClientVersion(2);

    mRenderer = new GLRendererHelper();
    mRenderView.setRenderer(mRenderer);

    mRenderView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
}

The GLRendererHelper class includes code that converts a video frame to a black-and-white representation.

The BaseVideoRenderer.onFrame() method is called when the subscriber (or subscriber) renders a video frame to the video renderer. The frame is an BaseVideoRenderer.Frame object (defined by the OpenTok Android SDK). In the BlackWhiteVideoRender implementation of this method, it takes the frame's image buffer, which is a YUV representation of the frame, and transforms it into black-and-white. It then passes the buffer to the displayFrame() method of the GLRendererHelper object and calls the requestRender() method of the GLSurfaceView object:

@Override
public void onFrame(Frame frame) {
    ByteBuffer imageBuffer = frame.getBuffer();

    // Image buffer is represented using three planes, Y, U and V.
    // Data is laid out in a linear way in the imageBuffer variable
    // Y plane is first, and its size is the same of the image (width * height)
    // U and V planes are next, in order to produce a B&W image, we set both
    // planes with the same value.

    int startU = frame.getWidth() * frame.getHeight();
    for (int i = startU; i < imageBuffer.capacity(); i++) {
        imageBuffer.put(i, (byte)-127);
    }

    mRenderer.displayFrame(frame);
    mRenderView.requestRender();
}

The GLRendererHelper class renders the frame contents to an OpenGL surface in Android.