This tutorial demonstrates how to use the OpenTok iOS SDK to publish a screen-sharing video, using the device screen as the source for the stream's video.
The code for this section is in the Screen-Sharing folder of the opentok-ios-sdk-samples project on GitHub. If you haven't already, you'll need to clone the repo into a local directory — this can be done using the command line:
git clone https://github.com/opentok/opentok-ios-sdk-samples.git
The Screen-Sharing application shows you how to capture the screen (a UIView) using a custom video capturer. Open the project in XCode to follow along.
The custom video capturer, TBScreenCapture, is first initialized with a display link object. This object calls the
captureView
method at 30 frames per second.
createPixelBuffer
and createCGContextFromPixelBuffer
helper methods set the stage to store the snapshot of the main screen. This buffer is then fed to the videoCaptureConsumer
in the
captureView
method.
- (instancetype)initWithView:(UIView *)view
{
self = [super init];
if (self) {
_view = view;
_queue = dispatch_queue_create("SCREEN_CAPTURE", NULL);
_screenScale = [[UIScreen mainScreen] scale];
_displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(captureView)];
_displayLink.preferredFramesPerSecond = 30.0;
_capturingSemaphore = dispatch_semaphore_create(1);
[self createPixelBuffer];
[self createCGContextFromPixelBuffer];
}
return self;
}
-(void)captureView
{
if (!(_capturing && self.videoCaptureConsumer)) {
return;
}
// Wait until consumeImageBuffer is done.
if (dispatch_semaphore_wait(_capturingSemaphore, DISPATCH_TIME_NOW) != 0) {
return;
}
dispatch_async(dispatch_get_main_queue(), ^{
[self.view.layer renderInContext:self->_bitmapContext];
// Don't block the UI thread
dispatch_async(self->_queue, ^{
CMTime time = [self getTimeStamp];
[self.videoCaptureConsumer consumeImageBuffer:self->_pixelBuffer
orientation:OTVideoOrientationUp
timestamp:time
metadata:nil];
// Signal for more frames
dispatch_semaphore_signal(self->_capturingSemaphore);
});
});
}
The snapshot is taken using [self.view.layer renderInContext:self->_bitmapContext]
.
The sample app uses the protocol methods of OTVideoCapture
to manage the display link object along with a _capturing
flag. The captureSettings
method sets
the videoFormat to OTPixelFormatARGB
.
- (int32_t)captureSettings:(OTVideoFormat*)videoFormat
{
videoFormat.pixelFormat = OTPixelFormatARGB;
return 0;
}
The ViewController class creates a session, instantiates subscribers, and sets up the publisher.
The publisher is the main workhorse of the app,
and its properties and methods — videoType
, audioFallbackEnabled
, videoContentHint
,
and setVideoCapture
— are set in the doPublish
method:
- (void)doPublish
{
...
[_publisher setVideoType:OTPublisherKitVideoTypeScreen];
// This disables the audio fallback feature when using routed sessions.
_publisher.audioFallbackEnabled = NO;
// Finally, wire up the video source.
TBScreenCapture* videoCapture =
[[TBScreenCapture alloc] initWithView:self.view];
[_publisher setVideoCapture:videoCapture];
_publisher.videoCapture.videoContentHint = OTVideoContentHintText;
...
}
Lastly, the app adds a running timer text display. The app captures this display and shares it with subscribers:
dispatch_source_set_event_handler(_timer, ^{
double timestamp = [[NSDate date] timeIntervalSince1970];
int64_t timeInMilisInt64 = (int64_t)(timestamp*1000);
NSString *mills = [NSString stringWithFormat:@"%lld", timeInMilisInt64];
dispatch_sync(dispatch_get_main_queue(), ^{
[self.timeDisplay setText:mills];
});
});
Congratulations! You've finished the Screen sharing Tutorial for iOS.
You can continue to play with and adjust the code you've developed here, or check out the Next Steps below. For more information on screen sharing with OpenTok, see the OpenTok Screen sharing developer guide for iOS.
When you're finished here, continue building and enhancing your OpenTok application with these helpful resources: