Once you have connected to a session, you can publish a stream that other clients connected to the session can view.
This topic includes the following sections:
To publish a stream, add an OTPublisher component as a child of the OTSession object:
<OTSession
apiKey="the API key"
sessionId="the session ID"
token="the token">
<OTPublisher/>
</OTSession>
The publisher starts streaming when the client connects to the session. The OTPublisher object dispatches a streamCreated
event when it starts streaming to the session. It dispatches an error
event if there is an error publishing. Set an eventHandlers
prop of the OTPublisher component, and set the streamCreated
and error
properties of that object to callback functions:
<OTPublisher
eventHandlers={{
streamCreated: () => {
console.log('The publisher started streaming.');
},
error: event => {
console.log('Publisher error:', event);
},
}}/>
Once you have connected to a session, you can check if the client can publish. Set a reference to the OTSession object, and call its
getCapabilites()
method in the sessionConnected
event handler.
This method returns a promise with an object that includes a canPublish
property. You can then conditionally publish based on that value:
import React, {Component} from 'react';
import {View} from 'react-native';
import {OTSession, OTPublisher, OTSubscriber} from 'opentok-react-native';
class App extends Component {
constructor(props) {
super(props);
this.apiKey = 'your API key';
this.sessionId = 'a session ID';
this.token = 'a valid token';
this.state = {canPublish: false};
this.sessionEventHandlers = {
sessionConnected: event => {
this.connectionCount++;
this.session.getCapabilities().then(capabilities => {
this.setState({canPublish: capabilities.canPublish});
});
},
};
}
render() {
return (
<View
<OTSession
apiKey={this.apiKey}
sessionId={this.sessionId}
token={this.token}
ref={instance => {
this.session = instance;
}}
eventHandlers={this.sessionEventHandlers}>
{this.state.canPublish ? (
<OTPublisher/>
) : null}
<OTSubscriber/>
</OTSession>
</View>
);
}
}
export default App;
To publish, the client must connect to the session with a token that is assigned a role that supports publishing.
You can have the publisher use the rear-facing camera of the device by setting a properties
prop of the OTPublisher component and setting the cameraPosition
property of that object to "back"
:
<OTPublisher
properties={{
cameraPosition: 'back',
}}
/>
Note that you can also publish a screen-sharing stream — one in which the source is the client's screen, not a camera. For details, see Screen sharing.
You can stop publisher from streaming to the session by unmounting it (removing it from the parent OTSession component). For example, the following code unpublishes a stream after 30 seconds:
import React, {Component} from 'react';
import {View} from 'react-native';
import {OTSession, OTPublisher} from 'opentok-react-native';
class App extends Component {
constructor(props) {
super(props);
this.apiKey = 'your-api-key';
this.sessionId = 'valid-session-id';
this.token = 'valid-token';
this.publisherOptions = {
publishCaptions: true,
publishVideo: true,
publishAudio: false,
};
this.state = {
publishing: true,
};
this.publisherEventHandlers = {
streamCreated: event => {
setTimeout(
function () {
console.log(10);
this.setState({publishing: false});
}.bind(this),
10000,
);
},
};
}
render() {
return (
<View>
<OTSession
apiKey={this.apiKey}
sessionId={this.sessionId}
token={this.token}
{this.state.publishing ? (
<OTPublisher
eventHandlers={this.publisherEventHandlers}
ref={instance => {
this.publisher = instance;
}}
/>
) : null}
</OTSession>
</View>
);
}
}
export default App;
Note that you can individually stop sending video or audio (while still publishing). For more information, see Adjusting audio and video.
The OTPublisher object dispatches a streamDestroyed
event when it stops streaming to the session:
<OTPublisher
eventHandlers={{
streamDestroyed: function() {
console.log('The publisher stopped streaming.');
},
}}
/>
The OTPublisher audioNetworkStats
and audioNetworkStats
events provide an array of objects defining the current audio or video statistics for the publisher. For a publisher in a routed session (one that uses the OpenTok Media Router), these event arrays each include one object, defining the statistics for the audio or video that is sent to the OpenTok Media Router. In a relayed session, the array includes an object for each subscriber to the published stream. Each object in the array has the following properties:
videoBytesSent
— The total number of audio and video bytes sentvideoPacketsSent
— The total number of audio and video packets sentvideoPacketsSent
— The total number of audio and video packets lostAdditionally, for a publisher in a relayed session, each object in the array contains the following two properties:
connectionId
— The unique ID of the subscribing client's connection, which matches the connectionId
property of the connectionCreated
event for that client's connection.
subscriberId
— The unique ID of the subscriber.
The following code logs these stats for the publisher's stream every second:
<OTPublisher
eventHandlers={{
audioNetworkStats: event => {
console.log('publisher audioNetworkStats event', event);
},
videoNetworkStats: event => {
console.log('publisher videoNetworkStats event', event);
},
}}
/>
To get more detailed stream statics, use the OTPublisher.getRtcStatsReport()
method. Calling this method results in the OTPublisher instance dispatching an rtcStatsReport
event:
<OTPublisher
eventHandlers={{
streamCreated: event => {
console.log('publisher streamCreated', event);
setTimeout(() => this.publisher.getRtcStatsReport(), 12000)
},
rtcStatsReport: event => {
console.log('publisher rtcStatsReport event', event);
},
}}
/>
For a publisher in a routed session, event array includes one object, defining the stats of the stream sent to the OpenTok Media Router. In a relayed session, the array includes an object defining the RTC Stats Reports for each subscriber to the published stream. Each object in the array has a jsonArrayOfReports
property that includes the data. The structure of the JSON array is similar to the format of the RtcStatsReport object implemented in web browsers (see the Mozilla docs). Also see this W3C documentation.
You can apply video filters to published video — see this topic.
You can set a video content hint to improve the quality and performance of a published video. This can be useful in certain situations:
When publishing screen-sharing video that will primarily contain either text or video content.
When using a camera video source, if you would prefer to degrade frame rate and maintain resolution, you can set a the content hint to "text" or "detail". In a routed session, the publisher will send a full-resolution, low frame-rate stream and — if network conditions permit — a full-resolution, regular frame-rate stream. The OpenTok Media Router will forward one of those streams to the subscribers.
This tells the client to use encoding or processing methods more appropriate to the type of content you specify.
Set the videoContentHint
property of OTPublisher properties
prop:
<OTPublisher
properties={{
videoContentHint: 'text',
}}
/>
Set the video content hint to one of the following values:
""
— No hint is provided (the default). The publishing client will make a best guess at how video content should be treated.
"motion"
— The track should be treated as if it contains video where motion is important. For example, you may use this seeting for a screen-sharing video stream that contains video.
"detail"
— The track should be treated as if video details are extra important. For example, you may use this seeting for a screen-sharing video stream that contains text content, painting, or line art.
"text"
— The track should be treated as if text details are extra important. For example, you may use this seeting for a screen-sharing video stream that contains text content.
With the "text" and "detailed" content hints, the client attempts to maintain high resolution, even if it must reduce the video frame rate. For the "motion" content hint, the client reduces resolution to prevent the frame rate from stalling.
You can read more about these options in the W3C Working Draft.
If you can accept a slow frame rate, you may also consider restricting the frame rate of subscribed streams.
See the developer guide for Adjusting audio and video.