QuickBlox Documentation

QuickBlox provides powerful Chat API and SDK to add real-time messaging and video calls to your web and mobile apps. Learn how to integrate QuickBlox across multiple platforms and devices. Check out our detailed guides to make integration easy and fast.

Advanced

Learn how to mute audio, disable video, switch camera, share your screen, configure media settings, etc.

Media management

To manage audio & video streams QBRTCSession provides QBMediaStreamManager class.
QBMediaStreamManager holds a user's local audio & video tracks and provides a way to change the video capturer.

🚧

Pay attention

QBMediaStreamManager is attached to QBRTCSession lifecycle. According to QBRTCSession lifecycle, you should use QBMediaStreamManager only when QBRTCSession is active or has been started.

Mute audio

Mute the audio by calling localAudioTrack.setEnabled() or qbMediaStreamManager.setAudioEnabled() method. Using these methods, we can tell SDK to send/not send audio data either from a local or remote peer in the specified call session.

QBMediaStreamManager qbMediaStreamManager = currentSession.getMediaStreamManager();
        
QBRTCAudioTrack localAudioTrack = qbMediaStreamManager.getLocalAudioTrack();
        
// Mute
localAudioTrack.setEnabled(false);
// or
qbMediaStreamManager.setAudioEnabled(false);

// Unmute
localAudioTrack.setEnabled(true);
// or
qbMediaStreamManager.setAudioEnabled(true);

// Is muted?
boolean isEnabled = localAudioTrack.enabled();
// or 
qbMediaStreamManager.isAudioEnabled();
val qbMediaStreamManager = currentSession.mediaStreamManager

val localAudioTrack = qbMediaStreamManager.localAudioTrack

// Mute
localAudioTrack.setEnabled(false)
// or
qbMediaStreamManager.isAudioEnabled = false

// Unmute
localAudioTrack.setEnabled(true)
// or
qbMediaStreamManager.isAudioEnabled = true

// Is muted?
val isEnabled = localAudioTrack.enabled()
// or
qbMediaStreamManager.isAudioEnabled

Disable video

Turn off the video by calling localVideoTrack.setEnabled() or qbMediaStreamManager.setVideoEnabled() method. Using these methods, we can tell SDK not to send video data either from a local or remote peer in the specified call session.

QBMediaStreamManager qbMediaStreamManager = currentSession.getMediaStreamManager();

QBRTCVideoTrack localVideoTrack = qbMediaStreamManager.getLocalVideoTrack();

// Enable
localVideoTrack.setEnabled(false);
// or
qbMediaStreamManager.setVideoEnabled(false);

// Disable
localVideoTrack.setEnabled(true);
// or
qbMediaStreamManager.setVideoEnabled(true);

// Is Enabled?
boolean isEnabled = localVideoTrack.enabled();
// or
qbMediaStreamManager.isVideoEnabled();
val qbMediaStreamManager = currentSession.mediaStreamManager

val localVideoTrack = qbMediaStreamManager.localVideoTrack

// Enable
localVideoTrack.setEnabled(false)
// or
qbMediaStreamManager.isVideoEnabled = false

// Disable
localVideoTrack.setEnabled(true)
// or
qbMediaStreamManager.isVideoEnabled = true

// Is Enabled?
val isEnabled = localVideoTrack.enabled()
// or 
qbMediaStreamManager.isVideoEnabled

Capture video from camera

When a call session is started, the Camera Capturer is used by default. If you want to use it manually, you should set QBRTCCameraVideoCapturer as Video Capturer.

try {
    currentSession.getMediaStreamManager().setVideoCapturer(new QBRTCCameraVideoCapturer(this, null));
} catch (QBRTCCameraVideoCapturer.QBRTCCameraCapturerException e) {
    e.printStackTrace();
}
try {
    currentSession.mediaStreamManager.videoCapturer = QBRTCCameraVideoCapturer(this, null)
} catch (e: QBRTCCameraVideoCapturer.QBRTCCameraCapturerException) {
    e.printStackTrace()
}

🚧

Pay attention

Creating a new instance of QBRTCCameraVideoCapturer throws the QBRTCCameraCapturerException so you should handle this exception.

Switch camera

You can switch the video camera during a call. (Default: front camera)

QBRTCCameraVideoCapturer videoCapturer = (QBRTCCameraVideoCapturer) currentSession.getMediaStreamManager().getVideoCapturer();
videoCapturer.switchCamera(cameraSwitchHandler);
val videoCapturer = currentSession.mediaStreamManager.videoCapturer as QBRTCCameraVideoCapturer
videoCapturer.switchCamera(cameraSwitchHandler)

You should use CameraSwitchHandler to handle the camera switching process.

CameraVideoCapturer.CameraSwitchHandler cameraSwitchHandler = new CameraVideoCapturer.CameraSwitchHandler() {
    @Override
    public void onCameraSwitchDone(boolean b) {

    }

    @Override
    public void onCameraSwitchError(String s) {

    }
};
val cameraSwitchHandler = object : CameraVideoCapturer.CameraSwitchHandler {
    override fun onCameraSwitchDone(b: Boolean?) {

    }

    override fun onCameraSwitchError(s: String?) {

    }
}

Change capture format

You can change framerate and frame size during an active call session using videoCapturer.

QBRTCCameraVideoCapturer videoCapturer = (QBRTCCameraVideoCapturer) currentSession.getMediaStreamManager().getVideoCapturer();
videoCapturer.changeCaptureFormat(width, height, framerate);
val videoCapturer = currentSession.mediaStreamManager.videoCapturer as QBRTCCameraVideoCapturer
videoCapturer.changeCaptureFormat(width, height, framerate)

Screen sharing

To share the screen of your device with the opponents, follow the steps below:

  1. Ask appropriate permission.
if (Build.VERSION.SDK_INT > Build.VERSION_CODES.LOLLIPOP) {
    QBRTCScreenCapturer.requestPermissions(CallActivity.this);
}
if (Build.VERSION.SDK_INT > Build.VERSION_CODES.LOLLIPOP) {
    QBRTCScreenCapturer.requestPermissions([email protected])
}

🚧

Note

Instead of CallActivity.this, you can use the context of the activity where you are asking this permission.

  1. Handle results of asking the permission.
@Override
protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {
    if (requestCode == QBRTCScreenCapturer.REQUEST_MEDIA_PROJECTION) {
        if (resultCode == Activity.RESULT_OK) {
            // Now you can start Screen Sharing
            startScreenSharing(data);
        }
    }
}
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
    if (requestCode == QBRTCScreenCapturer.REQUEST_MEDIA_PROJECTION) {
        if (resultCode == Activity.RESULT_OK) {
            // Now you can start Screen Sharing
            startScreenSharing(data)
        }
    }
}

🚧

Note

You should pass the Intent (data) to startScreenSharing() method to use it for setVideoCapturer().

  1. Set QBRTCScreenCapturer as Video Capturer.
QBRTCSession currentSession = getCurrentSession(); // Simply use your current session variable
currentSession.getMediaStreamManager().setVideoCapturer(new QBRTCScreenCapturer(data, null)); // data - it's Intent from onActivityResult
val currentSession = getCurrentSession() // Simply use your current session variable
currentSession.getMediaStreamManager().setVideoCapturer(QBRTCScreenCapturer(data, null)) // data - it's Intent from onActivityResult

WebRTC stats reporting

You are able to receive an information report about the current connection, audio, video tracks, and other useful information. To set a receiving time interval, use the code snippet below.

QBRTCConfig.setStatsReportInterval(60);
QBRTCConfig.setStatsReportInterval(60)

Then you should use the QBRTCStatsReportCallback and add them.

currentSession.addStatsReportCallback(new QBRTCStatsReportCallback() {
    @Override
    public void onStatsReportUpdate(QBRTCStatsReport qbrtcStatsReport, Integer userID) {
        qbrtcStatsReport.getAudioReceivedCodec();
        qbrtcStatsReport.getAudioReceivedBitrate();

        qbrtcStatsReport.getVideoSendCodec();
        qbrtcStatsReport.getVideoSendBitrate();
        qbrtcStatsReport.getVideoReceivedBitrate();

        qbrtcStatsReport.getVideoReceivedFps();

        qbrtcStatsReport.getAudioSendInputLevel();

        qbrtcStatsReport.getVideoReceivedWidth();
        qbrtcStatsReport.getVideoReceivedHeight();
    }
});
currentSession.addStatsReportCallback { qbrtcStatsReport, userID ->
    qbrtcStatsReport.audioReceivedCodec
    qbrtcStatsReport.audioReceivedBitrate

    qbrtcStatsReport.videoSendCodec
    qbrtcStatsReport.videoSendBitrate
    qbrtcStatsReport.videoReceivedBitrate
  
    qbrtcStatsReport.videoReceivedFps

    qbrtcStatsReport.audioSendInputLevel

    qbrtcStatsReport.videoReceivedWidth
    qbrtcStatsReport.videoReceivedHeight
}

πŸ“˜

Note

The qbrtcStatsReport.audioReceivedCodec and qbrtcStatsReport.videoReceivedFps are not all you can get from the QBRTCStatsReport. This is just the example.

πŸ“˜

Note

Using QBRTCStatsReport, you can define when your opponent is speaking by using the qbrtcStatsReport.getAudioReceiveOutputLevel() parameter. This parameter is the microphone level from the participant’s audio track at the moment of collecting the statistics report.

Group calls

If you want to make multi-user calls, you can do it between a few users. Using WebRTC your device connects with each user in the call. It means that each device must handle as many connections as there are users in the current call.

This table shows how many connections your device must establish to make a group call with a different number of users in the call.

Users in CallConnections numberPer user
22 connections2 connections
36 connections4 connections
412 connections6 connections
520 connections8 connections

So as you can see, to make a call with 4 other participants each device has to handle 8 different connections to send and receive audio and video tracks to each other call participant. It requires high-performance devices.

The other solution we provide requires only 1 output and 1 input connection to make up to 10 users multi-user call.

🚧

Pay Attention

Please use this WebRTC Video Calling to make the Group Calls with 4 or fewer users.
Because of Mesh architecture we use for multi-point where every participant sends and receives its media to all other participants, the current solution supports group calls with up to 4 people.

Also, QuickBlox provides a different solution for up to 10 people in the call at the same time.
If you need to make Group Calls with more than 4 users, please Contact Us, and we will offer you a simple solution for multi-user conference calls.

General settings

You can change different settings for your calls using QBRTCConfig class. All of them are listed below.

Answer time interval

If an opponent did not answer you within a dialing time interval, then onUserNotAnswer callback will be returned.

// Time interval to wait opponents answer
QBRTCConfig.setAnswerTimeInterval(answerTimeInterval);
// Time interval to wait opponents answer
QBRTCConfig.setAnswerTimeInterval(answerTimeInterval)

πŸ“˜

Note

By default, the answer time interval is 60 seconds.

Disconnect time interval

Set maximum allowed time to repair a connection after it was lost.

// Time to repair the connection after it was lost
QBRTCConfig.setDisconnectTime(disconnectTimeInterval);
// Time to repair the connection after it was lost
QBRTCConfig.setDisconnectTime(disconnectTimeInterval)

πŸ“˜

Note

By default, the disconnect time interval is 10 seconds.

Dialing time interval

Dialing time interval indicates how often to send notifications to your opponents about your call.

// Time interval for establishing connection with the opponent
QBRTCConfig.setDialingTimeInterval(dialingTimeInterval);
// Time interval for establishing connection with the opponent
QBRTCConfig.setDialingTimeInterval(dialingTimeInterval)

πŸ“˜

Note

By default, the dialing time interval is 5 seconds.

Maximum number of opponets

Set the maximum number of opponents in a group call using the snippet below.

// Max number of opponents in group call
QBRTCConfig.setMaxOpponentsCount(maxOpponentCount);
// Max number of opponents in group call
QBRTCConfig.setMaxOpponentsCount(maxOpponentCount)

πŸ“˜

Note

By default, the maximum number of opponents is 10.

Custom ICE servers

You can customize a list of ICE servers. By default, WebRTC module will use internal ICE servers that are usually enough, but you can always set your own. WebRTC engine will choose the TURN relay with the lowest round-trip time. Thus, setting multiple TURN servers allows your application to scale-up in terms of bandwidth and number of users. Review our Setup guide to learn how to configure custom ICE servers.

Media settings

You can use QBRTCMediaConfig class instance to configure a variety of media settings such as video/audio codecs, bitrate, fps, etc.

Video codecs

A video codec is a video stream encoding parameter. You can choose from the following values: H264, VP8, and VP9.

QBRTCMediaConfig.VideoCodec videoCodec;
videoCodec = QBRTCMediaConfig.VideoCodec.H264;
videoCodec = QBRTCMediaConfig.VideoCodec.VP8;
videoCodec = QBRTCMediaConfig.VideoCodec.VP9;

QBRTCMediaConfig.setVideoCodec(videoCodec);
var videoCodec: QBRTCMediaConfig.VideoCodec
videoCodec = QBRTCMediaConfig.VideoCodec.H264
videoCodec = QBRTCMediaConfig.VideoCodec.VP8
videoCodec = QBRTCMediaConfig.VideoCodec.VP9

QBRTCMediaConfig.setVideoCodec(videoCodec)

Camera resolution

A camera resolution is a video stream encoding parameter. It's possible to set the custom video resolution to provide guarantees for the predictable behavior of the video stream.

int videoWidth = QBRTCMediaConfig.VideoQuality.QBGA_VIDEO.width;
int videoHeight = QBRTCMediaConfig.VideoQuality.QBGA_VIDEO.height;

// VGA Resolution
//videoWidth = QBRTCMediaConfig.VideoQuality.VGA_VIDEO.width;
//videoHeight = QBRTCMediaConfig.VideoQuality.VGA_VIDEO.height;

// HD Resolution
//videoWidth = QBRTCMediaConfig.VideoQuality.HD_VIDEO.width;
//videoHeight = QBRTCMediaConfig.VideoQuality.HD_VIDEO.height;

// Custom Resolution (for example FullHD)
//videoWidth = 1920;
//videoHeight = 1080;
        
QBRTCMediaConfig.setVideoWidth(videoWidth);
QBRTCMediaConfig.setVideoHeight(videoHeight);
var videoWidth = QBRTCMediaConfig.VideoQuality.QBGA_VIDEO.width
var videoHeight = QBRTCMediaConfig.VideoQuality.QBGA_VIDEO.height

// VGA Resolution
//videoWidth = QBRTCMediaConfig.VideoQuality.VGA_VIDEO.width
//videoHeight = QBRTCMediaConfig.VideoQuality.VGA_VIDEO.height

// HD Resolution
//videoWidth = QBRTCMediaConfig.VideoQuality.HD_VIDEO.width
//videoHeight = QBRTCMediaConfig.VideoQuality.HD_VIDEO.height

// Custom Resolution (for example FullHD)
//videoWidth = 1920
//videoHeight = 1080

QBRTCMediaConfig.setVideoWidth(videoWidth)
QBRTCMediaConfig.setVideoHeight(videoHeight)

Audio codecs

An audio codec is an audio stream encoding parameter. You can choose from the following values: ISAC and OPUS. Default: ISAC.

QBRTCMediaConfig.AudioCodec audioCodec;
audioCodec = QBRTCMediaConfig.AudioCodec.OPUS;
audioCodec = QBRTCMediaConfig.AudioCodec.ISAC;

QBRTCMediaConfig.setAudioCodec(audioCodec)
var audioCodec: QBRTCMediaConfig.AudioCodec
audioCodec = QBRTCMediaConfig.AudioCodec.OPUS
audioCodec = QBRTCMediaConfig.AudioCodec.ISAC

QBRTCMediaConfig.setAudioCodec(audioCodec)

Bitrate

A bitrate is a video stream encoding parameter. It's possible to set the custom bitrate to provide guarantees for the predictable behaviour of the video stream.

int startBitrate = 0;
//startBitrate = 2000;
QBRTCMediaConfig.setVideoStartBitrate(startBitrate);
var startBitrate: Int = 0
//startBitrate = 2000
QBRTCMediaConfig.setVideoStartBitrate(startBitrate)

Hardware acceleration

Enable hardware acceleration if the device supports it. Default: false.

boolean useHWAcceleration = true;
QBRTCMediaConfig.setVideoHWAcceleration(useHWAcceleration);
val useHWAcceleration = true
QBRTCMediaConfig.setVideoHWAcceleration(useHWAcceleration)

Frames per second

The fps is a video stream encoding parameter. It's possible to set the custom fps to provide guarantees for the predictable behavior of the video stream.

int fps = 30;
//fps = 30;
QBRTCMediaConfig.setVideoFps(fps);
val fps = 30
//fps = 30;
QBRTCMediaConfig.setVideoFps(fps)

Acoustic echo cancellation

Enable a built-in acoustic echo cancellation if the device supports it. Default: true.

boolean useAEC = true;
QBRTCMediaConfig.setUseBuildInAEC(useAEC);
val useAEC = true
QBRTCMediaConfig.setUseBuildInAEC(useAEC)

Open sound library for embedded systems

Enable open sound library for embedded systems (OpenSL ES audio) if the device supports it. Default: false.

boolean useOpenSLES = true;
QBRTCMediaConfig.setUseOpenSLES(useOpenSLES);
val useOpenSLES = true
QBRTCMediaConfig.setUseOpenSLES(useOpenSLES)

Audio processing

Enable audio processing if the device supports it. Default: true.

boolean useAudioProcessing = true;
QBRTCMediaConfig.setAudioProcessingEnabled(useAudioProcessing);
val useAudioProcessing = true
QBRTCMediaConfig.setAudioProcessingEnabled(useAudioProcessing)

Updated 23 days ago


What's Next

Video Conference

Advanced


Learn how to mute audio, disable video, switch camera, share your screen, configure media settings, etc.

Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.