Setup

Learn how to add and configure QuickBlox SDK for your app.

Follow the instructions below to ensure that QuickBlox Android SDK runs smoothly with your app.

Visit our Key Concepts page to get an overall understanding of the most important QuickBlox concepts.

Get application credentials

QuickBlox application includes everything that brings messaging right into your application - chat, video calling, users, push notifications, etc. To create a QuickBlox application, follow the steps below:

  1. Register a new account following this link. Type in your email and password to sign in. You can also sign in with your Google or Github accounts.
  2. Create the app by clicking the New app button.
  3. Configure the app. Type in the information about your organization into corresponding fields and click the Add button.
  4. Go to the Dashboard => YOUR_APP => Overview section and copy your Application ID, Authorization Key, Authorization Secret, and Account Key .

Requirements

The minimum requirements for QuickBlox Android SDK are:

  • Android 5.0 (API 21)
  • Android Studio
  • Gradle 4.6
  • Gradle plugin 2.5.1

Install QuickBlox SDK into your app

To connect QuickBlox SDK to your app, import QuickBlox SDK dependencies via build.gradle file.
Include reference to SDK repository in your project-level build.gradle file at the root directory. Specify the URL of QuickBlox repository where the files are stored. Following this URL, gradle finds SDK artifacts.

allprojects {
    repositories {
        google()
        jcenter()
        maven {
            url "https://github.com/QuickBlox/quickblox-android-sdk-releases/raw/master/"
        }
    }
}

When the artifacts are found in QuickBlox repository, they get imported to particular SDK modules in build.gradle project file. Add the following code lines to app-level build.gradle file.

dependencies {
    implementation "com.quickblox:quickblox-android-sdk-messages:4.0.3"
    implementation "com.quickblox:quickblox-android-sdk-chat:4.0.3"
    implementation "com.quickblox:quickblox-android-sdk-content:4.0.3"
    implementation "com.quickblox:quickblox-android-sdk-videochat-webrtc:4.0.3"
    implementation "com.quickblox:quickblox-android-sdk-conference:4.0.3"
    implementation "com.quickblox:quickblox-android-sdk-customobjects:4.0.3"
}
ParametersDescription
messagesPush Notifications module enables working with push notifications and alerts to users.
chatChat module allows creating dialogs and sending messages into these dialogs.
contentContent module enables file storage and creating chat attachments for your app.
webrtcVideo Calling module adds video and audio calling features to your app.
conferenceVideo Conference allows setting up a video conference between 10-12 people in your app.
customobjectsCustom Objects module provides flexibility to define any data structure you need for your app.

Add permissions

To use the QuickBlox SDK, you need to add the following permissions to your app manifest:

<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />

To use video calling in your app, make sure to add the following minimum required permissions to your app manifest:

<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.CAMERA" />

To use chat functionality in your app and be able to send/receive files in the messages, make sure to add the following minimum required permissions to your app manifest:

<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />

If you need to add any other functionality to your app, you can add the relevant permissions to your app manifest. See this section to learn how to declare permissions. See this section to learn how to request permissions.

📘

Note that mentioning the camera and microphone permissions in the manifest isn't always enough. You need to request camera and microphone permissions additionally at runtime.

Initialize QuickBlox SDK

Initialize the framework with your application credentials. Pass Application ID, Authorization Key, Authorization Secret, and Account Key to the init() method.

🚧

You must initialize SDK before calling any methods through the SDK, except for the init() method. If you attempt to call a method without connecting, the error is returned.

static final String APPLICATION_ID = "67895";
static final String AUTH_KEY = "lkjdueksu7392kj";
static final String AUTH_SECRET = "BTFsj7Rtt27DAmT";
static final String ACCOUNT_KEY = "9yvTe17TmjNPqDoYtfqp";

QBSettings.getInstance().init(getApplicationContext(), APPLICATION_ID, AUTH_KEY, AUTH_SECRET);
QBSettings.getInstance().setAccountKey(ACCOUNT_KEY);
private const val APPLICATION_ID = "67895"
private const val AUTH_KEY = "lkjdueksu7392kj"
private const val AUTH_SECRET = "BTFsj7Rtt27DAmT"
private const val ACCOUNT_KEY = "9yvTe17TmjNPqDoYtfqp"

QBSettings.getInstance().init(applicationContext, APPLICATION_ID, AUTH_KEY, AUTH_SECRET)
QBSettings.getInstance().accountKey = ACCOUNT_KEY

❗️

Security

It's not recommended to keep your authKey and authSecret inside an application in production mode, instead of this, the best approach will be to store them on your backend.

ArgumentRequiredDescription
APPLICATION_IDyesApplication identifier.
AUTH_KEYyesAuthorization key.
AUTH_SECRETyesAuthorization secret.
ACCOUNT_KEYyesAccount key. Required to get actual Chat and API endpoints for the right server.

Initialize QuickBlox SDK without Authorization Key and Secret

You may don't want to store authKey and authSecret inside an application for security reasons. In such case, you can initialize QuickBlox SDK with applicationId and accountKey only, and store your authKey and authSecret on your backend. But, if so, the implementation of authentication with QuickBlox should be also moved to your backend.

static final String APPLICATION_ID = "67895";
static final String ACCOUNT_KEY = "9yvTe17TmjNPqDoYtfqp"; 

QBSDK.initWithAppId(getApplicationContext(), APPLICATION_ID, ACCOUNT_KEY);
private const val APPLICATION_ID = "67895"
private const val ACCOUNT_KEY = "9yvTe17TmjNPqDoYtfqp"

QBSDK.initWithAppId(applicationContext, APPLICATION_ID, ACCOUNT_KEY)

Then using your backend you can authorize a user in the QuickBlox system, send back the user session token, and set it to the QuickBlox SDK using QBAuth.startSessionWithToken() method. You can find out more about this in the Set existing session section.

Point SDK to the enterprise server

To point QuickBlox SDK to the QuickBlox enterprise server, you should pass the API_ENDPOINT and CHAT_ENDPOINT to the setEndpoints() method. You can call this method only after initializing the SDK.

private static final String API_ENDPOINT  = "https://apicustomdomain.quickblox.com";
private static final String CHAT_ENDPOINT = "chatcustomdomain.quickblox.com";
 
// use this method only if you need to set custom endpoints
QBSettings.getInstance().setEndpoints(API_ENDPOINT, CHAT_ENDPOINT), ServiceZone.PRODUCTION);
QBSettings.getInstance().setZone(ServiceZone.PRODUCTION);
private const val API_ENDPOINT = "https://apicustomdomain.quickblox.com"
private const val CHAT_ENDPOINT = "chatcustomdomain.quickblox.com"

// use the this method only if you need to set custom endpoints
QBSettings.getInstance().setEndpoints(API_ENDPOINT, CHAT_ENDPOINT, ServiceZone.PRODUCTION)
QBSettings.getInstance().zone = ServiceZone.PRODUCTION
ArgumentRequiredDescription
API_ENDPOINTyesAPI endpoint.
CHAT_ENDPOINTyesChat endpoint.
serviceZoneyesConnection service zone. The area where push notifications and messages can work.

📘

Contact our sales team to get API endpoint and chat endpoint.

Enable logging

Logging functionality allows you to keep track of all events and activities while running your app. As a result, you can monitor the operation of the SDK and improve the debug efficiency. There are 3 logging use cases:

  • Server API logging is used to monitor Server API calls.
  • Chat logging is used to monitor chat issues.
  • WebRTC logging is used to gather issues with video.

Server API logging

Enable Server API calls debug console output using the code snippet below.

QBSettings.getInstance().setLogLevel(LogLevel.DEBUG);
QBSettings.getInstance().logLevel = LogLevel.DEBUG
ParametersDescription
LogLevel.NOTHINGWrite nothing. Turn off logs.
LogLevel.DEBUGEnable logs (default value).

Chat logging

Use the method below to enable a detailed XMPP logging in the console output. true is enabled, false is disabled. Default: false.

QBChatService.setDebugEnabled(true);
QBChatService.setDebugEnabled(true)

WebRTC logging

To enable WebRTC logging, use the setDebugEnabled() method. true is enabled, false is disabled. Default: true.

QBRTCConfig.setDebugEnabled(true);
QBRTCConfig.setDebugEnabled(true)

Enable auto-reconnect to Chat

QuickBlox Chat runs over XMPP protocol. To receive messages in a real-time mode, the application should be connected to the Chat over XMPP protocol. To enable auto-reconnect to Chat, call the setReconnectionAllowed() method and pass true.

QBChatService.getInstance().setReconnectionAllowed(true)
QBChatService.getInstance().isReconnectionAllowed = true

📘

By default, autoreconnection functionality is enabled. Set autoreconnection before calling the QBChatService.getInstance().login() method so it could be applied in a current chat.

Message carbons

Message carbons functionality allows for multi-device support. Thus, all user messages get copied to all their devices so they could keep up with the current state of the conversation. For example, a User A has phone running conversations and desktop running conversations. User B has desktop running conversations. When User B sends a message to User A, the message shows on both the desktop and phone of User A.

Enable message carbons

try {
   QBChatService.getInstance().enableCarbons();
} catch (XMPPException exception) {
   exception.printStackTrace();
} catch (SmackException exception) {
   exception.printStackTrace();
}
try {
   QBChatService.getInstance().enableCarbons()
} catch (exception: XMPPException) {
   exception.printStackTrace()
} catch (exception: SmackException) {
   exception.printStackTrace()
}

Disable message carbons

try {
   QBChatService.getInstance().disableCarbons();
} catch (XMPPException exception) {
   exception.printStackTrace();
} catch (SmackException exception) {
   exception.printStackTrace();
}
try {
   QBChatService.getInstance().disableCarbons()
} catch (exception: XMPPException) {
   exception.printStackTrace()
} catch (exception: SmackException) {
   exception.printStackTrace()
}

📘

By default, message carbons functionality is disabled.

📘

Since message carbons functionality works over XMPP connection, make sure to enable it after the QBChatService.getInstance().login() method is called.

Stream management

Stream management has two important features Stanza Acknowledgements and Stream Resumption:

  • Stanza Acknowledgements is the ability to know if a stanza or series of stanzas has been received by one's peer. In other words, a reply is requested on every sent message. If the reply is received, the message is considered as delivered.
  • Stream Resumption is the ability to quickly resume a stream that has been terminated. So once a connection is re-established, Stream Resumption is executed. By matching the sequence numbers assigned to each Stanza Acknowledgement a server and client can verify which messages are missing and request to resend missing messages.
QBChatService.getInstance().setUseStreamManagement(true);
QBChatService.getInstance().setUseStreamManagement(true)

📘

By default, stream management functionality is disabled.

📘

You should enable Stream Management before you do the login() because the Stream Management is initialized while Chat login is performed.

The Stream Management defines an extension for active management of a stream between a client and server, including features for stanza acknowledgments.

Configure media settings

You can use a QBRTCMediaConfig class instance to configure a various list of media settings like video/audio codecs, bitrate, fps, etc.

QBRTCMediaConfig.setAudioCodec(QBRTCMediaConfig.AudioCodec.ISAC);
QBRTCMediaConfig.setAudioCodec(QBRTCMediaConfig.AudioCodec.OPUS);

QBRTCMediaConfig.setVideoCodec(QBRTCMediaConfig.VideoCodec.H264);
QBRTCMediaConfig.setVideoCodec(QBRTCMediaConfig.VideoCodec.VP8);
QBRTCMediaConfig.setVideoCodec(QBRTCMediaConfig.VideoCodec.VP9);

QBRTCMediaConfig.setAudioStartBitrate(audioStartBitrate);

QBRTCMediaConfig.setVideoStartBitrate(videoStartBitrate);

QBRTCMediaConfig.setVideoWidth(videoWidth);
QBRTCMediaConfig.setVideoHeight(videoHeight);

// Enable Hardware Acceleration if device supports it
QBRTCMediaConfig.setVideoHWAcceleration(true);

// Set frames-per-second in transmitting video stream
QBRTCMediaConfig.setVideoFps(videoFPS);

// Enable built-in AEC if device supports it
QBRTCMediaConfig.setUseBuildInAEC(true);

// Enable OpenSL ES audio if device supports it
QBRTCMediaConfig.setUseOpenSLES(true);

QBRTCMediaConfig.setAudioProcessingEnabled(true);
QBRTCMediaConfig.setAudioCodec(QBRTCMediaConfig.AudioCodec.ISAC)
QBRTCMediaConfig.setAudioCodec(QBRTCMediaConfig.AudioCodec.OPUS)

QBRTCMediaConfig.setVideoCodec(QBRTCMediaConfig.VideoCodec.H264)
QBRTCMediaConfig.setVideoCodec(QBRTCMediaConfig.VideoCodec.VP8)
QBRTCMediaConfig.setVideoCodec(QBRTCMediaConfig.VideoCodec.VP9)

QBRTCMediaConfig.setAudioStartBitrate(audioStartBitrate)

QBRTCMediaConfig.setVideoStartBitrate(videoStartBitrate)

QBRTCMediaConfig.setVideoWidth(videoWidth)
QBRTCMediaConfig.setVideoHeight(videoHeight)

// Enable Hardware Acceleration if device supports it
QBRTCMediaConfig.setVideoHWAcceleration(true)

// Set frames-per-second in transmitting video stream
QBRTCMediaConfig.setVideoFps(videoFPS)

// Enable built-in AEC if device supports it
QBRTCMediaConfig.setUseBuildInAEC(true)

// Enable OpenSL ES audio if device supports it
QBRTCMediaConfig.setUseOpenSLES(true)

QBRTCMediaConfig.setAudioProcessingEnabled(true)

📘

Make sure to setup media settings before initiating a call.

Synchronous and Asynchronous performers

You can use different performers perform() or performAsync(). Please note that the QBUsers.getUser() method is used there as an example. You can use these performers in all cases where performers are available. In a synchronous way, you should handle QBResponseException.

try {
    QBUsers.getUser(user.getId()).perform();
} catch (QBResponseException exception) {
      // handling exception
}
try {
    QBUsers.getUser(user.id).perform()
} catch (exception: QBResponseException) {
      // handling exception
}

In an asynchronous way, you can make further logic according to the server response.

QBUsers.getUser(user.getId()).performAsync(new QBEntityCallback<QBUser>() {
    @Override
    public void onSuccess(QBUser user, Bundle bundle) {

    }

    @Override
    public void onError(QBResponseException exception) {

    }
});
QBUsers.getUser(user.id).performAsync(object : QBEntityCallback<QBUser> {
    override fun onSuccess(user: QBUser?, bundle: Bundle?) {

    }

    override fun onError(exception: QBResponseException?) {

    }
})

Custom ICE servers

You can customize a list of ICE servers. By default, WebRTC module will use internal ICE servers that are usually enough, but you can always set your own. WebRTC engine will choose the TURN relay with the lowest round-trip time. Thus, setting multiple TURN servers allows your application to scale-up in terms of bandwidth and number of users.

// Set custom ICE servers up. Use it in case you want set YOUR OWN servers instead of defaults

List<PeerConnection.IceServer> iceServerList = new LinkedList<>();

iceServerList.add(new PeerConnection.IceServer("stun:stun.randomserver.example","", ""));
iceServerList.add(new PeerConnection.IceServer("stun:stun.randomserver.example","stun_login", "hdccn97ba2d56d72i426eb9875bya7yte8"));
iceServerList.add(new PeerConnection.IceServer("turn:turn.randomserver.example:6789?transport=udp","turn_login", "78kkb67m2f45h1e27ub9886gt70109"));
iceServerList.add(new PeerConnection.IceServer("turn:turn.randomserver.example:6789?transport=tcp","turn_login", "78kkb67m2f45h1e27ub9886gt70109"))
        
QBRTCConfig.setIceServerList(iceServerList);
// Set custom ICE servers up. Use it in case you want set YOUR OWN servers instead of defaults

val iceServerList = LinkedList<PeerConnection.IceServer>()

iceServerList.add(PeerConnection.IceServer("stun:stun.randomserver.example", "", ""))
iceServerList.add(PeerConnection.IceServer("stun:turn.randomserver.example", "stun_login", "hdccn97ba2d56d72i426eb9875bya7yte8"))
iceServerList.add(PeerConnection.IceServer("turn:turn.randomserver.example:6789?transport=udp", "turn_login", "78kkb67m2f45h1e27ub9886gt70109"))
iceServerList.add(PeerConnection.IceServer("turn:turn.randomserver.example:6789?transport=tcp", "turn_login", "78kkb67m2f45h1e27ub9886gt70109"))

QBRTCConfig.setIceServerList(iceServerList)

What’s Next