Android Sample

Two samples are provided in the Android version of the Cochl.Sense SDK: sense-file and sense-stream.

  • sense-file performs a prediction on an audio file (wav or mp3).

  • sense-stream performs a prediction on an audio buffer.

1. Check the Requirements


Follow our Getting started section to set up the environment.

Android Requirements

Sense SDK for Android supports Android API 26 (Version 8.0 “Oreo”), or later.

2. Prepare the Sample


The samples can be found here

git clone https://github.com/cochlearai/sense-sdk-android-tutorials.git

Unzip the SDK

# sense-file
unzip path/to/sdk/sense-sdk-<version>-android.zip \
  -d path/to/sample/sense-sdk-android-tutorials/sense-file/app

# sense-stream
unzip path/to/sdk/sense-sdk-<version>-android.zip \
  -d path/to/sample/sense-sdk-android-tutorials/sense-stream/app

Android Studio Setup

Edit app/build.gradle file

dependencies {
    implementation files('libs/sense-sdk-v<version>.aar')  // Cochl.Sense SDK
}

Edit app/src/main/AndroidManifest.xml file

<uses-permission android:name="android.permission.INTERNET"/>

<!-- sense-stream -->
<uses-permission android:name="android.permission.RECORD_AUDIO"/>

<!-- sense-file -->
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>

3. How to use Sense SDK Android


Sense SDK Initialization

You can initialize the Sense SDK by calling init(projectKey, senseParams) in the ai.cochl.sensesdk.Sense package. It may take a while if the initialization process has to perform a model download. The Sense SDK downloads a model when no model is found on the device or a newer version of the model is available.

Parameters

You can refer to the c++ sample page for more information about those parameters.

public static class Parameters {
    /// The name that will be shown on your Cochl dashboard for this device.
    public String deviceName = "";

    public ModelDelegate modelDelegate = ModelDelegate.Default;
    /// Number of threads available to the tflite interpreter.
    /// NOTE: num_threads should be >= 0
    /// You may pass 0 to let the sense select all the processors available.
    public int numThreads = 0;

    public Metrics metrics = new Metrics();

    /// Specify the desired log level for this device:
    /// 0: Debug
    /// 1: Information
    /// 2: Warning
    /// 3: Error
    /// The default log level is 1(Information)
    public int logLevel = 1;

    /// Path to the directory which holds the
    /// shared libraries for the Hexagon NN libraries on the device.
    /// Must be set if you use the Hexagon Model delegate.
    public String hexagonSharedLibsFolderPath = "";

    /// New features
    public HopSizeControl hopSizeControl = new HopSizeControl();
    public SensitivityControl sensitivityControl = new SensitivityControl();
    public ResultAbbreviation resultAbbreviation = new ResultAbbreviation();
    public LabelHiding labelHiding = new LabelHiding();
}

Possible ModelDelegate options

enum ModelDelegate {
    Default(0), Hexagon(1), NnApi(2), Gpu(3);

    public final int value;

    ModelDelegate(int value) {
        this.value = value;
    }
}

Open sense-file/app/src/main/java/ai/cochl/examples/MainActivity.java in the sample directory, put your project key and set parameters.

import ai.cochl.sensesdk.Sense;

public class MainActivity extends AppCompatActivity {
    private final String projectKey = "Your project key";
    private Sense sense = null;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        sense = Sense.getInstance();

        Sense.Parameters senseParams = new Sense.Parameters();
        senseParams.metrics.retentionPeriod = 0;  // days
        senseParams.metrics.freeDiskSpace = 100;  // MB
        senseParams.metrics.pushPeriod = 30;      // seconds

        senseParams.deviceName = "Android device.";

        senseParams.logLevel = 0;

        senseParams.hopSizeControl.enable = true;
        senseParams.sensitivityControl.enable = true;
        senseParams.resultAbbreviation.enable = true;
        senseParams.labelHiding.enable = false;  // stream mode only

        sense.init(projectKey, senseParams);
    }
}

Open sense-stream/app/src/main/java/ai/cochl/examples/MainActivity.java in the sample directory, put your project key and set parameters.

import ai.cochl.sensesdk.Sense;

public class MainActivity extends AppCompatActivity {
    private final String projectKey = "Your project key";
    private Sense sense = null;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        sense = Sense.getInstance();

        Sense.Parameters senseParams = new Sense.Parameters();
        senseParams.metrics.retentionPeriod = 0;  // days
        senseParams.metrics.freeDiskSpace = 100;  // MB
        senseParams.metrics.pushPeriod = 30;      // seconds

        senseParams.deviceName = "Android device.";

        senseParams.logLevel = 0;

        senseParams.hopSizeControl.enable = true;
        senseParams.sensitivityControl.enable = true;
        senseParams.resultAbbreviation.enable = true;
        senseParams.labelHiding.enable = true;

        sense.init(projectKey, senseParams);
    }
}

Audio Input

Sense SDK simply receives audio data and returns the list of sound tags detected in a JSON format. It supports two types of audio input: stream and file.

Read a file and pass it to the Sense SDK.

import java.io.File;

(...)

// Storage which contains the audio file
File sdcard;
if (android.os.Build.VERSION.SDK_INT < 29)
    sdcard = Environment.getExternalStorageDirectory();
else  // android.os.Build.VERSION.SDK_INT >= 29
    sdcard = this.getExternalFilesDir(null);

// Create a File object and add it as an input to the Sense
sense.addInput(new File(sdcard, "some_audio_file.wav"));

android.media.AudioRecord is used to receive an audio stream from the device.

import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;

(...)

private final int AUDIO_SOURCE = MediaRecorder.AudioSource.UNPROCESSED;
private final int SAMPLE_RATE = 22050;
private final int CHANNEL_CONFIG = AudioFormat.CHANNEL_IN_MONO;
// Supported audio formats: PCM_8BIT, PCM_16BIT, PCM_FLOAT;
private final int AUDIO_FORMAT = AudioFormat.ENCODING_PCM_FLOAT;
private final int RECORD_BUF_SIZE = AudioRecord.getMinBufferSize(SAMPLE_RATE,
                                                                 CHANNEL_CONFIG,
                                                                 AUDIO_FORMAT);

// Create an AudioRecord object and add it as an input to the Sense
sense.addInput(new AudioRecord(AUDIO_SOURCE,
                               SAMPLE_RATE,
                               CHANNEL_CONFIG,
                               AUDIO_FORMAT,
                               RECORD_BUF_SIZE));

4. Predict on Device


NOTE

Sample rate of the input audio must be 22,050 Hz or higher. Audio with a sample rate lower than 22,050 Hz can’t be used. If the sample rate is higher than 22,050 Hz, The Sense SDK will downsample the audio internally.

Do NOT change the SAMPLE_RATE parameter for the audio stream prediction.

private static final int SAMPLE_RATE = 22050;

JSON result format

{
  "tags": [
    {
      "name"        : <string>, The name of the predicted tag (e.g. "Siren")
      "probability" : <float>,  Probability of the predicted tag
    }
  ],
  "start_time"      : <float>,  Starting time of this prediction window
  "end_time"        : <float>,  Ending time of this prediction window
  "prediction_time" : <double>, The amount of time it took to process this window
                                (in milliseconds)
}

LIMITATIONS: Audio file length must be at least 1 second.

import ai.cochl.sensesdk.Sense;
import ai.cochl.sensesdk.CochlException;

(...)

private final String projectKey = "Your project key";

// Get the Sense singleton class
sense = Sense.getInstance();

// Set the project key and parameters
Sense.Parameters senseParams = new Sense.Parameters();
senseParams.metrics.retentionPeriod = 0;  // days
senseParams.metrics.freeDiskSpace = 100;  // MB
senseParams.metrics.pushPeriod = 30;      // seconds

senseParams.deviceName = "Android device.";

senseParams.logLevel = 0;

senseParams.hopSizeControl.enable = true;
senseParams.sensitivityControl.enable = true;
senseParams.resultAbbreviation.enable = true;
senseParams.labelHiding.enable = false;  // stream mode only

sense.init(projectKey, senseParams);

// Storage which contains the audio file
File sdcard;
if (android.os.Build.VERSION.SDK_INT < 29) {
    sdcard = Environment.getExternalStorageDirectory();
} else {  /* android.os.Build.VERSION.SDK_INT >= 29 */
    sdcard = this.getExternalFilesDir(null);
}

// Create a File object and add it as an input to the Sense
sense.addInput(new File(sdcard, "some_audio_file.wav"));

// Set the callback function that is called after the prediction is done
sense.predict(new Sense.OnPredictListener() {
    @Override
    public void onReceivedResult(JSONObject json) {
        try {
            String result = json.getJSONObject("result").toString(2);
        } catch (JSONException e) {
            e.printStackTrace();
        } finally {
            runOnUiThread(() -> progressBar.setStop());
        }
    }

    @Override
    public void onError(CochlException e) {
        runOnUiThread(() -> {
            sense.stopPredict();
        });
    }
});
import ai.cochl.sensesdk.Sense;
import ai.cochl.sensesdk.CochlException;

import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;

(...)

private final String projectKey = "Your project key";

// Set the parameters for the AudioRecord object
private final int AUDIO_SOURCE = MediaRecorder.AudioSource.UNPROCESSED;
private final int SAMPLE_RATE = 22050;
private final int CHANNEL_CONFIG = AudioFormat.CHANNEL_IN_MONO;
// Supported audio formats: PCM_8BIT, PCM_16BIT, PCM_FLOAT;
private final int AUDIO_FORMAT = AudioFormat.ENCODING_PCM_FLOAT;
private final int RECORD_BUF_SIZE = AudioRecord.getMinBufferSize(SAMPLE_RATE,
                                                                 CHANNEL_CONFIG,
                                                                 AUDIO_FORMAT);

(...)

// Get the Sense singleton class
sense = Sense.getInstance();

// Set the project key and parameters
Sense.Parameters senseParams = new Sense.Parameters();
senseParams.metrics.retentionPeriod = 0;  // days
senseParams.metrics.freeDiskSpace = 100;  // MB
senseParams.metrics.pushPeriod = 30;      // seconds

senseParams.deviceName = "Android device.";

senseParams.logLevel = 0;

senseParams.hopSizeControl.enable = true;
senseParams.sensitivityControl.enable = true;
senseParams.resultAbbreviation.enable = true;
senseParams.labelHiding.enable = true;

sense.init(projectKey, senseParams);

// Create an AudioRecord object and add it as an input to the Sense
sense.addInput(new AudioRecord(AUDIO_SOURCE,
                               SAMPLE_RATE,
                               CHANNEL_CONFIG,
                               AUDIO_FORMAT,
                               RECORD_BUF_SIZE));

// Set the callback function that is called after the prediction is done
sense.predict(new Sense.OnPredictListener() {
    @Override
    public void onReceivedResult(JSONObject json) {
        try {
            String frame_result = json.getJSONObject("frame_result").toString(2);
        } catch (JSONException e) {
            e.printStackTrace();
        }
    }

    @Override
    public void onError(CochlException e) {
        runOnUiThread(() -> {
            sense.stopPredict();
        });
    }
});

5. Pause & Resume (Stream only)


While processing a stream, you can pause and resume the inference as needed.

Pause & Resume

pauseBtn = findViewById(R.id.pauseBtn);
pauseBtn.setOnClickListener(new View.OnClickListener() {
    @Override
    public void onClick(View v) {
        sense.pause();
    }
});

resumeBtn = findViewById(R.id.resumeBtn);
resumeBtn.setOnClickListener(new View.OnClickListener() {
    @Override
    public void onClick(View v) {
        sense.resume();
    }
});

6. Stop Prediction


If an unexpected error occurs during a prediction, you can abort the prediction by calling the stopPredict() method. This method should also be called before changing the audio input for the sense.

sense.stopPredict();

7. Terminate


The Sense SDK allocates multiple resources during its initialization. To ensure all those resources are removed safely, you should call the terminate() method before exiting.

@Override
protected void onDestroy() {
    if (sense != null) {
        sense.terminate();
        sense = null;
    }
    super.onDestroy();
}

(Note) Reference


Sense

java.lang.Object
    ai.cochl.sensesdk.Sense;

The Sense is available as a singleton class and works with a project key. After successful initialization, you can add audio input to the Sense to perform prediction on it.

Public Methods

public static Sense getInstance()

  • Return an instance of the Sense singleton class.

public static String getSdkVersion()

  • Return the Sense SDK version.

public void init(String projectKey, Parameters senseParams)

  • Authenticate the user with the project key. (throw an exception if the authentication process failed)

public void addInput(AudioRecord audioRecord)

  • Add an AudioRecord object to perform a prediction on an audio stream.

public void addInput(File file)

  • Add a File object to perform a prediction on an audio file.

public void predict(Sense.OnPredictListener listener)

  • Start the prediction with a callback function to call after an audio frame was processed.

public void pause()

  • Pause the prediction for the audio stream.

public void resume()

  • Resume the prediction for the audio stream.

public void stopPredict()

  • Stop the prediction and allows you to add a new audio input.

public Parameters getParameters()

  • Return the parameters set during initialization.

public void terminate()

  • Release the resources allocated by the Sense SDK after a successful initialization.

CochlException

java.lang.Object
    java.lang.Throwable
        java.lang.Exception
            java.lang.RuntimeException
                ai.cochl.sensesdk.CochlException

A custom exception may be thrown by the Sense SDK.

Public Constructors

Available constructors for the CochlException class.

public CochlException()

public CochlException(String message)

public CochlException(String format, Object... args)

public CochlException(String message, Throwable throwable)

public CochlException(Throwable throwable)