ML Kit is a mobile SDK that brings Google's machine learning expertise to Android and iOS apps in a powerful yet easy-to-use package. Whether you're new or experienced in machine learning, you can easily implement the functionality you need in just a few lines of code. Follow our Detect and label objects in images with ML Kit for Firebase codelab to get started with built-in models -- there's no need to have deep knowledge of neural networks or model optimization. On the other hand, if you are an experienced ML developer, complete this codelab to learn how ML Kit makes it easy to use your custom TensorFlow Lite models in your mobile apps.

How does it work?

ML Kit makes it easy to apply ML techniques in your apps by bringing Google's ML technologies, such as the Google Cloud Vision API, Mobile Vision, and TensorFlow Lite, together in a single SDK. Whether you need the power of cloud-based processing, the real-time capabilities of Mobile Vision's on-device models, or the flexibility of custom TensorFlow Lite models, ML Kit makes it possible with just a few lines of code.

This codelab will walk you through creating your own Android app that can automatically detect and label images using a custom Tensor Flow Lite machine learning model in ML Kit for Firebase.

What you will build

In this codelab, you're going to build an Android app with Firebase ML Kit. You will:

  • Learn how to host your custom pre-trained Tensor Flow Lite model using Firebase
  • Use the ML Kit Custom Model API to download the pre-trained TensorFlow Lite model to your app
  • Use the downloaded model to run inference and label images

What you'll learn

What you'll need

This codelab is focused on ML Kit. Non-relevant concepts and code blocks are glossed over and are provided for you to simply copy and paste.

Download the Code

Click the following link to download all the code for this codelab:

Download source code

Unpack the downloaded zip file. This will unpack a root folder (mlkit-android) with all of the resources you will need. For this codelab, you will only need the resources in the custom-model subdirectory.

The custom-model subdirectory in the "mlkit" repository contains two directories for 2 Android Studio projects:

Download the Tensor Flow Lite model

Click the following link to download the pre-trained Tensor Flow Lite model we will be using in this codelab:

Download model

Unpack the downloaded zip file. This will unpack a root folder (mobilenet_v1_1.0_224_quant) inside which you will find the Tensor Flow Lite custom model we will use in this codelab (mobilenet_v1_1.0_224_quant.tflite).

  1. Go to the Firebase console.
  2. Click on "Add Project"; in the prompt window, name your project "ML Kit Custom Model Codelab.", and "Create Project"

Connect your Android app to your firebase console project, inside firebase console

  1. From the overview screen of your new Firebase project,
    click Add Firebase to your Android app.
  2. Enter the codelab's package name: com.google.firebase.codelab.mlkit_custommodel.

Add google-services.json file to your app

After adding the package name and selecting Continue, your browser automatically downloads a configuration file that contains all the necessary Firebase metadata for your Android app. Copy the google-services.json file into the .../custom-model/starter/app directory in your project.

Add the dependencies for ML Kit and the google-services plugin to your app

The google-services plugin uses the google-services.json file to configure your application to use Firebase, and the ML Kit dependencies allow you to integrate the ML Kit SDK in your app. The following lines should already be added to the end of the build.gradle file in the app directory of your project (check to confirm):

dependencies {
  // ...
  implementation 'com.google.firebase:firebase-ml-model-interpreter:15.0.0'
}
apply plugin: 'com.google.gms.google-services'

Sync your project with gradle files

To be sure that all dependencies are available to your app, you should sync your project with gradle files at this point. Select Sync Project with Gradle Files () from the Android Studio toolbar.

Now that you have imported the project into Android Studio and configured the google-services plugin with your JSON file, and added the dependencies for ML Kit, you are ready to run the app for the first time. Connect your Android device, and click Run ()in the Android Studio toolbar.

The app should launch on your emulator. At this point, you should see a basic layout that has a drop down field which allows you to select between several images. In the next section, you add image detection to your app to identify the objects in the images.

The pre-trained Tensor Flow Lite model we will be using in our app is the MobileNet_v1 model, which has been designed to be used in low-latency, low-power environments, and offers a good compromise between model size and accuracy. In this step, we will be hosting this model with Firebase by uploading it to our Firebase project. This enables apps using the ML Kit SDK to automatically download the model to our devices, and allows us to do model version management easily in the Firebase Console.

Host the custom model with Firebase

  1. Go to the Firebase console.
  2. Select your project.
  3. Select ML Kit under the DEVELOP section in the left hand navigation.
  4. Click on the CUSTOM tab.
  5. Click on Add another model and use "mobilenet_v1_224_quant" as the name. This is the name we will later use to download our custom model in our Android code.
  6. In the TensorFlow Lite model section, click BROWSE and upload the mobilenet_v1_1.0_224_quant.tflite file you downloaded earlier.
  7. Click PUBLISH.

We are now ready to modify our Android code to use this hosted model.

Download the custom model from Firebase

Now that we have hosted a pre-trained custom model by uploading it to our Firebase Project, we will modify our app code to automatically download and use this model.

Add the following fields to the top of the MainActivity class to define our FirebaseModelInterpreter.

MainActivity.java

    /**
     * An instance of the driver class to run model inference with Firebase.
     */
    private FirebaseModelInterpreter mInterpreter;
    /**
     * Data configuration of input & output data of model.
     */
    private FirebaseModelInputOutputOptions mDataOptions;

Then add the following code to the end of the onCreate method of the MainActivity class.

MainActivity.java

                int[] inputDims = {DIM_BATCH_SIZE, DIM_IMG_SIZE_X, DIM_IMG_SIZE_Y, DIM_PIXEL_SIZE};
        int[] outputDims = {DIM_BATCH_SIZE, mLabelList.size()};
        try {
            mDataOptions =
                    new FirebaseModelInputOutputOptions.Builder()
                            .setInputFormat(0, FirebaseModelDataType.BYTE, inputDims)
                            .setOutputFormat(0, FirebaseModelDataType.BYTE, outputDims)
                            .build();
            FirebaseModelDownloadConditions conditions = new FirebaseModelDownloadConditions
                    .Builder()
                    .requireWifi()
                    .build();
            FirebaseLocalModelSource localModelSource =
                    new FirebaseLocalModelSource.Builder("asset")
                            .setAssetFilePath(LOCAL_MODEL_ASSET).build();

            FirebaseCloudModelSource cloudSource = new FirebaseCloudModelSource.Builder
                    (HOSTED_MODEL_NAME)
                    .enableModelUpdates(true)
                    .setInitialDownloadConditions(conditions)
                    .setUpdatesDownloadConditions(conditions)  // You could also specify
                    // different conditions
                    // for updates
                    .build();
            FirebaseModelManager manager = FirebaseModelManager.getInstance();
            manager.registerLocalModelSource(localModelSource);
            manager.registerCloudModelSource(cloudSource);
            FirebaseModelOptions modelOptions =
                    new FirebaseModelOptions.Builder()
                            .setCloudModelName(HOSTED_MODEL_NAME)
                            .setLocalModelName("asset")
                            .build();
            mInterpreter = FirebaseModelInterpreter.getInstance(modelOptions);
        } catch (FirebaseMLException e) {
            showToast("Error while setting up the model");
            e.printStackTrace();
        }

Note how we use FirebaseModelInputOutputOptions in the code to specify inputs expected by our custom model and the outputs it generates. In the case of the Mobilenetv1 model, we use an input of 224x224 pixel images and generate a 1 dimensional list of outputs. We then set up the conditions in which our custom model should be downloaded to the device and register it with FirebaseModelManager.

Bundle a local version of the model for offline scenarios

Hosting a model with Firebase allows you to make updates to the model and have those automatically be downloaded to your users. However, in situations where there is poor internet connectivity, you may also want to bundle a local version of your model. By both hosting the model on Firebase and supporting locally, you can ensure that the most recent version of the model is used when network connectivity is available, but your app's ML features still work when the Firebase-hosted model isn't available.

We have already added the code to do this in the previous code snippet. We first created a FirebaseLocalModelSource and called registerLocalModelSource to register it with our FirebaseModelManager. All you have to do is add the mobilenet_v1.0_224_quant.tflite you downloaded earlier to the assets folder in your project.

In this step, we will define a function that uses the FirebaseModelInterpreter we configured in the previous step to run inference using the downloaded or local custom model.

Add code to use the downloaded/local model in your app

Copy the following method into the MainActivity class.

MainActivity.java

    private void runModelInference() {
        if (mInterpreter == null) {
            Log.e(TAG, "Image classifier has not been initialized; Skipped.");
            return;
        }
        // Create input data.
        ByteBuffer imgData = convertBitmapToByteBuffer(mSelectedImage, mSelectedImage.getWidth(),
                mSelectedImage.getHeight());

        try {
            FirebaseModelInputs inputs = new FirebaseModelInputs.Builder().add(imgData).build();
            // Here's where the magic happens!!
            mInterpreter
                    .run(inputs, mDataOptions)
                    .continueWith(
                            new Continuation<FirebaseModelOutputs, List<String>>() {
                                @Override
                                public List<String> then(Task<FirebaseModelOutputs> task) {
                                    byte[][] labelProbArray = task.getResult()
                                            .<byte[][]>getOutput(0);
                                    List<String> topLabels = getTopLabels(labelProbArray);
                                    mGraphicOverlay.clear();
                                    GraphicOverlay.Graphic labelGraphic = new LabelGraphic
                                            (mGraphicOverlay, topLabels);
                                    mGraphicOverlay.add(labelGraphic);
                                    return topLabels;
                                }
                            });
        } catch (FirebaseMLException e) {
            e.printStackTrace();
            showToast("Error running model inference");
        }

    }

ML Kit handles downloading and running the model automatically (or using the local bundled version if the hosted model can't be downloaded), and provides the results with task.getResult(). We then sort and display these results in our app UI.

Run the app on the emulator

Now click Run () in the Android Studio toolbar. Once the app loads, make sure that Image 1 is selected in the drop down field and click on the RUN MODEL button.

Your app should now look like image below, showing the model inference results and the detected image labels with their confidence levels.

You have used ML Kit for Firebase to easily add advanced machine learning capabilities to your app.

What we've covered