In this codelab, we will learn how to create a camera app that uses CameraX to show a viewfinder, take photos and analyze an image stream from the camera.

To achieve this we will introduce the concept of use cases in CameraX, which can be used for a variety of camera operations from displaying a viewfinder to analyzing frames in real time.

What we'll learn

Hardware we'll need

Software we'll need

Using the Android Studio menu, start a new project and select Empty Activity when prompted.

Next, we can pick any name that we want -- we ingeniously chose "CameraX App". We should make sure that the language is set to Kotlin, the minimum API level is 21 (which is the minimum required for CameraX) and that we use AndroidX artifacts.

To get started, let's add the CameraX dependencies to our app Gradle file, inside the dependencies section:

def camerax_version = "1.0.0-alpha01"
implementation "androidx.camera:camera-core:${camerax_version}"
implementation "androidx.camera:camera-camera2:${camerax_version}"

When prompted, click Sync Now, and we will ready to use CameraX in our app.

We will be using a SurfaceTexture to display the camera viewfinder. In this codelab, we will display the viewfinder in a square format of fixed size. For a more comprehensive example that shows a responsive viewfinder, checkout the official sample.

Let's edit the activity_main layout file under res > layout > activity_main.xml:

<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout
        xmlns:android="http://schemas.android.com/apk/res/android"
        xmlns:tools="http://schemas.android.com/tools"
        xmlns:app="http://schemas.android.com/apk/res-auto"
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        tools:context=".MainActivity">

    <TextureView
            android:id="@+id/view_finder"
            android:layout_width="640px"
            android:layout_height="640px"
            app:layout_constraintTop_toTopOf="parent"
            app:layout_constraintBottom_toBottomOf="parent"
            app:layout_constraintStart_toStartOf="parent"
            app:layout_constraintEnd_toEndOf="parent" />

</androidx.constraintlayout.widget.ConstraintLayout>

A crucial part of adding any functionality in our project that uses the camera is requesting the appropriate CAMERA permissions. First, we must declare them in the manifest, before the Application tag:

<uses-permission android:name="android.permission.CAMERA" />

Then, inside of MainActivity we need to request permissions at runtime. We will be making the changes in the MainActivity file under java > com.example.cameraxapp > MainActivity.kt:

At the top of the file, outside of the MainActivity class definition, let's add the following constants and imports:

// Your IDE likely can auto-import these classes, but there are several
// different implementations so we list them here to disambiguate
import android.Manifest
import android.util.Size
import android.graphics.Matrix
import java.util.concurrent.TimeUnit

// This is an arbitrary number we are using to keep tab of the permission
// request. Where an app has multiple context for requesting permission,
// this can help differentiate the different contexts
private const val REQUEST_CODE_PERMISSIONS = 10

// This is an array of all the permission specified in the manifest
private val REQUIRED_PERMISSIONS = arrayOf(Manifest.permission.CAMERA)

Inside of the MainActivity class, add the following fields and helper methods which are used to request permissions and trigger our code once we know that all permissions have been granted:

class MainActivity : AppCompatActivity(), LifecycleOwner {

    override fun onCreate(savedInstanceState: Bundle?) {
        ...
    }

    // Add this after onCreate

    private lateinit var viewFinder: TextureView

    private fun startCamera() {
        // TODO: Implement CameraX operations
    }

    private fun updateTransform() {
        // TODO: Implement camera viewfinder transformations
    }

    /**
     * Process result from permission request dialog box, has the request
     * been granted? If yes, start Camera. Otherwise display a toast
     */
    override fun onRequestPermissionsResult(
        requestCode: Int, permissions: Array<String>, grantResults: IntArray) {
        if (requestCode == REQUEST_CODE_PERMISSIONS) {
            if (allPermissionsGranted()) {
                viewFinder.post { startCamera() }
            } else {
                Toast.makeText(this,
                    "Permissions not granted by the user.", 
                    Toast.LENGTH_SHORT).show()
                finish()
            }
        }
    }

    /**
     * Check if all permission specified in the manifest have been granted
     */
    private fun allPermissionsGranted() = REQUIRED_PERMISSIONS.all {
        ContextCompat.checkSelfPermission(
               baseContext, it) == PackageManager.PERMISSION_GRANTED
    }
}

Finally, we put everything together inside of onCreate to trigger the permission request when appropriate:

override fun onCreate(savedInstanceState: Bundle?) {
    ...

    // Add this at the end of onCreate function

    viewFinder = findViewById(R.id.view_finder)

    // Request camera permissions
    if (allPermissionsGranted()) {
        viewFinder.post { startCamera() }
    } else {
        ActivityCompat.requestPermissions(
            this, REQUIRED_PERMISSIONS, REQUEST_CODE_PERMISSIONS)
    }

    // Every time the provided texture view changes, recompute layout
    viewFinder.addOnLayoutChangeListener { _, _, _, _, _, _, _, _, _ ->
        updateTransform()
    }
}

Now, when the application starts, it will check if it has the appropriate camera permissions. If it does, it will call `startCamera()` directly. Otherwise, it will request the permissions and, once granted, call `startCamera()`.

For most camera applications, showing a viewfinder to the users is very important -- otherwise it's very difficult for users to point the camera to the right place. A viewfinder can be implemented by using the CameraX `Preview` class.

To use Preview, we need to first define a configuration which then gets used to create an instance of the use case. The resulting instance is what we need to bind to the CameraX lifecycle. We will be doing this within the `startCamera()` method; fill out the implementation with this code:

private fun startCamera() {

    // Create configuration object for the viewfinder use case
    val previewConfig = PreviewConfig.Builder().apply {
        setTargetAspectRatio(Rational(1, 1))
        setTargetResolution(Size(640, 640))
    }.build()

    // Build the viewfinder use case
    val preview = Preview(previewConfig)

    // Every time the viewfinder is updated, recompute layout
    preview.setOnPreviewOutputUpdateListener {

        // To update the SurfaceTexture, we have to remove it and re-add it
        val parent = viewFinder.parent as ViewGroup
        parent.removeView(viewFinder)
        parent.addView(viewFinder, 0)

        viewFinder.surfaceTexture = it.surfaceTexture
        updateTransform()
    }

    // Bind use cases to lifecycle
    // If Android Studio complains about "this" being not a LifecycleOwner
    // try rebuilding the project or updating the appcompat dependency to
    // version 1.1.0 or higher.
    CameraX.bindToLifecycle(this, preview)
}

At this point, we need to implement the mysterious `updateTransform()` method. Inside of `updateTransform()` the goal is to compensate for changes in device orientation to display our viewfinder in upright rotation:

private fun updateTransform() {
    val matrix = Matrix()

    // Compute the center of the view finder
    val centerX = viewFinder.width / 2f
    val centerY = viewFinder.height / 2f

    // Correct preview output to account for display rotation
    val rotationDegrees = when(viewFinder.display.rotation) {
        Surface.ROTATION_0 -> 0
        Surface.ROTATION_90 -> 90
        Surface.ROTATION_180 -> 180
        Surface.ROTATION_270 -> 270
        else -> return
    }
    matrix.postRotate(-rotationDegrees.toFloat(), centerX, centerY)

    // Finally, apply transformations to our TextureView
    viewFinder.setTransform(matrix)
}

To implement a production-ready app, take a look at the official sample to see what else needs to be handled. For the sake of keeping this codelab short, we are taking a few shortcuts. For example, we are not keeping track of some configuration changes such as 180-degree device rotations, which do not trigger our layout change listener. Non-square viewfinders also need to compensate for aspect ratio changing when the device rotates.

If we build and run the app, we should now see a life preview. Nice!

To let users capture images, we will be providing a button as part of the layout after the texture view in res > layout > activity_main.xml:

<ImageButton
        android:id="@+id/capture_button"
        android:layout_width="72dp"
        android:layout_height="72dp"
        android:layout_margin="24dp"
        app:srcCompat="@android:drawable/ic_menu_camera"
        app:layout_constraintBottom_toBottomOf="parent"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintStart_toStartOf="parent" />

Other use cases work in a very similar way compared to Preview. First, we must define a configuration object which is used to instantiate the actual use case object. To capture photos, when the capture button is pressed, we need to update the `startCamera()` method and add a few more lines of code at the end, before the call to CameraX.bindToLifecycle:

private fun startCamera() {

    ...

    // Add this before CameraX.bindToLifecycle

    // Create configuration object for the image capture use case
    val imageCaptureConfig = ImageCaptureConfig.Builder()
        .apply {
            setTargetAspectRatio(Rational(1, 1))
            // We don't set a resolution for image capture; instead, we
            // select a capture mode which will infer the appropriate
            // resolution based on aspect ration and requested mode
            setCaptureMode(ImageCapture.CaptureMode.MIN_LATENCY)
    }.build()

    // Build the image capture use case and attach button click listener
    val imageCapture = ImageCapture(imageCaptureConfig)
    findViewById<ImageButton>(R.id.capture_button).setOnClickListener {
        val file = File(externalMediaDirs.first(),
            "${System.currentTimeMillis()}.jpg")
        imageCapture.takePicture(file,
            object : ImageCapture.OnImageSavedListener {
            override fun onError(error: ImageCapture.UseCaseError,
                                 message: String, exc: Throwable?) {
                val msg = "Photo capture failed: $message"
                Toast.makeText(baseContext, msg, Toast.LENGTH_SHORT).show()
                Log.e("CameraXApp", msg)
                exc?.printStackTrace()
            }

            override fun onImageSaved(file: File) {
                val msg = "Photo capture succeeded: ${file.absolutePath}"
                Toast.makeText(baseContext, msg, Toast.LENGTH_SHORT).show()
                Log.d("CameraXApp", msg)
            }
        })
    }

    // Bind use cases to lifecycle
    // If Android Studio complains about "this" being not a LifecycleOwner
    // try rebuilding the project or updating the appcompat dependency to
    // version 1.1.0 or higher.
    CameraX.bindToLifecycle(this, preview)
}

Then, update the call to CameraX.bindToLifecycle to include the new use case:

CameraX.bindToLifecycle(this, preview, imageCapture)

And just like that, we have implemented a functional photo-taking button.

A very interesting feature of CameraX is the ImageAnalysis class. It allows us to define a custom class implementing the ImageAnalysis.Analyzer interface, which will be called with incoming camera frames. In line with the core vision of CameraX, we won't have to worry about managing the camera session state or even disposing of images; binding to our app's desired lifecycle is sufficient like other lifecycle-aware components.

First, we will implement a custom image analyzer. Our analyzer is quite simple -- it just logs the average luma (luminosity) of the image, but exemplifies what needs to be done for arbitrarily complex use cases. All we need to do is override the `analyze` function in a class that implements the ImageAnalysis.Analyzer interface. We can define our implementation as an inner class within MainActivity:

private class LuminosityAnalyzer : ImageAnalysis.Analyzer {
    private var lastAnalyzedTimestamp = 0L

    /**
     * Helper extension function used to extract a byte array from an
     * image plane buffer
     */
    private fun ByteBuffer.toByteArray(): ByteArray {
        rewind()    // Rewind the buffer to zero
        val data = ByteArray(remaining())
        get(data)   // Copy the buffer into a byte array
        return data // Return the byte array
    }

    override fun analyze(image: ImageProxy, rotationDegrees: Int) {
        val currentTimestamp = System.currentTimeMillis()
        // Calculate the average luma no more often than every second
        if (currentTimestamp - lastAnalyzedTimestamp >=
            TimeUnit.SECONDS.toMillis(1)) {
            // Since format in ImageAnalysis is YUV, image.planes[0]
            // contains the Y (luminance) plane
            val buffer = image.planes[0].buffer
            // Extract image data from callback object
            val data = buffer.toByteArray()
            // Convert the data into an array of pixel values
            val pixels = data.map { it.toInt() and 0xFF }
            // Compute average luminance for the image
            val luma = pixels.average()
            // Log the new luma value
            Log.d("CameraXApp", "Average luminosity: $luma")
            // Update timestamp of last analyzed frame
            lastAnalyzedTimestamp = currentTimestamp
        }
    }
}

With our class implementing the ImageAnalysis.Analyzer interface, all we need to do is instantiate the ImageAnalysis like all other use cases and update the `startCamera()` function once again, before the call to CameraX.bindToLifecycle:

private fun startCamera() {

    ...

    // Add this before CameraX.bindToLifecycle

    // Setup image analysis pipeline that computes average pixel luminance
    val analyzerConfig = ImageAnalysisConfig.Builder().apply {
        // Use a worker thread for image analysis to prevent glitches
        val analyzerThread = HandlerThread(
            "LuminosityAnalysis").apply { start() }
        setCallbackHandler(Handler(analyzerThread.looper))
        // In our analysis, we care more about the latest image than
        // analyzing *every* image
        setImageReaderMode(
            ImageAnalysis.ImageReaderMode.ACQUIRE_LATEST_IMAGE)
    }.build()

    // Build the image analysis use case and instantiate our analyzer
    val analyzerUseCase = ImageAnalysis(analyzerConfig).apply {
        analyzer = LuminosityAnalyzer()
    }

    // Bind use cases to lifecycle
    // If Android Studio complains about "this" being not a LifecycleOwner
    // try rebuilding the project or updating the appcompat dependency to
    // version 1.1.0 or higher.
    CameraX.bindToLifecycle(this, preview, imageCapture)
}

And we also update the call to CameraX.bindtoLifecycle to bind the new use case:

CameraX.bindToLifecycle(
    this, preview, imageCapture, analyzerUseCase)

Running the app now will produce a message similar to this in logcat approximately every second:

D/CameraXApp: Average luminosity: ...

To test the app, all we have to do is click on the Run button in Android Studio and our project will be built, deployed and launched in the selected device or emulator. Once the app loads, we should see the viewfinder, which will remain upright even after rotating the device thanks to the orientation-handling code we added earlier, and should also be able to take photos using the button:

You've finished the code lab and successfully! Looking back, you implemented the following into a new Android app from scratch:

If you are interested in reading more about CameraX and the things that you can do with it, checkout the documentation or clone the official sample.