In this codelab you will create a stable and reusable testing harness to run performance tests on a very simple existing app. The harness will automate the collection information such as systrace logs, location requests, batterystats, graphics profiling, and more. Test failures will also be logged to files and we'll show you an example of how to write a performance-based test. Wow, that's a lot, right?! Good luck!! :)

What you'll learn

Since the number of moving parts and components within this codelab is rather large, you will enable each piece of the harness by uncommenting existing code. This should allow you to become familiar with the harness while also allowing you to perform similar steps on your own projects. We'll end up with a test harness that logically looks something like this.

Let's get started!

First, you need to setup your development environment.

What you'll need

A computer with the following packages installed:

The following environment conditions should also be met:

Finally, you should have an Android device connected to your computer via USB cable that can be used to run tests.

In the next step, we will download and build the test app.

In this step, we build and run the provided test app on your Android device.

Get the code

Run the following to download the sample code from GitHub. Or to download it directly as a zip file go here.

git clone

When this is done, you should have a folder on your machine called android-perf-testing.

Open up Android Studio. On the Quick Start screen, click "Open an existing Android Studio project" (or select File > Open).

In the import chooser window open the android-perf-testing folder cloned in the previous step. Select the settings.gradle file and click Choose.

Open the Project Navigator by clicking on the Project text button on the top left edge of the Android Studio window. At the top of the Project Navigator you can select different view perspectives, often, you're defaulted to the Android perspective after an import. Change this to Project to ensure you can see all the files we'll be working with. This process be seen in the screenshot below.

Wait for the completion of the import process by looking at the very bottom of the Android Studio screen for a "Gradle build finished" message to appear in the status bar.

You may see build errors, but we will resolve those shortly. Once the Gradle build is complete, click the green Run icon in the top of the Android Studio menu to install and run the app.

If you haven't already, plug in your Android device and unlock the device's screen. Select the device from the list of devices available to run the app in the window that appears.

Once the app is running on your device, it should look similar to this. It can take a minute or two for the app to deploy and open the first time.

You'll notice the test app has very limited functionality. Use the app to become familiar with it:

Now we have the sample app's source code imported and the app build and runs. Before we automate some testing, let's take a quick look at how the app is performing and the tooling we would normally use to diagnose a performance issue, then we'll be better prepared to automate those steps.

We're going to manually inspect the performance issue in the app. We can't automate performance testing if we don't understand what we are looking for, so let's manually look for some jank.

Launch the app again and touch the Open List View button. You'll be presented with a screen that looks like this:

Scroll to the bottom of the list; the app gets jankier the farther you scroll. This is because the app is having trouble providing the rendering subsystem a frame every 16 milliseconds (the primary cause of most UI related performance issues); therefore, Android is skipping frames and causing a visual jank. The app skips more frames as you scroll down the page. If you're persistent enough, you'll even notice that it may crash before reaching the bottom of the list.

Luckily we don't have to depend on careful eyesight to view the jank on our device. Within the Developer Options on most Android devices there is a GPU rendering profile option (near the bottom of the list). Enable this option with the Show on screen as bars option then open the app and use the Open List View option again. Your screen should look similar to this.

The green line towards the bottom/middle of the screen on the Android device indicates the important 16 millisecond barrier your app should attempt to never cross while providing frames to the rendering subsystem. Frames are drawn as a time series horizontally across the screen with each vertical bar representing a frame. The height of the bar indicates how long the frame took to draw and when any colored part of the line goes above the green bar it indicates the frame missed the 16 millisecond timeout and caused jank. The colors of the bar indicate the amount of time spent in each major phase of frame rendering. The orange part of the bar indicates app process code; this app is wasting massive amounts of time.

When you see jank in a ListView like this you typically turn to a tool called Systrace to dig deeper and figure out what's going on. Let's do that really quick.

We aren't going to deep dive into Systrace, but we will debug the app with systrace quickly to diagnose the jank issue. This step will also make sure Systrace is configured correctly on your system, but if you are well-versed in Systrace you can skip this step.

What is Systrace?

From the Android documentation:

Systrace is a tool that will help you analyze the performance of your application by capturing and displaying execution times of your applications processes and other Android system processes.

What are we using it for?

We're using Systrace to look for performance issues in our app. When we run Systrace while using our app, it will collect data from Android then format it into an html file we can view in a web browser. When we open the Systrace results, we're hoping it will be able to help us resolve some of the underlying issues associated with our app. So far all we know is that our app is janky; it would be nice if the tooling helped us figure out why.

How to run Systrace

Click the "Terminal" text button in the bottom left of the Android Studio window.

Run the following commands there:

On Mac / Linux:

python $ANDROID_HOME/platform-tools/systrace/ --time=10 -o ~/trace.html gfx view res

On Windows:

python %ANDROID_HOME%/platform-tools/systrace/ --time=10 -o %userprofile%/trace.html gfx view res

This will run Systrace for 10 seconds, giving you enough time to reproduce the janky behaviour you saw in previous steps. Run the above command and while systrace is running, open the app and scroll through the Simple List View again. Systrace will be collecting data while you use the app. It provides a filename of the results to the output file: trace.html .

Open a browser and view the trace.html file. You should see a display similar to this:

This is a visualization of the performance data from your app and the system as a linear timeline during the 10 second time period.

Again, this isn't a Systrace tutorial, but notice the floating navigation bar in the upper right corner of this window that puts your cursor into different modes to interact with the trace. From top to bottom in the following screenshot, the pictured icons allow your pointer to affect the screen in the following ways:

Systrace tool used for zooming and panning.

You can also use the W,A,S,D keys on your keyboard to zoom and pan if that's easier.

First focus your attention on the top-most horizontal section of Alerts.

The Alerts row here highlights possible issues that came up during the tracing; if you click on one, a detail panel will open at the bottom of your browser and you can read the alert details. For example:

You'll notice the app's ListView implementation has multiple issues. The alerts give you detailed information about performance improvements as well as links to documentation that can assist in resolving the issue.

The next horizontal section, typically has a header with your package name in it. If it doesn't, use the arrows to the right of the other package names to collapse the respective sections until your package name is visible.

If this is your first time using the tool you will want to focus on the Frames row (see below for a preview).

You'll find explanations for non-green Alerts in the bottom of the browser window again. In particular, look for alerts indicating your app is missing the dreaded 16ms window of time to produce a frame. The Alerts also flag other issues like non-recycling of views or incorrectly timed Layout inflating. Click some red alerts in this row until you also find one that describes Inflation during ListView recycling. There should be a lot of them.

Think about why it might be telling you that. How you would go about resolving it?

Whenever you have performance issues, Systrace is one of the primary places you should start looking for the issue. Here we validated that frames are being dropped and we found an issue with our app. But we really shouldn't be finding this issue through coordinating our test run with the Systrace manually. In the next step we will automated the test we performed, then we'll follow up by automating Systrace.

Now, let's start automating the manual test we performed before: Opening the ListView and scrolling the entire list.

As a quick Android testing primer: there are typically two types of tests that are written for Android apps. Unit tests typically exercise discrete bits of code. Android instrumental tests exercise app components and usually simulate user input and/or mock parts of Android not specifically being tested. The Espressolibrary is used to assist with Android instrumental tests.

Espresso Test Library

Espresso is a testing library that provides APIs for writing UI tests to simulate user interactions within a single app. Most user interactions can be quickly and succinctly scripted in a test using Espresso and other parts of the Android Testing Support Library.

Note: Sometimes Android tests require specific architectural designs to allow mocking parts of the Android framework. This is a complicated subject outside the scope of this particular codelab.

Espresso Library Dependencies

The first step to using Espresso is adding the library dependency to the project. Add the library by uncommenting the following lines in the app/build.gradle file within the dependencies section. This snippet includes a couple of other libraries that are typically used in conjunction with Espresso.

androidTestCompile "${supportLibVersion}"
androidTestCompile ''
androidTestCompile ''
androidTestCompile ''

Everytime you edit the Gradle script, Android Studio will ask if you would like to sync with Gradle in a yellow bar at the top of the file viewer. Click Sync Now and wait for the build and re-configuration of the project.

Writing Espresso Tests

Now we'll write a test that scrolls the app ListView the same way we did manually previously.

Navigate to the androidTest source directory in app/src/androidTest/java/. Some test classes are already defined within the package. Open the SimpleListActivityTest class. This class holds tests you would logically write for the SimpleListActivity class.

Uncomment the ActivityTestRule defined in the file (see below). This JUnit Rule ensures that the specified activity is created prior to running any tests defined in the class.

public ActivityTestRule<SimpleListActivity> mActivityRule = new ActivityTestRule<>(SimpleListActivity.class);

Next, uncomment the test method called scrollFullList and review the test method code. The test obtains the ListView within the activity under test and scrolls the entire list. The test waits for the scrolling to complete. After reviewing the code you will need to resolve the missing import errors with a right click on each error in the file.

public void scrollFullList() throws InterruptedException {
ListView listView = (ListView) mActivityRule.getActivity().findViewById(;

Finally, uncomment the @PerfTest at the top of the SimpleListActivityTest class declaration. This is important for ensuring your test class is picked up by the instrumentation later.

public class SimpleListActivityTest {

Running Tests

To run all tests configured for your app you will run the connectedCheck Gradle task. There are a few ways to run this task in Android Studio. It's good to know all of them so run through the following. Remember to make sure your test device's screen is on the device is unlocked before running your tests.

./gradlew :app:connectedCheck

Again, as you ran the task you probably saw exceptions running the test. On most devices the poorly written ListView causes an OutOfMemoryException. This can be seen by navigating to the Gradle Console text button which is available in the bottom right of the Android Studio window.

Let's continue by automating Systrace so we detect jank in our test.

Automating systrace is a little tricky since it needs to be run in parallel with the test suite. The Android development kit comes with a tool called MonkeyRunner which we can use for this. MonkeyRunner allows you to create scripts using Python 2.7 syntax, allowing for orchestration with a connected Android device. This allows us to start Systrace and then kickoff the test suite in parallel threads (among other things).

MonkeyRunner script

A script to automate systrace and the test suite is rather long so it has been included in the repo as a file called Review the function invocation calls at the end of the script as a summary of the logic required for automation. You will notice the script follows this general structure:

  1. Check environment variables
  2. Define functions
  3. Clear local data from previous runs
  4. Find an Android device
  5. Enable and clear graphics info dumpsys
  6. Start a systrace thread & test suite thread in parallel
  7. Wait for both threads to complete
  8. Download files from device
  9. Run analysis on downloaded files

You'll notice the script takes two parameters, the (1) a directory to log data to and (2) an Android device ID. Since you're running this script manually you have to retrieve the device ID manually by running this command in the Android Studio terminal window.

On Mac / Linux:

${ANDROID_HOME}/platform-tools/adb devices -l

On Windows:

%ANDROID_HOME%\platform-tools\adb devices -l

It will output something similar to this; select the device ID on the left side of the screen to run the test on (in case you have more than one).

List of devices attached
84B0456625000123       device usb:337123368X product:angler model:Nexus_6P device:angler

Now run the script in the Android Studio terminal window by typing these two commands after replacing <INSERT ID> with the device ID. Remember to unlock your connected Android device if it has gone to sleep prior to running the second command.

On Mac / Linux:

./gradlew :app:assembleDebug :app:assembleDebugAndroidTest :app:installDebug :app:installDebugAndroidTest
${ANDROID_HOME}/tools/monkeyrunner ./ ./ <INSERT_ID> 

On Windows:

gradlew :app:assembleDebug :app:assembleDebugAndroidTest :app:installDebug :app:installDebugAndroidTest

%ANDROID_HOME%\tools\monkeyrunner .\ <INSERT_ID>

The output for the first command should complete look similar to the text below. Make sure that it confirms the APKs were installed onto your test device.

:app:preDexDebugAndroidTest UP-TO-DATE
:app:dexDebugAndroidTest UP-TO-DATE
:app:packageDebugAndroidTest UP-TO-DATE
:app:assembleDebugAndroidTest UP-TO-DATE
Installing APK 'app-debug.apk' on 'Nexus 6P - 6.0'
Installed on 1 device.
Installing APK 'app-debug-androidTest-unaligned.apk' on 'Nexus 6P - 6.0'
Installed on 1 device.


Total time: 21.073 secs

The second command should complete with a log similar to this.

Writing logs to: ./
Using device_id: 84B0115625000732
Your ANDROID_HOME is set to: /Users/paulrashidi/android_sdk2
Cleaning data files
Waiting for a device to be connected.
Device connected.
Starting dump permission grant
Starting storage permission grant
Clearing gfxinfo on device
Starting test
Executing systrace
Exception in thread TestThread:Traceback (most recent call last):
  File "/Users/paulrashidi/android_sdk2/tools/lib/jython-standalone-2.5.3.jar/Lib/", line 179, in _Thread__bootstrap
  File "/Users/paulrashidi/android_sdk2/tools/lib/jython-standalone-2.5.3.jar/Lib/", line 170, in run
    self._target(*self._args, **self._kwargs)
  File "/Users/paulrashidi/verytmp/android-perf-testing/./", line 51, in perform_test
    print device.instrument(test_runner, params)['stream']
KeyError: 'stream'
Done running tests
Done systrace logging
Systrace Thread Done
Test Thread Done
Time between test and trace thread completion: 0
Starting adb pull for test files

FAIL: Could not find file indicating the test run completed.

OVERALL: FAILED. See above for more information.

You should have a perftesting directory that has a limited file set populated similar to the screenshot below.

Adding Gradle Tasks for MonkeyRunner

Now, let's make the script a little bit more native to the development environment so we can run it from Gradle and/or Android Studio. The easiest way to do this is to create a Gradle task that will run the monkeyrunner script. In Gradle we can do this for a typical project by defining a new task in the buildSrc directory; let's do that.

Navigate to buildSrc/src/main/groovy/com/google/android/perftesting/RunLocalPerfTestsTask and uncomment the code there to implement the task. Android Studio might report build errors on these Gradle tasks in the file editing window related to duplicate class definitions, it is safe to ignore those.

You should also peruse the buildSrc/src/main/groovy/com/google/android/perftesting/PerfTestTaskGeneratorPlugin file. This is a custom Gradle plugin that queries for connected Android devices and then sets up a RunLocalPerfTestsTask Gradle task for each connected device. The custom Gradle plugin also creates an additional generic RunLocalPerfTests task that, when run, executes each of the device-specific gradle tasks.

Now let's install the custom Gradle plugin into our project so that all of the custom Gradle tasks will be available.

Navigate to app/build.gradle and comment in the following lines of code. It is important that this plugin be applied after the Android Gradle plugin is configured, therefore, it is placed at the end of the build.gradle file.

// Create performance testing tasks for all connected Android devices using a Gradle plugin defined
// in the buildSrc directory.
apply plugin: PerfTestTaskGeneratorPlugin

Then add this code snippet as the very first line in the app/build.gradle file.


There will be a message at the top of the Android Studio screen that asks if you want to sync the Gradle files again. Perform the Gradle file sync and then navigate to the Gradle task listing. You should see the new tasks that were added by the Gradle plugin. If you have multiple devices connected you will have more tasks than the screenshot below.

The runLocalPerfTests Gradle task is the task that will run the performance tests. A major advantage to running the monkeyrunner script from Gradle is that we made the new performance test tasks dependent on the installation of the App and Test APKs. Anytime we make changes to the codebase, as long as we're using the Gradle tasks to execute the tests, Gradle will analyze all of the project files, and rebuild the associated APKs when necessary before running the tests.

Let's make sure the new task works. Double-click on the runLocalPerfTests task to run it. Be patient, it might take a bit to build, install, run your app and start the monkeyrunner instrumentation. Ensure the screen of your device shows the performance test running.

To make things easier, let's go ahead and add the runLocalPerfTests gradle task to the Run Configurations menu using the Save Configuration option as pictured here.

You should now have a run configuration for runLocalPerfTests in your gradle sidebar like pictured below:

Great!! So now you can run the performance test across the devices connected to your computer from within Android Studio and both the app and test APKs will be rebuilt as needed as you make changes to the source code. You also have the Systrace trace.html file in the testdata directory for debugging, but currently you don't have a whole lot of data about the tests that are being run. Let's fix that.

We've enabled Systrace and test execution, but we want more information. Additionally, since we're now running the tests via MonkeyRunner we've lost the test success/failure information. Let's fix that.

Collecting more data

The team developed some example JUnit Rules we can use. These rules are available in the app/src/androidTest/java/ directory in the package. Let's add these rules to the existing test.

Navigate to the app/src/androidTests directory. Open the SimpleListActivityTest class and uncomment the @Rule annotated member variables where the Class name begins with Enable, such as the lines below. Resolve the import errors as well.

public EnableTestTracing mEnableTestTracing = new EnableTestTracing();
public EnablePostTestDumpsys mEnablePostTestDumpsys = new EnablePostTestDumpsys();

public EnableLogcatDump mEnableLogcatDump = new EnableLogcatDump();

public EnableNetStatsDump mEnableNetStatsDump = new EnableNetStatsDump();

Each of these rules causes a different set of data to be logged.

Now uncomment the GlobalTimeout Rule to add a performance constraint. This rule ensures that any test in this class will throw an error if it takes longer than the specified amount of time. You'll notice the scrollFullList test method has a while loop at the end of it. The while loop and the GlobalTimeout rule result in a failure if the ListView code isn't performant enough to allow someone to scroll the whole list in a reasonable amount of time.

public Timeout globalTimeout= new Timeout(

Now navigate to the TestListener class in the same directory. Uncomment the code for testRunStarted and testRunFinished methods there, resolving import build errors.

public void testRunStarted(Description description) throws Exception {
    Log.w(LOG_TAG, "Test run started.");
    // Cleanup data from past test runs.
public void testRunFinished(Result result) throws Exception {
    Log.w(LOG_TAG, "Test run finished.");

This will enable a battery stats collection, test failure logging, and after the tests are complete this code will also move all log files that are being collected to an accessible location where they can be pulled via simple adb commands from the host computer.

Run the tests again

Go ahead and run the performance tests again using the runLocalPerfTests Run Configuration at the top of the Android Studio window.

You might notice that the monkeyrunner script already flags more issues since more files are now being pulled and can, therefore, be inspected. Navigate to the testdata directory at in the root directory of your project. You'll find quite a bit of information now being logged to that directory in a directory structure similar to the test class package structure. Navigate these files to look over the information being logged.

In the next step we'll look at the output of the current script and start fixing performance issues with information from the harness.

In your last performance test run you had performance issues. Let's now utilize the information being gathered by the harness to resolve those issues.

1) scrollFullList(
Script: org.junit.runners.model.TestTimedOutException: test timed out after 2500 milliseconds
Script:         at java.lang.Thread.sleep(Native Method)
Script:         at java.lang.Thread.sleep(
Script:         at java.lang.Thread.sleep(
Script:         at
Script:         at java.lang.reflect.Method.invoke(Native Method)
Script:         at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(
Script:         at
Script:         at org.junit.runners.model.FrameworkMethod.invokeExplosively(
Script:         at org.junit.internal.runners.statements.InvokeMethod.evaluate(
Script:         at org.junit.internal.runners.statements.FailOnTimeout$
Script:         at org.junit.internal.runners.statements.FailOnTimeout$
Script:         at
Script:         at

This error is also logged in a file called test.failure.log in the subdirectory of the testdata directory that corresponds to the test class and method names.

If you open the gfxinfo.dumpsys.log file you'll see a line notating an excessive amount of jank is present (in the sample below it was approximately 92%).

** Graphics info for pid 31367 [] **

Stats since: 19840518836088ns
Total frames rendered: 323
Janky frames: 296 (91.64%)
90th percentile: 117ms
95th percentile: 125ms
99th percentile: 133ms
Number Missed Vsync: 290
Number High input latency: 0
Number Slow UI thread: 295
Number Slow bitmap uploads: 286
Number Slow issue draw commands: 15

Open the systrace and observe the number of Alerts present indicating performance issues.

The Systrace alerts are clearly pointing to ListView recycling view issues as well as the fact that calling getView() is taking too long. To fix the problem open up the class and resolve the lint error displayed.

LayoutInflater inflater = LayoutInflater.from(getContext());

// This line is wrong, we're inflating a new view always instead of only if it's null.
// For demonstration purposes, we will leave this here to show the resulting jank.
convertView = inflater.inflate(R.layout.item_contact, parent, false);

Should become this

if (convertView == null) {
   LayoutInflater inflater = LayoutInflater.from(getContext());
   convertView = inflater.inflate(R.layout.item_contact, parent, false);

Run the perf test again and refresh the trace.html file in your web browser. Hmm... looks like getView() is still causing performance problems. After looking at Bitmap code in getView a little more you realize It should actually be using something that has a cache to load the Bitmap. a common choice is Glide. So change this code:

// Let's just create another bitmap when we need one. This makes no attempts to re-use
// bitmaps that were previously used in rendering past list view elements, causing a large
// amount of memory to be consumed as you scroll farther down the list.
Bitmap bm = BitmapFactory.decodeResource(convertView.getResources(), R.drawable.bbq);

to this:


Run the perf test again and refresh the trace.html file in your web browser. You could keep optimizing, but you get the idea. Now you have a repeatable way of flagging issues and then running tests to see whether specific changes make a difference.

Congrats! You've finished the codelab. Let's take just a few more minutes to sum up what you've learned and some key things to remember.

What we've covered

Things to remember

Give us feedback

We'd really appreciate if you could fill out some feedback on your codelab experience. Click the link below to fill out a short survey and we'll use this information to iterate and improve the codelab over time.

Tell us how we did

Learn more

If you'd like to learn more about performance testing see Testing Display Performance on the Android Developer documentation site.

If you'd like to learn more about Systrace, see the official documentation here.

If you're more curious about Espresso and UI testing, check out these official docs.