Android Vendor Test Suite (VTS) consists of four products:


VTS itself means the compliance test suite of Android Vendor Interface (VINTF).

VINTF is a versioned, stable interface for Android vendor implementation. This concept is introduced from Android version 8.0 (Oreo) in order to improve the engineering productivity, launch velocity, security, and reliability of Android device ecosystem.

VTS has a set of test cases designed to test the following components directly under VINTF:


CTS-on-GSI stands for Compatibility Test Suite (CTS) on General System Image (GSI). It is specifically designed to test Android devices using GSI, i.e., the stock AOSP (Android Open Source Project) system image. Most of the CTS-on-GSI tests is a kind of SDK API tests and can indirectly test VINTF. It consists of a subset of CTS test cases and some new test cases specifically designed for VINTF.


VTS-* is for the quality assurance of Android platform software. VTS-* has optional functional tests and non-functional tests. The default test plan name of the functional VTS and VTS-* is vts and vts-star, respectively.

The non-functional tests include performance tests (e.g., vts-performance) and fuzz tests (e.g., vts-fuzz). VTS-* also includes test case development tools. Two example test development tools are a HAL API call trace recording tool and a native code coverage measurement tool.

Android Lab and VTI

Vendor Test Infrastructure (VTI) is a set of cloud-based infrastructures for Android device partners and Open Source Software (OSS) ecosystems. It allows partners to easily create a cloud-based continuous integration service for VTS tests. That is possible thanks to the Android Lab infrastructure that can be used to build and operate an Android test lab so as to streamline VTS, VTS-*, and CTS-on-GSI executions.

Related Codelabs

Android VTS v8.0 Codelab

Android VTS Lab v9.0 Codelab

Are you interested in using and developing some VTS or CTS-on-GSI tests now? Then please click the next button.

Establishing a test environment

Recommended system environment:

To set up a testing environment:

  1. Install Python development kit:
$ sudo apt-get install python-dev
  1. Install Protocol Buffer tools (for Python):
$ sudo apt-get install python-protobuf
$ sudo apt-get install protobuf-compiler
  1. Install Python virtual environment-related tools:
$ sudo apt-get install python-virtualenv
$ sudo apt-get install python-pip
  1. (Optionally) Download Python packages from PyPI to a local directory:
# To permanently set $VTS_PYPI_PATH, please add the following line to ~/.bashrc
$ export VTS_PYPI_PATH=<your directory>
$ curl | base64 -d > pip_requirements.txt
$ pip download -d $VTS_PYPI_PATH -r pip_requirements.txt --no-binary protobuf,grpcio,matplotlib,numpy,Pillow,scipy
  1. Connect your device to host:
$ adb devices
$ adb shell

Testing a patch

To test a patch:

  1. Build a VTS host-side package:
$ . build/
$ lunch aosp_arm64-userdebug
$ make vts -j
  1. Run the default VTS tests:
$ vts-tradefed
> run vts     // here `vts` is the test plan name

VTS plans

Available VTS test plans include:



> run vts

For default VTS tests

> run vts-star

For default VTS-* tests (excluding tests in the default VTS plan).

> run vts-hal

For default VTS HAL (hardware abstraction layer) tests

> run vts-kernel

For default VTS kernel tests

> run vts-vndk

For default VTS VNDK (vendor native development kit) tests

> run cts-on-gsi

For default CTS-on-GSI (general system image) tests.

To view a list of all plans, refer to /test/vts/tools/vts-tradefed/res/

VTS TradeFed Console Options

Available VTS TradeFed console options include:



> run vts -m <test module>

Runs one specific test module.

> run vts -m <test module> -t <test case>

Runs one specific test case. Multiple test cases can be specified as a comma separated list (e.g. test_case1,test_case2,...).

> run vts -l INFO

Prints detailed console logs. Applied to both VTS Java and Python test frameworks.

> list invocations (or "l i" for short)

Lists all invocation threads.

> list results (or "l r" for short)

Lists previous results.

> run vts --primary-abi-only

Runs a test plan on the primary ABI (e.g., ARM64) only.

> run vts --primary-abi-only

Runs test only against one ABI (Application Binary Interface) bitness. This shortens test execution time.

> run vts -s <device serial>

Selects a device to use when multiple devices are connected.

> invocation <command ID> stop (or "i <command ID> stop" for short)

Interrupts running command and outputs partial result.

> run vts --shards <number of devices> -s <device serial> -s ...

Splits the vts plan into shards and runs them on multiple devices. --shards used only for VTS.

> run cts-on-gsi --shard-count <number of devices> -s <device serial> -s ...

Splits the cts-on-gsi plan into shards and runs them on multiple devices. --shard-count used only for CTS-on-GSI.

> run vts --retry <session ID>

Retries failed and not executed tests with previous configuration and command options. Used only for VTS.

> run cts-on-gsi-retry --retry <session ID>

Retries for CTS-on-GSI always use cts-on-gsi-retry as the plan name.

> run vts --collect-tests-only

Generates a report in which all tests pass without being executed.

> help

Prints help page that lists other console options.

> run vts --help

Prints common options for vts plan.

> run vts --help-all

Prints all options for vts plan.

For Windows Host

While building VTS on Windows is not supported, it is possible to run most of VTS tests on a Windows host machine with Python, Java, and ADB installed.

  1. Download links:
    Python 2.7
    ADB 1.0.39
    Install the required Python packages by using pip.
  2. Build VTS on Linux:
$ . build/
$ lunch aosp_arm64-userdebug
$ make vts -j
  1. Copy out/host/linux-x86/vts/ to your Windows host and extract it.
  2. (Optional) Download Python packages from PyPI to a local directory:
# Permanently set VTS_PYPI_PATH in Advanced System Settings.
$ set VTS_PYPI_PATH=<your directory>
# Copy to local pip_requirements.txt.
$ pip download -d %VTS_PYPI_PATH% -r pip_requirements.txt
  1. Add adb.exe to PATH. Run vts-tradefed_win.bat
$ vts-tradefed_win.bat
> run vts     // where vts is the test plan name

Build and run locally

For ARM devices, please run:

$ . build/
$ lunch aosp_arm64
$ make vts -j && vts-tradefed run commandAndExit <test plan name>
$ make vts -j TARGET_PRODUCT=aosp_arm64 && vts-tradefed run commandAndExit <test plan name> --primary-abi-only -l INFO --module <test module name>

For x86 devices, please use `aosp_x86_64` build target.

To run a test case

$ vts-tradefed run commandAndExit <test plan name> -m <test module> -t <test case>

An example:

$ vts-tradefed run commandAndExit vts -m VtsHalRadioV1_0Target -t RadioHidlTest.getAvailableNetworks

Download packages and run

If you have downloaded from its official release page or an Android build system, please run:

$ unzip
$ cd android-vts
$ export PATH=`pwd`/bin:$PATH
$ cd tools
$ ./vts-tradefed

Build a native test binary and run directly (only for development)

For a cc_test test module, please run:

$ m <test module> && adb sync data && adb shell data/nativetest64/<test module>/<test module>

Use atest (alpha version)

Atest is a script that builds only required build targets. It can be used for certain VTS and CTS-on-GSI test modules today. The supported VTS test modules include most of the VTS HAL tests. Atest is supported from v9.0 (P).

$ atest <test module>
$ atest <test module> -- -t <test case>
$ atest VtsCodelabHelloWorldTest -- -t testEcho1
$ atest ./ (where current working directory is //test/vts/testcases/codelab/hello_world/)

Here are some examples:

$ atest VtsCodelabHelloWorldTest
$ atest VtsCodelabHelloWorldTest -- -t testEcho1
$ atest VtsHalLightV2_0Target
$ atest VtsHalRadioV1_0Target -- -t RadioHidlTest.getAvailableNetworks

You can also run a test in the current working directory.

$ cd test/vts/testcases/codelab/hello_world/
$ atest ./

How to Flash a Device

It uses a <fish>-user build and a GSI-userdebug build. The flashing steps are:

Step 1. Flash a <fish>-user build.

$ fastboot update ${TARGET_PRODUCT}-img-$ --skip-reboot

Step 2. Select a GSI build. The GSI artifact file name is aosp_<arch>_<build AB type>-img-<build ID>.zip where <arch> is either arm64 or x86_64 (e.g., aosp_arm64_ab-img-<build ID>.zip). Specifically, for P branch, please use:

Then please unzip the GSI artifact file and use system.img in it.

Step 3. Flash a GSI

$ fastboot flash system system.img

If a device uses Android verified boot, please turn that off for testing by running:

$ fastboot flash vbmeta vbmeta.img

where the vbmeta.img can be download from an Android build system or built locally.

Build a GSI image locally

$ lunch aosp_<arch>_<build AB type>-userdebug  # e.g., aosp_arm64_ab-userdebug
$ make -j

Build vbmeta.img locally

$ make avbtool -j
$ avbtool make_vbmeta_image --flag 2 --output vbmeta.img

How to Run a CTS-on-GSI test

Since v8.1

Please use a VTS package and then unzip and run:

$ vts-tradefed
> run cts-on-gsi -m <test module> -t <test case>

Here -m and -t are optional.

For v8.0

Please use the CTS package that is downloadable from the CTS download page.

$ cts-tradefed
> run cts-reference-aosp -m <test module> -t <test case>

All VTS, VTS-*, and VTI code is kept in AOSP (Android Open Source Project). Let's download the AOSP source code based on this 'Downloading the Source' manual.

Write a Host-Side Python Test

We will extend the provided VTS HelloWorld Codelab test. Before actually extending that test, let's build and run that test.

$ make vts -j
$ vts-tradefed
> run vts-codelab -m VtsCodelabHelloWorldTest

If your VTS TradeFed console printed the following result (e.g., PASSED: 4), that means you can run VtsCodelabHelloWorldTest successfully on your device and thus are ready for this part of the codelab.

E/BuildInfo: Device build already contains a file for VIRTUALENVPATH in thread Invocation-<ID>
E/BuildInfo: Device build already contains a file for PYTHONPATH in thread Invocation-<ID>
I/VtsMultiDeviceTest: Setting test name as VtsCodelabHelloWorldTest
I/ConsoleReporter: [<ID>] Starting armeabi-v7a VtsCodelabHelloWorldTest with 2 tests
I/ConsoleReporter: [1/2 armeabi-v7a VtsCodelabHelloWorldTest <ID>] VtsCodelabHelloWorldTest#testEcho1 pass
I/ConsoleReporter: [2/2 armeabi-v7a VtsCodelabHelloWorldTest <ID>] VtsCodelabHelloWorldTest#testEcho2 pass
I/ConsoleReporter: [<ID>] armeabi-v7a VtsCodelabHelloWorldTest completed in 2s. 2 passed, 0 failed, 0 not executed
W/CompatibilityTest: Inaccurate runtime hint for armeabi-v7a VtsCodelabHelloWorldTest, expected 1m 0s was 19s
I/ResultReporter: Test Result: <omitted>/out/host/linux-x86/vts/android-vts/results/2017.04.21_11.27.07/test_result_failures.html
I/ResultReporter: Test Logs: <omitted>/out/host/linux-x86/vts/android-vts/logs/2017.04.21_11.27.07
I/ResultReporter: Invocation finished in 43s. PASSED: 4, FAILED: 0, MODULES: 2 of 2

It also shows where the test logs are kept (out/host/linux-x86/vts/android-vts/logs/2017.04.21_11.27.07) and the xml report is stored (out/host/linux-x86/vts/android-vts/results/2017.04.21_11.27.07).

You may use -l INFO option to have logs printed to console during test exection time.

The VtsCodelabHelloWorldTest code is stored in <your AOSP repo's local home dir>/test/vts/testcases/codelab/hello_world/. That directory has the following four files:

Let's look into each of the first three files.

LOCAL_PATH := $(call my-dir)

include $(CLEAR_VARS)
LOCAL_MODULE := VtsCodelabHelloWorldTest
VTS_CONFIG_SRC_DIR := testcases/codelab/hello_world
include test/vts/tools/build/

It tells that the test module's build module name is VtsCodelabHelloWorldTest, and its source code is kept in the testcases/codelab/hello_world directory. The last line is to use the predefined VTS build rule.


<configuration description="Config for VTS CodeLab HelloWorld test case">
    <option name="config-descriptor:metadata" key="plan" value="vts-codelab" />
    <target_preparer class="">
        <option name="push-group" value="HostDrivenTest.push" />
    <test class="">
        <option name="test-module-name" value="VtsCodelabHelloWorldTest"/>
        <option name="test-case-path" value="vts/testcases/codelab/hello_world/VtsCodelabHelloWorldTest" />

This xml file tells VTS TradeFed how to prepare and run the VtsCodelabHelloWorldTest test. It uses a VTS TradeFed test preparer: VtsFilePusher, which pushes all the files needed for a host-driven test. The actual list is defined in HostDrivenTest.push file which includes VtsDriverHal.push and VtsDriverShell.push files. Those included files may include some other push files defined in the same directory.

The actual test execution is specified by the VtsMultiDeviceTest class where option test-module-name specifies the actual test module name, which we can use when we do > run vts -m <Test Module Name> from a VTS TradeFed console, and option test-case-path specifies the path of the actual test source file (excluding .py extension).

import logging

from import asserts
from import base_test
from import const
from import test_runner

class VtsCodelabHelloWorldTest(base_test.BaseTestClass):
    """Two hello world test cases which use the shell driver."""

    def setUpClass(self):
        self.dut = self.android_devices[0] =

    def testEcho1(self):
        """A simple testcase which sends a command."""
        results =
            "echo hello_world")  # runs a shell command.[const.STDOUT]))  # prints the stdout
                            "hello_world")  # checks the stdout
                            0)  # checks the exit code

    def testEcho2(self):
        """A simple testcase which sends two commands."""
        results =["echo hello", "echo world"])[const.STDOUT]))
                            2)  # check the number of processed commands
        asserts.assertEqual(results[const.STDOUT][0].strip(), "hello")
        asserts.assertEqual(results[const.STDOUT][1].strip(), "world")
        asserts.assertEqual(results[const.EXIT_CODE][0], 0)
        asserts.assertEqual(results[const.EXIT_CODE][1], 0)

if __name__ == "__main__":

This file contains the actual test source code. It has the test class VtsCodelabHelloWorldTest which inherits from BaseTestClass class. This class can have four default methods: setUpClass which is called once at the beginning for setup, setUp which is called before running each test case, tearDown which is called after each test case, and tearDownClass which is called once at the end for cleanup. In this case, only setUpClass is defined (by overriding) and simply gets a DUT (Device Under Test) instance and a shell instance of the DUT.

A test case is a method with test as the prefix of its name (e.g., testEcho1 and testEcho2). testEcho1 test case invokes a remote shell instance (first line), sends 'echo hello_world' shell command to a target device (second line), and verifies the results (fourth line to check the stdout of the echo command and fifth line to check the exit code of the same echo command). testEcho2 test case shows how to send multiple shell commands using one Python function call.

To extend this test module, let's add the following method to VtsCodelabHelloWorldTest class.

    def testListFiles(self):
        """A simple testcase which lists files."""
        results ="ls /data/local/tmp")[const.STDOUT][0])
        asserts.assertEqual(results[const.EXIT_CODE][0], 0)

Then, run the following commands to test it:

$ make vts
$ vts-tradefed
> run vts-codelab -m VtsCodelabHelloWorldTest -t testListFiles -l INFO

You can check the test logs to see whether all files are correctly listed. That result can be validated by using adb shell ls /data/local/tmp command.

Because this test is written in Python and executed on the host, we call this a host-side Python test.

Write a Target-Side C/C++ Binary Test

This part explains how to package a target-side binary or a shell script as a VTS test by using BinaryTest template. Let's assume your binary test module name is `vts_sample_binary_test` that would exit with 0 if that test passes. You can wrap the test easily with VTS BinaryTest template by specifying the test module path and type in AndroidTest.xml:

<test class="">
    <option name="test-module-name" value="VtsSampleBinaryTest" />
    <option name="binary-test-source" value="DATA/nativetest/vts_sample_binary_test" />

The `binary-test-source` option specifies where the binary is packaged in VTS, and the BinaryTest template will push the test binary to a default location on device and deleting it after test finishes.

You can also specify a test tag, which is often used to distinguish 32bit tests and 64bit tests.

<test class="">
    <option name="test-module-name" value="VtsSampleBinaryTest" />
    <option name="binary-test-source" value="_32bit::DATA/nativetest/vts_sample_binary_test" />
    <option name="binary-test-source" value="_64bit::DATA/nativetest64/vts_sample_binary_test" />

An example test is available at $ANDROID_BUILD_TOP/test/vts/testcases/codelab/target_binary/.

Using a script to auto-generate a new test project

A basic test project can be created using script located test/vts/script/

To create a codelab like project, first load environment from repo root

$ . build/

Then call the project creation script with required --name and --dir option

$ test/vts/script/ --name SampleTestModule --dir sample_test_module

A new test project will be created under test/vts/testcases/sample_test_module

Filtering test cases

Test case filter can be configured from AndroidTest.xml.

There are three options that can be used to configure test filter:

Option name

Value type




Test case include filter
Example: test1,r(test2.*),-r(.*64bit)



Test case exclude filter

Example: test2,test3_64bit



Whether exclude filter has higher priority than include filter.

Default: false

Example: true

Filter priority

By default, include-filter has higher priority than exclude filter. This means if there is any items in include-filter, exclude-filter will just be ignored.

This is useful for users to select to run specific tests directly from command line even while the test has been excluded in configuration due to some error. See "Specify filters from run command" section below for details on how to use filter from run command.

However, if exclude-over-include option is set to true, tests excluded from exclude-filter will not run even when it is also specified in include-filter.

Multiple filters

include-filter and exclude-filter can be specified multiple times in AndroidTest.xml, and their values will be combined into a list. For example:

<test class="">
    <option name="include-filter" value="test1" />
    <option name="include-filter" value="test2" />

Comma separation

The value can also be comma separated. For example:

<test class="">
    <option name="include-filter" value="test1,test2" />
    <option name="include-filter" value="test3" />

This also means test name should not contain commas.

Regular expression

Regular expression can be used by wrapping test name with r(...). For example, the following configuration runs only all test whose name ends with 32bit:

<test class="">
    <option name="include-filter" value="r(.*32bit)" />

The regular expression wrapping can be escaped with backslashes. For example, \r(test1) is an exact test name match.

Bitness expansion

By default, bitness postfix will be added to each non-regular expression filtering rule that doesn't have bitness postfix. For example, the following two configurations are equal:

Configuration 1:

<test class="">
    <option name="include-filter" value="test1" />

Configuration 2:

<test class="">
    <option name="include-filter" value="test1" />
    <option name="include-filter" value="test1_32bit" />
    <option name="include-filter" value="test1_64bit" />

If a filter is a regular expression, or already has 32bit or 64bit posix, it will not be automatically expanded.

Minus sign prefix in include-filter

A minus sign prefix in include filter adds a value to exclude filter. For example, the following two configurations have the same effect:

Configuration 1:

<test class="">
    <option name="include-filter" value="-r(.*32bit)" />

Configuration 2:

<test class="">
    <option name="exclude-filter" value="r(.*32bit)" />

Specify filters from run command

The include-filter can also be configured in test run command.

vts-tradefed run vts-star -m TestModule -t test1,r(test2.*),-r(.*64bit)

Since minus sign adds the item to exclude-filter, the above command has the same effect with the following configuration:

<test class="">
    <option name="include-filter" value="test1" />
    <option name="include-filter" value="r(test2.*)" />
    <option name="exclude-filter" value="r(.*64bit)" />

Using a VTS template, you can quickly develop a VTS test for a specific objective. This part of codelab explains a few commonly used templates.

Wrap a target side GTest binary with GtestBinaryTest template

If your test binary is a GTest (Google Test), you may still use the BinaryTest template, which will treat the test module as a single test case in result reporting. You can specify the `gtest` binary test type so that individual test cases will be correctly parsed.

<test class="">
    <option name="test-module-name" value="VtsSampleBinaryTest" />
    <option name="binary-test-source" value="_32bit::DATA/nativetest/vts_sample_binary_test" />
    <option name="binary-test-source" value="_64bit::DATA/nativetest64/vts_sample_binary_test" />
    <option name="binary-test-type" value="gtest" />

GtestBinaryTest template will first list all the available test cases, and then run them one by one through shell command with --gtest_filter flag. This means, each test case will be executed on its own Linux process, and global static variable across test cases should not be used.

Wrap a target side HIDL HAL test binary with HalHidlGtest template

From Android version 8.0 (O), Hardware Interface Definition Language (HIDL) is used to specify HAL interfaces. Using VTS, HIDL HAL testing can be done effectively because the VTS framework handles its non-conventional test steps transparently and provides various useful utils which a HIDL HAL test case can use.

A HIDL HAL target side test often needs setup steps such as disabling Java framework, setting SELinux mode, toggling between passthrough and binder mode, checking HAL service status, and so forth.

Let's assume your test AndroidTest.xml looks like:

<test class="">
    <option name="test-module-name" value="VtsHalMyHidlTargetTest"/>
    <option name="binary-test-source" value="..." />

The following option is needed to use the HIDL HAL gtest template.

<option name="binary-test-type" value="hal_hidl_gtest" />

Use test template to write your target-side HIDL HAL test

VTS provides a number of test templates for HAL developers to write target-side HIDL HAL tests, these templates provide a way for developers to make use of VTS test framework features.


A HIDL GTest extending from VtsHalHidlTargetTestBase will allow VTS framework to toggle between passthrough and binder mode for performance comparison.

The VTS HIDL target templates are located in `VtsHalHidlTargetTestBase` module, and you may include it through your Android.bp file in the following way:

 cc_test {
    name: "VtsHalHidlSampleTest",
    defaults: ["hidl_defaults"],
    srcs: ["SampleTest.cpp"],
    shared_libs: [
    static_libs: ["VtsHalHidlTargetTestBase"],

And in `SampleTest.cpp`:

#include <VtsHalHidlTargetTestBase.h>

class SampleTest : public ::testing::VtsHalHidlTargetTestBase {
Interface int_ = ::testing::VtsHalHidlTargetTestBase::getService<IInterface>();



VtsHalHidlTargetCallbackBase is another template in that runner. It offers utility function such as WaitForCallback and NotifyFromCallback. A typical usage is as follows:

class CallbackArgs {
    ArgType1 arg1;
    ArgType2 arg2;

class MyCallback
    : public ::testing::VtsHalHidlTargetCallbackBase<>,
      public CallbackInterface {
  CallbackApi1(ArgType1 arg1) {
    CallbackArgs data;
    data.arg1 = arg1;
    NotifyFromCallback("CallbackApi1", data);

  CallbackApi2(ArgType2 arg2) {
    CallbackArgs data;
    data.arg1 = arg1;
    NotifyFromCallback("CallbackApi2", data);

Test(MyTest) {
  auto result = cb_.WaitForCallback("CallbackApi1");
  // cb_ as an instance of MyCallback, result is an instance of
  // ::testing::VtsHalHidlTargetCallbackBase::WaitForCallbackResult
  EXPECT_TRUE(result.no_timeout); // Check wait did not time out
  EXPECT_TRUE(result.args); // Check CallbackArgs is received (not
                               nullptr). This is optional.
  // Here check value of args using the pointer result.args;
  result = cb_.WaitForCallback("CallbackApi2");
  // Here check value of args using the pointer result.args;

  // Additionally. a test can wait for one of multiple callbacks.
  // In this case, wait will return when any of the callbacks in the provided
  // name list is called.
  result = cb_.WaitForCallbackAny(<vector_of_string>)
  // When vector_of_string is not provided, all callback functions will
  // be monitored. The name of callback function that was invoked
  // is stored in


VtsHalHidlTargetTestEnvBase is a template for creating a gtest environment. It provides interface for developer to register the HAL service accessed by the test and the VTS test framework will use these info to support:

To use VtsHalHidlTargetTestEnvBase, first define a testEnvironment based on VtsHalHidlTargetTestEnvBase and register targeting HALs accessed by the test (android.hardware.Foo@1.0::IFoo in the example):

#include <android/hardware/foo/1.0/IFoo.h>
#include <VtsHalHidlTargetTestEnvBase.h>

class testEnvironment  : public::testing::VtsHalHidlTargetTestEnvBase {
      virtual void registerTestServices() override {
testEnvironment* env;

Next, register the test environment in your main() function and call initTest()to initialize the test environment:

int main(int argc, char** argv) {
        testEnv = new testEnvironment();
        ::testing::InitGoogleTest(&argc, argv);
        testEnv>init(argc, argv);
        return RUN_ALL_TESTS();

Now, wherever you need to call GetService()in your test to get a clint handler of the targeting HAL service, pass the service name retrieved by the test environment (i.e. the service name of the HAL instance identified by the VTS test framework) as the parameter, for example:


The docstrings in source code under test/vts/runners/target/ contain more detailed explanation for each APIs.

Test template to write your host-side HIDL HAL test

VTS also provides a python test temple to help developers write host-side HIDL HAL tests. A test written based on hal_hidl_host_test template is automatically enabled to support the following VTS features:

To use hal_hidl_host_test template, you test will look like:

class ExampleTest(hal_hidl_host_test.HalHidlHostTest):
    def setUpClass(self):
        super(ContexthubHidlTest, self).setUpClass()
    def testCase1(self):

As shown in the example code, the first step we need is to define a test class based on hal_hidl_host_test.HalHidlHostTest. Then we need to explicitly claim the targeting HAL service(s) of the test as TEST_HAL_SERVICES. This step is similar to the HAL service registration in the target-side HIDL HAL test which will inform the VTS test framework of the HAL service(s) accessed by the test. Finally, when we call InitHidlHal to get the client handler of the targent HAL, we need to pass parameter hw_binder_service_name by calling the method getHalServiceName()provided by the template which will automatically identify the HAL service instance available on the target device and pass the corresponding service name.

Customize your test configuration (Optional)

AndroidTest.xml file

Pre-test file pushes from host to device can be configured for `VtsFilePusher` in AndroidTest.xml with the push-group option. Individual file push can be defined with push. Please refer to TradeFed for more detail.

Python module dependencies can be specified as dep-module option for VtsPythonVirtualenvPreparer in AndroidTest.xml. This will trigger the runner to install or update the modules using pip in the python virtual environment before running tests.

    <multi_target_preparer class="">
        <option name="dep-module" value="numpy" />
        <option name="dep-module" value="scipy" />
        <option name="dep-module" value="matplotlib" />
        <option name="dep-module" value="Pillow" />

`VtsPythonVirtualenvPreparer` will install a set of package including future, futures, enum, and protobuf by default. To add a dependency module, please add `<option name="dep-module" value="<module name>" />` inside `VtsPythonVirtualenvPreparer` in 'AndroidTest.xml'

Test case config

Optionally, a .runner_conf file can be used to pass variables in json format to test case.

To add a .runner_conf file, create a .runner_conf file under your project directory using project name:

$ vi test/vts/testcases/host/<your project directiry>/<your project name>.runner_conf

Then edit its contents to:

    <key_1>: <value_1>,
    <key_2>: <value_2>

And in your test case python class, you can get the json value by using self.getUserParams method.

For example:

    key1 = self.getUserParam("key_1")"%s: %s", "key_1", key1)

At last, add the following line to `` class

in AndroidTest.xml:

<option name="test-config-path" value="vts/testcases/<your project directiry>/<your project name>.runner_conf" />

Your config file will overwrite fields in the following default json object defined at

test/vts/tools/vts-tradefed/res/default/DefaultTestCase.config if there's a conflict.

Binary Test Configurations

Binary tests, including gtests and hidl tests, have the following AndroidTest.xml configuration options

Option name

Value type




Binary test source paths relative to vts test-case directory on host.
Example: DATA/nativetest/test



Working directories (device-side path).

Example: /data/local/tmp/testing/



Environment variables for binary.

Example: PATH=/new:$PATH



Test arguments or flags.

Example: --gtest_filter=test1



LD_LIBRARY_PATH environment variable.

Example: /data/local/tmp/lib



Run adb stop to turn off the Android Framework before test. Example: true



Stop all properly configured native servers during the testing. Example: true



Template type. Other template types extend from this template, but you don't have to specify this option for this template because you already specified binary-test-source.

For more information, please refer to this page in

SoC vendors and ODMs may define and implement HAL extensions for their products. VTS supports testing those HAL extensions. As long as the test cases follow the target/host-side templates, VTS can run them like the compliance tests in vts-hal plan.

The source code of the HAL can be placed in any project. The naming convention for HAL extension is vendor.<VENDOR_NAME>.<HAL_NAME>@<HAL_VERSION>; for VTS module, it is VtsHal<VENDOR_NAME><HAL_NAME><HAL_VERSION><Test_TYPE>. For example, vendor.example.light@2.0 and VtsHalExampleLightV2_0Target.

In this codelab, Let's assume we have defined a HAL extension defined under vendor/qcom/light/2.0 as vendor.qcom.light@2.0

1. Suppose we already defined a test binary VtsHalQcomLightV2_0TargetTest under vendor/qcom/light/2.0/vts/functional used for our target side test, we need to first include the binary with VTS package by adding the binary to ${ANDROID_BUILD_TOP}/test/vts/tools/build/tasks/list/vendor.qcom.light@2.0

2. Create VTS configurations for the test case. located under ${ANDROID_BUILD_TOP}/test/vts-testcase/hal/script is a script to help developers generate the config files ( and AndroidTest.xml) for VTS HAL tests. An example usage of the script is:

$ cd ${ANDROID_BUILD_TOP}/test/vts-testcase/hal/script
$ ./ --test_type target --package_root vendor.qcom --path_root vendor/qcom/ --test_binary_file VtsHalQcomLightV2_0TargetTest.cpp vendor.qcom.light@2.0

The generated config file will be stored under the same directory that stores the .hal files (${ANDROID_BUILD_TOP}/vendor/qcom/light/2.0/ in this case). You can also customize the directory to store config files by passing option --test_config_dir to the script.

3. Build and run the VTS module on a device with the HAL extension.

$ m vts -j
$ vts-tradefed
> run vts-codelab -m VtsHalQcomLightV2_0Target

The Hal adapter Test is designed to verify the backward compatibility of a system image with an older vendor image. The idea is to adapter a HAL service to a lower minor version (e.g. adapter android.hardware.vibrator@1.1 to android.hardware.vibrator@1.0) and verify the device functions as long as the system claims to be compatible with the lower version of the given HAL.


To write a HAL adapter test, there is no need to write any test code as these tests are typically based on existing CTS tests which are already packaged with What we need to do is to create a right test configuration.

As usual, you need an and an AndroidTest.xml to deploy the HAL adapter test as part of VTS.

Let's assume you want to create an adapter for HAL android.hardware.vibrator@1.1. In your AndroidTest.xml, first configure to push the adapter script as:

<target_preparer class="">
    <option name="abort-on-push-failure" value="true"/>
    <option name="push" value="script/target/>/data/local/tmp/"/>

Next, configure the VtsHalAdapterPreparer as:

<target_preparer class="">
    <option name="adapter-binary-name" value="android.hardware.vibrator@1.0-adapter"/>
    <option name="hal-package-name" value="android.hardware.vibrator@1.1"/>

Add device health test with the HAL adapter test module

Device health test is an test to verify the device system is running normally without any service/app crashing. It is a basic test to verify the device functions with a HAL adapter. To add a device health test to your HAL adapter test module. Add the following lines to your AndroidTest.xml :

<target_preparer class="">
    <option name="test-file-name" value="DeviceHealthTests.apk" />
    <option name="cleanup-apks" value="true" />
<test class="">
    <option name="package" value=""/>
    <option name="runner" value=""/>

Add CTS test case with the HAL adapter test module

Various CTS test cases have been introduced to verify the framework functionality based on the vendor image. We can pick one or more CTS test cases that abundantly access the testing HAL (i.e. android.hardware.vibrator@1.0 in this case) and add them to your HAL adapter test. Let's say we have picked CtsDevicePolicyManagerTestCases and CtsMonkeyTestCases for this adapter test, just add the following lines to your AndroidTest.xml :

<include name="CtsDevicePolicyManagerTestCases.config"/>
<include name="CtsMonkeyTestCases.config"/>

The final AndroidTest.xm looks like this:

<configuration description="Config for VTS VtsHalVibratorV1_1Adapter test cases">
    <option key="plan" name="config-descriptor:metadata" value="vts-hal-adapter"/>
    <target_preparer class="">
        <option name="abort-on-push-failure" value="true"/>
        <option name="push" value="script/target/>/data/local/tmp/"/>
    <target_preparer class="">
        <option name="adapter-binary-name" value="android.hardware.vibrator@1.0-adapter"/>
        <option name="hal-package-name" value="android.hardware.vibrator@1.1"/>
    <target_preparer class="">
        <option name="test-file-name" value="DeviceHealthTests.apk" />
        <option name="cleanup-apks" value="true" />

    <test class="">
        <option name="package" value=""/>
        <option name="runner" value=""/>
    <include name="CtsDevicePolicyManagerTestCases.config"/>
    <include name="CtsMonkeyTestCases.config"/>

HAL API call latency profiling

By enable API call latency profiling for your VTS HIDL HAL test, you are expected to get:

1. Add profiler library to VTS

To enable profiling for your HAL testing, we need to add the corresponding profiler library in: The name of the profiling library follow the pattern as:

We will use Nfc Hal as a running example throughout this section, so, the profiler library name is

2. Modify Your VTS Test Case

If you have not already, Codelab for Host-Driven Tests gives an overview of how to write a VTS test case. This section assumes you have completed that codelab and have at least one VTS test case (either host-side or target-side) which you would like to enable profiling.

2.1. Target-Side Tests

This subsection describes how to enable profiling for target-side tests. To enable profiling for host-side tests, follow the same steps by replacing target to host everywhere.

Copy an existing test directory

$ cd test/vts-testcase/hal/nfc/V1_0/
$ cp target target_profiling -rf

Note nfc could be replaced by the name of your HAL and V1_0 could be replaced by the version of your HAL version with format V<MAJOR_VERSION>_<MINOR_VERSION>.

Then rename the test name from VtsHalNfcV1_0Target to VtsHalNfcV1_0TargetProfiling everywhere.

Add the following lines to the corresponding AndroidTest.xml file under the target_profiling directory to push the profiler libraries to target.

<option name="push" value="DATA/lib/hal_profiling_lib->/data/local/tmp/32/"/>
<option name="push" value="DATA/lib64/hal_profiling_library->/data/local/tmp/64/"/>

Note, if the testing hal relies on a dependent hal (e.g. android.hardware.nfc@2.0 depends on android.hardware.nfc@1.0), we need to push the profiler library for the dependent hal as well.

Add the following lines to the corresponding AndroidTest.xml file under the target_profiling directory to enable profiling for test.

<option name="enable-profiling" value="true" />

An example AndroidTest.xml file looks like:

<configuration description="Config for VTS VtsHalNfcV1_0TargetProfiling test cases">
   <option name="config-descriptor:metadata" key="plan" value="vts-hal-profiling" />
    <target_preparer class="">
        <option name="push-group" value="HalHidlTargetProfilingTest.push" />
        <option name="cleanup" value="true"/>
        <option name="push" value="DATA/lib/>/data/local/tmp/32/"/>
        <option name="push" value="DATA/lib64/>/data/local/tmp/64/"/>
    <multi_target_preparer class="" />
    <test class="">
        <option name="test-module-name" value="VtsHalNfcV1_0TargetProfiling" />
        <option name="binary-test-source" value="_32bit::DATA/nativetest/VtsHalHalV1_0TargetTest/VtsHalHalV1_0TargetTest" />
        <option name="binary-test-source" value="_64bit::DATA/nativetest64/VtsHalHalV1_0TargetTest/VtsHalHalV1_0TargetTest" />
        <option name="binary-test-type" value="hal_hidl_gtest" />
        <option name="enable-profiling" value="true" />
        <option name="precondition-lshal" value="android.hardware.nfc@1.0"/>
        <option name="test-timeout" value="1m" />

3. Schedule the profiling test

Add the following lines to vts-serving-staging-hal-hidl-profiling.xml

<option name="compatibility:include-filter" value="VtsHalProfilingTestName" />

4. Subscribe the notification alert emails

Please check notification page for the detailed instructions.

Basically, now it is all set so let's wait for a day or so and then visit your VTS Dashboard. At that time, you should be able to add VtsHalNfcV1_0TargetProfiling to your favorite list.

That is all you need to do in order to subscribe alert emails which will sent if any notably performance degradations are found by your profiling tests.

Also if you click VtsHalNfcV1_0TargetProfiling in the dashboard main page, the test result page shows up where the top-left side shows the list of APIs which have some measured performance data.

5. Where to find the trace files?

All the trace files generated during the tests are by default stored under /tmp/vts-test-trace/.

To change the directory that store the path file, create a configure file e.g. Test.config under the test directory with

    "profiling_trace_path": "path_to_store_the_trace_file"

add following lines to the corresponding AndroidTest.xml file

<option name="save-trace-file-remote" value="true" />
<option name="test-config-path" value="path/to/your/test/Test.config" />

Custom profiling points and post-processing

1. Prerequisites

Let's assume you have created a performance benchmark binary which could run independently on device. We use an available test modules in the vts-performance test plan BinderPerformanceTest as an example. The source code of the performance benchmark is located at test/vts-testcase/performance.

BinderPerformanceTest measures roundtrip HwBinder RPC latency (nanoseconds) for the following message sizes (bytes) [4, 8, 16, 32, 64, 128, 256, 512, 1024, 2k, 4k, 8k, 16k, 32k, 64k] and outputs the roundtrip time in real time, CPU time, number of iterations. Here shows an example of the benchmark outputs when run it directory on the device:

2. Integrate the benchmark as a VTS test

2.1. Add benchmark binary to VTS

To package benchmark binary with VTS, add it to after the vts_test_bin_packages variable

2.2. Add VTS host side script

The host side script control the benchmark execution and the processing of the benchmark results. It typically contains the following major steps.

i. Register device controller and invoke shell on the target device.

def setUpClass(self):
      self.dut = self.registerController(android_device)[0]"one")

ii. Setup the command to run the benchmark on the device.

results =[
            "%s" % path_to_my_benchmark_test,

Where path_to_binary represents the full path of benchmark binary on the target device. The default path is /data/local/tmp/my_benchmark_test

iii. Validate benchmark test results.

            "Benchmark test failed.")

iv. Parse the benchmark test results and upload the metrics to VTS web dashboard.

Depends on the output format of the test results, we need to parse the STDOUT content in the return results into performance data points. Currently, VTS supports processing and displaying two performance data type. One is timestamp sample which records the start and end time stamp for a particular operation. The other is vector data sample which records a list of profiling data along with data labels.

Take the vector data sample as an example, let's suppose we have parsed the benchmark results into two vectors. One stores the performance data (e.g. latency of the API call), the other stores the corresponding data labels (e.g. the input size of data package)

Call AddProfilingDataLabeledVector to upload the vector data sample to VTS web as follows:

            "Benchmark name",

2.3. Configure the VTS test

Follow the same instruction in Codelab for Host-Driven Tests ("Write a VTS Test" section) to create a host-side VTS test using the host side script created in section 2.2.


Support for native coverage through VTS depends on a functioning instance of the VTS Dashboard, including integration with a build server and a Gerrit server. See the documentation for VTS Dashboard setup and configuration before proceeding.

Building a Device Image

The first step in measuring coverage is creating a device image that is instrumented for gcov coverage collection. This can be accomplished with a flag in the device manifest and a few build-time flags.

Let's add the following code segment to the file:

# Set if a device image has the VTS coverage instrumentation.
ifeq ($(NATIVE_COVERAGE),true)

This will have no impact on the device when coverage is disabled at build time but will add a read-only device property in the case when coverage is enabled.

Next, we can build a device image. The continuous build server must be configured to execute the build command with a few additional flags: NATIVE_COVERAGE, and COVERAGE_PATHS. The former is a global flag to enable or disable coverage instrumentation. The latter specifies the comma-separated paths to the source which should be instrumented for coverage.

As an example, let's propose an example for measuring coverage on the NFC implementation. We can configure the build command as follows:

> make NATIVE_COVERAGE=true COVERAGE_PATHS="hardware/interfaces/nfc,system/nfc"

Modifying Your Test for Host-Driven HIDL HAL Tests

In most cases, no additional test configuration is needed to enable coverage. By default, coverage processing is enabled on the target if it is coverage instrumented (as per the previous section) and the test is a HAL test (either a target-side HAL test or a host-side HAL test).

Configuring the Test for Coverage (optional)

The VTS framework automatically derives the information it needs to process the coverage data emitted from the device after test execution and to query Gerrit for the relevant source code. However, it relies on the assumption that there is a symmetry between git project names and the full Android source tree; specifically, the project name and the path to the project from the Android root may differ by at most one relative node in order for the VTS framework to identify the source code. For instance, the following two paths would both be identified as the project platform/test/vts:


On the other hand, a project with the path "android/platform/test/vts" would not be automatically matched with the project by name "platform/test/vts".

In cases when the project name differs significantly from the project's path from the Android root, a manual configuration must be specified in the test configuration JSON file. We must specify a list of dictionaries, each containing the module name (i.e. the module name in the make file for the binary or shared library), as well as git project name and path for the source code included in the module. For example, add the following to the configuration JSON file:

"modules": [{
               "module_name": "<module name>",
               "git_project": {
                                  "name": "<git project name>",
                                  "path": "<path to git project root>"

For the lights HAL, the test configuration file would look like:

"modules": [{
        "module_name": "vendor/lib64/hw/lights.msm8994",
        "git_project": {
                          "name": "platform/hardware/qcom/display",
                          "path": "hardware/qcom/display"
        "module_name": "system/lib64/hw/android.hardware.light@2.0-impl",
        "git_project": {
                          "name": "platform/hardware/interfaces",
                          "path": "hardware/interfaces"

Running VTS

If you run a VTS test with vts-hal plan on a coverage instrumented device, coverage data will automatically be collected and processed by the VTS framework with no additional effort required. The processed coverage data will be uploaded to the VTS Dashboard along with the test results so that the source code can be visualized with a line-level coverage overlay. If you want to run a VTS test with a custom plan and want to measure the coverage rate of that test, add the following lines in your-plan.xml :

<target_preparer class="" />

Note that two external dependencies are necessary to support coverage:

  1. A Gerrit server with REST API must be available and configured to integrate with the VTS Dashboard. See the Dashboard setup for integration directions.
  2. A build artifact server with REST API must be configured to integrate with the VTS runner. This will allow the runner to fetch both a build-time coverage artifact from the building step above as well as the source version information for each git project within the repository at build time. The VTS framework expects an a JSON file named "BUILD_INFO" which contains a dictionary of source project names to git revisions under the key "repo-dict".

Visualizing Coverage

After completing the build step and the test execution step, the VTS Dashboard should display a test result with a link in the row labeled "Coverage". This will display a user interface similar to the one below.

Lines executed are highlighted in green, while lines not exercised by the test are highlighted in red. White lines are not executable lines of code, such as comments and structural components of the coding language.

Generate raw coverage report

Besides the visualized coverage displayed on the dashboard, we could also configure to make the test to output the raw coverage report for later processing. The raw coverage report is a proto file with the form similar to:

coverage {
  file_path: "light/2.0/default/service.cpp"
  project_name: "platform/hardware/interfaces"
  revision: "e35089b8bbae459fbd601fc1dcf2ff80024e32e0"
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: -1
  line_coverage_vector: 0
  total_line_count: 1
  covered_line_count: 0

As we can see, the report contains coverage info for each file with basic info such as file_path and project_name. The pair (line_coverage_vector: number) indicate the number of execution times for that line during the test, where -1 means the line is not executable. The aggregated coverage data is shown at the end inclduing the total executable lines in the file and the number of lines covered by the test.

To generate such raw coverage report, add the following lines to your AndroidTest.xml:

<option name="output-coverage-report" value="true" />

The coverage report will by default be stored under /tmp/vts-test-coverage/, you can change the output directory by adding the following lines in AndroidTest.xml :

<option name="coverage-report-path" value="your/directory/" />

Measure CTS Native Code Coverage

Similar approach could also be applied to the CTS tests to measure the native code accessed during the test. The coverage data will be collected at the end of each CTS test modules and processed to generate a raw coverage report. Unfortunately, we currently do not support displaying CTS test results on VTS dashboard, so there's no visualized coverage report for CTS tests.

To measure native code coverage for CTS tests. First, we build a local device image:

> . build/
> lunch <product name>-userdebug
> make NATIVE_COVERAGE=true COVERAGE_PATHS="<list of paths to instrument with coverage>"

Next, we flash the device with the coverage-instrumented device image and run a CTS test with cts-hal-coverage plan. If you want to run the test with a custom plan and measure its code coverage, add the following lines to your-plan.xml :

<target_preparer class="" >
    <option name="coverage-report-dir" value="vts-coverage" />
<metrics_collector class="" />

You will find the raw coverage report after the test run under the vts-coverage subdirectory under the test result directory for your test invocation. For example: $BUILD_TOP/out/host/linux-x86/vts/android-vts/results/2018.04.20_15.04.05/vts-coverage

Offline Coverage

If the user would like to measure coverage without integrating with the VTS Dashboard or a build server, offline coverage measurement is also possible. First, we build a local device image:

> . build/
> lunch <product name>-userdebug
> make NATIVE_COVERAGE=true COVERAGE_PATHS="<list of paths to instrument with coverage>"

Next, we flash the device with the coverage-instrumented device image and run a VTS test with vts-hal plan. Finally, we can process the files by pulling the outputted GCDA files from the device using adb and matching them with the source code and GCNO files produced at build time. The file structure is symmetric in the Android source code, out directory, and data partition of the device.

These three can be combined into a tool such as gcov or lcov to produce a local coverage report.

Background and FAQs


To measure coverage, the source file is divided into units called basic blocks, which may contain one or more lines of code. All code in the same basic block are accounted for together. Some lines of code (i.e. variable declarations) are not executable and thus belong to no basic block. Some lines of code actually compile to several executable instructions (i.e. shorthand conditional operators) and belong to more than one basic block.

The generated coverage report displays a color-coded source file with numerical annotations on the left margin. The row fill indicates whether or not a line of code was executed when the tests were run: green means it was covered, red means it was not. The corresponding numbers on the left margin indicate the number of times the line was executed. Lines of code that are not colored and have no execution count in the margin are not executable instructions.


Why do some lines have no coverage information?

The line of code is not an executable instruction. For example, comments and structural coding language elements do not reflect instructions to the processor.

Why are some lines called more than expected?

Since some lines of code may belong to more than one basic block, they may appear to have been executed more than expected. For example, a line of code with an inline conditional statement may cause the line to belong to two basic blocks. Even if a line of code belongs to only one basic block, it may be display as having been executed more than it actually was. This may occur if one or more lines of code in the same basic block were executed, causing the execution count of the whole basic block to increase.

What does HIDL HAL Interface Fuzzer do?

HIDL HAL Interface Fuzzer (inteface fuzzer) is a fuzzer binary built using LLVM asan, sancov, and libFuzzer. It runs against a user-specified target HIDL HALs. It calls HAL functions in random order with random inputs until a terminating condition, e.g. HAL crash, sanitizer violation, timeout.

More information about asan, sancov, and libFuzzer.


All the code for HIDL HAL interface fuzzer is already carried by In other words, no additional test code needs to be written or compiled. Only configuration is needed to run the interface fuzzer against a targeted HAL.

As usual, you need an and an AndroidTest.xml to deploy the fuzz test as part of VTS.

Assume your test is named: VtsHalBluetoothV1_0IfaceFuzzer. Then AndroidTest.xml should look something like this:

<target_preparer class="">
    <option name="push-group" value="IfaceFuzzerTest.push"/>
<multi_target_preparer class=""/>
<test class="">
    <option name="test-module-name" value="VtsHalBluetoothV1_0IfaceFuzzer"/>
    <option name="hal-hidl-package-name" value="android.hardware.bluetooth@1.0"/>
    <option name="test-case-path" value="vts/testcases/fuzz/template/iface_fuzzer_test/iface_fuzzer_test"/>
    <option name="test-timeout" value="3h"/>

This should looks fairly standard. The only things to pay attention to are these three lines:

  1. This option specifies what files need to be pushed onto the device. Contents of IfaceFuzzerTest.push.
<option name="push-group" value="IfaceFuzzerTest.push"/>
  1. This option specifies bluetooth HAL as our fuzz target.
<option name="hal-hidl-package-name" value="android.hardware.bluetooth@1.0"/>
  1. This option specifies the host code used to deploy the fuzzer binary.
<option name="test-case-path" value="vts/testcases/fuzz/template/iface_fuzzer_test/iface_fuzzer_test"/>


To run the fuzzer you need to compile VTS with appropriate asan and sancov build options. From android source root directory do:

$ SANITIZE_TARGET="address coverage"  make vts -j64
$ vts-tradefed run commandAndExit vts -l VERBOSE --module VtsHalBluetoothV1_0IfaceFuzzer

This will run VtsHalBluetoothV1_0IfaceFuzzer test, print logs to screen, and return back to shell.


You will have to rely on logs to identify fuzzer failure. If the fuzzer encounters an error (e.g. segfault, buffer overflow, etc), you will see something like this in your log:

==15644==ERROR: AddressSanitizer: SEGV on unknown address 0x000000000110 (pc 0x0077f8776e80 bp 0x007fe11bed90 sp 0x007fe11bed20 T0)
==15644==The signal is caused by a READ memory access.
==15644==Hint: address points to the zero page.
    #0 0x77f8776e7f  (/vendor/lib64/hw/
    #1 0x77f87747e3  (/vendor/lib64/hw/
    #2 0x77f87e384b  (/system/lib64/
    #3 0x79410ae4df  (/data/local/tmp/64/
    #4 0x794498c90f  (/data/local/tmp/64/
    #5 0x5f42e82ca3  (/data/local/tmp/libfuzzer_test/vts_proto_fuzzer+0xaca3)
    #6 0x5f42e8f08f  (/data/local/tmp/libfuzzer_test/vts_proto_fuzzer+0x1708f)
    #7 0x5f42e8f27b  (/data/local/tmp/libfuzzer_test/vts_proto_fuzzer+0x1727b)
    #8 0x5f42e900a7  (/data/local/tmp/libfuzzer_test/vts_proto_fuzzer+0x180a7)
    #9 0x5f42e90243  (/data/local/tmp/libfuzzer_test/vts_proto_fuzzer+0x18243)
    #10 0x5f42e88cff  (/data/local/tmp/libfuzzer_test/vts_proto_fuzzer+0x10cff)
    #11 0x5f42e8655f  (/data/local/tmp/libfuzzer_test/vts_proto_fuzzer+0xe55f)
    #12 0x7944aef5f3  (/system/lib64/
    #13 0x5f42e8029b  (/data/local/tmp/libfuzzer_test/vts_proto_fuzzer+0x829b)

AddressSanitizer can not provide additional info.

This means that the fuzzer was able to trigger a segfault somewhere in bluetooth HAL implementation. Unfortunately, we don't have a way to symbolize this stack trace yet.

However, the log will contain the last call sequence batch that triggered the failure.


Let's assume you have got trace files for your test (e.g. by running the tests with profiling enabled, see instructions about HAL API call latency profiling). The trace files should be stored under test/vts-tescase/hal-trace/<HAL_NAME>/<HAL_VERSION>/

where <HAL_NAME> is the name of your HAL and <HAL_VERSION> is the version of your HAL with format V<MAJOR_VERSION>_<MINOR_VERSION>.

We will use Vibrator Hal as a running example throughout this section, so the traces are stored under test/vts-tescase/hal-trace/vibrator/V1_0/

Create an HIDL HAL replay test

Follow the same instruction in Codelab for Host-Driven Tests to create a host-side VTS test with name VtsHalVibratorV1_0TargetReplay.

Add the following line to the corresponding AndroidTest.xml under the test configuration to use the VTS replay test template.

<option name="binary-test-type" value="hal_hidl_replay_test" />

Add the following line to the corresponding AndroidTest.xml under the test configuration to add the trace file for replay.

<option name="hal-hidl-replay-test-trace-path" value="test/vts-testcase/hal-trace/vibrator/V1_0/vibrator.vts.trace" />

Note, if you want or replay multiple traces within the test, add each trace file using the above configuration.

Add the following line to the corresponding AndroidTest.xml under the test configuration to indicate the HIDL HAL package name for the test.

<option name="hal-hidl-package-name" value="android.hardware.vibrator@1.0" />

An example AndroidTest.xml for a replay test looks like follows:

<configuration description="Config for VTS VtsHalVibratorV1_0TargetReplay test cases">
    <option name="config-descriptor:metadata" key="plan" value="vts-hal-replay" />
    <target_preparer class="">
        <option name="abort-on-push-failure" value="false"/>
        <option name="push-group" value="HalHidlHostTest.push"/>
        <option name="cleanup" value="true" />
        <option name="push" value="spec/hardware/interfaces/vibrator/1.0/vts/Vibrator.vts->/data/local/tmp/spec/target.vts" />
        <option name="push" value="DATA/lib/>/data/local/tmp/32/"/>
        <option name="push" value="DATA/lib64/>/data/local/tmp/64/"/>
    <multi_target_preparer class=""/>
    <test class="">
        <option name="test-module-name" value="VtsHalVibratorV1_0TargetReplay"/>
        <option name="binary-test-type" value="hal_hidl_replay_test" />
        <option name="hal-hidl-replay-test-trace-path" value="test/vts-testcase/hal-trace/vibrator/V1_0/vibrator.vts.trace" />
        <option name="hal-hidl-package-name" value="android.hardware.vibrator@1.0" />
        <option name="test-timeout" value="2m"/>

Schedule the replay test

Add the following line to vts-serving-staging-hal-hidl-replay.xml

<option name="compatibility:include-filter" value="VtsHalVibratorV1_0TargetReplay"/>

Basically, now it is all set so let's wait for a day or so and then visit your VTS Dashboard. At that time, you should be able to add VtsHalVibratorV1_0TargetReplay to your favorite list.

Multi-device tests are possible since VTS 9.0. Here is an example of a sample multi-device test:

Configure test plan

<configuration description="VTS Codelab Plan">
<device name="device1">
<build_provider class="" />
<target_preparer class="" />
<target_preparer class="" />
<device name="device2" >
<build_provider class="" />
<target_preparer class="" />
<target_preparer class="">
<option name="push-group" value="HostDrivenTest.push" />
<option name="compatibility:include-filter" value="VtsCodelabHelloWorldMultiDeviceTest" />

Here two device items are set in test plan config to request two devices, and VtsFilePusher options are used for each device to push default host driven related driver files.

Call both device' shells in a test script

def setUpClass(self):'number of device: %s', self.android_devices)
asserts.assertEqual(len(self.android_devices), 2, 'number of device is wrong.')
self.dut1 = self.android_devices[0]
self.dut2 = self.android_devices[1]
self.shell1 =
self.shell2 =

def testSerialNotEqual(self):
'''Checks serial number from two device not being equal.'''
command = 'getprop | grep ro.serial'
res1 = self.shell1.Execute(command)
res2 = self.shell2.Execute(command)

def getSerialFromShellOutput(output):
'''Get serial from getprop query'''
return output[const.STDOUT][0].strip().split(' ')[-1][1:-1]
serial1 = getSerialFromShellOutput(res1)
serial2 = getSerialFromShellOutput(res2)'Serial number of device 1 shell output: %s', serial1)'Serial number of device 2 shell output: %s', serial2)
asserts.assertNotEqual(serial1, serial2, 'serials from two devices should not be the same')
asserts.assertEqual(serial1, self.dut1.serial, 'serial got from device system property is different from allocated serial')
asserts.assertEqual(serial2, self.dut2.serial, 'serial got from device system property is different from allocated serial')

self.android_devices is a list of AndroidDevice controllers. In the above code, we called each device's getprop command through shell to get device serial and verify them.

VtsVndkAbi ensures the ABI of VNDK libraries is consistent with that in generic system image. ABI compatibility is one of the requirements of system-only OTA.

The ABI test compares VNDK, VNDK-SP, and extension libraries with pre-generated dump files which include symbols and vtables. When the test is running, it finds the VNDK libraries in device's partitions in the order of odm, vendor, and system. If a library is not present on the device, that means the library is unused and not installed. The test skips the comparison. If a library is present, it is required to define all symbols and vtables that the corresponding dump file has, while there are exceptions:

  1. NO_TYPE, undefined (SHN_UNDEF), and weak symbols (STB_WEAK) are not included in the dump files. Weak symbols are normally generated from inline functions and template instantiations. Even though they are defined in the library, they are not always part of the library's public header.
  2. The symbols defined in 3 static libraries are skipped: libgcc.a, libatomic.a, libcompiler_rt-extras.a. They are runtime libraries that contain symbols depending on CPU variants. e.g., __aeabi_idiv.

Generate ABI dump

  1. Select a generic lunch target and build libraries
$ . build/
$ lunch aosp_arm64_ab-userdebug
$ make vndk libgcc libatomic libcompiler_rt-extras -j30
  1. Compile vndk-vtable-dumper
$ make vndk-vtable-dumper -j30
  1. Create a text file which consists of library names. One library in each line. In this example, the file name is vndk_list.txt
  2. Run to generate dump files. In addition to the text file, the script accepts library names as arguments. {VNDK_VER} in library name is replaced with "-" + PLATFORM_VNDK_VERSION which is a make variable.
$ cd ${ANDROID_BUILD_TOP}/test/vts-testcase/vndk/golden
$ rm -rf ./P
$ ./ vndk_list.txt
$ ./ vndk-sp{VNDK_VER}/
  1. The script detects the target CPU architecture and search ${ANDROID_PRODUCT_OUT}/system/lib[64] for the libraries. The dump files are output to PLATFORM_VNDK_VERSION/TARGET_ARCH/lib[64]/. For example, P/arm64/lib64/vndk-sp-P/libutils.so_symbol.dump
  2. Delete ${ANDROID_HOST_OUT}/vts/android-vts/testcases/vts/testcases/vndk/golden/P and build VTS. The new VTS package will contain the dump files.

If you run into any problems with VTS, you can report an issue to the android-vts google group. In order for the VTS team to address your issue effectively, please run the python script (located under test/vts/script) and include the logged output to your report. The script prints important system platform information such as Python and Pip versions.

$ python test/vts/script/

Example output:

===== Current date and time =====
2017-05-23 10:37:35.117056
===== OS version =====
===== Python version =====
2.7.6 (default, Oct 26 2016, 20:30:19)
[GCC 4.8.4]
===== Pip version =====
pip 9.0.1 from /usr/local/lib/python2.7/dist-packages (python 2.7)
===== Virtualenv version =====
===== Target device info [] =====