Camera Image Test Suite (ITS) is a framework for running tests on images produced by an Android camera. The general goal of each test in ITS is to configure the camera in a specific manner, capture one or more shots, and examine the shots to see if they contain the expected image data. Many of the tests require the camera to be pointed at a specific target chart or be illuminated at a specific intensity.
ITS is located in the CTS Verifier test harness in
cts/apps/CameraITS
.
Devices must pass the ITS tests corresponding to the supported features
advertised by the camera framework for third party apps as a subset of CTS.
Setup
To run ITS tests, the following must be set up:
- A device under test (DUT)
- A host machine (for example, a Linux desktop or laptop)
- A scene that the camera photographs
Device under test (DUT) setup
To set up a DUT, follow these steps:
- Connect the DUT to a host machine over USB.
- Grant permissions for the host to access the DUT over ADB.
Install the CTS Verifier app (
CtsVerifier.apk
) onto the device. For more information, see Using CTS Verifier.extract root/out/host/linux-x86/cts-verfier/android-cts-verifier.zip
cd android-cts-verifier
adb install -r -g CtsVerifier.apk
Host setup
ITS requires the host machine to be connected to the DUT through USB, be able to use ADB for device control and communication, and have the required software installed.
To set up your host machine, ensure the following software is installed.
Android SDK Platform Tools
The Android SDK Platform tools must be installed and ADB must be in the executable path of the shell or terminal that is running on the host machine. For the public released version of the Android SDK Platform tools, see SDK Platform Tools release notes.
Python
Python must be installed on the host machine. We recommend using a bundled Python distribution to ensure support for compatible versions. For details on which Python and package versions to install for a specific release, see the Camera ITS release notes for the corresponding release.
Mobly
For Android 12 and higher, the Mobly test framework
must be installed. Mobly lets you set up a DUT and chart tablet in the
its_base_test
class. To install the Mobly test framework, run:
pip install mobly
Environment setup
To set up the test environment, run:
cd CameraITS
source build/envsetup.sh
This command checks the Python installation, sets up the PYTHONPATH
environment variable, and runs unit tests on the utils/*.py
modules. If no
errors are printed to the terminal, the environment is ready to run the ITS
tests.
Scene setup
To set up the scenes, we recommend using the Camera ITS-in-a-box setup for ease in automation, reliability, and efficiency in testing. The ITS-in-a-box test rigs support all the lighting, centering, and chart changing requirements for ITS. Also, ITS-in-a-box is required for camera extensions testing.
For manual testing, ensure the following:
- The DUT is on a tripod
- The DUT is pointed at the correct scene for each test. (The ITS test script provide prompts to change the scene setup before starting tests in a new scene.)
- The DUT is connected to the host machine over USB.
- The DUT doesn't move during the test run.
- The scene is illuminated with a steady, non-fluctuating light source. (Don't use a fluorescent light because this introduces flicker.)
The ITS test script displays a prompt asking the user to change the scene setup before starting tests in a new scene.
The phone orientation must be set so that the camera takes images with no rotation. The easiest way to check this is with the face scenes in scene2. Most phones have the phone in landscape orientation with the phone rotated counter-clockwise for the rear camera and rotated clockwise for the front camera.
Configuration files
Using the Mobly framework, you must create a config.yml
configuration file to
define the Mobly testbed. The following are examples for different use cases.
Tablet-based scenes config.yml file
The following is an example config.yml
file for tablet-based scenes. For
tablet-based testing, the keyword TABLET
must be in the testbed name. During
initialization, the Mobly test runner initializes the parameters in the file
and passes them to the individual tests.
TestBeds:
- Name: TEST_BED_TABLET_SCENES
# Test configuration for scenes[0:4, 6, _change]
Controllers:
AndroidDevice:
- serial: 8A9X0NS5Z
label: dut
- serial: 5B16001229
label: tablet
TestParams:
brightness: 192
chart_distance: 22.0
debug_mode: "False" # "True" or "False"; quotes needed
lighting_cntl: <controller-type> # "arduino" or "None"; quotes needed
lighting_ch: <controller-channel>
camera: 0
foldable_device: "False". # set "True" if testing foldable
scene: <scene-name> # if <scene-name> runs all scenes
To invoke the test bed, run tools/run_all_tests.py
. If there are no command
line values specifying cameras or scenes, the test is run with the config.yml
file values. If there are command line values for cameras or scenes, these
override the values in the TestParams
section of the config.yml
file.
For example:
python tools/run_all_tests.py
python tools/run_all_tests.py camera=1
python tools/run_all_tests.py scenes=2,1,0
python tools/run_all_tests.py camera=1 scenes=2,1,0
sensor_fusion scene config.yml file
The following is an example config_yml
file for sensor_fusion
tests.
For sensor_fusion
testing, the keyword SENSOR_FUSION
must be in the testbed
name. Android 13 and higher support only the Arduino
controller for sensor fusion because of preview and video stabilization testing.
Android 12 supports Arduino and Canakit controllers.
Testbeds
- Name: TEST_BED_SENSOR_FUSION
# Test configuration for sensor_fusion/test_sensor_fusion.py
Controllers:
AndroidDevice:
- serial: 8A9X0NS5Z
label: dut
TestParams:
fps: 30
img_size: 640,480
test_length: 7
debug_mode: "False"
chart_distance: 25
rotator_cntl: arduino
rotator_ch: 1
camera: 0
To run sensor_fusion
tests with the
sensor fusion box, run:
python tools/run_all_tests.py scenes=sensor_fusion
python tools/run_all_tests.py scenes=sensor_fusion camera=0
Multiple testbeds config.yml file
The following is an example config.yml
file with multiple testbeds, a
tablet testbed and a sensor_fusion
testbed. The correct testbed is determined
by the scenes tested.
Testbeds
- Name: TEST_BED_TABLET_SCENES
# Test configuration for scenes[0:4, 6, _change]
Controllers:
AndroidDevice:
- serial: 8A9X0NS5Z
label: dut
- serial: 5B16001229
label: tablet
TestParams:
brightness: 192
chart_distance: 22.0
debug_mode: "False"
chart_loc_arg: ""
camera: 0
scene: <scene-name> # if <scene-name> runs all scenes
- Name: TEST_BED_SENSOR_FUSION
# Test configuration for sensor_fusion/test_sensor_fusion.py
Controllers:
AndroidDevice:
- serial: 8A9X0NS5Z
label: dut
TestParams:
fps: 30
img_size: 640,480
test_length: 7
debug_mode: "False"
chart_distance: 25
rotator_cntl: arduino # cntl can be arduino or canakit
rotator_ch: 1
camera: 0
Manual testing config.yml file
The following is an example config.yml
file for manual testing. From
Android 14, manual
testing is supported for all tests except for the
scene_extensions
tests. For manual testing, the keyword MANUAL
must be in the testbed name.
Also, the AndroidDevice
section can't include a serial or label section for
a tablet.
TestBeds:
- Name: TEST_BED_MANUAL
Controllers:
AndroidDevice:
- serial: 8A9X0NS5Z
label: dut
TestParams:
debug_mode: "False"
camera: 0
scene: 1
Running ITS tests
This section describes how to run ITS tests.
Invoking tests
After the device, host machine (including environment), and physical scene are set up, run the ITS tests using the following process.
Open the CTS Verifer app. In the tests menu, select Camera ITS Test.
From the host machine, run the ITS tests from the
CameraITS/
directory. For example, for a device with front and rear cameras, run the following command:python tools/run_all_tests.py
The script iterates through cameras and test scenes based on the
config.yml
file. For debugging setups, we recommend running one of thescene2
scenes with a single test for fastest turnaround.For manual testing, before starting to run the set of ITS tests on each scene, the script takes a picture of the current scene, saves it as a JPEG, prints the path to the JPEG to the console, and asks the user to confirm if the image is okay. This capture and confirm flow loops until the user confirms the image is okay. The following are the messages in this flow.
Preparing to run ITS on camera 0 Start running ITS on camera: 0 Press Enter after placing camera 0 to frame the test scene: scene1_1 The scene setup should be: A grey card covering at least the middle 30% of the scene Running vendor 3A on device Capture an image to check the test scene Capturing 1 frame with 1 format [yuv] Please check scene setup in /tmp/tmpwBOA7g/0/scene1_1.jpg Is the image okay for ITS scene1_1? (Y/N)
Each run of the script prints out a log showing either
PASS
,FAIL
,FAIL*
orSKIP
for each ITS test.FAIL*
indicates the test failed but because the test isn't yet mandated, the test will report as aPASS
to CtsVerifier.SKIP
indicates the test was passed because the device didn't advertise the underlying capability being tested. For example, if a device doesn't advertise through the camera interfaces that it supports DNG, tests related to DNG file capture are skipped and counted as aPASS
.To acknowledge that the tests have met the test requirements, tap the green check mark button. The Camera ITS Test entry in the CTS Verifier tests menu then becomes green and signifies the phone has passed Camera ITS.
Parallel DUT testing
Devices running Android 14 or higher support parallel DUT testing. This lets you test DUTs in parallel with multiple rigs to speed up overall testing. For example, parallel testing lets you test camera 0 in one rig and camera 1 in another rig at the same time. All testing for parallel testing sessions is aggregated on the CTS Verifier session on the reference DUT. You must run parallel testing with Arduino lighting control, as manual lighting control isn't supported with parallel testing. Make sure that a different channel on the same Arduino controller controls the lighting for each rig.
The following is a sample config.yml
file that defines three testbeds to run
in parallel.
TestBeds:
- Name: TEST_BED_TABLET_SCENES_INDEX_0
Controllers:
AndroidDevice:
- serial: <device-id-0>
label: dut
- serial: <tablet-id-0>
label: tablet
TestParams:
brightness: 192
chart_distance: 22.0
debug_mode: "False"
lighting_cntl: "arduino"
lighting_ch: <controller-channel-0>
camera: 0
scene: <scene-name> # if <scene-name> left as-is runs all scenes
foldable_device: "False"
- Name: TEST_BED_TABLET_SCENES_INDEX_1
Controllers:
AndroidDevice:
- serial: <device-id-1>
label: dut
- serial: <tablet-id-1>
label: tablet
TestParams:
brightness: 192
chart_distance: 22.0
debug_mode: "False"
lighting_cntl: "arduino"
lighting_ch: <controller-channel-1>
camera: 1
scene: <scene-name> # if <scene-name> left as-is runs all scenes
foldable_device: "False"
# TEST_BED_SENSOR_FUSION represents testbed index 2
# Parallel sensor_fusion is currently unsupported due to Arduino requirements
- Name: TEST_BED_SENSOR_FUSION
# Test configuration for sensor_fusion
Controllers:
AndroidDevice:
- serial: <device-id>
label: dut
TestParams:
fps: 30
img_size: 640,480
test_length: 7
debug_mode: "False"
chart_distance: 25
rotator_cntl: "arduino"
rotator_ch: <controller-channel-2>
camera: <camera-id>
foldable_device: "False"
tablet_device: "False"
lighting_cntl: "None"
lighting_ch: <controller-channel>
scene: "sensor_fusion"
To run the testbeds in parallel, use the following command:
for i in 0 1 2; do python3 tools/run_all_tests.py testbed_index=$i num_testbeds=3 & done; wait
DNG noise model
Devices that advertise the ability to capture RAW or DNG must provide a noise model in the capture result metadata of each raw shot. This noise model must be embedded into the camera HAL for each camera (for example, front and back cameras) on the device that claims support.
Noise model implementation
To implement a noise model, follow these steps to generate a noise model and embed the model into the camera HAL.
To generate a noise model for each camera, run the
dng_noise_model.py
script in thetools
directory. This outputs a C code snippet. For more information on how to set up the camera and capture environment, see theDngNoiseModel.pdf
document in thetools
directory.To implement the noise model for the device, cut and paste the C code snippet into the camera HAL.
Noise model validation
The tests/scene1_1/test_dng_noise_model.py
automated ITS test validates the noise model by verifying that the noise values
for the shot exposure and gain provided in the camera data is correct.