You can run the benchmark with custom settings via Android Debug Bridge (ADB). 

Edit the config.json file

The app ZIP package includes a predefined configuration file in the automation directory, config.json, shown below. Edit the settings in the file to define your custom benchmark run. 

{
  "custom":true,
  "numOfThreads":4,
  "rounds":20,
  "stressMode":false,
  "useF16Precision":true,
  "testNames":[
    "MOBILENET_V3_INT8",
    "MOBILENET_V3_FLOAT32",
    "SSD_MOBILENET_V3_INT8",
    "SSD_MOBILENET_V3_FLOAT32",
    "DEEPLAB_MOBILENET_V2_INT8",
    "DEEPLAB_MOBILENET_V2_FLOAT32",
    "INCEPTION_V4_INT8",
    "INCEPTION_V4_FLOAT32",
    "CUSTOM_CNN_INT8",
    "CUSTOM_CNN_FLOAT32"
  ]
}
  • Set custom to true to tell the app that the benchmark should use custom settings.
  • numOfThreads sets the number of threads used to run the benchmark. The optimal value is 4.
  • rounds sets the number of inference rounds for each test. The default value is 20.
  • When stressMode is true, the benchmark will run for 10,000 rounds using NNAPI only.
  • When useF16Precision is true, the benchmark relaxes the precision from F32 to F16 precsion. 
  • Add or remove models from testNames to set the running order of the tests to include in the benchmark run.

Push the config.json file to the app

Run the following command to push the config.json file to the config folder in the app's directory.

$ adb push config.json /storage/emulated/0/Android/data/com.ul.benchmarks.procyonai/files/config

Start a custom benchmark run via ADB

After the configuration file for the custom benchmark run has been added to the application, run the following command to start the custom benchmark:

$ adb shell am start -a android.intent.action.MAIN -n com.ul.benchmarks.procyonai/.MainActivity --ez "runTest" true --ez "custom" true

Result files

When the benchmark is complete, the result is saved in a ZIP file in the following path on the device.

/sdcard/Android/data/com.ul.benchmarks.procyonai/files/results/

The result ZIP contains the following files:

  • config.json contains the settings that were used by the benchmark.
  • device_score.json contains all the calculated benchmark scores and a timestamp for the result.
  • monitoring_data.csv contains the hardware monitoring data collected during the benchmark run.
  • results.csv contains the  inference time, initialization time, and model quality score metrics from each test.
  • systeminfo.json contains the device identification data.