Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in record #1218

Open
vickariofillis opened this issue Jul 10, 2023 · 7 comments
Open

Error in record #1218

vickariofillis opened this issue Jul 10, 2023 · 7 comments
Labels

Comments

@vickariofillis
Copy link

I tried using the record functionality but I am getting the following error. I got the same error for regardless of the benchmark.

INFO     Setting up target
INFO     Deploying gfxbench
.
.
.
INFO     Press Enter when you have finished recording TEARDOWN...

INFO     Pulling 'lahaina.teardown.revent' from device
INFO     Tearing down gfxbench
ERROR      File "/usr/local/lib/python3.10/dist-packages/wlauto-3.4.0.dev1+bf72a576-py3.10.egg/wa/framework/entrypoint.py", line 149, in main
ERROR        sys.exit(command.execute(config, args))
ERROR      File "/usr/local/lib/python3.10/dist-packages/wlauto-3.4.0.dev1+bf72a576-py3.10.egg/wa/commands/revent.py", line 129, in execute
ERROR        self.workload_record(args)
ERROR      File "/usr/local/lib/python3.10/dist-packages/wlauto-3.4.0.dev1+bf72a576-py3.10.egg/wa/commands/revent.py", line 219, in workload_record
ERROR        workload.teardown(context)
ERROR      File "/usr/local/lib/python3.10/dist-packages/wlauto-3.4.0.dev1+bf72a576-py3.10.egg/wa/framework/workload.py", line 364, in teardown
ERROR        self.gui.teardown()
ERROR      File "/usr/local/lib/python3.10/dist-packages/wlauto-3.4.0.dev1+bf72a576-py3.10.egg/wa/framework/workload.py", line 568, in teardown
ERROR        raise RuntimeError('Commands have not been initialized')
ERROR    
ERROR    RuntimeError(Commands have not been initialized)

This happens after I press enter to finish recording the teardown stage. I am using the following command to run it.

wa record -a -w gfxbench

I am using the latest version.

wlauto-3.4.0.dev1+bf72a576-py3.10.egg

@marcbonnici
Copy link
Contributor

Hi, The revent recording mechanism was designed to run with game workloads (e.g. templerun, angrybirds) as it is not possible to automate then with UIAutomator.
The benchmark workloads e.g. gfxbench are implemented with UIAutomator as it allows for more reliable and repeatable runs.

Are you able to use the existing configuration of the workloads to execute the required tests rather than relying on revent?

@vickariofillis
Copy link
Author

I was looking into having more flexibility in terms of the executed sub-benchmarks. For example, Geekbench supports only the CPU option and GFXBench has by default a subset of all sub-benchmarks. Implementing selecting Compute as well is not as straightforward as I expected*, so I was trying to find a solution via the record-replay route.

  • I get errors while using UIAutomatorViewer so I have no idea the names and types of the various UI elements of the applications.

Any suggestions for broadening the options of selecting sub-benchmarks? Is there a directory/website with other people's implementations (e.g., Geekbench with Compute)?

@marcbonnici
Copy link
Contributor

Hmm.. I see, the main problem I see with using revent for this is that you will not have a way to extract the scores from the workload results as that can only play back inputs and cannot react to screen elements and wait for the tests to complete, detect results etc.

What errors do you face when trying to use uiautomator? Are you having problems with just the benchmark or obtaining any captures from your device?

@vickariofillis
Copy link
Author

I don't care about recording the benchmark-specific scores so that's not an issue. I'm interested in only running the benchmarks.

Regarding UIAutomatorViewer. This is the error I get while trying to capture a device screenshot (UI XML Snapshot).

Unexpected error while obtaining UI hierarchy
java.lang.reflect.InvocationTargetException

Are you having problems with just the benchmark or obtaining any captures from your device?

What exactly do you mean by that? UIAutomator scripts run fine.

In general, I'm interested in using a profiler tool (Snapdragon Profiler or ARM Streamline) to collect device data while automating the running of various benchmarks (and sub-benchmarks).

@scojac01
Copy link
Contributor

You can change what GFXBench subtests are running here: https://github.com/ARM-software/workload-automation/blob/master/wa/workloads/gfxbench/__init__.py#L41

I found that the UiAutomatorViewer stopped working if my Android Tools were outdated by the device. Have you tried updating your tools?

@vickariofillis
Copy link
Author

You can change what GFXBench subtests are running here: https://github.com/ARM-software/workload-automation/blob/master/wa/workloads/gfxbench/__init__.py#L41

Thanks for the heads up.

I found that the UiAutomatorViewer stopped working if my Android Tools were outdated by the device. Have you tried updating your tools?

Yeah. Unfortunately, I still get an error.

@marcbonnici
Copy link
Contributor

Yeah. Unfortunately, I still get an error.

I would be tempted to suggest doing a clean install of the latest available android tools and SDK just to check that an older version is not being picked up somehow and still causing the error.

Or worst case scenario you could always take a ux dump on the device manually and open the resultant files in uiautomator. (screencap -p > /sdcard/ui_dump.png && uiautomator dump)

I don't care about recording the benchmark-specific scores so that's not an issue. I'm interested in only running the benchmarks.

I would suggest trying to get the UIAutomator approach to work first however as a workaround, one option would be to create a new Revent version of the workload which would allow the use of the record commands. Although you would still encounter the limitations of revent i.e. only being able to measure for a fixed amount of time rather than being able to detect when a test has completed as the timings of the benchmark / revent playback may not be consistent across runs etc.

To create a new revent workload we have the following command to help with this [1]

wa create workload -k apkrevent gfxbench_revent 

This will generate a workload file for you which can then be modified to add the appropriate package name (which can be taken from the existing workload) and parameters (if required).
This will allow WA to auto launch the application and you to use the record command as initially attempted.

[1] https://workload-automation.readthedocs.io/en/latest/developer_information.html#adding-a-reventapk-workload

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants