You can see all of the submissions in the linked repository. They are all JSON files so it should be simple to process in any language. There is no schema yet (now that I realized this I will probably write one) but it's rather simple to follow and understand what's inside.
The only part that is non-obvious is treatment of attachments. JSON is defined as all-unicode so to be able to preserve any binary blob that we may have to (a screen-shot, some random binary or mostly-text-with-GARBAGE-in-the-middle) we just base64-encode them.
Each JSON file (example) has a few top-level elements: results, resources and attachments. Plainbox, which generates the file, can do a bit more but that's all that we need in Lantern. Looking at the result map we can see that keys are just test identifiers and values are little structures with a few bits of data. The most interesting part there is the outcome which encodes if the test passed, failed or was skipped.
There are two kinds of tests that are interesting in the v2 lantern submissions. The first looks like 2015.pl.zygoon.lantern::test/intel_backlight/software-control note that the intel_backligth part is variable and can be acpi_video0 or nv_backlight or anything else really. This test checks if the software control is working through that device. The second test, which is only executed if software control works, is 2015.pl.zygoon.lantern::test/intel_backlight/brightness-0-is-visible. This test checks if setting brightness to zero actually turns the panel off.
Now I wrote this second batch of Lantern tests to check a theory:
firmware-based brightness control device, one that is based on /sys/class/backlight/*/type being equal to firmware, keeps the panel dim but lit when brightness zero is requested a raw driver will happily turn the panel backlight offWe now have the first essential piece of he puzzle, we know if the panel was turned off or not. The only missing bit is to know what kind of backlight control device we had, raw, firmware or platform. This data is saved in two places. The most natural way to access it is to look at a resource. Resources are a plainbox concept of allowing tests to generate structured data and keep this data inside the testing session. Plainbox uses this to probe the system and later on determine which tests to run. In Lantern we use this in a few places (1, 2). Since we also store this and the data is structured and easy to access we can simply look at it there. The interesting job identifier is 2015.pl.zygoon.lantern::probe/backlight_device. It can be found in the resource_map element and it is always an array. Each element is an object with fields defined by the resource job itself. Here it has the sysfs_type field which is exactly what we wanted to know.
So how to analyze a v2 submissions? Simple:
- Load each JSON file and then
- For each backlight device listed in resource_map["2015.pl.zygoon.lantern::probe/backlight_device"]
- Memorize ["sysfs_type"] and ["name"].
- If sysfs_type is equal to "firmware" then look at result_map["2015.pl.zygoon.lantern::test/" + name + "/brightness-0-is-visible"]["outcome"] to see if it is "pass".
- If sysfs_type is equal to "raw" then look at result_map["2015.pl.zygoon.lantern::test/" + name + "/brightness-0-is-visible"]["outcome"] to see if it is "fail".
- Each device that matches points 4 or 5 confirms out theory.
I preferred to write this description rather than the actual script to familiarize everyone with the Plainbox result format. It's a simple and intuitive format to store all kinds of test results. If you want to contribute and write this analysis script just fork the code on github and start hacking. It should be simple and I will gladly mentor anyone that wants to start coding or wants to contribute something simple to the project.