Tags:
create new tag
view all tags

Thoughts on testing the LCFG server

As part of the LCFG server refactoring project we want to be able to test any changes made to the LCFG server code to ensure that bugs are not introduced. The initial aim is that the output will be functionally identical, we are cleaning up the code but not altering the expected behaviour.

The general idea is that given the same input and configuration two versions of the server should generate the same XML output for each source profile.

We will need a script that collects together all the input (schemas, headers, package lists and source profiles) and configuration files along with the generated output.

Comparing several thousand profiles could take quite a while so to do this efficiently we plan to take an md5sum of each XML profile and compare them. Only if they differ will the profiles then be compared in more detail. This does rely on us not altering the ordering of the generated XML (where it doesn't have a functional effect) or the whitespace around the tags. In the first stage of the changes it is probably a good idea to maintain the ordering of elements. One option would be to run the XML profiles through a tool such as xmltidy to get them into a consistent layout but that is quite an expensive operation.

To make the XML profiles comparable a few nodes need to be modified to remove the variable data, these are published_by, published_at and server_version.

A comparison of XML profiles needs to compare the list of components, the list of top-level resources for each component, the sub-trees below each resource node and the list of packages.

The list of components, list of top-level resources and the list of packages are sorted by the current server before writing out. Ordering for these does not actually matter so we might want to have the option to sort each list before comparison. The sub-trees below each resource node must be compared directly, the structure should be remain the same.

We will ignore derivation information for now as we know it will differ but we might put together something later to ensure that it is still being generated correctly.

The plan would be to have a script which takes two directories of output and looks for differences.

The comparison must include checking that directories exist for all expected domains and that all expected profiles exist and no extra domains or XML profiles have appeared.

As a bonus it would be nice to able to point the script at two directories on the web and get it to pull them down for comparison but the basic script MUST be able to work without network connectivity.

-- Main.squinney - 2009-07-02

Topic revision: r2 - 2009-07-02 - squinney
 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2020 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback