Computer software does sometimes require some engineering discipline to keep in running smoothly. Software sometimes needs to be spoon fed information and sometimes that information comes in the form of documents similar to those read by humans (invoices can be sent from computer system to computer system to simplify and improve the efficiency of business transactions, perhaps over the Internet). Sometimes humans need to use software to write documents and software to read those same documents. The software used to write a document might be different from the software used to read it (Word at one end and Open Office at the other, say). On the Internet, web browsers need to read websites written and served up with various kinds of software packages from various software producers. All this makes it important at times to apply some engineering practices to ensure things work well. One such practice is the process from specification to conformance test or interoperability test. The specification for, say, a document might require that software reading the document handle it in this way or that way. It might also say how the document is to be written, perhaps using one of the many markup languages such as those which are based on the W3C standards authority's eXtensible Markup Language (XML). One software engineering practice used over and over for decades of computer system history is the production of many test assertions corresponding to statements in a specification. These are atomic restatements which get numbered or indexed in some way so that a test based on the spec can be tied to an individual statement in the spec. The test assertions are a bit like a special engineering index for the spec to help with testing.
Now a test assertion exists so that testers know that there is something particular which they ideally ought to make a special point of testing. They can refer to this test assertion in their test so that a failure of a component being tested can be tied in a test report to the exactly relevant item in the spec. Not complicated really. They often call the individual tests 'test cases'. Large, complicated systems can involve many thousands of test assertions (for large, complicated specifications, of course) and similarly large numbers of test cases based on these assertions. The test assertions could fill a fair bit of a database or a pretty big file of data, depending how they are stored. Then you have to be able to cater for various versions of the specs and various versions of the software or documents being tested. Simplicity might help make it all manageable and help real people keep track of it all.
Now I have an interest at the moment in a particular technology and I'd like to explore an idea that for XML document testing, and maybe any kind of testing when the tests are documented first as XML documents, you can either write the test using this certain technology or turn test results into test reports using this technology when the tests themselves are run some other way. The technology in question is W3C XML Schema version 1.1. This allows an assertion a bit like a test assertion to be written using a special syntax for expressions relating to an XML document to be inserted into the middle of a definition of a particular part of that XML document. Test assertions targeting XML documents do not have to be expressed so that they can be executed as a test applied to the XML documents. There are some well known examples of where they are written this way but here (some WSI web services specs have such test assertions) really these test assertions are doubling up as test cases. I reckon the assertion feature in XML Schema 1.1 might be used for such test-assertion-like test cases but how? If you write a set of descriptions of tests using XML markup of some kind and define that markup using XML Schema 1.1 I reckon you could match every atomic statement of a requirement or the like in a spec to a test assertion for that requirement and put an element in the markup to report on the testing of the test assertion and define that element in the XML Schema. If the XML Schema is written using version 1.1 then you can put an assertion into the definition of the element which when executed as an executable expression (with a Boolean result) against the element in the test report produces a yes/no answer (true/false answer, it being a 'Boolean', 'predicate' expression).
That might be one way to do it which results in a layer of yes/no answers which can be overlaid on the report to show whether test results are conforming to the test assertions or not. Another way works if the target is itself an XML document but here it gets more challenging. The assertions are run against the XML target but how do they get stored in the XML Schema, I wonder? The test assertions might be themselves documented using markup such as OASIS's Test Assertion Guidelines Technical Committee's (in progress) Test Assertion Markup Language (which I helped write up). Then we are left with the test cases which I wonder whether they can somehow be put into the form of a W3C XML Scheme 1.1 schema. They could be put into the form of a Test Assertion Markup Language XPath profile document and executed that way against the target XML using a tool like the Google Code project's Tamelizer (by Fujitsu America, based on previous work with WS-I) but that doesn't use XML Schema 1.1, it uses XSLT 2.0, which is cool. (Schematron is a similar alternative too which also uses XSLTXPath used by XML Schema 1.1 and the above alternatives) to some kind of schema but it isn't obvious how to do so. I think I'd need a report-like XML structure with one element for each test case and for each element (or element's 'type', an XML Schema thing) a schema definition where I can insert the XPath assertion expression. That doesn't work. I don't want to run the schema against the XML structure, I want to run the assertions in it against the target XML. No good. The target XML might have its own schema. What I have to do is create a schema for the target and put my assertions there. But it constrains the way I define my schema, perhaps not the way I want to do it for the type of XML document I am testing. Still it is an option. Another is to write tests and report on them in a test report document and define the test report document's XML using the schema where I put the assertions. That means two steps and is the same as the first option I looked at above which need not be limited to XML targets (or even software targets, similarly with Tamelizer).
Right so I might decide I can indeed use XML Schema 1.1 to define the target and put my assertions into the individual elements' definitions. I'd probably want a 'global' element definition for each element which always has the same test assertion(s) applying to it wherever it occurs. If the test assertions depend on where the element occurs then I might have to have local elements defined so they can have a set of test assertions applied to them depending on their context (where they occur in the document). This messes with my design a bit but seems to be part and parcel of this technique. All-in-all I need my assertions to map to the test assertions and the test assertions to map to the 'normative' statements in the spec for that type of document. OK, fine. It means to use this approach I have to base my schema design not just on the document's XML structure and processes intended for the document which might impose requirements on the schema design but also I base the schema design partly on the spec design too insomuch as that dictate the test assertions and their granularity. I might then have a mix of global element definitions and local element definitions but I would foresee possible problems if an element has some assertions relating to it as a target which depend on its context in the document and some which don't. In these cases the local, context-dependant requirements (and their test assertions) trump the global, context-independent ones and the element might just have to be defined with one or more local definitions in the schema.
I conclude I'm fairly comfortable with the use of W3X XML Schema 1.1 for associating test assertions with XML documents and even with other targets for testing but it might be limited to applying a kind of truth table of yes and no test reports as a layer over the top of a more general test report for tests made some way other than with the assertion expressions themselves and this goes along similar lines to those used in tools like Tamelizer. To go further and make the schema double as the set of test cases (the test suite or part of it) requires, I think, that the target be an XML document which is itself defined using a schema under the control of the test assertion and / or test case author(s). Cool.
No comments:
Post a Comment