Create openEHR Conformance Definition specification


Create openEHR Conformance Definition specification


Pablo Pazos
August 24, 2017, 12:36 AM

This is my review from Aug 11 of (sent to the SEC list, had some text coloring that doesn't appear here).

1.1. Purpose

Shouldn't we add something about helping organizations to actually create/write conformance statement documents? Like guidelines and maybe some formatting suggestions?

IMO that is the first step before any testing, an actually defines what can be tested. For instance, I wrote this doc for EHRServer (3rd PDF on this page It would be desirable to have a guide on how to write those, and that be part of the purpose of this spec.

2. Overview

Conformance specification and testing ...

2.1. Scope

"A conformance point is understood as a testable functional capability of a system..."

3 paragraphs below it says " non-functional capabilities, including ..."

This might be confusing. Maybe we should say that there are functional and non functional conformance points on the first phrase.

" It is however recommended that supply agreements for operational solutions include criteria for these factors, as relevant to the situation." this is part of what should be included on the "how to create/write a conformance statement guidelines".

IMO there is no need to mention the definitions of capacity, performance, etc here, just reference the correspondent standards to avoid redefinition of terms that are already defined there:

This should not be part of the spec, to open and ambiguous IMO: "It may be possible to develop a band-based rating sytem for capacity. Performance, availability, consistency and related characteristics may be assessed using a framework that takes account of the CAP and / or PACELC theorems."

"... a minimal portal tool is provided to enable viewing or data and other testable events"

"portal" implies software is web based, we shouldn't do any assumptions.

2.2. Conformance Certificate

It is not clear, what is this? how it is created? who can create it? who can grant it? who verifies it? Same questions will appear from any vendor.

I can interpret a lot of things from this definition, please elaborate.

3. Evaluation Environment

Conformance evaluation relies on some normative idea of systems that may be tested against a set of specifications. This section describes the assumptions that are made about systems that may be reasonably tested according to this specification.

We should use more strict terms. IMO we should not explain how we decide to test systems, just say what is valid and what is not, like:

Conformance evaluation should be tested against a set of specifications. This section describes how conformance should be tested.

BTW, it is not clear about which specifications tests should be done, if it's the conformance specs, openEHR specs, the app/system specs, the ap/system conformance statement with openEHR, etc.?

Because one spec is used to guide the assessment (Conformance), and other is used to test against it (openEHR), and other is to know what and how to test (app/system spec).

"...certain types of components are assumed and named in this specification, e.g. 'EHR persistence', 'Demographic persistence', 'Querying' and so on."

Add glossary section with term definitions.

"....system provided to be tested for conformance according to this specification is of a platform nature"

"The functionality assumed to be in these components is indirectly defined by the openEHR REST APIs..."

We need to be careful with that, some products might be not considered on those categories but implement openEHR and vendors might want to test conformance. I prefer not to assume a lot about systems because it is easier to test, and generate problems in the community because of that.

Another thing would be that we want to provide conformance only for platforms, not systems or apps. We should state that explicitly to avoid confusion.

"A further assumption about an openEHR platform is that there may be a IHE ATNA-based System Log service, and if it exists, it is accessible in the test environment via an appropriate interface."

Too open, or log service is there or not. Not sure if that should be provided but the vendor or by the test environment created by openEHR. Are we evaluating if they provide such component or if their solution uses that component? A solution can use an external log service but not provide one.

"Lastly it is assumed that there may be a generic data viewer available..."

Why? If the solution is a service oriented system, it might not have anything to actually view data. And if the test suite will automatically test data retrieved by service, why a view for humans is needed? I hope we are not going to assess systems manually by looking at screens

Related to 3.2. Manual Testing. IMO manual stuff that should be avoided.

4.2.1. Data Persistence Components

I know persistence based on ADL 1.4, like the implementation done in China that generates tables based on ADL structures. Should that be mentioned here?

Can Archetype Validation conformance points be defined in more detail? These items can be a lot of things:

Accept valid content
Reject invalid archetype
Reject invalid content
1. Is talking about a composition/data commit? or is about the archetypes themselves?

2. Does that validates an ADL? Or archetype references inside an OPT?

Also, should any of those points be related with data validation using TDS?

All access methods are REST, I understand that is one easy way of doing automatic tests, but might not be the suited for all the solutions.


Thomas Beale