Skip to content
Jama Test Tracing

Jama Test Tracing

Andi Lamprecht Andi Lamprecht ·· 2 min read· Draft
ADR-0247 · Author: Eric Gesell · Date: 2025-08-18 · Products: shared
Originally ADR--0119-Jama-Test-Tracing (v2) · Source on Confluence ↗

Tracing Automated Tests In Jama

Context

As part of our new development process we require traceability from the high-level requirements all the way down to code and tests. Ideally, all of those links can be traced in a single source, namely Jama. This ADR covers methods for applying test results to the relevant Jama requirements.

Decision

Use the Jama API to annotate Jama features with their associated test results.

The process to use the Jama API works as follows:

  1. Requirements are written following the process defined in X
  2. Integration tests are written in a dialect like Cucumber. This step is a collaboration between product (who defines the requirements) and engineering (who implements the low-level parts of the test). Each test includes the Project ID of the Jama requirement it is associated with.
  3. The tests are run via Github Actions workflows, either when a branch is deployed or whenever the workflow is triggered manually.
  4. After the test run, the results are published to Jama via a small script that updates the relevant items, using the Project ID as a unique identifier.

Reporting Test Results to Jama (1).svg.svg)

Reporting Test Results to Jama

We have built a prototype Github workflow based on the Jama Python API example. This prototype proves our ability to update existing Jama objects from a Github workflow. This prototype opens the door for other Jama integrations and more complex workflows.

Expanding the Scope

The current prototype publishes fake test results directly to the Jama requirements. Another option would be to publish these results to Jama test case objects. Using Test Cases would allow us to use the built-in reporting functions provided by Jama.

Consequences

Once we implement the flow described in this document, we will have direct links from requirements to test results. We will be able to run the tests whenever needed to get updated reporting in Jama.

Alternatives Considered

The main alternative considered is Testrail. It was rejected for these reasons:

  • Learning a new 3rd-party tool
  • Redundancy, since similar test reporting can be achieved with Jama Test Objects
  • Cost
  • 3rd party data risk
  • Integration costs into our system would likely be similar to the development cost of building our own scripts.
Last updated on