The Algorithmic Transparency Recording Standard (ATRS) is the UK government's answer to a question that quietly went unanswered for too long: when a public body uses AI to make or assist a decision, how is that visible to the people affected by it. The standard is now mandated for central government and adopted increasingly across the wider public sector. The records are public, structured, and auditable.
Background
ATRS originated from the Cabinet Office and the Centre for Data Ethics and Innovation, with current stewardship through DSIT (the Department for Science, Innovation and Technology). The standard sets out a structured template for documenting an algorithmic tool, what it does, who is affected, and what oversight exists. The records are published on the Algorithmic Transparency Hub.
The standard is mandatory across central government for in-scope AI tools and is being adopted across local government, the NHS, and devolved administrations through a mix of policy mandates and procurement pressure. Treating ATRS as optional is increasingly a procurement risk.
When it applies
Not every algorithm needs an ATRS record. The standard is aimed at AI tools that meaningfully shape decisions affecting the public. A typical positive case is an AI model that classifies, prioritises, or routes citizen interactions. A typical negative case is a back-office analytics dashboard that helps managers see trends but does not directly affect any individual decision.
The line is judgement-driven, not bright. The DSIT guidance gives examples and a self-assessment framework. The question to ask is whether a member of the public could reasonably want to know that AI was involved in how their request was handled. If yes, an ATRS record is appropriate.
What an ATRS record contains
The standard defines eight sections. Tier 1 covers a brief summary of what the tool does, the organisation accountable for it, and the contact for queries. Tier 2 goes deeper: detailed information about how the tool works, the data it uses, the technical specification, the oversight in place, the risks, mitigations, and impact assessments. The published record is intended to be readable by a member of the public, not just a technical reviewer.
Writing a record by hand for every AI tool, every model update, and every change of scope is the kind of task that gets postponed until a regulator asks. Generating the record automatically from the platform's configuration and operational metadata removes that excuse and produces something more accurate than a hand-written document.
What good looks like
A working ATRS process for operational AI has three properties. The record is generated from real configuration and live operational data, not curated for the publication event. The record updates when the model changes, the training data changes, or the oversight model changes, without anyone needing to remember to do it. The record is clear enough that a member of the public reading it could form a reasoned view of whether the tool seems fair, with enough technical depth that an auditor reading it could form a reasoned view of whether the tool is sound.
The record should also link back to the operational evidence: classification thresholds, override rates, confidence distribution, and the human-in-the-loop policy. These are the things an auditor will eventually ask about, and a record that links to them is far easier to defend than one that asserts them.
How Jarsis handles it
Jarsis Platform generates ATRS-aligned records automatically for every AI feature in scope: classification, document metadata extraction, and any other model the platform runs. The records are populated from the live platform configuration and operational metrics, so what is published reflects what is actually running. When a model or threshold changes, the record updates with it.
The records can be exported in the structure expected by the Algorithmic Transparency Hub for publication, or consumed internally for an organisation's own governance. The Compliance & Audit module gives the governance team the same view, with sign-off and review workflows attached.
See ATRS records being generated
Book a 30-minute walkthrough. We will show you the AI features running, the records they generate, and how the governance workflow looks in practice.
