Early access open — 121 days to EU AI Act enforcement

When the regulator asks,
you'll have the answer.

Tracient logs every AI agent action and generates regulator-ready DORA and EU AI Act evidence packs on demand. Purpose-built for compliance officers and DPOs at regulated fintechs, neo-banks, and payment institutions.

See how it works →
✓ You're on the list. We'll be in touch well before 2 August 2026.

The compliance gap

Your AI agents
have no paper trail.

DORA Articles 9, 12, and 13 require regulated financial institutions to maintain auditable records of every automated process. When your AI agent queries your transaction database, calls a sanctions API, or flags a customer account — that is a regulated action. Today, most compliance officers cannot answer the regulator's first question: what did your agents access, and when?

Your IAM platform covers your people. Your GRC tool maps your policies. Neither captures what your AI agents actually did — and neither produces the evidence pack your auditor will ask for. Tracient does.

See how it works →
The question your auditor will ask
"Provide the access log for all AI systems that processed customer data in Q2 2026."
Without Tracient
No log exists. Cannot demonstrate compliance. Regulatory exposure.
With Tracient
One-click DORA evidence pack. Complete audit trail. Ready in seconds.

How it works

Install in an afternoon.
Audit-ready by morning.

Tracient sits as a thin SDK layer between your AI agents and your systems. No changes to your existing agent code. No new infrastructure. No replacement of your existing compliance tooling — it fills the gap they leave.

01

Install the SDK

One pip install. Hooks into LangChain's native callback system without modifying your agent code. Works with your existing orchestration setup.

$ pip install tracient

from regulatediam import AuditLayer
agent = AuditLayer.wrap(your_agent)
02

Every action captured

Agent identity, model version, every tool call, data source, permissions, trigger, and timestamp — all logged automatically in a tamper-evident audit store.

event: tool_call
agent: fraud-detection-v2
tool: postgres.query
dora: Art.9(2)(c) mapped
euai: Art.12 mapped
03

Evidence packs on demand

Generate a formatted DORA Article 9, 12, and 13 evidence pack — or EU AI Act transparency documentation — at the click of a button. Pre-structured for regulators.

DORA Art.9 access evidence
DORA Art.12 incident extract
EU AI Act Art.13 transparency
evidence_q2_2026.pdf

Regulatory coverage

Three regulations.
One audit trail.

Every agent action is mapped to the specific articles your institution is subject to — so you're never manually translating raw logs into evidence.

DORA

Digital Operational Resilience Act

In force January 2025 · Enforcement ongoing

Requires regulated financial entities to maintain auditable ICT risk frameworks. All automated processes — including AI agents — must be logged, governed, and demonstrably controlled.

  • Art. 9 — ICT risk management & access controls
  • Art. 12 — Backup & recovery logging
  • Art. 13 — Learning & evolution documentation
  • Art. 28 — Third-party ICT risk management
EU AI Act

EU Artificial Intelligence Act

High-risk provisions: 2 August 2026

Classifies AI in credit scoring, fraud detection, and underwriting as high-risk. Requires automatic event logging, human oversight mechanisms, and transparency documentation.

  • Art. 12 — Automatic logging of events
  • Art. 13 — Transparency & information
  • Art. 14 — Human oversight requirements
  • Art. 17 — Quality management systems
NIS2

Network & Information Security Directive

Transposition deadline: October 2024

Extends cybersecurity obligations across financial services. Requires incident reporting, audit trails of ICT events, and supply chain security measures for all significant systems.

  • Art. 21 — Cybersecurity risk management
  • Art. 23 — Incident reporting obligations
  • Art. 24 — Standardisation & certification
  • Art. 32 — Supervisory enforcement

Who it's for

Built for the people
who get the call.

Compliance and risk professionals at regulated financial institutions deploying AI agents — who cannot yet demonstrate to a regulator what those agents are doing.

Head of Compliance

Series B fintech · DORA regulated

"We've deployed three AI agents this quarter. Our SailPoint instance covers human identity. Nobody has an answer for what the agents are doing."

Chief Risk Officer

Neo-bank · Payment institution licence

"Our next audit is Q3. When they ask for the agent access log, I need to hand them something — not explain that we haven't built it yet."

Chief Information Security Officer

Insurtech · Solvency II + DORA scope

"We know what every human user touches. We have no idea what our underwriting agent accessed last Tuesday. That's a gap I can't defend."

FAQ

Common
questions.

Everything you need to know before requesting a pilot.

We already use SailPoint or Saviynt — do we need this?

SailPoint and Saviynt govern human identities — your employees. They have no visibility into AI agents querying your databases or calling your APIs. Tracient provides the same governance layer for your non-human identities, in a format that maps directly to DORA and EU AI Act requirements. The two are complementary, not competing.

We already have a GRC platform for compliance — why do we need this too?

GRC platforms are excellent at mapping your policies, documenting controls, and tracking framework obligations. What they cannot do is capture what your AI agents actually did at runtime — the tool calls, data sources, timestamps, and permissions that constitute the evidence itself. Tracient generates that runtime evidence. Your GRC platform then has something real to attach to the control. They work together.

Which agent frameworks do you support?

Early access supports LangChain natively — the most widely deployed framework in regulated financial services. CrewAI, LlamaIndex, and AutoGen integrations are on the roadmap for Q3 2026. If your agents use a custom orchestration layer, we can discuss during the pilot.

Where is the audit log data stored?

Logs are stored in a tamper-evident, append-only store with EU data residency. We capture structural metadata — agent ID, tool, resource, timestamp, permission — not the content of prompts or outputs. Full data processing agreement available on request. SOC 2 Type II in progress for Q4 2026.

What does the early access pilot involve?

A 90-day pilot at no cost. We install the SDK in your environment, configure the DORA and EU AI Act mappings for your specific agent workflows, and generate your first evidence pack within a week. In exchange: a weekly feedback call and the right to reference your experience as an anonymised case study.

What does it cost after the pilot?

Early access organisations lock in founding pricing: £750/month for up to 5 agents, £1,800/month for up to 25. Scale pricing for larger estates on request. All plans include DORA and EU AI Act evidence packs, unlimited log storage, and the compliance dashboard.

Early access

2 August 2026
is closer than
your next audit.

121 days to EU AI Act enforcement

Join compliance and risk professionals from regulated fintechs, neo-banks, and payment institutions getting early access before the EU AI Act deadline.

90-day free pilot · EU data residency · No credit card

✓ You're on the list. We'll be in touch before enforcement begins.