One platform.
Every regulator's question, answerable.
Kirimana is the open-source, Artificial-Intelligence-native data contract and automation platform. The capabilities are the same in every industry — owner on every contract, classification before any model sees the data, lineage from business goal to source row, audit on every Artificial Intelligence call. What changes is the regulator and the question. Below: how those capabilities meet the realities of four industries we know well.
Every industry is becoming a data industry — and an AI-governed one.
The pattern is the same in manufacturing, retail, telco, energy, logistics, and SaaS: data sits in many platforms, the AI conversation is moving faster than the governance, and someone has to be able to answer the regulator (or the auditor, or the board) when they ask where a number came from. Kirimana doesn't replace your data platform. It puts a contract on top of it that travels with the data — across teams, across vendors, across years.
- A patchwork of data tools — warehouse, lakehouse, an old data lake, vendor-specific governance UI for each — and no single canonical truth about what each dataset means.
- AI usage is accelerating, but security and compliance can't tell which model saw which classification of data.
- Schema drift breaks dashboards on Monday morning; someone discovers it from a complaint, not a check.
- When the company moves platforms (cloud migration, M&A, vendor change), the governance work is redone from scratch.
A mid-size company moves from a self-managed data warehouse to a managed lakehouse. Without Kirimana, the contracts, classifications, and audit history are tied to the old tooling — every dataset gets re-curated by hand. With Kirimana, the canonical contracts and the audit trail are platform-agnostic; the platform adapter changes, the governance does not.
When the regulator asks where a number came from, you have an answer.
Banking data isn't just sensitive — it's reportable. Basel Committee on Banking Supervision Standard 239 (BCBS 239) demands lineage and timeliness on risk data. The European Union's Digital Operational Resilience Act (DORA) demands operational evidence on every change. The European Union Artificial Intelligence Act (EU AI Act) demands classification and audit on every model. Markets in Financial Instruments Directive II (MiFID II) demands traceability on transaction reporting. None of these regulators care which cloud you're on. They care that you can produce the trail. Kirimana gives you the trail by design — not as a quarterly project.
- Risk data aggregation across trading, treasury, and retail divisions — a BCBS 239 obligation — is built on lineage that only one engineer remembers.
- AI-assisted decisions in credit, fraud, and Anti-Money Laundering (AML) need to be explainable to model risk and to the regulator. Today they aren't.
- Schema drift in a feeder system breaks regulatory reports two weeks later, with no warning.
- Sensitive customer data leaves the tenant when a developer pastes it into a public AI tool.
An internal audit team is asked to produce, within five business days, the full lineage of the Tier 1 capital ratio number reported last quarter — every input, every transformation, every owner. With Kirimana, the goal-to-data lineage and the audit log together produce that evidence pack from a single command. The auditor reads it; the regulator sees the same trail.
Sovereignty, transparency, accountability — without locking yourself to one vendor.
Public-sector data programmes operate under a different gravity. Procurement is multi-vendor by mandate. Data residency is sovereign — citizens' data stays in jurisdiction. Transparency obligations cut both ways: the public has a right to know, and the agency has an obligation to redact what the law says must be redacted. Budgets per agency are smaller than the headlines suggest. Open source isn't a preference; it's a procurement strategy. Kirimana fits the public-sector contract because it was designed open from the first commit.
- Vendor lock-in conflicts directly with EU sovereign-cloud and procurement-diversification policies.
- Citizens' data must stay in jurisdiction. The platform must self-host on European Union (EU) infrastructure or on-premises — and prove it.
- Multiple agencies want to share contract definitions (a 'person', an 'organisation', a 'case file') but each runs a different stack.
- Transparency requests, freedom-of-information requests, GDPR Article 15 requests — and Article 17 erasure — collide unless the audit trail is built for both.
Three agencies in the same ministry need a shared definition of 'organisation' that traces to the national business registry. Without Kirimana, this becomes a six-figure procurement for a master-data tool. With Kirimana, one agency publishes the canonical contract to a federated library; the other two adopt it, version-pin it, and continue running their existing platforms. No vendor consolidation required.
Patient data, research data, and AI assistance — under one classification gate.
Healthcare data flows in three currents that should not mix: clinical operations (a patient's record, in care), research (de-identified data under ethics approval), and population-level analytics (aggregated, public-health). The boundaries are statutory. The Health Insurance Portability and Accountability Act (HIPAA) in the United States, GDPR Article 9 special-category data in Europe, ethics-board approvals on every research dataset, and the rapidly evolving guidance on Artificial Intelligence in clinical decision support — all converge on the same operational question: can you prove which data, under which consent, fed which algorithm? Kirimana makes that question answerable.
- Electronic Health Record (EHR) data exports to a research warehouse without a clear consent boundary; a Data Protection Officer (DPO) finds out through a complaint.
- AI-assisted diagnostic models are trained on data classifications nobody can reproduce six months later.
- Clinical trial integrity (Good Clinical Practice / GxP) requires a chain of custody from consent form to published result; today that chain is reconstructed manually for each audit.
- Hospital information-technology budgets won't support a per-seat governance product on top of the EHR licence.
An analyst writes a query that joins data classified as restricted (containing identifiable clinical attributes) with data classified as confidential research output. Without Kirimana, the join runs and a Data Protection Officer is notified weeks later — if at all. With Kirimana, the AI policy gate and the contract's classification refuse the join at apply-time, the lineage shows the upstream sensitivity, and a second query is written within the boundary. Once the clinical-tier layer + consent-expiry fields ship on the spec, the same enforcement extends to ethics-approval lifecycle.
Don't see your industry?
The capabilities are universal — owner-on-every-contract, classification-before-Artificial-Intelligence, lineage from goal to source. The regulators differ; the architecture doesn't. Talk to Kiri about your specific stack and obligations, or request early access and we'll work the mapping with you directly.