What is FASO

FASO is an independent organisation designed to produce reproducible, audit-bound observations about safety-relevant change in advanced AI systems. We are not a regulator, not a lab, and not a command centre. We do not issue directives. We build trusted visibility: what changed, how it was verified, and what confidence is warranted.

FASO combines technology and human review to turn scattered signals into something decision-makers can actually use. Our systems gather and organise evidence into provenance-bound records, then render it into clear, bounded visuals that show what changed, how confident we are and what the evidence does not support. Human analysts then verify, interpret, and explain those displays—internally for disciplined analysis and externally in publication-safe form. This is so that labs, regulators and safety bodies can understand the observation quickly without needing privileged access to raw data. The result is neutral, auditable visibility designed to improve decision quality under uncertainty.

Why FASO Needs to Exist

AI capability and deployment are moving faster than traditional institutional cycles. Decisions are increasingly made under uncertainty, with fragmented evidence, and with incentives that do not always align to public safety. FASO exists to reduce that uncertainty by turning scattered signals into disciplined, verifiable records and clear, bounded outputs that serious readers can trust.