Methodology Provenance - Andrew Greene - ORCID: 0009-0003-7735-8000
Where This Work Came From
The AI safety conversation is dominated by people who came from academia or software.
Almost no one in it has spent time in environments where epistemic failure has
physical consequences: where a wrong document number causes a valve to be installed
backwards, where a missing revision causes a structural assessment to be run against
the wrong load case. This is the lineage that produced this methodology.
Non-Affiliation Disclosure
Ontological Engineering Pty Ltd is operated strictly outside corporate hours,
entirely independently of primary employment at InEight. All research,
publications, and commercial activities of Ontological Engineering Pty Ltd are personal
and non-affiliated. The methodology, the intellectual property, and all associated
artifacts are the author's own. InEight has no involvement in, knowledge of, or
responsibility for any work published under this entity.
Document control at industrial scale is applied epistemology. Systems must know
what they know, record what they record, and refuse to assert what they cannot verify.
The chain of custody is not bureaucracy: it is the mechanism by which accountability
survives contact with complexity.
When generative AI arrived, the failure mode was immediately recognisable.
These systems have no document control. No chain of custody. No trip state.
They are the most authoritative-sounding information systems ever built, operating
without the safety architecture that any offshore engineer would consider baseline.
Ontological Engineering Pty Ltd exists to apply that discipline to AI systems.
The Right to Refuse methodology did not emerge from a literature review. It emerged
from six years managing the engineering information on a $54 billion offshore LNG
project, where the cost of epistemic failure is measured in safety incidents
and regulatory liability: not benchmark scores.