I hereby coin the term "Ptolemaic Code" to refer to software that appears functional but is based on a fundamentally incorrect model of the problem domain. As more code is generated by AI, the prevalence of such code is likely to increase.
1/7
I hereby coin the term "Ptolemaic Code" to refer to software that appears functional but is based on a fundamentally incorrect model of the problem domain. As more code is generated by AI, the prevalence of such code is likely to increase.
1/7
Like the ancient Ptolemaic model of the solar system, which tried to overcome its fundamentally incorrect understanding of the solar system by adding complex "epicycles" to force its Earth-centred model to match observed reality,
2/7
this code passes all its tests and satisfies, its specifications, yet is built on a fundamentally flawed logic.
AI code generation, which relies on examples, will likely produce significant amounts of Ptolemaic Code.
3/7
It attempts to fit solutions from potentially very different domains together with corrective code "epicycles" until it satisfies its context, the user, and tests.
4/7
This is a trap: while the code works within its parameters, its internal model is incoherent. When it inevitably fails, the incoherent basis for debugging or correction will lead to additional 'epicycles' being added. This process increases system complexity and brittleness.
5/7
The Ptolemaic model of the solar system, though completely incorrect, allowed sailors to navigate the globe for centuries, its failure was not practical but explanatory, similar to how Ptolemaic Code works until deeper issues arise.
6/7
Ptolemaic code works, but when it breaks, it can only be patched, not made correct.
7/7