Towards epistemological model-based approach in Computational Law

Доступна русская версия.

December speech by Daniel Greenwood, the Head of Computational Law R&D at MIT.

The speaker clearly outlines objectives of Computational Law; on the other hand I do not agree with how the boundaries of the discipline are defined. This is the neopositivist mainstream to bring it all down to language and semantics: let’s split laws into semantic atoms, and then link it with logic, statistics, ontologies, services; construct the Semantic Web and Linked Data. This have worked relatively well for years, but still the pace at which it all moves forward can hardly be considered sufficient. Look at the recent advances in Deep Learing, it exploded about 10 years ago and its pace is outstanding. 

My thesis is that to equate text/language and knowledge is the weak epistemological stance. Language-centricity is exactly what holds us back here.

In AGI field there is a branch of emergent systems, where knowledge of agent is embodied. Not imposed in the form of restrictions/rules, but learned through feed back and practice in environment, including communication. «The epistemology of emergent systems shared consensual experiences among phylogentically compatible agents.» (Vernon 2007) «Rules» or «laws» are therefore emergent characteristics, that is, explicit generalizations of these states. I am not an AGI optimist, but important technologies are spreading out from AGI in different directions, let’s take them for free.

A stronger epistemological thesis would be that for knowledge reporesentations we need to move from the language-centric text-based approach towards NNN-centric model-based. This understanding steadily reaches people even beyond the model-based system engineering field (MBSE), where it is already the mainstream. But what is behind this ‘NNN‘, expecially in domains with more than 4 dimensions, is a key issue. What to replace language with in this formula

In mechanical engineering, physical 4D models were set as the ‘NNN, and got engineers happy for 200 years to come. In «humanitarian» areas, they normally try to employ logical formalisms or statistical ensembles. But this is a palliative: both derive from language and carry the same limitations that language has. The time of these approaches is coming to an end.

The strong epistemological approach: move over from language to pragmatically defined episteme. Develop and use computational epistemology based on an epistemological theory of attention, instead of computational semantics based on a theory of language. Since jurisprudence is an environment of high variability and connectivity, a computational support for such architectures must also scale well in these dimensions. Fact-oriented ontological systems are poorly suited to scale this way. Neural network systems, although they scale better horizontally, require special epistemic architectures to grow up the generalization ladder. But there’s no such architectures.

It is not possible to develop epistemically-centered model-based system in the labyrinths of DOOM, where today’s AGI prototypes practice and do reinforcement learning. Epistemically, this is an extremely poor environment. But the maze of legislation and contractual obligations — this suits just right. Resulting intelligence will still not be general, yet specific, and thus more useful.

Читайте также:

Добавить комментарий