Comments on ontologies

Un article de Caverne des 1001 nuits.

Version du 22 décembre 2013 à 09:19 par 1001nuits (Discuter | Contributions)
(diff) ← Version précédente | voir la version courante (diff) | Version suivante → (diff)

Some comments can be made about ontologies.

Comment 1: This is a big one. They are a data centric approach. Data is not absolute in itself and is a consequence of the point of view of the viewer.

Comment 2: Problem in design between concept and attribute. Attribute can be a concept also. We are coming back to OO class diagram design issues.

Comment 3: Ontologies are too generic as a tool. There can be intermediate level of abstractions.

Comment 4: This is a big one. Ontologies should be able to represent much more than classic OO concepts. It could be possible to generalize the "instance of" concept. A is an instance B that is an instance of C. Generally, in language complexity, we can see this kind of multiple levels of instantiations.

Comment 5: This is also a big one. A tree is one of the most ambiguous way of representing knowledge. Knowledge is more a graph of nodes with various kinds of links. In ontologies, trees are used to represent inheritance, but inheritance can be multiple. Inheritance in indeed a very strong connection between classes and the tree view is proposing the interpretation that things could be represented in a tree (which is not the case at all). This can be misleading. Generally, the use of the tree is progressively degenerating with use, because the temptation to use the tree anyway for processing reasons will lead to a bad way of representing things.

Sample: the aircraft tree and the painting.

Comment 6: This is also a very big one. Ontology samples can change the perspective on the representation of the structure of things. The world is already standardized in many ways and things have been ordered already in the past by authorities. Let's take the sample of the ontology describing the French wines classification. Indeed, this sample will converge to the proper classification already standardized. Very often, this classification was done for regulatory or standardization issues. Then, the ontology sample can be flawed because it will model, not a disorganized knowledge but an already organized knowledge. In terms of language, it does not design a knowledge in the absolute sense of the term but a human product hierarchy or classification. Indeed, those ontologies should not be classified as knowledge representation because they have already their equivalent outside of the ontology world.

Comment 7: Ontologies cannot be absolute because they only represent what is useful to a certain set of use cases. Indeed, many ontologies with many various structures can be valid knowledge representations.

Comment 8: Difference between concept and action. Concept is a class and action is the oriented link (as in a sentence with noun + verb). The role of links can hide more concepts and we enter the area where the ontology can be erroneous.

Conclusion 1: OO modeling is richer than ontology descriptions because it adds the dynamic dimension.

Conclusion 2: semantic language descriptions such as Archimate are much more powerful that generic descriptions found in the ontologies. Archimate introduces a supplementary level in the knowledge representation. The objective is not to do natural language => ontology. The objective is to do natural language (domain) => domain specific semantic modeling => ontology.

Indeed, every item of the ontology is "typed" and every link between concept is typed also with a grammar that is domain specific. That is much more powerful and usage because this 2 steps process enables to restrict (or to mathematically "project") a domain unto a semantic representation of this domain. Then patterns can be found into the semantic representation where some links are not possible because of the nature of the objects.

This means that ontologies are too vague, too arbitrary, enable too few normalized constraints to be useful. Ontologies are less powerful (because less consistent) than UML models (which are only a projection of reality on a syntactic space).

Part to be done : knowledge representation is a projection of multidimensional graphs onto some mastered spaces.

Conclusion 3: semantic multiple instantiation can be done in programming languages such as Lisp which turns ontology as a very restrictive tool.