On names
Recently I have been thinking about the importance of standardizing definitions and names in Science.
I was looking at the notion of regular category, which is the correct setting for defining relations. Unfortunately this important concept has not settled completely. A regular category in the first place is finitely complete. But then some papers assume the existence of all coequalisers, but most (correctly) prove the existence of coequalisers of kernel pairs from a factorization system into extremal epis and monos, plus the fact that extremal epis are preserved by pullback (Joyal).
Unfortunately what I just called extremal epis are sometimes called strong epis or even special epis. Peter Johnstone in his Elephant book (which has a rather good treatment of regular categories) introduces another concept namely covers.
My point is that such a basic and important concept should not have a confusion of names or definitions.
However I really wanted to talk about a much more difficult problem, the naming of concepts in computer science.
The problem arises from two sources. One is that different forms of a concept have arisen in many different fields, and hence there are many existing names, usually with resticted sense, and hence difficult to use in an extended sense, with out upsetting the existing use. The second problem is confusion - the concepts have not been sufficiently analysed or a single name is used for several different concepts.
I could give a long list of examples. Let's take a few names and try to untangle them. The names I have in mind are "concurrent", "process", "system", "processor", "machine", "agent", "automaton".
Concurrent: the word appears to signify entities or behaviours of entities (two different things already) which run together, at the same time. In fact, the word has come to refer to, not entities, but behaviours which do not run together, but interleave.
Process: it is not clear in the literature whether process refers to an entity, or an entity in a state, or the behaviour of an entity. These are such fundamentally different things that they can't be confused.
System: the word system is so vague that it is difficult to use - but in the end we have resorted to using it in a recent paper. At least it is clear that a system is an entity.
Processor: is clearly an entity but seems to refer to a hardware component, so is insufficiently abstract.
Machine: similarly.
Agent: seems to be an entity but is used by many (Milner) to mean a process in a certain state (and what is a process?).
Automaton: seems like a good word for an entity, but a difficulty is that the word either is used in a very limited meaning (by finite state automata theorists) or there are too many uses for the word.
One more word, rather different. The word "series" has long been used in electrical circuit theory to denote a system physically constructed by a sequence of components. It does not mean that the behaviour is sequential although if the circuit contains pulses it may be. We used in at least one paper the word series composition to mean "communicating parallel composition" as it does in such circuits. I think we were mistaken to do so because series certainly suggests "sequential", which we definitely did not intend.
I was looking at the notion of regular category, which is the correct setting for defining relations. Unfortunately this important concept has not settled completely. A regular category in the first place is finitely complete. But then some papers assume the existence of all coequalisers, but most (correctly) prove the existence of coequalisers of kernel pairs from a factorization system into extremal epis and monos, plus the fact that extremal epis are preserved by pullback (Joyal).
Unfortunately what I just called extremal epis are sometimes called strong epis or even special epis. Peter Johnstone in his Elephant book (which has a rather good treatment of regular categories) introduces another concept namely covers.
My point is that such a basic and important concept should not have a confusion of names or definitions.
However I really wanted to talk about a much more difficult problem, the naming of concepts in computer science.
The problem arises from two sources. One is that different forms of a concept have arisen in many different fields, and hence there are many existing names, usually with resticted sense, and hence difficult to use in an extended sense, with out upsetting the existing use. The second problem is confusion - the concepts have not been sufficiently analysed or a single name is used for several different concepts.
I could give a long list of examples. Let's take a few names and try to untangle them. The names I have in mind are "concurrent", "process", "system", "processor", "machine", "agent", "automaton".
Concurrent: the word appears to signify entities or behaviours of entities (two different things already) which run together, at the same time. In fact, the word has come to refer to, not entities, but behaviours which do not run together, but interleave.
Process: it is not clear in the literature whether process refers to an entity, or an entity in a state, or the behaviour of an entity. These are such fundamentally different things that they can't be confused.
System: the word system is so vague that it is difficult to use - but in the end we have resorted to using it in a recent paper. At least it is clear that a system is an entity.
Processor: is clearly an entity but seems to refer to a hardware component, so is insufficiently abstract.
Machine: similarly.
Agent: seems to be an entity but is used by many (Milner) to mean a process in a certain state (and what is a process?).
Automaton: seems like a good word for an entity, but a difficulty is that the word either is used in a very limited meaning (by finite state automata theorists) or there are too many uses for the word.
One more word, rather different. The word "series" has long been used in electrical circuit theory to denote a system physically constructed by a sequence of components. It does not mean that the behaviour is sequential although if the circuit contains pulses it may be. We used in at least one paper the word series composition to mean "communicating parallel composition" as it does in such circuits. I think we were mistaken to do so because series certainly suggests "sequential", which we definitely did not intend.
Labels: category theory, computing
0 Comments:
Post a Comment
<< Home