A Broken Parity
Somebody has once said that a philosopher is a person capable of discovering an inacceptable complexity in the simple and generally accepted. One could consider the following mathematical illustration, on the secondary school level.
There is a notion of the partitioning of a set in a number of non-intersecting classes. That is, each element of the set is believed to belong to some class, but it cannot belong to several classes. One could fancy many such decompositions, which, in general, have nothing to with each other.
There is yet another notion: a symmetric, transitive and reflexive relation could be defined on the same set, which is traditionally called equivalence. The equivalent elements do not necessarily coincide; they only are treated as interchangeable in this particular relation. Of course there are as many kinds of equivalence as you can invent, and one may be very unlike the other.
Straight away, a school teacher proves that every partitioning of a set determines some equivalence, and conversely, the classes of mutually equivalent elements of a set necessarily provide its specific partitioning. In this picture, partitions and equivalence can be observed as merely the two modes of speaking about the same.
At that peaceful moment, a philosopher breaks in and immediately starts to wonder whether any set at all can be represented as a number of non-intersecting classes, and whether any set allows for a globally defined equivalence. For, if the notions are not universally definable, there may be doubts as to the admissibility of their comparison, nothing to say about any formal proofs.
The questions are phenomenally stupid. If we have deliberately composed the original set as we like, who can prevent us from picking out a few arbitrary elements to produce a subset? Then, everything that does not belong to this subset will form its complement, so that the subset taken together with its complement would provide a constructive example of a formally correct partitioning. Just the same holds for equivalence: take a few elements, call then equivalent by definition, and let the rest be treated as equivalent among themselves, but not equivalent to any of the chosen few. That is, at least two classes of equivalence can be constructed in any case; more sophisticated constructs are easily introduced using the three magic worlds: and so on... Or the three magic letters: etc. Or just three points in the end of a phrase.
The sour look of our philosopher indicates that he cannot get the point and remains utterly perplexed. "Good grace!" he pleads, "just think that non-belonging to one set does not mean belonging to any other. If this particular object is not a nail, it does not need to be a screw! There are lots of other kinds of ironware, like pins, dags, bolts, clinchers, or anything at all that we may unaware of until it eventually happens to come across. Claiming that there is nothing in the original set that would not belong to either the chosen subset or its complement, you implicitly assume that the set is already split into two non-intersecting parts, in which case the sought-for partitioning certainly does exist! This, however, does not exclude any less trivial occurrences."
In the absolutely the same manner, the possibility of explicitly demanding the equivalence of some elements of the base set in no way implies the mutual equivalence of the rest. The usual "proof" takes for granted that the non-equivalence of two elements to a given element different from the both means their equivalence. Isn't it a somewhat too strong assumption? Its point is to establish an interdependence between two quite different relations, which may only hold on rather restrictive conditions that need to be properly explicated; in fact, once again, you implicitly postulate what you are going to deduce.
Consider a very simple example. Let us admit that, in our theory, only the open balls in some space can be taken for sets; obviously, none of these sets can be split into any number of non-intersecting sets (in the same topology); moreover, even a finite (or countable) cover may not always be possible. To introduce partitioning, one needs to extends the original universe, which may lead to a quite different mathematics. Similarly, any equivalence taken as it is (as a set-theoretic relation) does not refer to any other relations (including any non-equivalence). We can always decide whether these two particular elements of the set are equivalent or not, and all the rules of equivalence will perfectly hold, so that each of the mutually equivalent elements could represent the same class of equivalence; but, in general, we do not know if there is a collection of representatives representative enough to allow the corresponding classes cover the whole original universe.
This is what elementary logic has to say. In practice, we can always restrict ourselves to the "right" sets behaving as expected, to ensure the validity of deduction. Still, such "proofs" can hardly be any convincing, as all we can is to formally construct the objects of a given class, stretching the theory to the desirable result.
Is there anything to blame? As a matter of fact, all the sciences do exactly the same, trying to adapt the inner structure of the science to the natural phenomena. Demystified (objective) mathematics is to take its place among the other sciences, instead of remaining a sort of religious revelation; this will make it much more friendly and ready to accept all the signs of our profound esteem.
Well, certain consequences may seem to be less comfortable to a working mathematician. For instance, the so much worshipped proofs by contradiction will no longer be treated as rigorous proofs, but rather as convenient heuristics, the public motives of the important decisions, whose truth is yet to be sought for elsewhere. Indeed, even provided we have found what this particular thing is not, we still have to guess what it really is. Negative definitions constitute an important stage of any cognition: they are a kind of search activity, overall orientation. However, since all the other modes of deduction become entirely conventional and do not lead to anything but working hypotheses, there is nothing tragic in this inherent insufficiency at all: we do what we do, produce what we can, while it is up to the practical implementations to put things right and eternalize the truly valuable.
A mathematical theory grows like any other: the empirical considerations outline a certain class of phenomena to cope with (the object area, the "universe"); we decide which features of the objects should be of primary importance in this context, while all the rest is to be somehow (in an as formal manner as possible) related to the that basic foundation. That is, there is no goal of "proving" anything, or to be convincing enough; a theoreticians is just to relate one thing to another, bring things together, rather than deduce. When the object area is structured enough, it may be quite admissible to reason by analogy, turning the already found regularities into formal schemes allowing both to predict something on the basis of the available facts (thus putting forth a hypothesis) and, conversely, to suggest a range of formal solutions for a given practical problem (the activity of justification). This is much like a physicist plans a series of experiments or, say, concludes about the inner structure of a quantum system by the observable spectrum. Well, the same can happen to everybody in everyday life: finding a crack in the wall, we envision the possibility of further destruction and try to collect the tools and materials necessary to patch it up. It is self-understood that some happenings won't lie in the line of our theory, and we'll need to consider a different methodology.
The dry residue: the traditional mathematical approach is quite useful in the circumstances similar to those that lead to the present state of affairs; the object area is assumed to be broad enough, as we position ourselves far from the its natural boundaries. However, since no scheme can be absolutely universal, a grain of humor may serve for good in assessing the results: normally, we obtain what we intend to obtain, while nothing prevents us from getting anything else when things change.
Now, let's get back to partitions and equivalence. The school attitude employs the principal postulate of the modern mathematics: what can be built has already been built. Mathematics does not distinguish actual existence from mere possibility. In particular, any set is prepared for us beforehand, just patiently waiting for a glimpse of our attention and a condescending couple of words. This means that no set operations are capable of producing new sets: all they do is to relate one existing set to another. Thus, speaking of a union, or an intersection, we only express the fact that two sets (from some static universe) are related to another set; similarly, the idea od a subset is to convey a special mode of linking one set to another. All such bindings have long since been established, and we treat them in a static manner, as ready-made. Formally, such structures are introduced using the intuitive (undefinable) construct of an "ordered pair", so that any relations between sets could be represented by the sets of ordered pairs.
As one can easily observe, the union (or intersection) of two sets is not always conceivable, since this possibility depends on the presence of a set of the same universe that could be associated with the union (or intersection). For instance, in the above universe of open balls, the union of two balls is only definable for one ball embedded into another; the union of two non-intersecting (or partially intersecting) balls will no longer be a ball. The same holds for the intersection. In this case, for any embedding, the union and the intersection are to select the outer and inner balls respectively: they become the operation of projecting an ordered pair onto one of the components (similar to arithmetic maximum and minimum).
Consequently, the theoretically deducible properties of sets are entirely dependent on the structure of the universe, so that all the sophisticated theorems merely reconfirm what we have already assumed by the very choice of the object area. In particular, the existence of a partitioning is out of question if its components do not belong to the original universe.
On the other hand, since any relations are formally defined as sets, the proof of the close correspondence between partitioning and equivalence is sheer tautology: the validity of a static notion of equivalence is based on the assumption that the corresponding classes have already been built.
Yet another traditional approach starts with the algebraic shift of the focus from the objects to the possible modes of operation. The background universe (a configuration space) is still to be involved; however, in general, it does not need to be a set, and all we demand is the existence of elements combinable through a number of (informal) manipulations. We do not care of what an element, or an operation, exactly is; it is quite enough to consider the overall features of this manipulative activity as the only available deductive scheme. Just for convenience, it is often assumed that the operations are defined for all elements of the universe; what does no suit us is to be deliberately put aside. In the presence of several operations, their domains may differ; still, it is always possible to restrict theory to the commonly definable, possible allowing for a few singular points. Such theories are known as complete. Obviously, there is nothing to expect in the outcome beyond sheer tautologies. Why not? One is free to play with void forms for one's personal entertainment.
Algebraic structures tend to smoothly glide towards the idea of representing operations with elements, and elements with operations; the usual set-theoretic methods then work there in full. One of fundamental tricks in this approach is to postulate the actual existence of any domain. For instance, each element invokes all the elements involved in its production via a (presumably) fixed combination of operations (a function). In the assumption that all the possible functions have already been evaluated, such prototypes are readily joined in a static object, a class of equivalence. The built-in completeness immediately recalls the existence of covers or partitions.
Some algebraic structures introduce a kind of partitioning from the very beginning. For instance, the bulk of the abstract (informal) elements can be complemented with an already known algebraic structure (or a set). This leads to discriminating functions by the type of the object returned: basic operations always return an element of the "syncretic" part of the universe, while some functions calculate to a "number" (an element of the structured field). This does not change anything in principle. The admissibility of treating a domain of a function as a class is a very strong assumption that virtually "preinstalls" all the deducible properties in the theory.
An object-bound mathematics would consider a definite (though not always formally definable) kinds of abstract objects, aiming at an explicit description of the possible regularities (links) as a sort of interpolation schemes to conclude on the properties of anything that is constructible in compliance with the rules of the game. All the answers are implicitly presupposed in the static case. There is no significant difference between the statements of all types, and no reason for obstinately preferring some schemes to the others. The logic of theory is to comprise all the diversity of formal tricks, and any level of deduction is to be built upon all the rest, since it is only in their integrity that the object of the science is expressed. Thus, the usual habit of operating with "logical" junctions (not, and, or...) in no way means that this "symbolic" logic could exist on itself, producing some "pure" knowledge regardless of the object area. In each case, we only try to control our objective acts bringing them under the familiar forms; this initiates an activity of reinterpreting the abstract logic in terms of object manipulations, which drives us to rearranging the object area to comply with our behavioral stereotypes. Some objects would not fit in the template; in this case, we modify the scheme, effectively replacing one logic with another. Fortunately, the continuity of scientific development rarely comes to any radical revision, and many techniques get productively employed in the course of several millennia (albeit with innumerable refinements and specifications). Still, search for new paradigms may be of crucial importance sometime.
Our cognition develops from syncretism, an overall impression and intuitive judgment, towards all kinds of distinctions, and then to the attempts of recovering the former entirety, reconstructing the whole from the diversity of the disparate components. The modern culture, with its market economy based on the universal division of labor, won't allow to complete the synthesis, and science mainly turns around grouping and grading, with the class inequality as a generic theme. Hence all those decompositions and partitioning in mathematics; we need equivalence just to stress the differences, and all the significant features are normally introduced in respect to the quotient sets, that is, for classes rather than elements. What is incompatible with any ranks is out of science (and the civilized society as well). The world of the future, presumably inhabited by perfect singularities, will require a different mathematics. Any alliance is to become transient, relative, limited to the demands of a specific problem, while extensive task swapping will cause frequent rearrangements in the object area. This is one of the determinative principles of the hierarchical approach.
As earlier indicated, the object-bound attitude is implicitly laid in the foundation of any formal universe, which is absolutely necessary to reconcile the theory with the practical needs. The next step to take is to abandon any "final" definitions at all, admitting a "natural" diversity never subject to any universal formalization. At any rate, the very idea of formalization implies a partial treatment of many special cases. A true science is not to invent any theories of everything; it must be capable of producing a straightforward and minimally complicated theory for each particular aspect of the object area, with a commendable soberness of mind preventing us from exaggerating our ingenuity.
In the example of sets and equity, the bent for complete classifications (partitioning a set into non-intersecting classes) might yield to the discussion of various scales, the collections of zones, with the elements of the universe tractable as relatively equivalent within each zone. Why relatively? Because every scale is hierarchical, with the scales of different levels never reducible to each other. Equivalence on one level may well account for qualitative distinction on another. For example, in music, the same note can be pitched a little higher or lower (within the same pitch zone), depending on the musical context and the artistic sense; similarly, in dancing, the same typical figures allow for quite different arrangements. Well, this is what makes the essence of art; still, in science, there are many levels of formalization, and one cannot easily tell a fundamental theory from a semiempirical model.
No scale can embrace everything; still, the uniformity of the universe admits hierarchies unfolded in full starting from any individual element. This uniformity is quite like the inner symmetry of the scale, the equivalence of the elements within a zone. Normally, any "outer" symmetry means the existence of the level of the scale, where this invariance comes up as zonal equivalence. In this way, the scale can be uniformly extended to the whole object area; completeness is thus restored, on the hierarchical basis.
Every time we consider a subset of a given set, a static scale is really meant. It comes to the neighborhood of a given element, its close enough vicinity, while the rest is beyond the limits of measurability. Another subset will determine a different set. To relate one subset to another, we have to introduce yet another level of hierarchy containing the object of a new kind, associated with some inner hierarchy. Thus, in music, harmony can be treated as a hierarchical structure, a union of several zones; on the other hand, polyfunctional notes exhibit an analog of set intersection. Such "local" hierarchies are not generally reducible to the incident elements and scales. It is only in a static context that a set becomes the collection of all its elements, while an element can be defined as a collection of all sets containing it. A subset and its complement are defined through each other in the exactly the same way. The fundamental logical problem is to somehow restrict the essentially non-formalizable notions of all and everything; the many variants of specification lead to quite distinct theories.
When it comes to creating and destroying mathematical objects, the issues of consistent formalization go top. For instance, people keep being born or gone, and there is a set of living people at every moment (in some conventional scale); however, there is no static set to contain every person at all, as there are at least those who did not yet come to life. Hence any theories of universal values (including the universality of mathematics) are beneath all criticism. Other object areas reveal similar peculiarities. Thus, electrons may form a Fermi gas; but, under certain conditions, they get stuck in Cooper pairs, forming a Bose condensate, the state of an entirely different symmetry. Our notions of food and dwelling are historically mutable, which makes the sets of culinary ingredients and construction materials as flexible. In the same line, what can prevent us from considering algebraic structures where the admissibility of operations depends on the elementary composition of the base, as well as the elements may change in the course of certain operations? What shall we take for the analogs of partitioning and equivalence in that case?
Jan 1983
|