The cult of skepticism

Part 6

Missing the real core of Science

We reviewed some usually unrecognized contradictions between the views of diverse skeptics behind their naturalistic front, about the precise kind of fundamental reality they believe in : a large number of them worship science and especially physics, while, We shall now dig further into their contradictions by analyzing their attitude towards the God of their God: while the God of Naturalism is Physics, the God of Physics is Mathematics. Yet much of Naturalism leans towards the despise of Mathematics (as illustrated by the lack of mathematicians in skeptical organizations).

First we need to explain how Mathematics is the God of Physics, and which Mathematics we are talking about. Mathematics is generally the only way to form any clear concept of a regularity law, unaffected by any fantasies about how any given regularity law should be interpreted in each case. So it is essentially required as a cornerstone of naturalistic expectations.
Then, I would roughly divide mathematics into 2 kinds: the "low" and the "high" styles of mathematics. This division is of course not strict. Different criteria can be considered for this distinction, which are not always equivalent, but are interestingly correlated.

One criterion, is to label as "low" the finitistic mathematics (which deals with finite systems only), in other words the mathematics of algorithms. Then the "high" mathematics is the one which involves infinite or continuous systems.

Another criterion, is to label as "high" the mathematics of symmetric systems, and as "low" that of asymmetric ones. A correlation between both criteria can be illustrated by the following example which is actually a toy model of mathematical facts which play an effective role in theoretical physics.

It is possible to approximate a circle by a regular polygon with a large number of sides. This approximates the continuous symmetries of the circle by the still numerous symmetries of the regular polygon. However, when the circle is replaced by a sphere, such a possibility breaks down: the 3D analogues of regular polygons are the Platonic solids which are very few. So the symmetry group of the sphere cannot be the approximation of any group of exact symmetries of a large but finite system: whatever system you may take trying to approximate a sphere, as soon as more than a quite small number of spherical symmetries work on it, infinitely more of these symmetries still do, and essentially all of them. When symmetry matters, no small escape from continuity is mathematically conceivable.

Daringly, a third criterion is sociological : the low-style math is the math of non-mathematicians, for whom math is "just a tool"; while high-style mathematics is the one of mathematicians, who see math as their home and a reality in its own right.

In these terms, the large anti-mathematical trend I mean to point out among naturalists, consists in dismissing the value of high-style mathematics, essentially for its touch of mysticism, to focus on (only accept the legitimacy of) low-style mathematics. Or (since it may look bad to make this position explicit), many of them express this in a completely implicit manner, in the form of giving descriptions of mathematics or physics as if low-style mathematics was the only mathematics that existed, or sufficed to express physics.

However, the found laws of physics belong to high-style mathematics. This is what I see as actually meant by the famous Unreasonable Effectiveness of Mathematics in the Natural Sciences (wikipedia - a page I wrote with notes and references on the debate).
So, physicists did not seem to expect this, but why ? I can offer anyway the following clue why high-style math is likely to be better suitable than low-style math to express a law of Nature: high-style math is needed for a law to meaningfully serve as a law of Nature because that style is needed for a law to carry its own necessity (virtue of logical unbreakability).

Indeed, low-style mathematical laws would be unable to carry their own necessity:

On the other hand, high-style laws are more likely to provide their own necessity. In particular, the concept of continuous transition of the state of a physical system, has a certain virtue of carrying by continuity the necessity of the final state, from that of the initial state (some people will object that quantum mechanics denies continuity as it only recognizes locally finite numbers of states. In a sense this is true, however this discreteness is only an appearance of the measurement results, which do not themselves belong to the laws of quantum mechanics as explained below. Still, the law of quantum mechanics in itself is fundamentally continuous in the way manifested as providing continuous values of the probabilities). Now for who actually studied the laws of physics, their character of intrinsic necessity of continued validity appears quite impressive. A striking example is the case of the conservation of energy. Naive expressions of this law present it as a postulate, coming without a reason and therefore questionable. However in General Relativity, it turns out to be a theorem of geometry, once the energy is seen as defined from space-time geometry by Einstein's field equation, regardless the details of this geometry (only assumed to be that of a pseudo-riemannian manifold with signature (3,1)).

Concretely, here is how this fact answers to the people who imagine the conservation of energy as a postulate : consider a universe A with a certain distribution of energy at a given time t. Can we imagine that at just the next time t+dt, the energy appears to vary at one given place in violation to the conservation of energy, everything else being the same ? We can proceed by figuring out a different universe B which at that time looks very similar to A, with the essentially the same distribution of energy, except just the different concentration of energy at that precise place. Now how can we transition from universe A to universe B ? I mean, of course we might proceed just by destroying the universe A at time t, to not let it exist anymore from now on, then creating the universe B to start existing only from the same time as if it was old while it isn't. However, that may or may not satisfy us for the following reason: once we so took our big scissors to cut the space-times of A and B at hopefully "the space-like slices marked by the same time all over the place", kept just the past side of A and the future side of B, we still need a way to... kind of glue both pieces together. And this is when we are running into trouble. Even allowing ourselves to do it completely arbitrarily, it is ... just mathematically impossible to do it without violating the laws of physics not just at one place, but more or less everywhere. Concretely, any energy added at one place must come from somewhere else.

That even would not be much of a problem from a low-style mathematical perspective : seeing the laws of physics as a computation, the cosmic cut-glue process, no matter how dirty, can just be done by switching off the cosmic computer (Doomsday), modifying its memory as much as we need, and then restarting it all Last Thursday. By the way, if consciousness was a computation, as naturalism claims, then for the same reason there would be no way to ensure the validity of our own thoughts (especially our memory), which would condemn us to skepticism and hyperbolic doubt. By the way, skeptics are really subscribing to hyperbolic doubt when, faced with the case of Pam Reynolds (who had OBE perceptions when the brain was really totally inactive), they put forward the possibility for the brain to build up memories of perceptions that were never real in order to fill the gaps of a given scenario. This idea that memories can be, and even sometimes did happen to be, completely built up with all their vividness about experiences that were never real, is indeed a logical consequence of the hypothesis of the material nature of memory, and has no essential difference with Last Thursdayism.

The conservation of the electric charge has the same necessity as the conservation of energy. Then maybe not all other physical laws have the same spectacular character of necessity, yet it remains hard to mathematically conceptualize, in quantum field theory, any process leading to some outcome which the laws would forbid. A semi-exception to this is given by the "wavefunction collapse" we shall comment below.

That was why and how the laws of physics belong to a high style of mathematics and need to be accepted as such. This does not contradict the possibility to re-express them algorithmically, which is indeed another requirement for some mathematical theory to be qualified as a physical law: it must be possible to effectively compute its predictions. Yet remains some deep structural difference between the law in its high-style formulation, and any algorithm which can produce its predictions. Namely, the former looks much more elegant than the latter.

But, what does it mean for some mathematics to be elegant ? Interestingly, this is precisely a typical example of an essential topic of disagreement between naturalism and supernaturalism: the status of qualia.

A famous illustration is given by Mary's room thought experiment: is a sensation fully explained by the physical description of its neuronal stimulation pattern, or is there something more to learn by having this sensation as one's own personal experience ? In other words, is a sensation, such as that of a color, anything more than the pattern of neuronal stimulations which form it in the brain ? This question may sound hard to figure out, because we do not have a ready scientific description of neuronal stimulation patterns for sensations under hand.
Another famous example of the same question, is "What is it like to be a bat ?".
Yet another example of, I would say, roughly the same question, is so much easier to effectively experiment that... probably most people already did it. Here it is: the question of the difference between a melody and its partition. Someone can have learned music theory, then read a music score, know in principle how it should be played on the piano, be familiar with hearing the piano and how each note sounds, and yet not figure out how nice is the melody which is written until the time of effectively hearing it. Of course good musicians can manage to read a music score in the sense of playing it in imagination and this way effectively feeling it, but that takes a special training or effort. I even once watched on a screen the display of the sound analysis of a music I was hearing, I am not sure how much this display was actually faithful, or did it lose a lot of fine crucial aspects of the sound, but I was very puzzled to experience how hard it seemed to figure out the connection between what I heard and what I saw...
So, when perceived by the right senses, a structure can induce a feeling (qualia, subjective appearance), that is something more than the sum of the parts which the structure was made of. Yet this feeling is not something complex like the structure was, but something simpler, and of a completely different nature.

Now the point of this reminder, is that I see a very similar phenomenon crucially concerning the core sciences : mathematics and physics. Mathematics is much more than the sum of its theorems, and the laws of physics are much more than the sum of their experimental predictions. And the way they are more, is not anything more complex, but something so much simpler than the parts they are made of, and at the same time something transcending these parts: they make sense. In other words, I find some special qualia of what it is like to be a mathematician, and what it is like to be a physicist. There is some qualia of the activity of exploring high-style mathematics, which may be more or less lacking in that of low-style mathematics. Metaphorically speaking, the laws of physics are written in symbols of color, whose qualia we need to feel in order to fully understand these laws, even though, strictly speaking, no qualia exists among either physical objects, the laws of physics, or generally any mathematical theory or entity.
Of course I cannot prove these things, since it is generally impossible to prove the existence of some qualia to those who do not happen to perceive them. For this reason, even those who perceive them may remain doubtful, either when challenged to justify these in debates, or even for themselves.

Now this is an important dimension of the opposition I see between Skepticism and Science : skeptics ignore and even despise the heart of science. Because the core of sciences is math and/or physics, the core of physics is mathematics, the core of mathematics is its high-style fields, some of which being also the part of math which matters for physics, and the core of these high-style fields of math is the qualia we can experience by studying them. And that qualia is a general light of scientific understanding which transcends all particular scientific facts or methods. Skeptics with their favorite methods miss that completely. Mathematical concepts and theorems are neither proven by double-blind randomized testing, nor falsifiable, and yet they perfectly belong to Science.

An important qualia of mathematics, is the sense that mathematics forms its own reality. This is usually called mathematical Platonism. Unfortunately, the usual attempts of philosophers to describe it are quite terrible and usually made of claims which I would dismiss as category mistakes instead of being meaningfully either true, false or debatable. So, let me offer my own formulation of mathematical Platonism. Actually I would split it in 2 thesis which may be considered independently of each other

The second point can be actually demonstrated by giving an explicit description of this architecture, namely the tandem of set theory and model theory, which the main focus of my work was to clarify precisely. A crucial point is the completeness theorem and its proof, which for any consistent theory expressed in first-order logic (= "classical logic"), provides an arithmetical (but not algorithmic) construction of a system it describes. In short, consistent description provably implies mathematical existence.

But this second point once made, supports the credibility of the first point, as there is no less reason to believe in the mathematical reality than in the physical reality once found that mathematics has all the qualities in the name of which the physical universe is usually accepted as real (unless I missed something). Moreover, the "unreasonable effectiveness of mathematics" with the observed crucial role of high-style math in theoretical physics, supports mathematical Platonism. Yet this support may not be very clear because, inside the world of mathematics, the precise part of mathematics which provides for what I will call here the native mathematical ontology is the tandem of set theory and model theory ; it is not the same as those parts of mathematics which matter in physics, so that both ontologies (the mathematical and the physical) do not coincide. This gap will be further commented below.

But the validity of mathematical Platonism, either the one with the native mathematical ontology, or some variant more suited to physics, remains controversial among both philosophers and physicists. For example Peter Woit strongly supports the value of high-style math, while Carlo Rovelli and Lee Smolin oppose mathematical Platonism by denying high-style math. (Both Rovelli and Smolin work on Loop Quantum Gravity, a tentative approach to quantum gravity based on a kind of discretization of space-time, but for what I saw on Wikipedia it seems unclear whether this succeeded anything at all).

Along such lines, here is an excerpt of the debate

Actually he is the one who understood nothing : what a low-mindedness of him to imagine that I need his explanations for such trivialities ! Of course it is possible to play this petty game of writing such stuff vaguely looking like an explanation of "gauge invariance". The true reason why I said that I don't see how this stuff can explain gauge invariance, is that the kind of gauge invariance which is really needed in physics is not the same, and cannot be the same, as those which can ever be reached by such methods. Where he sees no weakness, I see a fatal flaw. Yet I cannot explain why. It is the qualia which tells me so.

A much more caricatural situation occurred one day I discussed with a philosophy student. He was skeptical to the claim that sciences, especially physics with its mathematical theories, achieved any success in understanding the universe. He thought that theories of physics are made up, and physicists just give themselves the illusion of verifying their theories, since they need their theories to interpret data, their reasoning would be circular; there would be no way to prove theories. I tried to explain that it really makes precise sense to say that we have successful theories verified by observation, and the criterion for this is in the success of these theories to reduce the entropy of observed data. But he did not understand what I meant because he did not study the concept of information entropy. So he needs to study it first. But he dismissed this request, claiming that if I need mathematics to define the concept of entropy and this way justify the success of mathematics in physics then my reasoning is circular. Well in the same way, animals can claim that human language does not make any sense, as if we tried to explain what human language can be useful or meaningful for, we would need to use human language for the explanation, so the argument would be circular as well.

In a similar vein, many philosophers of mathematics are very fond of so-called intuitionistic logic™ (a phrase so trademarked by one of the most absurdly anti-intuitive ideologies ever) and/or finitism, giving these topics an extremely oversized importance compared to their interest for mathematicians. As far as I could see, such philosophies are mainly about arbitrarily denying the validity of much of mathematics or its proofs, for hardly any serious or fruitful reasons beyond the fun of this denialism. I mean, I can accept the study of distinguishing what can be made with finitistic mathematics, as a possible specialized topic in mathematics among hundreds of others; however those who focus on philosophical issues making a fuss of their skepticism about the law of excluded middle are really wasting time. Well, of course everyone is free to undertake such an exploration of the mathematical world in ways restricted to these viewpoints of mentally transabled people. However I would beg them to not go as far as expecting the taxpayers to provide for their metaphorical wheelchairs. Because, well, hopefully the world may have more useful jobs for people who do not really like math, beyond the job of just making a fuss about it.

More generally aside such extreme cases, a kind of obscurantism can be found as having taken the role of orthodoxy in academic philosophy, where the respective statuses of knowledge and ignorance have essentially been switched. Namely, where consideration is only given to reasonings based on the purest scientific ignorance and immaturity of understanding. This is well aligned with the general principles of skepticism, of the form "what can I discover and verify if I am the most stupid and ignorant person possible" taking such a framework as a prerequisite for the validity of judgements. It is on this basis that some ill-defined concept of "naturalism" is regarded by philosophers as the most advanced thought of the time, raising the enigma of how this obscurantist orthodoxy of naturalism should be articulated with the scientific orthodoxy of mathematical Platonism. Hence the great debate about which of both oxymorons "Naturalized Platonism versus Platonized Naturalism" would be a better fit.

Now, here are some explanations I can think of for this attitude by both some physicists and many philosophers to reject mathematical Platonism and still recognize the physical (the so-called "concrete objects" as if this concept of concrete object had to make any clear sense — a presumption of meaningfulness which reflects so much ignorance in physics) as "more real" (or even the only reality) than the mathematical.

The possibly main explanation is that so many people are more familiar with the physical than with the mathematical. At least, it takes more work, and usually comes later in life, to become familiar enough with mathematics to perceive this as a case for mathematical Platonism, than it was to get familiarity with the physical and accept the reality of the physical on this basis.

Then for those who finally become familiar enough with math for it to weight as much as physics, the reluctance to endorse mathematical Platonism may of course have a part of explanation in the general difficulty to switch views. But more precisely here, the difficulty comes from the question it raises of how all these things can fit together. Namely the difficulty comes from the temptation of monism : the assumption that there should only be one kind of reality, since a plurality of ontologies raises difficulties to figure out their articulations. Once the mathematical and the physical are so seen as in competition for the role of unique reality able to contain everything, the physical seems a better candidate than the mathematical indeed.

Yet Mathematical Monism, also called Mathematical universe hypothesis has some supporters, most of which also support the Many-Worlds interpretation of quantum physics. These should be distinguished from the computationalists, who are roughly the most common kind of naturalists, that may be considered more or less also mathematical monists, but low-style ones (finitists), thus generally offset from genuine theoretical physics. This position is usually labelled "Simulation hypothesis or "Digital physics". Among its proponents, Gerard 't Hooft and H. Pierre Noyes are the only physicists I can see, while others are usually clueless in physics (Stephen Wolfram, Brian Whitworth...)

On the other hand, Roger Penrose offers a trio of realities : the mathematical, the physical and the mental. This may sound good, but the problem is how they relate to each other. He describes them as cyclically dependent on each other, which is quite a mysterious architecture.

Now, mathematical Platonism brings an ontological challenge to naturalism (especially physicalism), as it is a clue about ontology, thus a unique chance to bring light to the previously mentioned ontological question on the nature of consciousness. This challenge consists in the following questions:

I see 3 possible ways out for naturalists: But already between the mathematical and the physical ontologies I see especially 2 fundamental differences Then I can still criticize all above 3 ways out for naturalists as ineffective solutions, because while they are removing the qualia (unified understanding) of the problem, this still cannot remove the large number of relatively more concrete problems which this qualia was the unified understanding of.

Among these numerous problems are, for example

I wrote on these ontological issues with other details and references in another text.

Aside these ontological issues, another qualia which emerges from the studies of both mathematics and physics, is the qualia of recognizing the situation described in the previous section : that the laws of physics indeed form a metaphorical locker and that no mathematical key (candidate complement to the laws of physics) can fit in. This is the unified understanding behind diverse no-go theorems against classical realism, and the feeling of "conspiracy" in the face of all experiences of failures trying diverse candidate mathematical keys and seeing them not fitting.

A first approach is the idea to accept as plausible the message which happens to be rather clearly given to us from Nature, rather than going for a headlong rush of seeing it as an accidental conspiracy of appearances hiding opposite kinds of underlying facts.
More precisely, this approach is the message of Logical Positivism, which was needed to motivate the success of modern physics away from imaginary obstacles : if some hypothetical parameter stubbornly and perfectly escapes measurability among all of a given range of diverse experiments (with no good "natural reason" for this), then it is reasonable to dismiss this parameter from the "explanations", i.e. the expression of the laws of physics which are relevant for these experiments. Then the success of such theories which do not use this parameter, in both terms of mathematical consistency and confirmed predictions, validates their terms.

An important point, which so many people could be led to underestimate by their naturalistic prejudices, is how amazingly great is the success of our theories of modern physics which explain so well so many experiences based on a formalism which reject such naively expected parameters. Now this success is so well-known to anyone caring to get informed, and other physicists did this supporting work already, making totally irrational any persistent belief in their underlying existence in some "true" fundamental laws of physics which remain to be discovered. This is why I see no duty to develop this point in too much details myself.

So here is the resulting picture:

This message from Nature, thus, was that the laws of physics form a locker whose key is nonphysical, non-mathematical, from which it is natural to infer that it must be consciousness.

Now my skeptic debater appeared quite reluctant towards such a logical positivistic attitude. So he expressed his skepticism towards these great advances of modern science as follows :

Yet the question may remain to assess the precise measure of how hard (far-fetched) it would be to try to reject this message : how hard would it be for any mathematical key to possibly fit this locker (even if such a key was not the one involved in reality). There are 2 general kinds of keys usually searched for, with toy models proposed by specialists However they are never really satisfactory. An important inconvenient is that they usually depend on a structure of absolute simultaneity along which instantaneous distant effects occur (breaking relativistic invariance). So they might only (more or less) "explain" non-relativistic versions of quantum mechanics. Searching for relativistic versions of such theories would be much more acrobatic, and finally hopeless.

Besides these, is the range of super-deterministic locally causal theories, which is usually not considered by specialists, but compared to the above, would have the great advantage of not involving instantaneous actions at a distance. It happens to be his favorite one. Here are some excerpts of his replies :

Yet, superdeterminism is hardly ever considered seriously by specialists. One may be tempted to complain for a lack of visible strong reason for this (in comparison with the above mentioned clear defects of non-local hidden variables and spontaneous collapse), as if it was a taboo subject. Actually this request for strong reasons against it is a good question, to which I see 2 main answers.

The clearest reason is about quantum computation, which cannot be accounted for by locally causal theories (assuming behaviors as given by some kind of classical computation with locally limited resources). Actually I had to raise the issue in the discussion, because he was not aware of it, since... he did not even figure out how much there was such a concept as quantum computation which may sometimes be more powerful than classical computation. His first reaction was

After some research he wrote the following reply In short, all this to say that his attempts to document himself on quantum computing still leave him skeptical about what the heck the whole point of research in quantum computing could be.
To excuse him, the above excerpt is from January 2018, thus 20 months before Google's breaking news on the topic which finally forced everyone to hear that it had to mean something. On the other hand, more knowledgeable physicists can be better aware of this discrepancy between the predictions of quantum mechanics and what can be expected from classical computational theories. Namely, 't Hooft himself wrote in comment to P. Woit's blog article in 2012, that his ideas "...could well lead to new predictions, such as a calculable string coupling constant g_s, and (an older prediction) the limitations for quantum computers". A prediction which may thus be considered to have failed with the news of Google's success, depending on where the limitations could be thought to be.

But, what a strange universe it would be, that would be basically working in one kind of way (local classical computation) but on top of that would be programmed in such an extraordinary way that it would maintain the appearance of so perfectly following a completely different kind of laws (quantum mechanics) for seemingly all experiments until the day when its success to do so would completely break down because of special circumstances which make it run out of the computing power needed to maintain this appearance. Actually I once saw a physicist (Valerio Scarani if I remember well) make a similar remark about the relevance of experiments to test the violation of Bell's inequalities : it would be so strange to see quantum mechanics perfectly predict everything in all previous experiments including those on entanglement with defective devices failing to fit the conditions of violation of Bell's inequalities, then suddenly break down when devices are perfected enough to fit. Now since experiments approved the expectation that this paradoxical prediction of quantum mechanics would keep working on perfected devices against classical realistic expectation, then why would the case of quantum computing be different ?

This leads us to the other answer I see : the violation of Bell's inequalities (the prediction of quantum mechanics which is the point of the concept of "superdeterminism" : the virtue of a local deterministic law to match this prediction). Why it is a really strong problem, is not so easy to clearly explain, but I will try by analogy with a different, more simply expressible mathematical problem that is subject to the same issues. Just one arbitrary example of an endless range of similar problems.

Here is the example. Let us call conspirational number any nonzero integer whose exponential only has finitely many 3s in its decimal expansion.
Now, does a conspirational number exist ? The point is, any good mathematician would be confident that no such number exists, and yet would not be able to write a proof for this. It is worth hesitating whether this conjecture is actually provable. It is expectable that no simple proof exists, so any proof would be extremely complicated. But the point is that mathematicians do not need such a proof, since their intuition suffices instead of a proof to know the non-existence of conspirational numbers anyway.

Now, superdeterministic laws are like conspirational numbers: mathematical intuition (the qualia of what it is like to be a mathematician) can give confidence that such a law cannot exist, despite the lack of proof for this impossibility. The reason is essentially the same, both for the confidence in the conjecture that no superdeterministic law can exist, and the lack of means to prove this non-existence. Of course I cannot prove the validity of this analogy between both problems, I am simply confident in this, just like I cannot prove but I am confident that conspirational numbers do not exist.

Here by any superdeterministic law, I do not even require one which matches much details of the Standard Model of particle physics. I even expect toy models to be impossible as well. Uh, maybe not the absolutely most straw ones. So let us be more precise: my conjecture is the non-existence of any toy model of superdeterministic law (i.e. which statistically violates the Bell's inequalities) that would be Turing-complete on a visible level, with a lower bound on the efficiency in which it can process visible computation. Here by "visible" I mean as opposed to the "hidden" character of hidden variables. So I mean a law of a world in which it is possible to build a visible computer, in the same sense of those computers we have in our world, able to make classical computations (to use the result as a choice of direction to measure one entangled particle).

Here is a quote from the conversation:

Indeed you do not need to know much of particle physics to undertake a search for conspirational numbers. And it would be a huge success to discover a conspirational number, even if it would still not be the same as a superdeterministic theory which matches all predictions of the standard model of particle physics... even though I am skeptical about its usefulness for anything whatsoever. In particular, I am skeptical about how much closer to the latter dream such a discovery would bring us. I am even skeptical about how much closer it would bring us to some much more modestly different dream, such as the dream of discovering a nonzero integer whose exponential only has finitely many 4s in its decimal expansion.

Should the Universe be designed for the Method or vice versa

Let us sum up the diverse ways we developed in previous sections, in which skepticism and its methods turn out to work as an opposite of science. In very short, science expands our understanding, while skepticism restricts and amputates it ; the true spirit of science is non-essentialist, while skepticism perverts science, turning it into its opposite, by essentializing it. It does that by picking there a few elements (methods and principles) which, in themselves, are genuine, but become perverted by being used out of context and in inappropriately exclusivist manners. The problem is not that these points of focus are wrong, and it is not even that they would not be the best ones (insofar as a comparison would make sense). But very generally, even the best principles in the world can become ineffective and even misleading when focused on too literally, without proper discernment. And the skill of proper discernment has only little to do with issues of methods or principles.

The precise aspects of science picked by skepticism are somehow naturally those which are more likely to be picked when approaching science in a religious manner, that is, as an object of evangelization. They may be the clearest answers to the question of what are the most straightforward things to share if you want to spread a scientific mindset. Doing that, they could have overlooked the fact that, well, this question of how to most clearly and easily share a scientific mindset may be a wrong question about such a mindset, while their dear goal of popularizing science may be a kind of oxymoron. Because the better something fits into the holy format of popularization, the less likely it may be to reflect a proper scientific mindset. In particular, popularization concerns are more typical of extroverts, while a proper scientific mindset may actually require some introversion instead. Other crucial aspects of science thus missed would be much harder or even sometimes impossible to share in such popularized manners:

Another vice of "scientific skepticism" is its usual trend to focus its attacks on both the caricatural and the politically weak (actually distinct categories which it confuses), while failing to challenge the stronger, institutionalized or somewhat more subtle forms of pseudo-science.

I pointed out that scientific methods could be usefully completed in some cases by some methods and inspirations from libertarianism. Indeed some good ideas and tools can be found there. Some of its principles are already exemplified by the success of free market economies over the Soviet ones ; along similar inspirations, some more advanced social technologies remain to be designed and implemented through a better use of information technologies which already came as a fruit of science.

Unfortunately, the current mainstream libertarian ideologies fell into similar traps with respect to the potentialities of libertarian solutions, as skepticism fell with respect to the scientific ideals: the trap of perverting its best principles by essentializing them and neglecting the need of thorough analysis to discover the appropriate ways of applying them. Namely, many fell to the temptation of extrapolating their confidence in the few classical methods of free market, justified by the success we know in classical cases, to dream of a straightforward universal applicability of these methods in all cases. They can rightly point out some cases where these methods were attacked for either wrong or controversial reasons, such as the institutions of rather heavy taxes to maintain some sub-optimal systems of social security and official education ; and extrapolate from there to less wisely dismiss as similar "attacks on liberties" any other calls for regulations, namely those for environmental concerns.

So in the discussion, I drew the parallel between both ideologies as follows:

So, searching for truths about the universe, or searching for solutions to real problems, is like traveling: there are some powerful means of transportation, but not a single one of them is appropriate to explore everything. Different parts of reality may have such different structures that they may require different kinds of methods to be uncovered. In the face of this diversity of needs and aspects of reality, researchers may need to adapt and innovate in terms of methods. In the name of what could anyone expect otherwise ? What could be the sense of expecting reality to be adapted to the requirement of being best investigated by some given simple methods fixed in advance ? The Universe has no such duty towards scientists.

Ironically, this strange hypothesis that the universe would happen to be adapted to the expectations of methodologists for the intellectual comfort of investigators (in terms of explorability : the superiority of a fixed exploration method), would actually be very bad news for these investigators. Indeed:

  1. It would make their work boring ;
  2. it would open the way to the automatization of their work, so that machines could ultimately overcompete them and leave them jobless.
Finally, I would re-place such considerations on the expectable meaning and values of scientific investigation, in the perspective of their context, as particular cases of the meanings of life in this universe which I sketched near the end of Part 2: the general quest for truth is just one of many possible games we have here the opportunity to play.
So, this quest for truth undertaken under human conditions, is just a game. It does not really aim to uncover these truths, providing an access to these which did not exist before, since all (or at least most) relevant truths to be so uncovered, were already known before the game started, and will be disclosed again anyway once the game is over. Instead, the real goal of the game is to play. It needs to have some visible other goals, and these visible goals need to appear serious, since this is needed to keep the motivation and opportunity to play this game seriously.
This game of the quest for truth, like any other game, is neither fair nor unfair: Rather than being worried for those who fail at the game of truth seeking (insofar as their failure does not cause any concrete tragedy, even if concrete tragedies themselves may in some sense be less serious than they seem), it is allowed to laugh at them ; yet, those who laugh for the wrong reasons will be laughed at.

In particular, different people may happen to be at different stages of their spiritual evolution (in particular their series of reincarnations). It can seem funny to stare at children's mistakes; but everyone has been a child someday, and the experience of committing and struggling with mistakes can be a necessary part of the learning process.

Let me reply to the possible suspicion that, by the above picture, I would be undermining the sense of seriousness (meaningful purpose) of scientific investigation. I happen to be very serious by nature, tempted to take everything extremely seriously. This went to the point of leading me to spoil my life trying to follow purposes and requirements which claimed to be serious ones but which actually weren't, such as "giving my life to God" with Evangelical Christianity, and accepting to follow the academic system up to PhD as if it would be needed for me to fulfill my wish of doing great science in my life, when it was actually a huge waste of energy. After such disastrous tries to integrate myself in the university system, I finally left it for good, disgusted of the reign of vanity and lack of meaningful purpose in so much of what is going on there. I already commented above about the mess of scientific popularization. I also reported elsewhere the degree of vanity I found in academia, both in scientific research and teaching activities. Indeed, what is the sense of passing an exam, really ? What is the sense of repeating the same lecture every year, on the same topic which thousands of other teachers are also teaching around the world, and still doing it as badly as decades ago without anyone giving any serious thought on needed restructuration ? What is the sense of racing to publish a given finding in a given popular research field, in hope to do it either before or at the same time as others, so as to become listed as one more of its co-discoverers ? What is the sense of doing some "great" work which will only interest the curiosity of a handful of specialists of the same topic but has no chance to be of any use for the rest of mankind anyway ? What is even the sense of all specialists of a given field focusing their works on publishing stuff to be read by their peers working on the same field around the world, but none of them even caring to help maintaining a list of those existing research teams in this field (I was the one doing it for them, despite being out of that system) ?

A vanity for another, let us compare the value of rational investigation to that of passing an exam. What really matters there is not, in itself, the discovery of the right result, since those truths to be discovered are anyway no mystery from the viewpoint by which the ultimate value of the investigation will finally be appreciated. What matters instead, is the method which is followed. Not because the right method is fixed in advance, but precisely because it isn't : you have to invent you own, and see how it goes. You have a large freedom in both your choices of target questions and methods to investigate them, and no secret will remain about this.
A possibly legitimate method is to copy your work from your neighbor's. Indeed this can be valuable in two ways. One way is to save your time, using the fruits of his work as a basis you need for your own other works, the different game you want to play where you can then develop and test different skills. The other way is to test your skills of discerning which works are worth copying from, and which aren't.
But, if you altogether undertake to get and wear dark glasses, proclaim that these glasses are the brightest of all (since only some of the brightest lights are visible through these), and complain against your neighbor, who disagrees with you on this, that you cannot see whether his work is worth copying from when you look at it through your glasses, then you may be the one actually failing.

Previous : Part 1 - Part 2 - Part 3 - Part 4 - Part 5
List of links on skepticism
Back to site : Antispirituality main page