Submitted by egdaylight on
Dated:
Scrutinizing the history of mathematics and computer science has led me to the following illustration & line of argumentation, followed by two typical responses and then a question which I pose to my readers. Perhaps the sequel contains a few insights for the reader too; alas, some sections are peppered with nonsense (most of which I have labelled accordingly).
An Illustration
I have a stagebystage process P, which operates thus:

Stage 1: enumerate all bit strings of length 1

...

Stage i: enumerate all bit strings of length i

... ad infinitum
When transitioning from Stage i to Stage i+1, process P never erases anything but only appends more bits. At each Stage i, process P computes the diagonal and the antidiagonal of the enumerated bits. It is easy to program process P and subsequently run it, as the following example illustrates:
Stage 1: Running process P leads to the following matrix at Stage 1:

0

1
The diagonal is 0. The antidiagonal is 1.
Stage 2: The matrix at Stage 2 contains all bit strings of length two and is obtained by merely appending bits to the matrix at Stage 1. The result is:

0 0

1 0

1 1

0 1
The diagonal is 0 0. The antidiagonal is 1 1.
Stage 3: Likewise, to transition to Stage 3, process P will append bits to the matrix obtained so far. The result is the following matrix of all eight bit strings of length three:

0 0 0

1 0 0

1 1 0

0 1 0

0 0 1

0 1 1

1 0 1

1 1 1
The diagonal is 0 0 0. The antidiagonal is 1 1 1.
Stage i: At stage i, process P has computed a matrix consisting of all bit strings of length i. In the running example, the diagonal is a string of zeroes of length i. And the antidiagonal is a string of ones of length i.
And so on.
Which antidiagonal is obtained at some distant enough infinity (to make this fuzzy question sensible)? For the present example, we seem to have:
 the diagonal is a bit string as long as the naturals (consisting of zeroes) followed by another bit string at least as long as the naturals
 the antidiagonal is a bit string of ordinal type omega (consisting of ones) followed by another bit string of at least ordinal type omega
The tentative (nonsensical) conclusions are as follows:
 the antidiagonal string is not an infinite string of ordinal type omega; presumably, the ordinal at hand is uncountable
 process P has, at some distant infinity, successively enumerated through all infinite bit strings
The question then is whether process P is an ordinary process. Doesn't P take too long to be an ordinary algorithmic process? Perhaps my way of reasoning is too extreme a form of constructivism? Maybe so, but I am interested in an account which is fully compliant with Turing's 1936 notion of computability; that is, with his reception of Ernest Hobson's writings and his circlefree machines. On the one hand, those machines don't incorporate the computational complexity of transitioning from Stage i to Stage i+1; that is, when reasoning intensionallly we apparently don't have to mind about the complexity at hand. On the other hand, it definitely makes sense to contemplate a process P which avoids storing obsolete entries in the matrix, and even this may be far from enough in order to have extensional import; see also my reply to Respondent 2 below.
Having presented some intuition, I now turn to my short argument.
My Argument
I have an endless process, illustrated above with stagebystage process P, which gradually builds the following three objects, each as a growing totality:
 a bijection between all (finite) bit strings on the one hand and all infinite bit strings on the other hand
 the diagonal of that bijection
 the antidiagonal
Each stage of the process contains a finite amount of information. Implementing item 1. requires an appropriate form of dovetailing, as illustrated with process P above. Intensionally, my endless process complies with Turing's 1936 notion of a circlefree machine.
In my reading of Hobson & Turing & Church, we have the following terminology:
 HobsonTuring computability refers to what Turing wrote before Church reviewed Turing's 1936 paper
 ChurchTuring computability refers to "Turing machine computability"
But didn't Turing and Church agree on what computability entails? Not quite. I will provide a link to my historical findings in the near future. I conclude my argument thus:
 Extensionally, my antidiagonal is HobsonTuring computable
 Extensionally, my antidiagonal is not ChurchTuring computable
(But this ain't no research paper; this is work in progress and, in retrospect, a bit of nonsense. Who says you will get more than merely one infinite string of ones? Sure, but how do you prove that?)
Response 1
The response which I often receive is one in which my process P is recast by the respondent into a "description of a function f," with f : N > (N > N) and where N denotes the set of natural numbers. The respondent then goes on to apply Cantor's diagonal argument in this Platonic setting of functions. This is all fine with me. But then he immediately recasts his obtained result back onto my intensional reasoning of my stagebystage process P. It is precisely this move which was eschewed by various historical actors. (I'm thinking of men like Poincaré, Hobson and Turing, but the history is, well, complicated.)
The respondent does not even look at how my process P is defined. The fact that I'm claiming that P is successfully enumerating all infinite bit strings suffices for him to (a) extrapolate to his Platonic realm of functions and (b) apply Cantor's diagonal argument and (c) project his impossibility result (which is correct in a Platonic realm) back onto my stagebystage realm in which Turing was (primarily) reasoning.
The crux, so I maintain, is that there are multiple ways to jump from intensional to extensional reasoning (and back again). At any rate, it is becoming clear (to me, in my ongoing research) that various historical players each had a different perspective on the matter at hand, also after the formative years in the 1930s, contrary to what is suggested (if not taught) today in, say, computer science textbooks.
Response 2
Not unlike the previous response, I have received the following commentary which I initially took to be persuasive:
In order to set up a bijection between bit strings and "all" the reals, you would need to use uncountably many infinite bit strings. Now, what goes beyond the countable transcends the computable as well; so, your procedure is not of the kind a Turing machine could accomplish. If you restrict the construction of your bijection so that it becomes a computable procedure, there will be a computable antidiagonal, revealing that any computable such procedure can be extended to a more encompassing similar procedure.
Unpacking the first sentence: "In order to set up a bijection" pertaining to "all the reals" belongs to a Platonic narrative about the idealized end result of running my process P. And "you would need to use" is part of an operational (stagebystage) exposition. Therefore, I'm not sure the implication conveyed in the first sentence is obvious: it goes from a final result in an idealized world (of set theorists, not engineers) to a statement about engineering or computer programming. Of course, the statement is informative; perhaps it is more informative than I'm willing to presently accept (for the sake of argument).
The respondent is reasoning under the assumption that my process P has successfully enumerated through all the reals, while I have presented P as a process that is succesfully enumerating through all the reals. Specifically, my process P is, stage by stage, outputting more bits of the antidiagonal. Process P can erase those entries in the matrix which have become obsolete for the task of computing the antidiagonal. (This clarification is thanks to Respondent 2 in reaction to my — now futile — stipulation above that "process P never erases anything.")
Intensionally, my process P is always making progress: from stage to stage it prints out more bits of the antidiagonal. From a bird'seye view however, my process P is generating more & more work for itself to carry out at a later stage. If the reader wishes to take a Platonic snapshot at ordinal number omega, then he will correctly observe — pace the present & previous respondent — that process P has failed to complete the bijection. I suggest to rephrase the observation thus: process P has not yet succeeded in completing the bijection. This line of reasoning brings me to my general question: Does it make sense to contemplate taking Platonic snapshots at ordinal numbers beyond omega? If not for the present case study, then perhaps for others.
Continuing with the feedback from the respondent; he says:
So, here you have a result: any computable bijection between bit strings and real numbers can be extended into a more complete one, or equivalently (since any output of an algorithm is at most countably infinite): there is no inextensible recursive enumeration of real numbers, because its antidiagonal will be computable.
I understand the statement "any output of an algorithm is at most countably infinite" to mean that, by convention, we today agree to always take the Platonic snapshot at omega. The historical question then is: When, where and how did that convention originate? (It definitely did not originate in engineering, for engineers typically work with finite abstractions.)
Revisiting Both Responses
At first glance & in retrospect, I simply 'know' that both respondents are right in their conclusion; I'm just not convinced of their line of reasoning. The crux, I guess, is that a computation (of my process P) is, itself, a string. And so P will merely traverse a denumerable part of the continuum. This is probably it. And then the responses presented above are compatible with the contents of this paragraph after all. I am now applying both intensional & Platonic reasoning (i.e., when talking about treating computations as strings) and I take this to be compliant with Martin Davis's 1958 book.
At second glance, we actually have the following situation:
 Aristotelian stagebystage reasoning, i.e., in the limit: the antidiagonal is in the matrix at any stage
 Platonism, i.e., at countable infinity, at the limit: the antidiagonal is STILL in the (nowinfinitely large) matrix at stage omega.
That is, we do not have a discontinuity when going from inthelimit reasoning (1.) to atthelimit reasoning (2.). There is no purported bijection between the naturals and all the reals in case 2, just like there is none in case 1. Hence, applying Cantor's diagonal argument in (2.) is unwarranted. That seems to be the real & right answer. Case closed.
Moreover, suppose my process P works twice as fast at Stage i+1 compared to Stage i. (I take this to be an extensional claim about P.) At first glance, this means that P will have successfully produced the antidiagonal's first bit string of ordinal type omega (which consists of all ones) in a fixed, finite number of moves. At second glance, this is not the case: for the amount of work to be carried out doubles from Stage i to Stage i+1. Moreover, let me also assume that process P works twice as small (in terms of space) at Stage i+1 compared to Stage i. All in all, we now have a standard process P. In fact, let me upgrade my latest process P to process P* such that it can achieve any computational task in fixed, finite time and space. Even then, isn't process P* just going to return a completed antidiagonal consisting of one infinite string of ordinal type omega? (And then the critique of both respondents applies.) Well, assume P* can simulate all other processes p1, p2, p3, ... in sequence. Even then, P* will merely traverse a countable part of the continuum.
My Question
What does all of the above mean? [Not much due to some nonsense.] Here is my tentative interpretation: Turing's 1936 own extensional notion of computability does not fully comply with that of Church. Turing's automatic machines are not equivalent to Church's lambda definable functions; instead, they were made to be equivalent. (Church was the reviewer of Turing's paper and Church had his agenda.)
 Quoting the historian Rod Adams: "Turing computable functions ... were specifically created to be equivalent to the class of effectively calculable functions" — from p.vii in: An Early History of Recursive Functions and Computability (Docent Press, 2011)
 Paraphrasing Wittgenstein: It's just a modelling exercise. That's all folks.
Q & A

Q: Are you saying that the set of all infinite bit strings is countable?

A: May I ask which line of thought makes you pose that particular question? Where, precisely, do you think I'm saying that? If I were to lay out all the infinite bit strings in front of me, hypothetically, and if I were then to claim that a bijection exists between the (finite) bit strings and the infinite bit strings, then (indeed) I would be trying to do the impossible: to disprove Cantor's diagonal argument. Note, moreover, that those bit strings must be abstract mathematical objects; that is, Cantor's diagonal argument does not work in the concrete changing world of individuals, i.e., in a world of stagebystage processing — at least not in the way which Cantor contemplated. In contrast to Cantor and set theorists, I am not working Platonically most of the time, i.e., when I explain the do's and don'ts of my process. Instead, I am working stage by stage and then (just like Turing by the way) I only briefly jump from intensional to extensional reasoning, i.e., from finite approximations in a changing world to one global, conceptual snapshot of an infinite string of bits. So I take just one Platonic snapshot and conclude; I do not start applying diagonal arguments and the like in that Platonic realm. I believe this is a faithful account of Ernest Hobson & Alan Turing's philosophical positions. (More on this in forthcoming work.)
[Last update: 29 September 2022. This topic is abstruse. Therefore I might want to refine & correct the present blog post several times in the near future. If this post contains new material, then a 'thank you' is in order to Julian Rohrhuber for pushing my thinking on several occasions (in private correspondence).]