Be sure to take advantage of the following resources :
Homework Solution Sessions: Saturdays 15:30 - 16:30, Sundays 12:30 - 13:30.
Weekly Review Sessions: Saturdays 12:00 - 13:00.
Get to know your mentor and reach out to them if you need help - that’s what they’re here for!
Informally, a Turing machine is a finite state machine with access to a tape (memory) that is either infinite in one direction or infinite in both directions (in this course, usually infinite in both directions). Initially the input to the TM is placed on this tape. At each step, the machine makes the following decisions (based on the state it is in and the symbol it’s tape-head is currently reading): change the state, write some symbol at the cell currently under the tape head, and move the tape head once cell to the left or to the right.
Formally, we define a Turing machine to be a 7-tuple , where is the set of states, is the start state, is the accepting state, is the rejecting state, is the input alphabet, is the tape alphabet, and , where is the transition function. Once a TM reaches or , it halts, and it will keep going until it reaches one of these states.
Unlike DFAs, TMs need not halt on all inputs. A Turing machine is called a decider if for all inputs , it halts and either accepts or rejects .
A language is called decidable if there exists a Turing machine such if , accepts , and if , rejects . Note that such a TM halts on all inputs, and therefore is a decider TM.
For a TM , denotes the set of all strings that accepts. Note that for strings not in , either rejects or loops forever.
The following is an equivalent definition of decidability: is called decidable if there exists a decider Turing machine such that .
Let and be languages, where is decidable. We say that solving reduces to solving (or simply, reduces to , denoted ), if we can decide by using a decider for as a subroutine (helper function).
Let M1
and M2
be two Turing machines that decide and respectively.
We construct Turing machines M3
and M4
that decide and respectively:
def M3(x):
for each of the |x| + 1 ways to divide x as yz:
simulate M1 on y
if M1 accepts:
simulate M2 on z
if M2 accepts, accept
reject
def M4(x):
if length of x is 0:
accept
for each sorted list of indices [0, a1, a2, ..., |x|]:
// the indices are a subset of {0, 1, 2, ..., |x|}
// each list starts with 0 and ends with |x|
string_is_good = true
for each ordered pair of adjacent indices (p, q):
simulate M1 on x[p:q]
// x[p:q] is the section of x from the pth to the q-1 th character
if M1 accepts:
pass // i.e. keep executing
else:
string_is_good = false
break
if string_is_good:
accept
reject
Note that we’ve implicitly appealed to the Church-Turing thesis here. One implication of the Church-Turing thesis is that for any algorithm with precise instructions written in English, there is a corresponding TM that does the same thing. We have written our algorithms in English to show the existence of two Turing machines.
We define the language as follows: Prove that the following languages are decidable by reducing it to .
.
Part 1: Let be the language of all strings with an odd number of 1’s. We leave it as a short exercise to show is regular by drawing a DFA.
Let be a decider for . We construct a decider for as follows.
Given is an input where is a DFA.
Construct a DFA such that .
Run on and return the answer.
Proof of correctness:
To prove the correctness, we need to argue that no matter what the input is, our TM returns the correct answer. We do so in two cases.
Suppose that . Then . So (and therefore ) will accept as desired.
Now suppose that . Then . So (and therefore ) will reject as desired.
(Note that there is also the case when the input string does not correspond to a valid encoding of a DFA. Even though this is not explicitly written, we implicitly assume that the first thing our machine does is check whether the input is a valid encoding of an object with the expected type. If it is not, the machine rejects. If it is, then it will carry on with the specified instructions. The important thing to keep in mind is that in our descriptions of Turing machines, this step of checking whether the input string has the correct form (i.e. that it is a valid encoding) will never be explicitly written, and we don’t expect you to explicitly write it either.)
Part 2: We first prove the following lemma: If a DFA with states accepts some string with , then it will accept infinitely many strings.
Proof of lemma:
Let be a DFA with states accepting the string , where . By PHP, there exists and , , such that and end on the same state . It follows that ends on for all . So, all the strings of the form for are accepted. This completes the proof of the lemma.
We now move onto the main claim. We construct a decider for as follows:
Given input where is a DFA, let be the number of states of .
Construct a DFA such that . (We leave the proof that this language is regular as an exercise).
Construct a DFA such that .
Let be a decider for and run on .
If accepts, reject. Else accept.
To prove the correctness, we need to argue that no matter what the input is, our TM returns the correct answer. We do so in two cases.
Suppose . This means is infinite. Since there are only finitely many strings of length at most , accepts some string of length greater than . So is nonempty. This means rejects, which means accepts, as desired.
Suppose . This means is finite. Then by the contrapositive of our lemma it accepts no strings of length greater than . It follows that , so accepts and rejects, as desired.
Suppose we change the definition of a TM so that the transition function has the form where the meaning of is “stay”. That is, at each step, the tape head can move one cell to the right or stay in the same position. Suppose is a TM of this new kind, and suppose also that is a decider. Show that is a regular language by describing a DFA solving . A proof of correctness is not required.
We construct a DFA that solves the language . The components of the 5-tuple will be specified below.
We say that a state is pointless if it can never be reached (no matter what the input is). Let be with all the pointless states removed. Note that has the exact same behavior as on all inputs. Therefore it suffices to prove is regular. The reason for will become clear when we define the transition function of our DFA. We drop the prime and refer to as from here.
We now specify how the components of our DFA are defined.
We let . And of course the input alphabet does not change. Furthermore, the initial state of the DFA will be the same as the initial state of the TM.
To define the transition function , we have to define what the output should be for an arbitrary . To make our definition, we’ll imagine the TM is in state when the tape head first moves right onto a symbol of the input. There are several cases to consider.
Case 1: If is such that , then define . So the and states are sink states in the DFA.
Case 2: If we are not in case 1, then we consider whether the transition moves the tape head to the right or stays. Let’s consider what happens when the head moves right. In this case, the DFA’s transition will proceed similarly to the TM and change the state in the same way. What is written on the tape by the TM is not important as we can never go back and read it. So if , then define .
Case 3: Suppose does not move the tape head, i.e., suppose . In this case one of two things can happen. Either the machine reaches an accepting or rejecting state before ever moving the tape head right. Or the machine will eventually move the tape head right and enter a state . If the machine reaches an accepting state, define . If it reaches a rejecting state, define . Otherwise, define .
This completes the definition of the transition function .
There is only one thing left, which is to define . We will have , but may contain other states as well. In particular, let’s call a state good if running on the blank input, starting at , ends at . Then we put all the good states in . Take a moment to understand why we are defining in this way. Think about what behavior we want once the TM reaches the end of the input and is pointing to the first blank symbol. At that point it will be at some state , and note that we are calling a good state if the TM eventually reaches the accepting state from that point on.
Fix an alphabet . We say that a language is semi-decidable (or recognizable) if there is a TM such that . (Recall that a language is decidable if there is a decider TM such that .) So for all , accepts, and for all , either rejects or loops forever. Show that is decidable if and only if and are semi-decidable.
We first prove that if is decidable, then and are semi-decidable. So suppose is decidable. Note that if a language is decidable, then it is automatically semi-decidable. Therefore we know is semi-decidable. So all we need to argue is that is semi-decidable. Observe that decidable languages are closed under complementation: if we take a decider for , and reverse the accepting and rejecting states, then we’ll have a decider for . This shows is decidable, and therefore semi-decidable.
We now prove that if and are semi-decidable, then is decidable. Let be a semi-decider for and let be a semi-decider for . We want to describe a decider TM for . As a first attempt, consider the following TM.
def N(x):
run M(x)
if it accepts, accept
run M'(x)
if it accepts, reject
What is wrong with the above TM? Why doesn’t it correctly decide ?
Here is a description of a correct decider for .
def N(x):
for t = 1, 2, 3, ...:
simulate M(x) for t steps
if it halts and accepts, accept
simulate M'(x) for t steps
if it halts and accepts, reject
To see that this is indeed a correct decider for , let’s consider what happens for all possible inputs.
If the input is such that , then we know that since is a semi-decider for , accepts after some number of steps. We also know that only accepts strings that are not in , and therefore does not accept . So will correctly accept.
If on the other hand is not in , then we know that is in . Since is a semi-decider for , accepts after some number of steps. We also know that only accepts strings that are in , and therefore does not accept . So will correctly reject.
Dr. Hyper Turing Machines Inc LLC is selling a whole host of new Turing machines, each for $19.99:
Bi-infinite TMs - with a tape that stretches infinitely in both directions!
Infinitely-scalable TMs - choose however many tapes heads you like!
Quad-core TMs - now with 4 tapes (each with its own tape head)
Normal TMs usually go for $9.99 these days. Your friend (who’s not very Turing-savvy) is in the market for a new Turing machine and just texted you asking you for purchasing advice. Your instincts tell you that maybe most of this is marketing hype. But some of those improvements do sound pretty compelling... Your friend doesn’t use their TM for all that much - mostly just browsing the web and checking email. What should you recommend them to do?
If your friend isn’t concerned about performance, they should just buy a normal TM. All of the above are equivalent to normal TMs. (Note that the descriptions below are informal. We are not after formal proofs in this question, but rather just build some intuition.)
We can easily simulate a bi-infinite tape TM with a singly-infinite one. The idea is to index the cells on a bi-infinite tape with the integers and the cells of the singly-infinite tape with the naturals, and then perform the usual bijection. Given input , we first space out by inserting one space between each pair of adjacent characters. After this, we can simulate a move of the tape-head by moving two cells in the same direction (if it is on an odd-numbered cell), and by moving two cells in the opposite direction (if it is on an even-numbered cell). The only exception to this scheme is when we are on the first cell. If we move left from the first cell in the bi-infinite TM, we just move one cell to the right in the singly-infinite TM. How do we know that we are on the first cell? We ‘mark’ the symbol on the first cell at the very beginning, and keep it marked. We achieve this by adding a symbol for every symbol in the alphabet.
If is a -tape-head TM, then we describe a TM that simulates M. We add to our tape alphabet the ability to ‘dot’ symbols (i.e. for each symbol , add the symbol ), demarcating where ’s tape heads are. To simulate a single move of , first makes a pass to determine what symbols are underneath ’s tape heads. then makes a second pass and performs the appropriate transitions for each virtual tape head.
In a 4-tape TM, we have 4 separate tapes each having their own tape heads. A single application of the transition function reads the symbols under the tape heads, changes them, and then moves left or right. What is written on the tapes and which direction the tape heads move are determined separately for each tape.
We can simulate a 4-tape Turing machine with a normal 1-tape Turing machine . At step of the computation suppose is the contents of tape , where is some string in . We keep the contents of the 4 tapes on a single tape, with a # as a separator: The first # is at index 0 of the tape. We expand the tape alphabet to allow ourselves to “dot” a symbol (this keeps track of where the “virtual tape heads” are).
To simulate a single move, scans from the 1st # to the 5th # in order to figure out what symbols lay underneath the tape heads. then performs a second pass to update the tapes according to the way ’s transition function dictates. An edge case that we need to handle : if ever moves one of the virtual heads onto a #, this signifies that has moved one of its heads onto previously unread tape cell. In this case, copies everything to the right of the tape head over 10 spots and writes blanks on the new space generated (this is kind of like requesting memory via
malloc
).The fact that Turing machines can be changed in so many ways and remain equivalent in power provides evidence that the Turing machine is a quite robust model of computation. This robustness may help justify Turing machines as a reasonable abstraction of computation.