Lecture 21 - Notes
Goals
- Get a general idea for the meaning of computability
theory and understand the Halting problem
- Get a general idea for the meaning of complexity
theory
- Understand the purpose of automata theory as a tool
for studying computability and complexity
- Get a general idea for what an automaton is.
Announcements
- New teams for today
- Midterm grading is underway; back sometime this week
- A5 due Wednesday
- Please fill out Week 5 Survey by Wednesday
New Teams - New Norms
Team Norms
Take five minutes with your team to:
- Discuss your reflections on past group work experiences, in or
outside of this class. Possibly include:
- Was it helpful to have team norms in your last team?
- How could you adjust your norms, or adjust your usage of them, to
make them more helpful?
- Construct a set of norms that you all agree upon.
Write your norms on the Team Norms sheet.
I’ll collect these after class and redistribute them with the
Exercise/Problem sheets each day.
We’ll occasionally revisit norms and make any adjustments deemed
necessary.
Theory of
Computation - A High-Level Introduction
Computability theory:
what can be computed?
Turns out not everything is computable. Example: the halting problem.
Construct a program \(H\) that takes as
input another program \(P\) and its
input \(I\). The correct output of
\(H\) is True if \(P\) ever terminates when given input \(I\), and False otherwise.
It’s impossible to write \(H\)!
Here’s a proof, which Alan Turing first showed in 1936(!):
Proof by contradiction: Suppose \(H\) exists. Consider the program \(Z\):
program Z( String x ):
if H(x,x):
loop forever
else:
return
Now imagine running program \(Z\)
with \(Z\) as input. This calls \(H(Z,Z)\); this will tell us whether \(Z\) terminates. Thus there are two
cases:
- \(H(Z,Z)\) is true, i.e., \(H\) tells us that \(Z\) terminates; then according to the code,
\(Z\) runs forever - a
contradiction
- \(H(Z, Z)\) is false, i.e., \(H\) tells us that \(Z\) doesn’t terminates; then according to
the code, \(Z\) terminates - a
contradiction
Computational
complexity: how can we classify computable problems by difficulty?
Not all problems are equivalently easy to solve.
Example: \(P\) vs \(NP\).
A problem is in \(P\) if it can be
solved in “polynomial time”.
A problem is in \(NP\) if a solution
to the problem can be verified in “polynomial
time”.
We don’t know whether these are the same! That is, we don’t know if
there are problems in \(NP\) that are
not in \(P\).
Automata Theory:
the study of abstract “machines”
To study these formally, we need to model concepts like
computer and problem mathematically.
Traditional methods for doing this come from automata
theory, which is the study of simple, abstract models of
computation.
In this approach, an automaton is a simple model of
computation, and the problems they solve relate to to
the acceptance of languages. Before defining these
formally, let’s look at an example.
Suppose you’re programming a toll gate, and the toll required to open
the gate is 15 cents. The machine can accept nickels (\(5\textcent\) coins) and dimes (\(10\textcent\) coins). You could imagine
writing code for this:
total = 0
while total < 15:
coin = accept_coin()
total += coin.value
open_gate()
We could also more abstractly model this using a “state machine”:
(See the picture in the whiteboard notes)
(End of what we covered in
L21)
Alphabets, Strings, and
Languages
Let’s now get more formal about what these machines can do; we will
abstract away the specifics of the problem by rephrasing it in terms of
the language accepted by the machine. Some
definitions:
- In the context of strings, an alphabet is finite
set, and its members are called symbols. Symbols can be
anything, really, but they are usually represented as numbers or
letters.
- Examples: \(\Sigma = \{0, 1\}\),
and \(\Sigma = \{a, b, c, \ldots,
z\}\).
- A string \(w\)
over an alphabet \(\Sigma\) is a finite
(ordered) sequence of symbols, where each symbol is an element of the
alphabet. For example, \(110\) and
\(0010\) are strings over the alphabet
\(\Sigma = \{0, 1\}\).
- The length of a string \(w\), written \(|w|\), is the number of symbols in the
string. For example, \(|110| = 3\) and
\(|0010| = 4\).
- The empty string, written \(\epsilon\), is the string whose length is
zero.
- A language over an alphabet \(\Sigma\) is a set of strings over \(\Sigma\). For example: \(\{1, 0, 01, 10\}\) is a language over the
alphabet \(\{0, 1\}\).
- The set of all strings (of any length) that can be made from an
alphabet \(\Sigma\) is written \(\Sigma^*\). So formally \(L\) is a language over \(\Sigma\) if \(L
\subseteq \Sigma^*\).