CS373: Intro to Theory of Computation
Spring 2010 Prof. Madhusudan Parthasarathy
Skills list for the final exam
The final exam will be cumulative, but will concentrate a little more
on material covered since the second midterm. In addition to the
material from the two midterms, we will be testing
material through lecture 25. The material
presented in lectures 26 is not very important (we may ask you
small factual questions only).
The overall look-and-feel of the exam will be similar to the mock final
(i.e. HW10), which you have already done.
Look at the skills-list for the first two midterms. These will be included for the
final. In addition, the new material for this exam includes:
- Turing machine recognizers:
- Designing recognizers for languages; dove-tailing
- Decidability and undecidability
- Know that a language is decidable if and only if the language
and its complement are both recognizable.
- Know that decidable languages are closed under complement, but
recognizable languages aren't.
- Prove that a language is undecidable or not recognizable, using a
simple proof by contradiction that relies on closure properties or
similar basic facts.
- Know the statement of Rice's theorem. Recognize when a decision
problem is covered by this theorem and, thus, is undecidable.
Recognize when a decision problem does not fit the conditions of
Rice's Theorem.
- Proving languages undecidable using reductions
- Understand the diagonalization proof that
ATM is undecidable. You don't have to be able
to construct a whole proof like this from scratch.
- Know how to figure out the language accepted by
the auxiliary TM Mw used in the reductions that
look like Rice's theorem. (E.g. like question 6 on quiz 6.)
- Know how to prove that a language is undecidable using
a reduction from another language already known to be
undecidable. Proofs on the exam are likely to have a
very simple/standard
outline.
- Context-free grammars
- Formal definition of a context-free grammar (tuple notation).
- Define what it means for one string to yield or derive another string.
- Define what it means for a string w to be in the language of a grammar G.
- Design a context-free grammar that generates a specific language L.
- Describe the language generated by a specific grammar G.
- Draw a parse tree for a word w using a grammar G.
- Give a derivation of a word w using a grammar G.
- Know that a single string may have more than one distinct parse trees. Produce examples illustrating this phenomenon.
- Algorithms on CFGs: deciding whether a language is empty, etc.
- Chomsky normal form
Context-free grammars
- Formal definition of a context-free grammar (tuple notation).
- Define what it means for one string to yield or derive another string.
- Define what it means for a string w to be in the language of a grammar G.
- Design a context-free grammar that generates a specific language L.
- Describe the language generated by a specific grammar G.
- Draw a parse tree for a word w using a grammar G.
- Give a derivation of a word w using a grammar G.
- Know that a single string may have more than one distinct parse trees. Produce examples illustrating this phenomenon.
- Algorithms on CFGs: fixed-point algorithms, including deciding whether a language is empty, etc. similar
to the ones used for converting grammars to CNF.
- Chomsky normal form
- Define Chomsky normal form.
- Recognize whether a grammar is in this form.
- Conversion from CFGs to Chomsky Normal Form.
- The CYK algorithm for membership of a word in the language of a CFG in Chomsky Normal Form.
- Recursive automata
- Definition of recursive automata, syntax and tuple notation.
- Know the formal definition of acceptance of words using configurations that involve both state and stack.
- Know how to specify a recursive automaton using tuple notation and state diagrams, and convert between the two notations.
- Construct a recursive automaton to recognize some specific language L.
- Describe the language recognized by a given RA.
- Know how to transform RAs to another RA in order to accept a different language; in particular, closure constructions on RAs.
- Know that recursive automata are equivalent to context-free grammars. Know the conversion between RAs and CFGs, and know how to do it on particular examples.
- Context-free languages
- Closure properties of CFLs-- closure under union, concatenation, *, homomorphism, closure under intersection with a regular language, etc.
- Know that CFLs are not closed under intersection and complementation.
- For languages that strongly resemble examples covered in class and/or in the text (e.g. section 2.3), be able to tell whether a specific language L is context-free.
- Use closure properties to prove that a language is not context-free, based on the fact that some other "seed" language is not context-free.
- Know the *corollary* to the pumping lemma context-free languages (or the pumping lemma itself);
should be able to understand and demonstrate it for sample CFLs.
- Know how to prove that a particular language is not context-free using the corollary to the pumping lemma (or the pumping lemma)
- Complexity theory
- Definition of classes P and NP
- Definition of NP-completeness; P vs NP and NP-completeness
- Vertex-cover, Hamiltonian path, and clique are NP-complete.
Here are some specific topics from earlier in the course which
are particularly worth reviewing:
- proving non-regularity of languages (using the direct argument),
- strong induction (e.g. on parse trees or derivations)
- the product construction for recognizing the union or
intersection of two regular languages
- the subset construction for converting an NFA to a DFA
- closure properties for regular and context-free languages
- use tuple notation to describe how to modify an input machine
to produce a new machine.