cs2800 notes14

Upload: mark-lu

Post on 03-Jun-2018

214 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/12/2019 cs2800 notes14

    1/5

    Regular Languages and FiniteAutomata

    Theorem: Every regular language is accepted by somenite automaton.Proof: We proceed by induction on the (length of/structureof) the description of the regular language. We need toshow that

    is accepted by a nite automaton Easy: build an automaton where no input ever

    reaches a nal state is accepted by a nite automaton

    Easy: an automaton where the initial state accepts each x I is accepted by a nite automaton

    Easy: an automaton with two states, wherex leadsfrom s0 to a nal state.

    if A and B are accepted, so is ABProof: Suppose that M A = (S A, I , f A, sA, F A) ac-

    cepts A and M B = (S B , I , f B , sB , F B ) accepts B.Suppose that M A and M B and NFAs, and S A and S Bare disjoint (without loss of generality).

    Idea: We hook M A and M B together.1

    Let NFS M AB = (S AS B , I , f AB , sA, F +B ), wher

    F +B =F B F A if B;F B otherwise

    t f AB (s, i ) if either s S A and t f A(s), or s S B and t f B (s), or s F A and t f B (sB ).

    Idea: given input xy AB, the machine guessewhen to switch from running M A to running M B .

    M AB accepts AB .

    if A and B are accepted, so is A B .

    M A B = (S AS B{s0}, I , f A B , s0, F A B ), wher s0 is a new state, not in S A S B

    f A B (s) =f A(s) if s S Af B (s) if s S Bf A(sA) f B (sB ) if s = s0

    F A B =F A F B {s0} if A BF A F B otherwise.

    M A B accepts A B.

    2

    if A is accepted, so is A.

    M A = (S A {s0}, I , f A , s0, F A {s0}), where s0 is a new state, not in S A;

    f A (s) =f A(s) if s S A F A;f A(s) f A(sA) if s F A;f A(sA) if s = s0

    M A accepts A.

    3

    A Non-Regular Language

    Not every language is regular (which means that not evlanguage can be accepted by a nite automaton).

    Theorem: L = {0n 1n : n = 0, 1, 2, . . .} is not regular

    Proof: Suppose, by way of contradiction, that L is regular. Then there is a DFA M = (S, {0, 1}, f , s 0, F ) thaaccepts L. Suppose that M has N states. Let s0, . . . , s 2Nbe the set of states that M goes through on input 0N 1N

    Thus f (s i , 0) = si+1 for i = 0, . . . , N .

    Since M has N states, by the pigeonhole principle (rmember that?), at least two of s0, . . . , s N must be thsame. Suppose its s i and s j , where i < j , and j i = tClaim: M accepts 0N 0t1N , and 0N 02t1N , ON 03t1N .

    Proof: Starting in s0, O i brings the machine to s i; another 0t bring the machine back to s i (since s j = s i+ t =s i); another 0t bring machine back to s i again. After going around the loop for a while, the can continue to sNand accept.

    4

  • 8/12/2019 cs2800 notes14

    2/5

    The Pumping Lemma

    The techniques of the previous proof generalize. If M is aDFA and x is a string accepted by M such that |x| | S |

    | S | is the number of states; |x| is the length of x

    then there are strings u, v, w such that

    x = uvw,

    | uv | | S |, | v| 1,

    uviw is accepted by M , for i = 0, 1, 2, . . . .

    The proof is the same as on the previous slide.

    x was 0n1n , u = 0i , v = 0t , w = 0N t i1N .

    We can use the Pumping Lemma to show that many lan-gauges are not regular

    { 1n2

    : n = 0, 1, 2, . . .}: homework

    { 02n 1n : n = 0, 1, 2, . . .}: homework

    { 1n : n is prime} . . .

    5

    More Powerful Machines

    Finite automata are very simple machines.

    They have no memory

    Roughly speaking, they cant count beyond the nuber of states they have.

    Pushdown automata have states and a stack which pro

    vides unlimited memory. They can recognize all languages generated by context

    free grammars (CFGs)

    CFGs are typically used to characterize the syntof programming languages

    They can recognize the language{0n 1n : n = 0, 1, 2, .but not the language L = {0n 1n 2n : n = 0, 1, 2, . . .}

    Linear bounded automata can recognize L .

    More generally, they can recognize context-sensitive grammars (CSGs)

    CSGs are (almost) good enough to characterize tgrammar of real langugaes (like English)

    6

    Most general of all: Turing machine (TM)

    Given a computable language, there is a TM thataccepts it.

    This is essentially how we dene computability.

    If youre interested in these issues, take CS 3810!

    7

    Coverage of Final

    everything covered by the rst prelim

    emphasis on more recent material

    Chapter 4: Fundamental Counting Methods

    Permutations and combinations Combinatorial identities Pascals triangle Binomial Theorem (but not multinomial theorem Balls and urns Inclusion-exclusion Pigeonhole principle

    Chapter 6: Probability:

    6.16.5 (but not inverse binomial distribution) basic denitions: probability space, events conditional probability, independence, Bayes Th random variables uniform and binomial distribution expected value and variance

    8

  • 8/12/2019 cs2800 notes14

    3/5

    Chapter 7: Logic:

    7.17.4, 7.6, 7.7; *not* 7.5 translating from English to propositional (or rst-

    order) logic truth tables and axiomatic proofs algorithm verication rst-order logic

    Chapter 3: Graphs and Trees

    basic terminology: digraph, dag, degree, multi-graph, path, connected component, clique

    Eulerian and Hamiltonian paths algorithm for telling if graph has Eulerian path

    BFS and DFS bipartite graphs graph coloring and chromatic number graph isomorphism

    Finite State Automata

    describing nite state automata

    regular langauges and nite state automata nondeterministic vs. deterministic automata pumping lemma (understand what its saying)

    9

    Some Bureuacracy

    The nal is on Friday, May 15, 2-4:30 PM, in Olin

    If you have a conict and havent told me, let me knnow

    Also tell me the courses and professors invol(with emails)

    Also tell the other professors Office hours go on as usual during study week,

    check the course web site soon.

    There may be small changes to accommodate tTAs exams

    There will be two review sessions: May 12 (7 Pand May 13 (4:45)

    10

    Ten Powerful Ideas

    Counting : Count without counting (combinatorics )

    Induction : Recognize it in all its guises.

    Exemplication : Find a sense in which you cantry out a problem or solution on small examples.

    Abstraction : Abstract away the inessential featuresof a problem.

    One possible way: represent it as a graph

    Modularity : Decompose a complex problem intosimpler subproblems.

    Representation : Understand the relationships be-tween different possible representations of the sameinformation or idea.

    Graphs vs. matrices vs. relations

    Renement : The best solutions come from a pro-cess of repeatedly rening and inventing alternativesolutions.

    Toolbox : Build up your vocabulary of abstract struc-tures.

    11

    Optimization : Understand which improvements aworth it.

    Probabilistic methods : Flipping a coin can bsurprisingly helpful!

    12

  • 8/12/2019 cs2800 notes14

    4/5

    Connections: Random Graphs

    Suppose we have a random graph with n vertices. Howlikely is it to be connected?

    What is a random graph?

    If it hasn vertices, there areC (n, 2) possible edges,and 2C (n, 2) possible graphs. What fraction of themis connected?

    One way of thinking about this. Build a graphusing a random process, that puts each edge inwith probability 1/ 2.

    Given three verticesa, b, and c, whats the probabilitythat there is an edge between a and b and between band c? 1/ 4

    What is the probability that there is no path of length2 between a and c? (3/ 4)n 2

    What is the probability that there is a path of length2 between a and c? 1 (3/ 4)n 2

    What is the probability that there is a path of length 2betweena and every other vertex? > (1 (3/ 4)n 2)n 1

    13

    Now use the binomial theorem to compute (1 (3/ 4)n 2)n

    (1 (3/ 4)n 2)n 1= 1 (n 1)(3/ 4)n 2 + C (n 1, 2)(3/ 4)2(n 2) +

    For sufficiently large n, this will be (just about) 1.

    Bottom line: If n is large, then it is almost certain thatrandom graph will be connected.

    Theorem: [Fagin, 1976] If P is any property expressible in rst-order logic, it is either true in almostgraphs, or false in almost all graphs.

    This is called a 0-1 law .

    14

    Connection: First-order Logic

    Suppose you wanted to query a database. How do youdo it?

    Modern database query language date back to SQL (struc-tured query language), and are all based on rst-orderlogic.

    The idea goes back to Ted Codd, who invented thenotion of relational databases.

    Suppose youre a travel agent and want to query the air-line database about whether there are ights from Ithacato Santa Fe.

    How are cities and ights between them represented? How do we form this query?

    Youre actually asking whether there is a path from Ithacato Santa Fe in the graph.

    This fact cannot be expressed in rst-order logic!

    15

    (A Little Bit on) NP

    (No details here; just a rough sketch of the ideas. TCS 3810/4820 if you want more.)

    NP = nondeterministic polynomial time

    a language (set of strings) L is in NP if, for each x L, you can guess a witness y showing that x L andquickly (in polynomial time) verify that its correc

    Examples:

    Does a graph have a Hamiltonian path? guess a Hamiltonian path

    Is a formula satisable? guess a satisfying assignment

    Is there a schedule that satises certain constrain . . .

    Formally, L is in NP if there exists a language L sucthat

    1. x L iff there exists a y such that (x, y) L , and

    2. checking if (x, y) L can be done in polynomial tim

    16

  • 8/12/2019 cs2800 notes14

    5/5

    NP-completeness

    A problem is NP-hard if every NP problem can bereduced to it.

    A problem is NP-complete if it is in NP and NP-hard

    Intuitively, if it is one of the hardest problems in NP.

    There are lots of problems known to be NP-complete

    If any NP complete problem is doable in polynomialtime, then they all are.

    Hamiltonian path satisability scheduling . . .

    If you can prove P = NP, youll get a Turing award.

    17