[LAC] Nondeterministic Finite Automata(NFA)

A “nondeterministic” finite automata has the power to be in several states at once. This ability is often expressed as an ability to “guess” something about its input. For instance, when the automata is used to search for certain sequences of characters(e.g. keywords) in a long text string, it is helpful to “guess” that we are at the beginning of one of those strings and use a sequence of states to do nothing but check that the string appears, character by character.

An informal view of NFA

like the DFA, NFA has states, input symbols, transition functions, etc. The transition function in NFA is a function that takes a state and input symbol as arguments, but returns a set of zero, one or more states(rather than returning exactly one state, as the DFA must).

Example of NFA:
enter image description here
The job of NFA is to accept all and only the strings of 0’s and 1’s that end in 01.
States q0 is the start state, and we can think of the automata as being in state q0 whenever it has not yet “guessed” . It is always possible that the next symbol does not begin the final 01, even if the symbol is 0. Thus, state q0 may transition to itself on both 0 and 1.
However, if the next symbol is 0, this NFA also guesses that the final 01 has begun. An arc labeled 0 thus leads from q0 to state q1.

Notice that there are 2 arcs labeled 0 out of q0. The NFA has the option of going either to q0 or q1.
In state q1, the NFA checks that the next symbol is 1 and if so, it goes to q2 and accepts.

While a DFA has exactly one arc out of each state for each input symbol, an NFA has no such constraint, like the image above, q1 has no arc labeled 0 and q2 has no arc at all. We call it “die” if no transition come out.

Thus if 00101 comes in, we can have this NFA diagram:
enter image description here
Because q0 has one way to accepting state, 00101 will be accepted.

Definition of NFA

Now, let us introduce the formal notions associated with nondeterministic finite automata. An NFA is represented essentially like a DFA:

A = (Q, Σ, δ, q0, F) 
where 
Q is a finite set of states
Σ is a finite set of input symbols
q0, a member of Q, is the start state
F, a subset of Q, is the set of final(or accepting) states
δ, the transition function is a function that takes a state in Q and an input symbol in Σ as arguments and returns a subset of Q. Notice that the only difference between an NFA and a DFA is in the type of value that δ returns: a set if states in the case of an NFA and a single state in the case of DFA

Transition table of the example above

enter image description here

The only difference is that each entry in the NFA is a set, even if the set is a singleton(has one number). Also, notice that when there is no transition at all from a given state on a given input symbol, the proper entry is φ, the empty set.

The extended transition function

The function δ*(δhat) takes a state q and a string of input symbols w, and returns the set of states that the NFA is in if it starts in state q and processes the string w.

  • Basic: δ*(q, ε) = {q} that is, if we are in state q and read no inputs, then we are still in state q. Notice the state is now a set {q}.
  • Induction: Suppose w is of the form w = xa, where a is the final symbol of w and x is the rest of w. Also suppose that δ*(q,x) = {p1, p2, …, pk}. Let
  • enter image description here
  • Then δ*(q,w) = {r1, r2, …, rm}.
  • Less formally, we compute δ(q,w) by first computing δ(q,x), and by then following any transition from any of these states that is labeled a.

Example: Let us use δ* to describe the processing of input 00101 by the NFA above.
enter image description here

The language of an NFA

As we have suggested, an NFA accepts a string w if it is possible to make any sequence of choices of next state, while reading the characters of w, and go from the start state to any accepting state.
If A = (Q, Σ, δ, q0, F) is an NFA, then
L(A) = {w | δ*(q0, w) ∩ F ≠ φ}
That is, L(A) is the set of string w in Σ* such that δ*(q0, w) contains at least one accepting state.

评论

此博客中的热门博文

[MLE] Linear Classification

[AIM] MetaHeuristics

[CS231] Neural Networks