Theorem. The following problem is undecidable:
• input: a weighted automaton;
• question: is some word mapped to ?
The archetypical function that can be computed by a weighted automaton is mapping a string of digits to its interpretation as a fraction stored in binary (or ternary, etc) notation. This construction is described in the following lemma.
Lemma 1. For every alphabet there is a finite weighted automaton which computes an injective function to the strictly positive reals
Proof. Without loss of generality, assume that . The idea is to treat an input as a fraction in base :
The only problem with this solution is that trailing zeros are ignored. Therefore, the automaton adds an imaginary 1 to the end of the input. To implement this procedure by an automaton, we do the following. Its state space is . The idea is that the first coordinate stores the value of the input seen so far, while the second stores where is the length of the input read so far. The initial vector is . For , the state update function is the linear function
The output function is
which corresponds to adding the imaginary 1 at the end of the output.
For a Turing machine , define to be the work alphabet of the machine plus pairs of the form (letter of the work alphabet, state of the machine). A configuration of the machine is represented as a word over this alphabet, where exactly one letter is of the pair type, and describes the position of the head. For example, the word
represents a configuration where the tape content is and the head is over the third cell with state .
Below, by transducer we mean a deterministic finite automaton where each transition is labelled by possibly empty output word. This is a slightly more general model than the one in the lecture on determinisation.
Lemma 2. Let be a Turing machine. There exists a transducer
such that if represents a configuration of , then represents its successor configuration.
Proof. The automaton has a one letter buffer in case the head will want to move to the left.
The following lemma shows some closure properties of functions recognised by weighted automata.
Lemma 3. Suppose that functions
are recognised by finite weighted automata, and let
be a transducer and regular language, respectively. Then the following functions are also recognised by finite weighted automata:
• the sum ;
• the scalar multiple for ;
• the composition ;
• the function which is defined as on and as outside .
Proof. The sum and scalar multiple are immediate. For composition with a transducer, suppose that the state space of is and the transducer has states . Then the weighted automaton for the composition has state space . After reading an input where the transducer would end up in state , the number stored in coordinate is the same as number stored in coordinate by the original weighted automaton, and it is zero otherwise. The “if then else” construction in the last item of the lemma can be seen as a special case of the transducer, by using a transducer which appends to the word a bit which says if the input belonged to .
Suppose that is a Turing machine. We will construct a weighted automaton with input alphabet such that the Turing machine has an accepting computation if and only if the automaton maps some nonempty word to . The automaton works as follows. Apply Lemma 1 to get an injective function
and apply Lemma 2 to the machine yielding a transducer which computes the successor configurations of . Consider now the function
defined as:
• if the input is of the form
where is the initial configuration over the empty input and is an accepting state, then the output is
• otherwise, the output is .
The definition of the function is defined in terms of the four closure operations given in Lemma 3, and therefore it is recognised by a finite weighted automaton. The only way for the function to output 0 is to get on input an accepting computation of the Turing machine.
Leave a Reply