On the average
complexity of Boolean functions
The computing complexity of
Boolean functions has been investigated for a long time, and the investigations
have been fruitful [1], [2]. Circuits are a rather general computing model
used in this area of research. An important property of this model is that
every circuit performs the same number of elementary operations on any tuple of
arguments, i.e., the worst-case complexity of Boolean functions is investigated
in this model. However, there are situations when the number of elementary
operations (gates in circuits) is not a realistic measure of the complexity. Often,
it is more important to know the average number of elementary operations taken
over all inputs. This number might differ greatly from the number of elementary
operations performed in the worst case. One of the ways to investigate the
average complexity of Boolean functions is to use the model of straight-line programs with
conditional stop . These
programs generalize the notion of circuits and are a natural model of
straight-line computations without if-instructions and indirect addressing but with
an opportunity to terminate computations when some condition is satisfied. The
computation by straight-line programs with a conditional stop can be
represented in the following way. There is a computing unit that performs
computations. The unit has a Random Access Memory (RAM). The unit can compute any
Boolean function of two variables. The set of such functions is the basis of
computation. At every moment of time, the unit has access to any cell of the memory
to read or write. The unit is controlled by a program that is a sequence of instructions
of two basic types. Each instruction of the first type computes the value of
some basis function whose arguments are the values of certain memory cells. The
value computed is written to a memory cell. Instruction of the second type can terminate
the computation. Such an instruction has one argument that is the value of a certain
memory cell. If the value of the argument equals 1, then the computing unit terminates
the computation. If the value of the argument equals 0, then the next instruction
of the program is executed. The values of certain memory cells chosen in
advance are called the result of the execution of the program. A natural
measure of the complexity of such programs is the average execution time.
In the investigation of
average complexity, it is interesting to find out how it relates to the circuit
complexity, which is the worst-case complexity. In particular, an interesting
question is how large the gap between these two measures is for a particular
function. It will be shown later that the effects concerned with the
opportunity of conditional stop appear when we compute Boolean functions of
three variables. It is well known that there exist functions of three variables
those circuit complexity equals 4. Meanwhile, the average-case complexity of any
Boolean function of two variables is not greater than 2.5. The gap between
worst- and average-case complexities can be much larger when the number of
variables increases without limit. The average-case complexity of conjunction, disjunction,
and a linear Boolean function depending on a growing number of variables is considered
in [3]. The average-case complexity of disjunction is
asymptotically equal to the constant 8/3. The average-case complexity of
conjunction is asymptotically equal to 11/3. Thus, the ratio of the circuit
complexity to the average-case complexity for disjunction and conjunction
equals the number of their variables up to a multiplicative constant. The
average-case complexity of a linear Boolean function is the same as the circuit
complexity, i.e., a conditional stop does not decrease the computational cost for
this function. In [4] it has been shown that there exist Boolean
functions of n variables for which the order of the ratio of their
circuit complexity to their average-case complexity equals (2n/n)1/2.
Moreover, there are no Boolean functions for which the order of this ratio is
greater than one. When the number of variables grows, the average-case
complexity of almost all Boolean functions and almost all Boolean (n,m)-operators
((n,m)-operator is a system of m functions depending on n
variables) is different from their circuit complexity by a constant factor. In
particular, the following theorem holds.
Theorem 1. Let n tend to infinity. Let m=nO(1).
Then:
(i) almost all (n,m)-operators f satisfy
(ii)
each (n,m)-operator f satisfies
The proof
of this theorem has cumbersome. The proof of a weaker statement is given below.
Almost all functions in
many natural classes of Boolean functions of large cardinality have a circuit
complexity which is defined only by the class cardinality and this complexity
is similar to the lower cardinality bound. However, for straight-line programs
with a conditional stop, there are classes of sufficiently large cardinality
where the gap between the average-case complexity and circuit complexity of
almost all functions increases with the number of their arguments. An
interesting example of such a class is the class of monotone functions. The
following theorem gives bounds for the average-case complexity of monotone
functions.
Theorem 2. Let n tend to infinity. Then:
(i) almost all monotone functions f of n variables satisfy
(ii)
each monotone function f of n variables satisfies
where a and b are positive
constants.
It follows from Theorem 2
that the average-case complexity of almost every monotone function of n
variables is n1/2 times less than its circuit complexity [6].
Some other aspects
connected with the relation between the average-case and circuit complexity are
considered in [4] and [5]. A comparison of the results of [3], [4], Theorem 1, and Theorem 2 poses a number of
interesting questions. For example, what is the upper bound for the circuit
complexity of a Boolean function (Boolean operator) if its average-case
complexity is no less (asymptotically no less) than its circuit complexity?
REFERENCES
[1] Lupanov O.B. A method of synthesis
of control system - the principle local coding // Probl. Kiber. 1965. 14.
31-110 (in Russian).
[2] Sholomov L.A. On the realization of
underdetermine Boolean functions by circuits of functional elements // Probl. Kiber.
1969. 21. 215-226 (in Russian).
[3] Chashkin A.V. On the average execution time of values of
elementary Boolean functions.
[4] Chashkin A.V. Average case
complexity for finite Boolean functions // Discrete Appl. Math. 114, No.1-3, 43-59 (2001).
[5] Chashkin A.V. Computation of
Boolean functions by randomized programs // Discrete Appl. Math.
135 (2004), no. 1-3, 65--82.
[6] Zabaluev R. N. On the average-case complexity
of monotone Boolean functions // Sources of XV international school-seminar
“Synthesis and complexity of control systems”. Novosibirsk. 2004. 40-45 (in
Russian).