The neural chips to speed up the process of a neural network have recently been developed actively. Since a conventional neural chip requires quantized synaptic weights, it is important to know the properties of the network with such weights. Devices for 2-layer network with the same number of input and output elements have been well developed, and such a device can be used as an autocorrelation assoicative memory by feeding its output to its input.

This paper investigates theoretically the capacity of an autocorrelation associative memory when its synaptic weights are quantized with a finite number of bits. It also proposed an optimum quantization function. A system with a finite number of elements is computer simulated. The proposed method can be applied to the desigh of a neural chip for associative memory.