Abstract:Neurons, modeled as linear threshold unit (LTU), can in theory compute all thresh- old functions. In practice, however, some of these functions require synaptic weights of arbitrary large precision. We show here that dendrites can alleviate this requirement. We introduce here the non-Linear Threshold Unit (nLTU) that integrates synaptic input sub-linearly within distinct subunits to take into account local saturation in dendrites. We systematically search parameter space of the nTLU and TLU to compare them. Firstly, this shows that the nLTU can compute all threshold functions with smaller precision weights than the LTU. Secondly, we show that a nLTU can compute significantly more functions than a LTU when an input can only make a single synapse. This work paves the way for a new generation of network made of nLTU with binary synapses.