A new Kolmogorov-Arnold network (KAN) is proposed to approximate potentially irregular functions in high dimension. We show that it outperforms multilayer perceptrons in terms of accuracy and converges faster. We also compare it with ReLU-KAN, a recently proposed network: it is more time consuming than ReLU-KAN, but more accurate.