Quantum Contextuality for Training Neural Networks
-
Graphical Abstract
-
Abstract
In the training process of Neural networks (NNs), the selection of hyper-parameters is crucial, which determines the final training effect of the model. Among them, the Learning rate decay (LRD) can improve the learning speed and accuracy; the Weight decay (WD) improves the over-fitting to varying degrees. However, the decay methods still have problems such as hysteresis and stiffness of parameter adjustment, so that the final model will be inferior. Based on the Quantum contextuality (QC) theory, we propose a Quantum contextuality constraint (QCC) to constrain the weights of nodes in NNs to further improve the training effect. In the simplest classification model, we combine this constraint with different methods of LRD and WD to verify that QCC can further improve the training effect on the decay method. The performance of the experiments shows that QCC can significantly improve the convergence and accuracy of the model.
-
-