||In this paper, we present a novel incremental and decremental learning method for the least-squares support vector machine (LS-SVM). The goal is to adapt a pre-trained model to changes in the training dataset, without retraining the model on all the data, where the changes can include addition and deletion of data samples. We propose a provably exact method where the updated model is exactly the same as a model trained from scratch using the entire (updated) training dataset. Our proposed method only requires access to the updated data samples, the previous model parameters, and a unique, fixed-size matrix that quantifies the effect of the previous training dataset. Our approach can significantly reduce the storage requirement of model updating, preserve the privacy of unchanged training samples without loss of model accuracy, and enhance the computational efficiency. Experiments on real-world image dataset validate the effectiveness of our proposed method.