In this paper, we present a novel approach to memory in the least squares support vector machine (LSSVM). Our method introduces a memory influence mechanism that allows for accurate partitioning of the training set without overfitting, while maintaining the equation constraints of the original LSSVM. We propose two memory models, namely the maximum memory impact model (MIMM) and the weighted impact memory model (WIMM), both of which can be reduced to the LSSVM. Additionally, we suggest different memory impact functions for the MIMM and WIMM. Experimental results demonstrate that our MIMM and WIMM outperform the LSSVM in terms of generalization performance and offer significant advantages in terms of time cost compared to other memory models.