Recent research has demonstrated the potential of utilizing pre-trained models for 3D molecular representation. However, these existing models primarily focus on equilibrium data and neglect off-equilibrium conformations. The main challenge lies in extending these methods to off-equilibrium data, as the training objective of these models assumes that conformations represent local energy minima. To bridge this gap, we propose a force-centric pretraining model for 3D molecular conformations, which encompasses both equilibrium and off-equilibrium data. For off-equilibrium data, our model directly learns from the atomic forces. To handle equilibrium data, we introduce zero-force regularization and forced-based denoising techniques to approximate near-equilibrium forces. This approach enables us to create a unified pre-trained model for 3D molecular representation, consisting of a diverse set of over 15 million conformations. Experimental results demonstrate that our pre-training objective improves force accuracy by approximately three times compared to the un-pre-trained Equivariant Transformer model. Additionally, by incorporating regularizations on equilibrium data, we resolve the instability issues encountered in vanilla Equivariant Transformers during molecular dynamics (MD) simulations. Our approach achieves state-of-the-art simulation performance, with an inference time that is 2.45 times faster than NequIP. Acting as a powerful molecular encoder, our pre-trained model achieves comparable performance to state-of-the-art methods in property prediction tasks.