This paper presents a probabilistic algorithm for integer factorisation using deep learning. The algorithm is based on Lawrence’s extension of Fermat’s factorisation algorithm, which transforms the problem into a binary classification task. To address this classification problem, a large corpus of training data is generated synthetically, leveraging the ease of generating large pseudo-random primes. The algorithm is introduced, followed by a summary of experiments conducted. The paper also discusses the limitations of these experiments and invites others to reproduce, verify, and potentially improve the algorithm to make it a practical and scalable factorisation approach.