Federated Learning (FL) is a technique that focuses on training a global model while ensuring the privacy of client data. However, FL encounters challenges due to the non-IID data distribution among clients. To address this, Clustered FL (CFL) has emerged as a promising solution. Nonetheless, most existing CFL frameworks lack asynchrony as they adopt synchronous frameworks. A framework called SDAGFL, based on directed acyclic graph distributed ledger techniques (DAG-DLT), has been proposed to introduce asynchrony. However, its complete decentralization leads to high communication and storage costs. In this regard, we propose DAG-ACFL, an asynchronous clustered FL framework also based on DAG-DLT. We provide a detailed explanation of the components of DAG-ACFL. Additionally, we design a tip selection algorithm that aggregates models from clients with similar distributions using cosine similarity of model parameters. Furthermore, we introduce an adaptive tip selection algorithm that dynamically determines the number of selected tips using change-point detection. To evaluate the performance of DAG-ACFL, we conduct experiments on multiple datasets and analyze its communication and storage costs. The results demonstrate the superiority of DAG-ACFL in asynchronous clustered FL. By combining DAG-DLT with clustered FL, DAG-ACFL achieves robust, decentralized, and private model training with efficient performance.