In recent times, large language models (LLMs) have demonstrated their remarkable abilities in various research fields such as natural language processing, computer vision, and molecular modeling. To expand on this, we present our model called Materials Informatics Transformer (MatInFormer) which utilizes LLMs for predicting material properties. Our approach involves learning the grammar of crystallography by tokenizing relevant space group information. Moreover, we showcase the versatility of MatInFormer by incorporating task-specific data related to Metal-Organic Frameworks (MOFs). By visualizing attention, we identify the key features that the model prioritizes during property prediction. Through empirical validation across 14 different datasets, we highlight the effectiveness of our proposed model, underscoring its potential for accurate material property prediction in high throughput screening.