In recent times, topic modeling approaches for adaptive language modeling have been extensively explored for Natural Language Processing applications such as machine translation, speech recognition etc. Language model is extremely fragile in adapting towards the required domain, so it needs to be channeled towards an area or a topic for producing optimal results. This paves the need to investigate various topic modeling approaches which are used to infer knowledge from a large corpora. In this paper, we mileage various topic modeling techniques which include Latent Semantic Indexing, Latent Dirichlet Allocation and Hierarchical Dirichlet Process. In this process, the baseline language model is dynamically adapted to different topics and the results are analyzed for these three topic modeling approaches. © 2017 IEEE.