# BioGPT > Biomedical domain GPT developed by Microsoft. Biomedical domain [GPT](https://wiki.g15e.com/pages/Generative%20Pre-trained%20Transformer.txt) developed by [Microsoft](https://wiki.g15e.com/pages/Microsoft.txt). https://github.com/microsoft/BioGPT > [Pre-trained](https://wiki.g15e.com/pages/Pre-trained%20model.txt) language models have attracted increasing attention in the biomedical domain, inspired by their great success in the general domain. Among the two main branches of pre-trained language models in the general language domain, i.e. (and its variants) and [GPT](https://wiki.g15e.com/pages/Generative%20Pre-trained%20Transformer.txt) (and its variants), the first one has been extensively studied in the biomedical domain, such as BioBERT and PubMedBERT . While they have achieved great success on a variety of discriminative downstream biomedical tasks, the lack of generation ability constrains their application scope. In this paper, we propose BioGPT, a domain-specific generative [Transformer](https://wiki.g15e.com/pages/Transformer%20architecture.txt) language model pre-trained on large-scale biomedical literature. We evaluate BioGPT on six biomedical processing tasks and demonstrate that our model outperforms previous models on most tasks. Especially, we get 44.98%, 38.42% and 40.76% F1 score on BC5CDR, KD-DTI and DDI end-to-end relation extraction tasks, respectively, and 78.2% accuracy on PubMedQA , creating a new record. Our case study on text generation further demonstrates the advantage of BioGPT on biomedical literature to generate fluent descriptions for biomedical terms.