# Pre-trained model > A pre-trained model refers to a model or a saved network created by someone else and trained on a large dataset to solve a similar problem. AI teams can use a pre-trained model as a starting point, instead of building a model from scratch. Examples of successful large-scale pre-trained language models are Bidirectional Encoder Representations from Transformers (BERT) and the Generative Pre-trained Transformer (GPT-n) series. A pre-trained model refers to a model or a saved [network](https://wiki.g15e.com/pages/Artificial%20neural%20network.txt) created by someone else and trained on a large dataset to solve a similar problem. [AI](https://wiki.g15e.com/pages/Artificial%20intelligence.txt) teams can use a pre-trained model as a starting point, instead of building a model from scratch. Examples of successful large-scale pre-trained language models are (BERT) and the [Generative Pre-trained Transformer](https://wiki.g15e.com/pages/Generative%20Pre-trained%20Transformer.txt) (GPT-n) series.