# Large language model > 입출력 모달리티가 텍스트를 포함하는 트랜스포머 아키텍쳐 기반의 생성 AI. ChatGPT를 계기로 대중에게 널리 알려짐. 입출력 모달리티가 텍스트를 포함하는 [트랜스포머 아키텍쳐](https://wiki.g15e.com/pages/Transformer%20architecture.txt) 기반의 [생성 AI](https://wiki.g15e.com/pages/Generative%20AI.txt). [ChatGPT](https://wiki.g15e.com/pages/ChatGPT.txt)를 계기로 대중에게 널리 알려짐. **** ## 주요 모델들 - 의 [ChatGPT](https://wiki.g15e.com/pages/ChatGPT.txt) - 의 [Claude](https://wiki.g15e.com/pages/Claude%20(AI.txt)) - [Google](https://wiki.g15e.com/pages/Google.txt)의 - 의 [Llama](https://wiki.g15e.com/pages/Llama.txt) - - 모델 데이터베이스: https://models.dev/ ## Articles - 2024-11-21 - [Re-Invoke: Tool invocation rewriting for zero-shot tool retrieval](https://research.google/blog/re-invoke-tool-invocation-rewriting-for-zero-shot-tool-retrieval/) - 2024-08-14 - [LLMs develop their own understanding of reality as their language abilities improve](https://news.mit.edu/2024/llms-develop-own-understanding-of-reality-as-language-abilities-improve-0814) - 2023-12-11 - [Why We Support and Encourage the Use of Large Language Models in NEJM AI Submissions | NEJM AI](https://ai.nejm.org/doi/full/10.1056/AIe2300128) ## See also - [ML crash course - LLM](https://wiki.g15e.com/pages/ML%20crash%20course%20-%20LLM.txt) - [LLM token sampling](https://wiki.g15e.com/pages/LLM%20token%20sampling.txt) - [Free LLMs](https://wiki.g15e.com/pages/Free%20LLMs.txt) ## External links - https://applied-llms.org/ - What We've Learned From A Year of Building with LLMs - [LLM Engineer's Almanac](https://modal.com/llm-almanac/advisor)