TinyLlama
T
Tinyllama
Overview :
The TinyLlama project aims to pre-train a 1.1B Llama model on 3 trillion tokens. With some optimizations, we can achieve this in just 90 days using 16 A100-40G GPUs. Training began on 2023-09-01. We adopt the same architecture and tokenizer as Llama 2. This means TinyLlama can be used in many open-source projects built on top of Llama. Additionally, with only 1.1B parameters, TinyLlama's compactness allows it to meet the needs of many applications with limited computational and memory resources.
Target Users :
TinyLlama can be used for fine-tuning chat models and text generation.
Total Visits: 29.7M
Top Region: US(17.94%)
Website Views : 67.9K
Use Cases
https://github.com/jzhang38/TinyLlama
https://huggingface.co/docs/transformers/main/en/chat_templating
https://github.com/huggingface/transformers.git
Features
Text Generation
Transformers
Safe Tensors
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase