Jais
J
Jais
Overview :
This is a pre-trained bilingual large language model with 130 billion parameters, supporting Arabic and English. It is trained on a dataset of 72 billion Arabic tokens and 279 billion English/Code tokens. The Arabic data has been iterated 1.6 epochs compared to 1 epoch for English/Code, with a total of 395 billion tokens trained. The model is based on the Transformer decoder-specific architecture (GPT-3), utilizing the SwiGLU nonlinear activation function. It implements ALiBi positional embeddings, allowing extrapolation to long sequence lengths, providing enhanced context processing and model accuracy.
Target Users :
["Research Purposes","Commercial Use, such as Chat Assistants, Customer Service, etc."]
Total Visits: 29.7M
Top Region: US(17.94%)
Website Views : 43.6K
Use Cases
Used as a foundational model for Arabic natural language processing research
Developing applications integrated with Arabic functionality
Fine-tuning for downstream tasks such as chat assistants
Features
Generative dialogue support in Arabic and English
Fine-tuning for specific downstream tasks
Context awareness capability
Support for long sequence generation
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase