contrastors
C
Contrastors
Overview :
contrastors is a contrastive learning toolkit that enables researchers and engineers to efficiently train and evaluate contrastive models. Built upon Flash Attention, it supports multi-GPU training, GradCache support for large batch training in memory-constrained environments. It also supports Huggingface, allowing for seamless loading of common models. Additionally, it supports masked language modeling pretraining and Matryoshka representation learning.
Target Users :
Used for researchers and engineers to efficiently train and evaluate contrastive learning models.
Total Visits: 474.6M
Top Region: US(19.34%)
Website Views : 63.8K
Use Cases
Training a BERT model from scratch on 8 GPUs using contrastors
Fine-tuning the nomic-bert-embed-v1-unsupervised model using contrastors
Generating your own dataset for contrastive learning using contrastors's scripts
Features
Multi-GPU support
GradCache support for large batch training
Support for masked language model pretraining
Support for Huggingface to load common models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase