LongCite
L
Longcite
Overview :
LongCite is an open-source model that trains large language models (LLMs) to produce accurate responses and precise sentence-level citations in long text question-answering scenarios. Its significance lies in enhancing the accuracy and credibility of question-answering systems, allowing users to verify the sources of the output information. LongCite supports context lengths of up to 128K and provides two models: LongCite-glm4-9b and LongCite-llama3.1-8b, trained respectively on GLM-4-9B and Meta-Llama-3.1-8B.
Target Users :
The primary audience for LongCite includes researchers and developers in the field of natural language processing, particularly those focused on developing and optimizing question-answering systems for long text environments. This technology aids them in improving system accuracy and user trust.
Total Visits: 474.6M
Top Region: US(19.34%)
Website Views : 47.2K
Use Cases
Researchers use the LongCite model to automatically cite relevant studies in academic papers.
Developers integrate LongCite into their question-answering systems to enhance response quality and credibility.
Educational institutions utilize the LongCite model to teach students about academic citation practices.
Features
Supports long text question answering, generating accurate responses and precise sentence-level citations.
Provides two pre-trained models: LongCite-glm4-9b and LongCite-llama3.1-8b.
Supports context lengths of up to 128K.
Offers environment setup guides and model deployment methods.
Provides a Citation with Fine-grained Context (CoF) pipeline.
Includes detailed guides for model training and evaluation.
Features automated benchmarking: LongBench-Cite, for measuring citation quality and answer correctness.
How to Use
1. Install the necessary software and libraries according to the environment setup guide.
2. Use the provided code examples to download and set up the LongCite model.
3. Prepare the long text context and queries.
4. Call the model's query_longcite function, passing in the context and queries.
5. Obtain the answers and citations generated by the model.
6. Adjust model parameters as needed, such as input length and the number of new tokens.
7. Deploy the model to either a server or a local environment for practical application.
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase