

Kelindar/search
Overview :
kelindar/search is a Go language library that provides functionality for embedded vector search and semantic embeddings, built on llama.cpp. This library is particularly suitable for small to medium-sized projects that require powerful semantic search capabilities while maintaining a simple and efficient implementation. It supports GGUF BERT models, allowing users to leverage advanced embedding techniques without delving into the complexities of traditional search systems. The library also offers GPU acceleration for fast computations on supported hardware. If your dataset contains fewer than 100,000 entries, this library can be easily integrated into your Go application to implement semantic search features.
Target Users :
The target audience consists of developers who need to integrate semantic search capabilities into their Go applications, particularly those working with small to medium-sized datasets and seeking to enhance search efficiency using the BERT model and GPU acceleration.
Use Cases
Use this library to create a vector index for documents or articles, enabling quick retrieval.
In a recommendation system, utilize vector embeddings generated from user behavior for similar item recommendations.
In natural language processing applications, generate semantic embeddings of text using the BERT model for text similarity analysis.
Features
No cgo integration with llama.cpp: Relies on pure Go, directly calling shared C libraries from Go code, simplifying integration, deployment, and cross-compilation.
BERT model support: Integrates BERT models via llama.cpp as long as they use the GGUF format.
Precompiled binaries with Vulkan GPU support: Provides precompiled binaries for Windows and Linux, compiled with Vulkan for GPU acceleration.
Embedded search index: Supports creating search indices from computed embeddings that can be saved to disk for later use, suitable for basic vector searches in small-scale applications.
How to Use
1. Install the library: Download the precompiled binaries suitable for Windows and Linux, or compile from the source.
2. Load the model: Initialize a model using the `search.NewVectorizer` function with a GGUF file.
3. Generate text embeddings: Use the `EmbedText` method to create vector embeddings for the provided text input.
4. Create an index and add vectors: Use `search.NewIndex` to create a new index, and use the `Add` method to add multiple vectors along with their corresponding labels.
5. Search the index: Execute a search using the `Search` method, passing in the embedding vector and the number of results to retrieve.
6. Print results: Iterate through the search results and print each result along with its relevance score.
Featured AI Tools

Pseudoeditor
PseudoEditor is a free online pseudocode editor. It features syntax highlighting and auto-completion, making it easier for you to write pseudocode. You can also use our pseudocode compiler feature to test your code. No download is required, start using it immediately.
Development & Tools
3.8M

Coze
Coze is a next-generation AI chatbot building platform that enables the rapid creation, debugging, and optimization of AI chatbot applications. Users can quickly build bots without writing code and deploy them across multiple platforms. Coze also offers a rich set of plugins that can extend the capabilities of bots, allowing them to interact with data, turn ideas into bot skills, equip bots with long-term memory, and enable bots to initiate conversations.
Development & Tools
3.8M