Turn search results into answers.
Polymathy is a Rust web service that transforms traditional search into an answer engine. Instead of returning a list of links, it fetches content from search results, chunks it semantically, and returns the actual information you're looking for.
Think of it as the backend for building your own Perplexity-style search experience.
Query → SearxNG → URLs → Content Processor → Semantic Chunks → You
- Your query hits SearxNG (privacy-respecting metasearch)
- Polymathy grabs the top 10 result URLs
- Each URL is sent to a content processor that extracts text, chunks it, and generates embeddings
- You get back actual content snippets with their sources
git clone https://github.com/skelfresearch/polymathy.git
cd polymathy
# Configure your services
cp sample.env .env
# Edit .env with your SearxNG and processor URLs
cargo run --releaseHit the API:
curl "http://localhost:8080/v1/search?q=rust+async+patterns"Response:
{
"0": ["https://blog.rust-lang.org/...", "Async functions in Rust return a Future..."],
"1": ["https://tokio.rs/...", "Tokio provides an async runtime for Rust..."],
"2": ["https://example.com/...", "When working with async Rust, you'll often..."]
}- Rust 1.70+
- SearxNG instance - Either self-hosted or a public instance
- Content processor - A service that handles chunking and embedding (you'll need to provide this)
Create a .env file:
SEARXNG_URL=http://localhost:8888/search
PROCESSOR_URL=http://localhost:8081/v1/process
SERVER_HOST=127.0.0.1
SERVER_PORT=8080| Variable | Required | Description |
|---|---|---|
SEARXNG_URL |
Yes | Your SearxNG search endpoint |
PROCESSOR_URL |
Yes | Content processor endpoint |
SERVER_HOST |
No | Defaults to 127.0.0.1 |
SERVER_PORT |
No | Defaults to 8080 |
Returns a map of chunk IDs to [source_url, content] pairs.
/swagger- Swagger UI/redoc- ReDoc/rapidoc- RapiDoc/scalar- Scalar/openapi.json- Raw spec
[dependencies]
polymathy = "0.2"cargo install --git https://github.com/skelfresearch/polymathyuse polymathy::run;
#[tokio::main]
async fn main() -> std::io::Result<()> {
run().await
}src/
├── lib.rs # Public API exports
├── api.rs # HTTP server, endpoints, request handling
├── search.rs # Data structures (SearchQuery, ProcessedContent)
├── index.rs # USearch vector index (384-dim, inner product)
└── bin/
└── polymath.rs # Binary entry point
Stack: Actix-web, Tokio, USearch, Reqwest, Serde
Polymathy doesn't include:
- A content processor (you need to build or provide one)
- An embedding model (the processor handles this)
- A complete RAG solution (it's one piece of the puzzle)
It's infrastructure for building answer engines, not an out-of-the-box product.
Full documentation: https://docs.skelfresearch.com/polymathy
Or run locally:
cd documentation
pip install mkdocs-material
mkdocs servePRs welcome. Please run cargo fmt and cargo clippy before submitting.
GPL-3.0 - See LICENSE