Highly customizable async caching framework for Rust - from in-memory to distributed solutions, designed for high-performance applications
-
Updated
Mar 5, 2026 - Rust
Highly customizable async caching framework for Rust - from in-memory to distributed solutions, designed for high-performance applications
Elara DB is an easy to use, lightweight persistent key-value store that can also be used as a fast in-memory cache. Manipulate data structures in-memory, encrypt database files and export data. 🎯
Scan-resistant, sharded Go cache. Admission-LFU/LRU/LFU/FIFO evictions with object pooling and optional embedded, p2p mesh cluster.
A High Performance, Generic, thread-safe, zero-dependency, key-value, in-memory cache
An in-memory cache with expiration and eviction policies.
TCP in-memory data structure store
A thread-safe, network-accessible LRU cache server written in Go.
A simple in-memory caching solution for Node.js, with a flexible design where you can define your own cache policies.
Implementation of PSR-6 & PSR-16 for Null & In-memory cache for PHP ^7.3 & ^8.0
High-performance in-memory cache for Go with TTL, distributed invalidation, and zero dependencies.
A lightweight Swift library for caching Identifiable values with optional expiry, supporting both in-memory and file-backed storage.
A CRUD users REST API using Python3 Flask and FastAPI
Easy ASGI/FastAPI endpoint rate limiter integration as middleware with Redis caching.
A tiny, zero-dependency, generic, sharded in-memory cache for Go with TTL and LRU eviction
Implementing caching system in .net web api core, making using of in memory cache for single-serve
Redis Distributed and In-Memory Cache .Net 6.0 MVC Apps
High-performance, thread-safe caching for Go with automatic function wrapping, TTL, Redis backend, and multiple eviction strategies (LRU/LFU/FIFO)
A high-performance, lightweight TTL (Time-To-Live) and LRU (Least Recently Used) based in-memory cache for both Node.js and browser environments. Built with TypeScript for type safety and excellent developer experience.
dalahash is an in-memory data store (redis/memcached partial compatibility) built for high-throughput, low-latency serving on Linux.
High-performance caching engine for Python — sync & async, in-memory or Redis, with decorators, tag-based invalidation, encryption & metrics.
Add a description, image, and links to the in-memory-cache topic page so that developers can more easily learn about it.
To associate your repository with the in-memory-cache topic, visit your repo's landing page and select "manage topics."