After spending way too much time, I went with a totally original, never-before-seen name for an in-memory cache: inmem. Creative, right?
Jokes aside, it all started as a side experiment and went through more refactor cycles than I care to admit. But now, it’s grown into something clean, fast, and actually useful. Introducing inmem— an embedded caching library written in pure Go, designed with simplicity and performance in mind. It supports eviction policies, sharding for concurrency, and transactions for atomic operations.
So, what is inmem?
It is a fast, embedded in-memory caching library written in Go. It's like a mini key-value store, living happily inside your app, keeping your data hot and your latency low.
💡 Why I built it?
- I was bored
⚙️ Key Features
- Eviction Support: Keep memory usage in check with support of an optimized TTL-based eviction. It also supports an efficient LRU and LFU. ARC (Adaptive Replacement Cache) is next in line!
- Sharding: Out of the box sharding support for not overloading a single map.
- Transactions: Atomic read/write/delete operations across multiple keys. No need to reinvent locking or worry about inconsistent state.
- Thread-safe: Go nuts with goroutines — it’s internal locking is shard-aware and handles the dirty work for you.
- Standalone eviction cache: Use the eviction cache (LRU or LFU) independently.
- Persistence: Periodically save cache data to disk and load it on startup.
🚀 Quick Start
Install is the usual way:
go get github.com/achu-1612/inmem
🧪 Usage
Basic Usage
A simple example showing how to store and retrieve a custom struct (User) from the cache. Don't forget to gob.Register
your types if you're storing structs!
package main
import (
"context"
"fmt"
"github.com/achu-1612/inmem"
)
type User struct {
Name string
Age int
}
func main() {
gob.Register(User{})
cache, err := inmem.New(context.Background(), inmem.Options{})
if err != nil {
panic(err)
}
cache.Set("key", User{Name: "Achu", Age: 25}, 0)
value, found := cache.Get("key")
if found {
fmt.Println("Found value:", value)
} else {
fmt.Println("Value not found")
}
}
Sharded Cache
Enable sharding to reduce lock contention and improve performance under concurrent access. You can configure the number of shards as needed.
cache, err := inmem.New(context.Background(), inmem.Options{
Sharding: true,
ShardCount: 4,
})
Transactions
Enable transactional support to perform atomic operations (like multi-key updates/deletes) safely. Currently supports optimistic and atomic transactions.
cache, err := inmem.New(context.Background(), inmem.Options{
TransactionType: inmem.TransactionTypeOptimistic,
})
Persistence
Enable persistence to periodically write cache data to disk. Useful if you want the cache to survive restarts.
cache, err := inmem.New(context.Background(), inmem.Options{
Sync: true,
SyncInterval: time.Minute,
SyncFolderPath: "cache_data",
})
Eviction
Configure the cache to automatically evict entries based on policy (e.g., LRU) when the size limit is reached.
cache, err := inmem.New(context.Background(), inmem.Options{
EvictionPolicy: eviction.PolicyLRU,,
MaxSize: 2,
})
if err != nil {
panic(err)
}
cache.Set("key1", "value", 0)
cache.Set("key2", "value", 0)
cache.Get("key2")
cache.Set("key3", "value", 0) // key1 will be evicted as key2 has access frequency as 2 and key1 has 1
Standalone Eviction cache
You can also use the underlying eviction engine independently if all you need is a simple key-value store with LRU or LFU eviction.
cache := eviction.New(eviction.Options{
Capacity: 1000,
Policy: eviction.PolicyLRU,
// Policy: eviction.PolicyLFU,
})
🎯 Final Thoughts
It all started as a simple experiment, but it’s grown into something I'm genuinely excited to share. Whether you're building a blazing-fast API, a CLI tool, or just need a clean way to manage in-memory data — I hope inmem
makes your life a bit easier (and faster).
If you give it a try and something breaks — that’s probably my fault.
If you try it and it works flawlessly — tell everyone it was intentional.
The project is open-source, and I’d love your feedback, suggestions, or contributions. Star it, fork it, break it, fix it — all welcome!
🙌 Thanks for Reading!
If you made it this far, you either really like caching or you're just incredibly patient. Either way, thanks for stopping by!
Have a good day.