头图
This article has included programming study notes . Covers PHP, JavaScript, Linux, Golang, MySQL, Redis, open source tools and more.

Meet Gocache

Gocache is a cache extension component written in Go language 多存储驱动 . It brings you a lot of capabilities for caching data.

Support function

Multiple cache-driven storage: support memory, redis or your custom storage driver. The following functions are supported:

✅Chained caches: Use multiple caches with priority order (eg in-memory then fallback to redis shared cache).

✅ Loadable Cache: Allows you to call a callback function to put data back into the cache.

✅ Metrics cache that allows you to store metrics about cache usage (hit, miss, set success, set error...).

✅ Automatic marshalling/unmarshalling of cached values as a marshaller for structs.

✅ Define default values in storage and override them when setting data.

✅ Cache invalidation via expiration time and/or using tags.

✅ Use of generics.

By default, Gocache supports the following cache drivers:

  1. Memory (bigcache) (allegro/bigcache).
  2. Memory (ristretto) (dgraph-io/ristretto).
  3. Memory (go-cache) (patrickmn/go-cache).
  4. Memory cache (bradfitz/memcache).
  5. Redis (go-redis/redis).
  6. Free cache (coocool/freecache).
  7. Pegasus ( apache/incubator-pegasus ) benchmark.

Reason for development

In the author's official website blog mentioned such a few words:

When I started implementing caching on my GraphQL Go project, it already had a memcache using a small library with a simple API, but also another memcache library to use batch mode with different libraries and APIs Load the data, do the same thing: cache the item. Later, we had another requirement: in addition to this in-memory cache, we wanted to add a layer of distributed caching using Redis, mainly to avoid empty caches for our new Kubernetes pods when putting new versions of the application into production.

So the author thought it was time to have a unified API to manage multiple cache stores: in-memory, redis, memcache or whatever you want.

how to use

Install

To start using the latest version of go-cache, you can use the following command:

 go get github.com/eko/gocache/v3

To avoid any errors when trying to import the library, use the following import statement:

 import (
    "github.com/eko/gocache/v3/cache"
    "github.com/eko/gocache/v3/store"
)

If you encounter any errors, be sure to run go mod tidy to clean up your go.mod file.

storage adapter

First, when you want to cache an item, you have to choose where you want to cache the item: in memory? in shared redis or memcache? Or possibly in another store. Currently, Gocache implements the following stores:

  1. BigCache: In-memory storage.
  2. Ristretto : Another in-memory store provided by DGraph.
  3. Memcache: memcache storage based on the bradfitz/gomemcache client library.
  4. Redis: Redis store based on go-redis/redis client library.
    All of these stores implement a very simple API that follows the interface:

     type StoreInterface interface {
     Get(key interface{}) (interface{}, error)
     Set(key interface{}, value interface{}, options *Options) error
     Delete(key interface{}) error
     Invalidate(options InvalidateOptions) error
     Clear() error
     GetType() string
    }

    This interface represents all the operations you can perform in the store, and each operation calls the necessary methods in the client library. All of these stores have different configurations depending on the client library you want to use, for example, to initialize the Memcache store:

     store := store.NewMemcache(
     memcache.New("10.0.0.1:11211", "10.0.0.2:11211", "10.0.0.3:11212"),
     &store.Options{
         Expiration: 10*time.Second,
     },
    )

    Then, the initialized storage must be passed to the cache object constructor.

cache adapter

A cache interface to rule them all. The cache interface is exactly the same as the store interface, because basically, the cache will perform operations on the store:

 type CacheInterface interface {
    Get(key interface{}) (interface{}, error)
    Set(key, object interface{}, options *store.Options) error
    Delete(key interface{}) error
    Invalidate(options store.InvalidateOptions) error
    Clear() error
    GetType() string
}

Using this interface, I can perform all necessary operations on cache items: set, get, delete, invalidate data, clear all caches and another method (GetType) which lets me know what the current cache item is, useful in a certain in some cases.

Starting from this interface, the cache types implemented are as follows:

Cache: Allows manipulation of the basic cache of data from a given store.

Chain: a special cache adapter that allows chaining multiple caches (probably because you have a memcache, a redis cache, etc...).

Loadable: A special cache adapter that allows specifying a callback function that automatically reloads data into the cache if it expires or becomes invalid.

Metric: a special cache adapter that allows to store metrics about cached data: number of items set, fetched, invalidated, successful or not.
The beauty comes when all these caches implement the same interface and can wrap each other: a metrics cache can take a loadable cache that can take a chained cache that can take multiple caches.

Here is a simple Memcache example:

 memcacheStore := store.NewMemcache(
    memcache.New("10.0.0.1:11211", "10.0.0.2:11211", "10.0.0.3:11212"),
    &store.Options{
        Expiration: 10*time.Second,
    },
)

cacheManager := cache.New(memcacheStore)
err := cacheManager.Set("my-key", []byte("my-value"), &cache.Options{
    Expiration: 15*time.Second, // Override default value of 10 seconds defined in the store
})
if err != nil {
    panic(err)
}

value := cacheManager.Get("my-key")

cacheManager.Delete("my-key")

cacheManager.Clear() 
// Clears the entire cache, in case you want to flush all cache

Now, suppose you want a chained cache with an in-memory Ristretto store and a distributed Redis store as backing, with a marshaller and metrics as results:

 // Initialize Ristretto cache and Redis client
ristrettoCache, err := ristretto.NewCache(&ristretto.Config{NumCounters: 1000, MaxCost: 100, BufferItems: 64})
if err != nil {
    panic(err)
}

redisClient := redis.NewClient(&redis.Options{Addr: "127.0.0.1:6379"})

// Initialize stores
ristrettoStore := store.NewRistretto(ristrettoCache, nil)
redisStore := store.NewRedis(redisClient, &cache.Options{Expiration: 5*time.Second})

// Initialize Prometheus metrics
promMetrics := metrics.NewPrometheus("my-amazing-app")

// Initialize chained cache
cacheManager := cache.NewMetric(promMetrics, cache.NewChain(
    cache.New(ristrettoStore),
    cache.New(redisStore),
))

// Initializes a marshaler
marshal := marshaler.New(cacheManager)

key := BookQuery{Slug: "my-test-amazing-book"}
value := Book{ID: 1, Name: "My test amazing book", Slug: "my-test-amazing-book"}

// Set the value in cache using given key
err = marshal.Set(key, value)
if err != nil {
    panic(err)
}

returnedValue, err := marshal.Get(key, new(Book))
if err != nil {
    panic(err)
}

// Then, do what you want with the value

We haven't talked about Marshaler yet, but it's another Gocache feature: we provide a service to help you automatically marshal/unmarshal your objects from/to your store.

This is useful when using struct objects as keys instead of memory storage because you have to convert the objects to bytes.

All these features: chained cache with in-memory and redis, Prometheus metrics and marshaler in about 20 lines of code.

Write your own cache or store

If you want to implement your own proprietary cache, it's easy to do too. Here's a simple example in case you want to log every action done in the cache (not a good idea, but fine, here's a simple todo example):

 package customcache

import (
    "log"

    "github.com/eko/gocache/cache"
    "github.com/eko/gocache/store"
)

const LoggableType = "loggable"

type LoggableCache struct {
    cache cache.CacheInterface
}

func NewLoggable(cache cache.CacheInterface) *LoggableCache {
    return &LoggableCache{
        cache: cache,
    }
}

func (c *LoggableCache) Get(key interface{}) (interface{}, error) {
    log.Print("Get some data...")
    return c.cache.Get(key)
}

func (c *LoggableCache) Set(key, object interface{}, options *store.Options) error {
    log.Print("Set some data...")
    return c.cache.Set(key, object, options)
}

func (c *LoggableCache) Delete(key interface{}) error {
    log.Print("Delete some data...")
    return c.cache.Delete(key)
}

func (c *LoggableCache) Invalidate(options store.InvalidateOptions) error {
    log.Print("Invalidate some data...")
    return c.cache.Invalidate(options)
}

func (c *LoggableCache) Clear() error {
    log.Print("Clear some data...")
    return c.cache.Clear()
}

func (c *LoggableCache) GetType() string {
    return LoggableType
}

Likewise, you can implement custom storage as well. If you think others can benefit your caching or storage implementation, please don't hesitate to open a pull request and contribute directly to the project so we can discuss your ideas together and bring a more robust caching library to life.

compression

Generate mock test data:

 go get github.com/golang/mock/mockgen
make mocks

The test suite can run:

 make test # run unit test
make benchmark-store # run benchmark test


Mandy
412 声望627 粉丝