先说下需求,由于需要上线统一系统平台,系统使用人数、数据量都有较大提升,之前已经升级了系统为三节点的服务进行负载,但是经过压测,发现接口缓存使用的redis成为了单点瓶颈,之前已经将登录会话状态和用户组织字典等接口数据的redis集群进行了分离(2个redis集群,使用的是redis一主二从哨兵集群)。虽然服务有三个节点,但是redis因为读写都在主节点上,而且由于数据量的增加,缓存数据key的大小也随之增加,所以形成了单点瓶颈。(中台是一个写少读多的场景,所以接口数据都有进行缓存操作了,而且作为中台,需要保证提供给业务系统的数据是一致的,所以没有使用读副本的方式)

为了提高系统并发度和吞吐量、降低整体系统负载,实现方案如下:通过增强spring cache的redis+caffeine二级缓存,使用双重检查机制,分布式锁(redisson实现)+本地锁(caffeine自有实现,本质是个ConcurrentHashMap)并发控制数据读取。实现逻辑图如下:image.png

另外发现损耗很大的地方在于存在大key,以及序列化的时间,和gateway网关的性能损坏,接口数据越大损耗越大,能达到差不多3-40%,因为目前是这套微服务的架构,网关暂时没有替换。通过减少无效字段,分解大key,最终实现了吞吐量提升50%,平均响应时间降低2倍

主要代码是重写了CacheManager和Cache接口,实现如下

public class RedisCaffeineCacheManager extends AbstractTransactionSupportingCacheManager {

    private final RedisSerializer redisSerializer;

    private final RedisCacheWriter cacheWriter;

    private final CacheLock cacheLock;

    private final DefaultRedisScript script;

    private final Map<String, RedisCacheConfiguration> initialCacheConfiguration = new LinkedHashMap<>();;

    public RedisCaffeineCacheManager(List<String> initialCacheNames, RedisCacheWriter cacheWriter, RedisCacheConfiguration initialCacheConfiguration, RedisSerializer redisSerializer, CacheLock cacheLock, DefaultRedisScript defaultRedisScript) {
        this.cacheWriter = cacheWriter;
        this.redisSerializer = redisSerializer;
        this.cacheLock = cacheLock;
        for (String cacheName : initialCacheNames) {
            this.initialCacheConfiguration.put(cacheName, initialCacheConfiguration);
        }
        this.script = defaultRedisScript;
    }

    protected Collection<? extends Cache> loadCaches() {
        List<Cache> list = new ArrayList<>();
        initialCacheConfiguration.forEach((cacheName, redisCacheConfiguration) -> {
            RedisCaffeineCache cache = new RedisCaffeineCache(cacheName, cacheLock, cacheWriter, redisCacheConfiguration, redisCacheConfiguration.getConversionService(), redisSerializer, script);
            list.add(cache);
        });
        return list;
    }

}

public class RedisCaffeineCache extends AbstractValueAdaptingCache {

    private static final Logger logger = LoggerFactory.getLogger(RedisCaffeineCache.class);

    private final String cacheName;

    private final com.github.benmanes.caffeine.cache.LoadingCache<Object, Object> caffeineCache;

    private static final byte[] BINARY_NULL_VALUE = RedisSerializer.java().serialize(NullValue.INSTANCE);
    private final CustomDefaultRedisCacheWriter cacheWriter;
    private final RedisCacheConfiguration cacheConfig;
    private final ConversionService conversionService;

    private final RedisSerializer redisSerializer;

    private final DefaultRedisScript script;

    private final CacheLock cacheLock;

    public RedisCaffeineCache(String cacheName, CacheLock cacheLock, RedisCacheWriter cacheWriter, RedisCacheConfiguration cacheConfig, ConversionService conversionService, RedisSerializer redisSerializer, DefaultRedisScript script) {
        super(true);
        this.cacheName = cacheName;
        this.cacheWriter = (CustomDefaultRedisCacheWriter)cacheWriter;
        this.cacheConfig = cacheConfig;
        this.conversionService = conversionService;
        this.redisSerializer = redisSerializer;
        this.cacheLock = cacheLock;
        this.script = script;
        this.caffeineCache = Caffeine.newBuilder()
                .maximumSize(5000)
                .expireAfterWrite(Expiration.from(cacheConfig.getTtl().toMillis(), TimeUnit.MILLISECONDS).getConverted(TimeUnit.SECONDS), TimeUnit.SECONDS)
                .build((key)->{
                    //logger.error("线程安全加载缓存" + key); //这里会加锁,同时只有一个请求会进来,底层是ConcurrentHashMap
                    return deserializeCacheValue(cacheWriter.get(cacheName, serializeCacheKey(key.toString())));
                });
    }

    AtomicInteger integer = new AtomicInteger(0);

    @Override
    protected Object lookup(Object key) {
        String cacheKey = createCacheKey(key);
        CacheKeyContextHolder.setCacheKey(cacheKey);
        byte[] andConvertCacheKey = serializeCacheKey(cacheKey);
        byte[] value = cacheWriter.get(cacheName, andConvertCacheKey);
        if (value == null) {
            //如果是空的 加锁 限制并发读取
            //int i = integer.incrementAndGet();
            //logger.error(Thread.currentThread().getName() + "锁定前:" + i);
            cacheLock.lock(cacheKey);
            //logger.error(Thread.currentThread().getName() + "锁定后:" + i);
            value = cacheWriter.get(cacheName, andConvertCacheKey);//再次获取 双重检查
            if (value == null) {
                //logger.error(Thread.currentThread().getName() + "还是没获取到值:" + i);
                return null;
            }
        }
        //logger.error(Thread.currentThread().getName() + "缓存结果:" + value);
        Result digestResult = (Result) deserializeCacheValue(value);  //拿到digest 和本地缓存判断
        Object data = digestResult.getData();
        String digestCacheKey = createDigestCacheKey(key, data.toString());
        Object o = caffeineCache.get(digestCacheKey);
        if(o == null) {
            return null;
        }
        return o;
    }

    @Override
    public String getName() {
        return cacheName;
    }

    @Override
    public Object getNativeCache() {
        return this;
    }

    @Override
    public <T> T get(Object key, Callable<T> valueLoader) {
        logger.error("RedisCaffeineCache.get该方法未实现");
        return null;
    }

    @Override
    public void put(Object key, Object value) {
        Object cacheValue = preProcessCacheValue(value);
        if (!isAllowNullValues() && cacheValue == null) {
            throw new IllegalArgumentException(String.format(
                    "Cache '%s' does not allow 'null' values. Avoid storing null via '@Cacheable(unless=\"#result == null\")' or configure RedisCache to allow 'null' via RedisCacheConfiguration.",
                    cacheName));
        }

        cacheWriter.saveDigestAndData(cacheName, connection -> multiSave(key, connection, cacheValue, script));
    }

    @Override
    public ValueWrapper putIfAbsent(Object key, @Nullable Object value) {
        Object cacheValue = preProcessCacheValue(value);
        if (!isAllowNullValues() && cacheValue == null) {
            return get(key);
        }
        Object result = cacheWriter.saveIfAbsentDigestAndData(cacheName, connection -> multiSave(key, connection, cacheValue, script));
        if(result == null) {
            return null;
        }
        return new SimpleValueWrapper(fromStoreValue(deserializeCacheValue((byte[]) result)));
    }

    private Object multiSave(Object key, RedisConnection connection, Object cacheValue, DefaultRedisScript script) {
        List<Object> keys = new ArrayList<>();
        keys.add(createAndConvertCacheKey(key));
        byte[] valueBytes = serializeCacheValue(cacheValue);
        String digest;//通过cacheValue生成摘要
        try {
            digest = MD5Utils.md5Hex(valueBytes);
        } catch (NoSuchAlgorithmException e) {
            throw new RuntimeException(e);
        }
        keys.add(createAndConvertDigestCacheKey(key, digest));
        Result digestResult = ResultBuilder.create().data(digest).build();
        byte[] digestResultBytes = serializeCacheValue(digestResult);
        Object[] args = new Object[]{Expiration.from(cacheConfig.getTtl().toMillis(), TimeUnit.MILLISECONDS).getConverted(TimeUnit.SECONDS),
                digestResultBytes, valueBytes};
        final byte[][] keysAndArgs = keysAndArgs(keys, args);
        final int keySize = keys.size();
        Object result = null;
        try {
            result = connection.evalSha(script.getSha1(), ReturnType.fromJavaType(script.getResultType()), keySize,  keysAndArgs);
        } catch (Exception e) {
            if (!exceptionContainsNoScriptError(e)) {
                throw e instanceof RuntimeException ? (RuntimeException) e : new RedisSystemException(e.getMessage(), e);
            }
            result = connection.eval(script.getScriptAsString().getBytes(StandardCharsets.UTF_8), ReturnType.fromJavaType(script.getResultType()), keySize, keysAndArgs);
        }
        return result;
    }

    @Override
    public void evict(Object key) {
        cacheWriter.remove(cacheName, serializeCacheKey(createCacheKey(key) + "_*"));
    }

    @Override
    public void clear() {
        byte[] pattern = conversionService.convert(createCacheKey("*"), byte[].class);
        cacheWriter.clean(cacheName, pattern);
    }


    static boolean exceptionContainsNoScriptError(Throwable e) {
        if (!(e instanceof NonTransientDataAccessException)) {
            return false;
        }
        Throwable current = e;
        while (current != null) {
            String exMessage = current.getMessage();
            if (exMessage != null && exMessage.contains("NOSCRIPT")) {
                return true;
            }
            current = current.getCause();
        }
        return false;
    }

    protected byte[][] keysAndArgs(List<Object> keys, Object[] args) {
        final int keySize = keys != null ? keys.size() : 0;
        byte[][] keysAndArgs = new byte[args.length + keySize][];
        int i = 0;
        if (keys != null) {
            for (Object key : keys) {
                if (cacheConfig.getKeySerializationPair() == null || key instanceof byte[]) {
                    keysAndArgs[i++] = (byte[]) key;
                } else {
                    keysAndArgs[i++] = serializeCacheKey(key.toString());
                }
            }
        }
        for (Object arg : args) {
            if (cacheConfig.getValueSerializationPair() == null || arg instanceof byte[]) {
                keysAndArgs[i++] = (byte[]) arg;
            } else {
                keysAndArgs[i++] = serializeCacheValue(arg);
            }
        }
        return keysAndArgs;
    }

    public CacheStatistics getStatistics() {
        return cacheWriter.getCacheStatistics(getName());
    }

    public void clearStatistics() {
        cacheWriter.clearStatistics(getName());
    }

    public RedisCacheConfiguration getCacheConfiguration() {
        return cacheConfig;
    }

    @Nullable
    public Object preProcessCacheValue(@Nullable Object value) {
        if (value != null) {
            return value;
        }
        return isAllowNullValues() ? NullValue.INSTANCE : null;
    }

    protected byte[] serializeCacheKey(String cacheKey) {
        return ByteUtils.getBytes(cacheConfig.getKeySerializationPair().write(cacheKey));
    }

    protected byte[] serializeCacheValue(Object value) {
        if (isAllowNullValues() && value instanceof NullValue) {
            return BINARY_NULL_VALUE;
        }
        return ByteUtils.getBytes(cacheConfig.getValueSerializationPair().write(value));
    }

    @Nullable
    protected Object deserializeCacheValue(byte[] value) {
        if (isAllowNullValues() && ObjectUtils.nullSafeEquals(value, BINARY_NULL_VALUE)) {
            return NullValue.INSTANCE;
        }
        if(value == null) {
            return null;
        }
        return cacheConfig.getValueSerializationPair().read(ByteBuffer.wrap(value));
    }

    protected String createCacheKey(Object key) {
        String convertedKey = convertKey(key);
        if (!cacheConfig.usePrefix()) {
            return convertedKey;
        }
        return prefixCacheKey(convertedKey);
    }

    public String createDigestCacheKey(Object key, String digest) {
        return createCacheKey(key) + "_" + digest; //得到摘要
    }

    protected String convertKey(Object key) {
        if (key instanceof String) {
            return (String) key;
        }
        TypeDescriptor source = TypeDescriptor.forObject(key);
        if (conversionService.canConvert(source, TypeDescriptor.valueOf(String.class))) {
            try {
                return conversionService.convert(key, String.class);
            } catch (ConversionFailedException e) {
                // may fail if the given key is a collection
                if (isCollectionLikeOrMap(source)) {
                    return convertCollectionLikeOrMapKey(key, source);
                }
                throw e;
            }
        }
        Method toString = ReflectionUtils.findMethod(key.getClass(), "toString");
        if (toString != null && !Object.class.equals(toString.getDeclaringClass())) {
            return key.toString();
        }
        throw new IllegalStateException(String.format(
                "Cannot convert cache key %s to String. Please register a suitable Converter via 'RedisCacheConfiguration.configureKeyConverters(...)' or override '%s.toString()'.",
                source, key.getClass().getSimpleName()));
    }

    private String convertCollectionLikeOrMapKey(Object key, TypeDescriptor source) {
        if (source.isMap()) {
            StringBuilder target = new StringBuilder("{");
            for (Map.Entry<?, ?> entry : ((Map<?, ?>) key).entrySet()) {
                target.append(convertKey(entry.getKey())).append("=").append(convertKey(entry.getValue()));
            }
            target.append("}");
            return target.toString();
        } else if (source.isCollection() || source.isArray()) {
            StringJoiner sj = new StringJoiner(",");
            Collection<?> collection = source.isCollection() ? (Collection<?>) key
                    : Arrays.asList(ObjectUtils.toObjectArray(key));
            for (Object val : collection) {
                sj.add(convertKey(val));
            }
            return "[" + sj.toString() + "]";
        }
        throw new IllegalArgumentException(String.format("Cannot convert cache key %s to String.", key));
    }

    private boolean isCollectionLikeOrMap(TypeDescriptor source) {
        return source.isArray() || source.isCollection() || source.isMap();
    }

    private byte[] createAndConvertCacheKey(Object key) {
        return serializeCacheKey(createCacheKey(key));
    }

    private byte[] createAndConvertDigestCacheKey(Object key, String digest) {
        return serializeCacheKey(createDigestCacheKey(key, digest));
    }

    private String prefixCacheKey(String key) {
        // allow contextual cache names by computing the key prefix on every call.
        return cacheConfig.getKeyPrefixFor(cacheName) + key;
    }

}

我不是码农
3 声望1 粉丝

java开发码农