Work documentary 35-API data cleaning and current limiting

The interface calls three-party services to do asynchronous data cleaning, but it is afraid that other people's services will collapse, so add Guava's current limiting strategy to the interface


/**
 * @author 寒夜
 */
@RestController
@Slf4j
public class Controller {
    
    

    volatile RateLimiter rateLimiter = RateLimiter.create(10);
 
    /**
     * 非阻塞限流
     *
     * @param count 每秒消费的令牌个数 (每秒允许放行的请求次数)
     * @return success/fail
     */
    @GetMapping("/tryAcquire")
    public String tryAcquire(Integer count) {
    
    
        if (limiter.tryAcquire(count)) {
    
    
            log.info("success,rate is {}", limiter.getRate());
            return "success";
        } else {
    
    
            log.info("fail ,rate is {} ", limiter.getRate());
            return "fail";
        }
 
    }
 
    /**
     * 限定时间的非阻塞限流
     *
     * @param count 每秒消费的令牌个数 (每秒允许放行的请求次数)
     * @return success/fail
     */
    @GetMapping("/tryAcquireWithTimeOut")
    public String tryAcquireWithTimeOut(Integer count, Integer timeout) {
    
    
        if (limiter.tryAcquire(count, timeout, TimeUnit.SECONDS)) {
    
    
            log.info("success,rate is {}", limiter.getRate());
            return "success";
        } else {
    
    
            log.info("fail ,rate is {} ", limiter.getRate());
            return "fail";
        }
 
    }
 
    /**
     * 同步阻塞限流
     * @param count 每秒消费的令牌个数 (每秒允许放行的请求次数)
     * @return success
     */
    @GetMapping("/acquire")
    public String acquire(Integer count) {
    
    
        limiter.acquire(count);
        log.info("success,rate is {}", limiter.getRate());
        return "success";
    }
 
}
}

RateLimiter is safe to use in concurrent scenarios and will limit the total rate of all threads, but it does not guarantee fairness.

The above are stand-alone current limiting, distributed current limiting [Redisson RRateLimiter]: https://blog.csdn.net/truelove12358/article/details/127751211

We will use Semaphore, Guava's RateLimiter, etc. to limit the amount of concurrency or rate. However, the current limit of a single machine cannot expand the threshold of the overall service, such as the overall upper limit for accessing or operating DB or the limit for a certain batch of interface operations. Flow, this kind of distribution can ensure the overall flow limit of the service cluster, and can ignore the fine-grained flow limit control caused by the problem of unbalanced traffic division in the cluster, which is a good strategy.

Stand-alone current limit:

advantage:

1. Simple and easy to implement: Stand-alone current limiting is implemented on a single server without complex architecture and configuration 2. Low latency: Since requests are processed on a single server, single-machine current limiting can reduce the delay caused by network communication . 3. Lower cost: Compared with distributed current limiting, stand-alone current limiting does not require additional hardware and network facilities, and the cost is lower.

shortcoming:

1. Single point of failure: If the current limiting measures are only deployed on a single server, once the server fails or becomes unavailable, the entire system will not be able to perform current limiting. 2. Limited scalability: single-machine current limiting is difficult to cope with large-scale requests The bursty situation cannot provide high throughput and scalability.

Distributed current limiting:

advantage:

1. High availability: Since the current limiting strategy is distributed on multiple servers, even if one of the servers is unavailable, other servers can still perform current limiting operations to ensure the availability of the system.
2. Horizontal expansion: Distributed current limiting provides better scalability, and can handle larger-scale requests by increasing the number of servers. 3. Flexibility:
Distributed current limiting can be dynamically adjusted according to the load conditions of different servers , to better adapt to changes in the system.

shortcoming:

1. Complexity: Distributed current limiting requires the design and implementation of a complex architecture, including mechanisms such as load balancing, messaging, and coordination. 2. High latency: Due to the communication between multiple servers involved, distributed current limiting may introduce high latency, especially in the case of large-scale requests

Guess you like

Origin blog.csdn.net/u013553309/article/details/131975402