LinkedIn uses Protobuf for microservice development, reducing network latency by 60%

image.png

LinkedIn adopted Protobuf for more efficient data transfer between microservices across its various platforms and integrated it with the open source framework Rest.li. After the company-wide rollout was complete, LinkedIn reduced latency by 60% while improving resource utilization.

The LinkedIn platform adopts a microservice architecture, and for many years, JSON has been the serialization format used by LinkedIn in more than 50,000 API nodes exposed by microservices. To help teams build consistent interactions between services, LinkedIn created and open-sourced a Java framework called Rest.li.

The framework can be used to create servers and clients that use the REST communication style and abstract many aspects of data exchange such as networking, serialization, service discovery, etc. The Rest.li framework primarily supports Java and Python, but also works with languages ​​like Scala, Kotlin, JavaScript, Go, and more.

image.png

Rest.li's default serialization format is JSON, which supports multiple languages ​​and is human-readable, which has many benefits but causes many problems in terms of performance (especially latency).

LinkedIn engineers Karthik Ramgopal and Aman Gupta share the challenges of using JSON for service-to-service communication:

The first challenge is that JSON is often too verbose as a text format, which leads to undesirable results in increased network bandwidth usage and latency. (…) The second challenge we faced was that the textual nature of JSON resulted in suboptimal latency and throughput for serialization and deserialization.

The LinkedIn team was looking for an alternative to JSON, one that had a compact payload size and efficient serialization that would reduce latency and increase throughput. They also hope that this solution will not limit the number of supported language stacks, and can be gradually migrated by integrating this new serialization mechanism into Rest.li. Finally, after comprehensive consideration, LinkedIn decided to adopt Protobuf with the highest comprehensive score in all considerations.

The main difficulty in integrating Protobuf into Rest.li lies in the dynamic schema generation of PDL, a framework-based custom schema definition system. This solution requires the generation of a symbol table for dynamically generating Protobuf schema definitions, but the delivery of the symbol table varies depending on the type of client. The back-end client obtains and caches the symbol table on demand, while the symbol table of the web page or mobile application is generated at build time, and contains version number dependencies.

image.png

After making changes to the framework, the LinkedIn team gradually reconfigured the client via HTTP headers to replace JSON with Protobuf. After adopting Protobuf, the throughput of response increased by 6.25% on average, and the throughput of request increased by 1.77% on average. The LinkedIn team also saw a 60% reduction in latency for large workloads.

image.png

Based on the experience gained with Protobuf adoption, the LinkedIn team plans to migrate Rest.li to gRPC in the future. gRPC also uses Protobuf, with additional support for streaming, and a huge community behind it.

Reference document: https://www.infoq.com/news/2023/07/linkedin-protocol-buffers-restli/

Guess you like

Origin blog.csdn.net/xiangzhihong8/article/details/132532697