How to measure time difference between servers?

question:

Each server has its own local clock, and comparisons of local clocks across servers are meaningless.

For example: the local time ta=1 on server A, and the local time tb=2 on server B, although ta<tb, we cannot say that ta must happen earlier, because the local time of the two servers will be different.

Since there is a difference in the local time of the two servers, how to measure the time difference between the two servers?

 

Packet measurement method:


     
 

The delivery process is as shown above:

1) Server A records a local time Ta1, and then sends a message to server B;

2) After receiving the message, server B takes the local time Tb and puts it into the message, and sends the message back to server A;

3) After server A receives the reply message, it records the local time Ta2.

It is easy to know that Ta1 and Ta2 take the local time of server A, and Tb takes the local time of server B.

Assuming that the round-trip time is the same (a reasonable assumption), then

x = (Ta2 – Ta1)/2

If you insist that this x is inaccurate through one measurement, you can send 100 million round-trip requests to find a relatively accurate x value.

Assuming the absolute time of server A and server B is the same (this assumption is unreasonable), then

Tb = Ta1 + x = (Ta2 + Ta1)/2

But in fact, there is a time difference between server A and server B. Let’s set the time difference to be “delta”, then

Tb + "delta" = Ta1 + x= (Ta2 + Ta1)/2

So, "delta" = (Ta2 + Ta1)/2– Tb

This "delta" is the time difference between server A and server B.

 

 

 

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=326307863&siteId=291194637