Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> But I do agree with leap seconds: it's absolute trivia, not a useful thing for a programmer to know.

By and large, I agree with this.

But I've always found it a bit funny when a large organisation [1] says "our servers have sub-millisecond timing accuracy, thanks to GPS synchronization and these PCIe rubidium atomic clock cards we've developed" while at the same time saying [2] "we smear leapseconds over the course of a day, in practice it doesn't matter if a server's time is off by ±0.5 seconds"

[1] https://engineering.fb.com/2021/08/11/open-source/time-appli... [2] https://engineering.fb.com/2020/03/18/production-engineering...



The thing that super accurate timestamps buys you is common agreement across your infrastructure as to what the time is. This basically makes distributed systems work faster/better/whatever.

The relation between that time and what the rest of the world thinks the time is is actually less relevant.


I think the irony comes to full circle when you then use `unsmear` library to reverse the leap smearing in ntp

https://github.com/google/unsmear


And then you validate against Microsoft's default `ClockSkew` of 5 minutes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: