Relativity is exactly the problem, time runs at a different speed on the moon. I have no idea how this is supposed to be handled in software.
Like if you have a lunar instant of time and an earth time and you want to figure out how much time happened between those two instants, I guess you’ll essentially need to decide on a frame of reference and then take into account relativity as you convert the lunar time to UTC. But I’m not a physicist, I’m not sure if doing that even makes sense.
Time passes differently for other lower orbit satellites as well. They just adjust the time to take up the slack but it’s likely done at very high precision.
Honestly, it should be really easy to figure out. Take two sycronized high precision clocks, put one in orbit and keep one on earth and then subtract one time from another after a few days. (At that precision, you also need to take into account the time it takes to radio the signal back to earth as well.)
The main issue comes from someone trying to build a library for this. For example, try answering the question “what time was it 2y 46d 2h 15m ago on the moon in lunar time?” (assuming it was asked on Earth, in some known timezone). Needing to look up from some table on a time server to answer all these questions sounds like a nightmare.
On Earth, there is a table with leap seconds… and sometimes they’re negative. That alone, is a good reason why writing time libraries is better left to people who specialize in writing time libraries.
The relativity part, also made me think: Luna orbits Earth at about 3600Km/h… but Earth’s equator itself, “orbits” Earth’s poles at 1600Km/h… so if one has relativity effects on time, half that speed must be having some relativity effects too, right…? Someone on the South Pole would also see a clock on the equator go some microseconds slower per day… and all the clocks at different latitudes, and everyone relative to everyone else, so you can’t tell “precisely” the time on Earth without taking into account the exact location… 😬
Relativity is exactly the problem, time runs at a different speed on the moon. I have no idea how this is supposed to be handled in software.
Like if you have a lunar instant of time and an earth time and you want to figure out how much time happened between those two instants, I guess you’ll essentially need to decide on a frame of reference and then take into account relativity as you convert the lunar time to UTC. But I’m not a physicist, I’m not sure if doing that even makes sense.
Time passes differently for other lower orbit satellites as well. They just adjust the time to take up the slack but it’s likely done at very high precision.
Honestly, it should be really easy to figure out. Take two sycronized high precision clocks, put one in orbit and keep one on earth and then subtract one time from another after a few days. (At that precision, you also need to take into account the time it takes to radio the signal back to earth as well.)
The main issue comes from someone trying to build a library for this. For example, try answering the question “what time was it 2y 46d 2h 15m ago on the moon in lunar time?” (assuming it was asked on Earth, in some known timezone). Needing to look up from some table on a time server to answer all these questions sounds like a nightmare.
On Earth, there is a table with leap seconds… and sometimes they’re negative. That alone, is a good reason why writing time libraries is better left to people who specialize in writing time libraries.
The relativity part, also made me think: Luna orbits Earth at about 3600Km/h… but Earth’s equator itself, “orbits” Earth’s poles at 1600Km/h… so if one has relativity effects on time, half that speed must be having some relativity effects too, right…? Someone on the South Pole would also see a clock on the equator go some microseconds slower per day… and all the clocks at different latitudes, and everyone relative to everyone else, so you can’t tell “precisely” the time on Earth without taking into account the exact location… 😬