So, the basic definition of epoch time (a.k.a. 'unixtime') and what it means is clear. However, nowhere on the wikipedia page or on any other resource I can find is it mentioned that epoch time can be serialized in various equally-modern formats with varying degrees of precision.
(Note: I am NOT asking about ISO formatted date/time strings and the variants of those. Separate topic. Further, I'm not talking about low-level programming representations of these numbers, like signed vs. unsigned integers, 32 vs 64 bit, leap-seconds vs none, etc. Wikipedia actually covers all that. Just the various serializations of the epoch/unix time, ie. in a console, in JSON, in a database, etc.)
For instance, I've stumbled across at least 3 basic variants:
- 1707882022 - In seconds
- 1707882022000 - In milliseconds, no decimal indicator
- 1707882022.1598356 - In microseconds, with decimal indicator
Javascript, for instance, defaults to the milliseconds option, while Python 3 defaults to microseconds. Which would not be a big deal, except for the inexplicable decision (in JS, at least) to express fractional values without a decimal indicator between the seconds and milliseconds. (Which kinda breaks all known U/X principles, given that we all learn to use decimal points from about first grade. But I digress.)
My question is: is that all the possible options/formats? Are they all defined/specified anywhere, or is it all just ad-hoc decisions by language designers?
A Bonus would be the answer to: are these all fully inter-operable on all major programming languages? Do all equivalent new Date()
methods support all three formats equally well, just by detecting the string/integer size? Or is it necessary in some places to do manual conversion when multiple language environments share a common datastore? So far, I haven't encountered any glitches, and the machine parses the differences much more easily than my eyes do. I'm just wondering if it's something to watch out for...