Timestamp to date
Convert timestamp to human readable date. Supports seconds, milliseconds, and nanoseconds.
Convert timestamp to human readable date. Supports seconds, milliseconds, and nanoseconds.
Unix time (epoch time) is a system for storing dates and times as the number of seconds or milliseconds that have elapsed since midnight January 1, 1970 (UTC). This standard is widely used in programming, databases, and network protocols due to its simplicity and ease of date comparison. A value of 0 represents exactly that date, each subsequent second adds 1 to the counter.
The most common format is Unix in seconds (e.g. 1704067200), used in Linux and PHP systems. The second popular format is Unix in milliseconds (e.g. 1704067200000), used in JavaScript and JVM. The third is in nanoseconds (e.g. 1704067200000000), used in systems requiring extreme precision, such as databases or financial systems.
Traditional Unix in 32-bit systems has a maximum limit of 2,147,483,647 seconds, which corresponds to January 19, 2038. After this date, the value wraps to negative. Modern systems solve this by using 64-bit numbers, which will last for ages. New applications should use milliseconds or nanoseconds to avoid compatibility issues.
Unix timestamp is always in UTC, which means it contains no time zone information. To display the date locally, the system adds the time zone offset. For example, the same timestamp value in Poland will be displayed 1 or 2 hours earlier than in the USA (depending on daylight saving time). This makes timestamp ideal for storing and transferring dates between time zones.