
Unix Timestamps Explained — Concepts and Conversion
📷 Andrey Grushnikov / PexelsUnix Timestamps Explained — Concepts and Conversion
Learn what a Unix timestamp is, how it works, and how to convert it to a human-readable date. Essential knowledge for every developer.
What Is a Unix Timestamp?
A Unix timestamp (also called Epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970 at 00:00:00 UTC. This reference point is called the "Unix Epoch."
Examples
0= January 1, 1970, 00:00:00 UTC1000000000= September 9, 2001, 01:46:40 UTC1710547200= March 16, 2024, 00:00:00 UTC
Why Use Unix Timestamps?
- Universal — no time zone confusion; the same number means the same moment everywhere
- Sortable — events can be ordered with a simple integer comparison
- Compact — a single integer instead of a formatted date string
- Language-agnostic — every programming language can handle an integer
Using Unix Timestamps in JavaScript
// Current timestamp (seconds)
Math.floor(Date.now() / 1000);
// Convert a timestamp to a Date object
new Date(timestamp * 1000);
// Convert a date to a timestamp
Math.floor(new Date('2026-03-16').getTime() / 1000);
The Year 2038 Problem
Unix timestamps stored as signed 32-bit integers will overflow on January 19, 2038. Modern systems use 64-bit integers, which will not overflow for approximately 292 billion years.
Milliseconds vs Seconds
- Unix timestamp (seconds):
1710547200— 10 digits - JavaScript timestamp (milliseconds):
1710547200000— 13 digits
If a timestamp has 13 digits, it is in milliseconds. Divide by 1000 to get seconds.
Convert Now
Use the free Unix Timestamp Converter to instantly convert between Unix timestamps and human-readable dates.