What is a Unix Timestamp?
A Unix timestamp (or Epoch time) is the number of seconds since January 1, 1970, 00:00:00 UTC.
Why January 1, 1970?
This date was chosen when Unix was being developed in the early 1970s. It's called the "Unix Epoch" - a simple, arbitrary starting point.
Examples
Timestamp: 0
Date: Thu Jan 01 1970 00:00:00 UTC
Timestamp: 1000000000
Date: Sun Sep 09 2001 01:46:40 UTC
Timestamp: 1704067200
Date: Mon Jan 01 2024 00:00:00 UTC
Seconds vs Milliseconds
| System | Unit | Example |
|---|
| Unix/Linux | Seconds | 1704067200 |
|---|
| JavaScript | Milliseconds | 1704067200000 |
|---|
| Java (modern) | Milliseconds | 1704067200000 |
|---|
// JavaScript uses milliseconds
Date.now() // 1704067200000
new Date().getTime() // 1704067200000
// Convert to seconds
Math.floor(Date.now() / 1000) // 1704067200
The Year 2038 Problem
32-bit systems store timestamps as signed integers:
- Max value: 2,147,483,647
- This represents: Tue Jan 19 2038 03:14:07 UTC
After this, the number overflows to negative, breaking date calculations.
Advantages of Timestamps
- Timezone agnostic: Same value everywhere
- Easy math: Add/subtract seconds directly
- Compact: Single number vs date string
- Sortable: Natural chronological order
- Unambiguous: No date format confusion