Convert epoch timestamps to human dates and back. Live clock, relative time, multiple formats. Runs entirely in your browser.
A Unix timestamp (also called epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970, 00:00:00 UTC — known as the Unix epoch. It's the standard way computers track time internally.
Most Unix timestamps are in seconds (10 digits, e.g., 1712019960). JavaScript, Java, and some APIs use milliseconds (13 digits, e.g., 1712019960000). This tool auto-detects which format you're using.
JavaScript: Math.floor(Date.now() / 1000)
Python: import time; int(time.time())
PHP: time()
Go: time.Now().Unix()
SQL (PostgreSQL): EXTRACT(EPOCH FROM NOW())
Bash: date +%s
32-bit systems store timestamps as a signed 32-bit integer, which will overflow on January 19, 2038 at 03:14:07 UTC. Most modern systems use 64-bit timestamps, which won't overflow for another 292 billion years.
2026-04-02T12:00:00ZWed, 02 Apr 2026 12:00:00 +0000Paste your Unix timestamp into the input field and the tool instantly converts it to a human-readable date and time. It shows results in both UTC and your local timezone and supports both 10-digit (seconds) and 13-digit (milliseconds) timestamps.
A Unix timestamp counts the seconds (or milliseconds) since January 1, 1970 00:00:00 UTC — called the Unix Epoch. It's a universal, timezone-free way to store time used across databases, APIs, and programming languages.
Use the date input field to pick any date and time. The tool instantly converts it to a Unix timestamp in both seconds and milliseconds, ready to copy for API calls or database queries.
The tool shows the current Unix timestamp in real time, updating every second. Copy it directly for use in code, logs, or API requests.
10-digit timestamps represent seconds since epoch. 13-digit timestamps represent milliseconds. JavaScript's Date.now() uses milliseconds by default, while most Unix systems use second precision.