Unix Timestamp Converter
Convert Unix timestamps to human-readable dates and vice versa. Supports seconds and milliseconds. Shows current epoch time live. Free online tool for developers.
What is a Unix Timestamp?
A Unix timestamp (also called Epoch time, POSIX time, or seconds since epoch) is the number of seconds that have elapsed since January 1, 1970 00:00:00 UTC — a moment known as the Unix Epoch. This deceptively simple concept is the foundation of how computers track time. Every operating system, programming language, database, and web API uses Unix timestamps internally, even when they display human-friendly dates on the surface.
The elegance of Unix timestamps lies in their universality: a timestamp of 1715000000 means the exact same moment in time regardless of timezone, locale, or calendar system. This makes them ideal for storing, comparing, and transmitting time data across distributed systems. When your API returns a timestamp, when your database stores a created_at field, when your JWT token has an exp claim — they're all using Unix timestamps under the hood.
This converter handles both seconds (10-digit) and milliseconds (13-digit) timestamps, automatically detecting which format you've entered. It displays the result in multiple formats including UTC, your local timezone, ISO 8601, and relative time. The live counter at the top shows the current Unix timestamp ticking in real time — useful as a quick reference when debugging time-related issues.
Timestamps are deeply intertwined with other developer tools. The exp and iat claims in a JSON Web Token are Unix timestamps. UUID v7 embeds a millisecond timestamp in its first 48 bits. Cron expressions schedule jobs that fire at specific timestamps. And when debugging API responses, you'll often need to convert timestamps in JSON payloads to understand when events occurred.
Seconds vs Milliseconds
- Seconds (10 digits): e.g.,
1715000000— Used by Unix systems, PHP'stime(), Python'stime.time(), Ruby'sTime.now.to_i, and most database TIMESTAMP columns. - Milliseconds (13 digits): e.g.,
1715000000000— Used by JavaScript'sDate.now(), Java'sSystem.currentTimeMillis(), and many modern APIs that need sub-second precision. - Microseconds (16 digits): e.g.,
1715000000000000— Used by Python'stime.time_ns() // 1000, PostgreSQL'sTIMESTAMPtype internally, and high-precision logging systems.
Practical Examples
1. Get the current Unix timestamp in different languages:
# JavaScript
Date.now() // 1715000000000 (milliseconds)
Math.floor(Date.now() / 1000) // 1715000000 (seconds)
# Python
import time
int(time.time()) // 1715000000
# PHP
time() // 1715000000
# Bash
date +%s // 17150000002. Convert a timestamp to a human-readable date:
# JavaScript
new Date(1715000000 * 1000).toISOString()
// "2024-05-06T18:13:20.000Z"
# Python
from datetime import datetime, timezone
datetime.fromtimestamp(1715000000, tz=timezone.utc)
// datetime(2024, 5, 6, 18, 13, 20, tzinfo=timezone.utc)
# PHP
date('Y-m-d H:i:s', 1715000000);
// "2024-05-06 18:13:20"3. Convert a date string to a Unix timestamp:
# JavaScript
new Date('2024-05-06T18:13:20Z').getTime() / 1000
// 1715000000
# Python
from datetime import datetime, timezone
dt = datetime(2024, 5, 6, 18, 13, 20, tzinfo=timezone.utc)
int(dt.timestamp())
// 17150000004. Check if a JWT token has expired:
// Decode the JWT payload and check the exp claim
const payload = JSON.parse(atob(token.split('.')[1]));
const isExpired = payload.exp < Math.floor(Date.now() / 1000);
console.log(isExpired ? 'Token expired' : 'Token valid');5. Calculate the difference between two timestamps:
const start = 1715000000;
const end = 1715086400;
const diffSeconds = end - start; // 86400
const diffHours = diffSeconds / 3600; // 24
const diffDays = diffSeconds / 86400; // 1Notable Timestamps
0— January 1, 1970 00:00:00 UTC (The Unix Epoch)1000000000— September 9, 2001 01:46:40 UTC (The "Billennium")1234567890— February 13, 2009 23:31:30 UTC2000000000— May 18, 2033 03:33:20 UTC2147483647— January 19, 2038 03:14:07 UTC (Y2K38: max value for signed 32-bit integer)
The Year 2038 Problem (Y2K38)
Systems that store Unix timestamps as signed 32-bit integers will overflow on January 19, 2038 at 03:14:07 UTC. After this moment, the timestamp wraps to a large negative number, which the system interprets as a date in December 1901. Most modern systems use 64-bit integers (which won't overflow for 292 billion years), but embedded systems, legacy databases, and older file formats may still be vulnerable.
Frequently Asked Questions
What is the Unix Epoch and why does it start at January 1, 1970?
The Unix Epoch (January 1, 1970 00:00:00 UTC) was chosen as the reference point when the Unix operating system was being developed at Bell Labs in the early 1970s. The original Unix stored time as a 32-bit integer counting seconds. The developers needed a recent-enough date that the counter wouldn't overflow too quickly, and January 1, 1970 was a round, convenient date close to when the system was created. It has since become the universal standard for computer timekeeping.
How do I tell if a timestamp is in seconds or milliseconds?
Count the digits. A Unix timestamp in seconds has 10 digits (e.g., 1715000000) and represents dates from 2001 to 2286. A timestamp in milliseconds has 13 digits (e.g., 1715000000000). If you see a number with 10 digits that starts with 1 or 2, it's almost certainly seconds. If it has 13 digits, it's milliseconds. Some systems also use microseconds (16 digits) or nanoseconds (19 digits). JavaScript uses milliseconds, while most server-side languages default to seconds.
Do Unix timestamps account for leap seconds?
No. Unix timestamps (POSIX time) explicitly do not count leap seconds. Every day is treated as exactly 86,400 seconds. When a leap second occurs, the Unix timestamp either repeats a second or skips one, depending on the system's implementation. This means Unix timestamps don't precisely represent UTC, but the simplification makes time arithmetic much easier — the difference between two timestamps always gives you the exact number of elapsed SI seconds minus any leap seconds in that interval. As of 2023, there have been 27 leap seconds since 1972.
What is the Year 2038 problem?
The Year 2038 problem (Y2K38) occurs because many systems store Unix timestamps as signed 32-bit integers, which have a maximum value of 2,147,483,647 — corresponding to January 19, 2038 at 03:14:07 UTC. After this moment, the integer overflows and wraps to -2,147,483,648, which systems interpret as December 13, 1901. Modern 64-bit systems are unaffected (they won't overflow for 292 billion years), but embedded systems, IoT devices, legacy databases, and some file formats still use 32-bit timestamps and will need updates before 2038.
How do timezones work with Unix timestamps?
Unix timestamps are always in UTC — they represent an absolute moment in time with no timezone information. When you convert a timestamp to a human-readable date, you apply a timezone offset to get the local time. For example, timestamp 1715000000 is "2024-05-06 18:13:20 UTC" which is "2024-05-06 14:13:20 EDT" (UTC-4) or "2024-05-07 03:13:20 JST" (UTC+9). This is why timestamps are ideal for storing time in databases — you store one value and convert to the user's timezone at display time.
Related Tools
- JWT Decoder — Decode JWTs that contain timestamp claims (exp, iat, nbf)
- Cron Expression Generator — Schedule jobs that fire at specific times
- UUID Generator — UUID v7 embeds timestamps for time-ordered IDs
- JSON Formatter — Format API responses containing timestamp fields