UseBoldTools logo

Timestamp Converter Unix Timestamp ↔ Readable Date

Convert a Unix timestamp to a human-readable date — or any date back to Unix epoch time — instantly in your browser. Automatically handles both seconds and milliseconds.

Converts automatically as you type

Current Unix Time (live)

Seconds (10-digit)

1775465184

Milliseconds (13-digit)

1775465184167

Unix Timestamp → Date

Date → Unix Timestamp

Instant client-side conversion No data sent to any server Works offline after first load

What Is a Unix Timestamp?

A Unix timestamp is an integer that counts the total number of seconds elapsed since January 1, 1970 at 00:00:00 UTC — a fixed reference point called the Unix epoch. It was introduced with the Unix operating system in the early 1970s and has since become the universal standard for representing time in software.

Because a timestamp is just a number, it carries no time-zone information and is the same value anywhere in the world. Whether a system is running in New York, London or Tokyo, the timestamp 1712217600 always refers to exactly the same moment: April 4, 2024 at 00:00:00 UTC. Converting it to a local time is a separate step that depends on the observer's time zone.

10-Digit vs 13-Digit — Seconds vs Milliseconds

The original Unix standard measures time in seconds, giving a 10-digit number for any date after September 2001. Most backend languages — Python, PHP, Go, Ruby — return seconds by default. However, JavaScript's Date.now() and Java's System.currentTimeMillis() return milliseconds, producing a 13-digit number.

10-digit → Seconds

1712217600

April 4, 2024 00:00:00 UTC
Used by: Python, PHP, Go, Linux, SQL databases

13-digit → Milliseconds

1712217600000

Same moment, × 1,000
Used by: JavaScript, Java, many web APIs

This tool automatically detects which format you have entered by counting the digits. You never need to choose manually. The detected format is shown directly below the input field.

How to Convert a Unix Timestamp to a Date

Paste your timestamp into the first input field above. Conversion happens automatically as you type — no button press needed. The tool shows six output formats simultaneously:

  • Local Date & Time — displayed in your browser's system time zone, with the zone abbreviation (e.g. GMT+5:30, EST).
  • UTC — the time in Coordinated Universal Time, independent of your location.
  • ISO 8601 — machine-readable format (2024-04-04T00:00:00.000Z) used in APIs, databases and log files.
  • Unix seconds and milliseconds — both epoch forms side by side for easy copying.
  • Relative time — a human-friendly label such as "3 days ago" or "in 2 hours" for quick sanity-checking.

UTC vs Local Time — Why Both Matter

A Unix timestamp is always stored in UTC. When you display it to a user, you convert it to their local time zone. When you store or transmit it between systems, you keep it as UTC or as the raw timestamp.

This distinction matters in real situations:

  • A server log entry at 2024-04-04T00:00:00Z is midnight UTC — which is 8:00 AM in Singapore (UTC+8) and 8:00 PM the previous day in New York (UTC-4).
  • An API call that sends a timestamp to a backend should always use UTC or the raw epoch number to avoid ambiguity.
  • Displaying dates to end users should always use local time so the time feels natural in their context.

Common Use Cases

Developers

Debug API responses with timestamp fields. Verify token expiry times. Check created_at, expires_at and iat/exp values in JWTs.

QA & Testers

Confirm that a feature triggered at the right time. Check event ordering in test logs. Verify time-based conditions behave correctly across time zones.

Log Analysis

Server logs, application traces and error reports often include raw Unix timestamps. Convert them instantly to understand when an incident occurred.

Databases & APIs

SQL databases store timestamps as integers. REST and GraphQL APIs return epoch values. Convert them to verify data before writing migrations or integration tests.

Why Does My Timestamp Show the Wrong Date?

There are two very common causes:

Cause 1 — Seconds treated as milliseconds (or vice versa)

If you pass 1712217600 (seconds) to a JavaScript new Date() constructor without multiplying by 1,000, you will get a date in January 1970 — because the constructor expects milliseconds. Always check your digit count: 10 digits = seconds, 13 digits = milliseconds.

Cause 2 — Time zone confusion

The UTC output and local output will differ by your time zone offset. If you expect midnight UTC but your system is UTC-5, local time will show 7:00 PM the previous day. This is correct behaviour — both values represent the same moment. Use the UTC output for system comparisons; use local time for user-facing display.

Practical Examples

Example 1 — 1712217600

  • Format detected: 10-digit seconds
  • UTC: Thu, 04 Apr 2024 00:00:00 GMT
  • ISO 8601: 2024-04-04T00:00:00.000Z
  • In milliseconds: 1712217600000

Example 2 — 1712217600000

  • Format detected: 13-digit milliseconds
  • UTC: Thu, 04 Apr 2024 00:00:00 GMT
  • ISO 8601: 2024-04-04T00:00:00.000Z
  • In seconds: 1712217600

Both examples refer to the identical moment. They differ only in unit.

Frequently Asked Questions

What is a Unix timestamp?

A Unix timestamp is the total number of seconds elapsed since January 1, 1970 at 00:00:00 UTC — called the Unix epoch. It is a plain integer with no time-zone component, making it the most reliable way to store and compare time across distributed systems, databases and APIs.

What is the difference between a 10-digit and a 13-digit timestamp?

A 10-digit timestamp is in seconds (e.g. 1712217600). A 13-digit timestamp is in milliseconds (e.g. 1712217600000). They represent the same moment — just at different precision. JavaScript uses milliseconds; most other languages and databases use seconds. This tool detects the format automatically.

How do I convert a timestamp to a readable date?

Paste your Unix timestamp into the top input field. The tool converts it instantly as you type and shows six formats: local time, UTC, ISO 8601, epoch in seconds, epoch in milliseconds, and relative time (e.g. "3 hours ago"). Each output has its own Copy button.

Why is my timestamp showing the wrong time?

The two most common causes: (1) mixing up seconds and milliseconds — a 10-digit seconds value passed to a JavaScript new Date() without multiplying by 1,000 will appear to be in 1970; (2) time-zone confusion — the UTC and local outputs will differ by your offset, but both are correct representations of the same moment.

Does this tool show UTC or local time?

Both, always. Every conversion shows your local time (in your browser's system time zone), UTC, and ISO 8601 (which uses the Z suffix for unambiguous UTC). The relative time — such as "2 days ago" — is always calculated from the current moment and is time-zone independent.

Is this timestamp converter free and private?

Yes — completely free, no account required. All conversions happen instantly in your browser using the native JavaScript Date API. No data is transmitted to any server, and nothing is stored or logged. The tool also works offline once the page has loaded.