Home / Tech / Unix Timestamp

Unix Timestamp Converter

Timestamp to Date


Date to Timestamp

Understanding Unix Timestamps

A Unix timestamp (also called Epoch time, POSIX time, or Unix time) is a way of tracking time as a single number. It represents the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC—a moment known as the "Unix epoch."

For example, the timestamp 1732642098 represents a specific moment in November 2025. This format is universal and works across all time zones and programming languages, making it ideal for logging events, storing dates in databases, and synchronizing time across distributed systems.

💡 Expert Tip: Always Store UTC

Unix timestamps are always in UTC (Coordinated Universal Time), not your local timezone. When you convert a timestamp to a human-readable date, your system automatically adjusts it to your local timezone. If you're building an app that serves users globally, always store timestamps in UTC and convert to local time only when displaying to users. This prevents timezone conversion bugs.

⚠️ Common Mistake: Seconds vs. Milliseconds

JavaScript's Date.now() returns the timestamp in milliseconds (e.g., 1732642098000—13 digits), while most Unix systems, PHP, Python, and databases use seconds (e.g., 1732642098—10 digits). If you see a date showing as 1970 or a year in the far future, you probably mixed the two. To convert: divide milliseconds by 1000 to get seconds, or multiply seconds by 1000 to get milliseconds.

The Year 2038 Problem (Y2K38)

Old 32-bit systems store timestamps as signed 32-bit integers, which can only hold values up to 2,147,483,647 seconds. This maxes out at 03:14:07 UTC on January 19, 2038. One second later, the number overflows and wraps to a negative value, causing the system to interpret the date as December 13, 1901. Modern 64-bit systems don't have this problem and can represent dates billions of years into the future.

Key Takeaways

  • Epoch Start: January 1, 1970, 00:00:00 UTC
  • Seconds: Standard Unix timestamp (10 digits)
  • Milliseconds: JavaScript/modern APIs (13 digits)
  • Always UTC: Timestamps don't carry timezone info—convert locally for display
  • Negative Timestamps: Valid! They represent dates before 1970 (e.g., -86400 = Dec 31, 1969)

Reviewed by: Michael Chang, Senior Software Engineer
Last updated: November 26, 2025

Frequently Asked Questions

What is a Unix timestamp?

A Unix timestamp (also called Epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC (the Unix epoch). It's a universal way to represent time as a single number, making it easy to store and compare dates across different systems and programming languages.

Why does Unix time start at 1970?

Unix time starts at January 1, 1970, because that's when the Unix operating system was being developed at Bell Labs. The developers chose this date as an arbitrary reference point (epoch) for their new time system. It became the standard because Unix became widely adopted, and now virtually all modern systems use it.

What's the difference between Unix timestamp in seconds vs milliseconds?

A standard Unix timestamp counts seconds since the epoch and is typically 10 digits (e.g., 1732642098). A timestamp in milliseconds counts milliseconds and is typically 13 digits (e.g., 1732642098000). JavaScript's Date.now() returns milliseconds, while most Unix/Linux systems and databases use seconds. To convert: divide milliseconds by 1000 to get seconds, or multiply seconds by 1000 to get milliseconds.

What is the Year 2038 problem?

The Year 2038 problem (Y2K38) occurs because 32-bit systems store Unix timestamps as signed 32-bit integers, which max out at 2,147,483,647 seconds after epoch—that's 03:14:07 UTC on January 19, 2038. After this, the number overflows and wraps to a negative value, causing systems to think the date is December 13, 1901. Modern 64-bit systems don't have this issue and can represent dates billions of years into the future.