Zero Data RetentionQuantum-Ready Entropy256-bit MinimumClient-Side OnlyPost-Quantum ReadyZero KnowledgeNIST SP 800-63BFIPS 140-3 AlignedNo Account NeededDoD CompliantZero Data RetentionQuantum-Ready Entropy256-bit MinimumClient-Side OnlyPost-Quantum ReadyZero KnowledgeNIST SP 800-63BFIPS 140-3 AlignedNo Account NeededDoD Compliant
Concepts7 min readUpdated January 2025

What Is Password Entropy? A Plain English Explanation

Entropy is the only honest measure of password strength. Here is exactly what it means, how to calculate it, and why length matters more than complexity.

What is password entropy?

Password entropy is a measurement of how unpredictable a password is — specifically, how many guesses an attacker would need to crack it through brute force. It is expressed in bits, and each additional bit doubles the number of guesses required.

The concept comes from information theory, specifically Claude Shannon's 1948 paper "A Mathematical Theory of Communication." Shannon entropy measures the minimum number of bits needed to represent a piece of information — in the password context, how much information content (unpredictability) a password contains.

A password with 40 bits of entropy requires up to 240 guesses to crack by brute force — roughly 1.1 trillion guesses. With 80 bits, that becomes 280, or about 1.2 × 1024 guesses. The difference is not linear — it is exponential. This is why security professionals care so much about entropy.

The key formula: Entropy (bits) = log₂(pool sizelength) = length × log₂(pool size)

How to calculate entropy

Password entropy depends on two variables: the length of the password and the size of the pool of possible characters at each position.

The formula is: H = L × log₂(N)

Where H is entropy in bits, L is password length, and N is the size of the character pool.

Common character pool sizes:

Character setPool size (N)log₂(N)
Digits only (0–9)103.32 bits/char
Lowercase letters (a–z)264.70 bits/char
Lower + uppercase525.70 bits/char
Alphanumeric (lower + upper + digits)625.95 bits/char
Full printable ASCII (including symbols)946.55 bits/char
Full ASCII + Unicode extensions100+6.64+ bits/char

Worked examples:

  • 8-character alphanumeric password: 8 × log₂(62) = 8 × 5.95 = 47.6 bits
  • 12-character full ASCII password: 12 × log₂(94) = 12 × 6.55 = 78.6 bits
  • 20-character full ASCII password: 20 × log₂(94) = 20 × 6.55 = 131 bits

This is the theoretical maximum entropy — the actual effective entropy depends on how randomly the password was generated. A human-chosen "random" password almost always has far lower effective entropy because humans are predictable.

Entropy vs. perceived strength

This is where most password strength meters mislead users. A typical strength meter checks for uppercase, lowercase, numbers, and symbols and reports "strong" when all four are present. But a password like P@ssw0rd1 has high perceived strength (4 character types, 9 characters) and very low actual strength — it appears on every cracked password list.

Why? Because entropy measures statistical unpredictability, not just character diversity. When a human chooses a password by substituting @ for a and 0 for o, they are following an extremely predictable pattern. Modern cracking tools handle these substitutions trivially — they are baked into dictionary attack rule sets.

True entropy requires true randomness. A password generator using a cryptographically secure pseudorandom number generator (CSPRNG) — like crypto.getRandomValues() in the browser, which PassGeni uses — produces passwords with entropy close to the theoretical maximum. A human choosing a "random" password typically achieves 10–30% of the theoretical maximum entropy for the same length and character set.

Entropy and crack time

Crack time estimates depend on the attacker's hardware and the hashing algorithm used to store the password. Modern GPU-accelerated cracking rigs can test billions to trillions of guesses per second against weakly hashed passwords (MD5, SHA-1). Against modern slow hashes (bcrypt, Argon2id), the rate drops dramatically.

EntropyGuesses requiredTime at 10B/sec (MD5)Time at 10K/sec (bcrypt)
28 bits~268 million< 1 second~7 hours
40 bits~1.1 trillion~110 seconds~3.5 years
56 bits~72 quadrillion~83 daysMillions of years
80 bits~1.2 × 10²⁴Billions of yearsHeat death of universe
128 bits~3.4 × 10³⁸ImpossibleImpossible

Key takeaway: against a slow hash (which any serious application uses), 40+ bits of entropy is sufficient for most purposes. Against an offline attack on leaked MD5 hashes — the scenario after a major breach — you want 60+ bits. For long-term secrets, 80+ bits provides a meaningful safety margin against near-future computational improvements.

Why pool size matters

Expanding the character pool is the most efficient way to increase entropy per character. Compare:

  • Lowercase only: 4.70 bits per character
  • Full ASCII (lowercase + uppercase + digits + 32 symbols): 6.55 bits per character
  • Difference: 1.85 bits per character

Over a 12-character password, that difference is 22 bits — equivalent to adding roughly 3–4 characters to a lowercase-only password. This is why including symbols is genuinely valuable, even as NIST warns against requiring them (because requirements lead to predictable substitutions).

The practical recommendation: use a password generator that draws from a large pool. PassGeni's default pool (lowercase + uppercase + digits + symbols) gives 6.55 bits per character. At 18 characters (the default length), that is approximately 118 bits — sufficient against any foreseeable attack.

Length vs. complexity: the numbers

The NIST guidance to prefer length over complexity is mathematically correct, but the two are not mutually exclusive. The data:

ConfigurationEntropyVerdict
8 chars, full ASCII (complex)52 bitsMarginal
12 chars, lowercase only (long)56 bitsMarginal
12 chars, full ASCII79 bitsGood
16 chars, full ASCII105 bitsStrong
20 chars, full ASCII131 bitsPost-quantum safe
5-word passphrase (EFF list, 7776 words)65 bitsGood for memorability
6-word passphrase (EFF list)78 bitsStrong

The ideal is both: long and drawn from a large pool. A randomly generated 16-character password using the full ASCII printable set achieves 105 bits — better than any human-chosen passphrase, and immune to the pattern-based attacks that defeat complexity rules.

Practical entropy targets

What entropy should you target for different use cases?

General online account (with bcrypt storage)≥ 56 bits
HIPAA / PCI-DSS compliant systems≥ 78 bits (12 chars, full ASCII)
Admin and privileged accounts≥ 105 bits (16 chars, full ASCII)
Long-lived secrets, API keys≥ 128 bits
Post-quantum resistant (long-term)≥ 128 bits (20 chars, full ASCII)

Common entropy myths

Myth: "My password is strong because it has uppercase, numbers, and symbols." Character diversity contributes to pool size but not if the underlying pattern is predictable. P@ssw0rd! has all four types and zero real entropy because it appears in every cracking dictionary.

Myth: "I just need to avoid dictionary words." Modern cracking tools combine dictionary attacks with rule-based transforms, making this insufficient at short lengths. Below 12 characters, even non-dictionary passwords can be cracked quickly against weakly hashed stores.

Myth: "Entropy is the same thing as strength." Entropy measures theoretical unpredictability of random selection. Real-world password strength also depends on: the hashing algorithm used to store it, whether it appears in breach databases, the attacker's specific tooling, and rate limiting on the authentication system. A 40-bit randomly generated password stored with Argon2id is practically much stronger than an 80-bit password stored as unsalted MD5.

Myth: "Passphrases always have more entropy than passwords." A randomly chosen 4-word passphrase from the EFF list has about 51 bits of entropy — less than a 9-character random full-ASCII password. Passphrases win on memorability, not raw entropy at equal length.

PassGeni shows the exact entropy in bits for every generated password, alongside an estimated crack time. Both figures assume cryptographically random generation — which PassGeni uses via crypto.getRandomValues() directly in your browser.

Frequently asked questions

What is password entropy?

Password entropy is a measure of how unpredictable a password is, expressed in bits. Higher entropy means more possible combinations and a harder brute-force attack. Formula: length × log2(character_pool_size).

How many bits of entropy is enough?

NIST considers 80+ bits sufficient for most applications. For highly sensitive accounts, 100+ bits is recommended. A 16-character random password from a full character set has approximately 105 bits of entropy.

What is password entropy?

Password entropy is a measure of how unpredictable a password is, expressed in bits. It is calculated as: E = length × log₂(character pool size). Higher entropy means more possible combinations and harder brute-force attacks. A 20-character random password from the full ASCII set has ~131 bits of entropy.

How many bits of entropy is a secure password?

NIST considers 80+ bits sufficient for most applications. For highly sensitive accounts, 100+ bits is recommended. Post-quantum security requires 128+ bits to maintain 64-bit effective security against Grover's algorithm. PassGeni's default 18-character output provides approximately 118 bits.

How is password entropy calculated?

Entropy formula: E = L × log₂(N), where L is password length and N is the size of the character pool. Example: 20 characters from full printable ASCII (95 chars) = 20 × log₂(95) = 20 × 6.57 = 131.4 bits of entropy.

Does password complexity increase entropy?

Yes — larger character pools increase entropy per character. Switching from lowercase only (26 chars, 4.7 bits/char) to full ASCII (95 chars, 6.57 bits/char) adds about 1.9 bits per character. But length has a larger effect: each added character multiplies possible combinations by the full pool size.

What is the entropy of a passphrase?

A 4-word Diceware passphrase from a 7,776-word list has entropy of 4 × log₂(7776) = 51.7 bits. A 5-word passphrase gives 64.6 bits, 6 words gives 77.5 bits. These assume random word selection — user-chosen words have significantly lower effective entropy.

Why doesn't high entropy guarantee security?

Entropy calculation assumes truly random generation. Human-created passwords have much lower effective entropy because they follow predictable patterns. 'Summer2024!' technically uses a 95-character pool but has near-zero effective entropy because it's in cracking dictionaries. Use a cryptographic generator like PassGeni.

What is the difference between entropy and crack time?

Entropy measures theoretical unpredictability; crack time estimates how long a specific attack would take. Crack time depends on entropy AND hashing algorithm. A 50-bit entropy password takes seconds to crack if stored as MD5, but thousands of years if stored as Argon2id. PassGeni's Strength Checker shows both.

How does PassGeni calculate entropy?

PassGeni uses the standard formula E = L × log₂(N) where N is the actual character pool size used in generation. For a password with symbols, numbers, upper and lowercase, N = 95. The entropy display in the Strength Checker reflects cryptographic randomness — not the statistical properties of the specific password output.

Related guides
← All guidesGenerate password →