macro
Hello a byte of coding. I've professionally developed web software for sixteen years and have been leading a team of platform engineers for two. I write for myself and my friends and it looks like someone thought you should read this page too. I hope to broaden your perspective, enlighten you on some neat thing, or challenge your way of thinking. If you can handle fictional characters: please, be my guest and continue reading.

Ed25519 Deep Dive Addendum

- 70 min read - Text Only

As an application developer, you have likely connected to GitHub or an SSH server with an SSH Key. To do this successfully without anyone else impersonating you, a Digital Signature Algorithm (DSA) is used to authenticate you. Several DSAs can be used to authenticate over SSH; Ed25519 is a great option and not only for SSH keys. In A Deep dive into Ed25519 Signatures, I suggest the Ed25519 algorithm is modern, fast, small, misuse-resistant, and secure option for signing data; I also included an exercise in creating Ed25519 keys, signing data and verifying signatures.

What's in this article?

While my last article (A Deep dive into Ed25519 Signatures) aimed to give developers a detailed practical introduction to Ed25519, I have since learned there are several fine details that ought to be considered when working with Ed25519 and selecting an implementation.

  1. First, I look into a post I saw on the 🍊 site in 2020, it describes how there are different validation criteria across implementations.

    • Within, I learned that batch verification is a thing and briefly investigate what it means.
    • I found out that signature malleability is a problem which I later learn is tied to exclusive ownership.
    • Then I learn that there are important differences between implementations which make them incompatible, especially in the face of maliciously signed signatures.
  2. Second, I research what "exclusive ownership" means and find a paper. I double take when I learn that Ed25519 was not formally proven to be secure for nearly nine years.
  3. Third, I review a technical specification for Ed25519: RFC8032.
  4. Fourth, I discuss deterministic signatures.

    • To set a baseline, I cover how ECDSA signatures work, which Ed25519 improves upon.
    • Then I elaborate on deterministic signatures and how it is implemented.
    • But there are faults with deterministic signatures too, which I cover.
  5. And lastly, I wrap up by covering how Ed25519's reference implementation promotes a misuse vulnerability widely promoted this year.
hi
Hey there! This article is quite long and detailed. Cryptography often ends up being difficult to grok, especially for those add to this world with technology. I have written this content for those who wish to improve their application security with applied cryptography.
You will see me throughout this post too. My words are by the same author but with an emphasis on the mathematical side of cryptography. If you come out with a greater understanding of cryptography at the end, then let me know!
hi

When I wrote the deep dive

I did my best with my knowledge at the time to write that deep dive and I shared what I knew about Ed25519 with the world. Or more accurately, the internet, where the best way to get the right answer on the internet is not to ask a question; it's to post the wrong answer. Several corrections were applied to A Deep dive into Ed25519 Signatures. However, that experience left me feeling less informed than when I started.

So, I scraped even my vaguest memories for any leads.

It's 255:19 AM. Do you know what your validation criteria are?

weary
I remember this graphic of a bunch of invalid curve point or key or something, where several different implementations were being compared (Dalek, LibSodium); it mentioned incompatibility between several and patchy coverage of rejecting certain inputs. Do you happen to remember anything like that?
mood
I spent an hour searching for that. Thank you.

Image from 255:19AM

reading
Finally finished reading it. That was dense. Some of it stuck.
That diagram, what do all those squares mean? I'd think it would be related to tests passing and failing but I'm not sure which is which and what that means in practice.
hmm
sword
For each implementation, a 14x14 matrix of maliciously crafted signatures is tested. A dark pixel means that a test case was rejected. A light pixel means the test case was accepted and a malicious signature will be accepted by that implementation.
Ah! I see. That's... concerning though. This algorithm is so core to the internet but implementations disagree that much? That's... about what I expected really.
happy

It's 255:19AM. Do you know what your validation criteria are? helped me understand that additional verification of public keys (contrary to Ed25519's proposed claims) is necessary. The original paper (which I read later) suggests that all public keys are valid. While the math does not divide by zero, it also does not satisfy certain security properties.

I learned a new concept: unbatched and batch signature verification. Most applications authenticate requests or transactions one at a time in real-time. However, some use cases (e.g., blockchain consensus) benefit more from a batched verification technique where a sufficient number of messages and signatures can be verified together with less computation.

Batched verification?

Outside of the blockchain, where is batch signature verification useful?
puppy-confused
looking
In the paywalled paper Batch Verification Suitable for Efficiently Verifying a Limited Number of Signatures, the authors present a use case for camera and sensor data verification in realtime. Many sensor readings and surveillance images are verified to be authentic with digital signature algorithms by using batch verification.
grump
I do not support paywalls for cryptographic research. By limiting access to this material, publishers are constraining the security of our past, present, and future. Please support The International Association for Cryptologic Research (IACR) instead.
Networked IP Cameras which sign their images
A reproduction of a figure in the paper titled "Batch Verification Suitable for Efficiently Verifying a Limited Number of Signatures."
go-on
If your threat model includes tampered data at scale, which can be authenticated in a batch-basis, then batched verification may be a good choice for you. Unfortunately, the behavior of batched equations and unbatched equations are different for Ed25519 for dishonest signatures.

Signature malleability?

It briefly mentions signature malleability.

... which chose to write a paragraph arguing that signature malleability is never a problem instead of performing the check.
oh
anime-glasses
It might be similar to Base64 malleability, where the meaning of the message is the same but either the author or an adversary can tweak the bytes without the application realizing it has been tampered.

Multiple incompatible implementations

As you might guess from that graphic above, the RFC8032 specification, OpenSSL, dalek, LibSodium, zebra, donna, and even the Go crypto library all behave differently. Not only that, but the same library can differ over time between versions, such as LibSodium.

What ramifications does that have?
thinking
dot-dot-dot-surprised
Differences in individual and batch verification can lead to denial-of-service within an application or across distributed consensus applications. This can be triggered by maliciously crafted signatures, either by a dishonest signer, or a dishonest participant with an existing honest signature.
Could you give some examples?
notes
bonk
Tor could be crashed by sending multiple signatures that fail batch verification with the batched verification equation but succeed in individual verification with the unbatched verification equation. It could not determine which signature to remove.
heck
ZCash was taken by surprise when LibSodium updated its validation rules to reject more signatures. This lead to the development of ed25519-zebra, which replicated LibSodium's older implementation bug for bug so ZCash agents would not inadvertently disagree. This disagreement posed an existential risk to the operation of the ZCash blockchain; in other words the chain could be halted by denial-of-service.
checkmark
Then, ZIP-215: Explicitly Defining and Modifying Ed25519 Validation Rules was then proposed to formally change the Ed25519 implementation to behave consistently with batch and individual verification. This provided a successful evolution to Ed25519 in ZCash's history.
Note:
I have no ties to ZCash the currency or the foundation at this time. I just find this use case interesting and concerning.

The original paper

With the malleability comment from "It's 255:19AM..." in mind, I checked the original paper "High-speed high-security signatures". Indeed, there is a paragraph for Malleability which starts with:

We also see no relevance of "malleability" to the standard definition of signature security.

social-credit-20
That's how you respond to a known issue?
say-disappointed
Malleability took me by surprise with Base64, and I found it concerning. Distinguished Encoding Rules, which is used in the very certificate used to secure this website over HTTPS, was designed specifically to remove malleability. This is an important problem in cryptography, and it should not be hand-waved away.
What about the rest of the paper?
blep
math
It was mostly math and benchmark comparison.
cooking
He mentions that other Pseudorandom functions (PRFs) may be used by Ed25519, like SHA3 (which was not finalized at the time).
I might try mixing and matching my own choices in this, though I understand it will not be compatible with other implementations.
wizard
boi
In practice, application developers and cryptography engineers do not go out of their way to change the reference behavior. Doing so creates a compatibility nightmare and introduces risk.
guess-i-will-die
In Dalek, isis specifically notes that they are not willing to change behavior for something that is nearly a decade old, even if it is poorly designed. Compatibility matters to cryptographers too.

isis agora lovecruft (they/them)

... one doesn't simply get to change the definition of a cryptographic primitive ten years after-the-fact with zero consideration for backwards compatibility in hardware and protocols which have it already have the older definition baked in.

Dalek keypair.rs line 388-392

isis agora lovecruft (they/them)
dumb-math2
Also, the mod L detail during verification only appears in the batched equation but not the unbatched equation. I'll refer to this as the "L bound check" later.
That's such a little detail though!
surprise
humph
In cryptography, these small details really matter. Thousands of people hours have been sunk in discovering and working around rough edges in Ed25519 specifically. Maybe this is why NIST hesitates to include DJB's cryptography?

I left this paper knowing it lacked due diligence, but just how much? Also, this paper was incredibly difficult to read. This was easily the worst paper I have read in months. I do not recommend you put your time towards it.

Provable security of Ed25519

After publishing my deep dive, I asked Soatok for feedback. In response to one of the trivia points, he mentioned:

The inclusion of the public key in R provides a property called "Exclusive Ownership" :3

ECDSA lacks it

Lightning Round - Commitments has a write-up on it

finger-guns

To find out a definition "Exclusive Ownership", I searched around and found a comprehensive paper with relevant material to cite.

I found the paper The Provable Security of Ed25519: Theory and Practice on IACR's ePrint archive when looking for more details. I did a double take when I started reading.

maybe-you

You're telling me there's a paper saying we have a signature scheme in something as important as TLS 1.3 and it quotes:

No detailed proofs have ever been given for these security properties for EdDSA, and in particular its Ed25519 instantiations.
maybe-you-should-not
I cannot believe we have gotten this far without a formal analysis of such an important and now widespread technology.

This paper holds nothing back, at the start the authors address one of my significant concerns:

The papers do refer to malleability and argue that is not relevant for the standard definitions of signature scheme security, but their definition of malleability does not agree with the common usage of the term.

Without even a moment to breath, the next line mentions that the source code does not even match the paper.

It also transpires that, whilst the source code presented alongside the paper accepts mangled signatures (hence is not SUF-CMA), the additional check included in the paper’s description but not the source code, is actually sufficient to prove SUF-CMA.
What's SUF-CMA?
puppy-confused
you-got-it
That one means they can't tweak your signature and it still work.
hmm
Also, the paper only described the L bound check on the batched equation, and the source code implemented the unbatched equation with single verification.
Why is SUF-CMA important?
pick-me
laptop
In a use case where the message and signature should be unique, you must have the SUF-CMA property in your system. Otherwise, mangling the signature will yield more valid message and signature combinations which are seen as unique. This is particularly important in blockchain applications.

This paper was highly educational for me. I finally started to understand the design decisions behind Ed25519. For example, why is the public key mixed into the signature? The method is called Key-Prefixing and it defends against key substitution attacks. Mixing in the key may not be required for this defense. This paper references another, Optimal Security Proofs for Signatures from Identification Schemes, but I have not reviewed it yet.

Thankfully the proof this paper prepares is far more accessible than the mathy benchmark paper that DJB et al published. I came out with a better understanding of what's going on and why. Like "It's 255:19 AM...", this paper also compares variants of Ed25519.

Another variant is the use of Ristretto encodings for Ed25519. Whilst this appears promising, the draft RFC is still under active development and we do not analyze it here.
Will we see any Ristretto in any new standards?
surprise
faceplant
I asked about this. It is unlikely. The IETF process has been blocking the draft for Ristretto for years and it is unclear if they want it. As of now, it is expired and archived.

I specifically learned about "pre-hashing" and that while it is supported in the IETF standard, it is not often employed.

ECDSA and RSA signatures do that! But not Ed25519?
satisfied
01010101
It hashes the public key point and the full message at once to create the signature, so no it does not take a raw hash. That explains my confusion when I wrote the deep dive.

The paper finally gets to comparing the original reference implementation, the IETF specification, and the latest LibSodium implementation.

comparison chart of Ed25519 implementations

The reference implementation was almost SUF-CMA but lacked the essential L bound check, which the IETF specification required. But what are these other checks that LibSodium added? And what are these other security properties?

A few security properties defined in straightforward language
Existential Unforgeability Under Chosen Message attacks EUF-CMA
An attacker cannot forge a signature for a new message that has never been signed before.
Strong Unforgeability Under Chosen Message Attacks SUF-CMA
An attacker cannot forge a different signature for an already signed message that also verifies for the original public key.
Strong Universal Exclusive Ownership S-UEO
An attacker that knows another honest public key cannot create their own public key and message which verifies with an existing signature from an honest public key.
Malicious Strong Universal Exclusive Ownership M-S-UEO
An attacker that can create two malicious public keys and two messages cannot create a single signature that verifies for both messages and public keys. Also, if you have the M-S-UEO property, you also have the S-UEO property.
Message Bound Signature MBS
An attacker cannot create a single signature that verifies two distinct messages with the same public key.

The original release of Ed25519 achieves the minimum requirement of EUF-CMA but not SUF-CMA.

What's EUF-CMA?
puppy-confused
the-more-you-know
That one says others cannot forge signatures and say you signed them.
Oh! That is essential to a digital signature! If someone could sign things for me without my key, then what is the point of my secret key?
surprised-pikachu
ceiling-watching
Exactly! That said, it is still possible to create alternative signatures which can disrupt systems that expect one message - one public key - one and only one signature.

The IETF specification achieves SUF-CMA by ensuring that the signature S is within a proper bound. On top of the signature bound, LibSodium achieved M-S-UEO and MBS by checking:

  • that the signature's commitment R is within a proper bound;
  • and that the public key A is within a proper bound.

So the original idea was sound but they dropped the ball on mandating some bounds checks?
ughhh
ughhh
Exactly, while saying malleability isn't a problem in the original paper. History begs to differ; malleability is a problem.
What about that old RSA that everything still uses? Is it EUF-CMA?
popcorn
looking-ych
According to On the Security of the PKCS#1 v1.5 Signature Scheme, it meets EUF-CMA, though it calls it "UF-CMA."
What about small curve group and invalid point attacks, is that a thing for Ed25519?
tech-support
derpy
That's an attack that exposes a victim's secret to the attacker through key agreement! Key agreement is different from digital signature algorithms, DSAs like Ed25519 only share whether a message, signature, and public key were accepted or rejected.
fist-bump-ych
That said, maliciously crafted public keys and signatures can be used to create malleable signatures. LibSodium rejects these signatures by checking the public key, the signature, and the commitment R.

For those wishing to read the same paper, check out my annotated version of The Provable Security of Ed25519: Theory and Practice

What about clamping? You highlight that a few times in this paper.
poof
exasperated
Clamping is actually a touchy subject. Clamping made the security proof very difficult according to this paper. It has caused problems for hierarchical keys generation with Tor and wallet key derivation too.
Could you elaborate?
cucumber
dumb-math
Clamping is mathematically unsound; some even regard it as offensive. The lower bits can be clamped with a mathematically acrobatic operation, but the upper bit cannot be set so soundly. It is argued that the upper bit is set to enforce constant time computation for everybody. The lower bits supposedly make the cofactor go away.
objection
Implementations do not use the mathematical technique that the high bit would affect with regard to timing attacks. Adding the high bit only makes this construction harder to extend in practice to interesting applications. The original goal may have been a misuse-resistant implementation but in practice it has been a waste of people's time.
you-dense
This is not key agreement! The lower bits do not mitigate the cofactor in any way for signatures. Instead other bound checks have been found and applied to libraries like LibSodium.
So should I add clamping or not?
shy
bullshit2
If you are developing an Ed25519 implementation from scratch for educational purposes, to remain compatible with existing implementations you should apply clamping. You'll find out that the secret key itself is not clamped. The secret key is hashed and the hash is clamped before use with elliptic curve mathematics.

Now that I am aware of how this algorithm works and the theory behind its security and how it is realized in practice: I was ready to review a proper specification. If you are interested in reading this paper, check out my annotated copy of The Provable Security of Ed25519.

Reading RFC8032

If you thought the original Ed25519 paper was a specification, it really is not. It describes an idea and then fills every page with benchmark bragging, mathematical transformations, and comparing to competing research. It is a scientific paper and an accompanying implementation called "ref10" that demonstrates the performance it touted on every page.

Unlike the original paper, the IETF's RFC8032 is actually useful. IETF documents are written for engineers tasked with implementing a solution to a known problem on the internet. A good RFC has an appendix with a reference implementation and/ or "test vectors" to verify a successful implementation.

What is RFC and IETF?
hi
talk-w-bubble

A "Request for Comments" or RFC is an accessible document made by the "internet community" for the community to advance industry standards for connected software. The "Internet Engineering Task Force" or IETF is one such community. Other communities exist too, but the IETF uses "RFC" as a name for their informational, best practice, and specification documents.

This is my definition. Others may disagree.

What does it take for something to become an RFC?
hmm
maybe-you

Several things make for a successful RFC! Here's my opinion on what it takes:

  • The internet community has tried to solve a complex problem involving internet communication and that community has come up with several similar solutions.
  • Those creating solutions are likely making incompatible implementations for the problem in question.
  • These solutions may be entirely proprietary and the internet community would benefit by having a freely available standard that others could implement.
  • A committee of volunteers with industry clout deem an idea worth standardizing and support or refuse to support a standardization effort.
  • Those submitting the standard draft have the patience to trudge through political mud and mailing lists for years.
That last half sounds very discouraging. Maybe that is why Google ditched the standards process for JWT.
nah
What do you recommend referencing while reading RFC8032?
peeking
thumbs-up

Read the source code for ed25519-dalek. isis agora lovecruft is the primary author of Dalek cryptography. It is very well written, easy to explore, and straightforward to read.

As far as I can tell, isis's name is intentionally lowercase.

concern
However, there is a known misuse vulnerability that has not been addressed and the library has not had a new version published in two years. Some are reporting it as no longer maintained in the Rust community. While the source is easy to read, consider another implementation if you are using Rust.
ed25519-dalek is deprecated
Libraries that included Dalek now consider it deprecated and are replacing its usage.
A list of unsafe libraries is available on GitHub.
todays-interesting-thing
But back to RFCs: once a technology component (like Ed25519) is standardized, extending other IETF specs to support those components is relatively easy and the documents are quite short. For example, soon after RFC8032 (Ed25519) was finalized, RFC8037 was accepted which specified how to use Ed25519 signatures with JWTs.
Enough about the IETF, what did you actually learn from reading RFC8032?
reading
RFC 8032 Annotations
If you are interested, read my annotated copy of RFC 8032.

If you go into a specification, you should understand that you will learn how to make the thing, not necessarily how to use it effectively or correctly. There are "best practice" documents on using a technology, but unless it is a living draft, the knowledge documented will be out of date in months or years.

deus-ex-new
The community may submit errata, but the RFC text itself is not updated. I consider this aspect of RFCs to be unfortunate. It should not be upon a reader 5 years later to patch the document in their head. This is a usability defect. Implementors will not always review the errata.

As I read RFC8032 and referenced Dalek and LibSodium source code, I learned how Ed25519 was constructed and implemented. If you should read this, and re-read it five times, you should develop a better understanding of the following:

  • How Ed25519 secret keys (the "k") are prepared for elliptic curve operations (the secret scalar "s")
  • How Ed25519 secret scalars are converted to public keys (the "A")
  • How Ed25519 signature commitments (the "R") are deterministically generated
  • How Ed25519 signature verifiers (the "S") are created
  • How to verify a signature with a public key, the message, the signature commitment, and the signature verifier
  • That implementations may differ by having different contexts in the hashing process (which creates incompatible signatures)
  • Develop a better intuition of elliptic curve math in practice
Oh! That's some useful information, but will this help me as an application developer?
excited
i-dunno-man-im-just-a
No, not really. It will remind a developer about the basic interface of key generation, signing, and verifying. But a library that implements it should too. Reading RFC8032 will not help you secure your application.
ych-chomp
Though it does hint that not every Ed25519 signature will be compatible with the same message and key due to contexts. This is different from compatibility by rejecting malicious signatures and malicious keys.
yeee
As a developing cryptographer, it was incredibly informative to me. If you would like to learn about how cryptographic constructions are assembled then do check out RFC8032 as functioning example.

After finishing up my review of RFC8032 and the other papers, I felt Ed25519 is a quirky design. It works, but it is clear that DJB et al were racing to make something so performant on Intel chips that it would compel adoption. I'm not saying don't use it. Just that whatever we use next should face a fair number of questions on design and implementation.

Note that RFC8032 does not have the safety checks that LibSodium has. The RFC will never be updated to give caution for malicious keys that have small or malicious points. A future informational RFC might be published but by the time it comes out, it will be far too late.

While RFCs may describe how a technology should be implemented or behave, it does not specify how it should be used or how it should be exposed to a developer. RFCs are also not a reliable source for guidance on preventing misuse of a technology.

Sure, RFCs come with a section called "security considerations", but the content is intended for the implementers and not the developers. The only thing in RFC8032 that extends to developers says "don't sign large amounts of data at once." Well thanks but what if I need to? No advice is given here.

Deterministic signatures and their weaknesses

First an introduction to signatures based upon elliptic curve technology. This knowledge is necessary to understand why deterministic signatures are advantageous.

What problem do deterministic signatures solve?

The following describes how ECDSA works, consider reading the weaknesses at the end before skipping back to deterministic signatures.

An Elliptic Curve Digital Signature Algorithm (ECDSA) is based on a commitment and verifier. With ECDSA, the commitment is typically a random point on the elliptic curve, and the verifier is an unguessable number which satisfies a verification equation that also takes a public key and message. Corruption to either the commitment or the verifier will result in an unverifiable signature.

Creating a signature involves both a probabilistic process and a few deterministic processes.

puppy-confused
Could you explain what probabilistic and deterministic mean?

A probabilistic process is a computation that produces something that appears random, inputs may be given that tweak the properties of what comes out.

A deterministic process is a sequence of steps that always results in the same thing given the exact same inputs.

thumbs-up
Also, I will refer to the private key, what you might see in a PEM file, as a "signing secret" or "signing secret key". Other created and forgotten secrets are involved as you will see next.
typing

A commitment secret is chosen randomly and is a large number. This secret should be less than the prime number the curve is based on and it should be greater than zero. This commitment secret will be used in the signature equation later and, as named, should be kept secret. It is generated with a probabilistic process.

Then, the commitment secret is transformed into a public format called the commitment. This generally involves math where the "base point" or "generator" of the curve is multiplied by itself many times. If the commitment secret were 500 then the base point would be multiplied by itself 500 times. This is also referred to as exponentiation.

"Base point" and "generator" are terms used to refer to a point value defined by the curve. Points are defined as having an x and y coordinate. If you apply a whole number (called a "scalar") as an exponent to the coordinate, an elliptic curve operation is executed to compute a new point on the elliptic curve.
heart
gendo
What about this mathematical operation makes it important?
Reversing this exponentiation operation is considered a computationally-hard problem. It is also sometimes called a "trap door function", where it is easy to fall in, but not to climb out. Often this problem is referred as "The Discrete Log Problem", or "DLog." Computationally-hard problems underpin all of modern practical cryptography.
teaching

We often call the commitment in specifications and papers "R", as it is a random point. The verifying application cannot confirm how the commitment was constructed. It may be maliciously constructed or unsafely constructed if the commitment secret were 0 or greater than the prime-order of the curve.

peeking-from-door
What's this about a prime-order?
ECDSA is based upon elliptic curves which involve a high-bit prime number in its definition. When the "base point" is multiplied to itself more times than the prime number, this reduces the security of the curve by introducing bias or malleability. So it is important that scalars used in exponentiation do not equal or exceed the prime number in honest implementations.
crossmark

Before creating the verifier, the message is hashed (along with any other contextual information) and the hash is treated as a big number, in other words a "scalar." This hash is something the verifying application must recreate when using a verification equation. It is generated with a deterministic process.

The verifier is constructed with an equation which takes the secret signing key, the message hash, the public key, the commitment, and the curve parameters as inputs. Unlike the commitment which is a point, the verifier is just a large number.

A signature is the commitment and the verifier bundled together. Additional info like what curve, what hash algorithm, the public key identifier, etc. may also be included.

To verify a signature, an application must reconstruct the hash from the message and have the public key of the signer. A signature is verified with the signature components (the commitment and verifier, and contextual components like the message hash, the public key, and the curve parameters. It is verified by using an equation which if satisfied strongly indicates that the private key tied to the public key was used to sign the message hash.

To see more on how ECDSA is implemented, check out RFC6979 section 2.3.5 and section 2.4. RFC6979 describes a deterministic implementation, to read a non-deterministic implementation consider step 2 of section 2.4 to only contain:

A random value modulo q, dubbed k, is generated [with a cryptographic number generator].

And ignore section 3 entirely for a probabilistic ECDSA implementation.

notes

ECDSA comes with a few weaknesses:

  • If the same commitment is signed with two verifiers using the same public key, the private key can be trivially recovered;
  • Insufficient randomness or broken randomness for commitment generation will allow private keys to be recovered;
  • Libraries may allow the application to set the commitment, which may let the attacker influence or set the commitment and therefore allow private keys to be recovered;
  • And if the public key is maliciously constructed, its signatures might be accepted for any or several messages.
You can find out more how ECDSA was poorly implemented in Sony PlayStation 3 ECDSA random number reuse (archived).
point-left

Feel free to share this topic: Elliptic Curve Digital Signature Algorithms.

How are deterministic signatures implemented?

In practice we see a need for caution and care around the commitment part of a signature. If the commitment secret is duplicated for different messages, then the signing key secret for that key can be recovered. When an attacker holds the key, they may impersonate the honest key holder and defeat the security of the scheme that relies upon that authenticity.

tell-me-more
If you are unfamiliar with some of the vocabulary in this section, see Elliptic Curve Digital Signature Algorithms. When I mention "recovered", consider it equal to "leaked" or "stolen."

The security of a signature scheme requires that all commitment secrets be unique (at least as far as distinct messages go). One way to do this is to record information that changes over time, called state. It could be a counter or a collection of all previously issued commitments. Unfortunately, state in this scenario requires consistency for any application that holds the private key. Also, maintaining a set of all commitments that can be efficiently searched is difficult at scale. An alternative is to partition counters such that each signer has a reserved counter space, so other signers will not accidentally duplicate commitment secrets that will collide.

If this is starting to sound over engineered, that is because it is.
yuck

Instead, we have a cryptographic tool which is collision-resistant. Cryptographic hashes or message digests exist to fill the role of what is called a "random oracle." In computer science theory, an oracle is something that gives an answer, and a random oracle gives a random answer which has never been seen before. Replying with a pre-existing signature does not reveal more information about the secret key. By definition something that has never been seen before cannot collide with previous answers.

We deem a hash function as broken when a collision is found. MD5 is easily broken, and SHA-1 (see shattered) can be broken with sufficient money and server time.

One more computer science thing to mention: we generally do not care if a message is verified twice. The verifier will not be able to determine if a signer had signed the same message again or replied with a saved signature when the message is the same. Because of that, it is safe to use deterministic signatures which use deterministic commitments.
ok

To create a deterministic commitment secret with a hash function, we need to maintain one important requirement: the hash digest used to create the commitment secret must be unguessable. Also, the public form of the commitment which goes into the signature should not reveal the commitment secret.

If you see that the commitment for any signature repeat for a different message or key: that is a strong sign that something is wrong! Examples of this are on the blockchain.
surprise

Ed25519 uses the message and part of a SHA-512 hash of the signing secret key to create a deterministic commitment secret and then the commitment is made from that commitment secret. Because secret bits of sufficient entropy were added to the hash of the message, it is too difficult to guess or reverse what the commitment secret is.

Deterministic ECDSA (RFC6979) uses a unique approach to probe for safe commitment secrets using HMACs. The message and signing secret key are mixed together with HMACs to create a possible commitment secret. If the commitment secret is unsatisfactory, it probes for another value with a deterministic algorithm. For example, a commitment secret of all 0’s is unsatisfactory.

A deterministic commitment secret protects the secret signing key from being recovered. This protection relies on the collision resistance property we get from cryptographic hashes. Besides key generation, the commitment secret generation part of digital signatures is the only probabilistic part of creating a signature. By replacing the commitment secret probabilistic process with a deterministic process, the complete signature process for a pre-existing key becomes deterministic.

Deterministic signatures remove the risk of exposing a secret signing key when commitment secrets could be misused. This provides a defense against application developer ignorance (like Sony) and poor randomness from the environment (like some transactions on the blockchain).

An interesting consequence of Ed25519’s design: since the secret scalar is derived from the first 256 bits of the SHA-512 hash and the secret commitments are made from the other 256 bits, if the secret scalar were leaked but not the private key then signatures made with a stolen secret scalar must be different from honest signatures for messages that have not been honestly signed before.
standard
However, a verifier cannot tell the difference between the impersonator and the honest signer since the commitment is made from secret information.
dizzy

Feel free to share this topic: Deterministic Digital Signature Algorithms.

Deterministic signatures in practice

Ed25519 removed the biggest landmine that traditional ECDSA still has to this day. By removing user choice and randomness from this signature algorithm, we can feel safer in using it.
angel
puking
Yep! It is not all rainbows however.
Did I miss something?
beg
wrench
The theory is good! However, the real world has a few more tricks for deterministic algorithms.
Do tell.
standard

Assuming your deterministic algorithm runs in constant time, a remote attacker should not be able to use side channels (e.g., timing analysis) to discover information about the secret key.

However, the story is quite different for a local attacker. Power analysis is a technique that samples power consumption or monitors radio frequency noise for indications of computation.

For your privacy, this youtube video was not automatically loaded.
Click this area to load an embedded youtube video.

Yarom and Genkin show on stage that they could recover an Ed25519 key with a cheap software-defined radio by getting the device to sign a variety of data on the primary CPU.

In the same line as power analysis, an attacker might try to use a fault attack to corrupt the commitment secret. A fault attack involves tampering with the device’s power for a short duration of time. Instead of a full computer, fault attacks generally target smart cards or dedicated chips that perform cryptographic operations. These trusted devices are important for identity between machines.

What devices should I keep secure if fault attacks are a threat to me?
jaw-dropped
access-granted
If you have a hardware security module (HSM), a hardware wallet, or even a WebAuthN key, you should keep it secure. There are devices that self-destruct when tampered. This is also a choice, but make a plan if the device is lost or destroyed.

Often fault attacks are performed by tapping into the processor’s power and then intentionally manipulating the power when the device performs a computation. As an example, if a fault is introduced when the process should have hashed the message but it did not, then the hash that is signed may be predictable. That is, the commitment secret has been coerced to a predictable value and the signing code may proceed without realizing anything is wrong. The signature that comes out of the device may then be examined to recover the private key.

Suppose I wanted to defend against fault attacks and power analysis, what can I do?
flails
heh-heh
I have heard a few things. For applications, a developer may try to use blinded keys. This does make it slower but if you’re only signing something once a minute you will not notice. An example of how to do this is included in "Draft: Key Blinding for Signature Schemes."
ready
For hardware developers, I have heard a suggestion where a power source like a capacitor is charged, the main power is disconnected, and the capacitor is used as a power source for the computation. See Protecting smart cards from passive power analysis with detached power supplies. The patent for this idea has expired so try it out!
I was just interested. Turns out local attackers are not part of my threat model. Is there anything else?
angel
yes
There is! Keep reading!

Public Key Confusion

If you take a look at RFC8032, you will find that the signing function takes the public key as part of the hash which is signed. Also, you will see that the public key is the same in both the signing hash and the verifying hash.

thumbs-up
By including the public key in the hash, assuming other checks are in place like LibSodium, this signature scheme produces message bound signatures. Soatok called this exclusive ownership.

Now what do developers do when they see duplicate domain specific code? It is refactored into its own function.

And what do developers do when they see something is deterministic and effectively static? It is memoized and cached.

In the original implementation, we see the following code. If it looks hard to read, you are not alone. This is why I recommend reading Dalek instead. The Rust compiler can do so much more than the C compiler to make an optimized executable.

int crypto_sign(
  unsigned char *sm,
  unsigned long long *smlen,
  const unsigned char *m,
  unsigned long long mlen,
  const unsigned char *sk
)
{
  unsigned char pk[32];
  unsigned char az[64];
  unsigned char nonce[64];
  unsigned char hram[64];
  ge_p3 R;

  memmove(pk,sk + 32,32);

  crypto_hash_sha512(az,sk,32);
  az[0] &= 248;
  az[31] &= 63;
  az[31] |= 64;

  *smlen = mlen + 64;
  memmove(sm + 64,m,mlen);
  memmove(sm + 32,az + 32,32);
  crypto_hash_sha512(nonce,sm + 32,mlen + 32);
  memmove(sm + 32,pk,32);

  sc_reduce(nonce);
  ge_scalarmult_base(&R,nonce);
  ge_p3_tobytes(sm,&R);

  crypto_hash_sha512(hram,sm,mlen + 64);
  sc_reduce(hram);
  sc_muladd(sm + 32,hram,az,nonce);

  return 0;
}

Source for the above code snippet

In the incredibly inaccessible code above, we see that the public key "pk" is created on the fly and that it sneaks into the signing buffer which is hashed at the end.

As a reminder, RFC8032 helpfully describes that this hash takes in the following order:

  • The commitment "R"
  • The public key "A" in RFC, "pk" in the above
  • The message "M" in the RFC, "m" in the above
teaching
I am getting heartburn trying to read and explain ed25519 ref10
It'll be over soon. No wonder everyone wanted to rewrite it. This is absolutely terrible code.
dizzy

A small but impactful change is often introduced. I have found multiple instances of this change being independently introduced into this function by multiple developers in their commit history.

Here is the most readable change I have seen.

void ed25519_sign(
  unsigned char *signature,
  const unsigned char *message,
  size_t message_len,
  const unsigned char *public_key,
  const unsigned char *private_key) {
    sha512_context hash;
    unsigned char hram[64];
    unsigned char r[64];
    ge_p3 R;

    sha512_init(&hash);
    sha512_update(&hash, private_key + 32, 32);
    sha512_update(&hash, message, message_len);
    sha512_final(&hash, r);

    sc_reduce(r);
    ge_scalarmult_base(&R, r);
    ge_p3_tobytes(signature, &R);

    sha512_init(&hash);
    sha512_update(&hash, signature, 32);
    sha512_update(&hash, public_key, 32);
    sha512_update(&hash, message, message_len);
    sha512_final(&hash, hram);

    sc_reduce(hram);
    sc_muladd(signature + 32, hram, private_key, r);
}

Source code for the above snippet

This code made several changes:

  • The secret signing key is no longer an input. Instead, a private_key is given as input.
  • The secret scalar that is derived from the secret signing key is preserved. It is the first 32 bytes of the private_key variable
  • The key tag is preserved. It is another 32 bytes (after the secret scalar) from the private_key variable.
  • The public key is now supplied as another variable to this function. Previously, the public key was calculated from the private key each time the signature function executed.
  • The hash function switches from a single call to a stateful call using the initialize, update, and finalize pattern.

Unfortunately, these changes have introduced a misuse vulnerability.

glare
Remember, the problem is accidentally signing the commitment twice with different signatures. The last thing that can be tweaked in the signing function is the public key!
But I can pass in the public key that comes from my private key, what is the problem?
dot-dot-dot-surprised
smug
What if you don’t pass the public key that you specifically derived from the secret key?
Then the hash, which takes the public key comes out differently.
laptop-thinking
panic2
And so, the verifier changes in the signature, but not the commitment!
Which creates two different signatures by the same public key on the same commitment.
idea
failure
That means the secret scalar (which is effectively the secret signing key to any verifying application) may be recovered with some math.

This misuse is demonstrated in ed25519-unsafe-libs/ed25519-chalkias-exploit.

What can we learn from this issue? Besides "don’t do that."
screaming-inside
flails2
The public key is bound to the secret key in the signing process. This change unbinds the public key to the secret key.
The secret scalar and key tag appear bound together.
thinking
shades
Indeed. If the public key were a part of the signing key variable, then it would be bound to the signing key and this issue would be avoided.
That would require more offsets and then I’d have to keep that consistent across all the call points.
wtfdidijustread
ych-doot
There is a better way... C has structures as does practically every language. You should leave behind these dangerous techniques that use byte pointers and offsets and just use structs. All the data in this scenario has fixed sizes. It can be passed by value or reference with ease.
Also try Rust! Rust is designed to prevent misuse involving bytes and pointers. As long as you do not use unsafe. Why use buffers in Rust when you could use structures that pass around slices?
excited
proud
By composing related and bound data together in a structure, we can be more confident that it would not be misused over the alternative, which is that the programming language is used as the source of structure.
Could you elaborate more?
beg
looking-ych
If you have ever heard of static application security testing (SAST), it works by discovering the structures of how data flows in a program. While some structures are technically safe at one point in time, the program structure may be changed with new code to be unsafe later.
galaxy-brain
Data structures can be understood and reasoned about more easily, while program structures require a holistic analysis — which is often beyond the scope of what a developer can do in their head.
All that said, what can application developers take away from this experience?
typing
sneeze
If you are supplying a public key to a signing (not verifying) operation, then this is an area prone to mistakes and misuse. Insulate this code from the outside with a few changes!
laptop2
Immediately wrap the signing function from a misuse-prone library with your own function which derives the public key from the secret signing key. Then, look for alternative implementations which do not include this misuse prone function design.
corporate-drone
You could go further and wrap the signing key to memoize the public key next to it so it is never separated.

Conclusion

The Ed25519 digital signature algorithm raised the bar for signature key safety by introducing reliable and efficient deterministic signatures. Unfortunately, the original authors hand-waved away important security guarantees in the name of efficiency. These authors produced an inaccessible scientific document focused entirely on performance. The community had to review, adopt, and integrate an inaccessible implementation from Daniel J. Bernstein's SUPERCOP codebase because the paper did such a poor job on describing its implementation. Daniel J. Bernstein et al were so close to providing message bound signatures but failed to add trivial checks to the reference implementation, likely in the name of performance. The paper also proposed two equations, one that was fast for individual verification and one that was acceptable for batch verification. However, these equations behave differently for maliciously crafted signatures. This difference in behavior has led to a denial-of-service problem for batch verification use cases. Also, the paper did not provide proofs for its security properties.

A combination of no formal specification, no human friendly framework of how Ed25519 is made, and a completely inaccessible codebase has lead to misuse as Ed25519's adoption has proliferated. RFC8032 helped formalize what an Ed25519 implementation should look like but lacked several safety features required to complete the message bound signature security property. Those security features included: checking the commitment was within an acceptable range, checking that the public key was within an acceptable range, and verifying that the public key was not maliciously constructed with known points that break the mathematical properties of the elliptic curve. Again, LibSodium provides these protections now.

Without a reliable baseline, several diverse implementations came to be. Each implementation had its own validation criteria on signatures and hardly any of them actually implemented RFC8032 faithfully. Some of these implementations, such as LibSodium, changed over time to restrict signatures to those signed only by honest signers. In distributed consensus environments, changes in verification behavior must be versioned and coordinated. Because verification behavior changed in LibSodium, a denial-of-service attack was a threat if some contributors to consensus accepted maliciously crafted signatures.

The reference code was also so inaccessible that developers would introduce a misuse vulnerability when trying to improve their application while minimizing impact to the API. A developer looking to improve the accessibility of their code, minimize impact to usage of that code, and improve performance by memoizing the public key would unknowingly introduce risk into their application.

I believe that the next generation of digital signature algorithms, including post-quantum algorithms, should:

  • Be resistant to timing attacks
  • Be efficient for signing and verification
  • Prevent misuse which can reveal the private key
  • Prevent misuse on signing and verifying large messages, including partial signing and partial verification of messages
  • Prevent misuse on signing content by correctly binding all key specific material to the signing key up front
  • Include an accessible implementation with clear documentation for both API usage and the internal code in how it functions
  • Supply a reference implementation with the knowledge that it will proliferate for a decade instead of being rewritten by another competent patient cryptographer
  • Provide a draft specification for the digital signature algorithm
  • Ensure the draft specification and the reference implementation are consistent
  • Minimize acrobatic math for odd behavior like clamping and instead rely upon mathematically sound approaches when required
  • Actually provide formal proofs for security properties and how the security properties are realized in the specification
  • Include security properties such as M-S-UEO and MBS and Non Re-Signability
Non Re-signability is a newly proposed security property in BUFFing signature schemes beyond unforgeability and the case of post-quantum signatures. In short it means that without the message, there should be no way for someone else to come up with their own signature for their own public key.
point-left
cupcake
I might write about this later!
i-sleep
This article is part of the series Creating Authenticated QR Codes. The next article has not been published yet! When it is, I will post it on Twitter, Telegram, and through my RSS feed here.

Final Words

My goal was to really understand Ed25519 so that I could explain it here in a way where you, the reader, do not have to dive into the depths that I had to in order to understand the content and its qualities.

My journey continually confirmed that cryptography resources remain inaccessible to many audiences. I had to reach out to experts (see acknowledgement below) to clarify or confirm my understanding. Some of my expert interactions required a significant mathematical understanding of what goes into a cryptographic tool like a digital signature algorithm. I would not have been able to follow along three years ago.

After publishing A Deep dive into Ed25519 Signatures, I received enough feedback to convince me that I had much more to learn. While investigating that feedback, I made small corrections and added hints to the original article and put larger items onto a backlog which led to this addendum. However while preparing the pre-draft to this article, I burned out and could not write for months.

Once I had regained my writing capability in August, I resumed my education on Ed25519 to write this addendum. This article took nearly a month to research and write. I hope that my language is more accessible than what I had to read and that it is correct enough to inform application developers that make decisions on what cryptographic tools to employ in their applications.

hi
Hey, if you think something is inaccurate, please contact me over email. See my contact page for details. I would appreciate feedback that includes references that support your views and knowledge.

Thank you for taking the time to get this far, I wish you the best in securing the future of your teams and customers.

Acknowledgements

Thank you to Thomas Pornin, author of RFC6979 and BearSSL for taking the time to connect everything together for signature schemes and the math involved. Thank you to Filippo Valsorda, author of age encryption which uses Ed25519 keys, for recommending what content to include. Thank you to Nadim Kobeissi, who is invested in raising the baseline for cryptography education in neat and fun ways, for recommending what content to include. Thank you to Soatok, another furry blogger who seeks to make cryptography accessible, for leads and feedback while researching Ed25519 and for final proof reading of this article. Thank you to Ibzan for content and editing review. Thank you to Xe Iaso for reviewing this article before publication. Thank you to Glitch for permission to use his character in dialog, Ibzan and Soatok and Xe also have characters depicted above with permission.