Conversation
Thanks for the PR. I'm hoping to get #213 squared away first, and then I'll take a look at these new vectors if nobody else beats me to it. |
Good question. I didn't address that need in #213, but #193 mentions adding coverage for randomized signatures so I think it's a welcome addition as follow-up work (this PR or otherwise). I didn't add a "resolved" linkage to my PR with the goal of leaving 193 open until we cover that (and potentially port more of the leancrypto vectors).
Exactly. I don't think my remaining time budget for Wycheproof this week will fit both but I will get to it "soon" ™️ :-) |
Understood! I'm in no particular rush. I am able to use these in their current form but was asked to contribute these "upstream" in case it helps others. Let me know how I can be of help! |
|
@chrisfenner - Thanks for your patience. I took a look at this branch and think we should rework the arrangement slightly if you're open to it. WDYT about rebasing this on The main benefit from my perspective is that it would let us roll the new vectors into the pre-existing If it turns out we do need a distinct schema I think we'll want to avoid adding a new vector definition for it inside Right now adding new vectors to existing files is mechanically annoying so I would be happy to have you make the described changes to use the shared schemas, but continue to provide the new vectors as distinct files. I can take on the work of merging them once we're done review if that makes life easier for you. |
|
👋 @chrisfenner Have you had a chance to think about my feedback? Is that something you would be interested in addressing, or should we try to find someone to take over getting this across the line? |
|
Sorry @cpu, I was busy and then on vacation for a couple weeks there and did not get back to you. I need take a closer look at your suggested schema change, but I should be able to do it! |
|
No worries, I appreciate the update. Let us know if it turns out you don't think you'll be able to find the time. |
|
@cpu quick question about using the test group to store the private key: each of the test cases in this PR (which are actually just adapted from https://github.com/post-quantum-cryptography/KAT) have a different private key. Is it fine to have 100 test groups in the new files? How would you like these numbered? 100 groups with |
|
Another obvious option: just generate a bunch of actually new vectors. Really the only reason I started with p-q-c/KAT's test cases was starting with data that was already known-correct. |
I think it's sub-optimal, but probably OK if necessary. Incrementing the tcId across groups is the best bet so we still retain a unique tcId per test case per file.
If you're up for that, I think it would be nicer than having many test groups. I think the existing schema nudges us in this direction. |
I'll just plan to do that, then. As a reviewer, what makes it easy for you to review a PR with 300 test cases? I'm assuming you would like to be pretty confident that the data is right, but I'm unfamiliar with whatever validation tools we use in this repo. |
|
FWIW, I seem to remember that https://github.com/post-quantum-cryptography/KAT vectors are random, not targeting different edge cases each. If that's the case, I think having one for each parameter set and inputs combination is enough. In other words, if two vectors can't be excepted to ever separately fail, there's no need to have two. Looking only at the PR description, I think we already have Mu and seeds in the current schemas, so maybe we just need to add an optional randomness field to the existing schema and one vector per parameter set (-44, -65, and -87) with non-missing randomness? |
This PR introduces 100 (for each of ML-DSA-44, -65, and -87) happy-path ML-DSA signing cases that include:
This requires some new schema work. For the new test cases I have introduced a new
CompletePureMlDsaSignTestVectorto the common ML-DSA schema file.These are taken from https://github.com/post-quantum-cryptography/KAT with a small modification by myself (https://github.com/chrisfenner/mldsa-test-vectors) to compute and verify Mu values.
Apologies if I've messed up any schema or naming conventions, I've made my best attempt to conform to the existing patterns.