Unfortunately, G::from_bytes doesn't require canonicity so that still can't
be properly tested for. While we could try to detect SEC1, and write tests
on that, there's not a suitably stable/wide enough solution to be worth it.
This could still be gamed. For [1, 2, 3], the options were ([1], [2, 3]) or
([1, 2], [3]). This means 2 would always have the maximum round count, and
thus this is still game-able. There's no point to keeping its complexity
accordingly when the algorithm is as efficient as it is.
While a proper random could be used to satisfy 3.7.2, it'd break the
expected determinism.
Also moves the aggregator over to Digest. While a bit verbose for this context,
as all appended items were fixed length, it's length prefixing is solid and
the API is pleasant. The downside is the additional dependency which is
in tree and quite compact.
While the prior intent was to avoid zeroizing for vartime verification, which
is assumed to not have any private data, this simplifies the code and promotes
safety.
Offset signing is now tested. Multi-nonce algorithms are now tested.
Multi-generator nonce algorithms are now tested. More fault cases are now tested
as well.
This wasn't done prior to be 'leaderless', as now the participant with the
lowest ID has an extra step, yet this is still trivial. There's also notable
performance benefits to not taking the previous dividing approach, which
performed an exp.
There's two ways which this could be tested.
1) Preprocess not taking in an arbitrary RNG item, yet the relevant bytes
This would be an unsafe level of refactoring, in my opinion.
2) Test random_nonce and test the passed in RNG eventually ends up at
random_nonce.
This takes the latter route, both verifying random_nonce meets the vectors
and that the FROST machine calls random_nonce properly.
The audit recommends checking failure cases for from_bytes,
from_bytes_unechecked, and from_repr. This isn't feasible.
from_bytes is allowed to have non-canonical values. [0xff; 32] may accordingly
be a valid point for non-SEC1-encoded curves.
from_bytes_unchecked doesn't have a defined failure mode, and by name,
unchecked, shouldn't necessarily fail. The audit acknowledges the tests should
test for whatever result is 'appropriate', yet any result which isn't a failure
on a valid element is appropriate.
from_repr must be canonical, yet for a binary field of 2^n where n % 8 == 0, a
[0xff; n / 8] repr would be valid.
single function
3.4.3 actually describes getting rid of DLEqProof for a thin wrapper around
MultiDLEqProof. That can't be done due to DLEqProof not requiring the std
features, enabling Vecs, which MultiDLEqProof relies on.
Merging the verification statement does simplify the code a bit. While merging
the proof could also be, it has much less value due to the simplicity of
proving (nonce * G, scalar * G).