This commit greatly expands the usage of black_box/zeroize on bits, as it
originally should have. It is likely overkill, leading to less efficient
code generation, yet does its best to be comprehensive where comprehensiveness
is extremely annoying to achieve.
In the future, this usage of black_box may be desirable to move to its own
crate.
Credit to @AaronFeickert for identifying the original commit was incomplete.
Unfortunately, G::from_bytes doesn't require canonicity so that still can't
be properly tested for. While we could try to detect SEC1, and write tests
on that, there's not a suitably stable/wide enough solution to be worth it.
This could still be gamed. For [1, 2, 3], the options were ([1], [2, 3]) or
([1, 2], [3]). This means 2 would always have the maximum round count, and
thus this is still game-able. There's no point to keeping its complexity
accordingly when the algorithm is as efficient as it is.
While a proper random could be used to satisfy 3.7.2, it'd break the
expected determinism.
Also moves the aggregator over to Digest. While a bit verbose for this context,
as all appended items were fixed length, it's length prefixing is solid and
the API is pleasant. The downside is the additional dependency which is
in tree and quite compact.
While the prior intent was to avoid zeroizing for vartime verification, which
is assumed to not have any private data, this simplifies the code and promotes
safety.
Offset signing is now tested. Multi-nonce algorithms are now tested.
Multi-generator nonce algorithms are now tested. More fault cases are now tested
as well.
This wasn't done prior to be 'leaderless', as now the participant with the
lowest ID has an extra step, yet this is still trivial. There's also notable
performance benefits to not taking the previous dividing approach, which
performed an exp.