mirror of
https://github.com/serai-dex/serai.git
synced 2024-11-16 17:07:35 +00:00
add polyseed support (#257)
* add polyseed support * fix pr comments * fix tests * Embed the mempool into the Blockchain * Plan scheduled payments whenever outputs are received The scheduler prior waited for the next series of payments to be added. * Replace Tendermint step with sync_block Step moved a step forward after an externally synced/added block. This created a race condition to add the block between the sync process and the Tendermint machine. Now that the block routes through Tendermint, there is no such race condition. * Finish binding Tendermint into Tributary and define a Tributary master object * Add correction the last commit missed * Add DoS limits to tributary and require provided transactions be ordered * Fix the scheduler from dropping UTXOs when there weren't any payments * Documentation and cargo update * Add a dedicated db crate with a basic DB trait It's needed by the processor and tributary (coordinator). * Add a DB to Tributary Adds support for reloading most of the blockchain. * Reloaded provided transactions from the disk Also resolves a race condition by asserting provided transactions must be unique, allowing them to be safely provided multiple times. * must_use annotations on DbTxn * Support reloading the mempool from disk * Add a NewSet event to validator-sets Updates to the latest serai-dex/substrate due to depending on 10ccaca0eb498a2316bbf627d419b29b1a75933a. * Add basic getters to tributary * cargo update * Update to the latest subxt Writes a custom unsigned extrinic creator due to subxt having an internal error with the scale metadata. While the code in our scope increased, it's much more ergonomic to our usage. We may end up rewriting most of subxt, eventually. * Make unsigned private due to unsafe calling potential * Start defining the coordinator * Merge AckBlock with Burns Offers greater efficiency while reducing concerns re: atomicity. * Correct processor flow to have the coordinator decide signing set/re-attempts The signing set should be the first group to submit preprocesses to Tributary. Re-attempts shouldn't be once every 30s, yet n blocks since the last relevant message. Removes the use of an async task/channel in the signer (and Substrate signer). Also removes the need to be able to get the time from a coin's block, which was a fragile system marked with a TODO already. * cargo +nightly fmt * cargo update Since p256 now pulls in an extra crate with this update, the {k,p}256 imports disable default-features to prevent growing the tree. * Support extracting timestamps from blocks * Make progres on handling NewSet events Further bones out the coordinator. * Resolve #245 * Have InInstructions track the latest block for a network in storage * Fill out code for the rest of the Substrate events * Clean up the Substrate block processing code * Rename transaction file to tributary, add function for genesis * Add a processor API to the coordinator * Add extensive commentary on mutable to the processor's main file Clearly establishes why consistency is guaranteed from a Rust borrow-checker mindset. While there are plenty of... 'violations', they're clearly explained. Hopefully, this method of thinking helps promote/ensure consistency in the future. * Move ConfirmKeyPair from key_gen to substrate Clarifies the emitter and accordingly why its mutations are justified. * Remove BatchSigned SubstrateBlock's provision of the most recently acknowledged block has equivalent information with the same latency. Accordingly, there's no need for it. * Add note to processor_messages * Use a single txn for an entire coordinator message Removes direct DB accesses whre possible. Documents the safety of the rest. Does uncover one case of unsafety not previously noted. * cargo update to remove usage of yanked crate * Clarify safety of Scanner::block_number and KeyGen::keys * Tweak ConfirmKeyPair to alleviate database requirements of coordinator * Use an enum for Coin/NetworkId It originally wasn't an enum so software which had yet to update before an integration wouldn't error (as now enums are strictly typed). The strict typing is preferable though. * Code a method to determine the activation block before any block has consensus [0; 32] is a magic for no block has been set yet due to this being the first key pair. If [0; 32] is the latest finalized block, the processor determines an activation block based on timestamps. This doesn't use an Option for ergonomic reasons. * automate whitespace & trimming test cases * Save keys by their tweaked group_key Keys are referred to by their tweaked versions. If a tweak was needed, keys would fail to confirm. * Use crypto-bigint's reduction in ed448 Achieves feasible performance in the ed448 which makes it potentially viable for real world usage. Accordingly prepares a new release, updating the README. * Move the entirety of ed448 to Residue, offering a further 2-4x speedup * Resolve #68 Notably speeds up monero-serai's build and CLSAG performance. * Make MainDB into SubstrateDB * Initial Tributary handling * Add additional checks to key_gen/sign There is the ability to cause state bloat by flooding Tributary. KeyGen/Sign specifically shouldn't allow bloat since we check the commitments/preprocesses/shares for validity. Accordingly, any invalid data (such as bloat) should be detected. It was posssible to place bloat after the valid data. Doing so would be considered a valid KeyGen/Sign message, yet could add up to 50k kB per sign. * Apply DKG TX handling code to all sign TXs The existing code was almost entirely applicable. It just needed to be scoped with an ID. While the handle function is now a bit convoluted, I don't see a better option. * Split FinalizedBlock into ExternalBlock and SeraiBlock Also re-arranges their orders. * Add support for multiple orderings in Provided Necessary as our Tributary chains needed to agree when a Serai block has occurred, and when a Monero block has occurred. Since those could happen at the same time, some validators may put SeraiBlock before ExternalBlock and vice versa, causing a chain halt. Now they can have distinct ordering queues. * Slash on unrecognized ID * ExternalBlock handler * Add a SubstrateBlockAck message to the processor When a Substrate block occurs, the coordinator is expected to emit SubstrateBlock. This causes the processor to begin a variety of plans. The processor now emits SubstrateBlockAck, explicitly listing all plan IDs, before starting signing. This lets the coordinator provide a SubstrateBlock transaction, and with it, recognize all plan IDs as valid. Prior, we would've had to have a spotty algorithm based upon the upcoming Preprocess messages, or if we immediately provided the SubstrateBlock transaction, then wait for the processor to inform us of the contained plans. This creates an explicitly proper async flow not reliant on waiting for data availability. Alternatively, we could've replaced Preprocess with (Block, Vec<Preprocess>). This would've been more efficient, yet also clunky due to the multiple usages of the Preprocess message. * Route the SubstrateBlock message, which is the last Tributary transaction type * Add recent bloat checks added to signer to substrate_signer as well * Add no_std support to transcript, dalek-ff-group, ed448, ciphersuite, multiexp, schnorr, and monero-generators transcript, dalek-ff-group, ed449, and ciphersuite are all usable with no_std alone. The rest additionally require alloc. Part of #279. * Add a test to the coordinator for running a Tributary Impls a LocalP2p for testing. Moves rebroadcasting into Tendermint, since it's what knows if a message is fully valid + original. Removes TributarySpec::validators() HashMap, as its non-determinism caused different instances to have different round robin schedules. It was already prior moved to a Vec for this issue, so I'm unsure why this remnant existed. Also renames the GH no-std workflow from the prior commit. * Add a test for Tributary Further fleshes out the Tributary testing code. * Test handling of DKG commitments transactions * Add Transaction::sign. While I don't love the introduction of empty_signed, it's practically fine. * Tributary test wait_for_tx_inclusion function * Additionally test DKGShares * Handle adding new Tributaries Removes last_block as an argument from Tendermint. It now loads from the DB as needed. While slightly less performant, it's easiest and should be fine. * Reload Tributaries add_active_tributary writes the spec to disk before it returns, so even if the VecDeque it pushes to isn't popped, the tributary will still be loaded on boot. * Start handling P2P messages This defines the tart of a very complex series of locks I'm really unhappy with. At the same time, there's not immediately a better solution. This also should work without issue. * Clarify Arc RwLocks and sleeps in coordinator * Send a heartbeat message when a Tributary falls behind * cargo fmt * cargo update * Move json word lists to rs Allows building the seed code without serde_json. * Break coordinator main into multiple functions Also moves from std::sync::RwLock to tokio::sync::RwLock to prevent wasting cycles on spinning. * Remove reliance on a blockchain read lock from block/commit * Implement Tributary syncing Also adds a forwards-lookup to the Tributary blockchain. * Don't return from sync_block until the Tendermint machine returns if it's valid or not We had a race condition where'd we be informed of blocks 1 .. 3, and immediately add 1 .. 3. Because we immediately tried to add 2 after 1, it'd fail since the tip was still the genesis, yet 2 needs the tip to be 1. Adding a channel, while ugly, was the simplest way to accomplish this. Also has any added block be broadcasted. Else there's a race condition where a node which syncs up to the most recent block does so, yet fails to add the next block when it's committed to. * Test handle_p2p and Tributary syncing Includes bug fixes. * Tweak tests workflow * Add a TributaryReader which doesn't require a borrow to operate Reduces lock contention. Additionally changes block_key to include the genesis. While not technically needed, the lack of genesis introduced a side effect where any Tributary on the the database could return the block of any other Tributary. While that wasn't a security issue, returning it suggested it was on-chain when it wasn't. This may have been usable to create issues. * Document panic in FROST * Document a pair of panics requiring 256 GB of RAM/4 GB of a context * Add a UID function to messages When we receive messages, we're provided with a message ID we can use to prevent handling an item multiple times. That doesn't prevent us from *sending* an item multiple times though. Thanks to the UID system, we can now not send if already present. Alternatively, we can remove the ordered message ID for just the UID, allowing duplicates to be sent without issue, and handled on the receiving end. * Initial code to handle messages from processors * Document the processor/tributary/coordinator/serai flow * Have Coordinator MainDb take a mutable borrow * Update to substrate polkadot-v0.9.42 * Correct error message in ff-group-tests * Update to May's nightly Doesn't use the PR due to the needed changes. * Support arbitrary RPC providers in monero-serai Sets a clean path for no-std premised RPCs (buffers to an external RPC impl)/ Tor-based RPCs/client-side load balancing/... * Correct processor's handling of the new Monero RPC code * Correct Serai Dockerfile * Publish ExternablBlock/SubstrateBlock, delay *Preprocess until ID acknowledged Adds a channel for the Tributary scanner to communicate when an ID has been acknowledged. * Rename uid to intent * Use U448 for Ed448 instead of U512 * Spawn a new async task for each block message This probably should be done with n-long lived tasks, one per Tributary. While this may not be suitably performant long-term (potential DoS vector), this at least resolves the halting concerns. * Move the coordinator to a n-processor design * Ensure Tributary commits are minimal * Properly get genesis for a Processor message * Create a vote transaction upon GeneratedKeyPair * Remove TODO about code de-duplication It's infeasible to write a macro/function there. Does add a type alias which makes things cleaner. * Have coordinator publish batches to Substrate * Implement MuSig key aggregation into DKG Isn't spec compliant due to the lack of a spec to be compliant too. Slight deviation from the paper by using a unique list instead of a multiset. Closes #186, progresses #277. * Correct 2/3rds definitions throughout the codebase The prior formula failed for some values, such as 20. 20 / 3 = 6, * 2 = 12, + 1 = 13. 13 is 65%, not >= 67. * cargo update Resolves a yanked crate and removes some duplicated dependencies. * Add a dedicated function to get a MuSig key * Do the minimal amount of work for dkg to compile under no-std The Substrate runtime requires access to the MuSig key aggregation function. \#279 related. * Use a MuSig signature to publish validator set key pairs to Serai The processor/coordinator flow still has to be rewritten. * Correct various no_std definitions * Add a context to MuSig key aggregation * Use proper messages for ValidatorSets/InInstructions pallet Provides a DST, and associated metadata as beneficial. Also utilizes MuSig's context to session-bind. Since set_keys_messages also binds to set, this is semi-redundant, yet that's appreciated. * Remove signed Substrate TXs from Coordinator * Only scan v2 Monero TXs * Fix for prior commit * Ensure canonical points in the cross-group DLEq proof * Fix incorrect sig_hash generation sig_hash was used as a challenge. challenges should be of the form H(R, A, m). These sig hashes were solely H(A, m), allowing trivial forgeries. * cargo update Resolves an openssl advisory and nets ~-8 crates. * Build no-std tests with RISC-V 32 IMAC Turns out wasm still has std, making it suboptimal to use here. * Pin setup-protoc to v2.0.0 * Update to substrate polkadot-v0.9.43 * fix tributary sync test * Slight terminology correction in sync test Also correct a mistake from merging the most recent polkadot version. * Update nightly * Replace lazy_static with OnceLock inside monero-serai lazy_static, if no_std environments were used, effectively required always using spin locks. This resolves the ergonomics of that while adopting Rust std code. no_std does still use a spin based solution. Theoretically, we could use atomics, yet writing our own Mutex wasn't a priority. * no-std support for monero-serai (#311) * Move monero-serai from std to std-shims, where possible * no-std fixes * Make the HttpRpc its own feature, thiserror only on std * Drop monero-rs's epee for a homegrown one We only need it for a single function. While I tried jeffro's, it didn't work out of the box, had three unimplemented!s, and is no where near viable for no_std. Fixes #182, though should be further tested. * no-std monero-serai * Allow base58-monero via git * cargo fmt * Represent RCT amounts with None, not 0. Fixes #282. Does allow any v1 TXs which exist, and v2 miner-TXs, to specify Some(0). As far as I can tell, both were/are theoreitcally possible. * Add a message queue This is intended to be a reliable transport between the processors and coordinator. Since it'll be intranet only, it's written as never fail. Primarily needs testing and a proper ID. * cargo update Resolves https://github.com/serai-dex/serai/security/dependabot/29 * Correct deny.toml with inclusion of message-queue * Update nightly * std-shims: six `Read` for &[u8] * Use serai- prefixes on Serai-specific packages Fixes deny.toml, also runs a minor cargo update shrinking the tree. * Update monero-tests workflow to new name for the processor * Correct depends for processor-messages * Disable Rust caching We hit the cache limit after just one or two builds, making it infeasible. * cargo update Resolves a yanked crate * Move location of serai-client in Cargo.toml * Monero: support for legacy transactions (#308) * add mlsag * fix last commit * fix miner v1 txs * fix non-miner v1 txs * add borromean + fix mlsag * add block hash calculations * fix for the jokester that added unreduced scalars to the borromean signature of 2368d846e671bf79a1f84c6d3af9f0bfe296f043f50cf17ae5e485384a53707b * Add Borromean range proof verifying functionality * Add MLSAG verifying functionality * fmt & clippy :) * update MLSAG, ss2_elements will always be 2 * Add MgSig proving * Tidy block.rs * Tidy Borromean, fix bugs in last commit, replace todo! with unreachable! * Mark legacy EcdhInfo amount decryption as experimental * Correct comments * Write a new impl of the merkle algorithm This one tries to be understandable. * Only pull in things only needed for experimental when experimental * Stop caching the Monero block hash now in processor that we have Block::hash * Corrections for recent processor commit * Use a clearer algorithm for the merkle Should also be more efficient due to not shifting as often. * Tidy Mlsag * Remove verify_rct_* from Mlsag Both methods were ports from Monero, overtly specific without clear documentation. They need to be added back in, with documentation, or included in a node which provides the necessary further context for them to be naturally understandable. * Move mlsag/mod.rs to mlsag.rs This should only be a folder if it has multiple files. * Replace EcdhInfo terminology The ECDH encrypted the amount, yet this struct contained the encrypted amount, not some ECDH. Also corrects the types on the original EcdhInfo struct. * Correct handling of commitment masks when scanning * Route read_array through read_raw_vec * Misc lint * Make a proper RctType enum No longer caches RctType in the RctSignatures as well. * Replace Vec<Bulletproofs> with Bulletproofs Monero uses aggregated range proofs, so there's only ever one Bulletproof. This is enforced with a consensus rule as well, making this safe. As for why Monero uses a vec, it's probably due to the lack of variadic typing used. Its effectively an Option for them, yet we don't need an Option since we do have variadic typing (enums). * Add necessary checks to Eventuality re: supported protocols * Fix for block 202612 and fix merkel root calculations * MLSAG (de)serialisation fix ss_2_elements will not always be 2 as rct type 1 transactions are not enforced to have one input * Revert "MLSAG (de)serialisation fix" This reverts commit5e710e0c96
. here it checks number of MGs == number of inputs:0a1eaf26f9/src/cryptonote_core/tx_verification_utils.cpp (L60-59)
and here it checks for RctTypeFull number of MGs == 1:0a1eaf26f9/src/ringct/rctSigs.cpp (L1325)
so number of inputs == 1 so ss_2_elements == 2 * update `MlsagAggregate` comment * cargo update Resolves a yanked crate * Move location of serai-client in Cargo.toml --------- Co-authored-by: Luke Parker <lukeparker5132@gmail.com> * Fix the known issue with the DSA I wrote it to only select TXs with a timelock, not only TXs which are unlocked. This most likely explains why it so heavily selected coinbases. Also moves an InternalError which would've never been hit on mainnet, yet technically isn't an invariant, to only exist when cfg(test). * Add a bin to download a chain, over RPC, reserializing and hashing every item Parallelized. Doesn't check the deserialization is correct. Does use distinct, persistent HTTP clients. * Correct how Monero integration tests are run * Support multiple RPCs in the reserialize_chain bin * Don't call get_height every block * Modify get_transactions to split requests as to not hit the restricted RPC limits * Meaningful changes from aggressive-clippy I do want to enable a few specific lints, yet aggressive-clippy as a whole isn't worthwhile. * Extend reserialize_chain with CLSAG/BP(+) verification * Remove spammy println from reserialize_chain * Update reserialize_chain for v1 and migration TXs Also always marks 0-amount inputs as RCT due to impossibility of non-RCT 0-amount outputs. * Only deserialize RctSignatures where's there at least one input This is only enforced by the Monero protocol due to a single check the mixRing isn't empty in get_pre_mlsag_hash. The value in ensuring there's a least one input is to ensure the safety of our rct_type functions, which determines the RctType based off structural analysis (specifically, input data if MlsagBorromean). rct_type was technically safe without this. A 0-input transaction would be mis-classified as RctFull/MlsagAggregate, which would then make the RctSignatures invalid for being RctFull (requiring exactly one input) yet not having inputs, meaning an invalid RctSignatures would be mis-classified yet still invalid. This just removes the risk of mis-classification in the first place, tightening the library's safety. * docs/Getting Started.md: cargo build --release --all-features * Fix the known instance of #295 * Bind RocksDB into serai-db * Split up tests in CI to avoid node storage limits * Corrections to prior commit * Again I called git commit --amend without calling git add . again :( * Update the flow for completed signing processes Now, an on-chain transaction exists. This resolves some ambiguities and provides greater coordination. * Clean Polyseed code * Final tweaks * Correct no-std builds for Polyseed * Again correct no-std --------- Co-authored-by: Luke Parker <lukeparker5132@gmail.com> Co-authored-by: GitHub Actions <unknown> Co-authored-by: Boog900 <54e72d8a-345f-4599-bd90-c6b9bc7d0ec5@aleeas.com> Co-authored-by: Boog900 <108027008+Boog900@users.noreply.github.com> Co-authored-by: Steven Chang <stevenchang5000@gmail.com>
This commit is contained in:
parent
807ec30762
commit
23b9d57305
16 changed files with 21246 additions and 49 deletions
16
Cargo.lock
generated
16
Cargo.lock
generated
|
@ -5140,6 +5140,7 @@ dependencies = [
|
|||
"monero-generators",
|
||||
"monero-rpc",
|
||||
"multiexp",
|
||||
"pbkdf2 0.12.1",
|
||||
"rand 0.8.5",
|
||||
"rand_chacha 0.3.1",
|
||||
"rand_core 0.6.4",
|
||||
|
@ -6008,6 +6009,17 @@ dependencies = [
|
|||
"subtle",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "password-hash"
|
||||
version = "0.5.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "346f04948ba92c43e8469c1ee6736c7563d71012b17d40745260fe106aac2166"
|
||||
dependencies = [
|
||||
"base64ct",
|
||||
"rand_core 0.6.4",
|
||||
"subtle",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pasta_curves"
|
||||
version = "0.5.1"
|
||||
|
@ -6043,7 +6055,7 @@ checksum = "83a0692ec44e4cf1ef28ca317f14f8f07da2d95ec3fa01f86e4467b725e60917"
|
|||
dependencies = [
|
||||
"digest 0.10.7",
|
||||
"hmac 0.12.1",
|
||||
"password-hash",
|
||||
"password-hash 0.4.2",
|
||||
"sha2 0.10.7",
|
||||
]
|
||||
|
||||
|
@ -6055,6 +6067,8 @@ checksum = "f0ca0b5a68607598bf3bad68f32227a8164f6254833f84eafaac409cd6746c31"
|
|||
dependencies = [
|
||||
"digest 0.10.7",
|
||||
"hmac 0.12.1",
|
||||
"password-hash 0.5.0",
|
||||
"sha2 0.10.7",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
|
|
@ -29,6 +29,7 @@ rand_distr = { version = "0.4", default-features = false }
|
|||
|
||||
crc = { version = "3", default-features = false }
|
||||
sha3 = { version = "0.10", default-features = false }
|
||||
pbkdf2 = { version = "0.12", features = ["simple"], default-features = false }
|
||||
|
||||
curve25519-dalek = { version = "^3.2", default-features = false }
|
||||
|
||||
|
|
|
@ -6,13 +6,17 @@ use curve25519_dalek::scalar::Scalar;
|
|||
|
||||
use crate::{
|
||||
hash,
|
||||
wallet::seed::{Seed, Language, classic::trim_by_lang},
|
||||
wallet::seed::{
|
||||
Seed, SeedType,
|
||||
classic::{self, trim_by_lang},
|
||||
polyseed,
|
||||
},
|
||||
};
|
||||
|
||||
#[test]
|
||||
fn test_classic_seed() {
|
||||
struct Vector {
|
||||
language: Language,
|
||||
language: classic::Language,
|
||||
seed: String,
|
||||
spend: String,
|
||||
view: String,
|
||||
|
@ -20,13 +24,13 @@ fn test_classic_seed() {
|
|||
|
||||
let vectors = [
|
||||
Vector {
|
||||
language: Language::Chinese,
|
||||
language: classic::Language::Chinese,
|
||||
seed: "摇 曲 艺 武 滴 然 效 似 赏 式 祥 歌 买 疑 小 碧 堆 博 键 房 鲜 悲 付 喷 武".into(),
|
||||
spend: "a5e4fff1706ef9212993a69f246f5c95ad6d84371692d63e9bb0ea112a58340d".into(),
|
||||
view: "1176c43ce541477ea2f3ef0b49b25112b084e26b8a843e1304ac4677b74cdf02".into(),
|
||||
},
|
||||
Vector {
|
||||
language: Language::English,
|
||||
language: classic::Language::English,
|
||||
seed: "washing thirsty occur lectures tuesday fainted toxic adapt \
|
||||
abnormal memoir nylon mostly building shrugged online ember northern \
|
||||
ruby woes dauntless boil family illness inroads northern"
|
||||
|
@ -35,7 +39,7 @@ fn test_classic_seed() {
|
|||
view: "513ba91c538a5a9069e0094de90e927c0cd147fa10428ce3ac1afd49f63e3b01".into(),
|
||||
},
|
||||
Vector {
|
||||
language: Language::Dutch,
|
||||
language: classic::Language::Dutch,
|
||||
seed: "setwinst riphagen vimmetje extase blief tuitelig fuiven meifeest \
|
||||
ponywagen zesmaal ripdeal matverf codetaal leut ivoor rotten \
|
||||
wisgerhof winzucht typograaf atrium rein zilt traktaat verzaagd setwinst"
|
||||
|
@ -44,7 +48,7 @@ fn test_classic_seed() {
|
|||
view: "eac30b69477e3f68093d131c7fd961564458401b07f8c87ff8f6030c1a0c7301".into(),
|
||||
},
|
||||
Vector {
|
||||
language: Language::French,
|
||||
language: classic::Language::French,
|
||||
seed: "poids vaseux tarte bazar poivre effet entier nuance \
|
||||
sensuel ennui pacte osselet poudre battre alibi mouton \
|
||||
stade paquet pliage gibier type question position projet pliage"
|
||||
|
@ -53,7 +57,7 @@ fn test_classic_seed() {
|
|||
view: "6725b32230400a1032f31d622b44c3a227f88258939b14a7c72e00939e7bdf0e".into(),
|
||||
},
|
||||
Vector {
|
||||
language: Language::Spanish,
|
||||
language: classic::Language::Spanish,
|
||||
seed: "minero ocupar mirar evadir octubre cal logro miope \
|
||||
opaco disco ancla litio clase cuello nasal clase \
|
||||
fiar avance deseo mente grumo negro cordón croqueta clase"
|
||||
|
@ -62,7 +66,7 @@ fn test_classic_seed() {
|
|||
view: "18deafb34d55b7a43cae2c1c1c206a3c80c12cc9d1f84640b484b95b7fec3e05".into(),
|
||||
},
|
||||
Vector {
|
||||
language: Language::German,
|
||||
language: classic::Language::German,
|
||||
seed: "Kaliber Gabelung Tapir Liveband Favorit Specht Enklave Nabel \
|
||||
Jupiter Foliant Chronik nisten löten Vase Aussage Rekord \
|
||||
Yeti Gesetz Eleganz Alraune Künstler Almweide Jahr Kastanie Almweide"
|
||||
|
@ -71,7 +75,7 @@ fn test_classic_seed() {
|
|||
view: "99f0ec556643bd9c038a4ed86edcb9c6c16032c4622ed2e000299d527a792701".into(),
|
||||
},
|
||||
Vector {
|
||||
language: Language::Italian,
|
||||
language: classic::Language::Italian,
|
||||
seed: "cavo pancetta auto fulmine alleanza filmato diavolo prato \
|
||||
forzare meritare litigare lezione segreto evasione votare buio \
|
||||
licenza cliente dorso natale crescere vento tutelare vetta evasione"
|
||||
|
@ -80,7 +84,7 @@ fn test_classic_seed() {
|
|||
view: "698a1dce6018aef5516e82ca0cb3e3ec7778d17dfb41a137567bfa2e55e63a03".into(),
|
||||
},
|
||||
Vector {
|
||||
language: Language::Portuguese,
|
||||
language: classic::Language::Portuguese,
|
||||
seed: "agito eventualidade onus itrio holograma sodomizar objetos dobro \
|
||||
iugoslavo bcrepuscular odalisca abjeto iuane darwinista eczema acetona \
|
||||
cibernetico hoquei gleba driver buffer azoto megera nogueira agito"
|
||||
|
@ -89,7 +93,7 @@ fn test_classic_seed() {
|
|||
view: "ad1b4fd35270f5f36c4da7166672b347e75c3f4d41346ec2a06d1d0193632801".into(),
|
||||
},
|
||||
Vector {
|
||||
language: Language::Japanese,
|
||||
language: classic::Language::Japanese,
|
||||
seed: "ぜんぶ どうぐ おたがい せんきょ おうじ そんちょう じゅしん いろえんぴつ \
|
||||
かほう つかれる えらぶ にちじょう くのう にちようび ぬまえび さんきゃく \
|
||||
おおや ちぬき うすめる いがく せつでん さうな すいえい せつだん おおや"
|
||||
|
@ -98,7 +102,7 @@ fn test_classic_seed() {
|
|||
view: "6c3634a313ec2ee979d565c33888fd7c3502d696ce0134a8bc1a2698c7f2c508".into(),
|
||||
},
|
||||
Vector {
|
||||
language: Language::Russian,
|
||||
language: classic::Language::Russian,
|
||||
seed: "шатер икра нация ехать получать инерция доза реальный \
|
||||
рыжий таможня лопата душа веселый клетка атлас лекция \
|
||||
обгонять паек наивный лыжный дурак стать ежик задача паек"
|
||||
|
@ -107,7 +111,7 @@ fn test_classic_seed() {
|
|||
view: "fcd53e41ec0df995ab43927f7c44bc3359c93523d5009fb3f5ba87431d545a03".into(),
|
||||
},
|
||||
Vector {
|
||||
language: Language::Esperanto,
|
||||
language: classic::Language::Esperanto,
|
||||
seed: "ukazo klini peco etikedo fabriko imitado onklino urino \
|
||||
pudro incidento kumuluso ikono smirgi hirundo uretro krii \
|
||||
sparkado super speciala pupo alpinisto cvana vokegi zombio fabriko"
|
||||
|
@ -116,7 +120,7 @@ fn test_classic_seed() {
|
|||
view: "cd4d120e1ea34360af528f6a3e6156063312d9cefc9aa6b5218d366c0ed6a201".into(),
|
||||
},
|
||||
Vector {
|
||||
language: Language::Lojban,
|
||||
language: classic::Language::Lojban,
|
||||
seed: "jetnu vensa julne xrotu xamsi julne cutci dakli \
|
||||
mlatu xedja muvgau palpi xindo sfubu ciste cinri \
|
||||
blabi darno dembi janli blabi fenki bukpu burcu blabi"
|
||||
|
@ -125,7 +129,7 @@ fn test_classic_seed() {
|
|||
view: "c806ce62bafaa7b2d597f1a1e2dbe4a2f96bfd804bf6f8420fc7f4a6bd700c00".into(),
|
||||
},
|
||||
Vector {
|
||||
language: Language::EnglishOld,
|
||||
language: classic::Language::EnglishOld,
|
||||
seed: "glorious especially puff son moment add youth nowhere \
|
||||
throw glide grip wrong rhythm consume very swear \
|
||||
bitter heavy eventually begin reason flirt type unable"
|
||||
|
@ -163,15 +167,216 @@ fn test_classic_seed() {
|
|||
Scalar::from_canonical_bytes(view).unwrap()
|
||||
);
|
||||
|
||||
assert_eq!(Seed::from_entropy(vector.language, Zeroizing::new(spend)).unwrap(), seed);
|
||||
assert_eq!(
|
||||
Seed::from_entropy(SeedType::Classic(vector.language), Zeroizing::new(spend), None)
|
||||
.unwrap(),
|
||||
seed
|
||||
);
|
||||
}
|
||||
|
||||
// Test against ourself
|
||||
// Test against ourselves
|
||||
{
|
||||
let seed = Seed::new(&mut OsRng, vector.language);
|
||||
let seed = Seed::new(&mut OsRng, SeedType::Classic(vector.language));
|
||||
assert_eq!(seed, Seed::from_string(Zeroizing::new(trim_seed(&seed.to_string()))).unwrap());
|
||||
assert_eq!(seed, Seed::from_entropy(vector.language, seed.entropy()).unwrap());
|
||||
assert_eq!(
|
||||
seed,
|
||||
Seed::from_entropy(SeedType::Classic(vector.language), seed.entropy(), None).unwrap()
|
||||
);
|
||||
assert_eq!(seed, Seed::from_string(seed.to_string()).unwrap());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_polyseed() {
|
||||
struct Vector {
|
||||
language: polyseed::Language,
|
||||
seed: String,
|
||||
entropy: String,
|
||||
birthday: u64,
|
||||
has_prefix: bool,
|
||||
has_accent: bool,
|
||||
}
|
||||
|
||||
let vectors = [
|
||||
Vector {
|
||||
language: polyseed::Language::English,
|
||||
seed: "raven tail swear infant grief assist regular lamp \
|
||||
duck valid someone little harsh puppy airport language"
|
||||
.into(),
|
||||
entropy: "dd76e7359a0ded37cd0ff0f3c829a5ae01673300000000000000000000000000".into(),
|
||||
birthday: 1638446400,
|
||||
has_prefix: true,
|
||||
has_accent: false,
|
||||
},
|
||||
Vector {
|
||||
language: polyseed::Language::Spanish,
|
||||
seed: "eje fin parte célebre tabú pestaña lienzo puma \
|
||||
prisión hora regalo lengua existir lápiz lote sonoro"
|
||||
.into(),
|
||||
entropy: "5a2b02df7db21fcbe6ec6df137d54c7b20fd2b00000000000000000000000000".into(),
|
||||
birthday: 3118651200,
|
||||
has_prefix: true,
|
||||
has_accent: true,
|
||||
},
|
||||
Vector {
|
||||
language: polyseed::Language::French,
|
||||
seed: "valable arracher décaler jeudi amusant dresser mener épaissir risible \
|
||||
prouesse réserve ampleur ajuster muter caméra enchère"
|
||||
.into(),
|
||||
entropy: "11cfd870324b26657342c37360c424a14a050b00000000000000000000000000".into(),
|
||||
birthday: 1679314966,
|
||||
has_prefix: true,
|
||||
has_accent: true,
|
||||
},
|
||||
Vector {
|
||||
language: polyseed::Language::Italian,
|
||||
seed: "caduco midollo copione meninge isotopo illogico riflesso tartaruga fermento \
|
||||
olandese normale tristezza episodio voragine forbito achille"
|
||||
.into(),
|
||||
entropy: "7ecc57c9b4652d4e31428f62bec91cfd55500600000000000000000000000000".into(),
|
||||
birthday: 1679316358,
|
||||
has_prefix: true,
|
||||
has_accent: false,
|
||||
},
|
||||
Vector {
|
||||
language: polyseed::Language::Portuguese,
|
||||
seed: "caverna custear azedo adeus senador apertada sedoso omitir \
|
||||
sujeito aurora videira molho cartaz gesso dentista tapar"
|
||||
.into(),
|
||||
entropy: "45473063711376cae38f1b3eba18c874124e1d00000000000000000000000000".into(),
|
||||
birthday: 1679316657,
|
||||
has_prefix: true,
|
||||
has_accent: false,
|
||||
},
|
||||
Vector {
|
||||
language: polyseed::Language::Czech,
|
||||
seed: "usmrtit nora dotaz komunita zavalit funkce mzda sotva akce \
|
||||
vesta kabel herna stodola uvolnit ustrnout email"
|
||||
.into(),
|
||||
entropy: "7ac8a4efd62d9c3c4c02e350d32326df37821c00000000000000000000000000".into(),
|
||||
birthday: 1679316898,
|
||||
has_prefix: true,
|
||||
has_accent: false,
|
||||
},
|
||||
Vector {
|
||||
language: polyseed::Language::Korean,
|
||||
seed: "전망 선풍기 국제 무궁화 설사 기름 이론적 해안 절망 예선 \
|
||||
지우개 보관 절망 말기 시각 귀신"
|
||||
.into(),
|
||||
entropy: "684663fda420298f42ed94b2c512ed38ddf12b00000000000000000000000000".into(),
|
||||
birthday: 1679317073,
|
||||
has_prefix: false,
|
||||
has_accent: false,
|
||||
},
|
||||
Vector {
|
||||
language: polyseed::Language::Japanese,
|
||||
seed: "うちあわせ ちつじょ つごう しはい けんこう とおる てみやげ はんとし たんとう \
|
||||
といれ おさない おさえる むかう ぬぐう なふだ せまる"
|
||||
.into(),
|
||||
entropy: "94e6665518a6286c6e3ba508a2279eb62b771f00000000000000000000000000".into(),
|
||||
birthday: 1679318722,
|
||||
has_prefix: false,
|
||||
has_accent: false,
|
||||
},
|
||||
Vector {
|
||||
language: polyseed::Language::ChineseTraditional,
|
||||
seed: "亂 挖 斤 柄 代 圈 枝 轄 魯 論 函 開 勘 番 榮 壁".into(),
|
||||
entropy: "b1594f585987ab0fd5a31da1f0d377dae5283f00000000000000000000000000".into(),
|
||||
birthday: 1679426433,
|
||||
has_prefix: false,
|
||||
has_accent: false,
|
||||
},
|
||||
Vector {
|
||||
language: polyseed::Language::ChineseSimplified,
|
||||
seed: "啊 百 族 府 票 划 伪 仓 叶 虾 借 溜 晨 左 等 鬼".into(),
|
||||
entropy: "21cdd366f337b89b8d1bc1df9fe73047c22b0300000000000000000000000000".into(),
|
||||
birthday: 1679426817,
|
||||
has_prefix: false,
|
||||
has_accent: false,
|
||||
},
|
||||
];
|
||||
|
||||
for vector in vectors {
|
||||
let add_whitespace = |mut seed: String| {
|
||||
seed.push(' ');
|
||||
seed
|
||||
};
|
||||
|
||||
let seed_without_accents = |seed: &str| {
|
||||
seed
|
||||
.split_whitespace()
|
||||
.map(|w| w.chars().filter(|c| c.is_ascii()).collect::<String>())
|
||||
.collect::<Vec<_>>()
|
||||
.join(" ")
|
||||
};
|
||||
|
||||
let trim_seed = |seed: &str| {
|
||||
let seed_to_trim =
|
||||
if vector.has_accent { seed_without_accents(seed) } else { seed.to_string() };
|
||||
seed_to_trim
|
||||
.split_whitespace()
|
||||
.map(|w| {
|
||||
let mut ascii = 0;
|
||||
let mut to_take = w.len();
|
||||
for (i, char) in w.chars().enumerate() {
|
||||
if char.is_ascii() {
|
||||
ascii += 1;
|
||||
}
|
||||
if ascii == polyseed::PREFIX_LEN {
|
||||
// +1 to include this character, which put us at the prefix length
|
||||
to_take = i + 1;
|
||||
break;
|
||||
}
|
||||
}
|
||||
w.chars().take(to_take).collect::<String>()
|
||||
})
|
||||
.collect::<Vec<_>>()
|
||||
.join(" ")
|
||||
};
|
||||
|
||||
// String -> Seed
|
||||
let seed = Seed::from_string(Zeroizing::new(vector.seed.clone())).unwrap();
|
||||
|
||||
// Make sure a version with added whitespace still works
|
||||
let whitespaced_seed =
|
||||
Seed::from_string(Zeroizing::new(add_whitespace(vector.seed.clone()))).unwrap();
|
||||
assert_eq!(seed, whitespaced_seed);
|
||||
// Check trimmed versions works
|
||||
if vector.has_prefix {
|
||||
let trimmed_seed = Seed::from_string(Zeroizing::new(trim_seed(&vector.seed))).unwrap();
|
||||
assert_eq!(seed, trimmed_seed);
|
||||
}
|
||||
// Check versions without accents work
|
||||
if vector.has_accent {
|
||||
let seed_without_accents =
|
||||
Seed::from_string(Zeroizing::new(seed_without_accents(&vector.seed))).unwrap();
|
||||
assert_eq!(seed, seed_without_accents);
|
||||
}
|
||||
|
||||
let entropy = Zeroizing::new(hex::decode(vector.entropy).unwrap().try_into().unwrap());
|
||||
assert_eq!(seed.entropy(), entropy);
|
||||
assert!(seed.birthday().abs_diff(vector.birthday) < polyseed::TIME_STEP);
|
||||
|
||||
// Entropy -> Seed
|
||||
let from_entropy =
|
||||
Seed::from_entropy(SeedType::Polyseed(vector.language), entropy, Some(seed.birthday()))
|
||||
.unwrap();
|
||||
assert_eq!(seed.to_string(), from_entropy.to_string());
|
||||
|
||||
// Check against ourselves
|
||||
{
|
||||
let seed = Seed::new(&mut OsRng, SeedType::Polyseed(vector.language));
|
||||
assert_eq!(seed, Seed::from_string(seed.to_string()).unwrap());
|
||||
assert_eq!(
|
||||
seed,
|
||||
Seed::from_entropy(
|
||||
SeedType::Polyseed(vector.language),
|
||||
seed.entropy(),
|
||||
Some(seed.birthday())
|
||||
)
|
||||
.unwrap()
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -13,14 +13,28 @@ use crc::{Crc, CRC_32_ISO_HDLC};
|
|||
|
||||
use curve25519_dalek::scalar::Scalar;
|
||||
|
||||
use crate::{
|
||||
random_scalar,
|
||||
wallet::seed::{SeedError, Language},
|
||||
};
|
||||
use crate::{random_scalar, wallet::seed::SeedError};
|
||||
|
||||
pub(crate) const CLASSIC_SEED_LENGTH: usize = 24;
|
||||
pub(crate) const CLASSIC_SEED_LENGTH_WITH_CHECKSUM: usize = 25;
|
||||
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Debug, Hash)]
|
||||
pub enum Language {
|
||||
Chinese,
|
||||
English,
|
||||
Dutch,
|
||||
French,
|
||||
Spanish,
|
||||
German,
|
||||
Italian,
|
||||
Portuguese,
|
||||
Japanese,
|
||||
Russian,
|
||||
Esperanto,
|
||||
Lojban,
|
||||
EnglishOld,
|
||||
}
|
||||
|
||||
fn trim(word: &str, len: usize) -> Zeroizing<String> {
|
||||
Zeroizing::new(word.chars().take(len).collect())
|
||||
}
|
||||
|
|
|
@ -5,7 +5,9 @@ use zeroize::{Zeroize, ZeroizeOnDrop, Zeroizing};
|
|||
use rand_core::{RngCore, CryptoRng};
|
||||
|
||||
pub(crate) mod classic;
|
||||
pub(crate) mod polyseed;
|
||||
use classic::{CLASSIC_SEED_LENGTH, CLASSIC_SEED_LENGTH_WITH_CHECKSUM, ClassicSeed};
|
||||
use polyseed::{POLYSEED_LENGTH, Polyseed};
|
||||
|
||||
/// Error when decoding a seed.
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Debug)]
|
||||
|
@ -19,74 +21,106 @@ pub enum SeedError {
|
|||
InvalidChecksum,
|
||||
#[cfg_attr(feature = "std", error("english old seeds don't support checksums"))]
|
||||
EnglishOldWithChecksum,
|
||||
#[cfg_attr(feature = "std", error("provided entropy is not valid"))]
|
||||
InvalidEntropy,
|
||||
#[cfg_attr(feature = "std", error("invalid seed"))]
|
||||
InvalidSeed,
|
||||
#[cfg_attr(feature = "std", error("provided features are not supported"))]
|
||||
UnsupportedFeatures,
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Debug, Hash)]
|
||||
pub enum Language {
|
||||
Chinese,
|
||||
English,
|
||||
Dutch,
|
||||
French,
|
||||
Spanish,
|
||||
German,
|
||||
Italian,
|
||||
Portuguese,
|
||||
Japanese,
|
||||
Russian,
|
||||
Esperanto,
|
||||
Lojban,
|
||||
EnglishOld,
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Debug)]
|
||||
pub enum SeedType {
|
||||
Classic(classic::Language),
|
||||
Polyseed(polyseed::Language),
|
||||
}
|
||||
|
||||
/// A Monero seed.
|
||||
// TODO: Add polyseed to enum
|
||||
#[derive(Clone, PartialEq, Eq, Zeroize, ZeroizeOnDrop)]
|
||||
pub enum Seed {
|
||||
Classic(ClassicSeed),
|
||||
Polyseed(Polyseed),
|
||||
}
|
||||
|
||||
impl fmt::Debug for Seed {
|
||||
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
|
||||
match self {
|
||||
Seed::Classic(_) => f.debug_struct("Seed::Classic").finish_non_exhaustive(),
|
||||
Seed::Polyseed(_) => f.debug_struct("Seed::Polyseed").finish_non_exhaustive(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Seed {
|
||||
/// Create a new seed.
|
||||
pub fn new<R: RngCore + CryptoRng>(rng: &mut R, lang: Language) -> Seed {
|
||||
Seed::Classic(ClassicSeed::new(rng, lang))
|
||||
/// Creates a new `Seed`.
|
||||
pub fn new<R: RngCore + CryptoRng>(rng: &mut R, seed_type: SeedType) -> Seed {
|
||||
match seed_type {
|
||||
SeedType::Classic(lang) => Seed::Classic(ClassicSeed::new(rng, lang)),
|
||||
SeedType::Polyseed(lang) => Seed::Polyseed(Polyseed::new(rng, lang)),
|
||||
}
|
||||
}
|
||||
|
||||
/// Parse a seed from a String.
|
||||
/// Parse a seed from a `String`.
|
||||
pub fn from_string(words: Zeroizing<String>) -> Result<Seed, SeedError> {
|
||||
match words.split_whitespace().count() {
|
||||
CLASSIC_SEED_LENGTH | CLASSIC_SEED_LENGTH_WITH_CHECKSUM => {
|
||||
ClassicSeed::from_string(words).map(Seed::Classic)
|
||||
}
|
||||
POLYSEED_LENGTH => Polyseed::from_string(words).map(Seed::Polyseed),
|
||||
_ => Err(SeedError::InvalidSeedLength)?,
|
||||
}
|
||||
}
|
||||
|
||||
/// Create a Seed from entropy.
|
||||
pub fn from_entropy(lang: Language, entropy: Zeroizing<[u8; 32]>) -> Option<Seed> {
|
||||
ClassicSeed::from_entropy(lang, entropy).map(Seed::Classic)
|
||||
/// Creates a `Seed` from an entropy and an optional birthday (denoted in seconds since the
|
||||
/// epoch).
|
||||
///
|
||||
/// For `SeedType::Classic`, the birthday is ignored.
|
||||
///
|
||||
/// For `SeedType::Polyseed`, the last 13 bytes of `entropy` must be `0`.
|
||||
// TODO: Return Result, not Option
|
||||
pub fn from_entropy(
|
||||
seed_type: SeedType,
|
||||
entropy: Zeroizing<[u8; 32]>,
|
||||
birthday: Option<u64>,
|
||||
) -> Option<Seed> {
|
||||
match seed_type {
|
||||
SeedType::Classic(lang) => ClassicSeed::from_entropy(lang, entropy).map(Seed::Classic),
|
||||
SeedType::Polyseed(lang) => {
|
||||
Polyseed::from(lang, 0, birthday.unwrap_or(0), entropy).map(Seed::Polyseed).ok()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Convert a seed to a String.
|
||||
/// Returns seed as `String`.
|
||||
pub fn to_string(&self) -> Zeroizing<String> {
|
||||
match self {
|
||||
Seed::Classic(seed) => seed.to_string(),
|
||||
Seed::Polyseed(seed) => seed.to_string(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Return the entropy for this seed.
|
||||
/// Returns the entropy for this seed.
|
||||
pub fn entropy(&self) -> Zeroizing<[u8; 32]> {
|
||||
match self {
|
||||
Seed::Classic(seed) => seed.entropy(),
|
||||
Seed::Polyseed(seed) => seed.entropy().clone(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the key derived from this seed.
|
||||
pub fn key(&self) -> Zeroizing<[u8; 32]> {
|
||||
match self {
|
||||
// Classic does not differentiate between its entropy and its key
|
||||
Seed::Classic(seed) => seed.entropy(),
|
||||
Seed::Polyseed(seed) => seed.key(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the birthday of this seed.
|
||||
pub fn birthday(&self) -> u64 {
|
||||
match self {
|
||||
Seed::Classic(_) => 0,
|
||||
Seed::Polyseed(seed) => seed.birthday(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
429
coins/monero/src/wallet/seed/polyseed.rs
Normal file
429
coins/monero/src/wallet/seed/polyseed.rs
Normal file
|
@ -0,0 +1,429 @@
|
|||
use core::fmt;
|
||||
use std_shims::{sync::OnceLock, vec::Vec, string::String, collections::HashMap};
|
||||
#[cfg(feature = "std")]
|
||||
use std::time::{SystemTime, UNIX_EPOCH};
|
||||
|
||||
use subtle::ConstantTimeEq;
|
||||
use zeroize::{Zeroize, Zeroizing, ZeroizeOnDrop};
|
||||
use rand_core::{RngCore, CryptoRng};
|
||||
|
||||
use sha3::Sha3_256;
|
||||
use pbkdf2::pbkdf2_hmac;
|
||||
|
||||
use super::SeedError;
|
||||
|
||||
// Features
|
||||
const FEATURE_BITS: u8 = 5;
|
||||
#[allow(dead_code)]
|
||||
const INTERNAL_FEATURES: u8 = 2;
|
||||
const USER_FEATURES: u8 = 3;
|
||||
|
||||
const USER_FEATURES_MASK: u8 = (1 << USER_FEATURES) - 1;
|
||||
const ENCRYPTED_MASK: u8 = 1 << 4;
|
||||
const RESERVED_FEATURES_MASK: u8 = ((1 << FEATURE_BITS) - 1) ^ ENCRYPTED_MASK;
|
||||
|
||||
fn user_features(features: u8) -> u8 {
|
||||
features & USER_FEATURES_MASK
|
||||
}
|
||||
|
||||
fn polyseed_features_supported(features: u8) -> bool {
|
||||
(features & RESERVED_FEATURES_MASK) == 0
|
||||
}
|
||||
|
||||
// Dates
|
||||
const DATE_BITS: u8 = 10;
|
||||
const DATE_MASK: u16 = (1u16 << DATE_BITS) - 1;
|
||||
const POLYSEED_EPOCH: u64 = 1635768000; // 1st November 2021 12:00 UTC
|
||||
pub(crate) const TIME_STEP: u64 = 2629746; // 30.436875 days = 1/12 of the Gregorian year
|
||||
|
||||
// After ~85 years, this will roll over.
|
||||
fn birthday_encode(time: u64) -> u16 {
|
||||
u16::try_from((time.saturating_sub(POLYSEED_EPOCH) / TIME_STEP) & u64::from(DATE_MASK))
|
||||
.expect("value masked by 2**10 - 1 didn't fit into a u16")
|
||||
}
|
||||
|
||||
fn birthday_decode(birthday: u16) -> u64 {
|
||||
POLYSEED_EPOCH + (u64::from(birthday) * TIME_STEP)
|
||||
}
|
||||
|
||||
// Polyseed parameters
|
||||
const SECRET_BITS: usize = 150;
|
||||
|
||||
const BITS_PER_BYTE: usize = 8;
|
||||
// ceildiv of SECRET_BITS by BITS_PER_BYTE
|
||||
const SECRET_SIZE: usize = (SECRET_BITS + BITS_PER_BYTE - 1) / BITS_PER_BYTE; // 19
|
||||
const CLEAR_BITS: usize = (SECRET_SIZE * BITS_PER_BYTE) - SECRET_BITS; // 2
|
||||
|
||||
// Polyseed calls this CLEAR_MASK and has a very complicated formula for this fundamental
|
||||
// equivalency
|
||||
const LAST_BYTE_SECRET_BITS_MASK: u8 = ((1 << (BITS_PER_BYTE - CLEAR_BITS)) - 1) as u8;
|
||||
|
||||
const SECRET_BITS_PER_WORD: usize = 10;
|
||||
|
||||
// Amount of words in a seed
|
||||
pub(crate) const POLYSEED_LENGTH: usize = 16;
|
||||
// Amount of characters each word must have if trimmed
|
||||
pub(crate) const PREFIX_LEN: usize = 4;
|
||||
|
||||
const POLY_NUM_CHECK_DIGITS: usize = 1;
|
||||
const DATA_WORDS: usize = POLYSEED_LENGTH - POLY_NUM_CHECK_DIGITS;
|
||||
|
||||
// Polynomial
|
||||
const GF_BITS: usize = 11;
|
||||
const POLYSEED_MUL2_TABLE: [u16; 8] = [5, 7, 1, 3, 13, 15, 9, 11];
|
||||
|
||||
type Poly = [u16; POLYSEED_LENGTH];
|
||||
|
||||
fn elem_mul2(x: u16) -> u16 {
|
||||
if x < 1024 {
|
||||
return 2 * x;
|
||||
}
|
||||
POLYSEED_MUL2_TABLE[usize::from(x % 8)] + (16 * ((x - 1024) / 8))
|
||||
}
|
||||
|
||||
fn poly_eval(poly: &Poly) -> u16 {
|
||||
// Horner's method at x = 2
|
||||
let mut result = poly[POLYSEED_LENGTH - 1];
|
||||
for i in (0 .. (POLYSEED_LENGTH - 1)).rev() {
|
||||
result = elem_mul2(result) ^ poly[i];
|
||||
}
|
||||
result
|
||||
}
|
||||
|
||||
// Key gen parameters
|
||||
const POLYSEED_SALT: &[u8] = b"POLYSEED key";
|
||||
const POLYSEED_KEYGEN_ITERATIONS: u32 = 10000;
|
||||
|
||||
// Polyseed technically supports multiple coins, and the value for Monero is 0
|
||||
// See: https://github.com/tevador/polyseed/blob/master/include/polyseed.h#L58
|
||||
const COIN: u16 = 0;
|
||||
|
||||
/// Language options for Polyseed.
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Hash, Debug, Zeroize)]
|
||||
pub enum Language {
|
||||
English,
|
||||
Spanish,
|
||||
French,
|
||||
Italian,
|
||||
Japanese,
|
||||
Korean,
|
||||
Czech,
|
||||
Portuguese,
|
||||
ChineseSimplified,
|
||||
ChineseTraditional,
|
||||
}
|
||||
|
||||
struct WordList {
|
||||
words: Vec<String>,
|
||||
has_prefix: bool,
|
||||
has_accent: bool,
|
||||
}
|
||||
|
||||
impl WordList {
|
||||
fn new(words: &str, has_prefix: bool, has_accent: bool) -> WordList {
|
||||
let res = WordList { words: serde_json::from_str(words).unwrap(), has_prefix, has_accent };
|
||||
// This is needed for a later unwrap to not fails
|
||||
assert!(words.len() < usize::from(u16::MAX));
|
||||
res
|
||||
}
|
||||
}
|
||||
|
||||
static LANGUAGES_CELL: OnceLock<HashMap<Language, WordList>> = OnceLock::new();
|
||||
#[allow(non_snake_case)]
|
||||
fn LANGUAGES() -> &'static HashMap<Language, WordList> {
|
||||
LANGUAGES_CELL.get_or_init(|| {
|
||||
HashMap::from([
|
||||
(Language::Czech, WordList::new(include_str!("./polyseed/cs.json"), true, false)),
|
||||
(Language::French, WordList::new(include_str!("./polyseed/fr.json"), true, true)),
|
||||
(Language::Korean, WordList::new(include_str!("./polyseed/ko.json"), false, false)),
|
||||
(Language::English, WordList::new(include_str!("./polyseed/en.json"), true, false)),
|
||||
(Language::Italian, WordList::new(include_str!("./polyseed/it.json"), true, false)),
|
||||
(Language::Spanish, WordList::new(include_str!("./polyseed/es.json"), true, true)),
|
||||
(Language::Japanese, WordList::new(include_str!("./polyseed/ja.json"), false, false)),
|
||||
(Language::Portuguese, WordList::new(include_str!("./polyseed/pt.json"), true, false)),
|
||||
(
|
||||
Language::ChineseSimplified,
|
||||
WordList::new(include_str!("./polyseed/zh_simplified.json"), false, false),
|
||||
),
|
||||
(
|
||||
Language::ChineseTraditional,
|
||||
WordList::new(include_str!("./polyseed/zh_traditional.json"), false, false),
|
||||
),
|
||||
])
|
||||
})
|
||||
}
|
||||
|
||||
#[derive(Clone, PartialEq, Eq, Zeroize, ZeroizeOnDrop)]
|
||||
pub struct Polyseed {
|
||||
language: Language,
|
||||
features: u8,
|
||||
birthday: u16,
|
||||
entropy: Zeroizing<[u8; 32]>,
|
||||
checksum: u16,
|
||||
}
|
||||
|
||||
impl fmt::Debug for Polyseed {
|
||||
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
|
||||
f.debug_struct("Polyseed").finish_non_exhaustive()
|
||||
}
|
||||
}
|
||||
|
||||
fn valid_entropy(entropy: &Zeroizing<[u8; 32]>) -> bool {
|
||||
// Last byte of the entropy should only use certain bits
|
||||
let mut res =
|
||||
entropy[SECRET_SIZE - 1].ct_eq(&(entropy[SECRET_SIZE - 1] & LAST_BYTE_SECRET_BITS_MASK));
|
||||
// Last 13 bytes of the buffer should be unused
|
||||
for b in SECRET_SIZE .. entropy.len() {
|
||||
res &= entropy[b].ct_eq(&0);
|
||||
}
|
||||
res.into()
|
||||
}
|
||||
|
||||
impl Polyseed {
|
||||
// TODO: Clean this
|
||||
fn to_poly(&self) -> Poly {
|
||||
let mut extra_bits = u32::from(FEATURE_BITS + DATE_BITS);
|
||||
let extra_val = (u16::from(self.features) << DATE_BITS) | self.birthday;
|
||||
|
||||
let mut entropy_idx = 0;
|
||||
let mut secret_bits = BITS_PER_BYTE;
|
||||
let mut seed_rem_bits = SECRET_BITS - BITS_PER_BYTE;
|
||||
|
||||
let mut poly = [0; POLYSEED_LENGTH];
|
||||
for i in 0 .. DATA_WORDS {
|
||||
extra_bits -= 1;
|
||||
|
||||
let mut word_bits = 0;
|
||||
let mut word_val = 0;
|
||||
while word_bits < SECRET_BITS_PER_WORD {
|
||||
if secret_bits == 0 {
|
||||
entropy_idx += 1;
|
||||
secret_bits = seed_rem_bits.min(BITS_PER_BYTE);
|
||||
seed_rem_bits -= secret_bits;
|
||||
}
|
||||
let chunk_bits = secret_bits.min(SECRET_BITS_PER_WORD - word_bits);
|
||||
secret_bits -= chunk_bits;
|
||||
word_bits += chunk_bits;
|
||||
word_val <<= chunk_bits;
|
||||
word_val |=
|
||||
(u16::from(self.entropy[entropy_idx]) >> secret_bits) & ((1u16 << chunk_bits) - 1);
|
||||
}
|
||||
|
||||
word_val <<= 1;
|
||||
word_val |= (extra_val >> extra_bits) & 1;
|
||||
poly[POLY_NUM_CHECK_DIGITS + i] = word_val;
|
||||
}
|
||||
|
||||
poly
|
||||
}
|
||||
|
||||
/// Create a new `Polyseed` with specific internals.
|
||||
///
|
||||
/// `birthday` is defined in seconds since the Unix epoch.
|
||||
pub fn from(
|
||||
language: Language,
|
||||
features: u8,
|
||||
birthday: u64,
|
||||
entropy: Zeroizing<[u8; 32]>,
|
||||
) -> Result<Polyseed, SeedError> {
|
||||
let features = user_features(features);
|
||||
if !polyseed_features_supported(features) {
|
||||
Err(SeedError::UnsupportedFeatures)?;
|
||||
}
|
||||
|
||||
let birthday = birthday_encode(birthday);
|
||||
|
||||
if !valid_entropy(&entropy) {
|
||||
Err(SeedError::InvalidEntropy)?;
|
||||
}
|
||||
|
||||
let mut res = Polyseed { language, birthday, features, entropy, checksum: 0 };
|
||||
res.checksum = poly_eval(&res.to_poly());
|
||||
Ok(res)
|
||||
}
|
||||
|
||||
/// Create a new `Polyseed`.
|
||||
///
|
||||
/// This uses the system's time for the birthday, if available.
|
||||
pub fn new<R: RngCore + CryptoRng>(rng: &mut R, language: Language) -> Polyseed {
|
||||
// Get the birthday
|
||||
#[cfg(feature = "std")]
|
||||
let birthday = SystemTime::now().duration_since(UNIX_EPOCH).unwrap().as_secs();
|
||||
#[cfg(not(feature = "std"))]
|
||||
let birthday = 0;
|
||||
|
||||
// Derive entropy
|
||||
let mut entropy = Zeroizing::new([0; 32]);
|
||||
rng.fill_bytes(entropy.as_mut());
|
||||
entropy[SECRET_SIZE ..].fill(0);
|
||||
entropy[SECRET_SIZE - 1] &= LAST_BYTE_SECRET_BITS_MASK;
|
||||
|
||||
Self::from(language, 0, birthday, entropy).unwrap()
|
||||
}
|
||||
|
||||
/// Create a new `Polyseed` from a String.
|
||||
pub fn from_string(seed: Zeroizing<String>) -> Result<Polyseed, SeedError> {
|
||||
// Decode the seed into its polynomial coefficients
|
||||
let mut poly = [0; POLYSEED_LENGTH];
|
||||
let lang = (|| {
|
||||
'language: for (name, lang) in LANGUAGES().iter() {
|
||||
for (i, word) in seed.split_whitespace().enumerate() {
|
||||
// Find the word's index
|
||||
fn check_if_matches<S: AsRef<str>, I: Iterator<Item = S>>(
|
||||
has_prefix: bool,
|
||||
mut lang_words: I,
|
||||
word: &str,
|
||||
) -> Option<usize> {
|
||||
if has_prefix {
|
||||
// Get the position of the word within the iterator
|
||||
// Doesn't use starts_with and some words are substrs of others, leading to false
|
||||
// positives
|
||||
let mut get_position = || {
|
||||
lang_words.position(|lang_word| {
|
||||
let mut lang_word = lang_word.as_ref().chars();
|
||||
let mut word = word.chars();
|
||||
|
||||
let mut res = true;
|
||||
for _ in 0 .. PREFIX_LEN {
|
||||
res &= lang_word.next() == word.next();
|
||||
}
|
||||
res
|
||||
})
|
||||
};
|
||||
let res = get_position();
|
||||
// If another word has this prefix, don't call it a match
|
||||
if get_position().is_some() {
|
||||
return None;
|
||||
}
|
||||
res
|
||||
} else {
|
||||
lang_words.position(|lang_word| lang_word.as_ref() == word)
|
||||
}
|
||||
}
|
||||
|
||||
let Some(coeff) = (if lang.has_accent {
|
||||
let ascii = |word: &str| word.chars().filter(|c| c.is_ascii()).collect::<String>();
|
||||
check_if_matches(
|
||||
lang.has_prefix,
|
||||
lang.words.iter().map(|lang_word| ascii(lang_word)),
|
||||
&ascii(word)
|
||||
)
|
||||
} else {
|
||||
check_if_matches(lang.has_prefix, lang.words.iter(), word)
|
||||
}) else { continue 'language; };
|
||||
|
||||
// WordList asserts the word list length is less than u16::MAX
|
||||
poly[i] = u16::try_from(coeff).expect("coeff exceeded u16");
|
||||
}
|
||||
|
||||
return Ok(*name);
|
||||
}
|
||||
|
||||
Err(SeedError::UnknownLanguage)
|
||||
})()?;
|
||||
|
||||
// xor out the coin
|
||||
poly[POLY_NUM_CHECK_DIGITS] ^= COIN;
|
||||
|
||||
// Validate the checksum
|
||||
if poly_eval(&poly) != 0 {
|
||||
Err(SeedError::InvalidChecksum)?;
|
||||
}
|
||||
|
||||
// Convert the polynomial into entropy
|
||||
let mut entropy = Zeroizing::new([0; 32]);
|
||||
|
||||
let mut extra = 0;
|
||||
|
||||
let mut entropy_idx = 0;
|
||||
let mut entropy_bits = 0;
|
||||
|
||||
let checksum = poly[0];
|
||||
for mut word_val in poly.into_iter().skip(POLY_NUM_CHECK_DIGITS) {
|
||||
// Parse the bottom bit, which is one of the bits of extra
|
||||
// This iterates for less than 16 iters, meaning this won't drop any bits
|
||||
extra <<= 1;
|
||||
extra |= word_val & 1;
|
||||
word_val >>= 1;
|
||||
|
||||
// 10 bits per word creates a [8, 2], [6, 4], [4, 6], [2, 8] cycle
|
||||
// 15 % 4 is 3, leaving 2 bits off, and 152 (19 * 8) - 2 is 150, the amount of bits in the
|
||||
// secret
|
||||
let mut word_bits = GF_BITS - 1;
|
||||
while word_bits > 0 {
|
||||
if entropy_bits == BITS_PER_BYTE {
|
||||
entropy_idx += 1;
|
||||
entropy_bits = 0;
|
||||
}
|
||||
let chunk_bits = word_bits.min(BITS_PER_BYTE - entropy_bits);
|
||||
word_bits -= chunk_bits;
|
||||
let chunk_mask = (1u16 << chunk_bits) - 1;
|
||||
if chunk_bits < BITS_PER_BYTE {
|
||||
entropy[entropy_idx] <<= chunk_bits;
|
||||
}
|
||||
entropy[entropy_idx] |=
|
||||
u8::try_from((word_val >> word_bits) & chunk_mask).expect("chunk exceeded u8");
|
||||
entropy_bits += chunk_bits;
|
||||
}
|
||||
}
|
||||
|
||||
let birthday = extra & DATE_MASK;
|
||||
// extra is contained to u16, and DATE_BITS > 8
|
||||
let features =
|
||||
u8::try_from(extra >> DATE_BITS).expect("couldn't convert extra >> DATE_BITS to u8");
|
||||
|
||||
let res = Polyseed::from(lang, features, birthday_decode(birthday), entropy);
|
||||
if let Ok(res) = res.as_ref() {
|
||||
debug_assert_eq!(res.checksum, checksum);
|
||||
}
|
||||
res
|
||||
}
|
||||
|
||||
/// When this seed was created, defined in seconds since the epoch.
|
||||
pub fn birthday(&self) -> u64 {
|
||||
birthday_decode(self.birthday)
|
||||
}
|
||||
|
||||
/// This seed's features.
|
||||
pub fn features(&self) -> u8 {
|
||||
self.features
|
||||
}
|
||||
|
||||
/// This seed's entropy.
|
||||
pub fn entropy(&self) -> &Zeroizing<[u8; 32]> {
|
||||
&self.entropy
|
||||
}
|
||||
|
||||
/// The key derived from this seed.
|
||||
pub fn key(&self) -> Zeroizing<[u8; 32]> {
|
||||
let mut key = Zeroizing::new([0; 32]);
|
||||
pbkdf2_hmac::<Sha3_256>(
|
||||
self.entropy.as_slice(),
|
||||
POLYSEED_SALT,
|
||||
POLYSEED_KEYGEN_ITERATIONS,
|
||||
key.as_mut(),
|
||||
);
|
||||
key
|
||||
}
|
||||
|
||||
pub fn to_string(&self) -> Zeroizing<String> {
|
||||
// Encode the polynomial with the existing checksum
|
||||
let mut poly = self.to_poly();
|
||||
poly[0] = self.checksum;
|
||||
|
||||
// Embed the coin
|
||||
poly[POLY_NUM_CHECK_DIGITS] ^= COIN;
|
||||
|
||||
// Output words
|
||||
let mut seed = Zeroizing::new(String::new());
|
||||
let words = &LANGUAGES()[&self.language].words;
|
||||
for i in 0 .. poly.len() {
|
||||
seed.push_str(&words[usize::from(poly[i])]);
|
||||
if i < poly.len() - 1 {
|
||||
seed.push(' ');
|
||||
}
|
||||
}
|
||||
|
||||
seed
|
||||
}
|
||||
}
|
2050
coins/monero/src/wallet/seed/polyseed/cs.json
Normal file
2050
coins/monero/src/wallet/seed/polyseed/cs.json
Normal file
File diff suppressed because it is too large
Load diff
2050
coins/monero/src/wallet/seed/polyseed/en.json
Normal file
2050
coins/monero/src/wallet/seed/polyseed/en.json
Normal file
File diff suppressed because it is too large
Load diff
2050
coins/monero/src/wallet/seed/polyseed/es.json
Normal file
2050
coins/monero/src/wallet/seed/polyseed/es.json
Normal file
File diff suppressed because it is too large
Load diff
2050
coins/monero/src/wallet/seed/polyseed/fr.json
Normal file
2050
coins/monero/src/wallet/seed/polyseed/fr.json
Normal file
File diff suppressed because it is too large
Load diff
2050
coins/monero/src/wallet/seed/polyseed/it.json
Normal file
2050
coins/monero/src/wallet/seed/polyseed/it.json
Normal file
File diff suppressed because it is too large
Load diff
2050
coins/monero/src/wallet/seed/polyseed/ja.json
Normal file
2050
coins/monero/src/wallet/seed/polyseed/ja.json
Normal file
File diff suppressed because it is too large
Load diff
2050
coins/monero/src/wallet/seed/polyseed/ko.json
Normal file
2050
coins/monero/src/wallet/seed/polyseed/ko.json
Normal file
File diff suppressed because it is too large
Load diff
2050
coins/monero/src/wallet/seed/polyseed/pt.json
Normal file
2050
coins/monero/src/wallet/seed/polyseed/pt.json
Normal file
File diff suppressed because it is too large
Load diff
2050
coins/monero/src/wallet/seed/polyseed/zh_simplified.json
Normal file
2050
coins/monero/src/wallet/seed/polyseed/zh_simplified.json
Normal file
File diff suppressed because it is too large
Load diff
2050
coins/monero/src/wallet/seed/polyseed/zh_traditional.json
Normal file
2050
coins/monero/src/wallet/seed/polyseed/zh_traditional.json
Normal file
File diff suppressed because it is too large
Load diff
Loading…
Reference in a new issue