mirror of
https://github.com/serai-dex/serai.git
synced 2025-01-09 20:39:29 +00:00
Move in instructions from inherent transactions to unsigned transactions
The original intent was to use inherent transactions to prevent needing to vote on-chain, which would spam the chain with worthless votes. Inherent transactions, and our Tendermint library, would use the BFT's processs voting to also vote on all included transactions. This perfectly collapses integrity voting creating *no additional on-chain costs*. Unfortunately, this led to issues such as #6, along with questions of validator scalability when all validators are expencted to participate in consensus (in order to vote on if the included instructions are valid). This has been summarized in #241. With this change, we can remove Tendermint from Substrate. This greatly decreases our complexity. While I'm unhappy with the amount of time spent on it, just to reach this conclusion, thankfully tendermint-machine itself is still usable for #163. This also has reached a tipping point recently as the polkadot-v0.9.40 branch of substrate changed how syncing works, requiring further changes to sc-tendermint. These have no value if we're just going to get rid of it later, due to fundamental design issues, yet I would like to keep Substrate updated. This should be followed by moving back to GRANDPA, enabling closing most open Tendermint issues. Please note the current in-instructions-pallet does not actually verify the included signature yet. It's marked TODO, despite this bing critical.
This commit is contained in:
parent
9157f8d0a0
commit
c182b804bc
26 changed files with 305 additions and 481 deletions
16
Cargo.lock
generated
16
Cargo.lock
generated
|
@ -3739,18 +3739,6 @@ dependencies = [
|
|||
"syn 1.0.109",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "in-instructions-client"
|
||||
version = "0.1.0"
|
||||
dependencies = [
|
||||
"async-trait",
|
||||
"in-instructions-pallet",
|
||||
"jsonrpsee-core",
|
||||
"jsonrpsee-http-client",
|
||||
"parity-scale-codec",
|
||||
"sp-inherents",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "in-instructions-pallet"
|
||||
version = "0.1.0"
|
||||
|
@ -3761,7 +3749,6 @@ dependencies = [
|
|||
"parity-scale-codec",
|
||||
"scale-info",
|
||||
"serai-primitives",
|
||||
"sp-inherents",
|
||||
"sp-runtime",
|
||||
"thiserror",
|
||||
"tokens-pallet",
|
||||
|
@ -3775,6 +3762,7 @@ dependencies = [
|
|||
"scale-info",
|
||||
"serai-primitives",
|
||||
"serde",
|
||||
"sp-application-crypto",
|
||||
"sp-runtime",
|
||||
"sp-std 5.0.0",
|
||||
"tokens-primitives",
|
||||
|
@ -8460,7 +8448,6 @@ version = "0.1.0"
|
|||
dependencies = [
|
||||
"bitcoin",
|
||||
"ciphersuite",
|
||||
"jsonrpsee-server",
|
||||
"lazy_static",
|
||||
"monero-serai",
|
||||
"parity-scale-codec",
|
||||
|
@ -8481,7 +8468,6 @@ dependencies = [
|
|||
"clap 4.1.13",
|
||||
"frame-benchmarking",
|
||||
"frame-benchmarking-cli",
|
||||
"in-instructions-client",
|
||||
"jsonrpsee",
|
||||
"pallet-transaction-payment-rpc",
|
||||
"sc-basic-authorship",
|
||||
|
|
|
@ -31,7 +31,6 @@ members = [
|
|||
|
||||
"substrate/in-instructions/primitives",
|
||||
"substrate/in-instructions/pallet",
|
||||
"substrate/in-instructions/client",
|
||||
|
||||
"substrate/validator-sets/primitives",
|
||||
"substrate/validator-sets/pallet",
|
||||
|
|
|
@ -53,7 +53,6 @@ exceptions = [
|
|||
{ allow = ["AGPL-3.0"], name = "tokens-pallet" },
|
||||
|
||||
{ allow = ["AGPL-3.0"], name = "in-instructions-pallet" },
|
||||
{ allow = ["AGPL-3.0"], name = "in-instructions-client" },
|
||||
|
||||
{ allow = ["AGPL-3.0"], name = "validator-sets-pallet" },
|
||||
|
||||
|
|
|
@ -1,32 +0,0 @@
|
|||
# Consensus
|
||||
|
||||
### Inherent Transactions
|
||||
|
||||
Inherent transactions are a feature of Substrate enabling block producers to
|
||||
include transactions without overhead. This enables forming a leader protocol
|
||||
for including various forms of information on chain, such as In Instruction. By
|
||||
having a single node include the data, we prevent having pointless replicas on
|
||||
chain.
|
||||
|
||||
In order to ensure the validity of the inherent transactions, the consensus
|
||||
process validates them. Under Substrate, a block with inherents is checked by
|
||||
all nodes, and independently accepted or rejected. Under Serai, a block with
|
||||
inherents is checked by the validators, and if a BFT majority of validators
|
||||
agree it's legitimate, it is, regardless of the node's perception.
|
||||
|
||||
### Consensus
|
||||
|
||||
Serai uses Tendermint to obtain consensus on its blockchain. Tendermint details
|
||||
both block production and finalization, finalizing each block as it's produced.
|
||||
|
||||
Validators operate contextually. They are expected to know how to create
|
||||
inherent transactions and actually do so, additionally verifying inherent
|
||||
transactions proposed by other nodes. Verification comes from ensuring perfect
|
||||
consistency with what the validator would've proposed themselves.
|
||||
|
||||
While Substrate prefers block production and finalization to be distinct, such
|
||||
a model would allow unchecked inherent transactions to proliferate on Serai.
|
||||
Since inherent transactions detail the flow of external funds in relation to
|
||||
Serai, any operations on such blocks would be unsafe to a potentially fatal
|
||||
degree. Accordingly, re-bundling the two to ensure the only data in the system
|
||||
is that which has been fully checked was decided as the best move forward.
|
8
docs/protocol/In Instructions.md
Normal file
8
docs/protocol/In Instructions.md
Normal file
|
@ -0,0 +1,8 @@
|
|||
# In Instructions
|
||||
|
||||
In Instructions are included onto the Serai blockchain via unsigned
|
||||
transactions. In order to ensure the integrity of the included instructions, the
|
||||
validator set responsible for the network in question produces a threshold
|
||||
signature of their authenticity.
|
||||
|
||||
This lets all other validators verify the instructions with an O(1) operation.
|
|
@ -1,24 +0,0 @@
|
|||
[package]
|
||||
name = "in-instructions-client"
|
||||
version = "0.1.0"
|
||||
description = "Package In Instructions into inherent transactions"
|
||||
license = "AGPL-3.0-only"
|
||||
authors = ["Luke Parker <lukeparker5132@gmail.com>"]
|
||||
edition = "2021"
|
||||
publish = false
|
||||
|
||||
[package.metadata.docs.rs]
|
||||
all-features = true
|
||||
rustdoc-args = ["--cfg", "docsrs"]
|
||||
|
||||
[dependencies]
|
||||
async-trait = "0.1"
|
||||
|
||||
scale = { package = "parity-scale-codec", version = "3", features = ["derive", "max-encoded-len"] }
|
||||
|
||||
jsonrpsee-core = "0.16"
|
||||
jsonrpsee-http-client = "0.16"
|
||||
|
||||
sp-inherents = { git = "https://github.com/serai-dex/substrate" }
|
||||
|
||||
in-instructions-pallet = { path = "../pallet" }
|
|
@ -1,15 +0,0 @@
|
|||
AGPL-3.0-only license
|
||||
|
||||
Copyright (c) 2022-2023 Luke Parker
|
||||
|
||||
This program is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU Affero General Public License Version 3 as
|
||||
published by the Free Software Foundation.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU Affero General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU Affero General Public License
|
||||
along with this program. If not, see <http://www.gnu.org/licenses/>.
|
|
@ -1,47 +0,0 @@
|
|||
#![cfg_attr(docsrs, feature(doc_cfg))]
|
||||
#![cfg_attr(docsrs, feature(doc_auto_cfg))]
|
||||
|
||||
use scale::Decode;
|
||||
|
||||
use jsonrpsee_core::client::ClientT;
|
||||
use jsonrpsee_http_client::HttpClientBuilder;
|
||||
|
||||
use sp_inherents::{Error, InherentData, InherentIdentifier};
|
||||
|
||||
use in_instructions_pallet::{primitives::Updates, INHERENT_IDENTIFIER, InherentError};
|
||||
|
||||
pub struct InherentDataProvider;
|
||||
impl InherentDataProvider {
|
||||
#[allow(clippy::new_without_default)] // This isn't planned to forever have empty arguments
|
||||
pub fn new() -> InherentDataProvider {
|
||||
InherentDataProvider
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl sp_inherents::InherentDataProvider for InherentDataProvider {
|
||||
async fn provide_inherent_data(&self, inherent_data: &mut InherentData) -> Result<(), Error> {
|
||||
let updates: Updates = (|| async {
|
||||
let client = HttpClientBuilder::default().build("http://127.0.0.1:5134")?;
|
||||
client.request("processor_coinUpdates", Vec::<u8>::new()).await
|
||||
})()
|
||||
.await
|
||||
.map_err(|e| {
|
||||
Error::Application(Box::from(format!("couldn't communicate with processor: {e}")))
|
||||
})?;
|
||||
inherent_data.put_data(INHERENT_IDENTIFIER, &updates)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
async fn try_handle_error(
|
||||
&self,
|
||||
identifier: &InherentIdentifier,
|
||||
mut error: &[u8],
|
||||
) -> Option<Result<(), Error>> {
|
||||
if *identifier != INHERENT_IDENTIFIER {
|
||||
return None;
|
||||
}
|
||||
|
||||
Some(Err(Error::Application(Box::from(<InherentError as Decode>::decode(&mut error).ok()?))))
|
||||
}
|
||||
}
|
|
@ -1,7 +1,7 @@
|
|||
[package]
|
||||
name = "in-instructions-pallet"
|
||||
version = "0.1.0"
|
||||
description = "Execute calls via In Instructions from inherent transactions"
|
||||
description = "Execute calls via In Instructions from unsigned transactions"
|
||||
license = "AGPL-3.0-only"
|
||||
authors = ["Luke Parker <lukeparker5132@gmail.com>"]
|
||||
edition = "2021"
|
||||
|
@ -17,7 +17,6 @@ thiserror = { version = "1", optional = true }
|
|||
scale = { package = "parity-scale-codec", version = "3", default-features = false, features = ["derive", "max-encoded-len"] }
|
||||
scale-info = { version = "2", default-features = false, features = ["derive"] }
|
||||
|
||||
sp-inherents = { git = "https://github.com/serai-dex/substrate", default-features = false }
|
||||
sp-runtime = { git = "https://github.com/serai-dex/substrate", default-features = false }
|
||||
|
||||
frame-system = { git = "https://github.com/serai-dex/substrate", default-features = false }
|
||||
|
@ -35,7 +34,6 @@ std = [
|
|||
"scale/std",
|
||||
"scale-info/std",
|
||||
|
||||
"sp-inherents/std",
|
||||
"sp-runtime/std",
|
||||
|
||||
"frame-system/std",
|
||||
|
|
|
@ -4,60 +4,20 @@
|
|||
|
||||
use scale::{Encode, Decode};
|
||||
|
||||
use sp_inherents::{InherentIdentifier, IsFatalError};
|
||||
|
||||
use sp_runtime::RuntimeDebug;
|
||||
|
||||
use serai_primitives::{BlockNumber, BlockHash, Coin, WithAmount, Balance};
|
||||
use serai_primitives::{BlockHash, NetworkId};
|
||||
|
||||
pub use in_instructions_primitives as primitives;
|
||||
use primitives::{InInstruction, Updates};
|
||||
|
||||
pub const INHERENT_IDENTIFIER: InherentIdentifier = *b"ininstrs";
|
||||
use primitives::{InInstruction, InInstructionWithBalance, SignedBatch};
|
||||
|
||||
#[derive(Clone, Copy, Encode, RuntimeDebug)]
|
||||
#[cfg_attr(feature = "std", derive(Decode, thiserror::Error))]
|
||||
pub enum InherentError {
|
||||
#[cfg_attr(feature = "std", error("invalid call"))]
|
||||
InvalidCall,
|
||||
#[cfg_attr(feature = "std", error("inherent has {0} updates despite us having {1} coins"))]
|
||||
InvalidUpdateQuantity(u32, u32),
|
||||
#[cfg_attr(
|
||||
feature = "std",
|
||||
error("inherent for coin {0:?} has block number {1:?} despite us having {2:?}")
|
||||
)]
|
||||
UnrecognizedBlockNumber(Coin, BlockNumber, BlockNumber),
|
||||
#[cfg_attr(
|
||||
feature = "std",
|
||||
error("inherent for coin {0:?} has block number {1:?} which doesn't succeed {2:?}")
|
||||
)]
|
||||
InvalidBlockNumber(Coin, BlockNumber, BlockNumber),
|
||||
#[cfg_attr(feature = "std", error("coin {0:?} has {1} more batches than we do"))]
|
||||
UnrecognizedBatches(Coin, u32),
|
||||
#[cfg_attr(feature = "std", error("coin {0:?} has a different batch (ID {1:?})"))]
|
||||
DifferentBatch(Coin, BlockHash),
|
||||
}
|
||||
|
||||
impl IsFatalError for InherentError {
|
||||
fn is_fatal_error(&self) -> bool {
|
||||
match self {
|
||||
InherentError::InvalidCall | InherentError::InvalidUpdateQuantity(..) => true,
|
||||
InherentError::UnrecognizedBlockNumber(..) => false,
|
||||
InherentError::InvalidBlockNumber(..) => true,
|
||||
InherentError::UnrecognizedBatches(..) => false,
|
||||
// One of our nodes is definitively wrong. If it's ours (signified by it passing consensus),
|
||||
// we should panic. If it's theirs, they should be slashed
|
||||
// Unfortunately, we can't return fatal here to trigger a slash as fatal should only be used
|
||||
// for undeniable, technical invalidity
|
||||
// TODO: Code a way in which this still triggers a slash vote
|
||||
InherentError::DifferentBatch(..) => false,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn coin_from_index(index: usize) -> Coin {
|
||||
// Offset by 1 since Serai is the first coin, yet Serai doesn't have updates
|
||||
Coin::from(1 + u32::try_from(index).unwrap())
|
||||
pub enum PalletError {
|
||||
#[cfg_attr(feature = "std", error("batch for unrecognized network"))]
|
||||
UnrecognizedNetwork,
|
||||
#[cfg_attr(feature = "std", error("invalid signature for batch"))]
|
||||
InvalidSignature,
|
||||
}
|
||||
|
||||
#[frame_support::pallet]
|
||||
|
@ -77,36 +37,23 @@ pub mod pallet {
|
|||
#[pallet::event]
|
||||
#[pallet::generate_deposit(fn deposit_event)]
|
||||
pub enum Event<T: Config> {
|
||||
Batch { coin: Coin, id: BlockHash },
|
||||
Failure { coin: Coin, id: BlockHash, index: u32 },
|
||||
Batch { network: NetworkId, id: u32, block: BlockHash },
|
||||
InstructionFailure { network: NetworkId, id: u32, index: u32 },
|
||||
}
|
||||
|
||||
#[pallet::pallet]
|
||||
#[pallet::generate_store(pub(crate) trait Store)]
|
||||
pub struct Pallet<T>(PhantomData<T>);
|
||||
|
||||
// Used to only allow one set of updates per block, preventing double updating
|
||||
#[pallet::storage]
|
||||
pub(crate) type Once<T: Config> = StorageValue<_, bool, ValueQuery>;
|
||||
// Latest block number agreed upon for a coin
|
||||
#[pallet::storage]
|
||||
#[pallet::getter(fn block_number)]
|
||||
pub(crate) type BlockNumbers<T: Config> =
|
||||
StorageMap<_, Blake2_256, Coin, BlockNumber, ValueQuery>;
|
||||
|
||||
#[pallet::hooks]
|
||||
impl<T: Config> Hooks<BlockNumberFor<T>> for Pallet<T> {
|
||||
fn on_finalize(_: BlockNumberFor<T>) {
|
||||
Once::<T>::take();
|
||||
}
|
||||
}
|
||||
#[pallet::getter(fn batch)]
|
||||
pub(crate) type Batches<T: Config> = StorageMap<_, Blake2_256, NetworkId, u32, OptionQuery>;
|
||||
|
||||
impl<T: Config> Pallet<T> {
|
||||
fn execute(coin: Coin, instruction: WithAmount<InInstruction>) -> Result<(), ()> {
|
||||
match instruction.data {
|
||||
InInstruction::Transfer(address) => {
|
||||
Tokens::<T>::mint(address, Balance { coin, amount: instruction.amount })
|
||||
}
|
||||
fn execute(instruction: InInstructionWithBalance) -> Result<(), ()> {
|
||||
match instruction.instruction {
|
||||
InInstruction::Transfer(address) => Tokens::<T>::mint(address, instruction.balance),
|
||||
_ => panic!("unsupported instruction"),
|
||||
}
|
||||
Ok(())
|
||||
|
@ -117,121 +64,65 @@ pub mod pallet {
|
|||
impl<T: Config> Pallet<T> {
|
||||
#[pallet::call_index(0)]
|
||||
#[pallet::weight((0, DispatchClass::Operational))] // TODO
|
||||
pub fn update(origin: OriginFor<T>, mut updates: Updates) -> DispatchResult {
|
||||
pub fn execute_batch(origin: OriginFor<T>, batch: SignedBatch) -> DispatchResult {
|
||||
ensure_none(origin)?;
|
||||
assert!(!Once::<T>::exists());
|
||||
Once::<T>::put(true);
|
||||
|
||||
for (coin, update) in updates.iter_mut().enumerate() {
|
||||
if let Some(update) = update {
|
||||
let coin = coin_from_index(coin);
|
||||
BlockNumbers::<T>::insert(coin, update.block_number);
|
||||
let mut batch = batch.batch;
|
||||
|
||||
for batch in update.batches.iter_mut() {
|
||||
Self::deposit_event(Event::Batch { coin, id: batch.id });
|
||||
Batches::<T>::insert(batch.network, batch.id);
|
||||
Self::deposit_event(Event::Batch {
|
||||
network: batch.network,
|
||||
id: batch.id,
|
||||
block: batch.block,
|
||||
});
|
||||
for (i, instruction) in batch.instructions.drain(..).enumerate() {
|
||||
if Self::execute(coin, instruction).is_err() {
|
||||
Self::deposit_event(Event::Failure {
|
||||
coin,
|
||||
if Self::execute(instruction).is_err() {
|
||||
Self::deposit_event(Event::InstructionFailure {
|
||||
network: batch.network,
|
||||
id: batch.id,
|
||||
index: u32::try_from(i).unwrap(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
#[pallet::inherent]
|
||||
impl<T: Config> ProvideInherent for Pallet<T> {
|
||||
#[pallet::validate_unsigned]
|
||||
impl<T: Config> ValidateUnsigned for Pallet<T> {
|
||||
type Call = Call<T>;
|
||||
type Error = InherentError;
|
||||
const INHERENT_IDENTIFIER: InherentIdentifier = INHERENT_IDENTIFIER;
|
||||
|
||||
fn create_inherent(data: &InherentData) -> Option<Self::Call> {
|
||||
data
|
||||
.get_data::<Updates>(&INHERENT_IDENTIFIER)
|
||||
.unwrap()
|
||||
.map(|updates| Call::update { updates })
|
||||
}
|
||||
|
||||
// Assumes that only not yet handled batches are provided as inherent data
|
||||
fn check_inherent(call: &Self::Call, data: &InherentData) -> Result<(), Self::Error> {
|
||||
// First unwrap is for the Result of fetching/decoding the Updates
|
||||
// Second unwrap is for the Option of if they exist
|
||||
let expected = data.get_data::<Updates>(&INHERENT_IDENTIFIER).unwrap().unwrap();
|
||||
fn validate_unsigned(_: TransactionSource, call: &Self::Call) -> TransactionValidity {
|
||||
// Match to be exhaustive
|
||||
let updates = match call {
|
||||
Call::update { ref updates } => updates,
|
||||
_ => Err(InherentError::InvalidCall)?,
|
||||
let batch = match call {
|
||||
Call::execute_batch { ref batch } => batch,
|
||||
_ => Err(InvalidTransaction::Call)?,
|
||||
};
|
||||
|
||||
// The block producer should've provided one update per coin
|
||||
// We, an honest node, did provide one update per coin
|
||||
// Accordingly, we should have the same amount of updates
|
||||
if updates.len() != expected.len() {
|
||||
Err(InherentError::InvalidUpdateQuantity(
|
||||
updates.len().try_into().unwrap(),
|
||||
expected.len().try_into().unwrap(),
|
||||
))?;
|
||||
let network = batch.batch.network;
|
||||
|
||||
// TODO: Get the key for this network or Err(UnrecognizedNetwork)
|
||||
|
||||
// TODO: Verify the signature or Err(InvalidSignature)
|
||||
|
||||
// Verify the batch is sequential
|
||||
// Batches has the last ID set. The next ID should be it + 1
|
||||
// If there's no ID, the next ID should be 0
|
||||
let expected = Batches::<T>::get(network).map(|prev| prev + 1).unwrap_or(0);
|
||||
if batch.batch.id < expected {
|
||||
Err(InvalidTransaction::Stale)?;
|
||||
}
|
||||
if batch.batch.id > expected {
|
||||
Err(InvalidTransaction::Future)?;
|
||||
}
|
||||
|
||||
// This zip is safe since we verified they're equally sized
|
||||
// This should be written as coins.zip(updates.iter().zip(&expected)), where coins is the
|
||||
// validator set's coins
|
||||
// That'd require having context on the validator set right now which isn't worth pulling in
|
||||
// right now, when we only have one validator set
|
||||
for (coin, both) in updates.iter().zip(&expected).enumerate() {
|
||||
let coin = coin_from_index(coin);
|
||||
match both {
|
||||
// Block producer claims there's an update for this coin, as do we
|
||||
(Some(update), Some(expected)) => {
|
||||
if update.block_number.0 > expected.block_number.0 {
|
||||
Err(InherentError::UnrecognizedBlockNumber(
|
||||
coin,
|
||||
update.block_number,
|
||||
expected.block_number,
|
||||
))?;
|
||||
}
|
||||
|
||||
let prev = BlockNumbers::<T>::get(coin);
|
||||
if update.block_number.0 <= prev.0 {
|
||||
Err(InherentError::InvalidBlockNumber(coin, update.block_number, prev))?;
|
||||
}
|
||||
|
||||
if update.batches.len() > expected.batches.len() {
|
||||
Err(InherentError::UnrecognizedBatches(
|
||||
coin,
|
||||
(update.batches.len() - expected.batches.len()).try_into().unwrap(),
|
||||
))?;
|
||||
}
|
||||
|
||||
for (batch, expected) in update.batches.iter().zip(&expected.batches) {
|
||||
if batch != expected {
|
||||
Err(InherentError::DifferentBatch(coin, batch.id))?;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Block producer claims there's an update for this coin, yet we don't
|
||||
(Some(update), None) => {
|
||||
Err(InherentError::UnrecognizedBatches(coin, update.batches.len().try_into().unwrap()))?
|
||||
}
|
||||
|
||||
// Block producer didn't include update for this coin
|
||||
(None, _) => (),
|
||||
};
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn is_inherent(_: &Self::Call) -> bool {
|
||||
true
|
||||
ValidTransaction::with_tag_prefix("in-instructions")
|
||||
.and_provides((batch.batch.network, batch.batch.id))
|
||||
// Set a 10 block longevity, though this should be included in the next block
|
||||
.longevity(10)
|
||||
.propagate(true)
|
||||
.build()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -18,6 +18,7 @@ scale-info = { version = "2", default-features = false, features = ["derive"] }
|
|||
|
||||
serde = { version = "1", features = ["derive"], optional = true }
|
||||
|
||||
sp-application-crypto = { git = "https://github.com/serai-dex/substrate", default-features = false }
|
||||
sp-std = { git = "https://github.com/serai-dex/substrate", default-features = false }
|
||||
sp-runtime = { git = "https://github.com/serai-dex/substrate", default-features = false }
|
||||
|
||||
|
|
|
@ -11,11 +11,13 @@ use scale_info::TypeInfo;
|
|||
#[cfg(feature = "std")]
|
||||
use serde::{Serialize, Deserialize};
|
||||
|
||||
use sp_application_crypto::sr25519::Signature;
|
||||
|
||||
#[cfg(not(feature = "std"))]
|
||||
use sp_std::vec::Vec;
|
||||
use sp_runtime::RuntimeDebug;
|
||||
|
||||
use serai_primitives::{BlockNumber, BlockHash, SeraiAddress, ExternalAddress, Data, WithAmount};
|
||||
use serai_primitives::{BlockHash, Balance, NetworkId, SeraiAddress, ExternalAddress, Data};
|
||||
|
||||
mod shorthand;
|
||||
pub use shorthand::*;
|
||||
|
@ -40,28 +42,40 @@ pub enum InInstruction {
|
|||
Call(ApplicationCall),
|
||||
}
|
||||
|
||||
#[derive(Clone, PartialEq, Eq, Debug, Encode, Decode, MaxEncodedLen, TypeInfo)]
|
||||
#[derive(Clone, PartialEq, Eq, Encode, Decode, TypeInfo, RuntimeDebug)]
|
||||
#[cfg_attr(feature = "std", derive(Zeroize, Serialize, Deserialize))]
|
||||
pub struct RefundableInInstruction {
|
||||
pub origin: Option<ExternalAddress>,
|
||||
pub instruction: InInstruction,
|
||||
}
|
||||
|
||||
#[derive(Clone, PartialEq, Eq, Debug, Encode, Decode, MaxEncodedLen, TypeInfo)]
|
||||
#[cfg_attr(feature = "std", derive(Zeroize, Serialize, Deserialize))]
|
||||
pub struct InInstructionWithBalance {
|
||||
pub instruction: InInstruction,
|
||||
pub balance: Balance,
|
||||
}
|
||||
|
||||
#[derive(Clone, PartialEq, Eq, Encode, Decode, TypeInfo, RuntimeDebug)]
|
||||
#[cfg_attr(feature = "std", derive(Zeroize, Serialize, Deserialize))]
|
||||
pub struct Batch {
|
||||
pub id: BlockHash,
|
||||
pub instructions: Vec<WithAmount<InInstruction>>,
|
||||
pub network: NetworkId,
|
||||
pub id: u32,
|
||||
pub block: BlockHash,
|
||||
pub instructions: Vec<InInstructionWithBalance>,
|
||||
}
|
||||
|
||||
#[derive(Clone, PartialEq, Eq, Encode, Decode, TypeInfo, RuntimeDebug)]
|
||||
#[cfg_attr(feature = "std", derive(Zeroize, Serialize, Deserialize))]
|
||||
pub struct Update {
|
||||
// Coin's latest block number
|
||||
pub block_number: BlockNumber,
|
||||
pub batches: Vec<Batch>,
|
||||
#[cfg_attr(feature = "std", derive(Serialize, Deserialize))]
|
||||
pub struct SignedBatch {
|
||||
pub batch: Batch,
|
||||
pub signature: Signature,
|
||||
}
|
||||
|
||||
// None if the current block producer isn't operating over this coin or otherwise failed to get
|
||||
// data
|
||||
pub type Updates = Vec<Option<Update>>;
|
||||
#[cfg(feature = "std")]
|
||||
impl Zeroize for SignedBatch {
|
||||
fn zeroize(&mut self) {
|
||||
self.batch.zeroize();
|
||||
self.signature.as_mut().zeroize();
|
||||
}
|
||||
}
|
||||
|
|
|
@ -50,8 +50,6 @@ sc-rpc-api = { git = "https://github.com/serai-dex/substrate" }
|
|||
substrate-frame-rpc-system = { git = "https://github.com/serai-dex/substrate" }
|
||||
pallet-transaction-payment-rpc = { git = "https://github.com/serai-dex/substrate" }
|
||||
|
||||
in-instructions-client = { path = "../in-instructions/client" }
|
||||
|
||||
sc-tendermint = { path = "../tendermint/client" }
|
||||
|
||||
[build-dependencies]
|
||||
|
|
|
@ -11,8 +11,6 @@ use sp_inherents::CreateInherentDataProviders;
|
|||
use sp_consensus::DisableProofRecording;
|
||||
use sp_api::ProvideRuntimeApi;
|
||||
|
||||
use in_instructions_client::InherentDataProvider as InstructionsProvider;
|
||||
|
||||
use sc_executor::{NativeVersion, NativeExecutionDispatch, NativeElseWasmExecutor};
|
||||
use sc_transaction_pool::FullPool;
|
||||
use sc_network::NetworkService;
|
||||
|
@ -59,13 +57,13 @@ impl NativeExecutionDispatch for ExecutorDispatch {
|
|||
pub struct Cidp;
|
||||
#[async_trait::async_trait]
|
||||
impl CreateInherentDataProviders<Block, ()> for Cidp {
|
||||
type InherentDataProviders = (InstructionsProvider,);
|
||||
type InherentDataProviders = ();
|
||||
async fn create_inherent_data_providers(
|
||||
&self,
|
||||
_: <Block as BlockTrait>::Hash,
|
||||
_: (),
|
||||
) -> Result<Self::InherentDataProviders, Box<dyn Send + Sync + Error>> {
|
||||
Ok((InstructionsProvider::new(),))
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -147,7 +147,7 @@ impl Contains<RuntimeCall> for CallFilter {
|
|||
return matches!(call, tokens::Call::burn { .. });
|
||||
}
|
||||
if let RuntimeCall::InInstructions(call) = call {
|
||||
return matches!(call, in_instructions::Call::update { .. });
|
||||
return matches!(call, in_instructions::Call::execute_batch { .. });
|
||||
}
|
||||
|
||||
if let RuntimeCall::ValidatorSets(call) = call {
|
||||
|
|
|
@ -45,5 +45,3 @@ lazy_static = "1"
|
|||
rand_core = "0.6"
|
||||
|
||||
tokio = "1"
|
||||
|
||||
jsonrpsee-server = "0.16"
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
use serai_runtime::{in_instructions, InInstructions, Runtime};
|
||||
pub use in_instructions::primitives;
|
||||
use primitives::SignedBatch;
|
||||
|
||||
use crate::{
|
||||
primitives::{Coin, BlockNumber},
|
||||
Serai, SeraiError, scale_value,
|
||||
};
|
||||
use subxt::{tx, utils::Encoded};
|
||||
|
||||
use crate::{Serai, SeraiError, scale_composite};
|
||||
|
||||
const PALLET: &str = "InInstructions";
|
||||
|
||||
|
@ -22,16 +22,11 @@ impl Serai {
|
|||
.await
|
||||
}
|
||||
|
||||
pub async fn get_coin_block_number(
|
||||
&self,
|
||||
coin: Coin,
|
||||
block: [u8; 32],
|
||||
) -> Result<BlockNumber, SeraiError> {
|
||||
Ok(
|
||||
self
|
||||
.storage(PALLET, "BlockNumbers", Some(vec![scale_value(coin)]), block)
|
||||
.await?
|
||||
.unwrap_or(BlockNumber(0)),
|
||||
)
|
||||
pub fn execute_batch(&self, batch: SignedBatch) -> Result<Encoded, SeraiError> {
|
||||
self.unsigned(&tx::dynamic(
|
||||
PALLET,
|
||||
"execute_batch",
|
||||
scale_composite(in_instructions::Call::<Runtime>::execute_batch { batch }),
|
||||
))
|
||||
}
|
||||
}
|
||||
|
|
|
@ -6,6 +6,8 @@ pub(crate) use scale_value::{scale_value, scale_composite};
|
|||
use subxt::ext::scale_value::Value;
|
||||
|
||||
use sp_core::{Pair as PairTrait, sr25519::Pair};
|
||||
|
||||
pub use subxt;
|
||||
use subxt::{
|
||||
error::Error as SubxtError,
|
||||
utils::Encoded,
|
||||
|
@ -14,6 +16,7 @@ use subxt::{
|
|||
extrinsic_params::{BaseExtrinsicParams, BaseExtrinsicParamsBuilder},
|
||||
},
|
||||
tx::{Signer, DynamicTxPayload, TxClient},
|
||||
rpc::types::ChainBlock,
|
||||
Config as SubxtConfig, OnlineClient,
|
||||
};
|
||||
|
||||
|
@ -36,6 +39,8 @@ pub struct Tip {
|
|||
pub tip: u64,
|
||||
}
|
||||
|
||||
pub type Header = SubstrateHeader<<Runtime as Config>::BlockNumber, BlakeTwo256>;
|
||||
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Debug)]
|
||||
pub struct SeraiConfig;
|
||||
impl SubxtConfig for SeraiConfig {
|
||||
|
@ -47,12 +52,14 @@ impl SubxtConfig for SeraiConfig {
|
|||
// TODO: Bech32m
|
||||
type Address = SeraiAddress;
|
||||
|
||||
type Header = SubstrateHeader<<Runtime as Config>::BlockNumber, BlakeTwo256>;
|
||||
type Header = Header;
|
||||
type Signature = Signature;
|
||||
|
||||
type ExtrinsicParams = BaseExtrinsicParams<SeraiConfig, Tip>;
|
||||
}
|
||||
|
||||
pub type Block = ChainBlock<SeraiConfig>;
|
||||
|
||||
#[derive(Error, Debug)]
|
||||
pub enum SeraiError {
|
||||
#[error("failed to communicate with serai: {0}")]
|
||||
|
@ -116,6 +123,41 @@ impl Serai {
|
|||
Ok(self.0.rpc().finalized_head().await.map_err(SeraiError::RpcError)?.into())
|
||||
}
|
||||
|
||||
pub async fn get_block(&self, hash: [u8; 32]) -> Result<Option<Block>, SeraiError> {
|
||||
let Some(res) =
|
||||
self.0.rpc().block(Some(hash.into())).await.map_err(SeraiError::RpcError)? else {
|
||||
return Ok(None);
|
||||
};
|
||||
|
||||
// Only return finalized blocks
|
||||
let Some(justifications) = res.justifications.as_ref() else { return Ok(None); };
|
||||
if justifications.is_empty() {
|
||||
return Ok(None);
|
||||
}
|
||||
|
||||
Ok(Some(res.block))
|
||||
}
|
||||
|
||||
// Ideally, this would be get_block_hash, not get_block_by_number
|
||||
// Unfortunately, in order to only operate over only finalized data, we have to check the
|
||||
// returned hash is for a finalized block. We can only do that by calling subxt's `block`, which
|
||||
// will return the block and any justifications
|
||||
// If we're already putting in all the work to get the block, we may as well just return it here
|
||||
pub async fn get_block_by_number(&self, number: u64) -> Result<Option<Block>, SeraiError> {
|
||||
let Some(hash) =
|
||||
self.0.rpc().block_hash(Some(number.into())).await.map_err(SeraiError::RpcError)? else {
|
||||
return Ok(None);
|
||||
};
|
||||
self.get_block(hash.into()).await
|
||||
}
|
||||
|
||||
pub fn unsigned(&self, payload: &DynamicTxPayload<'static>) -> Result<Encoded, SeraiError> {
|
||||
TxClient::new(self.0.offline())
|
||||
.create_unsigned(payload)
|
||||
.map(|tx| Encoded(tx.into_encoded()))
|
||||
.map_err(|_| SeraiError::InvalidRuntime)
|
||||
}
|
||||
|
||||
pub fn sign<S: Send + Sync + Signer<SeraiConfig>>(
|
||||
&self,
|
||||
signer: &S,
|
||||
|
|
56
substrate/serai/client/tests/batch.rs
Normal file
56
substrate/serai/client/tests/batch.rs
Normal file
|
@ -0,0 +1,56 @@
|
|||
use rand_core::{RngCore, OsRng};
|
||||
|
||||
use sp_core::sr25519::Signature;
|
||||
|
||||
use serai_client::{
|
||||
primitives::{BITCOIN_NET_ID, BITCOIN, BlockHash, SeraiAddress, Amount, Balance},
|
||||
tokens::TokensEvent,
|
||||
in_instructions::{
|
||||
primitives::{InInstruction, InInstructionWithBalance, Batch, SignedBatch},
|
||||
InInstructionsEvent,
|
||||
},
|
||||
Serai,
|
||||
};
|
||||
|
||||
mod runner;
|
||||
use runner::{URL, provide_batch};
|
||||
|
||||
serai_test!(
|
||||
async fn publish_batch() {
|
||||
let network = BITCOIN_NET_ID;
|
||||
let id = 0;
|
||||
|
||||
let mut block_hash = BlockHash([0; 32]);
|
||||
OsRng.fill_bytes(&mut block_hash.0);
|
||||
|
||||
let mut address = SeraiAddress::new([0; 32]);
|
||||
OsRng.fill_bytes(&mut address.0);
|
||||
|
||||
let coin = BITCOIN;
|
||||
let amount = Amount(OsRng.next_u64().saturating_add(1));
|
||||
let balance = Balance { coin, amount };
|
||||
|
||||
let batch = Batch {
|
||||
network,
|
||||
id,
|
||||
block: block_hash,
|
||||
instructions: vec![InInstructionWithBalance {
|
||||
instruction: InInstruction::Transfer(address),
|
||||
balance,
|
||||
}],
|
||||
};
|
||||
let signed = SignedBatch { batch, signature: Signature::from_raw([0; 64]) };
|
||||
let block = provide_batch(signed).await;
|
||||
|
||||
let serai = Serai::new(URL).await.unwrap();
|
||||
let batches = serai.get_batch_events(block).await.unwrap();
|
||||
assert_eq!(batches, vec![InInstructionsEvent::Batch { network, id, block: block_hash }]);
|
||||
|
||||
assert_eq!(
|
||||
serai.get_mint_events(block).await.unwrap(),
|
||||
vec![TokensEvent::Mint { address, balance }],
|
||||
);
|
||||
assert_eq!(serai.get_token_supply(block, coin).await.unwrap(), amount);
|
||||
assert_eq!(serai.get_token_balance(block, coin, address).await.unwrap(), amount);
|
||||
}
|
||||
);
|
|
@ -4,36 +4,65 @@ use rand_core::{RngCore, OsRng};
|
|||
|
||||
use tokio::time::sleep;
|
||||
|
||||
use sp_core::Pair;
|
||||
use sp_core::{sr25519::Signature, Pair};
|
||||
use subxt::{config::extrinsic_params::BaseExtrinsicParamsBuilder};
|
||||
|
||||
use serai_client::{
|
||||
primitives::{
|
||||
BITCOIN, BlockNumber, BlockHash, SeraiAddress, Amount, WithAmount, Balance, Data,
|
||||
ExternalAddress, insecure_pair_from_name,
|
||||
BITCOIN_NET_ID, BITCOIN, BlockHash, SeraiAddress, Amount, Balance, Data, ExternalAddress,
|
||||
insecure_pair_from_name,
|
||||
},
|
||||
in_instructions::{
|
||||
InInstructionsEvent,
|
||||
primitives::{InInstruction, InInstructionWithBalance, Batch, SignedBatch},
|
||||
},
|
||||
in_instructions::primitives::{InInstruction, Batch, Update},
|
||||
tokens::{primitives::OutInstruction, TokensEvent},
|
||||
PairSigner, Serai,
|
||||
};
|
||||
|
||||
mod runner;
|
||||
use runner::{URL, provide_updates};
|
||||
use runner::{URL, provide_batch};
|
||||
|
||||
serai_test!(
|
||||
async fn burn() {
|
||||
let coin = BITCOIN;
|
||||
let mut id = BlockHash([0; 32]);
|
||||
OsRng.fill_bytes(&mut id.0);
|
||||
let block_number = BlockNumber(OsRng.next_u64());
|
||||
let network = BITCOIN_NET_ID;
|
||||
let id = 0;
|
||||
|
||||
let mut block_hash = BlockHash([0; 32]);
|
||||
OsRng.fill_bytes(&mut block_hash.0);
|
||||
|
||||
let pair = insecure_pair_from_name("Alice");
|
||||
let public = pair.public();
|
||||
let address = SeraiAddress::from(public);
|
||||
|
||||
let amount = Amount(OsRng.next_u64());
|
||||
let coin = BITCOIN;
|
||||
let amount = Amount(OsRng.next_u64().saturating_add(1));
|
||||
let balance = Balance { coin, amount };
|
||||
|
||||
let batch = Batch {
|
||||
network,
|
||||
id,
|
||||
block: block_hash,
|
||||
instructions: vec![InInstructionWithBalance {
|
||||
instruction: InInstruction::Transfer(address),
|
||||
balance,
|
||||
}],
|
||||
};
|
||||
let signed = SignedBatch { batch, signature: Signature::from_raw([0; 64]) };
|
||||
let block = provide_batch(signed).await;
|
||||
|
||||
let serai = Serai::new(URL).await.unwrap();
|
||||
let batches = serai.get_batch_events(block).await.unwrap();
|
||||
assert_eq!(batches, vec![InInstructionsEvent::Batch { network, id, block: block_hash }]);
|
||||
|
||||
assert_eq!(
|
||||
serai.get_mint_events(block).await.unwrap(),
|
||||
vec![TokensEvent::Mint { address, balance }]
|
||||
);
|
||||
assert_eq!(serai.get_token_supply(block, coin).await.unwrap(), amount);
|
||||
assert_eq!(serai.get_token_balance(block, coin, address).await.unwrap(), amount);
|
||||
|
||||
// Now burn it
|
||||
let mut rand_bytes = vec![0; 32];
|
||||
OsRng.fill_bytes(&mut rand_bytes);
|
||||
let external_address = ExternalAddress::new(rand_bytes).unwrap();
|
||||
|
@ -42,16 +71,6 @@ serai_test!(
|
|||
OsRng.fill_bytes(&mut rand_bytes);
|
||||
let data = Data::new(rand_bytes).unwrap();
|
||||
|
||||
let batch = Batch {
|
||||
id,
|
||||
instructions: vec![WithAmount { data: InInstruction::Transfer(address), amount }],
|
||||
};
|
||||
let update = Update { block_number, batches: vec![batch] };
|
||||
let block = provide_updates(vec![Some(update)]).await;
|
||||
|
||||
let serai = Serai::new(URL).await.unwrap();
|
||||
assert_eq!(serai.get_token_balance(block, coin, address).await.unwrap(), amount);
|
||||
|
||||
let out = OutInstruction { address: external_address, data: Some(data) };
|
||||
let burn = Serai::burn(balance, out.clone());
|
||||
|
||||
|
|
|
@ -1,18 +1,15 @@
|
|||
use core::time::Duration;
|
||||
use std::sync::Arc;
|
||||
|
||||
use lazy_static::lazy_static;
|
||||
|
||||
use tokio::{sync::Mutex, time::sleep};
|
||||
|
||||
use serai_client::{
|
||||
primitives::Coin,
|
||||
in_instructions::{primitives::Updates, InInstructionsEvent},
|
||||
subxt::config::Header,
|
||||
in_instructions::{primitives::SignedBatch, InInstructionsEvent},
|
||||
Serai,
|
||||
};
|
||||
|
||||
use jsonrpsee_server::RpcModule;
|
||||
|
||||
pub const URL: &str = "ws://127.0.0.1:9944";
|
||||
|
||||
lazy_static! {
|
||||
|
@ -20,73 +17,64 @@ lazy_static! {
|
|||
}
|
||||
|
||||
#[allow(dead_code)]
|
||||
pub async fn provide_updates(updates: Updates) -> [u8; 32] {
|
||||
let done = Arc::new(Mutex::new(false));
|
||||
let done_clone = done.clone();
|
||||
let updates_clone = updates.clone();
|
||||
pub async fn provide_batch(batch: SignedBatch) -> [u8; 32] {
|
||||
let serai = Serai::new(URL).await.unwrap();
|
||||
|
||||
let mut rpc = RpcModule::new(());
|
||||
rpc
|
||||
.register_async_method("processor_coinUpdates", move |_, _| {
|
||||
let done_clone = done_clone.clone();
|
||||
let updates_clone = updates_clone.clone();
|
||||
async move {
|
||||
// Sleep to prevent a race condition where we submit the inherents for this block and the
|
||||
// next one, then remove them, making them unverifiable, causing the node to panic for
|
||||
// being self-malicious
|
||||
sleep(Duration::from_millis(500)).await;
|
||||
if !*done_clone.lock().await {
|
||||
Ok(updates_clone)
|
||||
} else {
|
||||
Ok(vec![])
|
||||
}
|
||||
}
|
||||
})
|
||||
.unwrap();
|
||||
let handle = jsonrpsee_server::ServerBuilder::default()
|
||||
.build("127.0.0.1:5134")
|
||||
let mut latest = serai
|
||||
.get_block(serai.get_latest_block_hash().await.unwrap())
|
||||
.await
|
||||
.unwrap()
|
||||
.start(rpc)
|
||||
.unwrap();
|
||||
.unwrap()
|
||||
.header()
|
||||
.number();
|
||||
|
||||
let serai = Serai::new(URL).await.unwrap();
|
||||
loop {
|
||||
let latest = serai.get_latest_block_hash().await.unwrap();
|
||||
let mut batches = serai.get_batch_events(latest).await.unwrap();
|
||||
if batches.is_empty() {
|
||||
sleep(Duration::from_millis(50)).await;
|
||||
continue;
|
||||
}
|
||||
*done.lock().await = true;
|
||||
let execution = serai.execute_batch(batch.clone()).unwrap();
|
||||
serai.publish(&execution).await.unwrap();
|
||||
|
||||
for (index, update) in updates.iter().enumerate() {
|
||||
if let Some(update) = update {
|
||||
let coin_by_index = Coin(u32::try_from(index).unwrap() + 1);
|
||||
// Get the block it was included in
|
||||
let mut block;
|
||||
let mut ticks = 0;
|
||||
'get_block: loop {
|
||||
latest += 1;
|
||||
|
||||
for expected in &update.batches {
|
||||
match batches.swap_remove(0) {
|
||||
InInstructionsEvent::Batch { coin, id } => {
|
||||
assert_eq!(coin, coin_by_index);
|
||||
assert_eq!(expected.id, id);
|
||||
}
|
||||
_ => panic!("get_batches returned non-batch"),
|
||||
block = {
|
||||
let mut block;
|
||||
while {
|
||||
block = serai.get_block_by_number(latest).await.unwrap();
|
||||
block.is_none()
|
||||
} {
|
||||
sleep(Duration::from_secs(1)).await;
|
||||
ticks += 1;
|
||||
|
||||
if ticks > 60 {
|
||||
panic!("60 seconds without inclusion in a finalized block");
|
||||
}
|
||||
}
|
||||
block.unwrap()
|
||||
};
|
||||
|
||||
for extrinsic in block.extrinsics {
|
||||
if extrinsic.0 == execution.0[2 ..] {
|
||||
break 'get_block;
|
||||
}
|
||||
}
|
||||
}
|
||||
let block = block.header.hash().into();
|
||||
|
||||
let batches = serai.get_batch_events(block).await.unwrap();
|
||||
// TODO: impl From<Batch> for BatchEvent?
|
||||
assert_eq!(
|
||||
serai.get_coin_block_number(coin_by_index, latest).await.unwrap(),
|
||||
update.block_number
|
||||
batches,
|
||||
vec![InInstructionsEvent::Batch {
|
||||
network: batch.batch.network,
|
||||
id: batch.batch.id,
|
||||
block: batch.batch.block,
|
||||
}],
|
||||
);
|
||||
}
|
||||
}
|
||||
// This will fail if there were more batch events than expected
|
||||
assert!(batches.is_empty());
|
||||
|
||||
handle.stop().unwrap();
|
||||
handle.stopped().await;
|
||||
// TODO: Check the tokens events
|
||||
|
||||
return latest;
|
||||
}
|
||||
block
|
||||
}
|
||||
|
||||
#[macro_export]
|
||||
|
|
|
@ -1,46 +0,0 @@
|
|||
use rand_core::{RngCore, OsRng};
|
||||
|
||||
use serai_client::{
|
||||
primitives::{BITCOIN, BlockNumber, BlockHash, SeraiAddress, Amount, WithAmount, Balance},
|
||||
tokens::TokensEvent,
|
||||
in_instructions::{
|
||||
primitives::{InInstruction, Batch, Update},
|
||||
InInstructionsEvent,
|
||||
},
|
||||
Serai,
|
||||
};
|
||||
|
||||
mod runner;
|
||||
use runner::{URL, provide_updates};
|
||||
|
||||
serai_test!(
|
||||
async fn publish_updates() {
|
||||
let coin = BITCOIN;
|
||||
let mut id = BlockHash([0; 32]);
|
||||
OsRng.fill_bytes(&mut id.0);
|
||||
let block_number = BlockNumber(OsRng.next_u64());
|
||||
|
||||
let mut address = SeraiAddress::new([0; 32]);
|
||||
OsRng.fill_bytes(&mut address.0);
|
||||
let amount = Amount(OsRng.next_u64());
|
||||
|
||||
let batch = Batch {
|
||||
id,
|
||||
instructions: vec![WithAmount { data: InInstruction::Transfer(address), amount }],
|
||||
};
|
||||
let update = Update { block_number, batches: vec![batch] };
|
||||
let block = provide_updates(vec![Some(update)]).await;
|
||||
|
||||
let serai = Serai::new(URL).await.unwrap();
|
||||
let batches = serai.get_batch_events(block).await.unwrap();
|
||||
assert_eq!(batches, vec![InInstructionsEvent::Batch { coin, id }]);
|
||||
assert_eq!(serai.get_coin_block_number(coin, block).await.unwrap(), block_number);
|
||||
|
||||
assert_eq!(
|
||||
serai.get_mint_events(block).await.unwrap(),
|
||||
vec![TokensEvent::Mint { address, balance: Balance { coin, amount } }]
|
||||
);
|
||||
assert_eq!(serai.get_token_supply(block, coin).await.unwrap(), amount);
|
||||
assert_eq!(serai.get_token_balance(block, coin, address).await.unwrap(), amount);
|
||||
}
|
||||
);
|
|
@ -45,12 +45,3 @@ impl Mul for Amount {
|
|||
Amount(self.0.checked_mul(other.0).unwrap())
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, PartialEq, Eq, Debug, Encode, Decode, MaxEncodedLen, TypeInfo)]
|
||||
#[cfg_attr(feature = "std", derive(Zeroize, Serialize, Deserialize))]
|
||||
pub struct WithAmount<
|
||||
T: Clone + PartialEq + Eq + Debug + Encode + Decode + MaxEncodedLen + TypeInfo,
|
||||
> {
|
||||
pub data: T,
|
||||
pub amount: Amount,
|
||||
}
|
||||
|
|
|
@ -10,7 +10,7 @@ use sp_core::{ConstU32, bounded::BoundedVec};
|
|||
use serde::{Serialize, Deserialize};
|
||||
|
||||
/// The type used to identify networks.
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Debug, Encode, Decode, MaxEncodedLen, TypeInfo)]
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Hash, Debug, Encode, Decode, MaxEncodedLen, TypeInfo)]
|
||||
#[cfg_attr(feature = "std", derive(Zeroize, Serialize, Deserialize))]
|
||||
pub struct NetworkId(pub u16);
|
||||
impl From<u16> for NetworkId {
|
||||
|
@ -24,7 +24,7 @@ pub const ETHEREUM_NET_ID: NetworkId = NetworkId(1);
|
|||
pub const MONERO_NET_ID: NetworkId = NetworkId(2);
|
||||
|
||||
/// The type used to identify coins.
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Debug, Encode, Decode, MaxEncodedLen, TypeInfo)]
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Hash, Debug, Encode, Decode, MaxEncodedLen, TypeInfo)]
|
||||
#[cfg_attr(feature = "std", derive(Zeroize, Serialize, Deserialize))]
|
||||
pub struct Coin(pub u32);
|
||||
impl From<u32> for Coin {
|
||||
|
|
|
@ -11,7 +11,7 @@ use scale_info::TypeInfo;
|
|||
#[cfg(feature = "std")]
|
||||
use serde::{Serialize, Deserialize};
|
||||
|
||||
use serai_primitives::{SeraiAddress, ExternalAddress, Data, pallet_address};
|
||||
use serai_primitives::{Balance, SeraiAddress, ExternalAddress, Data, pallet_address};
|
||||
|
||||
pub const ADDRESS: SeraiAddress = pallet_address(b"Tokens");
|
||||
|
||||
|
@ -22,6 +22,13 @@ pub struct OutInstruction {
|
|||
pub data: Option<Data>,
|
||||
}
|
||||
|
||||
#[derive(Clone, PartialEq, Eq, Debug, Encode, Decode, MaxEncodedLen, TypeInfo)]
|
||||
#[cfg_attr(feature = "std", derive(Zeroize, Serialize, Deserialize))]
|
||||
pub struct OutInstructionWithBalance {
|
||||
pub instruction: OutInstruction,
|
||||
pub balance: Balance,
|
||||
}
|
||||
|
||||
#[derive(Clone, PartialEq, Eq, Debug, Encode, Decode, MaxEncodedLen, TypeInfo)]
|
||||
#[cfg_attr(feature = "std", derive(Zeroize, Serialize, Deserialize))]
|
||||
pub enum Destination {
|
||||
|
|
|
@ -17,7 +17,7 @@ use serai_primitives::NetworkId;
|
|||
pub struct Session(pub u32);
|
||||
|
||||
/// The type used to identify a specific validator set during a specific session.
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Debug, Encode, Decode, TypeInfo, MaxEncodedLen)]
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Hash, Debug, Encode, Decode, TypeInfo, MaxEncodedLen)]
|
||||
#[cfg_attr(feature = "std", derive(Zeroize, Serialize, Deserialize))]
|
||||
pub struct ValidatorSet {
|
||||
pub session: Session,
|
||||
|
|
Loading…
Reference in a new issue