Skip to content
This repository has been archived by the owner on Nov 15, 2023. It is now read-only.

Add event with XCM executor outcome, which includes weight fee #1286

Merged

Conversation

girazoki
Copy link
Contributor

Throws an event with the outcome of the xcm executor, which in case of both Success or Error shows the weight used. The weight used information is useful for monitoring purposes, e.g., in case Fees are charged. I didnt want to disrupt the existing events, so I mainteined them.

@gilescope
Copy link
Contributor

Personally I'd prefer altering the existing events to include weight used if we're going to do this.

@girazoki
Copy link
Contributor Author

I also prefer that. Let me change it to that

@girazoki
Copy link
Contributor Author

Modified

Copy link
Contributor

@joepetrowski joepetrowski left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree that this is useful, but does seem strange to partially change the Event variants to structs instead of tuples. @KiChjang I think worked on updating events, should we combine that with this?

@girazoki
Copy link
Contributor Author

Is it this PR https://github.com/paritytech/cumulus/pull/1260/files? If so it does not look like it changed those events in xcmp-queue. I can switch back to tuples or change everything to named events, let me know which is preferred

@joepetrowski
Copy link
Contributor

Yeah exactly, looks like that PR missed this. I think going to all named events makes sense.

@girazoki
Copy link
Contributor Author

Updated!

Copy link
Contributor

@joepetrowski joepetrowski left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. I'd just like @KiChjang or @gavofyork to weigh in on the message_id naming w/r/t paritytech/polkadot#4756 so that XCM v3 doesn't change the meaning of ID.

But based on the definition and usage of id in this section in the UMP queue I think it is OK (or we should use sp_io::hashing::blake2_256 instead of T::Hashing::hash):

/// Returns a [`MessageId`] for the given upward message payload.
fn upward_message_id(data: &[u8]) -> MessageId {
	sp_io::hashing::blake2_256(data)
}

impl<XcmExecutor: xcm::latest::ExecuteXcm<C::Call>, C: Config> UmpSink for XcmSink<XcmExecutor, C> {
	fn process_upward_message(
		origin: ParaId,
		mut data: &[u8],
		max_weight: Weight,
	) -> Result<Weight, (MessageId, Weight)> {
		/* snip */

		let id = upward_message_id(&data[..]);

		/* snip */

		match maybe_msg_and_weight {
			Err(_) => {
				Pallet::<C>::deposit_event(Event::InvalidFormat(id));
				Ok(0)
			},
			Ok((Err(()), weight_used)) => {
				Pallet::<C>::deposit_event(Event::UnsupportedVersion(id));
				Ok(weight_used)
			},
			Ok((Ok(xcm_message), weight_used)) => {
				let xcm_junction = Junction::Parachain(origin.into());
				let outcome = XcmExecutor::execute_xcm(xcm_junction, xcm_message, id, max_weight);
				match outcome {
					Outcome::Error(XcmError::WeightLimitReached(required)) => Err((id, required)),
					outcome => {
						let outcome_weight = outcome.weight_used();
						Pallet::<C>::deposit_event(Event::ExecutedUpward(id, outcome));
						Ok(weight_used.saturating_add(outcome_weight))
					},
				}
			},
		}
	}
}

@girazoki
Copy link
Contributor Author

I think #697 changes it already to sp_io::hashing::blake2_256.

@KiChjang
Copy link
Contributor

Can this merge into the XCMv3 branch instead?

@girazoki
Copy link
Contributor Author

Yeah, absolutely

@girazoki
Copy link
Contributor Author

@KiChjang can you merge master in #697

Copy link
Contributor

@KiChjang KiChjang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, it doesn't matter, XCMv3 can merge master right afterwards and it'll be fine.

@joepetrowski
Copy link
Contributor

bot merge

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants