Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

EVM: Avoid memory.read() Memory Copy #2573

Merged
merged 5 commits into from
Mar 9, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 5 additions & 10 deletions packages/evm/src/memory.ts
Original file line number Diff line number Diff line change
Expand Up @@ -63,21 +63,16 @@ export class Memory {
* It fills up the difference between memory's length and `offset + size` with zeros.
* @param offset - Starting position
* @param size - How many bytes to read
* @param avoidCopy - Avoid memory copy if possible for performance reasons (optional)
*/
read(offset: number, size: number): Buffer {
read(offset: number, size: number, avoidCopy?: boolean): Buffer {
this.extend(offset, size)

const returnBuffer = Buffer.allocUnsafe(size)
// Copy the stored "buffer" from memory into the return Buffer

const loaded = this._store.slice(offset, offset + size)
returnBuffer.fill(loaded, 0, loaded.length)

if (loaded.length < size) {
// fill the remaining part of the Buffer with zeros
returnBuffer.fill(0, loaded.length, size)
if (avoidCopy === true) {
return loaded
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think what we can do here is:

  • Return the slice of the _store
  • In case that loaded.length does not equal the size, expand the current internal _store with extra zeros. (Use the extend method to allocate in 8kb chunks)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A that's a good idea, I will try that (have got 3 work hours from now on). I would nevertheless want to keep this in this limited scope (so "free" the memory one after one or - rather - over time eventually), otherwise this will get too hairy and uncontrollable.

In the current scope this feels already pretty good though, will also test on some couple of hundred tx loaded mainnet blocks.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Direct update: ok, tested with npm run client:start -- --discDns=false --discV4=false --maxPeers=0 --executeBlocks=1400000-1401000 on a couple of thousand txs, that all went well, only one fork (homestead) but already a good sign. 🙂

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. There is actually already a this.extend(offset, size) call at the beginning of the method.

I don't think it would have any effect if I just delete the zero filling conditional (not seeing how this can ever evaluate to true in the current setup of the method).

    if (loaded.length < size) {
      // fill the remaining part of the Buffer with zeros
      returnBuffer.fill(0, loaded.length, size)
    }

Will do that.

Or maybe, from a procedural point of view: will first look into the current test failures, and then do it. 😋

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah right, sorry, there is indeed the extend method above.

Somewhat unrelated, but each time we extend and it creates a 8kb memory, we concat the Buffer which, I think (https://nodejs.org/api/buffer.html#static-method-bufferconcatlist-totallength) copies the previous Buffer as well, which we also might not want. I am not sure how other clients allocate these Buffers, you could split them up in chunks of whatever size without copying them, but then if you want to read over 2+ containers the read-logic gets a big complex (and you have to copy them again). (Think of memory in this "optimized" (?) case as a linked list?)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, that's interesting and might be also something worth addressing. Is there any spec or so reason why we start memory at 0? Is this given by the protocol? Otherwise we could at least safe this first memory copy over step by directly initializing with 96 bytes?

Just ran some mainnet blocks with console.log(Memory extended by ${sizeDiff} to ${newSize}) injected in the extend() method (only when extension is taking place), and going once to 96 is the most common extension operation. 🤔

grafik

(I have the impression I am in the range of some attack blocks, I guess therefore the occasional repeated 8768 extensions, no confirmation though on this and at the end a side question)

Copy link
Member

@jochem-brouwer jochem-brouwer Mar 8, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This 96 is due to solidity, it initializes the "free memory pointer" at slot 0x40 (64) which will thus write 32 bytes at offset 64 and thus extend to 96 bytes. I am not sure if I am ok with directly initializing it to 96 bytes, since it is super likely that more memory writes are happening, and if solidity changes their initial memory structure we have this artifact laying around.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will experiment with it a bit.

I tested 1024, that seemed like a reasonable default after some tests and looking into some numbers. The performance tests do not gain by that though, so doesn't seem like such an imminent topic to solve. Would like to do some mainnet block tests as well though.


return returnBuffer
return Buffer.from(loaded)
}
}
12 changes: 6 additions & 6 deletions packages/evm/src/opcodes/functions.ts
Original file line number Diff line number Diff line change
Expand Up @@ -918,7 +918,7 @@ export const handlers: Map<number, OpHandler> = new Map([

let data = Buffer.alloc(0)
if (length !== BigInt(0)) {
data = runState.memory.read(Number(offset), Number(length))
data = runState.memory.read(Number(offset), Number(length), true)
}

const ret = await runState.interpreter.create(gasLimit, value, data)
Expand All @@ -940,7 +940,7 @@ export const handlers: Map<number, OpHandler> = new Map([

let data = Buffer.alloc(0)
if (length !== BigInt(0)) {
data = runState.memory.read(Number(offset), Number(length))
data = runState.memory.read(Number(offset), Number(length), true)
}

const ret = await runState.interpreter.create2(
Expand All @@ -962,7 +962,7 @@ export const handlers: Map<number, OpHandler> = new Map([

let data = Buffer.alloc(0)
if (inLength !== BigInt(0)) {
data = runState.memory.read(Number(inOffset), Number(inLength))
data = runState.memory.read(Number(inOffset), Number(inLength), true)
}

const gasLimit = runState.messageGasLimit!
Expand All @@ -987,7 +987,7 @@ export const handlers: Map<number, OpHandler> = new Map([

let data = Buffer.alloc(0)
if (inLength !== BigInt(0)) {
data = runState.memory.read(Number(inOffset), Number(inLength))
data = runState.memory.read(Number(inOffset), Number(inLength), true)
}

const ret = await runState.interpreter.callCode(gasLimit, toAddress, value, data)
Expand All @@ -1007,7 +1007,7 @@ export const handlers: Map<number, OpHandler> = new Map([

let data = Buffer.alloc(0)
if (inLength !== BigInt(0)) {
data = runState.memory.read(Number(inOffset), Number(inLength))
data = runState.memory.read(Number(inOffset), Number(inLength), true)
}

const gasLimit = runState.messageGasLimit!
Expand Down Expand Up @@ -1121,7 +1121,7 @@ export const handlers: Map<number, OpHandler> = new Map([

let data = Buffer.alloc(0)
if (inLength !== BigInt(0)) {
data = runState.memory.read(Number(inOffset), Number(inLength))
data = runState.memory.read(Number(inOffset), Number(inLength), true)
}

const ret = await runState.interpreter.callStatic(gasLimit, toAddress, value, data)
Expand Down
2 changes: 1 addition & 1 deletion packages/evm/src/precompiles/04-identity.ts
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,6 @@ export function precompile04(opts: PrecompileInput): ExecResult {

return {
executionGasUsed: gasUsed,
returnValue: data,
returnValue: Buffer.from(data), // Copy the memory (`Buffer.from()`)
}
}