Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add flag to execute scripts in parallel w.r.t. chains #203

Merged
merged 9 commits into from
Apr 5, 2024
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -106,6 +106,7 @@ dist
build/
cache/
artifacts/
chains-info/

local.json
keys.json
Expand Down
2 changes: 2 additions & 0 deletions evm/cli-utils.js
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@ const addBaseOptions = (program, options = {}) => {
.env('ENV'),
);
program.addOption(new Option('-y, --yes', 'skip deployment prompt confirmation').env('YES'));
program.addOption(new Option('--parallel', 'run script parallely wrt chains'));
program.addOption(new Option('--saveChainSeparately', 'save chain info separately'));
program.addOption(new Option('--gasOptions <gasOptions>', 'gas options cli override'));

if (!options.ignoreChainNames) {
Expand Down
82 changes: 81 additions & 1 deletion evm/utils.js
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ const {
const { CosmWasmClient } = require('@cosmjs/cosmwasm-stargate');
const CreateDeploy = require('@axelar-network/axelar-gmp-sdk-solidity/artifacts/contracts/deploy/CreateDeploy.sol/CreateDeploy.json');
const IDeployer = require('@axelar-network/axelar-gmp-sdk-solidity/interfaces/IDeployer.json');
const { exec } = require('child_process');
const { verifyContract } = require(`${__dirname}/../axelar-chains-config`);

const getSaltFromKey = (key) => {
Expand Down Expand Up @@ -668,10 +669,18 @@ function loadConfig(env) {
return require(`${__dirname}/../axelar-chains-config/info/${env}.json`);
}

function loadParallelExecutionConfig(env, chain) {
return require(`${__dirname}/../chains-info/${env}-${chain}.json`);
}

function saveConfig(config, env) {
writeJSON(config, `${__dirname}/../axelar-chains-config/info/${env}.json`);
}

function saveParallelExecutionConfig(config, env, chain) {
Copy link
Member

@milapsheth milapsheth Jun 4, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note that the implementation of this function is not tied to a notion of parallel execution, so the naming isn't quite right. Furthermore, it's pretty much the same as existing functions. So instead of a new helper, generalize loadConfig/saveConfig to take a filename/relative path. It can default to null meaning the current setting, but otherwise allows customizing the name. Using a different info directory is confusing as well since it's still saving the config.

writeJSON(config, `${__dirname}/../chains-info/${env}-${chain}.json`);
}

async function printWalletInfo(wallet, options = {}) {
let balance = 0;
const address = await wallet.getAddress();
Expand Down Expand Up @@ -852,6 +861,73 @@ const mainProcessor = async (options, processCommand, save = true, catchErr = fa
}
}

if (options.parallel && chains.length > 1) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please avoid making functions so complex. Use clear and concise helper functions to organize the code

const cmds = process.argv.filter((command) => command);
let chainCommandIndex = -1;
let skipPrompt = false;

for (let commandIndex = 0; commandIndex < cmds.length; commandIndex++) {
Copy link
Member

@milapsheth milapsheth Jun 4, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is a weird way to run it parallely (like what you'd need to do in bash). Why not run processCommand without an await for each chain and wait at the end?

Copy link
Member

@milapsheth milapsheth Jun 4, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I imagine it was to separate stdout capture between parallel runs. I'm wondering if we can capture the console.log separately or reassign .log field to something that stores the output in a list so we can output, or use a process library to do the above more cleanly. If there's no good way to do this in JS, then we can do the above but refactor it to make it cleaner

const cmd = cmds[commandIndex];

if (cmd === '-n' || cmd === '--chainName' || cmd === '--chainNames') {
chainCommandIndex = commandIndex;
} else if (cmd === '--parallel') {
cmds[commandIndex] = '--saveChainSeparately';
} else if (cmd === '-y' || cmd === '--yes') {
skipPrompt = true;
}
}

if (!skipPrompt) {
cmds.push('-y');
}

const successfullChains = [];

const executeChain = (chainName) => {
const chain = config.chains[chainName.toLowerCase()];

if (
chainsToSkip.includes(chain.name.toLowerCase()) ||
chain.status === 'deactive' ||
(chain.contracts && chain.contracts[options.contractName]?.skip)
) {
printWarn('Skipping chain', chain.name);
return Promise.resolve();
}

return new Promise((resolve) => {
cmds[chainCommandIndex + 1] = chainName;

exec(cmds.join(' '), { stdio: 'inherit' }, (error, stdout) => {
printInfo('-------------------------------------------------------');
printInfo(`Logs for ${chainName}`, stdout);

if (error) {
printError(`Error while running script for ${chainName}`, error);
} else {
successfullChains.push(chainName);
printInfo(`Finished running script for chain`, chainName);
}

resolve();
});
});
};

await Promise.all(chains.map(executeChain));

if (save) {
for (const chainName of successfullChains) {
config.chains[chainName.toLowerCase()] = loadParallelExecutionConfig(options.env, chainName);
}

saveConfig(config, options.env);
}

return;
}

Copy link
Member

@milapsheth milapsheth Jun 4, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i don't see a need for separate code like above to handle parallel execution. Create a chainProcessor function that handles most of the for loop body. Depending on sequential/parallel mode, it can be run with an await or the promise can be collected (or if needed using an exec in the chainProcessor function), and we await at the end.

To deal with configs in parallel mode, you can give a copy of it to each call so they don't write to the same object. And then you can merge the chain specific changes at the end. No need to save configs separately.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

An initial complex solution is usually not the right one, I recommend spending more time to come up with a more elegant approach and discuss it during syncs if needed, otherwise codebase can easily become unmaintainable.

for (const chainName of chains) {
const chain = config.chains[chainName.toLowerCase()];

Expand All @@ -878,7 +954,11 @@ const mainProcessor = async (options, processCommand, save = true, catchErr = fa
}

if (save) {
saveConfig(config, options.env);
if (options.saveChainSeparately) {
saveParallelExecutionConfig(config.chains[chainName.toLowerCase()], options.env, chainName);
} else {
saveConfig(config, options.env);
}
}
}
};
Expand Down
Loading