Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Proposal] Endpoint to query using Normalized Transaction IDs #506

Open
eordano opened this issue Apr 18, 2017 · 0 comments
Open

[Proposal] Endpoint to query using Normalized Transaction IDs #506

eordano opened this issue Apr 18, 2017 · 0 comments

Comments

@eordano
Copy link
Contributor

eordano commented Apr 18, 2017

Hi everybody,

I was wondering if a pull request to add a new index to query transactions using a non-malleable tx id (like the one described in BIP 140) could be added to insight. In particular, it would:

  • Add a /ntx/:hash endpoint that returns the current transaction id for this normalized transaction
  • Add an index from ntxid -> (txid, block hash) to fulfill that query
  • Update the index whenever a transaction gets included on the mempool (if block hash for that ntxid is null) or confirmed in a block.
    I think this is enough to be safe from invalid state arising from reorgs.
  • Upgrade plan for already running insight nodes: Process all transactions from the genesis block to initially populate the index, leaving a "latest block hash processed" mark on the database

This allows for wallets and software that checks for transaction inclusion on the blockchain to query what was the actually "final" transaction id for a transaction that was created by a non-validating client, without resorting to check with which malleated version of the transaction was the previous transaction output spent.

The downside of this approach is having yet another index of transactions, which would take quite some disk space, around 16 gigabytes according to my estimation (currently the biggest pain when running an insight node IMO). Also, initial indexing would take longer (but I think 200 million hashes are calculated in about 5 minutes with a relatively modern CPU and updated nodejs).

Does this have any chance of getting merged into insight?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant