Skip to content
This repository has been archived by the owner on Feb 12, 2024. It is now read-only.

feat: store pins in datastore instead of a DAG #2771

Merged
merged 47 commits into from
Aug 25, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
47 commits
Select commit Hold shift + click to select a range
582be49
feat: store pins in datastore instead of a DAG
achingbrain Mar 3, 2020
b65c94b
chore: revert update of ipfs-repo in mfs dev deps
achingbrain Mar 4, 2020
a8e1a06
chore: fix interop tests
achingbrain Mar 5, 2020
6ff3a10
chore: dedupe pinning tests
achingbrain Mar 5, 2020
e9ab233
chore: fix up http tests
achingbrain Mar 5, 2020
68345b2
feat: new api
achingbrain Mar 6, 2020
1d304db
fix: handle invalid paths in http routes
achingbrain Mar 6, 2020
8aeb819
chore: fix up tests
achingbrain Mar 6, 2020
ab70c18
chore: responding to pr comments
achingbrain Mar 10, 2020
1185cb9
chore: responding to pr comments
achingbrain Mar 10, 2020
d213521
Merge remote-tracking branch 'origin/master' into refactor/store-pins…
achingbrain Jun 30, 2020
591b7f2
chore: fix tests
achingbrain Jun 30, 2020
b534a61
Merge remote-tracking branch 'origin/master' into refactor/store-pins…
achingbrain Jul 2, 2020
96399fd
chore: fix tests
achingbrain Jul 3, 2020
af90886
Merge remote-tracking branch 'origin/master' into refactor/store-pins…
achingbrain Jul 21, 2020
d89a330
chore: split pin.add and pin.rm into pin.addAll and pin.rmAll
achingbrain Jul 21, 2020
b8a2640
chore: fix tests
achingbrain Jul 21, 2020
00e3671
chore: fix more tests
achingbrain Jul 21, 2020
c98b50c
chore: make depth infinity for recursive pins
achingbrain Jul 21, 2020
d4f3796
chore: keep multihash prefix for key
achingbrain Jul 21, 2020
692664d
chore: correct multihash
achingbrain Jul 21, 2020
2864906
Merge remote-tracking branch 'origin/master' into refactor/store-pins…
achingbrain Aug 5, 2020
94204f8
chore: updating deps
achingbrain Aug 10, 2020
ea82cf9
chore: update deps
achingbrain Aug 11, 2020
8a1d959
Merge remote-tracking branch 'origin/master' into refactor/store-pins…
achingbrain Aug 12, 2020
e8c2543
chore: replace node buffers with uint8arrays
achingbrain Aug 14, 2020
0d3a1a1
chore: add missing dep
achingbrain Aug 15, 2020
aa591c7
chore: add missing deps
achingbrain Aug 15, 2020
86bcd70
chore: fix up normalisation tests
achingbrain Aug 15, 2020
cdd80ef
chore: fix typo
achingbrain Aug 15, 2020
bfc74d6
chore: fix failing test
achingbrain Aug 15, 2020
173905f
chore: fix pubsub tests
achingbrain Aug 15, 2020
d072a40
chore: remove unused dep
achingbrain Aug 15, 2020
5879a49
chore: fix failing test
achingbrain Aug 15, 2020
caf8ba5
chore: use base64pad and print migration progress
achingbrain Aug 15, 2020
50ad0f2
chore: fix object data
achingbrain Aug 15, 2020
903e2cf
chore: fix base string
achingbrain Aug 15, 2020
ba5c29b
chore: fix failing tests
achingbrain Aug 16, 2020
5ad0906
chore: fix up examples
achingbrain Aug 16, 2020
a73ec06
chore: fix circuit relay test
achingbrain Aug 16, 2020
a069f4c
chore: ignore invalid messages
achingbrain Aug 17, 2020
b2a73f7
chore: update ipfs-utils version
achingbrain Aug 24, 2020
27d1866
chore: update dep versions
achingbrain Aug 24, 2020
ce097df
Merge remote-tracking branch 'origin/master' into refactor/store-pins…
achingbrain Aug 24, 2020
6efcc3a
chore: stringify ipns record properly
achingbrain Aug 24, 2020
2cff1c8
chore: update dep version
achingbrain Aug 24, 2020
8914194
Merge remote-tracking branch 'origin/master' into refactor/store-pins…
achingbrain Aug 24, 2020
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
2 changes: 1 addition & 1 deletion docs/BROWSERS.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ document.addEventListener('DOMContentLoaded', async () => {
const cid = results[0].hash
console.log('CID created via ipfs.add:', cid)
const data = await node.cat(cid)
console.log('Data read back via ipfs.cat:', data.toString())
console.log('Data read back via ipfs.cat:', new TextDecoder().decode(data))
})
</script>
```
Expand Down
89 changes: 54 additions & 35 deletions docs/MIGRATION-TO-ASYNC-AWAIT.md
Original file line number Diff line number Diff line change
Expand Up @@ -171,9 +171,10 @@ e.g.

```js
const readable = ipfs.catReadableStream('QmHash')
const decoder = new TextDecoder()

readable.on('data', chunk => {
console.log(chunk.toString())
console.log(decoder.decode(chunk))
})

readable.on('end', () => {
Expand All @@ -185,9 +186,10 @@ Becomes:

```js
const source = ipfs.cat('QmHash')
const decoder = new TextDecoder()

for await (const chunk of source) {
console.log(chunk.toString())
console.log(decoder.decode(chunk))
}

console.log('done')
Expand All @@ -201,9 +203,10 @@ e.g.

```js
const readable = ipfs.catReadableStream('QmHash')
const decoder = new TextDecoder()

readable.on('data', chunk => {
console.log(chunk.toString())
console.log(decoder.decode(chunk))
})

readable.on('end', () => {
Expand All @@ -216,9 +219,10 @@ Becomes:
```js
const toStream = require('it-to-stream')
const readable = toStream.readable(ipfs.cat('QmHash'))
const decoder = new TextDecoder()

readable.on('data', chunk => {
console.log(chunk.toString())
console.log(decoder.decode(chunk))
})

readable.on('end', () => {
Expand All @@ -238,11 +242,12 @@ e.g.

```js
const { pipeline, Writable } = require('stream')
const decoder = new TextDecoder()

let data = Buffer.alloc(0)
let data = new Uint8Array(0)
const concat = new Writable({
write (chunk, enc, cb) {
data = Buffer.concat([data, chunk])
data = uint8ArrayConcat([data, chunk])
cb()
}
})
Expand All @@ -251,7 +256,7 @@ pipeline(
ipfs.catReadableStream('QmHash'),
concat,
err => {
console.log(data.toString())
console.log(decoder.decode(chunk))
}
)
```
Expand All @@ -260,11 +265,12 @@ Becomes:

```js
const pipe = require('it-pipe')
const decoder = new TextDecoder()

let data = Buffer.alloc(0)
let data = new Uint8Array(0)
const concat = async source => {
for await (const chunk of source) {
data = Buffer.concat([data, chunk])
data = uint8ArrayConcat([data, chunk])
}
}

Expand All @@ -273,15 +279,16 @@ const data = await pipe(
concat
)

console.log(data.toString())
console.log(decoder.decode(data))
```

...which, by the way, could more succinctly be written as:

```js
const toBuffer = require('it-to-buffer')
const decoder = new TextDecoder()
const data = await toBuffer(ipfs.cat('QmHash'))
console.log(data.toString())
console.log(decoder.decode(data))
```

**Impact 🍏**
Expand All @@ -292,11 +299,12 @@ e.g.

```js
const { pipeline, Writable } = require('stream')
const decoder = new TextDecoder()

let data = Buffer.alloc(0)
let data = new Uint8Array(0)
const concat = new Writable({
write (chunk, enc, cb) {
data = Buffer.concat([data, chunk])
data = uint8ArrayConcat([data, chunk])
cb()
}
})
Expand All @@ -305,7 +313,7 @@ pipeline(
ipfs.catReadableStream('QmHash'),
concat,
err => {
console.log(data.toString())
console.log(decoder.decode(data))
}
)
```
Expand All @@ -315,11 +323,12 @@ Becomes:
```js
const toStream = require('it-to-stream')
const { pipeline, Writable } = require('stream')
const decoder = new TextDecoder()

let data = Buffer.alloc(0)
let data = new Uint8Array(0)
const concat = new Writable({
write (chunk, enc, cb) {
data = Buffer.concat([data, chunk])
data = uint8ArrayConcat([data, chunk])
cb()
}
})
Expand All @@ -328,7 +337,7 @@ pipeline(
toStream.readable(ipfs.cat('QmHash')),
concat,
err => {
console.log(data.toString())
console.log(decoder.decode(data))
}
)
```
Expand Down Expand Up @@ -472,10 +481,12 @@ Use a [for/await](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refere
e.g.

```js
const decoder = new TextDecoder()

pull(
ipfs.catPullStream('QmHash'),
pull.through(chunk => {
console.log(chunk.toString())
console.log(decoder.decode(data))
}),
pull.onEnd(err => {
console.log('done')
Expand All @@ -486,8 +497,10 @@ pull(
Becomes:

```js
const decoder = new TextDecoder()

for await (const chunk of ipfs.cat('QmHash')) {
console.log(chunk.toString())
console.log(decoder.decode(data))
}

console.log('done')
Expand All @@ -500,10 +513,12 @@ Convert the async iterable to a pull stream.
e.g.

```js
const decoder = new TextDecoder()

pull(
ipfs.catPullStream('QmHash'),
pull.through(chunk => {
console.log(chunk.toString())
console.log(decoder.decode(data))
}),
pull.onEnd(err => {
console.log('done')
Expand All @@ -515,11 +530,12 @@ Becomes:

```js
const toPull = require('async-iterator-to-pull-stream')
const decoder = new TextDecoder()

pull(
toPull.source(ipfs.cat('QmHash')),
pull.through(chunk => {
console.log(chunk.toString())
console.log(decoder.decode(data))
}),
pull.onEnd(err => {
console.log('done')
Expand All @@ -538,10 +554,12 @@ Use `it-pipe` and `it-concat` concat data from an async iterable.
e.g.

```js
const decoder = new TextDecoder()

pull(
ipfs.catPullStream('QmHash'),
pull.collect((err, chunks) => {
console.log(Buffer.concat(chunks).toString())
console.log(decoder.decode(uint8ArrayConcat(chunks)))
})
)
```
Expand All @@ -551,13 +569,14 @@ Becomes:
```js
const pipe = require('it-pipe')
const concat = require('it-concat')
const decoder = new TextDecoder()

const data = await pipe(
ipfs.cat('QmHash'),
concat
)

console.log(data.toString())
console.log(decoder.decode(data))
```

#### Transform Pull Streams
Expand Down Expand Up @@ -640,8 +659,8 @@ e.g.

```js
const results = await ipfs.addAll([
{ path: 'root/1.txt', content: Buffer.from('one') },
{ path: 'root/2.txt', content: Buffer.from('two') }
{ path: 'root/1.txt', content: 'one' },
{ path: 'root/2.txt', content: 'two' }
])

// Note that ALL files have already been added to IPFS
Expand All @@ -654,8 +673,8 @@ Becomes:

```js
const addSource = ipfs.addAll([
{ path: 'root/1.txt', content: Buffer.from('one') },
{ path: 'root/2.txt', content: Buffer.from('two') }
{ path: 'root/1.txt', content: 'one' },
{ path: 'root/2.txt', content: 'two' }
])

for await (const file of addSource) {
Expand All @@ -669,8 +688,8 @@ Alternatively you can buffer up the results using the `it-all` utility:
const all = require('it-all')

const results = await all(ipfs.addAll([
{ path: 'root/1.txt', content: Buffer.from('one') },
{ path: 'root/2.txt', content: Buffer.from('two') }
{ path: 'root/1.txt', content: 'one' },
{ path: 'root/2.txt', content: 'two' }
]))

results.forEach(file => {
Expand All @@ -682,8 +701,8 @@ Often you just want the last item (the root directory entry) when adding multipl

```js
const results = await ipfs.addAll([
{ path: 'root/1.txt', content: Buffer.from('one') },
{ path: 'root/2.txt', content: Buffer.from('two') }
{ path: 'root/1.txt', content: 'one' },
{ path: 'root/2.txt', content: 'two' }
])

const lastResult = results[results.length - 1]
Expand All @@ -695,8 +714,8 @@ Becomes:

```js
const addSource = ipfs.addAll([
{ path: 'root/1.txt', content: Buffer.from('one') },
{ path: 'root/2.txt', content: Buffer.from('two') }
{ path: 'root/1.txt', content: 'one' },
{ path: 'root/2.txt', content: 'two' }
])

let lastResult
Expand All @@ -711,8 +730,8 @@ Alternatively you can use the `it-last` utility:

```js
const lastResult = await last(ipfs.addAll([
{ path: 'root/1.txt', content: Buffer.from('one') },
{ path: 'root/2.txt', content: Buffer.from('two') }
{ path: 'root/1.txt', content: 'one' },
{ path: 'root/2.txt', content: 'two' }
]))

console.log(lastResult)
Expand Down
9 changes: 5 additions & 4 deletions docs/core-api/BLOCK.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,11 +92,12 @@ An optional object which may have the following keys:

```JavaScript
// Defaults
const buf = Buffer.from('a serialized object')
const buf = new TextEncoder().encode('a serialized object')
const decoder = new TextDecoder()

const block = await ipfs.block.put(buf)

console.log(block.data.toString())
console.log(decoder.decode(block.data))
// Logs:
// a serialized object
console.log(block.cid.toString())
Expand All @@ -105,12 +106,12 @@ console.log(block.cid.toString())

// With custom format and hashtype through CID
const CID = require('cids')
const buf = Buffer.from('another serialized object')
const buf = new TextEncoder().encode('another serialized object')
const cid = new CID(1, 'dag-pb', multihash)

const block = await ipfs.block.put(blob, cid)

console.log(block.data.toString())
console.log(decoder.decode(block.data))
// Logs:
// a serialized object
console.log(block.cid.toString())
Expand Down
Loading