Skip to content
This repository has been archived by the owner on Feb 12, 2024. It is now read-only.

Commit

Permalink
Merge remote-tracking branch 'origin' into chore/add-ipfs-http-respon…
Browse files Browse the repository at this point in the history
…se-package
  • Loading branch information
vasco-santos committed Sep 23, 2021
2 parents 9d7dfdf + 5de1b13 commit 8094dfb
Show file tree
Hide file tree
Showing 1,031 changed files with 9,425 additions and 8,016 deletions.
4 changes: 1 addition & 3 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,11 +21,9 @@ jobs:
git config --global user.email "[email protected]"
git config --global user.name "Github Actions"
echo "//registry.npmjs.org/:_authToken=$NPM_TOKEN" > .npmrc
echo "$DOCKER_TOKEN" | docker login -u "$DOCKER_USERNAME" --password-stdin
npm install
npm run build
npm run release:rc
echo "$DOCKER_TOKEN" | docker login -u "$DOCKER_USERNAME" --password-stdin
npm run docker:rc
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
Expand Down
263 changes: 137 additions & 126 deletions .github/workflows/test.yml

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -33,3 +33,4 @@ tsconfig-check.aegir.json

# Operating system files
.DS_Store
types
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ $ npm install ipfs-core
Then start a node in your app:

```javascript
const IPFS = require('ipfs-core')
import * as IPFS from 'ipfs-core'

const ipfs = await IPFS.create()
const { cid } = await ipfs.add('Hello world')
Expand Down
2 changes: 1 addition & 1 deletion docs/CONFIG.md
Original file line number Diff line number Diff line change
Expand Up @@ -200,7 +200,7 @@ Options for Multicast DNS peer discovery:

### `webRTCStar`

WebRTCStar is a discovery mechanism prvided by a signalling-star that allows peer-to-peer communications in the browser.
WebRTCStar is a discovery mechanism provided by a signalling-star that allows peer-to-peer communications in the browser.

Options for webRTCstar peer discovery:

Expand Down
2 changes: 1 addition & 1 deletion docs/DAEMON.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ The IPFS Daemon exposes the API defined in the [HTTP API spec](https://docs.ipfs
If you want a programmatic way to spawn a IPFS Daemon using JavaScript, check out the [ipfsd-ctl](https://github.com/ipfs/js-ipfsd-ctl) module.

```javascript
const { createFactory } = require('ipfsd-ctl')
import { createFactory } from 'ipfsd-ctl'
const factory = createFactory({
type: 'proc' // or 'js' to run in a separate process
})
Expand Down
4 changes: 2 additions & 2 deletions docs/FAQ.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,8 +73,8 @@ Yes, however, bear in mind that there isn't a 100% stable solution to use WebRTC
To add WebRTC support in a IPFS node instance, do:

```JavaScript
const wrtc = require('wrtc') // or require('electron-webrtc')()
const WebRTCStar = require('libp2p-webrtc-star')
import wrtc from 'wrtc' // or 'electron-webrtc'
import WebRTCStar from 'libp2p-webrtc-star'

const node = await IPFS.create({
repo: 'your-repo-path',
Expand Down
38 changes: 22 additions & 16 deletions docs/IPLD.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,38 +49,41 @@ If your application requires support for extra codecs, you can configure them as
1. Configure the [IPLD layer](https://github.com/ipfs/js-ipfs/blob/master/packages/ipfs/docs/MODULE.md#optionsipld) of your IPFS daemon to support the codec. This step is necessary so the node knows how to prepare data received over HTTP to be passed to IPLD for serialization:

```javascript
const ipfs = require('ipfs')
import { create } from 'ipfs'
import customBlockCodec from 'custom-blockcodec'
import customMultibase from 'custom-multibase'
import customMultihasher from 'custom-multihasher'

const node = await ipfs({
const node = await create({
ipld: {
// either specify BlockCodecs as part of the `codecs` list
codecs: [
require('custom-blockcodec')
customBlockCodec
],

// and/or supply a function to load them dynamically
loadCodec: async (codecNameOrCode) => {
return require(codecNameOrCode)
return import(codecNameOrCode)
},

// either specify Multibase codecs as part of the `bases` list
bases: [
require('custom-multibase')
customMultibase
],

// and/or supply a function to load them dynamically
loadBase: async (baseNameOrCode) => {
return require(baseNameOrCode)
return import(baseNameOrCode)
},

// either specify Multihash hashers as part of the `hashers` list
hashers: [
require('custom-multibase')
customMultihasher
],

// and/or supply a function to load them dynamically
loadHasher: async (hashNameOrCode) => {
return require(hashNameOrCode)
return import(hashNameOrCode)
}
}
})
Expand All @@ -89,39 +92,42 @@ If your application requires support for extra codecs, you can configure them as
2. Configure your IPFS HTTP API Client to support the codec. This is necessary so that the client can send the data to the IPFS node over HTTP:

```javascript
const ipfsHttpClient = require('ipfs-http-client')
import { create } from 'ipfs-http-client'
import customBlockCodec from 'custom-blockcodec'
import customMultibase from 'custom-multibase'
import customMultihasher from 'custom-multihasher'
const client = ipfsHttpClient({
const client = create({
url: 'http://127.0.0.1:5002',
ipld: {
// either specify BlockCodecs as part of the `codecs` list
codecs: [
require('custom-blockcodec')
customBlockCodec
],
// and/or supply a function to load them dynamically
loadCodec: async (codecNameOrCode) => {
return require(codecNameOrCode)
return import(codecNameOrCode)
},
// either specify Multibase codecs as part of the `bases` list
bases: [
require('custom-multibase')
customMultibase
],
// and/or supply a function to load them dynamically
loadBase: async (baseNameOrCode) => {
return require(baseNameOrCode)
return import(baseNameOrCode)
},
// either specify Multihash hashers as part of the `hashers` list
hashers: [
require('custom-multibase')
customMultihasher
],
// and/or supply a function to load them dynamically
loadHasher: async (hashNameOrCode) => {
return require(hashNameOrCode)
return import(hashNameOrCode)
}
}
})
Expand Down
50 changes: 25 additions & 25 deletions docs/MIGRATION-TO-ASYNC-AWAIT.md
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ const peerId = PeerId.createFromB58String(peerIdStr)
You can get hold of the `PeerId` class using npm or in a script tag:

```js
const PeerId = require('peer-id')
import PeerId from 'peer-id'
const peerId = PeerId.createFromB58String(peerIdStr)
```

Expand Down Expand Up @@ -128,7 +128,7 @@ You can get hold of the `PeerInfo` class using npm or in a script tag:

```js
const PeerInfo = require('peer-info')
const PeerId = require('peer-id')
import PeerId from 'peer-id'
const peerInfo = new PeerInfo(PeerId.createFromB58String(info.id))
info.addrs.forEach(addr => peerInfo.multiaddrs.add(addr))
```
Expand Down Expand Up @@ -217,7 +217,7 @@ readable.on('end', () => {
Becomes:

```js
const toStream = require('it-to-stream')
import toStream from 'it-to-stream'
const readable = toStream.readable(ipfs.cat('QmHash'))
const decoder = new TextDecoder()

Expand Down Expand Up @@ -285,7 +285,7 @@ console.log(decoder.decode(data))
...which, by the way, could more succinctly be written as:

```js
const toBuffer = require('it-to-buffer')
import toBuffer from 'it-to-buffer'
const decoder = new TextDecoder()
const data = await toBuffer(ipfs.cat('QmHash'))
console.log(decoder.decode(data))
Expand Down Expand Up @@ -321,7 +321,7 @@ pipeline(
Becomes:

```js
const toStream = require('it-to-stream')
import toStream from 'it-to-stream'
const { pipeline, Writable } = require('stream')
const decoder = new TextDecoder()

Expand Down Expand Up @@ -353,7 +353,7 @@ Use `it-pipe` and a [for/await](https://developer.mozilla.org/en-US/docs/Web/Jav
e.g.

```js
const fs = require('fs')
import fs from 'fs'
const { pipeline } = require('stream')

const items = []
Expand All @@ -378,7 +378,7 @@ pipeline(
Becomes:

```js
const fs = require('fs')
import fs from 'fs'
const pipe = require('it-pipe')

const items = []
Expand All @@ -400,9 +400,9 @@ console.log(items)
...which, by the way, could more succinctly be written as:

```js
const fs = require('fs')
import fs from 'fs'
const pipe = require('it-pipe')
const all = require('it-all')
import all from 'it-all'

const items = await pipe(
fs.createReadStream('/path/to/file'),
Expand All @@ -420,7 +420,7 @@ Convert the async iterable to a readable stream.
e.g.

```js
const fs = require('fs')
import fs from 'fs'
const { pipeline } = require('stream')

const items = []
Expand All @@ -445,8 +445,8 @@ pipeline(
Becomes:

```js
const toStream = require('it-to-stream')
const fs = require('fs')
import toStream from 'it-to-stream'
import fs from 'fs'
const { pipeline } = require('stream')

const items = []
Expand Down Expand Up @@ -568,7 +568,7 @@ Becomes:

```js
const pipe = require('it-pipe')
const concat = require('it-concat')
import concat from 'it-concat'
const decoder = new TextDecoder()

const data = await pipe(
Expand All @@ -590,7 +590,7 @@ Use `it-pipe` and `it-all` to collect all items from an async iterable.
e.g.

```js
const fs = require('fs')
import fs from 'fs'
const toPull = require('stream-to-pull-stream')

pull(
Expand All @@ -605,7 +605,7 @@ pull(
Becomes:

```js
const fs = require('fs')
import fs from 'fs'

const file = await ipfs.add(fs.createReadStream('/path/to/file'))

Expand All @@ -619,7 +619,7 @@ Convert the async iterable to a pull stream.
e.g.

```js
const fs = require('fs')
import fs from 'fs'
const toPull = require('stream-to-pull-stream')

pull(
Expand All @@ -634,7 +634,7 @@ pull(
Becomes:

```js
const fs = require('fs')
import fs from 'fs'
const streamToPull = require('stream-to-pull-stream')
const itToPull = require('async-iterator-to-pull-stream')

Expand Down Expand Up @@ -685,7 +685,7 @@ for await (const file of addSource) {
Alternatively you can buffer up the results using the `it-all` utility:

```js
const all = require('it-all')
import all from 'it-all'

const results = await all(ipfs.addAll([
{ path: 'root/1.txt', content: 'one' },
Expand Down Expand Up @@ -744,7 +744,7 @@ Reading files.
e.g.

```js
const fs = require('fs')
import fs from 'fs'

const data = await ipfs.cat('/ipfs/QmHash')

Expand All @@ -759,8 +759,8 @@ Becomes:

```js
const pipe = require('it-pipe')
const toIterable = require('stream-to-it')
const fs = require('fs')
import toIterable from 'stream-to-it'
import fs from 'fs'

// Note that as chunks arrive they are written to the file and memory can be freed and re-used
await pipe(
Expand All @@ -774,8 +774,8 @@ console.log('done')
Alternatively you can buffer up the chunks using the `it-concat` utility (not recommended!):

```js
const fs = require('fs')
const concat = require('it-concat')
import fs from 'fs'
import concat from 'it-concat'

const data = await concat(ipfs.cat('/ipfs/QmHash'))

Expand Down Expand Up @@ -812,7 +812,7 @@ for await (const file of filesSource) {
Alternatively you can buffer up the directory listing using the `it-all` utility:

```js
const all = require('it-all')
import all from 'it-all'

const results = await all(ipfs.ls('/ipfs/QmHash'))

Expand Down Expand Up @@ -905,7 +905,7 @@ files.forEach(file => {
Becomes:

```js
const fs = require('fs')
import fs from 'fs'
const ipfs = IpfsHttpClient()

const file = await ipfs.add(fs.createReadStream('/path/to/file.txt'))
Expand Down
Loading

0 comments on commit 8094dfb

Please sign in to comment.