Skip to content
This repository has been archived by the owner on Feb 12, 2024. It is now read-only.

feat: switch to esm #3879

Merged
merged 29 commits into from
Sep 22, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
40b4770
feat: switch to esm
achingbrain Sep 19, 2021
4ed46d2
chore: linting and types
achingbrain Sep 19, 2021
146cd2a
chore: build passing, fires dying down
achingbrain Sep 19, 2021
f39c962
chore: linting
achingbrain Sep 19, 2021
4f24dc9
chore: types, linting and building working
achingbrain Sep 19, 2021
36c47c2
Merge remote-tracking branch 'origin/master' into chore/switch-to-esm
achingbrain Sep 19, 2021
90a4c28
chore: unit tests passing
achingbrain Sep 20, 2021
fb6b129
chore: fix interface tests
achingbrain Sep 21, 2021
b0e4412
chore: fix up tests
achingbrain Sep 21, 2021
de09cfd
chore: fix more tests
achingbrain Sep 21, 2021
fbe51ea
chore: fix more tests
achingbrain Sep 21, 2021
fa73359
chore: fix up more tests
achingbrain Sep 21, 2021
851b8e3
chore: fix up more tests
achingbrain Sep 21, 2021
5332e5f
chore: fix up more tests
achingbrain Sep 21, 2021
d0104df
chore: fix up more tests
achingbrain Sep 21, 2021
1a4ddf0
chore: fix up more tests
achingbrain Sep 21, 2021
828823a
chore: move browser overrides to module
achingbrain Sep 21, 2021
5c4b2fb
chore: fix tests
achingbrain Sep 21, 2021
d25eaf5
chore: fix tests
achingbrain Sep 21, 2021
a76b0d9
chore: looks like import.meta.urls are not reliable
achingbrain Sep 21, 2021
cb4dfa8
chore: do not use n for bigints
achingbrain Sep 21, 2021
b66dbb2
chore: update aegir
achingbrain Sep 21, 2021
1c22efa
chore: fix deps
achingbrain Sep 21, 2021
219e778
chore: fix tests
achingbrain Sep 21, 2021
0e03383
chore: fix cli setup
achingbrain Sep 21, 2021
01a7c1f
chore: add ignores for remote pinning tests
achingbrain Sep 21, 2021
834e65e
chore: examples need updating
achingbrain Sep 22, 2021
e2163b1
chore: fix test
achingbrain Sep 22, 2021
71a3ded
chore: update docs
achingbrain Sep 22, 2021
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
263 changes: 137 additions & 126 deletions .github/workflows/test.yml

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -33,3 +33,4 @@ tsconfig-check.aegir.json

# Operating system files
.DS_Store
types
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ $ npm install ipfs-core
Then start a node in your app:

```javascript
const IPFS = require('ipfs-core')
import * as IPFS from 'ipfs-core'

const ipfs = await IPFS.create()
const { cid } = await ipfs.add('Hello world')
Expand Down
2 changes: 1 addition & 1 deletion docs/DAEMON.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ The IPFS Daemon exposes the API defined in the [HTTP API spec](https://docs.ipfs
If you want a programmatic way to spawn a IPFS Daemon using JavaScript, check out the [ipfsd-ctl](https://github.com/ipfs/js-ipfsd-ctl) module.

```javascript
const { createFactory } = require('ipfsd-ctl')
import { createFactory } from 'ipfsd-ctl'
const factory = createFactory({
type: 'proc' // or 'js' to run in a separate process
})
Expand Down
4 changes: 2 additions & 2 deletions docs/FAQ.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,8 +73,8 @@ Yes, however, bear in mind that there isn't a 100% stable solution to use WebRTC
To add WebRTC support in a IPFS node instance, do:

```JavaScript
const wrtc = require('wrtc') // or require('electron-webrtc')()
const WebRTCStar = require('libp2p-webrtc-star')
import wrtc from 'wrtc' // or 'electron-webrtc'
import WebRTCStar from 'libp2p-webrtc-star'

const node = await IPFS.create({
repo: 'your-repo-path',
Expand Down
38 changes: 22 additions & 16 deletions docs/IPLD.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,38 +49,41 @@ If your application requires support for extra codecs, you can configure them as
1. Configure the [IPLD layer](https://github.com/ipfs/js-ipfs/blob/master/packages/ipfs/docs/MODULE.md#optionsipld) of your IPFS daemon to support the codec. This step is necessary so the node knows how to prepare data received over HTTP to be passed to IPLD for serialization:

```javascript
const ipfs = require('ipfs')
import { create } from 'ipfs'
import customBlockCodec from 'custom-blockcodec'
import customMultibase from 'custom-multibase'
import customMultihasher from 'custom-multihasher'

const node = await ipfs({
const node = await create({
ipld: {
// either specify BlockCodecs as part of the `codecs` list
codecs: [
require('custom-blockcodec')
customBlockCodec
],

// and/or supply a function to load them dynamically
loadCodec: async (codecNameOrCode) => {
return require(codecNameOrCode)
return import(codecNameOrCode)
},

// either specify Multibase codecs as part of the `bases` list
bases: [
require('custom-multibase')
customMultibase
],

// and/or supply a function to load them dynamically
loadBase: async (baseNameOrCode) => {
return require(baseNameOrCode)
return import(baseNameOrCode)
},

// either specify Multihash hashers as part of the `hashers` list
hashers: [
require('custom-multibase')
customMultihasher
],

// and/or supply a function to load them dynamically
loadHasher: async (hashNameOrCode) => {
return require(hashNameOrCode)
return import(hashNameOrCode)
}
}
})
Expand All @@ -89,39 +92,42 @@ If your application requires support for extra codecs, you can configure them as
2. Configure your IPFS HTTP API Client to support the codec. This is necessary so that the client can send the data to the IPFS node over HTTP:

```javascript
const ipfsHttpClient = require('ipfs-http-client')
import { create } from 'ipfs-http-client'
import customBlockCodec from 'custom-blockcodec'
import customMultibase from 'custom-multibase'
import customMultihasher from 'custom-multihasher'

const client = ipfsHttpClient({
const client = create({
url: 'http://127.0.0.1:5002',
ipld: {
// either specify BlockCodecs as part of the `codecs` list
codecs: [
require('custom-blockcodec')
customBlockCodec
],

// and/or supply a function to load them dynamically
loadCodec: async (codecNameOrCode) => {
return require(codecNameOrCode)
return import(codecNameOrCode)
},

// either specify Multibase codecs as part of the `bases` list
bases: [
require('custom-multibase')
customMultibase
],

// and/or supply a function to load them dynamically
loadBase: async (baseNameOrCode) => {
return require(baseNameOrCode)
return import(baseNameOrCode)
},

// either specify Multihash hashers as part of the `hashers` list
hashers: [
require('custom-multibase')
customMultihasher
],

// and/or supply a function to load them dynamically
loadHasher: async (hashNameOrCode) => {
return require(hashNameOrCode)
return import(hashNameOrCode)
}
}
})
Expand Down
50 changes: 25 additions & 25 deletions docs/MIGRATION-TO-ASYNC-AWAIT.md
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ const peerId = PeerId.createFromB58String(peerIdStr)
You can get hold of the `PeerId` class using npm or in a script tag:

```js
const PeerId = require('peer-id')
import PeerId from 'peer-id'
const peerId = PeerId.createFromB58String(peerIdStr)
```

Expand Down Expand Up @@ -128,7 +128,7 @@ You can get hold of the `PeerInfo` class using npm or in a script tag:

```js
const PeerInfo = require('peer-info')
const PeerId = require('peer-id')
import PeerId from 'peer-id'
const peerInfo = new PeerInfo(PeerId.createFromB58String(info.id))
info.addrs.forEach(addr => peerInfo.multiaddrs.add(addr))
```
Expand Down Expand Up @@ -217,7 +217,7 @@ readable.on('end', () => {
Becomes:

```js
const toStream = require('it-to-stream')
import toStream from 'it-to-stream'
const readable = toStream.readable(ipfs.cat('QmHash'))
const decoder = new TextDecoder()

Expand Down Expand Up @@ -285,7 +285,7 @@ console.log(decoder.decode(data))
...which, by the way, could more succinctly be written as:

```js
const toBuffer = require('it-to-buffer')
import toBuffer from 'it-to-buffer'
const decoder = new TextDecoder()
const data = await toBuffer(ipfs.cat('QmHash'))
console.log(decoder.decode(data))
Expand Down Expand Up @@ -321,7 +321,7 @@ pipeline(
Becomes:

```js
const toStream = require('it-to-stream')
import toStream from 'it-to-stream'
const { pipeline, Writable } = require('stream')
const decoder = new TextDecoder()

Expand Down Expand Up @@ -353,7 +353,7 @@ Use `it-pipe` and a [for/await](https://developer.mozilla.org/en-US/docs/Web/Jav
e.g.

```js
const fs = require('fs')
import fs from 'fs'
const { pipeline } = require('stream')

const items = []
Expand All @@ -378,7 +378,7 @@ pipeline(
Becomes:

```js
const fs = require('fs')
import fs from 'fs'
const pipe = require('it-pipe')

const items = []
Expand All @@ -400,9 +400,9 @@ console.log(items)
...which, by the way, could more succinctly be written as:

```js
const fs = require('fs')
import fs from 'fs'
const pipe = require('it-pipe')
const all = require('it-all')
import all from 'it-all'

const items = await pipe(
fs.createReadStream('/path/to/file'),
Expand All @@ -420,7 +420,7 @@ Convert the async iterable to a readable stream.
e.g.

```js
const fs = require('fs')
import fs from 'fs'
const { pipeline } = require('stream')

const items = []
Expand All @@ -445,8 +445,8 @@ pipeline(
Becomes:

```js
const toStream = require('it-to-stream')
const fs = require('fs')
import toStream from 'it-to-stream'
import fs from 'fs'
const { pipeline } = require('stream')

const items = []
Expand Down Expand Up @@ -568,7 +568,7 @@ Becomes:

```js
const pipe = require('it-pipe')
const concat = require('it-concat')
import concat from 'it-concat'
const decoder = new TextDecoder()

const data = await pipe(
Expand All @@ -590,7 +590,7 @@ Use `it-pipe` and `it-all` to collect all items from an async iterable.
e.g.

```js
const fs = require('fs')
import fs from 'fs'
const toPull = require('stream-to-pull-stream')

pull(
Expand All @@ -605,7 +605,7 @@ pull(
Becomes:

```js
const fs = require('fs')
import fs from 'fs'

const file = await ipfs.add(fs.createReadStream('/path/to/file'))

Expand All @@ -619,7 +619,7 @@ Convert the async iterable to a pull stream.
e.g.

```js
const fs = require('fs')
import fs from 'fs'
const toPull = require('stream-to-pull-stream')

pull(
Expand All @@ -634,7 +634,7 @@ pull(
Becomes:

```js
const fs = require('fs')
import fs from 'fs'
const streamToPull = require('stream-to-pull-stream')
const itToPull = require('async-iterator-to-pull-stream')

Expand Down Expand Up @@ -685,7 +685,7 @@ for await (const file of addSource) {
Alternatively you can buffer up the results using the `it-all` utility:

```js
const all = require('it-all')
import all from 'it-all'

const results = await all(ipfs.addAll([
{ path: 'root/1.txt', content: 'one' },
Expand Down Expand Up @@ -744,7 +744,7 @@ Reading files.
e.g.

```js
const fs = require('fs')
import fs from 'fs'

const data = await ipfs.cat('/ipfs/QmHash')

Expand All @@ -759,8 +759,8 @@ Becomes:

```js
const pipe = require('it-pipe')
const toIterable = require('stream-to-it')
const fs = require('fs')
import toIterable from 'stream-to-it'
import fs from 'fs'

// Note that as chunks arrive they are written to the file and memory can be freed and re-used
await pipe(
Expand All @@ -774,8 +774,8 @@ console.log('done')
Alternatively you can buffer up the chunks using the `it-concat` utility (not recommended!):

```js
const fs = require('fs')
const concat = require('it-concat')
import fs from 'fs'
import concat from 'it-concat'

const data = await concat(ipfs.cat('/ipfs/QmHash'))

Expand Down Expand Up @@ -812,7 +812,7 @@ for await (const file of filesSource) {
Alternatively you can buffer up the directory listing using the `it-all` utility:

```js
const all = require('it-all')
import all from 'it-all'

const results = await all(ipfs.ls('/ipfs/QmHash'))

Expand Down Expand Up @@ -905,7 +905,7 @@ files.forEach(file => {
Becomes:

```js
const fs = require('fs')
import fs from 'fs'
const ipfs = IpfsHttpClient()

const file = await ipfs.add(fs.createReadStream('/path/to/file.txt'))
Expand Down
Loading