-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Investigate using native streams #20
Comments
👍 It just so struck me with my StreamSaver lib where I try to transfer a ReadableStream to a Service Worker. Native streams work OK, But the polyfilled ReadableStream is not transferable with postMessage For me native ReadableStream's are much more important than having a full stream specification. (just so that you can get byob). byob reader isn't available in any browser yet, So my guessing is that hardly anyone are using it today. So i'm guessing i'm in favor of removing byob mode and use native stream instead until browsers becomes more ready for it. Side note: I also wondering if you can't extend the ReadableStream somehow to add support for byob but still have it being seen as a native stream and still be transferable and also work with native Response new Response(readable).blob() const klass = ReadableStream || NoopClass
window.ReadableStream = class Polyfill extends klass {
constructor (args) {
// check if byob mode and make magic happen
super(...args)
}
someMethod () {
super.someMethod()
}
} Maybe will be more troublesome then actually fixing the missing methods to native ReadableStream.prototype 🤔 but can maybe be the way to solve it? |
Just tried this: class Foo extends ReadableStream {
constructor (...args) {
super(...args)
console.log('Hi foo, i fix byob for ya')
}
}
const rs = new Foo({
start (ctrl) {
ctrl.enqueue(new Uint8Array([97]))
ctrl.close()
}
})
new Response(rs).text().then(console.log)
const rs2 = new Foo()
postMessage(rs2, '*', [rs2]) Works. (the current polyfill version can't do this - since it's not a native stream) using native stream instead seems like it would be better then having some "convert to/from native stream" (#1) |
Thanks for your input! 😄
I know at least one person is using BYOB readers with this polyfill, because they found a bug with it: #3. I want to keep BYOB support, but make it an optional feature rather than a default one. I see two options:
I've had the same idea as well! But it won't be easy. The If we can make the constructor work, the rest would be fairly straightforward:
It'd be incredible if we could get this to work though! It would mean we could restructure the polyfill to "progressively enhance" native streams: let PolyfillReadableStream;
if (supportsDefaultSource(ReadableStream)) {
// native supports default source
PolyfillReadableStream = class extends ReadableStream {};
} else {
// no native support
PolyfillReadableStream = PonyfillReadableStream;
}
if (!supportsByteSource(PolyfillReadableStream)) {
// polyfill byte stream on top of default stream
PolyfillReadableStream = class extends PolyfillReadableStream {
/* ... */
};
}
if (!PolyfillReadableStream.prototype.pipeTo) {
// polyfill pipeTo and pipeThrough
}
if (!PolyfillReadableStream.prototype.tee) {
// polyfill tee
}
// ... We could put these in separate modules, and have users pick only the features they care about (like with core-js): import {ReadableStream} from 'web-streams-polyfill';
// polyfill only pipeTo
// import `web-streams-polyfill/feature/readable-byte-stream`;
import `web-streams-polyfill/feature/pipe-to`;
// import `web-streams-polyfill/feature/tee`;
const readable = new ReadableStream();
readable.pipeTo(writable); So yeah: would be cool, if we can get it to work. 😛 |
I'm currently working on splitting the After that, I have to figure out how these dependencies should be implemented so they can work with either native and polyfilled streams, without breaking any (or too many) tests. For example: I'm also worried that some of these dependencies on abstract operations cannot be implemented using only the public API of native streams. This would mean I'd have to approximate them, or leave them out entirely. That means more trade-offs about which test failures are acceptable and which aren't. For example: There's still a lot of questions, and I'm figuring them out as I go. I'm doing this in my spare time, so it's going to take a bit of time to get there! 😅 |
Oh, sounds a bit complex 😅 |
@jimmywarting |
@bt-88 I think he meant that you can't do: import { ReadableStream } from 'web-streams-polyfill';
let stream = new ReadableStream({/* ... */});
let response = new Response(stream); // expects a native ReadableStream, but we're passing in a polyfilled stream or import { WritableStream } from 'web-streams-polyfill';
let stream = new Response(/* ... */).body;
await stream.pipeTo(new WritableStream({/* ... */})); // either pipeTo does not exist on the native ReadableStream, or the native pipeTo expects a native WritableStream but we're passing in a polyfilled stream You can work around this by wrapping the native stream inside a polyfilled stream, or vice versa. I have a library that does exactly this: web-streams-adapter. However, you have to do this everywhere you send or receive a native The goal of this issue is to make the |
yea, what he said☝️ |
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
While this all seems very interesting, it is not very relevant to this issue, or even to the goals of this project. So I'm gonna have to ask you to move all further discussion of this "streams using custom Chromium build" experiment to your own fork. You can drop a link to that issue here, in case others want to follow your progress. 🙂 |
@MattiasBuelens Marking the comments as "oof-topic" really makes no sense. This issue is titled
which the comments directly focus on. There are no restrictions on how that must be achieved at OP. OP is akin to asking a golfing question without including restrictions, then adding restrictions in the comments. It should not matter how the requirement is achieved since there are no restrictions on how the requirement can be achieved at OP. |
The goal for this issue is: investigate how the polyfill can leverage the native streams implementation as provided by the browser, such that it can be made compatible with browser-provided APIs that return or accept such native streams. This is explained in my previous comment. Maybe the title of this issue alone doesn't explain that that well enough, but there's more to an issue than just its title. I ask that you at least make an effort to read through the previous comments before commenting yourself, to better understand the problem we're trying to solve. You propose to compile a C++ streams implementation (like the one in Chrome) to run inside a web page, either using WebAssembly or through some communication channel with a native application. As explained several times before, this proposal does not achieve the goals set out for this issue. Hence, while this might be an interesting challenge, it is not relevant for the discussion of this issue. Therefore, I marked the comments as off-topic.
In every project, there are many implicit restrictions defined just by the environment in which the solution must operate. I expect some common sense from contributors, so that I don't have to state these in every single issue. For example, this is a polyfill that can be used on any web page running in any supported browser, as detailed in the README. That immediately imposes a bunch of restrictions: it must be written in a programming language that is supported by these platforms (i.e. JavaScript or WebAssembly, or something that compiles to these), and it must obey the rules set out by these platforms (i.e. only use browser-provided APIs, do not require additional extensions,...) Switching to WebAssembly would mean we'd have to drop support for older browsers, which would be unfortunate but possibly defendable if the performance gains in newer browsers are significant. However, requiring users to install a separate native application would not work at all for usage in a web page. The browser's job is to protect the user from (potentially malicious) websites, and requiring untrusted native code just so a random website can use the streams API is very hard to sell to users. Thus, it's not an option for this polyfill. |
Just checking in to see how things have progress? Have you started somewhere or is it still in the stage of "splitting the ReadableStream class into multiple modules"? if (!ReadableStream.prototype[Symbol.asyncIterator]) {
// polyfill iterator
} |
Sorry, not a whole lot of progress as of late. 😞 I've been busy working on the streams standard itself, e.g. adding Now that Chrome has shipped readable byte streams, the delta between Chrome's streams implementation and this polyfill has become very small. So I can definitely see the appeal of having a minimal polyfill to add the few missing bits (like |
note from fetch-blob: We could extend this to basically pull the streams from |
I was thinking this conditional loading of node streams could be of any help to you: let nodeStuff = {}
try {
const load = new Function('x', 'return require(x)')
const process = load('node:process')
const { emitWarning } = process
try {
process.emitWarning = () => {}
nodeStuff = load('node:stream/web')
process.emitWarning = emitWarning
} catch (err) {
process.emitWarning = emitWarning
}
} catch (err) {}
module.exports = nodeStuff |
What is Either way, if we go this route (which we might, see #108 (comment)), I'd probably prefer something like: try {
module.exports = require("node:stream/web");
} catch {
module.exports = require("web-streams-polyfill/es2018");
} or for ESM: let streams;
try {
streams = await import("node:stream/web");
} catch {
streams = await import("web-streams-polyfill/es2018");
}
const { ReadableStream, WritableStream, TransformStream } = streams;
export { ReadableStream, WritableStream, TransformStream }; But even then, it's not that simple. The polyfill's implementation may be ahead of Node's implementation, so in some cases we may still want to either monkey-patch Node's implementation or fallback to our own implementation anyway. 😕 |
Also i adapted it just for web-streams-polyfill in case you want to patch/extend nodes implementation, seeing that we now have some utilities that actually use the nodes stream like i was thinking maybe that if some webpack or rollup tries to polyfill the top level await would require node v14.8 i think? quite many is using fetch which as a indirect dependency on fetch-blob -> node:stream/web and it cause a frustration among many developers who did not even use whatwg streams in any way |
Woops, I'm blind. Thanks. 😅
I see, so it's trying to hide
Indeed, so we'd have to wait a bit longer before we can safely drop Node 12 and below. 😢
I understand the frustration, but I don't feel like it's the polyfill's responsibility to hide this warning. The warning is there for a reason after all... 😕 |
correct
when 12 goes EOL, node-fetch is going to drop it in v4 and only support something like 14.18 i think (where AbortController got introduced) |
Somewhere back in my head I was perhaps also hoping that your library would also somehow extend/patch |
The original intention was to extend/patch native streams in the browser. But yes, if at all feasible, we may want to do the same for Node. 🙂 Now that I think about it, it might be good to use subpath imports to import the native implementation (if any). That way, we can avoid importing from {
"imports": {
"#native-streams": {
"node": "./node-stream.js",
"default": "./global-stream.js"
}
},
} // node-stream.js
export let streams;
try {
streams = await import("node:stream/web");
} catch {
streams = undefined;
} // global-stream.js
export let streams = globalThis; We could then build the polyfill like this: // polyfill.js
let { streams } = await import("#native-streams");
if (streams) {
streams = patchStreamsImplementation(streams);
} else {
streams = await import("./ponyfill.js");
}
export const ReadableStream = streams.ReadableStream;
export const WritableStream = streams.WritableStream;
export const TransformStream = streams.TransformStream; |
Inspired by this tweet from @surma:
I've thought about this previously. Back then, I decided that it was not feasible because readable byte streams are not supported by any browser. A full polyfill would always need to provide its own
ReadableStream
implementation that supports byte streams. By extension, it would also need to provide its own implementations forWritableStream
(that works with itsReadableStream.pipeTo()
) andTransformStream
(that uses its readable and writable streams).Looking at this again, I think we can do better. If you don't need readable byte streams, then the native
ReadableStream
should be good enough as a starting point for the polyfill. From there, the polyfill could add any missing methods (pipeTo
,pipeThrough
,getIterator
,...) and implement them using the native reader fromgetReader()
.This approach can never be fully spec-compliant though, since the spec explicitly forbids these methods to use the public API. For example,
pipeTo()
must useAcquireReadableStreamDefaultReader()
instead ofReadableStream.getReader()
, so it cannot be affected by user-land JavaScript code making modifications toReadableStream.prototype
. I don't think that has to be a problem though: we are already a user-land polyfill written in JavaScript that modifies those prototypes, it would be silly for the polyfill to try and guard itself against other JavaScript code making similar modifications.Steps in the spec that require inspecting the internal state of the stream or call into internal methods will need to be replaced by something that emulates the behavior using solely the public API.
Often, this will be easy: e.g.
ReadableStreamDefaultControllerEnqueue()
becomescontroller.enqueue()
.Sometimes, we have to be a bit more lenient.
ReadableStreamPipeTo()
's error propagation says:We can check if it becomes errored by waiting for the
source.closed
promise to become rejected. However, we can't synchronously check if it is already errored.In rare cases, this may turn out to be impossible.
TransformStreamDefaultSinkWriteAlgorithm
specifies:Usually, the writable stream starts erroring because the writable controller has errored, which the transform stream's implementation controls. However, it could also be triggered by
WritableStream.abort()
, which is out of the control of the transform stream implementation. In this case, the controller is only made aware of it after the writable stream finishes erroring (state becomes"errored"
) through itsabort()
algorithm, which is already too late.Of course, we can't just flat-out remove byte stream support from the polyfill, just for the sake of using native streams more. The default should still be a full polyfill, but we might want to give users the option to select which features they want polyfilled (as @surma suggested in another tweet).
Anyway, I still want to give this a try. It might fail catastrophically, but then at least I'll have a better answer on why we use so little from the native streams implementation. 😅
The text was updated successfully, but these errors were encountered: