Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Polyfilling modules #2

Open
littledan opened this issue Aug 9, 2018 · 37 comments
Open

Polyfilling modules #2

littledan opened this issue Aug 9, 2018 · 37 comments

Comments

@littledan
Copy link
Member

The presentation in this repository articulates integrity as an important goal of built-in modules. When we discussed this topic in committee, additional, opposing goals were raised in the area of hackability, for the ability to do polyfills, bug fixes, and virtualization.

This post focuses on polyfilling new features to look like built-in ones, but the techniques may work as well for the other cases mentioned above.

Feature requests to support polyfilling

When new APIs come into JavaScript and its host environments, these can often be filled in by polyfills, JavaScript code which achieves the same goal. Some cases that polyfills cover for in terms of the evolution of specifications:

  1. An class/library is entirely missing
  2. An individual function/method is missing within a class/library
  3. A function or constructor changes behavior, for example by adding an overload

Some ways polyfills are invoked today:
a. A separate script tag in a page, as recommended by polyfill.io
b. Importing a module which mutates the global object and/or objects within that, e.g., babel-polyfill
c. Importing a module which exports an object that has the same behavior as a built-in, also known as a ponyfill

Option c. above doesn't require any special support/integration from the platform--it's just a module that matches the built-in interface--so the rest of this document focuses on cases a. and b.

How polyfilling could fit into package-name-maps

Native modules are proposed to enable polyfilling on the web with a different mechanism. The Layered APIs proposal suggests a syntax like "std:foo|https://bar.com/baz.js" for a module specifier with a fallback. The package name maps proposal proposes a refinement of this syntax, where the fallback is listed in the map.

Both of these mechanisms only permit the case 1. above, not 2. or 3. If the standard library is supported at all, then the fallback will not be used, so it's not possible to add additional methods or cases within methods.

If the package-name-map can remap the uses of the standard library, then polyfills could be supported for cases 2. and 3. For example, if the syntax "@std/<lib>" is used, for various values of <lib>, and this is remappable via package-name-map, with simply a default value of being the built-in library, then a polyfill could wrap the built-in version of a standard library using package-name-map's feature enabling different paths to have different mappings.

Polyfilling across embedding environments

Option b. above for using polyfills, in conjunction with modern tooling, can work across Node.js and the Web Platform to use polyfills against current JS and Web APIs in practice. Should we have a cross-environment way to invoke polyfills for built-in modules?

@obedm503
Copy link

obedm503 commented Sep 26, 2018

<script
  type="polyfill"
  module="temporal"
  src="https://example.com/temporal.js"
></script>

if module="temporal" is already supported by the browser, it decides to not load it, making it transparent to the developer. There could be another attribute to let the developer force the browser to load the module in cases where the module is supported but it's missing a method or something.

used as

import { Instant } from temporal;

notice the lack of quotes on the import. Since this is supposed to be the standard library, urls don't make sense here. the lack of urls will set the apart from a random module.

There's also the option of developing some api to allow developers to register standard modules. This does not load the module in place. It just adds it as an optional dependency to the dependency graph before the browser starts loading other modules. This means that if no other module uses this native module, the browser will not ever load it

// main.js or index.html head script
polyfills.register('temporal', {
  url:'https://example.com/temporal.js',
  force: true, // or falsy
});

or doing it from the perspective of the polyfill

// temporal-polyfill.js
var temporal = { Instant: class Instant {} }
polyfills.fill('temporal', temporal, { force: true })

but this doesn't have the benefit of not loading the script like the other suggestions do.

But above all, I suggest staying away from magic strings like std:temporal|https://example.com/temporal.js they just make everything confusing.

@hax
Copy link
Member

hax commented Sep 28, 2018

Note, there are also polyfills which do not introduce new features but just fix bugs, for example: https://github.com/fanmingfei/array-reverse-ios12

There could be another attribute to let the developer force the browser to load the module in cases where the module is supported but it's missing a method or something.

How?

@obedm503
Copy link

a force boolean attribute that loads and uses the given implementation even if the browser already supports the module

@benjamn
Copy link
Member

benjamn commented Sep 28, 2018

Given that browser bugs are generally not intentional, the force attribute seems reasonable, since the browser may believe it supports a module, according to some previous specification of what the module should contain and how it should behave, but there may be bugs in its implementation, or the specification may have changed since then. Identifying and fixing those bugs/gaps is ultimately the concern of the application developer.

Related question: should standard library modules have versions, so that the browser can load polyfills conditionally? Perhaps a sophisticated polyfill could fill in the gaps between the latest version known to the browser and the requested version, using HTTP request headers or query parameters? Loading an entire reimplementation of the module just because it's missing a few new features seems wasteful (though that's often what happens today, admittedly).

@ljharb
Copy link
Member

ljharb commented Sep 28, 2018

Some bugs require full replacement to patch; the best practice today imo will remain so in the future: the polyfill code should be the source of truth for determining if it needs to make any changes.

Maybe there could be a way to load some kind of predicate module that can dynamically determine if it needs to load?

@benjamn
Copy link
Member

benjamn commented Sep 28, 2018

It seems preferable to perform at most one HTTP request, if possible. Including the requested version and the browser's current version of the module in that request should be adequate, especially since the polyfill server can always just serve a complete reimplementation if it likes (or one that dynamically detects what needs patching, but includes all the necessary code).

In the not-so-distant future, when virtually all browsers support virtually everything that a certain standard module is supposed to support, but a few new features have been recently added to the spec, I sincerely hope the polyfill mechanism is capable of providing just those new features. Otherwise our polyfills are just going to keep getting bigger as the standard library grows, rather than eventually shrinking and disappearing.

@obedm503
Copy link

That would be the best scenario. But it's more difficult to patch a module than to patch the global. I guess it's just a different kind of problem. This might be possible with the third suggestion of an API that lets talk user supply a module implementation programmatically. This might allow the one http request to provide the patch based on the user agent. Slowly but surely this server request would eventually be a noop

@littledan
Copy link
Member Author

Related discussion: drufball/layered-apis#34

@littledan
Copy link
Member Author

@domenic 's import-maps proposal has been evolving well into a solution for all of the issues we discussed in this thread. It seems like a technically good solution to me.

From here, I'm wondering: Are we comfortable with the system for polyfilling being determined per-embedder, or should we have a system that works across embedders? Can import-maps be the this system to apply across embedders, or would we need some changes, or a different system? Are there different needs in different embedders that motivate separate systems for controlling module polyfills?

I'm content with import-maps being developed as a WICG/WHATWG specification, and in my opinion, it'd be nice if this polyfilling solution can be applied across the JS ecosystem as a whole, not just in the web.

Related discussion on import-maps and Node: nodejs/open-standards#13

@ljharb
Copy link
Member

ljharb commented Dec 17, 2018

If the standard library is part of the language, then i think the polyfilling method needs to be as well.

@littledan
Copy link
Member Author

If Node and the Web have different needs here, I think it should be fine to develop the polyfilling solution in hosts, rather than within the language. Maybe we could include a requirement in the specification that the host must include some such mechanism. I don't understand exactly what is different about their needs, though.

@erights
Copy link

erights commented Dec 18, 2018

I disagree. Shimming an initial primordial state is not a host specific concept, and for many existing shims, is already done in a non-host-specific manner. We should uphold this.

When language-wide issues are left to hosts, we have failed. We failed to codify the host-independent behavior that all platforms must implement and all clients may count on. If the consequence of leaving it to hosts to define is that they gratuitously differ from each other in pointless ways, then we have failed even worse.

@littledan
Copy link
Member Author

Well, I'm not sure exactly how we should design this virtualization mechanism. import-maps explains this virtualization in terms of where the replacement code comes from. JavaScript itself doesn't have the same concept of resources. How would you design the virtualization mechanism?

Let's be pragmatic here. With the way the JS spec is currently written, embedders would be justified in exposing built-in modules only at the embedder level, with no particular virtualization mechanism. I think we should either propose a practical mechanism here, or leave it to embedders to do so.

@erights
Copy link

erights commented Dec 18, 2018

we either propose a practical mechanism

@littledan
Copy link
Member Author

@erights Be my guest!

@thysultan
Copy link

thysultan commented Dec 19, 2018

@littledan This in conjunction with inline modules: reference giving an idea of how this would work.

Requiring a hand-shake with the host environment is certain to introduce a very sticky situation with library authors importing standard modules/polyfills.

Just as i can use the WeakMap constructor today as if it where native/global like Object within a polyfilled runtime. I too should be able to import standard libraries without any convoluted hand-shake with the host environment because as a library i'm implicitly not in control of that hand-shake(the role is mostly delegated to the consumer) and is a good model going forward with esm standard modules.

For example in the linked reference i mentioned a fictional module "Window" that could be polyfilled, which one could extend to demonstrate how current day feature detection could be done and how it can be preserved going forward in the face of this proposal.

module Window extends Window {
    // could be any keyword, super, this, implements etc...
    if (!implements(Window)) {
        // implement polyfill when the extending module does not exist.
    }
}

// or go full in on syntax
module Window implements Window {
// only if Window is not already implemented
// this might have the benefit that the vm could better optimize not loading/running this if Window is indeed implemented.
}

// or; since the identifier is implicit.
module implements Window {}

@thysultan
Copy link

thysultan commented Dec 19, 2018

Arguable this is at odds with a "magic string" syntax direction. Given that direction makes this virtually impossible, a facet that i would hope this to be a data-point against that direction(magic strings).

@zloirock
Copy link

zloirock commented Dec 19, 2018

Fallbacks like

std:virtual-scroller|https://some-cdn.example/virtual-scroller.mjs

looks like a beautiful decision, but will not work in a serious part of use cases. In this case, we have problems, for example, with shared between polyfills helpers. Polyfills should be able to be bundled to one file.

We need at least an additional way to polyfill from usual JS code. By calling usual methods - syntax solution will cause a problem for old engines.

It should work synchronously. Why synchronously? Now, we load polyfills before all code. No one will do

waitForLoadingPolyfills.then(() => {
  runAllTheRestCode();
})

We need a registry.

Something like:

STDLibraryRegistry.get(name);
STDLibraryRegistry.setModule(name, value);
STDLibraryRegistry.setModuleProperty(name, key, value);

Also, this way could help with transpiling built-in modules because import from standard modules namespace could be simply transpiled to calling methods of this registry:

import { len, map } from "std:builtins";
// =>
const { len, map } = STDLibraryRegistry.get("std:builtins");

If someone thinks that standard library polyfill bundled to one file is something monstrous shouldn't forget about solutions like @babel/prevet-env or @babel/runtime which will bundle only required parts of the standard library based on syntax analysis and requirements for target engines.

@zloirock
Copy link

@littledan any feedback?

@littledan
Copy link
Member Author

Thanks for this detailed review.

Polyfills should be able to be bundled to one file.

import-maps is based on different things being in different resources, if I understand correctly. Is your suggestion based on performance, deployability or something else? I believe WebPackage should help with some of those issues.

syntax solution will cause a problem for old engines.

I'm wondering if @Rich-Harris's https://github.com/rich-harris/shimport and @guybedford's shim in https://github.com/guybedford/es-module-shims alleviates these concerns. Old engines don't have built-in modules anyway...

It should work synchronously.

I can imagine usages of your suggestion for synchronous dynamic access, but I don't understand why it's a requirement.

In browser-native implementations, the import-maps solution (including as a way to replace get-originals) should be as synchronous as an import statement. For transpiler output, if built-in modules are being entirely polyfilled on that platform, then there should be no particular barrier to implementing this API as something transpiler-internal; if they are not being entirely polyfilled, then they can be supplied with import-maps and used via import statements. Would this work?

@zloirock
Copy link

zloirock commented Dec 25, 2018

@littledan

Is your suggestion based on performance, deployability or something else?

Take a look an architecture of actual standard library polyfills. core-js contains thousands of files, hundreds shared between modules helpers. @ljharb es-shims project also contains many shared between polyfills helpers like es-abstract. Even with WebPackage loading all of those files separately and asynchronously will cause critical slowdown.

I'm wondering...

We can't test the availability of new syntax by Function like in the first repo example at least because it will cause CSP issues. Without testing and available non-syntax solutions, we will be forced to completely replace of modules system and built-in modules even if they are available in the engine. Even if we could be able to detect new syntax and standard library features support, we will be forced to add one more abstraction level between polyfills and developers, but JS infrastructure already too bloated.

Old engines don't have built-in modules anyway...

We talk about polyfilling. Polyfilling and transpiling should provide it even for old engines.

I can imagine usages of your suggestion for synchronous dynamic access, but I don't understand why it's a requirement.

@ljharb already wrote one part of the answer in another issue, the second you can find above.

@zloirock
Copy link

@ljharb wdyt about #2 (comment) ?

@ljharb
Copy link
Member

ljharb commented Dec 27, 2018

@zloirock i agree completely with the problems you describe; I’m not sure about your suggested solution but that’s worth exploring.

@WebReflection
Copy link

WebReflection commented Jan 4, 2019

Chiming in with something I haven't found in this issue: broken, or partial implementations.

broken implementation

Like anything else in ECMAScript, vendors might ship broken implementation of a standard.

As example, there are at least 4 ways the URLSearchParams constructor might need to be fully replaced, so that having
something like std:foo|https://bar.com/baz.js won't mean much, if foo is broken.

partial implementation

Keeping the previous example, even if URLSearchParams is available, some browser might not have shipped it for HTMLAnchroElement.prototype.searchParams, and if the standard library ships frozen prototypes, and there's no mechanism to do better feature and syntax detect, the standard library will be used indirectly through modules that will provide these features detection upfront.

In few words, if a developer knows that std:foo could be broken, such developer will require @dev/foo which internally will establish if std:foo is broken and, if that's the case, provide a polyfill for it.

This will basically ruin the goal of the standard library, if less network requests is one of its goals.

what do we need

A mechanism to understand, ahead of time, what needs to be downloaded and what doesn't, as incremental feature/syntax detection check => polyfills, which is basically already possible through 12345678 JS loaders, but it's not baked in core as standard.

Until we have such mechanism to disambiguate what a client really need to load, I believe any attempt to have a standard library will fail in adoption or, in the best case scenario, will miss the goal of reducing network requests or bundles size.

@bkardell
Copy link

@WebReflection I've been thinking some similarish thoughts to bits of what you said, but unsure how you would go about it really... Did you have ideas on how you would bake that in? I'd be very curious to hear thoughts on that.

@WebReflection
Copy link

WebReflection commented Feb 12, 2019

@bkardell the only easy solution I could think of, since based on new things that won't break anything we know already, is allowing a do expression to define the static import.

import foo from do {
  let gotcha = false;
  try {
    import foo from 'std:foo';
    foo.gotcha();
  } catch(o_O) {
    gotcha = true;
  }
  gotha ? '@ungap/foo' : 'std:foo';
};

@bkardell
Copy link

@WebReflection I'm not sure I understand... you're suggesting you'd do that everywhere, or you'd do that to build some map of exports that your app uses but is otherwise left as an exercise for each thing, or at a higher level through some tooling?

@littledan
Copy link
Member Author

@zloirock

Without testing and available non-syntax solutions, we will be forced to completely replace of modules system and built-in modules even if they are available in the engine.

One thing I think you can count on is, built-in modules would only be shipped if it's together with import-maps--you won't have to make a built-in module polyfill/wrapper work without import-maps. So, you'd have two versions: the 100% polyfill shipped as a module and bundled as it is today, or the new thing where you use import-maps and modules wrapping other modules. Would this be too complicated an architecture?

@ljharb
Copy link
Member

ljharb commented Feb 27, 2019

That can’t be counted on unless import maps are part of the language; there are more engines than browsers and node.

@zloirock
Copy link

@littledan sure, it can simplify some cases, but it's not a solution for the indicated problem.

@cyrilletuzi
Copy link

cyrilletuzi commented Mar 12, 2019

Hi everyone, I'll share a common use case where the polyfilling fallback option will be a problem.

I'm the author of @ngx-pwa/local-storage, the most popular Angular library for local storage. The main goal of the lib is the same as kv-storage: having an efficient storage via indexedDB, but with a simple API like localStorage.

There are dozen of librairies like this one, including the famous localforage.

If the lib wants to use the new kv-storage, it is not the role of the lib to include a polyfill, as it would lead to several issues:

  • duplicate polyfills if other libs do the same,
  • paths could change when the lib is used in the final application.

Asking the lib user to include the polyfill wouldn't be great, as it would add additionnal and complex setup steps, while the purpose of a lib is to ease things.

You could argue that with kv-storage API, this kind of libs won't be useful anymore, but that's not true, as they also do many other things, like:

  • fallback to localStorage in some cases (private mode of Firefox/IE/Edge for example),
  • fallback to memory storage for server-side rendering,
  • workaround for some browsers issues (like IE/Edge not supporting indexedDB v2),
  • wrappring in other tools (like RxJS for mine),
  • etc.

So currently, to choose the best storage, runtime checks are done, like:

if (window.indexedDB) {}

I don't see anyway to do a runtime check with the current proposal.

@WebReflection
Copy link

@cyrilletuzi I've proposed a solution to the exact issue you're facing here

it's been ignored and it got 2 thumbs down

the issue goes beyond your case, as already written.

the only way to do what you need is to ignore import maps and load dependencies asynchronously through a dynamic import(...) dance that will allow fallbacks.

I know this is not the answer you were looking for, but I'm also happy you brought here your use case, and I'm pretty sure many others will follow up.

import maps right now is useless for polyfills ... apparently they don't see it, but it's already the case.

@cyrilletuzi
Copy link

Following @WebReflection answer, I need to add that dynamic import() is not possible in all cases, as it's asynchronous, which is not possible in my case.

@thysultan
Copy link

My thoughts on the design of a JavaScript standard library hasn't changed; it shouldn't ship in any form that doesn't allow you to polyfill a standard module without effectively changing consumer code – #2 (comment)

That is a consumer shouldn't have to do any weird host dependent handshake, for example:

import foo from "std:something|somethingelse"
// or
import foo from do { /* ... */}

Because as @WebReflection mentions sometimes you intentionally don't want to use the standard module if it's defective, and additionally in contrast to the goals of any standard library pollutes consumer sites with extraneous data that would almost certainly need to change when something out of the blue happens.

This is most certainly a far cry from the current status quo of a clean separation between consumers and providers where:

var map = new WeakMap

Works regardless of how the provider implements the constructor by either being builtin/polyfilled/extended/replaced.

That is any standard library if one where to exist should preserve the is global-ish but is polyfill-able/extend-able/replace-able artefact of current global JavaScript constructors.

@zloirock
Copy link

So, looking to the stage 2 presentation slides, polyfilling issue is completely ignored, "import-maps should handle those".

TC39, please, listen to the opinions of polyfills authors! Don't break the polyfilling in JavaScript!

@ljharb
Copy link
Member

ljharb commented May 23, 2019

@mattijs re the slide content, please see #2 (comment)

@mattijs
Copy link
Member

mattijs commented May 24, 2019

@ljharb I read that explanation a while back, thanks for pointing it out again.

@zloirock I'll think some more on the use case you are laying out.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests