-
-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make the WebGL material system more easy to make libraries for #6144
Comments
I agree that this would be great for general shader use and for #6276 |
I've started making a proof-of-concept system for making a shader graph out of snippets. There's some explanations in the readme here: https://github.com/davepagurek/shader-builder The things I'm hoping to address with the prototype:
Some things I still want to think about/tinker with:
Let me know if anyone has thoughts so far! |
I want to review/rework how addon libraries work to a certain extend as part of the investigations I'm doing. Might be worth thinking about how this fits in there as well. |
Definitely! What aspects of library building are you thinking about currently? |
I'm thinking whether to change how addons are authored for p5.js where currently the general advice is to attach methods directly to the A set of minor pros and cons for the prototype approach is that for pro, it is easy to write, align with how internal modules work, and work with existing addons; a con is that addons need to be loaded before p5 initialization and features cannot be dynamically added when the runtime is running. Also reviewing this as part of the docs review of creating_libraries.md I'm looking at. Some of the things might not be relevant but I'm still thinking. |
Here's an update on this with some new thoughts!
For that reason, I think it makes sense to define a hooks api, inspired by the hooks in Luma's gaussian splat library. From the end user's point of view, you could augment a shader like this: const myShader = p5.RendererGL.defaultFillShader.fillHooks({
uniforms: `uniform float time;`,
vertex: {
getLocalPosition: `(vec3 pos) {
pos.y += 20.0 * sin(time);
return pos;
}`
},
}) Internally, when we define the shader to have a hook, it'd look like this. In the shader source, use defaultFillShader = createShader(
`attribute vec3 position;
void main() {
vec3 localPosition = HOOK_getLocalPosition(aPosition);
// ...etc
}`
fragSrc, // omitted for brevity
{
vertexHooks: {
// For each hook, provide a default value
getLocalPosition: `(vec3 pos) { return pos; }`
}
}
) (Optionally, for performance, we can also add a The hooks object is stored in the shader, and when a user calls type HooksOptions = {
// A string spliced into both shaders above `main`, e.g. for `uniform`s
declarations?: string
// Options for each shader individually
vertex: SingleHookOptions
fragment: SingleHookOptions
}
type SingleHookOptions = {
// A string spliced in before `main`, e.g. for `out` variables in the vertex and `in` variables in the fragment shader
declarations?: string
// Implementations of the other hooks defined by the shader
[hookName: string]: string
}
setHooks(options: HooksOptions): p5.Shader When we compile the shader, we'd splice in a string with the hook definitions: filledVertSrc() {
const main = 'void main';
const [preMain, postMain] = this._vertSrc.split(main);
let hooks = '';
if (this.hooks.declarations) {
hooks += this.hooks.declarations + '\n';
}
for (const hookName in this.hooks.vertex) {
if (hookName === 'declarations') {
hooks += this.hooks.vertex.declarations + '\n';
} else {
// Add a #define so that if the shader wants to use preprocessor directives to
// optimize away the extra function calls in main, it can do so
hooks += '#define HOOK_' + hookName
hooks += 'HOOK_' + hookName + this.hooks.vertex[hookName] + '\n';
}
}
return preMain + hooks + main + postMain;
} As for the specific hooks to include, I think it'd be something like:
|
@davepagurek For the hooks idea, I am exploring standardizing it for 2.0 as well. We currently have some hooks like For shader hooks, not sure how feasible it is but, would it be possible to have the WebGL renderer module extend this functionality so that additional shader hooks can be defined using similar syntax to default lifecycle hooks? The main goal is to reduce concept duplication where possible but if it doesn't make sense in this context we can think about how to prevent confusion instead. |
The main thing unique about this shader scenario is that instead of functions, you'd supply GLSL strings. Other than that though, it seems feasible. For p5 lifecycle hooks, those would look like this, right? p5.registerAddon((p5, fn, lifecycles) => {
lifecycles.postdraw = function() {
// Run actions after `draw()` runs
};
}) Previously, I was thinking about that as a method on a default shader. I had initially suggested const myShader = defaultShader.augment({
declarations: `uniform float timeScale;`,
getLocalPosition: `(vec3 pos) {
pos.y += 20.0 * sin(time * timeScale);
return pos;
}`
}) But if we use a callback function, we could use an assignment instead: const myShader = defaultShader.augment((lifecycles) => {
lifecycles.declarations = `uniform float timeScale;`
lifecycles.getLocalPosition = `(vec3 pos) {
pos.y += 20.0 * sin(time * timeScale);
return pos;
}`
}) Do you think that interface gets close enough to the p5 lifecycle hooks for it to feel familiar? |
Just thinking a bit out loud here, in term of API p5.registerAddon((p5, fn, lifecycles) => {
// `webgl` to namespace things
lifecycles.webgl.defaultShader.augment = {
// ....
};
}); Although thinking about it now, is this meant to be a library author facing feature only or would it also be user facing? If it is user facing then the The idea behind lifecycles = {}; and on each call of |
I think the issue with
Users would be the ones making their own shaders by filling out hooks, with the idea being that rather than writing full vertex + fragment shader source code, they could just write the part that interests them (e.g. just editing position to warp all the points in a mesh, or just editing the color if they want to make a generative texture.) Those shaders could be packaged as addons too, like calling |
Increasing Access
There have been a number of requests related to the material system in p5, such as adding fog (#5878) or blend modes for ambient light (#6123), I've been working on a library for shader-based warping (https://github.com/davepagurek/p5.warp), and a GSoC project this year will involve working on image-based lighting as an alternative to point/directional/spot lights.
We intentionally don't add every feature into p5 core in order to keep the codebase maintainable, keep the API simple for beginners, and keep the runtime reasonably fast. It would be great to allow community libraries to fill these needs instead! However, the system is currently very difficult to add to externally; the only viable option right now is to package a p5 shader and distribute that, which means keeping your shader up-to-date with internal changes.
A dedicated way to hook into the material system would help people who are interested in contributing via a library test out their ideas, and would give users a larger variety of tools for different needs as new libraries are added.
Most appropriate sub-area of p5.js?
Feature request details
The main difference between adding a material to p5 and writing a full shader is that for the former, you generally want to keep most of the existing shader. The best way to do that right now is copy-and-pasting, which goes stale over time and requires expertise of p5's internals to do in the first place. The design goal would be to allow people to replace specific parts of our shaders without needing to do that.
To narrow the scope, I think this only needs to apply to our fill material with lighting, not lines or text for now.
Some potential pieces a library might want to replace are:
Shader snippets
We can maybe think of our shaders as a collection of code snippets for both the fragment and vertex shader, which have two parts: a header (to specify inputs) and a body (which runs in
main()
), combined like this:If we break down our current shaders into snippets like that, then we could provide a minimal API for creating a new shader where one could replace just one part. Maybe something like:
Some downsides with this are the fact that it treats all snippets just as strings, so there may be naming collisions or type mismatches when making snippets work together. It would at least require relatively minimal code to implement, though.
Shader graph
There's this existing library for combining shader pieces to make one shader: https://www.npmjs.com/package/@gerhobbelt/shadergraph This does much of what the above snippet idea does, but in a much more complete but heavy way, where one can define snippets for small bits of code and build a complicated dependency graph to compile into a shader.
Using this benefits from not being built from scratch, but also adds a new dependency to p5, and means providing a more complicated API to library builders.
Providing access to default shader source
The barest-bones solution maybe just involves exposing the source code for our current material shaders via variables that libraries can reference. That way they could use our existing vertex shader but write their own fragment shader.
This doesn't solve the problem where one wants to use most of our lighting calculations (and therefore be able to integrate with point/directional/spot light calls made in p5) but would still be helpful.
The text was updated successfully, but these errors were encountered: