-
-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support per-request cache with SSR #461
Comments
Go on, go on.. I'm listening 😉 I definitely want to figure this out. So, are you suggesting something like I do want to fix this though. |
API designYou got it exactly right and there would be no breaking changes. You asked for it, so here comes a braindump! 😉 I just typed everything out in one go before heading to dinner, so this is not exhaustive or necessarily super thought-through, but hopefully it's good starting point. As I said, if there is interest and this seems like a promising direction I can try to find some time to take a stab at a PoC to look at. ExampleA possible API could look something like this: // server.js
async function handleRequest(req, res) {
const queryCache = makeQueryCache();
queryCache.prefetch('something', fetchSomething); // This would usually happen in a framework or be abstracted somehow
const markup = ReactDOMServer.renderToString(
<ReactQueryCacheProvider cache={queryCache}> // Could also use existing provider
<App />
</ReactQueryCacheProvider>
);
// This should probably be named something else and just return a raw object that the user can choose to serialize themselves however they want, to keep the react-query lean
const serializedCache = queryCache.serialize();
res.send(`
${someHtmlTemplateStart}
<body>
<div id="root">{markup}</div>
<script id="initial_payload" type="application/json" charset="utf-8">
{ REACT_QUERY_CACHE: ${serializedCache} }
</script>
${someHtmlTemplateEnd}
`);
} // client.js
import { queryCache } from 'react-query';
const initialPayload = JSON.parse(
document.getElementById('initial_payload').textContent
);
/* -- Alternative 1: Hydrating to global cache -- */
queryCache.hydrate(initialPayload.REACT_QUERY_CACHE);
ReactDOM.hydrate(<App />, document.getElementById('root'));
/* -- Alternative 2: Using cache provider -- */
const clientQueryCache = makeQueryCache(initialPayload.REACT_QUERY_CACHE);
ReactDOM.hydrate(
<ReactQueryCacheProvider cache={clientQueryCache}>
<App />
</ReactQueryCacheProvider>
, document.getElementById('root')); These are all pretty common patterns when working with SSR (though it's more common today to use a Provider on the client than some module-based global state). Server renderingUsually when you use
Also, IF you have provided a cache via context on the server, This should also work with streaming out of the box. The future streaming Suspense server renderer, combined with Progressive hydration, might require figuring out things like streaming serialization and hydration though which should be doable, but who knows? Another unknown is how React Blocks would play into this, but that's also pretty experimental for now as I understand it. ClientProviding the I haven't looked enough at the source, so imagine supporting multiple caches in different roots or different parts of an application would require more work as well which might or might not be worth it, but the thing about the "provide cache via provider"-part is that you get it for free anyway when building out the SSR-support. 😄 Since we can place the Serializing the cache to localStorage and hydrating it on reload (for scroll restoration) could be another usecase that this would support btw. Supporting different frameworksNext.jsEven though the examples are "custom SSR low level" ones, I'm pretty sure we can make it work with Next.js as well. Return the serialized cache from I think this should work similarly with the new GatsbyI'm not as well versed in Gatsby so would need to do some research, but I see no reason the approach shouldn't work there as well since it's a common pattern. React Router v6 (alpha)RR6 has a new RemixI haven't checked out the previews fully yet and there are probably still unknowns, so same as Gatsby. |
Add makeServerQueryCache as a way to create a queryCache that caches data on the server. Add queryCache.dehydrate as a way to dehydrate the cache into a serializeable format. Add initialQueries as an option to makeQueryCache as a way to rehydrate/create a warm cache. Closes TanStack#461
Add makeServerQueryCache as a way to create a queryCache that caches data on the server. Add queryCache.dehydrate as a way to dehydrate the cache into a serializeable format. Add initialQueries as an option to makeQueryCache as a way to rehydrate/create a warm cache. Closes TanStack#461
Just to document the progress: #476 added #570 was a first attempt to bring the other pieces together, a great discussion led to some insights, as well as breaking the PR apart in pieces:
Just to be clear, the example API-design in the last message is outdated, but the general approach is still valid. |
@Ephem I tried to use your new API's with Next.js + SSG; you might want to take a look: https://github.com/PepijnSenders/react-query-next-ssg-example. I still had to manually hydrate the cache, but otherwise, it looks quite amazing! |
@PepijnSenders That's awesome! I have limited internet/computer access until next week so I can't look in detail, but at a glance this looks very close to how I've imagined it, just with a bit more boilerplate the de/rehydrate-APIs are meant to solve. 🎉 If you are up for it, I definitely think that example should become the official one when the APIs are fully there? Btw, just to be clear, #476 which does the heavy lifting here was created by @jackmellis ❤️ |
Sure, I can help with that! Enjoy your time away :) |
Next should take care of this. Huzzah! |
Should still be open? or is there another issue tracking the de/rehydration? |
@pseudo-su You could say the title of this issue has been resolved, but not parts of the description of it. No other issue is currently tracking hydration, but a WIP PR is up in #728. |
Oh fantastic, very interested in this feature. I'm looking into using It looks like your PR would implement the main thing I'm looking for 🤩. Looks like there's quite a lot of reference to NextJS here so in case providing an alternative adds to the discussion, the ideal react-query const jobContext = createJobContext()
// 👇 Ensure you wrap your application with the provider.
const app = (
<JobProvider jobContext={jobContext}>
<MyApp />
</JobProvider>
)
// 👇 This makes sure we "bootstrap" resolve any jobs prior to rendering
asyncBootstrapper(app).then(() => {
// We can now render our app 👇
const appString = renderToString(app)
// Get the resolved jobs state. 👇
const jobsState = jobContext.getState()
const html = `
<html>
<head>
<title>Example</title>
</head>
<body>
<div id="app">${appString}</div>
<script type="text/javascript">
// Serialise the state into the HTML response
// 👇
window.JOBS_STATE = ${serialize(jobsState)}
</script>
</body>
</html>`
res.send(html)
}); import React from 'react'
import { render } from 'react-dom'
import { JobProvider } from 'react-jobs'
import MyApp from './shared/components/MyApp'
// Get any "rehydrate" state sent back by the server
// 👇
const rehydrateState = window.JOBS_STATE
// Surround your app with the JobProvider, providing
// the rehydrateState
// 👇
const app = (
<JobProvider rehydrateState={rehydrateState}>
<MyApp />
</JobProvider>
)
// Render 👍
render(app, document.getElementById('app')) |
I might be getting ahead of myself and maybe these make sense as seperate features after the Setting freshness/staleness settings based query responseI might be wrong but as far as I can tell there doesn't seem to be a way to set the freshness/staleness settings of a query in the As I understand, this means that if you make a Unnecessary re-triggering of query functionYou might decide to make the Stale data being cached too longYou might decide to make the Accidental "cache/freshness doubling"If you have a CDN in front of your API it's likely that the data you're receiving from the server is already "aged" even if it's not yet considered "stale". For example, if your server returns a response with the headers
It would indicate that while the document is allowed to live for 300 seconds (5 minutes) before being considered stale, it has already been stored in the CDN for 240 seconds (4 minutes) so in reality once it gets rendered into the page it should only be considered "fresh" for 60 more seconds (roughly). If I can only set staleness settings before the queryFn resolves, I might choose to make it NOTE: There is inconsistency with what response headers CDNs support/use to inform the client the age of the response ( Using Cache-Control semanticsUsing Cache-Control semantics (or a subset of directives) when storing items in the For example: If I have a <!--GET /api/v1/people-->
Content-Type: application/json
X-Cache: HIT
Cache-Control: max-age=1200; s-maxage=1200
Age: 1
<!--GET /api/v1/pets-->
Content-Type: application/json
X-Cache: HIT
Cache-Control: max-age=300; s-maxage=300
Age: 0
<!--GET /api/v1/news-->
Content-Type: application/json
X-Cache: MISS
Cache-Control: max-age=0; s-maxage=1500 Technically if I want to prevent my server-rendered react HTML pages getting cached if/when they contain stale data I should make the <!--GET /dashboard-->
Content-Type: text/html
Cache-Control: max-age=0; s-maxage=300 This would also extend to things like the Having a "private" and "shared" cache providerBy default queries shouldn't be shared across requests, but there are some that are OK to share across multiple users/requests. Any query/request that specifies that it's safe to store in a shared cache eg const assets = require(process.env.RAZZLE_ASSETS_MANIFEST);
const server = express();
// this cache is used across requests
const sharedQueryCache = makeQueryCache();
server
.get('/*', (req, res) => {
// this cache is used for only a single request
const requestQueryCache = makeQueryCache();
const markup = renderToString(
<ReactQueryCacheProvider cache={queryCache} sharedCache={sharedQueryCache}>
<App />
</ReactQueryCacheProvider>
);
const cacheData = queryCache.serialize()
const sharedCacheData = sharedQueryCache.serialize()
res.send(
`<!doctype html>
<html lang="">
<head>
${
assets.client.css
? `<link rel="stylesheet" href="${assets.client.css}">`
: ''
}
</head>
<body>
<div id="root">${markup}</div>
<script src="${assets.client.js}" defer crossorigin></script>
</body>
</html>`
);
}); |
I'm glad you are interested in this issue! Custom SSRWith custom SSR, the mechanisms to fill the cache will be either to prefetch the data using Either way, there will definitely be custom SSR examples and docs as well as Next-versions, I think the focus on getting it to work with Next is simply because that is more constrained than custom SSR, so its actually somewhat trickier to get right from a library perspective. Having a "private" and "shared" cache providerI agree this usecase is useful, but would be a bit hesitant about juggling two caches in the library implementation. Another way to go about this would be to pre-seed the Btw, there are more things to hammer out around this. Currently no timeouts are scheduled on the server, so data newer gets stale there. This is fine for per-request caches, but maybe not optimal for shared ones, or things like CLI-tools that also run in a node-environment. Staleness based on headers or response dataWhile unrelated to this issue, these are very interesting suggestions! I suggest you go ahead and open a new issue/discussion for this so it doesn't go unnoticed in this closed one. 😄 |
This is in a sense a continuation of the now closed #70
First of all, thanks for your hard work on this library, it's great! I'm currently designing a new somewhat complex data fetching solution at work and I would like to build it on top of react-query, but there is some functionality around SSR that I would like to discuss.
I think the call to not cache data at all in #70 was the right one given the circumstances, for security reasons there can not be a cache by default on the server since it might leak user data cross-requests. However, this has some cumbersome consequences:
queryCache.prefetchQuery
ahead of the server rendering (like in Next.jsgetInitialProps
) wont automatically prime the cacheuseQuery
withinitialData
in one place wont prime the cache for use in a second place, so reading the same data from the cache in multiple places becomes impossibleA big point of react-query is to be the global cache for data. Since this is not true on the server, a lot of nice patterns falls apart, and the only way to fix that is to implement your own cache on the server that wraps react-query, for example using a custom
useQueryWithSSRCache
that passes someinitialData
every time it is used. This seems a bit backwards since it's re-implementing a somewhat big part of react-query.My suggestion is to make using a cache on the server opt-in and design it around creating a new cache per request, that you place on a context provider. If a cache is available on the context, use that instead of the global one.
The other part of the puzzle is to make that cache serialisable (and you probably want to destroy the entire cache when you do so), so you can send it to the client and hydrate it there. On the client you could either hydrate the global cache before rendering, or create a cache and place on a context like on the server (but then you can't import
queryCache
like normal).This is also something that will very likely be needed to support Suspense server rendering on the server in the future, if you don't have a cache per request, what do you read from to see if you need to Suspend or not? I'm not sure if this is something you are interested in, or if you deem it out of scope for the library? From the Readme:
(Btw, I'm curious about what you mean by out of sync here?)
I'd love to get your input on this! This is a complex area so even if this became somewhat long, there is still a lot of nuance here of course, I'm very open to discussing it further and might very well be interested in contributing to something like this as well.
The text was updated successfully, but these errors were encountered: