-
-
Notifications
You must be signed in to change notification settings - Fork 562
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Slow sidebar generation when there are a lot of items #1318
Comments
Dumping some rough notes regarding the issue:
|
Thanks for the issue @stavros-k and the investigations @HiDeoo! Sounds like the problem is not the generation step then, but more what HiDeoo concluded: “the issue is more related to the rendering part”. Subgroups use |
Sharing some preliminary test results. Does look like we hit something exponential even with a flat sidebar looking at the second benchmark. Flat sidebarThis test uses the Starlight starter template and adds a number of extra top-level links to the sidebar (controlled via the 4 pagesDefault starter template pages (3 pages + 404).
103 pagesSame as above, but the default
|
I do think this is probably something Starlight is doing. I made a quick benchmark using Astro’s empty/minimal template that can generate a different number of pages or list items and didn’t see the same exponential behaviour:
|
Would you be able to share the repro using Astro’s minimal template or describe it, e.g. what the pages looks like, are they physical pages on disk, etc.? I'm asking because with the same template I'm getting faster time than you for a Benchmark 1: ITEMS=1 pnpm build
Time (mean ± σ): 1.692 s ± 0.126 s [User: 2.920 s, System: 0.334 s]
Range (min … max): 1.553 s … 1.998 s 10 runs
Benchmark 2: ITEMS=4000 pnpm build
Time (mean ± σ): 26.435 s ± 0.962 s [User: 30.663 s, System: 1.147 s]
Range (min … max): 24.636 s … 27.282 s 10 runs
Summary
ITEMS=1 pnpm build ran
15.62 ± 1.29 times faster than ITEMS=4000 pnpm build |
I renamed ---
import type { GetStaticPaths } from 'astro';
export const getStaticPaths = (() => {
return Array.from({ length: parseInt(import.meta.env.PAGES || '1') })
.fill('')
.map((_, i) => ({ params: { slug: `page-${i}` } }));
}) satisfies GetStaticPaths;
---
<html lang="en">
<head>
<meta charset="utf-8" />
<link rel="icon" type="image/svg+xml" href="/favicon.svg" />
<meta name="viewport" content="width=device-width" />
<meta name="generator" content={Astro.generator} />
<title>Astro</title>
</head>
<body>
<h1>Astro</h1>
<ul>
{
Array.from({ length: parseInt(import.meta.env.ITEMS || '1') })
.fill('')
.map((_, i) => (
<li>
<a href={`page-${i}`}>{i}</a>
</li>
))
}
</ul>
</body>
</html> It is a bit simpler than the Starlight template obviously. I’ve also tried running Pretty similar with 2000 sidebar entries, except now weirdly |
I would think this is not directly related to network I/O but due to some streams API being used in the code. I continued exploring a little bit more in some free time with Starting from an example similar to yours, using the Starlight starter template, with 103 pages and 4000 sidebar entries, I get the following results which are close to yours as expected:
Altho, if I apply the following change in Astro to const response = await runtimeRenderPage(
result,
Component,
renderContext.props,
{},
- env.streaming,
+ false,
renderContext.route
); I get the following results:
Unfortunately, I had to stop here and didn't get the time yet to investigate further. |
Very interesting, thanks @HiDeoo! That flag controls whether Astro renders to a Could be that rendering to the format required for stream consumption is much more expensive beyond some certain threshold for some reason 🤔 |
Yeah, and it looks like the bigger the stream gets, the worse it gets with most of the time being spend in undici |
Yeah from my testing there won’t be much we can do in Starlight. We can try pulling a few bits of logic up to sidebar generation and out of the component to see if that helps performance, but it would be small incremental benefits I suspect, rather than the big shift needed to unlock this. Did some tests where I simplified the component a bit and you still hit this inflection point eventually. I did notice a style hoisting bug with components in MDX when I built with a patch passing @matthewp has also been chatting with some of the Node.js folks and they suggested Astro doesn’t need FWIW, some more data from my tests with a slightly simpler sidebar component (hence the higher numbers than we saw above where the trouble starts): Percent of the stack consumed by the
|
Command | Mean [s] | Min [s] | Max [s] | Relative |
---|---|---|---|---|
ITEMS=1000 pnpm build |
7.709 ± 0.325 | 7.386 | 8.564 | 1.00 |
ITEMS=2000 pnpm build |
8.358 ± 0.146 | 8.104 | 8.511 | 1.08 ± 0.05 |
ITEMS=3000 pnpm build |
8.693 ± 0.121 | 8.544 | 8.914 | 1.13 ± 0.05 |
ITEMS=4000 pnpm build |
9.473 ± 0.175 | 9.206 | 9.834 | 1.23 ± 0.06 |
ITEMS=5000 pnpm build |
16.209 ± 0.501 | 15.701 | 16.875 | 2.10 ± 0.11 |
ITEMS=6000 pnpm build |
23.813 ± 1.553 | 20.433 | 25.278 | 3.09 ± 0.24 |
ITEMS=7000 pnpm build |
21.604 ± 1.682 | 18.454 | 24.055 | 2.80 ± 0.25 |
ITEMS=8000 pnpm build |
35.274 ± 3.218 | 32.081 | 39.589 | 4.58 ± 0.46 |
ITEMS=9000 pnpm build |
46.008 ± 2.098 | 42.993 | 49.602 | 5.97 ± 0.37 |
ITEMS=10000 pnpm build |
52.259 ± 2.289 | 48.252 | 55.392 | 6.78 ± 0.41 |
Yeah, this is not something I personally even tried as I imagined this would only slightly just delay the issue. The entire conversation you linked and the trick suggested by Matteo is super interesting. Thanks for sharing all this and it's great to at least already have a lead on what could be the issue / potential improvements. |
Yes, will be cool if we uncovered some improvements for all Astro users! For the simple change you suggested of disabling streaming in SSR, I'm currently testing the preview release in withastro/astro#9603 In the existing Astro benchmarks it seems to make no difference, but I don't think they have a test like this long list case. |
Thanks again for the issue @stavros-k — it’s led to quite some investigations in the whole Astro team! There are currently two different Astro PRs looking to improve performance here: withastro/astro#9603 and withastro/astro#9614. To round off my data mentioned above, using the preview release saw a significant improvement in my benchmark with the current Astro release in blue and the experimental preview release with streaming disabled (withastro/astro#9603) in red: Can hopefully run the same benchmark for withastro/astro#9614 soon. Given these investigations, I hope it’s OK I close this issue here as it seems to be something that needs improving upstream. I’ll be monitoring those two PRs and pushing for these improvements there! |
What version of
starlight
are you using?0.15.1
What version of
astro
are you using?4.0.8
What package manager are you using?
npm
What operating system are you using?
Linux
What browser are you using?
Chrome
Describe the Bug
Autogenerating sidebar with a lot of pages, impacts performance by a large amount.
Little backstory:
I'm doing some PoC to convert a Docusarus site
https://truecharts.org
/https://github.com/truecharts/website/
to Starlight.Started by dumping all .md files and started a build (Fixed few issues with frontmatter missing etc).
I instantly got an insanely fast build under 40sec! Compared to 20+min we had before with cache prefilled.
Then I enabled autogeneration of sidebar and things got weird :D
No sidebar:
Sidebar:
Keep in mind that most of the
helm-security.md
files are empty, just a title in the frontmatter and nothing else(Had to disable a generate due to other perf issues, unrelated).
Some more info might be gathered from the discord chat https://discord.com/channels/830184174198718474/1070481941863878697/1190075061797924974
I'd love to see some performance improvement regarding this!
Link to Minimal Reproducible Example
Participation
The text was updated successfully, but these errors were encountered: