Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug] ./gotosocial --help hangs for more than 30s #3122

Closed
xnuk opened this issue Jul 20, 2024 · 6 comments · Fixed by #3185
Closed

[bug] ./gotosocial --help hangs for more than 30s #3122

xnuk opened this issue Jul 20, 2024 · 6 comments · Fixed by #3185
Labels
bug Something isn't working performance
Milestone

Comments

@xnuk
Copy link
Contributor

xnuk commented Jul 20, 2024

Describe the bug with a clear and concise description of what the bug is.

Latest main branch (409b398) ./gotosocial binary hangs on amd64. v0.16.0 doesn't hang. So I git bisect it and find cde2fb6 (#3090) was a problem.

What's your GoToSocial Version?

409b398

GoToSocial Arch

amd64, ./script/build.sh

What happened?

./gotosocial --help hits >99% CPU usage and hangs for >30s.

What you expected to happen?

Show help message in a second.

How to reproduce it?

$ git switch cde2fb6244a791b3c5b746112e3a8be3a79f39a4 --detach
$ ./script/build.sh
$ time ./gotosocial --help  # hangs
GoToSocial - a fediverse social media server

[omitted]

________________________________________________________
Executed in   29.07 secs    fish           external
   usr time   28.26 secs    0.00 millis   28.26 secs
   sys time    0.48 secs    1.13 millis    0.48 secs

Anything else we need to know?

  • go version is go version go1.22.5 linux/amd64, but it's reproducible in Go 1.22.2.
  • Using Arch Linux x64 (6.9.10-zen1-1-zen).
@xnuk xnuk added the bug Something isn't working label Jul 20, 2024
@tsmethurst
Copy link
Contributor

Will check it out, probably something to do with our wasm ffmpeg compilation.

@xnuk
Copy link
Contributor Author

xnuk commented Jul 21, 2024

probably something to do with our wasm ffmpeg compilation.

Yes I guess so too. Removing {ffmpeg,ffprobe}Pool in internal/media/ffmpeg/{ffmpeg,ffprobe}.go fixes this issue

The patch
diff --git a/internal/media/ffmpeg/ffmpeg.go b/internal/media/ffmpeg/ffmpeg.go
index 25332398..3fe6a960 100644
--- a/internal/media/ffmpeg/ffmpeg.go
+++ b/internal/media/ffmpeg/ffmpeg.go
@@ -20,64 +20,17 @@ package ffmpeg
 import (
 	"context"
 
-	ffmpeglib "codeberg.org/gruf/go-ffmpreg/embed/ffmpeg"
 	"codeberg.org/gruf/go-ffmpreg/wasm"
-
-	"github.com/tetratelabs/wazero"
-	"github.com/tetratelabs/wazero/imports/wasi_snapshot_preview1"
 )
 
 // InitFfmpeg initializes the ffmpeg WebAssembly instance pool,
 // with given maximum limiting the number of concurrent instances.
 func InitFfmpeg(ctx context.Context, max int) error {
 	initCache() // ensure compilation cache initialized
-	return ffmpegPool.Init(ctx, max)
+	return nil
 }
 
 // Ffmpeg runs the given arguments with an instance of ffmpeg.
 func Ffmpeg(ctx context.Context, args wasm.Args) (uint32, error) {
-	return ffmpegPool.Run(ctx, args)
-}
-
-var ffmpegPool = wasmInstancePool{
-	inst: wasm.Instantiator{
-
-		// WASM module name.
-		Module: "ffmpeg",
-
-		// Per-instance WebAssembly runtime (with shared cache).
-		Runtime: func(ctx context.Context) wazero.Runtime {
-
-			// Prepare config with cache.
-			cfg := wazero.NewRuntimeConfig()
-			cfg = cfg.WithCoreFeatures(ffmpeglib.CoreFeatures)
-			cfg = cfg.WithCompilationCache(cache)
-
-			// Instantiate runtime with our config.
-			rt := wazero.NewRuntimeWithConfig(ctx, cfg)
-
-			// Prepare default "env" host module.
-			env := rt.NewHostModuleBuilder("env")
-
-			// Instantiate "env" module in our runtime.
-			_, err := env.Instantiate(context.Background())
-			if err != nil {
-				panic(err)
-			}
-
-			// Instantiate the wasi snapshot preview 1 in runtime.
-			_, err = wasi_snapshot_preview1.Instantiate(ctx, rt)
-			if err != nil {
-				panic(err)
-			}
-
-			return rt
-		},
-
-		// Per-run module configuration.
-		Config: wazero.NewModuleConfig,
-
-		// Embedded WASM.
-		Source: ffmpeglib.B,
-	},
+	return 0, nil
 }
diff --git a/internal/media/ffmpeg/ffprobe.go b/internal/media/ffmpeg/ffprobe.go
index 19582450..cf2cbf61 100644
--- a/internal/media/ffmpeg/ffprobe.go
+++ b/internal/media/ffmpeg/ffprobe.go
@@ -20,64 +20,17 @@ package ffmpeg
 import (
 	"context"
 
-	ffprobelib "codeberg.org/gruf/go-ffmpreg/embed/ffprobe"
 	"codeberg.org/gruf/go-ffmpreg/wasm"
-
-	"github.com/tetratelabs/wazero"
-	"github.com/tetratelabs/wazero/imports/wasi_snapshot_preview1"
 )
 
 // InitFfprobe initializes the ffprobe WebAssembly instance pool,
 // with given maximum limiting the number of concurrent instances.
 func InitFfprobe(ctx context.Context, max int) error {
 	initCache() // ensure compilation cache initialized
-	return ffprobePool.Init(ctx, max)
+	return nil
 }
 
 // Ffprobe runs the given arguments with an instance of ffprobe.
 func Ffprobe(ctx context.Context, args wasm.Args) (uint32, error) {
-	return ffprobePool.Run(ctx, args)
-}
-
-var ffprobePool = wasmInstancePool{
-	inst: wasm.Instantiator{
-
-		// WASM module name.
-		Module: "ffprobe",
-
-		// Per-instance WebAssembly runtime (with shared cache).
-		Runtime: func(ctx context.Context) wazero.Runtime {
-
-			// Prepare config with cache.
-			cfg := wazero.NewRuntimeConfig()
-			cfg = cfg.WithCoreFeatures(ffprobelib.CoreFeatures)
-			cfg = cfg.WithCompilationCache(cache)
-
-			// Instantiate runtime with our config.
-			rt := wazero.NewRuntimeWithConfig(ctx, cfg)
-
-			// Prepare default "env" host module.
-			env := rt.NewHostModuleBuilder("env")
-
-			// Instantiate "env" module in our runtime.
-			_, err := env.Instantiate(context.Background())
-			if err != nil {
-				panic(err)
-			}
-
-			// Instantiate the wasi snapshot preview 1 in runtime.
-			_, err = wasi_snapshot_preview1.Instantiate(ctx, rt)
-			if err != nil {
-				panic(err)
-			}
-
-			return rt
-		},
-
-		// Per-run module configuration.
-		Config: wazero.NewModuleConfig,
-
-		// Embedded WASM.
-		Source: ffprobelib.B,
-	},
+	return 0, nil
 }

@xnuk
Copy link
Contributor Author

xnuk commented Jul 21, 2024

Importing testrig loads ffmpeg wasm (before calling main() function) and that makes ./gotosocial --help slower. Clearing init() in testrig/config.go also fixes this issue.

func init() {
ctx := context.Background()
// Ensure global ffmpeg WASM pool initialized.
if err := ffmpeg.InitFfmpeg(ctx, 1); err != nil {
panic(err)
}
// Ensure global ffmpeg WASM pool initialized.
if err := ffmpeg.InitFfprobe(ctx, 1); err != nil {
panic(err)
}
}

@xnuk
Copy link
Contributor Author

xnuk commented Jul 21, 2024

Compiling a ffmpeg.wasm takes around 14±2s in my computer. There is no log messages for compiling this in GoToSocial currently, I could easily confused about why it stucks.

package main

import (
	"codeberg.org/gruf/go-ffmpreg/embed/ffmpeg"
	"context"
	"fmt"
	"github.com/tetratelabs/wazero"
	"time"
)

func main() {
	rt := wazero.NewRuntime(context.Background())

	t := time.Now()
	rt.CompileModule(context.Background(), ffmpeg.B)
	dur := time.Since(t)
	fmt.Printf("%.3f seconds", dur.Seconds())
}

@daenney
Copy link
Member

daenney commented Aug 2, 2024

This is a bit odd. We only call the "compile the WASM" in the server start action. It really should not be affecting other subcommands, or -h in general. That's also the reason there's no log message, as that's emitted by server start.

But somehow we still end up calling one of the WASM compilation functions, or at least causing them to trigger.

@daenney
Copy link
Member

daenney commented Aug 2, 2024

Importing testrig loads ffmpeg wasm (before calling main() function) and that makes ./gotosocial --help slower. Clearing init() in testrig/config.go also fixes this issue.

Ah. Yes. That we can fix. Testrig's init gets evaluated at import time so it triggers the WASM compilation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working performance
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants