Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

More than 3 times slow download speed over sqllite storage #890

Open
mahbubmaruf178 opened this issue Jan 15, 2024 · 9 comments
Open

More than 3 times slow download speed over sqllite storage #890

mahbubmaruf178 opened this issue Jan 15, 2024 · 9 comments

Comments

@mahbubmaruf178
Copy link

mahbubmaruf178 commented Jan 15, 2024

storage, err := sqliteStorage.NewDirectStorage(sqliteStorage.NewDirectStorageOpts{})
defer storage.Close()
config := torrent.NewDefaultClientConfig()
config.DefaultStorage = storage
its my storage setting for getting only torrent reader file without saving into disk(like streaming)

here is with sql vs default storage speed
sql storage -
withsqltr

default storage
sql

so, How I can speed Up streaming without saving data into disk ?

@anacrolix
Copy link
Owner

The defaults for the sqlite storage aren't optimal. You can tune those (and get 10-20x improvement), but additionally you can set sqlite storage to be entirely in memory. I'll provide the details soon.

@mahbubmaruf178
Copy link
Author

mahbubmaruf178 commented Jan 15, 2024

if torrent file size like 5gb is it store whole into memory?🤔
if store data (~20mb chunk) into disk then I can get the best speed ?

@anacrolix
Copy link
Owner

Sorry for the delay on this: Here are the defaults I use in https://www.coveapp.info/ for a squirrel.Cache, the implementation behind the direct sqlite storage:

cacheOpts.SetAutoVacuum = g.Some("full")
	cacheOpts.SetJournalMode = "wal"
	cacheOpts.SetSynchronous = 0
	cacheOpts.Path = "squirrel2.db"
	cacheOpts.Capacity = 9 << 30
	cacheOpts.MmapSizeOk = true
	cacheOpts.MmapSize = 64 << 20
	cacheOpts.CacheSize = g.Some[int64](-32 << 20)
	cacheOpts.SetLockingMode = "normal"
	cacheOpts.JournalSizeLimit.Set(1 << 30)
	cacheOpts.MaxPageCount.Set(15 << 30 >> 12)

This essentially says: Allow concurrent reads and a single writer (with decreased transaction overhead), don't bother to flush to disk on writes (it's a cache), store all the data in a file called squirrel2.db, limit the file to 9 GiB. Memory map the first 64 MiB of data (I think). Keep 32 MiB of the database in memory at most. Allow regular transactions. Don't let the journal get over 1 GiB in size. If the file gets over 15 GiB, return an error.

Many of these settings should be the default. Take a look at squirrel.NewCacheOpts, there's plenty of stuff in there, including exclusive mode, and memory mode which will give you even better performance.

@mahbubmaruf178
Copy link
Author

mahbubmaruf178 commented Jan 20, 2024

Yes, download speeds are improved, but not suficient .
I noticed that Blot Storage has fast download speed. Is there bolddb has size limit, or is it possible to set limit?

@anacrolix
Copy link
Owner

No, the bolt DB implementation provided in anacrolix/torrent doesn't include size limits, or any cache eviction.

@anacrolix
Copy link
Owner

Did you want to try with https://github.com/anacrolix/possum?

The Go interface is here: https://pkg.go.dev/github.com/anacrolix/possum/go.

You can use the resource.Provider interface in https://pkg.go.dev/github.com/anacrolix/possum/[email protected]/resource with http://localhost:6060/github.com/anacrolix/[email protected]/storage#NewResourcePieces. It does require that you compile a Rust library.

In my testing it's not currently faster than using squirrel, but it is heading that way.

I'm not sure in general why you're not happy with the other storage backends, I've not seen them be bottlenecks before, so if you have more information you could share, please do (torrent/magnet link for example).

@mahbubmaruf178
Copy link
Author

I'm working on a project that user can upload torrent file into their cloud storage like one drive,pcloud,storj,wasabi.. etc. I made an api that that require file reader to upload file . In my case I have got good speed except sqlite and filecache(maybe) .

@anacrolix
Copy link
Owner

Can you just pick one storage backend and go with that? Any reason you need the sqlite or filecache ones?

@mahbubmaruf178
Copy link
Author

because cheap vps have low storage, so I'm trying to upload it like streaming without saving the file to disk.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants