Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transmission client crashes randomly #4736

Closed
K92Pi opened this issue Sep 15, 2021 · 3 comments
Closed

Transmission client crashes randomly #4736

K92Pi opened this issue Sep 15, 2021 · 3 comments

Comments

@K92Pi
Copy link

K92Pi commented Sep 15, 2021

Creating a bug report/issue

Required Information

  • DietPi version | cat /boot/dietpi/.version

G_DIETPI_VERSION_CORE=7
G_DIETPI_VERSION_SUB=5
G_DIETPI_VERSION_RC=2
G_GITBRANCH='master'
G_GITOWNER='MichaIng'
G_LIVE_PATCH_STATUS[0]='applied'

  • Distro version | echo $G_DISTRO_NAME or cat /etc/debian_version
    11.0

  • Kernel version | uname -a
    Linux MediaServer 5.10.60-v7l+ #1449 SMP Wed Aug 25 15:00:44 BST 2021 armv7l GNU/Linux

  • SBC model | echo $G_HW_MODEL_NAME or (EG: RPi3)
    RPi 4 Model B (armv7l)

  • Power supply used | (EG: 5V 1A RAVpower)

  • SDcard used | (EG: SanDisk ultra)
    Sandisk flash drive

Additional Information (if applicable)

  • Software title | (EG: Nextcloud)
    Transmission

  • Was the software title installed freshly or updated/migrated?
    Fresh OS install

  • Can this issue be replicated on a fresh installation of DietPi?
    Yes

  • Bug report ID | echo $G_HW_UUID
    25767c1a-8ba0-431e-a9f2-b4994f768d64

Hi!
I'm having an issue with Transmission on my Pi. Lately the client crashes all the time with very minimal active torrents(<10). I've noticed it consumes 800+ MB memory and all the sudden it becomes unresponsive.
In the log I've found the followings:

Sep 14 17:04:17 MediaServer transmission-daemon[3484]: [2021-09-14 17:04:17.344] XXX S01-S09 Piece 389, which was just downloaded, failed its checksum test (torrent.c:3466) - a lot of these
Sep 14 17:01:58 MediaServer transmission-daemon[3412]: [2021-09-14 17:01:58.621] XXX S01-S09 Piece 1574, which was just downloaded, failed its checksum test (torrent.c:3466) - a lot of these

...and when it crushes:

Sep 14 17:06:25 MediaServer systemd[1]: transmission-daemon.service: Main process exited, code=killed, status=9/KILL
Sep 14 17:06:25 MediaServer systemd[1]: transmission-daemon.service: Failed with result 'signal'.
Sep 14 17:06:25 MediaServer systemd[1]: transmission-daemon.service: Consumed 51.992s CPU time. 

dmesg

[ 1930.305925] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=/,mems_allowed=0,global_oom,task_memcg=/,task=transmission-da,pid=3484,uid=107
[ 1930.306020] Out of memory: Killed process 3484 (transmission-da) total-vm:794436kB, anon-rss:711084kB, file-rss:6992kB, shmem-rss:0kB, UID:107 pgtables:1516kB oom_score_adj:0
[ 1934.807388] oom_reaper: reaped process 3484 (transmission-da), now anon-rss:0kB, file-rss:0kB, shmem-rss:0kB

I'm having the latest Transmission client. Could you please help me with this issue?

@Joulinar
Copy link
Collaborator

Hi,

Do you have an external disk connected to your system? If yes it could be similar to this issue #4622

@MichaIng
Copy link
Owner

On the other hand, when it consumes 800 MiB+ RAM it may be a correct intermission of the OOM reaper, which also verifies the number "total-vm:794436kB". It the total available memory (including swap file) exceeded when Transmissions consumes 800 MiB? Everything else is then as expected, OOM reaper sends SIGKILL and the systemd service shows this as exit signal.

The question is why Transmissions consumes that much memory in the first place with <10 torrents. I remember there was an issue in the past, Stretch first, got resolved with Buster, then still rare reports on Buster, and now still on Bullseye? Transmission is aimed to be a very lightweight BitTorrent server, so these memory leaks break this aim 🤔.

See: #2413

@MichaIng
Copy link
Owner

I'll mark this as closed due to outstanding reply. Feel free to reopen if the issue persists, especially when available system memory is actually sufficient. If it is indeed #4622, please continue discussion there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants