Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possible memory leak since 9.0 #1936

Closed
fcheung opened this issue Apr 11, 2023 · 14 comments
Closed

Possible memory leak since 9.0 #1936

fcheung opened this issue Apr 11, 2023 · 14 comments
Assignees
Labels
bug community To tag external issues and PRs submitted by the community

Comments

@fcheung
Copy link

fcheung commented Apr 11, 2023

Description

After updating from 8.16 to 9.0 we noticed our ruby processes had steady memory growth. This occurs only in one of our applications, despite all of them having similar environments in terms of ruby version, rails version etc.

This screenshot show ruby process usage overtime on one of our dev servers over a weekend (ie no traffic other than loadbalancer health checks, maybe actioncable pings if anyone had left a window open)

Screenshot 2023-04-11 at 13 06 57

after we rolled back to 8.16 this issue went away.

A thing that is different about this application that springs to mind is that is does use actioncable, ie. has extra threads running in the background and 9.0 defaulted thread instrumentation to on, so maybe that is relevant?

Steps to Reproduce

working on it ...

Your Environment

  • alpine linux 3.16
  • ruby 3.2.1
  • rails 7.0.4.2
  • agent 9.0.0
  • server: puma 6.1.1

Additional context

I did find one other person mentioning this: #1842 (comment)

For Maintainers Only or Hero Triaging this bug

Suggested Priority (P1,P2,P3,P4,P5):
Suggested T-Shirt size (S, M, L, XL, Unknown):

@fcheung fcheung added the bug label Apr 11, 2023
@workato-integration
Copy link

@github-actions github-actions bot added the community To tag external issues and PRs submitted by the community label Apr 11, 2023
@hannahramadan
Copy link
Contributor

Hi @fcheung 👋 Thank you for sharing this issue! You mentioned 9.0.0 enabling thread instrumentation by default and we're wondering if this is related to what you're seeing. To help narrow it down, would you be willing to try disabling thread instrumentation while using Ruby agent 9.0.0?

To do this, add the following to your newrelic.yml

# newrelic.yml
instrumentation.thread.tracing: false

Alternatively, you can disable thread instrumentation with the following environment variable:

NEW_RELIC_INSTRUMENTATION_THREAD_TRACING=false

We're also wondering if you have long-running/indefinite transactions? Thank you for mentioning that you're working on a reproduction. That should help us get closer to a resolution!

@cirdes
Copy link

cirdes commented Apr 12, 2023

@fcheung I have experienced the same.

@fcheung
Copy link
Author

fcheung commented Apr 12, 2023

Hi,

I've set NEW_RELIC_INSTRUMENTATION_THREAD_TRACING=true on our app still running new relic 8.16 to see whether thread tracing is the culprit - leaving that to settle for a while.

In the mean time I've got this demo app that seems to reproduce the issue: https://github.com/fcheung/leak_app

I've been running this demo app locally, in production mode with a valid license key and the data is showing up in newrelic

The app has one page at / that opens an action cable connection. It doesn't do anything with this connection, but rails sends 'ping' messages every 3 seconds to all connected clients. I've been testing this with a dozen or so tabs all open on this page.

if you run the app with the TRACE_ALLOCATIONS environment variable set, then it will record object allocations and visiting /dump will generate a dump in tmp. After running the app for 25 mins or so then the stats I get out of heapy look like this (output from heapy r path/to/file all --lines 20)

Analyzing Heap (Generation: all)
-------------------------------

allocated by memory (16504705) (in bytes)
==============================
  2097992  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/thread_pool.rb:307
  1059600  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/transaction/abstract_segment.rb:260
  1049152  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/timeout-0.3.2/lib/timeout.rb:101
  1049016  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/plugin.rb:66
  1049016  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/server.rb:261
  1049016  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/reactor.rb:37
   359440  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/tracer.rb:241
   339920  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/transaction/abstract_segment.rb:40
   258092  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/actionpack-7.0.4.3/lib/action_dispatch/routing/mapper.rb:
   252760  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/activesupport-7.0.4.3/lib/active_support/core_ext/class/attribute.rb:127
   116780  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/activesupport-7.0.4.3/lib/active_support/core_ext/module/delegation.rb:238
    93584  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/bootsnap-1.16.0/lib/bootsnap/compile_cache/iseq.rb:49
    88721  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/rack-2.2.6.4/lib/rack/request.rb:
    88128  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/actionview-7.0.4.3/lib/action_view/ripper_ast_parser.rb:129
    87040  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/3.2.0/ripper/sexp.rb:117
    84960  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/guid_generator.rb:25
    83366  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/cluster.rb:
    83273  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/3.2.0/json/common.rb:216
    81617  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/binder.rb:
    80920  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/launcher.rb:
    78796  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/dsl.rb:
    72024  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/server.rb:
    70396  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/actionpack-7.0.4.3/lib/action_controller/metal/request_forgery_protection.rb:
    65576  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/3.2.0/net/http/response.rb:632
    64996  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/request.rb:

object count (73940)
==============================
  26490  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/transaction/abstract_segment.rb:260
   1125  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/tracer.rb:241
   1123  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/transaction/abstract_segment.rb:40
   1109  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/rack-2.2.6.4/lib/rack/mime.rb:
   1062  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/guid_generator.rb:25
    982  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/activesupport-7.0.4.3/lib/active_support/core_ext/class/attribute.rb:127
    526  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/actionpack-7.0.4.3/lib/action_dispatch/routing/mapper.rb:
    482  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/3.2.0/delegate.rb:418
    432  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/actionview-7.0.4.3/lib/action_view/ripper_ast_parser.rb:129
    408  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/3.2.0/ripper/sexp.rb:117
    378  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/activesupport-7.0.4.3/lib/active_support/core_ext/module/redefine_method.rb:20
    359  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/3.2.0/x86_64-darwin22/ripper.bundle:
    357  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/client.rb:237
    319  <internal:/Users/fred/.rbenv/versions/3.2.2/lib/ruby/3.2.0/rubygems/core_ext/kernel_require.rb>:37
    298  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/activesupport-7.0.4.3/lib/active_support/core_ext/module/delegation.rb:238
    289  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/bootsnap-1.16.0/lib/bootsnap/compile_cache/iseq.rb:49
    244  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/bootsnap-1.16.0/lib/bootsnap/compile_cache/yaml.rb:173
    241  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/3.2.0/delegate.rb:347
    235  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/3.2.0/json/common.rb:216
    229  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/i18n-1.12.0/lib/i18n/backend/transliterator.rb:
    210  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/activesupport-7.0.4.3/lib/active_support/concern.rb:135
    208  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/actionview-7.0.4.3/lib/action_view/ripper_ast_parser.rb:139
    204  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/3.2.0/ripper/sexp.rb:126
    197  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/launcher.rb:
    193  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/cluster.rb:

High Ref Counts
==============================

  26489  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/transaction/abstract_segment.rb:40
   5692  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/tracer.rb:241
   2383  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/client.rb:69
   2181  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/actioncable-7.0.4.3/lib/action_cable/connection/base.rb:153
   2147  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/activesupport-7.0.4.3/lib/active_support/core_ext/class/attribute.rb:127
   2058  <internal:/Users/fred/.rbenv/versions/3.2.2/lib/ruby/3.2.0/rubygems/core_ext/kernel_require.rb>:37
   1517  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/actionpack-7.0.4.3/lib/action_dispatch/routing/mapper.rb:
   1502  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/tracer.rb:415
   1446  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/3.2.0/delegate.rb:418
   1262  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/rack-2.2.6.4/lib/rack/mime.rb:51
   1140  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/activesupport-7.0.4.3/lib/active_support/core_ext/module/redefine_method.rb:20
   1123  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/transaction.rb:217
    897  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/actionpack-7.0.4.3/lib/action_controller/metal/request_forgery_protection.rb:
    853  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/3.2.0/x86_64-darwin22/ripper.bundle:
    832  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/3.2.0/ripper/sexp.rb:117
    781  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/actionpack-7.0.4.3/lib/action_dispatch/http/content_security_policy.rb:
    772  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/actionview-7.0.4.3/lib/action_view/ripper_ast_parser.rb:129
    723  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/3.2.0/delegate.rb:347
    702  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/activesupport-7.0.4.3/lib/active_support/concern.rb:135
    669  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/actionview-7.0.4.3/lib/action_view/helpers/tags/base.rb:
    639  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/railties-7.0.4.3/lib/rails/application/default_middleware_stack.rb:
    628  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/rack-2.2.6.4/lib/rack/request.rb:
    588  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/activesupport-7.0.4.3/lib/active_support/core_ext/module/delegation.rb:238
    568  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/error_logger.rb:
    552  /Users/fred/.rbenv/versions/3.2.2/lib/ruby/3.2.0/ripper/core.rb:49

checking the earlier dumps show the counts associated with newrelic_rpm-9.1.0/lib/new_relic/agent/transaction/abstract_segment.rb:260 steadily increasing. The heap diff between the dump i took shortly after boot and just now shows (first few entries)

fred@FredericksMBP3-2 leak_app % heapy diff tmp/dump-2023-04-12T17:37:57+01:00 tmp/dump-2023-04-12T18:03:30+01:00 
Allocated ARRAY 25821 objects of size 1032840/1525236 (in bytes) at: /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/transaction/abstract_segment.rb:260
Allocated ARRAY 1040 objects of size 41600/1525236 (in bytes) at: /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/transaction/abstract_segment.rb:40
Allocated OBJECT 993 objects of size 317760/1525236 (in bytes) at: /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/tracer.rb:241
Allocated STRING 950 objects of size 76000/1525236 (in bytes) at: /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/guid_generator.rb:25
Allocated STRING 60 objects of size 4800/1525236 (in bytes) at: /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/guid_generator.rb:26
Allocated STRING 32 objects of size 3022/1525236 (in bytes) at: /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/client.rb:237
Allocated DATA 23 objects of size 1656/1525236 (in bytes) at: /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/stats.rb:18
Allocated OBJECT 23 objects of size 1840/1525236 (in bytes) at: /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/stats_engine/stats_hash.rb:38
Allocated SHAPE 22 objects of size 2024/1525236 (in bytes) at: :
Allocated IMEMO 8 objects of size 328/1525236 (in bytes) at: /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/agent_helpers/harvest.rb:36
Allocated OBJECT 8 objects of size 720/1525236 (in bytes) at: /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/concurrent-ruby-1.2.2/lib/concurrent-ruby/concurrent/synchronization/safe_initialization.rb:30
Allocated DATA 8 objects of size 512/1525236 (in bytes) at: /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/concurrent-ruby-1.2.2/lib/concurrent-ruby/concurrent/synchronization/mutex_lockable_object.rb:33
Allocated DATA 8 objects of size 576/1525236 (in bytes) at: /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/concurrent-ruby-1.2.2/lib/concurrent-ruby/concurrent/synchronization/mutex_lockable_object.rb:32
Allocated STRING 6 objects of size 720/1525236 (in bytes) at: /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/stats_engine/stats_hash.rb:38
Allocated ARRAY 6 objects of size 240/1525236 (in bytes) at: /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/event_buffer.rb:30
Allocated IMEMO 6 objects of size 240/1525236 (in bytes) at: /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/newrelic_rpm-9.1.0/lib/new_relic/agent/agent_helpers/harvest.rb:46
Allocated ARRAY 4 objects of size 160/1525236 (in bytes) at: /Users/fred/.rbenv/versions/3.2.2/lib/ruby/gems/3.2.0/gems/puma-5.6.5/lib/puma/server.rb:333

I've also run another copy of the app with NEW_RELIC_INSTRUMENTATION_THREAD_TRACING=false. In this copy there is nothing from newrelic in the heapy dump of top allocated/object count/ref count and the heap diff over 10 minutes or so doesn't seem to have much that isn't just normal churn.

@hannahramadan
Copy link
Contributor

Thanks for sharing a reproduction of the issue @fcheung. We're working on setting it up and hope to get back to you soon!
cc @cirdes

@fcheung
Copy link
Author

fcheung commented Apr 17, 2023

I can confirm that I get the same behaviour with 8.16.0 and NEW_RELIC_INSTRUMENTATION_THREAD_TRACING=true

@bendilley
Copy link

I can confirm that we had the same problem with newrelic 9.0.0, ruby 3.2.1, rails 7.0.4.3, puma 6.1.1. We've disabled newrelic to solve the problem.

@hannahramadan
Copy link
Contributor

Thanks for confirming @fcheung!

@bendilley - it looks like thread tracing is contributing to this issue. While we work on this issue, you can disable thread tracing and continue to use the agent by adding the following to your newrelic.yml

# newrelic.yml
instrumentation.thread.tracing: false

Alternatively, you can disable thread instrumentation with the following environment variable:

NEW_RELIC_INSTRUMENTATION_THREAD_TRACING=false

@travisbell
Copy link

travisbell commented Apr 19, 2023

Hey guys, for what it's worth, after I re-enabled thread tracing with 9.2.0 being released, while my processes don't hang anymore (yay! issue #1910 is fixed) I am now seeing this as well.

@tannalynn
Copy link
Contributor

I've prepared a tag that appears to fix the issue for me when using the reproduction @fcheung provided (thank you so much for providing that! It's a huge help and made debugging the issue much easier).

Here are some screenshots of the memory usage:

This one is multiple instances of the repro app, each using a different agent tag. The tags were 9.2.0, 9.2.0 with thread tracing disabled, and a couple of debugging/bugfix attempts. After running for about 24 hours, you can see that the original 9.2.0 memory usage is continuously increasing, however the debug/bugfix attempt tags all show flat memory usage, same as with thread tracing being disabled!
Screen Shot 2023-04-21 at 9 09 37 AM

This screenshot is 9.2.0, 9.2.0 with thread tracing disabled, and my final bugfix (same as in previous tags but without the extra debugging logs/code). It's for a shorter period of time, but you can see that 9.2.0 is beginning to increase but the other 2 are remaining flat. This is the tag I have ready for others to try out.
Screen Shot 2023-04-21 at 11 33 24 AM

It would be great to see if this result is seen by others as well, so you can try this change out using this in your gemfile.

gem 'newrelic_rpm', git: 'https://github.com/newrelic/newrelic-ruby-agent.git', tag: 'thread_instrumentation_memory_bugfix'

We'd appreciate any feedback on this fix and definitely want to know if anyone is still seeing increasing memory usage with this tag.
Thank you!

@fcheung
Copy link
Author

fcheung commented Apr 24, 2023

I've deployed this @tannalynn - will keep you posted with what we see

@kaylareopelle
Copy link
Contributor

Hi @fcheung, @bendilley, @travisbell, @cirdes! 👋

A fix for this issue is available in version 9.2.2 of newrelic_rpm. I'm going to close this issue. Please reopen or create a new issue if the problem persists.

@fcheung
Copy link
Author

fcheung commented May 2, 2023

@kaylareopelle @tannalynn FYI the fix has been looking good overhere for the last few days.

@kaylareopelle
Copy link
Contributor

That's great news! We appreciate you circling back to us.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug community To tag external issues and PRs submitted by the community
Projects
Archived in project
Development

No branches or pull requests

7 participants