-
-
Notifications
You must be signed in to change notification settings - Fork 437
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adapt signal_fixpeaks to deal with larger gaps in data #650
Comments
Fixing peaks is a tricky issue, but there's definitely room for improvement! Would be also nice to add an example / study explaining and benchmarking these methods |
Indeed! I've opened the PR for now with the 3 changes I mentioned above, it's a work in progress but I thought I should get your input before overcomplicating things: #647 The new version is able to fix ECG peaks for data where the old version got stuck in a loop. I still see a couple of issues though:
|
Hi @DominiqueMakowski & @JanCBrammer , I've noticed a couple of problematic parts of the code from the previous PR
import neurokit2 as nk
peaks = [250, 1250, 1350, 2250, 3250]
interval = nk.signal_period(peaks, sampling_rate=1000, desired_length=None)
interval
array([0.75, 1. , 0.1 , 0.9 , 1. ])
nk.standardize(interval)
array([ 0. , 0.66226618, -1.72189206, 0.39735971, 0.66226618])
This is probably not ideal if int(1.99)
1
round(1.99)
2 Should I open a new PR? Sorry I didn't notice these earlier 🙈 |
No worries, better late than never :) Yes please do open a PR |
Describe the solution you'd like
I noticed that the "neurokit" method of signal_fixpeaks does not work well with larger gaps in the data, sometimes leading to negative indices or getting stuck in a loop: https://github.com/danibene/fixpeaks_with_large_gaps/blob/main/example_signal_with_large_gaps.ipynb
How could we do it?
I think there are a few possible ways to go about dealing with larger gaps (not mutually exclusive):
If you're interested in any of the above, I'm happy to try my hand & open a PR!
The text was updated successfully, but these errors were encountered: