Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Crash when marking changepoints in renaissance data #100

Open
vext01 opened this issue Jul 16, 2019 · 6 comments
Open

Crash when marking changepoints in renaissance data #100

vext01 opened this issue Jul 16, 2019 · 6 comments

Comments

@vext01
Copy link
Member

vext01 commented Jul 16, 2019

It could be that the data is incomplete (in terms of pexecs), but I get the following crash:

$ ~/research/warmup_stats/bin/mark_changepoints_in_json renaissance_results_outliers_w200.json.bz2
Marking changepoints and classifications.
Expecting a steady state to be reached before the last 500 iterations.
Using a fixed bound of 0.001s.
Using R version 3.3.3 and changepoint library 2.3.1
Loading: renaissance_results_outliers_w200.json.bz2
/home/vext01/research/warmup_stats/bin/mark_changepoints_in_json:227: RuntimeWarning: divide by zero encountered in log
  pen_value=15.0*numpy.log(len(p_exec)))
/home/vext01/research/warmup_stats/work/pylibs/rpy2/rinterface/__init__.py:186: RRuntimeWarning: Error in multiple.meanvar.norm(data, mul.method = method, penalty, pen.value,  :
  Minimum segment legnth is too large to include a change in this data

  warnings.warn(x, RRuntimeWarning)
Traceback (most recent call last):
  File "/home/vext01/research/warmup_stats/bin/mark_changepoints_in_json", line 298, in <module>
    main(options.json_files[0], options.delta, options.steady_state)
  File "/home/vext01/research/warmup_stats/bin/mark_changepoints_in_json", line 200, in main
    segments = get_segments(cpt, delta, steady_state, p_exec, outliers)
  File "/home/vext01/research/warmup_stats/bin/mark_changepoints_in_json", line 227, in get_segments
    pen_value=15.0*numpy.log(len(p_exec)))
  File "/home/vext01/research/warmup_stats/work/pylibs/rpy2/robjects/functions.py", line 178, in __call__
    return super(SignatureTranslatedFunction, self).__call__(*args, **kwargs)
  File "/home/vext01/research/warmup_stats/work/pylibs/rpy2/robjects/functions.py", line 106, in __call__
    res = super(Function, self).__call__(*new_args, **new_kwargs)
rpy2.rinterface.RRuntimeError: Error in multiple.meanvar.norm(data, mul.method = method, penalty, pen.value,  :
  Minimum segment legnth is too large to include a change in this data
@ltratt
Copy link
Member

ltratt commented Jul 16, 2019

I assume this means that you have N iterations and a window size of M where N should be > M, but is actually <= M. I might be very wrong.

@vext01
Copy link
Member Author

vext01 commented Jul 16, 2019

Well, that's what I thought too, but I have 2000 samples and I'm using the default window size.

I'll investigate when I get a chance. Just wanted to raise the bug so it isn't forgotten.

@Gomezale
Copy link

Gomezale commented Jul 23, 2019

Hi Everyone,
I'm pretty new in R use and trying to solve similar use but using FlowAI.
When I run flowAI automated (over 142 samples) i nmiddle of the samples I get this error:
data<- flow_auto_qc(fs_raw_cd45)

image

Any suggestion how to solve it?
Thanks a lot in advance

@vext01
Copy link
Member Author

vext01 commented Jul 23, 2019

This happens when the size of your data is less than the window size.

In the case I reported above, this was due to a crashed process execution, which Krun represents as the empty list.

My suggestion is to check the size of your data.

@Gomezale
Copy link

Gomezale commented Jul 23, 2019

Thanks for your quick support.
May I kindly ask you how to check the size of my data? and how to change to avoid the error?

@vext01
Copy link
Member Author

vext01 commented Jul 23, 2019

It looks like you are using the changepoint library from R(?).

This repo is a Python library that merely consumes the R library in a black box fashion. As such, I can't really help with your query, but you may have luck asking upstream?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants