Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bitarray test slow #23

Closed
jakirkham opened this issue Sep 15, 2017 · 12 comments
Closed

bitarray test slow #23

jakirkham opened this issue Sep 15, 2017 · 12 comments

Comments

@jakirkham
Copy link
Member

Frequently the bitarray test takes too long to complete on the CIs. A sample build log is linked below. Is there something we can do to fix this? Should we just skip it?

ref: https://circleci.com/gh/conda-forge/julia-feedstock/62

@tkelman
Copy link
Member

tkelman commented Sep 15, 2017

Shouldn't take much more than a minute on Julia 0.5 - https://travis-ci.org/JuliaLang/julia/jobs/275261992#L1823

Something about the way things are built here may be resulting in a serious performance regression.

@jakirkham
Copy link
Member Author

Are there any important options that we should be providing? Do we need to explicitly add -O2 or -O3 for instance?

@tkelman
Copy link
Member

tkelman commented Sep 15, 2017

Does the conda build environment set variables like CFLAGS, CXXFLAGS to any default values? Those might be overriding the settings in the Julia build system.

@jakirkham
Copy link
Member Author

jakirkham commented Sep 15, 2017

Yes we set these. What should we be looking to add?

@tkelman
Copy link
Member

tkelman commented Sep 15, 2017

What are they set to? Julia will set some defaults (I forget whether it's O2 or O3, would have to check the makefiles), but if they are set to something different in the environment it'll use that value.

@isuruf
Copy link
Member

isuruf commented Sep 15, 2017

Just -m64

@jakirkham
Copy link
Member Author

jakirkham commented Sep 15, 2017

More details are here.

Yes we are not adding any optimizations by default. That is too hairy. Best to be dealt with on a per package basis.

@jakirkham
Copy link
Member Author

Any more details on what sort of flags should be added?

@jakirkham
Copy link
Member Author

TBH if I run this test locally with our julia package, it takes less time than the Travis CI build linked (results below). So I don't think it is a build option issue. However the fact that this test takes wildly different lengths of time to run is disconcerting to say the least.

$ julia -e 'Base.runtests(["bitarray"]);'
Test (Worker) | Time (s) | GC (s) | GC % | Alloc (MB) | RSS (MB)
bitarray (1)  |   67.55  |  1.26  |  1.9 | 2821.37    | 343.90  

Test Summary: |   Pass   Total
  Overall     | 890969  890969
    SUCCESS

Though I would not be surprised if the test is highly demanding and is causing the CI worker to freeze. Is there some way we can make this test better behaved? If not, I'm leaning towards skipping it as this is slowing down development of the recipe.

@tkelman
Copy link
Member

tkelman commented Sep 21, 2017

It can also depend on which tests run in the same test worker.

@tkelman
Copy link
Member

tkelman commented Sep 21, 2017

Any more details on what sort of flags should be added?

Build Julia in a totally clean environment from a clone and make. Use the same options that does. Probably does O2 or O3 most places.

@dfornika dfornika mentioned this issue Nov 8, 2017
@ngam
Copy link
Contributor

ngam commented Dec 29, 2021

Closing this, reopen if more discussion is needed. bitarray is still a bit slow, but we improved things by forcing the tests to run in parallel in the #157

@ngam ngam closed this as completed Dec 29, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants