Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build directly to install space #150

Merged
merged 4 commits into from
Apr 7, 2017

Conversation

mathias-luedtke
Copy link
Member

@mathias-luedtke mathias-luedtke commented Mar 20, 2017

This PR picks some install space related commits from #137
The current master (still) builds all packages, cleans and rebuild to install space (unless `NOT_TEST_INSTALL=true).

These patches avoid rebuilding and fix the retry behavior.

@mathias-luedtke
Copy link
Member Author

Since the mockup tests are not considered (fixed in #137), I have prepared a test brach in ros_canopen: https://travis-ci.org/ipa-mdl/ros_canopen/builds/213134116

In addtion there is a cob_robots branch: https://travis-ci.org/ipa320/cob_robots/builds/213121299
This repository is the main reason for fast-forwarding the install-space patches: Without them the travis build gets stopped because it exceeds 50 minutes.

# Test if the unit tests in the packages in the downstream repo pass.
if [ "$BUILDER" == catkin ]; then
for pkg in $PKGS_DOWNSTREAM; do
if [ ! -d "$CATKIN_WORKSPACE/install/share/$pkg" ]; then continue; fi # skip meta-packages
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this comment correct? Isn't this line skipping those specified in $PKGS_DOWNSTREAM (which are already tested in catkin_run_tests section)?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, it just skips packages without a directory in install/share. AFAIK this is only the case for meta-packages.

@130s
Copy link
Member

130s commented Mar 22, 2017

Seems like this works great!

I'm not sure if I understand correctly, but looking at https://travis-ci.org/ipa320/cob_robots/jobs/213121300, seems like in catkin_run_tests section all tests are run, and catkin_install_run_tests finished within a second (not sure why that short. Are all tests done are from downstream?). I assume with the workspace configured for install, catkin run_tests does all what we wanted?

If so, is keeping those 2 sections still worthy?
Maybe for other build tools that don't handle tests in install space (e.g. catkin_make) we can still keep the logic in catkin_install_run_tests.

@mathias-luedtke
Copy link
Member Author

mathias-luedtke commented Mar 22, 2017

I don't know why, but the catkin_install_run_tests section runs all *.test files from target and downstream packages found in the install space.

We don't install *.test for the cob packages by default, so this block is a no-op.

Maybe for other build tools that don't handle install space (e.g. catkin_make)

catkin_make install would build to install space

@mathias-luedtke mathias-luedtke mentioned this pull request Mar 22, 2017
@mathias-luedtke
Copy link
Member Author

We might consider skipping catkin_install_run_tests if we don't find any *.test files.

@130s
Copy link
Member

130s commented Mar 24, 2017

We might consider skipping catkin_install_run_tests if we don't find any *.test files.

I think that's the way we might want to do as long as tests get to run in catkin_run_tests.

@mathias-luedtke
Copy link
Member Author

mathias-luedtke commented Mar 27, 2017

We might consider skipping catkin_install_run_tests if we don't find any *.test files.

I think that's the way we might want to do as long as tests get to run in catkin_run_tests.

Let's create another issue/PR for this and not mix it.

Copy link
Member

@130s 130s left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

#54 (comment) helped me a lot to understand what this PR does.

The current behaviour is really odd:

  1. build to build/devel space
  2. run build test
  3. clean
  4. build to build/devel space and install to install space
  5. run rostest on install space

So in step 4 we are rebuilding everything from step 1, which doubles the time the build takes.

Ok, with this PR we're skipping the build in 4, is that correct?
What I was afraid was to skipping step 1 and 2 (and 3 obvisously), which could cause errors on the repos where no install rules are not defined. But looks like I was wrong and worrying about nothing.
Looking at the example ipa320/cob_robots, it still goes through all the steps with the only change is skipping the rebuild I think.

Merging. Thanks!

@130s 130s merged commit 7d6e77e into ros-industrial:master Apr 7, 2017
@@ -103,6 +109,8 @@ if [ "${USE_MOCKUP// }" != "" ]; then
ln -s "$TARGET_REPO_PATH/$USE_MOCKUP" $CATKIN_WORKSPACE/src
fi

catkin config --install
Copy link
Member

@130s 130s May 15, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

#150 (review)

The current behaviour is really odd:

  1. build to build/devel space
  2. run build test
  3. clean
  4. build to build/devel space and install to install space
  5. run rostest on install space

So in step 4 we are rebuilding everything from step 1, which doubles the time the build takes.

Ok, with this PR we're skipping the build in 4, is that correct?
What I was afraid was to skipping step 1 and 2 (and 3 obvisously), which could cause errors on the repos where no install rules are not defined. But looks like I was wrong and worrying about nothing.

I overlooked adding this line. With this line, users have no way to avoid testing against install space. E.g. ros-industrial-consortium/descartes#198 (comment) where the tests depend on the files that are intentionally not deployed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants