Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

default stepsizes primal-dual solver #41

Merged
merged 22 commits into from
Jul 29, 2020
Merged

Conversation

pylat
Copy link
Member

@pylat pylat commented Jul 28, 2020

It seems I have somehow closed the previous pull request. I couldn't find a way to reopen it, so I am recreating a pull request with the issues of the previous one fixed (hopefully).

Edit: replaces #40

@lostella
Copy link
Member

lostella commented Jul 29, 2020

It seems I have somehow closed the previous pull request. I couldn't find a way to reopen it

I'm not sure why that is the case, but one general comment about the PR (which may or may not be related to the re-opening issue):

It looks like you're opening the PR from the master branch of your fork. This is not ideal in general, since you want your local master branch to be in sync with the main repository's master branch; also, the PR contains 22 commits right at the start, which looks weird.

The workflow you may want to use starts with the following setup:

  • clone your fork ("origin") locally
  • add a remote (call it eg "upstream") for kul-forbes/ProximalAlgorithms.jl
  • set your local master branch to track upstream/master

This way you can pull master freely, and you will always get the hottest stuff from the main repository.

When you need to work on something:

  1. pull master
  2. branch out of master
  3. commit your changes
  4. push the branch to origin (ie your fork)
  5. open a PR for your branch

You'll see that the PR shows your commits from item 3. only (ideally a single commit at the beginning of the PR).

Copy link
Member

@lostella lostella left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me! Nevermind the indentation issues, I'll do a pass over the whole code base with JuliaFormatter.

If you're fine with the change, we can merge it

src/algorithms/primaldual.jl Show resolved Hide resolved
end
elseif theta == 0 && mu == 0.5 # PPDCA
if beta_l == 0 || beta_f == 0
if nmL > par * max(beta_l, beta_f)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still off by one?

@lostella
Copy link
Member

@pylat Regarding testing the linearity of functions (or maybe affinity?) we could have a dedicated predicate in ProximalOperators that takes into account for all calculus rules.

The only issue is, is_affine is already used to test if a function is the indicator of an affine set. Maybe we should rename that to is_affine_indicator or something?

)

if isa(h, Zero) # case h(Lx) equiv 0
L = R(0)*I
y0 = zero(x0)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indeed it won't affect the primal variable. When relaxation parameter lambda is not equal to 1, then we end up with updating u^+ = (1-lambda) u. So that is why I set it to zero. Otherwise we can just let it scale the duals throughout the iterations.

So this whole check does not really fix a bug, does it @pylat? Just trying to understand. It really just affects the dual variables, which one doesn’t care much about whenever h==0

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Without L = R(0)*I, it was a bug (it was using the default L= identity). With the default L, we had barx = prox_g( x- \gamma u - \gamma \nabla f(x)) where the dual variable was converging to zero. So it is incorrect without L = 0 but just happened to converge often.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it! It’s just that h(1x) = h(0x) for all x whenever h == 0, so the problems are equivalent in that case (but apparently the iterations are not)

@pylat
Copy link
Member Author

pylat commented Jul 29, 2020

It seems I have somehow closed the previous pull request. I couldn't find a way to reopen it

I'm not sure why that is the case, but one general comment about the PR (which may or may not be related to the re-opening issue):

It looks like you're opening the PR from the master branch of your fork. This is not ideal in general, since you want your local master branch to be in sync with the main repository's master branch; also, the PR contains 22 commits right at the start, which looks weird.

The workflow you may want to use starts with the following setup:

  • clone your fork ("origin") locally
  • add a remote (call it eg "upstream") for kul-forbes/ProximalAlgorithms.jl
  • set your local master branch to track upstream/master

This way you can pull master freely, and you will always get the hottest stuff from the main repository.

When you need to work on something:

  1. pull master
  2. branch out of master
  3. commit your changes
  4. push the branch to origin (ie your fork)
  5. open a PR for your branch

You'll see that the PR shows your commits from item 3. only (ideally a single commit at the beginning of the PR).

Thanks Lorenzo, I will keep this in mind. PR closing was also probably automatic when I pushed into the master.

@lostella lostella merged commit 698b4ea into JuliaFirstOrder:master Jul 29, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants