-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
simulate! and thus parametricbootstrap for GLMM #418
Conversation
Codecov Report
@@ Coverage Diff @@
## master #418 +/- ##
==========================================
- Coverage 94.31% 94.22% -0.09%
==========================================
Files 23 23
Lines 1637 1681 +44
==========================================
+ Hits 1544 1584 +40
- Misses 93 97 +4
Continue to review full report at Codecov.
|
I need to fix the Gaussian with non-identity link pathway. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like the idea of using the StableRNGs
package in the tests.
I think it will be necessary to consider the scaling in building η
in the simulate!
method. I put a comment there. I'm pretty sure that using sdest
applied to the penalized weighted least squares structure for the working response is not correct.
src/simulate.jl
Outdated
end | ||
|
||
# scale by lm.σ and add fixed-effects contribution | ||
BLAS.gemv!('N', one(T), lm.X, β, lm.σ, η) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are you sure that lm.σ
is the correct multiple here? It just calls sdest(lm)
which calls varest(m)
which is based on pwrss(m)
. And that value is not necessarily related to any scale parameter.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep, my deep dive into the GLMM code has highlighted this to me. It 'works' here because sigma is usually close to 1 for families without a dispersion parameter and I think unit scaling is correct.
The failure on nightly looks unrelated: MethodError: propertynames(::LinearMixedModel{Float64}, ::Bool) is ambiguous. Candidates:
propertynames(m::LinearMixedModel, private) in MixedModels at /home/runner/work/MixedModels.jl/MixedModels.jl/src/linearmixedmodel.jl:564
propertynames(x, private::Bool) in Base at reflection.jl:1530 |
@dmbates Care to take a second look? |
src/bootstrap.jl
Outdated
@@ -290,7 +304,7 @@ function tidyσs(bsamp::MixedModelBootstrap{T}) where {T} | |||
) | |||
for (iter, r) in enumerate(bstr) | |||
setθ!(bsamp, iter) # install r.θ in λ | |||
σ = r.σ | |||
σ = r.σ === missing ? 1 : r.σ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This could be written as
σ = coalesce(r.σ, one(T))
to keep type consistency and more in the spirit of the missing
design.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice, thanks for the tip!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just one minor suggestion of using coalesce
instead of x === missing ? ...
Closes #245.
I had to add in a few accessor-to-preallocated-buffer methods for GLMM elsewhere (
fixef!
, etc.).stderror!
actually does a hidden allocation, but I'll fix that as soon as I've looked upStatsBase.stderror
.There's also still some debugging code in there, but it should otherwise be in a good state to check for conceptual errors.