-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Matrix reduction to improve the numerical stability #41
Comments
Hi, I just found that, setting diagonal values to 1 would achieve the same condition number. for petsc solver I did:
Then the problem can be solved by "umfpack_solver"(scipy direct), the petsc solver still failed(maybe I used it wrongly, I tried the default and tfqmr, jacobi), and I am not sure how to change for the jax_solver... |
By "matrix reduction", you mean if your original system is NxN, and the number of Dirichlet B.C. related variables is M, then in JAX-FEM we still solve NxN system, but with "matrix reduction" we solve (N-M)x(N-M), right? We are not sure how much gain we will have using "matrix reduction", but it looks like this will generally improve the solver performance. We will try to implement this feature. Can you find a feature in PETSc that can conveniently reduce a NxN system into (N-M)x(N-M) given the M indices? |
Ah yes, that was what I originally thought. I looked through pets4py's doc, didnt find anything handy. I think it would be inefficient to remove the column and rows. In the second post, I found setting diagonal to 1 might achieve the same performance. Sorry I didnt explain in full detail. What I found that original implementation is only zeroing the corresponding rows of the matrix. and what I did is to zeroing rows and columns and set the diagonal values to 1. Here is results that I found with different approaches. I note down the condition number of A with different approaches, the conclusion is that the zeroing rows and columns and set diagonal to 1 might able to achieve the same performance as matrix reduction:
|
Thanks for the report here. Yeah, it does seem that zeroing both column and row instead of actually reduction is easier for implementation, though we need to figure out what to add to the right hand side term. |
Hi, I am having some numerical stability issue when solving with some less perfect geometries. I tried the same example on Sfepy and I found the problem can actually by solved without convergence issue. I found the problem is when applying the Dirichlet BCs, jaxfem is just zeroing the correspoding rows and columns in dres/du or A, while Sfepy would do the matrix reduction.
I wonder if there is a particular reason not to implement the matrix reduction? I plan to give it a try, since I do need to improve the numerical stability where it is affecting my performance heavily.
The text was updated successfully, but these errors were encountered: