-
Notifications
You must be signed in to change notification settings - Fork 176
Fieldsplit: update MatNest Jacobian and support bcs on AssembledMatrix #4708
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: release
Are you sure you want to change the base?
Conversation
7c9dfcc to
aaabace
Compare
| # Bump petsc matrix state of each split by assembling them. | ||
| # Ensures that if the matrix changed, the preconditioner is | ||
| # updated if necessary. | ||
| for field, splits in ctx._splits.items(): | ||
| for subctx in splits: | ||
| subctx._jac.assemble() | ||
| if subctx.Jp is not None: | ||
| subctx._pjac.assemble() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@stefanozampini I think this should be called elsewhere, maybe in pyop2 or PETSc, but so far this fixes the bug #4687. This is very similar to what we had to do for MatIS
firedrake/firedrake/assemble.py
Lines 1496 to 1498 in 0ce9ad2
| if op2tensor.handle.getType() == "is": | |
| # Flag the entire matrix as assembled before indexing the diagonal block | |
| op2tensor.handle.assemble() |
|
|
||
| if ctx._pre_jacobian_callback is not None: | ||
| ctx._pre_jacobian_callback(X) | ||
| ctx._assemble_jac(ctx._jac) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't the update of the subfield operators be done inside _assemble_jac?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
MatAssembly for MATNEST calls the subblocks assembly https://gitlab.com/petsc/petsc/-/blob/main/src/mat/impls/nest/matnest.c?ref_type=heads#L481 ,so maybe you have special code in _assemble_jac that skips calling MatAssemble if the matrix is of type nest?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the issue is that the bug involves several MatNests at play. Suppose you split a 3x3 MatNest into a 2x2 and 1x1 blocks in the diagonal. Every block of 2x2 MatNest gets updated (I don't know where but I think PETSc coordinates this), except that no one seems to be calling assemble() on the 2x2 MatNest, hence the preconditioner does not get updated.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And to clarify, our implementation of fieldsplit always creates new matrices for the sub KSPs. This seems a bit wasteful for MatNest, but does seem to be more performant for a monolithic MatAIJ
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you call M.assemble() on the outer MATNEST, the subblocks will be assembled. If there is a bug in PETSc, this should be in fieldsplit, not in matnest.
Can you try with a plain PETSc4py script that reproduces your solver? It should be relatively simple to implement; you don't need all the FEM machinery. If it works, then the problem is in firedrake, otherwise is in fieldsplit
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, are there petsc4py examples on how to setup nonlinear problems?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah! I think I have found the problem. Can you try increasing the object state of the B matrix in case of MAT_REUSE_MATRIX in https://gitlab.com/petsc/petsc/-/blob/main/src/mat/impls/nest/matnest.c?ref_type=heads#L706?
Just add PetscCall(PetscObjectStateIncrease((PetscObject)*B)) after PetscCheck(sub == *B,...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, are there petsc4py examples on how to setup nonlinear problems?
https://gitlab.com/petsc/petsc/-/blob/main/src/binding/petsc4py/test/test_snes.py
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Your one-line suggestion seems to fix the issue, and we do no longer need this block, thanks
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great. I will open a PR in PETSc to fix the problem when I have some time to do it, hopefully this week
| # Bump petsc matrix state of each split by assembling them. | ||
| # Ensures that if the matrix changed, the preconditioner is | ||
| # updated if necessary. | ||
| for fields, splits in ctx._splits.items(): | ||
| for subctx in splits: | ||
| subctx._jac.assemble() | ||
| if subctx.Jp is not None: | ||
| subctx._pjac.assemble() | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| # Bump petsc matrix state of each split by assembling them. | |
| # Ensures that if the matrix changed, the preconditioner is | |
| # updated if necessary. | |
| for fields, splits in ctx._splits.items(): | |
| for subctx in splits: | |
| subctx._jac.assemble() | |
| if subctx.Jp is not None: | |
| subctx._pjac.assemble() |
Description
Trigger preconditioner update for a MatNest with more than one field on a block of the fieldsplit.
Fixes #4687
Also fixes fieldsplit for
AssembledMatrixwith bcsCleanup
create_subdmindmhooks.py