Skip to content

Conversation

@pbrubeck
Copy link
Contributor

@pbrubeck pbrubeck commented Nov 4, 2025

Description

Trigger preconditioner update for a MatNest with more than one field on a block of the fieldsplit.
Fixes #4687

Also fixes fieldsplit for AssembledMatrix with bcs

Cleanup create_subdm in dmhooks.py

@pbrubeck pbrubeck force-pushed the pbrubeck/fix/nest-fieldsplit branch from 7c9dfcc to aaabace Compare November 4, 2025 17:24
@pbrubeck pbrubeck changed the title Fieldsplit: update MatNest Jacobian Fieldsplit: update MatNest Jacobian and support bcs on AssembledMatrix Nov 4, 2025
Comment on lines 517 to 524
# Bump petsc matrix state of each split by assembling them.
# Ensures that if the matrix changed, the preconditioner is
# updated if necessary.
for field, splits in ctx._splits.items():
for subctx in splits:
subctx._jac.assemble()
if subctx.Jp is not None:
subctx._pjac.assemble()
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@stefanozampini I think this should be called elsewhere, maybe in pyop2 or PETSc, but so far this fixes the bug #4687. This is very similar to what we had to do for MatIS

if op2tensor.handle.getType() == "is":
# Flag the entire matrix as assembled before indexing the diagonal block
op2tensor.handle.assemble()


if ctx._pre_jacobian_callback is not None:
ctx._pre_jacobian_callback(X)
ctx._assemble_jac(ctx._jac)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't the update of the subfield operators be done inside _assemble_jac?

Copy link
Contributor

@stefanozampini stefanozampini Nov 5, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MatAssembly for MATNEST calls the subblocks assembly https://gitlab.com/petsc/petsc/-/blob/main/src/mat/impls/nest/matnest.c?ref_type=heads#L481 ,so maybe you have special code in _assemble_jac that skips calling MatAssemble if the matrix is of type nest?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the issue is that the bug involves several MatNests at play. Suppose you split a 3x3 MatNest into a 2x2 and 1x1 blocks in the diagonal. Every block of 2x2 MatNest gets updated (I don't know where but I think PETSc coordinates this), except that no one seems to be calling assemble() on the 2x2 MatNest, hence the preconditioner does not get updated.

Copy link
Contributor Author

@pbrubeck pbrubeck Nov 5, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And to clarify, our implementation of fieldsplit always creates new matrices for the sub KSPs. This seems a bit wasteful for MatNest, but does seem to be more performant for a monolithic MatAIJ

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you call M.assemble() on the outer MATNEST, the subblocks will be assembled. If there is a bug in PETSc, this should be in fieldsplit, not in matnest.
Can you try with a plain PETSc4py script that reproduces your solver? It should be relatively simple to implement; you don't need all the FEM machinery. If it works, then the problem is in firedrake, otherwise is in fieldsplit

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, are there petsc4py examples on how to setup nonlinear problems?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah! I think I have found the problem. Can you try increasing the object state of the B matrix in case of MAT_REUSE_MATRIX in https://gitlab.com/petsc/petsc/-/blob/main/src/mat/impls/nest/matnest.c?ref_type=heads#L706?
Just add PetscCall(PetscObjectStateIncrease((PetscObject)*B)) after PetscCheck(sub == *B,...

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, are there petsc4py examples on how to setup nonlinear problems?

https://gitlab.com/petsc/petsc/-/blob/main/src/binding/petsc4py/test/test_snes.py

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Your one-line suggestion seems to fix the issue, and we do no longer need this block, thanks

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great. I will open a PR in PETSc to fix the problem when I have some time to do it, hopefully this week

Comment on lines +526 to +534
# Bump petsc matrix state of each split by assembling them.
# Ensures that if the matrix changed, the preconditioner is
# updated if necessary.
for fields, splits in ctx._splits.items():
for subctx in splits:
subctx._jac.assemble()
if subctx.Jp is not None:
subctx._pjac.assemble()

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# Bump petsc matrix state of each split by assembling them.
# Ensures that if the matrix changed, the preconditioner is
# updated if necessary.
for fields, splits in ctx._splits.items():
for subctx in splits:
subctx._jac.assemble()
if subctx.Jp is not None:
subctx._pjac.assemble()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants