> On 17 Apr 2023, at 1:10 AM, Alexander Lindsay <[email protected]> > wrote: > > Are there any plans to get the missing hook into PETSc for AIR? Just curious > if there’s an issue I can subscribe to or anything.
Not that I know of, but it would make for a nice contribution if you feel like creating a PR. Thanks, Pierre > (Independently I’m excited to test HPDDM out tomorrow) > >> On Apr 13, 2023, at 10:29 PM, Pierre Jolivet <[email protected]> wrote: >> >> >>> On 14 Apr 2023, at 7:02 AM, Alexander Lindsay <[email protected]> >>> wrote: >>> >>> Pierre, >>> >>> This is very helpful information. Thank you. Yes I would appreciate those >>> command line options if you’re willing to share! >> >> No problem, I’ll get in touch with you in private first, because it may >> require some extra work (need a couple of extra options in PETSc >> ./configure), and this is not very related to the problem at hand, so best >> not to spam the mailing list. >> >> Thanks, >> Pierre >> >>>> On Apr 13, 2023, at 9:54 PM, Pierre Jolivet <[email protected]> wrote: >>>> >>>> >>>> >>>>> On 13 Apr 2023, at 10:33 PM, Alexander Lindsay <[email protected]> >>>>> wrote: >>>>> >>>>> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds >>>>> numbers. My options table >>>>> >>>>> -dm_moose_fieldsplit_names u,p >>>>> -dm_moose_nfieldsplits 2 >>>>> -fieldsplit_p_dm_moose_vars pressure >>>>> -fieldsplit_p_ksp_type preonly >>>>> -fieldsplit_p_pc_type jacobi >>>>> -fieldsplit_u_dm_moose_vars vel_x,vel_y >>>>> -fieldsplit_u_ksp_type preonly >>>>> -fieldsplit_u_pc_hypre_type boomeramg >>>>> -fieldsplit_u_pc_type hypre >>>>> -pc_fieldsplit_schur_fact_type full >>>>> -pc_fieldsplit_schur_precondition selfp >>>>> -pc_fieldsplit_type schur >>>>> -pc_type fieldsplit >>>>> >>>>> works wonderfully for a low Reynolds number of 2.2. The solver >>>>> performance crushes LU as I scale up the problem. However, not >>>>> surprisingly this options table struggles when I bump the Reynolds number >>>>> to 220. I've read that use of AIR (approximate ideal restriction) can >>>>> improve performance for advection dominated problems. I've tried setting >>>>> -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion problem and >>>>> the option works fine. However, when applying it to my field-split >>>>> preconditioned Navier-Stokes system, I get immediate non-convergence: >>>>> >>>>> 0 Nonlinear |R| = 1.033077e+03 >>>>> 0 Linear |R| = 1.033077e+03 >>>>> Linear solve did not converge due to DIVERGED_NANORINF iterations 0 >>>>> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 >>>>> >>>>> Does anyone have an idea as to why this might be happening? >>>> >>>> Do not use this option, even when not part of PCFIELDSPLIT. >>>> There is some missing plumbing in PETSc which makes it unusable, see Ben’s >>>> comment here >>>> https://github.com/hypre-space/hypre/issues/764#issuecomment-1353452417. >>>> In fact, it’s quite easy to make HYPRE generate NaN with a very simple >>>> stabilized convection—diffusion problem near the pure convection limit >>>> (something that ℓAIR is supposed to handle). >>>> Even worse, you can make HYPRE fill your terminal with printf-style >>>> debugging messages >>>> https://github.com/hypre-space/hypre/blob/5546cc22d46b3dba253849f258786da47c9a7b21/src/parcsr_ls/par_lr_restr.c#L1416 >>>> with this option turned on. >>>> As a result, I have been unable to reproduce any of the ℓAIR results. >>>> This also explains why I have been using plain BoomerAMG instead of ℓAIR >>>> for the comparison in page 9 of https://arxiv.org/pdf/2201.02250.pdf (if >>>> you would like to try the PC we are using, I could send you the command >>>> line options). >>>> >>>> Thanks, >>>> Pierre >>>> >>>>> If not, I'd take a suggestion on where to set a breakpoint to start my >>>>> own investigation. Alternatively, I welcome other preconditioning >>>>> suggestions for an advection dominated problem. >>>>> >>>>> Alex >>>> >>
