> Must call DMShellSetGlobalVector() or DMShellSetCreateGlobalVector()
> [0]PETSC ERROR: #1 DMCreateGlobalVector_Shell() at 
> /Users/markadams/Codes/petsc/src/dm/impls/shell/dmshell.c:210

  It looks like you have built a DMSHELL? You need to teach it how to generate 
global vectors since yours currently does not.

  Barry


> On May 9, 2023, at 5:40 PM, Mark Adams <[email protected]> wrote:
> 
> 
> 
> On Tue, May 9, 2023 at 3:01 PM Barry Smith <[email protected] 
> <mailto:[email protected]>> wrote:
>> 
>> 
>>> On May 9, 2023, at 12:32 PM, Mark Adams <[email protected] 
>>> <mailto:[email protected]>> wrote:
>>> 
>>> I have a MG hierarchy that I construct manually with DMRefine and 
>>> DMPlexExtrude.
>>> 
>>> * The solver works great with chevy/sor but with chevy/sor it converges 
>>> slowly or I get indefinite PC errors from CG. And the eigen estimates in 
>>> cheby are really high, like 10-15. 
>> 
>>    So with Cheby/SOR it works great but with the exact same options 
>> Cheby/SOR it behaves poorly? Are you using  some quantum computer and NERSc?
>  
> It turned out that I had the sign wrong on my Laplacian point function and so 
> the matrix was negative definite. I'm not sure what really happened exactly 
> but it is sort of behaving better.
> It looks like my prolongation operator is garbage, the coarse grid correction 
> does nothing (cg/jacobi converges in a little less that the number of MG 
> iterations times the sum of pre and post smoothing steps), and the rows sums 
> of P are not 1.
> Not sure what is going on there, but it is probably related to the DM 
> hierarchy not being constructed correctly....
>>> 
>>> * I tried turning galerkin=none and I got this error. 
>> 
>>   This is because without Garkin it needs to restrict the current solution 
>> and then compute the coarse grid Jacobian. Since you did not provide a DM 
>> that has the ability to even generate coarse grid vectors the process can 
>> work. You need a DM that can provide the coarse grid vectors and restrict 
>> solutions. Did you forget to pass a DM to the solver?
> 
> The DM does everything. Similar to many examples but I've been checking with 
> snes/tutorials/ex12.c today. 
> I do call:
> PetscCall(DMSetCoarseDM(dmhierarchy[r], dmhierarchy[r-1]));
> But I am missing something else that goes on in DMRefineHierarchy, which I 
> can't use because I am semi-coarsening.
> I probably have to build a section on each DM or something, but I have bigger 
> fish to fry at this point.
> 
> (I construct a 2D coarse grid, refine that a number of times and 
> DMPlexExtrude each one the same amount (number and distance), the extruded 
> direction is wrapped around a torus and made periodic. 
> The fine grid now looks like, and will eventually be, the grids that tokamak 
> codes use.)
> 
> Thanks,
> Mark
>  
>>> 
>>> Any thoughts on either of these issues?
>>> 
>>> Thanks,
>>> Mark
>>> 
>>> [0]PETSC ERROR: --------------------- Error Message 
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: Must call DMShellSetGlobalVector() or 
>>> DMShellSetCreateGlobalVector()
>>> [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could 
>>> be the program crashed before they were used or a spelling mistake, etc!
>>> [0]PETSC ERROR:   Option left: name:-ksp_converged_reason (no value) 
>>> source: command line
>>> [0]PETSC ERROR:   Option left: name:-mg_levels_esteig_ksp_type value: cg 
>>> source: command line
>>> [0]PETSC ERROR:   Option left: name:-mg_levels_pc_type value: sor source: 
>>> command line
>>> [0]PETSC ERROR:   Option left: name:-options_left (no value) source: 
>>> command line
>>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.19.1-224-g9ed82936d20  
>>> GIT Date: 2023-05-07 12:33:48 -0400
>>> [0]PETSC ERROR: ./ex96 on a arch-macosx-gnu-O named MarksMac-302.local by 
>>> markadams Tue May  9 12:26:52 2023
>>> [0]PETSC ERROR: Configure options CFLAGS="-g  -Wall" CXXFLAGS="-g  -Wall" 
>>> COPTFLAGS=-O CXXOPTFLAGS=-O --with-cc=/usr/local/opt/llvm/bin/clang 
>>> --with-cxx=/usr/local/opt/llvm/bin/clang++ --download-mpich 
>>> --with-strict-petscerrorcode --download-triangle=1 --with-x=0 
>>> --with-debugging=0 --download-hdf5=1 PETSC_ARCH=arch-macosx-gnu-O
>>> [0]PETSC ERROR: #1 DMCreateGlobalVector_Shell() at 
>>> /Users/markadams/Codes/petsc/src/dm/impls/shell/dmshell.c:210
>>> [0]PETSC ERROR: #2 DMCreateGlobalVector() at 
>>> /Users/markadams/Codes/petsc/src/dm/interface/dm.c:1022
>>> [0]PETSC ERROR: #3 DMGetNamedGlobalVector() at 
>>> /Users/markadams/Codes/petsc/src/dm/interface/dmget.c:377
>>> [0]PETSC ERROR: #4 DMRestrictHook_SNESVecSol() at 
>>> /Users/markadams/Codes/petsc/src/snes/interface/snes.c:649
>>> [0]PETSC ERROR: #5 DMRestrict() at 
>>> /Users/markadams/Codes/petsc/src/dm/interface/dm.c:3407
>>> [0]PETSC ERROR: #6 PCSetUp_MG() at 
>>> /Users/markadams/Codes/petsc/src/ksp/pc/impls/mg/mg.c:1074
>>> [0]PETSC ERROR: #7 PCSetUp() at 
>>> /Users/markadams/Codes/petsc/src/ksp/pc/interface/precon.c:994
>>> [0]PETSC ERROR: #8 KSPSetUp() at 
>>> /Users/markadams/Codes/petsc/src/ksp/ksp/interface/itfunc.c:406
>>> [0]PETSC ERROR: #9 KSPSolve_Private() at 
>>> /Users/markadams/Codes/petsc/src/ksp/ksp/interface/itfunc.c:824
>>> [0]PETSC ERROR: #10 KSPSolve() at 
>>> /Users/markadams/Codes/petsc/src/ksp/ksp/interface/itfunc.c:1070
>>> [0]PETSC ERROR: #11 SNESSolve_KSPONLY() at 
>>> /Users/markadams/Codes/petsc/src/snes/impls/ksponly/ksponly.c:48
>>> [0]PETSC ERROR: #12 SNESSolve() at 
>>> /Users/markadams/Codes/petsc/src/snes/interface/snes.c:4663
>>> [0]PETSC ERROR: #13 main() at ex96.c:433
>>> 
>>> 
>>> 
>> 

Reply via email to