Dear petsc-users,

    I am trying with a student to modify the MG example 65 to use mat-shells instead of assembled matrices. Our use case is for a method that will use custom shell operators and shell interpolation/restrictions.

To start we have modified ex65 and tried to replace the standard restriction/interpolation with a shell matrix that performs the same operations on the DMDA grids.

I attach our modifications to ex65.

The problem is that the code sometimes completes execution and sometimes errors out like

$ ./ex65shell -ksp_monitor -pc_type mg -da_refine 2 -ksp_rtol 1e-1
>> Created DMshell 0x55bbc83c91a0 (0x55bbc8390a80)
Calling KSPSolve from main
computeRHS on grid 513
computeMatrix on grid 513
Inside Coarsen
>> Created DMshell 0x55bbc84c5270 (0x55bbc84b2ff0)
Inside Coarsen
>> Created DMshell 0x55bbc84e0a30 (0x55bbc84bdb60)
>> Create interpolation from 0x55bbc84c5270(0x55bbc84b2ff0) to 0x55bbc83c91a0(0x55bbc8390a80) [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Invalid pointer
[0]PETSC ERROR: Invalid Pointer to PetscObject: Argument 'obj' (parameter # 1)
[0]PETSC ERROR: See 
https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!Y29qRoOD71oxcd-BQFTUohkIToMJNW3puZ3CerpZ6pBnecqhBQ8TBwhSnikLt4BYQTmsBgtKvoCFi-JuLXQAE_Oe_a_d2XOL8KvOmw$
  for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.22.0, unknown
[0]PETSC ERROR: ./ex65shell with 1 MPI process(es) and PETSC_ARCH  on signalkuppe by matteo Tue Dec 10 12:55:12 2024 [0]PETSC ERROR: Configure options: --prefix=/home/matteo/software/petscsaved/3.22-opt/ PETSC_DIR=/home/matteo/software/petsc --PETSC_ARCH=dbg --with-debugging=1 --with-st rict-petscerrorcode --download-hdf5 --download-ml --with-metis --with-parmetis --with-gmsh --with-triangle --with-zlib --with-p4est-dir=~/software/p4est/local/ [0]PETSC ERROR: #1 PetscObjectReference() at /home/matteo/software/petsc/src/sys/objects/inherit.c:620 [0]PETSC ERROR: #2 PCMGSetRScale() at /home/matteo/software/petsc/src/ksp/pc/impls/mg/mgfunc.c:394 [0]PETSC ERROR: #3 PCSetUp_MG() at /home/matteo/software/petsc/src/ksp/pc/impls/mg/mg.c:998 [0]PETSC ERROR: #4 PCSetUp() at /home/matteo/software/petsc/src/ksp/pc/interface/precon.c:1071 [0]PETSC ERROR: #5 KSPSetUp() at /home/matteo/software/petsc/src/ksp/ksp/interface/itfunc.c:415 [0]PETSC ERROR: #6 KSPSolve_Private() at /home/matteo/software/petsc/src/ksp/ksp/interface/itfunc.c:826 [0]PETSC ERROR: #7 KSPSolve() at /home/matteo/software/petsc/src/ksp/ksp/interface/itfunc.c:1075
[0]PETSC ERROR: #8 main() at ../src/ex65shell.c:79
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -da_refine 2 (source: command line)
[0]PETSC ERROR: -ksp_monitor (source: command line)
[0]PETSC ERROR: -ksp_rtol 1e-1 (source: command line)
[0]PETSC ERROR: -pc_type mg (source: command line)
[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-ma...@mcs.anl.gov----------

In all cases valgrind complains like

$ valgrind ./ex65shell -ksp_monitor -pc_type mg -da_refine 2 -ksp_rtol 1e-1 -int_view ascii::ascii_info
==2130767== Memcheck, a memory error detector
==2130767== Copyright (C) 2002-2022, and GNU GPL'd, by Julian Seward et al.
==2130767== Using Valgrind-3.19.0 and LibVEX; rerun with -h for copyright info ==2130767== Command: ./ex65shell -ksp_monitor -pc_type mg -da_refine 2 -ksp_rtol 1e-1 -int_view ascii::ascii_info
==2130767==
hwloc x86 backend cannot work under Valgrind, disabling.
May be reenabled by dumping CPUIDs with hwloc-gather-cpuid
and reloading them under Valgrind with HWLOC_CPUID_PATH.
>> Created DMshell 0xfbf50f0 (0xfb49480)
Calling KSPSolve from main
computeRHS on grid 513
computeMatrix on grid 513
Inside Coarsen
>> Created DMshell 0xff404a0 (0xff062d0)
Inside Coarsen
>> Created DMshell 0xff78ca0 (0xff4a530)
>> Create interpolation from 0xff404a0(0xff062d0) to 0xfbf50f0(0xfb49480)
Mat Object: 1 MPI process
 type: shell
 rows=513, cols=257
==2130767== Conditional jump or move depends on uninitialised value(s)
==2130767==    at 0x83FDAAE: PCSetUp_MG (mg.c:998)
==2130767==    by 0x8620C04: PCSetUp (precon.c:1071)
==2130767==    by 0x7DA78D5: KSPSetUp (itfunc.c:415)
==2130767==    by 0x7DB2093: KSPSolve_Private (itfunc.c:826)
==2130767==    by 0x7DB7A53: KSPSolve (itfunc.c:1075)
==2130767==    by 0x10D2F7: main (ex65shell.c:79)
==2130767==
==2130767== Conditional jump or move depends on uninitialised value(s)
==2130767==    at 0x5794061: VecDestroy (vector.c:570)
==2130767==    by 0x83FDC36: PCSetUp_MG (mg.c:999)
==2130767==    by 0x8620C04: PCSetUp (precon.c:1071)
==2130767==    by 0x7DA78D5: KSPSetUp (itfunc.c:415)
==2130767==    by 0x7DB2093: KSPSolve_Private (itfunc.c:826)
==2130767==    by 0x7DB7A53: KSPSolve (itfunc.c:1075)
==2130767==    by 0x10D2F7: main (ex65shell.c:79)
==2130767==

We are clearly doing something wrong since PCSetUp_MG (mg.c:998) is an area of code where I wouldn't expect we would enter.

Can you advise on the proper way to achieve our goal?

Best regards

    Matteo
36,38d35
< static PetscErrorCode MyInterpolation(Mat mat, Vec coarse, Vec fine);
< static PetscErrorCode MyRestriction(Mat mat, Vec fine, Vec coarse);
< 
52d48
<   PetscCall(PetscPrintf(PETSC_COMM_WORLD,">> Created DMshell %p (%p)\n",*shell,da));
77d72
<   PetscCall(PetscPrintf(PETSC_COMM_WORLD, "Calling KSPSolve from main\n"));
105c100
<   DM da1, da2;    // dm1 è la coarse, dm2 è la fine
---
>   DM da1, da2;
110,157c105
<   // PetscCall(DMCreateInterpolation(da1, da2, mat, vec));
<   PetscCall(PetscPrintf(PETSC_COMM_WORLD,">> Create interpolation from %p(%p) to %p(%p)\n",dm1,da1,dm2,da2));
< 
<   // getting the corners of the grids
<   PetscInt xLenCoarse, xLenFine;  
<   PetscCall(DMDAGetCorners(da1, NULL, NULL, NULL, &xLenCoarse, NULL, NULL));
<   PetscCall(DMDAGetCorners(da2, NULL, NULL, NULL, &xLenFine, NULL, NULL));
<   
<   PetscCall(MatCreateShell(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE, xLenFine, xLenCoarse, NULL, mat));
<   PetscCall(MatShellSetOperation(*mat, Mstatic ATOP_MULT, (void(*)(void))MyInterpolation));
<   PetscCall(MatShellSetOperation(*mat, MATOP_MULT_TRANSPOSE, (void(*)(void))MyRestriction));
<   PetscCall(MatSetUp(*mat));
< 
< 
<   PetscCall(MatViewFromOptions(*mat, NULL, "-int_view"));
<   //PetscCall(VecViewFromOptions(*vec, NULL, "-int_view"));
<   PetscFunctionReturn(PETSC_SUCCESS);
< }
< 
< static PetscErrorCode MyInterpolation(Mat mat, Vec coarse, Vec fine){
<   PetscFunctionBeginUser;
< 
<   PetscCall(PetscPrintf(PETSC_COMM_WORLD, "Inside myInterpolation\n"));
< 
<   PetscInt size;
< 
<   // get size of the vector
<   PetscCall(VecGetSize(coarse, &size));
< 
<   PetscCall(VecViewFromOptions(coarse, NULL, "-int_view"));
<   PetscCall(VecViewFromOptions(fine, NULL, "-int_view"));
< 
<   // setup the two vectors
<   const PetscScalar *vecCoarse;
<   PetscScalar *vecFine;
<   PetscCall(VecGetArrayRead(coarse, &vecCoarse));
<   PetscCall(VecGetArray(fine, &vecFine));
< 
<   for (PetscInt i = 0; i < size; i++)
<   {
<     if (i%2 == 0) vecFine[i] = vecCoarse[i];
< 
<     else vecFine[i] = 0.5*(vecCoarse[i-1]+vecCoarse[i+1]);
<   }
<   
<   PetscCall(VecRestoreArrayRead(coarse, &vecCoarse)); 
<   PetscCall(VecRestoreArray(fine, &vecFine));
< 
---
>   PetscCall(DMCreateInterpolation(da1, da2, mat, vec));
164,166c112
<   // Mat tmat;
<   
<   PetscCall(PetscPrintf(PETSC_COMM_WORLD,"<< Create restriction\n"));
---
>   Mat tmat;
171,214c117,119
<   // PetscCall(DMCreateInterpolation(da1, da2, &tmat, NULL));
<   // PetscCall(MatTranspose(tmat, MAT_INITIAL_MATRIX, mat));
<   // PetscCall(MatDestroy(&tmat));
< 
<   // getting the corners of the grids
<   PetscInt xLenCoarse, xLenFine;  
<   PetscCall(DMDAGetCorners(da1, NULL, NULL, NULL, &xLenCoarse, NULL, NULL));
<   PetscCall(DMDAGetCorners(da2, NULL, NULL, NULL, &xLenFine, NULL, NULL));
<     
<   PetscCall(MatCreateShell(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE, xLenCoarse, xLenFine, NULL, mat));
<   PetscCall(MatShellSetOperation(*mat, MATOP_MULT, (void(*)(void))MyRestriction));
<   PetscCall(MatShellSetOperation(*mat, MATOP_MULT_TRANSPOSE, (void(*)(void))MyInterpolation));
< 
<   PetscCall(MatViewFromOptions(*mat, NULL, "-int_view")); 
<   PetscFunctionReturn(PETSC_SUCCESS);
< }
< 
< static PetscErrorCode MyRestriction(Mat mat, Vec fine, Vec coarse){
<   PetscFunctionBeginUser;
< 
<   PetscCall(PetscPrintf(PETSC_COMM_WORLD, "Inside myRestriction\n"));
< 
<   PetscInt size;
< 
<   // get size of the vector
<   PetscCall(VecGetSize(coarse, &size));
< 
<   PetscCall(VecViewFromOptions(coarse, NULL, "-int_view"));
<   PetscCall(VecViewFromOptions(fine, NULL, "-int_view"));
< 
<   // setup the two vectors
<   const PetscScalar *vecFine;
<   PetscScalar *vecCoarse;
<   PetscCall(VecGetArray(coarse, &vecCoarse));
<   PetscCall(VecGetArrayRead(fine, &vecFine));
< 
<   for (PetscInt i = 0; i < size; i++)
<   {
<     vecCoarse[i] = vecFine[i];
<   }
<   
<   PetscCall(VecRestoreArray(coarse, &vecCoarse)); 
<   PetscCall(VecRestoreArrayRead(fine, &vecFine));
< 
---
>   PetscCall(DMCreateInterpolation(da1, da2, &tmat, NULL));
>   PetscCall(MatTranspose(tmat, MAT_INITIAL_MATRIX, mat));
>   PetscCall(MatDestroy(&tmat));
245d149
<   PetscCall(PetscPrintf(PETSC_COMM_WORLD, "Inside Refine\n"));
257d160
<   PetscCall(PetscPrintf(PETSC_COMM_WORLD, "Inside Coarsen\n"));
274d176
<   PetscCall(PetscPrintf(PETSC_COMM_WORLD, "computeRHS on grid %d\n",mx));
297d198
<   PetscCall(PetscPrintf(PETSC_COMM_WORLD, "computeMatrix on grid %d\n",mx));

Reply via email to