> On May 17, 2023, at 11:10 AM, Leonardo Mutti 
> <[email protected]> wrote:
> 
> Dear developers, let me kindly ask for your help again. 
> In the following snippet, a bi-diagonal matrix A is set up. It measures 8x8 
> blocks, each block is 2x2 elements. I would like to create the correct IS 
> objects for PCGASM. 
> The non-overlapping IS should be: [0,1], [2,3],[4,5], ..., [14,15]. The 
> overlapping IS should be: [0,1], [0,1,2,3], [2,3,4,5], ..., [12,13,14,15]
> I am running the code with 4 processors. For some reason, after calling 
> PCGASMDestroySubdomains the code crashes with severe (157): Program Exception 
> - access violation. A visual inspection of the indices using ISView looks 
> good.

  Likely memory corruption or use of an object or an array that was already 
freed. Best to use Valgrind to find the exact location of the mess.


> Thanks again,
> Leonardo
> 
>       Mat   :: A
>       Vec   :: b
>       PetscInt :: 
> M,N_blocks,block_size,I,J,NSub,converged_reason,srank,erank,color,subcomm
>       PetscMPIInt :: size
>       PetscErrorCode :: ierr
>       PetscScalar :: v
>       KSP            :: ksp
>       PC             :: pc
>       IS,ALLOCATABLE :: subdomains_IS(:), inflated_IS(:)
>       PetscInt            :: NMPI,MYRANK,IERMPI
>       INTEGER  :: IS_counter, is_start, is_end
>          
>       call PetscInitialize(PETSC_NULL_CHARACTER, ierr)
>       call PetscLogDefaultBegin(ierr)
>       call MPI_COMM_SIZE(MPI_COMM_WORLD, NMPI, IERMPI)
>       CALL MPI_COMM_RANK(MPI_COMM_WORLD, MYRANK,IERMPI)   
>             
>       N_blocks = 8
>       block_size = 2
>       M = N_blocks * block_size
>                         
>       ALLOCATE(subdomains_IS(N_blocks))
>       ALLOCATE(inflated_IS(N_blocks))
>       
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>       ! ASSUMPTION: no block spans more than one rank (the inflated blocks 
> can)
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!      
>             
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>       ! INTRO: create matrix and right hand side
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>       
>       ! How many inflated blocks span more than one rank? NMPI-1 !
>       
>       call MatCreateAIJ(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE,
>      & M, M, PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
>      & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,A, ierr)
>       call VecCreate(PETSC_COMM_WORLD,b,ierr)
>       call VecSetSizes(b, PETSC_DECIDE, M,ierr)
>       call VecSetFromOptions(b,ierr)
> 
>       DO I=(MYRANK*(M/NMPI)),((MYRANK+1)*(M/NMPI)-1)
>          
>          ! Set matrix
>          v=1
>          call MatSetValue(A, I, I, v, INSERT_VALUES, ierr)
>          IF (I-block_size .GE. 0) THEN
>             v=-1
>             call MatSetValue(A, I, I-block_size, v, INSERT_VALUES, ierr)
>          ENDIF
>          ! Set rhs
>          v = I
>          call VecSetValue(b,I,v, INSERT_VALUES,ierr)
>          
>       END DO       
>                
>       call MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY, ierr)
>       call MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY, ierr)
>       call VecAssemblyBegin(b,ierr)
>       call VecAssemblyEnd(b,ierr)
> 
>       
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>       ! FIRST KSP/PC SETUP
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>       
>       call KSPCreate(PETSC_COMM_WORLD, ksp, ierr)
>       call KSPSetOperators(ksp, A, A, ierr)
>       call KSPSetType(ksp, 'preonly', ierr)
>       call KSPGetPC(ksp, pc, ierr)
>       call PCSetType(pc, PCGASM, ierr)
> 
> 
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>       !! GASM, SETTING SUBDOMAINS
>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
> 
>       DO IS_COUNTER=1,N_blocks
>          
>          srank = MAX(((IS_COUNTER-2)*block_size)/(M/NMPI),0) ! start rank 
> reached by inflated block
>          erank = MIN(((IS_COUNTER-1)*block_size)/(M/NMPI),NMPI-1) ! end rank 
> reached by inflated block. Coincides with rank containing non-inflated block
> 
>    ! Create subcomms
>    color = MPI_UNDEFINED
>    IF (myrank == srank .or. myrank == erank) THEN
>       color = 1
>    ENDIF
>    call MPI_Comm_split(MPI_COMM_WORLD,color,MYRANK,subcomm,ierr)  
> 
>          ! Create IS
>          IF (srank .EQ. erank) THEN   ! Block and overlap are on the same rank
>             IF (MYRANK .EQ. srank) THEN 
>                call 
> ISCreateStride(PETSC_COMM_SELF,block_size,(IS_COUNTER-1)*block_size,1,subdomains_IS(IS_COUNTER),ierr)
>                IF (IS_COUNTER .EQ. 1) THEN  ! the first block is not inflated
>                    call 
> ISCreateStride(PETSC_COMM_SELF,block_size,(IS_COUNTER-1)*block_size,1,inflated_IS(IS_COUNTER),ierr)
>                ELSE
>                   call 
> ISCreateStride(PETSC_COMM_SELF,2*block_size,(IS_COUNTER-2)*block_size,1,inflated_IS(IS_COUNTER),ierr)
>                ENDIF
>             ENDIF
>          else ! Block and overlap not on the same rank
>             if (myrank == erank) then  ! the block
>                call  
> ISCreateStride(subcomm,block_size,(IS_COUNTER-1)*block_size,1,subdomains_IS(IS_COUNTER),ierr)
>                call  
> ISCreateStride(subcomm,block_size,(IS_COUNTER-1)*block_size,1,inflated_IS(IS_COUNTER),ierr)
>             endif
>             if (myrank == srank) then  ! the overlap
>                call  
> ISCreateStride(subcomm,block_size,(IS_COUNTER-2)*block_size,1,inflated_IS(IS_COUNTER),ierr)
>                call  
> ISCreateStride(subcomm,0,(IS_COUNTER-1)*block_size,1,subdomains_IS(IS_COUNTER),ierr)
>             endif
>          endif   
>          
>          call MPI_Comm_free(subcomm, ierr)
>       END DO
>       
>       ! Set the domains/subdomains
>       NSub = N_blocks/NMPI
>       is_start = 1 + myrank * NSub
>       is_end = min(is_start + NSub, N_blocks)
>       if (myrank + 1 < NMPI) then         
>          NSub = NSub + 1
>       endif      
>          
>       call 
> PCGASMSetSubdomains(pc,NSub,subdomains_IS(is_start:is_end),inflated_IS(is_start:is_end),ierr)
>       call 
> PCGASMDestroySubdomains(NSub,subdomains_IS(is_start:is_end),inflated_IS(is_start:is_end),ierr)
>             
>       call PetscOptionsSetValue(PETSC_NULL_OPTIONS,"-sub_ksp_type", "gmres", 
> ierr)
>       call PetscOptionsSetValue(PETSC_NULL_OPTIONS,"-sub_pc_type", "none", 
> ierr)      
>       call 
> PetscOptionsSetValue(PETSC_NULL_OPTIONS,"-pc_gasm_view_subdomains", "1", ierr)
>       
>       call KSPSetUp(ksp, ierr)
>       call PCSetUp(pc, ierr)
>       call KSPSetFromOptions(ksp, ierr)
>       call PCSetFromOptions(pc, ierr)
>       
>       call KSPView(ksp,PETSC_VIEWER_STDOUT_WORLD, ierr)
>       
> 
> Il giorno mer 10 mag 2023 alle ore 03:02 Barry Smith <[email protected] 
> <mailto:[email protected]>> ha scritto:
>> 
>> 
>>> On May 9, 2023, at 4:58 PM, LEONARDO MUTTI 
>>> <[email protected] 
>>> <mailto:[email protected]>> wrote:
>>> 
>>> In my notation diag(1,1) means a diagonal 2x2 matrix with 1,1 on the 
>>> diagonal, submatrix in the 8x8 diagonal matrix diag(1,1,2,2,...,2). 
>>> Am I then correct that the IS representing diag(1,1) is 0,1, and that 
>>> diag(2,2,...,2) is represented by 2,3,4,5,6,7?
>> 
>> I believe so
>> 
>>> Thanks,
>>> Leonardo
>>> 
>>> Il mar 9 mag 2023, 20:45 Barry Smith <[email protected] 
>>> <mailto:[email protected]>> ha scritto:
>>>> 
>>>>   It is simplier than you are making it out to be. Each IS[] is a list of 
>>>> rows (and columns) in the sub (domain) matrix. In your case with the 
>>>> matrix of 144 by 144 the indices will go from 0 to 143.
>>>> 
>>>>   In your simple Fortran code you have a completely different problem. A 
>>>> matrix with 8 rows and columns. In that case if you want the first IS to 
>>>> represent just the first row (and column) in the matrix then it should 
>>>> contain only 0. The second submatrix which is all rows (but the first) 
>>>> should have 1,2,3,4,5,6,7
>>>> 
>>>>   I do not understand why your code has 
>>>> 
>>>>>>>> indices_first_domain = [0,1,8,9] ! corresponds to diag(1,1)
>>>> 
>>>> it should just be 0
>>>> 
>>>> 
>>>> 
>>>> 
>>>> 
>>>>> On May 9, 2023, at 12:44 PM, LEONARDO MUTTI 
>>>>> <[email protected] 
>>>>> <mailto:[email protected]>> wrote:
>>>>> 
>>>>> Partial typo: I expect 9x(16+16) numbers to be stored in subdomain_IS : # 
>>>>> subdomains x (row indices of the submatrix + col indices of the 
>>>>> submatrix).
>>>>> 
>>>>> Il giorno mar 9 mag 2023 alle ore 18:31 LEONARDO MUTTI 
>>>>> <[email protected] 
>>>>> <mailto:[email protected]>> ha scritto:
>>>>>> 
>>>>>> 
>>>>>> ---------- Forwarded message ---------
>>>>>> Da: LEONARDO MUTTI <[email protected] 
>>>>>> <mailto:[email protected]>>
>>>>>> Date: mar 9 mag 2023 alle ore 18:29
>>>>>> Subject: Re: [petsc-users] Understanding index sets for PCGASM
>>>>>> To: Matthew Knepley <[email protected] <mailto:[email protected]>>
>>>>>> 
>>>>>> 
>>>>>> Thank you for your answer, but I am still confused, sorry. 
>>>>>> Consider 
>>>>>> https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/ksp/tests/ex71f.F90 
>>>>>> on one processor.
>>>>>> Let M=12 for the sake of simplicity, i.e. we deal with a 12x12 2D grid, 
>>>>>> hence, a  144x144 matrix.
>>>>>> Let NSubx = 3, so that on the grid we do 3 vertical and 3 horizontal 
>>>>>> subdivisions.
>>>>>> We should obtain 9 subdomains that are grids of 4x4 nodes each, thus 
>>>>>> corresponding to 9 submatrices of size 16x16. 
>>>>>> In my run I obtain NSub = 9 (great) and subdomain_IS(i), i=1,...,9, 
>>>>>> reads:
>>>>>> 
>>>>>> IS Object: 1 MPI process
>>>>>>   type: general
>>>>>> Number of indices in set 16
>>>>>> 0 0
>>>>>> 1 1
>>>>>> 2 2
>>>>>> 3 3
>>>>>> 4 12
>>>>>> 5 13
>>>>>> 6 14
>>>>>> 7 15
>>>>>> 8 24
>>>>>> 9 25
>>>>>> 10 26
>>>>>> 11 27
>>>>>> 12 36
>>>>>> 13 37
>>>>>> 14 38
>>>>>> 15 39
>>>>>> IS Object: 1 MPI process
>>>>>>   type: general
>>>>>> Number of indices in set 16
>>>>>> 0 4
>>>>>> 1 5
>>>>>> 2 6
>>>>>> 3 7
>>>>>> 4 16
>>>>>> 5 17
>>>>>> 6 18
>>>>>> 7 19
>>>>>> 8 28
>>>>>> 9 29
>>>>>> 10 30
>>>>>> 11 31
>>>>>> 12 40
>>>>>> 13 41
>>>>>> 14 42
>>>>>> 15 43
>>>>>> IS Object: 1 MPI process
>>>>>>   type: general
>>>>>> Number of indices in set 16
>>>>>> 0 8
>>>>>> 1 9
>>>>>> 2 10
>>>>>> 3 11
>>>>>> 4 20
>>>>>> 5 21
>>>>>> 6 22
>>>>>> 7 23
>>>>>> 8 32
>>>>>> 9 33
>>>>>> 10 34
>>>>>> 11 35
>>>>>> 12 44
>>>>>> 13 45
>>>>>> 14 46
>>>>>> 15 47
>>>>>> IS Object: 1 MPI process
>>>>>>   type: general
>>>>>> Number of indices in set 16
>>>>>> 0 48
>>>>>> 1 49
>>>>>> 2 50
>>>>>> 3 51
>>>>>> 4 60
>>>>>> 5 61
>>>>>> 6 62
>>>>>> 7 63
>>>>>> 8 72
>>>>>> 9 73
>>>>>> 10 74
>>>>>> 11 75
>>>>>> 12 84
>>>>>> 13 85
>>>>>> 14 86
>>>>>> 15 87
>>>>>> IS Object: 1 MPI process
>>>>>>   type: general
>>>>>> Number of indices in set 16
>>>>>> 0 52
>>>>>> 1 53
>>>>>> 2 54
>>>>>> 3 55
>>>>>> 4 64
>>>>>> 5 65
>>>>>> 6 66
>>>>>> 7 67
>>>>>> 8 76
>>>>>> 9 77
>>>>>> 10 78
>>>>>> 11 79
>>>>>> 12 88
>>>>>> 13 89
>>>>>> 14 90
>>>>>> 15 91
>>>>>> IS Object: 1 MPI process
>>>>>>   type: general
>>>>>> Number of indices in set 16
>>>>>> 0 56
>>>>>> 1 57
>>>>>> 2 58
>>>>>> 3 59
>>>>>> 4 68
>>>>>> 5 69
>>>>>> 6 70
>>>>>> 7 71
>>>>>> 8 80
>>>>>> 9 81
>>>>>> 10 82
>>>>>> 11 83
>>>>>> 12 92
>>>>>> 13 93
>>>>>> 14 94
>>>>>> 15 95
>>>>>> IS Object: 1 MPI process
>>>>>>   type: general
>>>>>> Number of indices in set 16
>>>>>> 0 96
>>>>>> 1 97
>>>>>> 2 98
>>>>>> 3 99
>>>>>> 4 108
>>>>>> 5 109
>>>>>> 6 110
>>>>>> 7 111
>>>>>> 8 120
>>>>>> 9 121
>>>>>> 10 122
>>>>>> 11 123
>>>>>> 12 132
>>>>>> 13 133
>>>>>> 14 134
>>>>>> 15 135
>>>>>> IS Object: 1 MPI process
>>>>>>   type: general
>>>>>> Number of indices in set 16
>>>>>> 0 100
>>>>>> 1 101
>>>>>> 2 102
>>>>>> 3 103
>>>>>> 4 112
>>>>>> 5 113
>>>>>> 6 114
>>>>>> 7 115
>>>>>> 8 124
>>>>>> 9 125
>>>>>> 10 126
>>>>>> 11 127
>>>>>> 12 136
>>>>>> 13 137
>>>>>> 14 138
>>>>>> 15 139
>>>>>> IS Object: 1 MPI process
>>>>>>   type: general
>>>>>> Number of indices in set 16
>>>>>> 0 104
>>>>>> 1 105
>>>>>> 2 106
>>>>>> 3 107
>>>>>> 4 116
>>>>>> 5 117
>>>>>> 6 118
>>>>>> 7 119
>>>>>> 8 128
>>>>>> 9 129
>>>>>> 10 130
>>>>>> 11 131
>>>>>> 12 140
>>>>>> 13 141
>>>>>> 14 142
>>>>>> 15 143
>>>>>> 
>>>>>> As you said, no number here reaches 144. 
>>>>>> But the number stored in subdomain_IS are 9x16= #subdomains x 16, 
>>>>>> whereas I would expect, also given your latest reply, 
>>>>>> 9x16x16x2=#subdomains x submatrix height x submatrix width x length of a 
>>>>>> (row,column) pair.
>>>>>> It would really help me if you could briefly explain how the output 
>>>>>> above encodes the subdivision into subdomains.
>>>>>> Many thanks again,
>>>>>> Leonardo
>>>>>> 
>>>>>> 
>>>>>> Il giorno mar 9 mag 2023 alle ore 16:24 Matthew Knepley 
>>>>>> <[email protected] <mailto:[email protected]>> ha scritto:
>>>>>>> On Tue, May 9, 2023 at 10:05 AM LEONARDO MUTTI 
>>>>>>> <[email protected] 
>>>>>>> <mailto:[email protected]>> wrote:
>>>>>>>> Great thanks! I can now successfully run 
>>>>>>>> https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/ksp/tests/ex71f.F90.
>>>>>>>> 
>>>>>>>> Going forward with my experiments, let me post a new code snippet 
>>>>>>>> (very similar to  ex71f.F90) that I cannot get to work, probably I 
>>>>>>>> must be setting up the IS objects incorrectly.
>>>>>>>> 
>>>>>>>> I have an 8x8 matrix A=diag(1,1,2,2,...,2) and a vector  
>>>>>>>> b=(0.5,...,0.5). We have only one processor, and I want to solve Ax=b 
>>>>>>>> using GASM. In particular, KSP is set to preonly, GASM is the 
>>>>>>>> preconditioner and it uses on each submatrix an lu direct solver 
>>>>>>>> (sub_ksp = preonly, sub_pc = lu).
>>>>>>>> 
>>>>>>>> For the GASM algorithm, I divide A into diag(1,1) and diag(2,2,...,2). 
>>>>>>>> For simplicity I set 0 overlap. Now I want to use GASM to solve Ax=b. 
>>>>>>>> The code follows.
>>>>>>>> 
>>>>>>>> #include <petsc/finclude/petscmat.h>
>>>>>>>> #include <petsc/finclude/petscksp.h>
>>>>>>>> #include <petsc/finclude/petscpc.h>
>>>>>>>>       USE petscmat
>>>>>>>>       USE petscksp
>>>>>>>>       USE petscpc
>>>>>>>>       USE MPI
>>>>>>>> 
>>>>>>>>       Mat   :: A
>>>>>>>>       Vec   :: b, x
>>>>>>>>       PetscInt :: M, I, J, ISLen, NSub
>>>>>>>>       PetscMPIInt :: size
>>>>>>>>       PetscErrorCode :: ierr
>>>>>>>>       PetscScalar :: v
>>>>>>>>       KSP            :: ksp
>>>>>>>>       PC             :: pc
>>>>>>>>       IS :: subdomains_IS(2), inflated_IS(2)
>>>>>>>>       PetscInt,DIMENSION(4) :: indices_first_domain
>>>>>>>>       PetscInt,DIMENSION(36) :: indices_second_domain
>>>>>>>>       
>>>>>>>>       call PetscInitialize(PETSC_NULL_CHARACTER, ierr)
>>>>>>>>       call MPI_Comm_size(PETSC_COMM_WORLD, size, ierr)
>>>>>>>> 
>>>>>>>> 
>>>>>>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>>>       ! INTRO: create matrix and right hand side
>>>>>>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>>>       
>>>>>>>>       WRITE(*,*) "Assembling A,b"
>>>>>>>> 
>>>>>>>>       M = 8
>>>>>>>>       call MatCreateAIJ(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE,
>>>>>>>>      & M, M, PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
>>>>>>>>      & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,A, ierr)
>>>>>>>>       DO I=1,M
>>>>>>>>          DO J=1,M
>>>>>>>>             IF ((I .EQ. J) .AND. (I .LE. 2 )) THEN
>>>>>>>>                v = 1
>>>>>>>>             ELSE IF ((I .EQ. J) .AND. (I .GT. 2 )) THEN
>>>>>>>>                v = 2
>>>>>>>>             ELSE
>>>>>>>>                v = 0
>>>>>>>>             ENDIF
>>>>>>>>             call MatSetValue(A, I-1, J-1, v, INSERT_VALUES, ierr)
>>>>>>>>          END DO
>>>>>>>>       END DO
>>>>>>>> 
>>>>>>>>       call MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY, ierr)
>>>>>>>>       call MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY, ierr)
>>>>>>>>       
>>>>>>>>       call VecCreate(PETSC_COMM_WORLD,b,ierr)
>>>>>>>>       call VecSetSizes(b, PETSC_DECIDE, M,ierr)
>>>>>>>>       call VecSetFromOptions(b,ierr)
>>>>>>>>       
>>>>>>>>       do I=1,M
>>>>>>>>          v = 0.5
>>>>>>>>          call VecSetValue(b,I-1,v, INSERT_VALUES,ierr)
>>>>>>>>       end do
>>>>>>>>       
>>>>>>>>       call VecAssemblyBegin(b,ierr)
>>>>>>>>       call VecAssemblyEnd(b,ierr)
>>>>>>>>                  
>>>>>>>>   
>>>>>>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>>>       ! FIRST KSP/PC SETUP
>>>>>>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>>>       
>>>>>>>>       WRITE(*,*) "KSP/PC first setup"
>>>>>>>>       
>>>>>>>>       call KSPCreate(PETSC_COMM_WORLD, ksp, ierr)
>>>>>>>>       call KSPSetOperators(ksp, A, A, ierr)
>>>>>>>>       call KSPSetType(ksp, 'preonly', ierr)
>>>>>>>>       call KSPGetPC(ksp, pc, ierr)
>>>>>>>>       call KSPSetUp(ksp, ierr)
>>>>>>>>       call PCSetType(pc, PCGASM, ierr)
>>>>>>>>      
>>>>>>>>   
>>>>>>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>>>       ! GASM, SETTING SUBDOMAINS
>>>>>>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>>>       
>>>>>>>>       WRITE(*,*) "Setting GASM subdomains"
>>>>>>>>       
>>>>>>>>       ! Let's create the subdomain IS and inflated_IS
>>>>>>>>       ! They are equal if no overlap is present
>>>>>>>>       ! They are 1: 0,1,8,9
>>>>>>>>       !          2: 10,...,15,18,...,23,...,58,...,63  
>>>>>>>>       
>>>>>>>>       indices_first_domain = [0,1,8,9] ! corresponds to diag(1,1)
>>>>>>>>       do I=0,5
>>>>>>>>          do J=0,5
>>>>>>>>             indices_second_domain(I*6+1+J) = 18 + J + 8*I ! 
>>>>>>>> corresponds to diag(2,2,...,2)
>>>>>>>>             !WRITE(*,*) I*6+1+J, 18 + J + 8*I
>>>>>>>>          end do
>>>>>>>>       end do
>>>>>>>>       
>>>>>>>>       ! Convert into IS
>>>>>>>>       ISLen = 4
>>>>>>>>       call ISCreateGeneral(PETSC_COMM_WORLD,ISLen,indices_first_domain,
>>>>>>>>      & PETSC_COPY_VALUES, subdomains_IS(1), ierr)
>>>>>>>>       call ISCreateGeneral(PETSC_COMM_WORLD,ISLen,indices_first_domain,
>>>>>>>>      & PETSC_COPY_VALUES, inflated_IS(1), ierr)
>>>>>>>>       ISLen = 36
>>>>>>>>       call 
>>>>>>>> ISCreateGeneral(PETSC_COMM_WORLD,ISLen,indices_second_domain,
>>>>>>>>      & PETSC_COPY_VALUES, subdomains_IS(2), ierr)
>>>>>>>>       call 
>>>>>>>> ISCreateGeneral(PETSC_COMM_WORLD,ISLen,indices_second_domain,
>>>>>>>>      & PETSC_COPY_VALUES, inflated_IS(2), ierr)
>>>>>>>> 
>>>>>>>>       NSub = 2
>>>>>>>>       call PCGASMSetSubdomains(pc,NSub,
>>>>>>>>      & subdomains_IS,inflated_IS,ierr)
>>>>>>>>       call PCGASMDestroySubdomains(NSub,
>>>>>>>>      & subdomains_IS,inflated_IS,ierr)
>>>>>>>>       
>>>>>>>>       
>>>>>>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>>>       ! GASM: SET SUBSOLVERS
>>>>>>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!      
>>>>>>>>       
>>>>>>>>       WRITE(*,*) "Setting subsolvers for GASM"
>>>>>>>>       
>>>>>>>>       call PCSetUp(pc, ierr) ! should I add this?
>>>>>>>>       
>>>>>>>>       call PetscOptionsSetValue(PETSC_NULL_OPTIONS,
>>>>>>>>      & "-sub_pc_type", "lu", ierr)
>>>>>>>>       call PetscOptionsSetValue(PETSC_NULL_OPTIONS,
>>>>>>>>      & "-sub_ksp_type", "preonly", ierr)
>>>>>>>>       
>>>>>>>>       call KSPSetFromOptions(ksp, ierr)
>>>>>>>>       call PCSetFromOptions(pc, ierr)
>>>>>>>>      
>>>>>>>>   
>>>>>>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
>>>>>>>>       ! DUMMY SOLUTION: DID IT WORK?
>>>>>>>>       !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 
>>>>>>>>       
>>>>>>>>       WRITE(*,*) "Solve"
>>>>>>>>       
>>>>>>>>       call VecDuplicate(b,x,ierr)
>>>>>>>>       call KSPSolve(ksp,b,x,ierr)
>>>>>>>>         
>>>>>>>>       call MatDestroy(A, ierr)
>>>>>>>>       call KSPDestroy(ksp, ierr)
>>>>>>>>       call PetscFinalize(ierr)
>>>>>>>> 
>>>>>>>> This code is failing in multiple points. At  call PCSetUp(pc, ierr) it 
>>>>>>>> produces:
>>>>>>>> 
>>>>>>>> [0]PETSC ERROR: Argument out of range
>>>>>>>> [0]PETSC ERROR: Scatter indices in ix are out of range
>>>>>>>> ...
>>>>>>>> [0]PETSC ERROR: #1 VecScatterCreate() at 
>>>>>>>> ***\src\vec\is\sf\INTERF~1\vscat.c:736
>>>>>>>> [0]PETSC ERROR: #2 PCSetUp_GASM() at 
>>>>>>>> ***\src\ksp\pc\impls\gasm\gasm.c:433
>>>>>>>> [0]PETSC ERROR: #3 PCSetUp() at ***\src\ksp\pc\INTERF~1\precon.c:994
>>>>>>>> 
>>>>>>>> And at call KSPSolve(ksp,b,x,ierr) it produces:
>>>>>>>> 
>>>>>>>> forrtl: severe (157): Program Exception - access violation
>>>>>>>> 
>>>>>>>> The index sets are setup coherently with the outputs of e.g. 
>>>>>>>> https://gitlab.com/petsc/petsc/-/blob/main/src/ksp/ksp/tests/output/ex71f_1.out:
>>>>>>>>  in particular each element of the matrix A corresponds to a number 
>>>>>>>> from 0 to 63.
>>>>>>> 
>>>>>>> This is not correct, I believe. The indices are row/col indices, not 
>>>>>>> indices into dense blocks, so for
>>>>>>> your example, they are all in [0, 8].
>>>>>>> 
>>>>>>>   Thanks,
>>>>>>> 
>>>>>>>      Matt
>>>>>>>  
>>>>>>>> Note that each submatrix does not represent some physical subdomain, 
>>>>>>>> the subdivision is just at the algebraic level.
>>>>>>>> I thus have the following questions:
>>>>>>>> is this the correct way of creating the IS objects, given my objective 
>>>>>>>> at the beginning of the email? Is the ordering correct?
>>>>>>>> what am I doing wrong that is generating the above errors?
>>>>>>>> Thanks for the patience and the time.
>>>>>>>> Best, 
>>>>>>>> Leonardo
>>>>>>>> 
>>>>>>>> Il giorno ven 5 mag 2023 alle ore 18:43 Barry Smith <[email protected] 
>>>>>>>> <mailto:[email protected]>> ha scritto:
>>>>>>>>> 
>>>>>>>>>   Added in barry/2023-05-04/add-pcgasm-set-subdomains see also 
>>>>>>>>> https://gitlab.com/petsc/petsc/-/merge_requests/6419
>>>>>>>>> 
>>>>>>>>>   Barry
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>>> On May 4, 2023, at 11:23 AM, LEONARDO MUTTI 
>>>>>>>>>> <[email protected] 
>>>>>>>>>> <mailto:[email protected]>> wrote:
>>>>>>>>>> 
>>>>>>>>>> Thank you for the help. 
>>>>>>>>>> Adding to my example:
>>>>>>>>>>       call PCGASMSetSubdomains(pc,NSub, subdomains_IS, 
>>>>>>>>>> inflated_IS,ierr)
>>>>>>>>>>       call 
>>>>>>>>>> PCGASMDestroySubdomains(NSub,subdomains_IS,inflated_IS,ierr)
>>>>>>>>>> results in:
>>>>>>>>>>       Error LNK2019 unresolved external symbol 
>>>>>>>>>> PCGASMDESTROYSUBDOMAINS referenced in function ...  
>>>>>>>>>>       Error LNK2019 unresolved external symbol PCGASMSETSUBDOMAINS 
>>>>>>>>>> referenced in function ...  
>>>>>>>>>> I'm not sure if the interfaces are missing or if I have a 
>>>>>>>>>> compilation problem.
>>>>>>>>>> Thank you again. 
>>>>>>>>>> Best,
>>>>>>>>>> Leonardo
>>>>>>>>>> 
>>>>>>>>>> Il giorno sab 29 apr 2023 alle ore 20:30 Barry Smith 
>>>>>>>>>> <[email protected] <mailto:[email protected]>> ha scritto:
>>>>>>>>>>> 
>>>>>>>>>>>    Thank you for the test code. I have a fix in the branch  
>>>>>>>>>>> barry/2023-04-29/fix-pcasmcreatesubdomains2d 
>>>>>>>>>>> <https://gitlab.com/petsc/petsc/-/commits/barry/2023-04-29/fix-pcasmcreatesubdomains2d>
>>>>>>>>>>>  with merge request 
>>>>>>>>>>> https://gitlab.com/petsc/petsc/-/merge_requests/6394
>>>>>>>>>>> 
>>>>>>>>>>>    The functions did not have proper Fortran stubs and interfaces 
>>>>>>>>>>> so I had to provide them manually in the new branch.
>>>>>>>>>>> 
>>>>>>>>>>>    Use
>>>>>>>>>>> 
>>>>>>>>>>>    git fetch
>>>>>>>>>>>    git checkout barry/2023-04-29/fix-pcasmcreatesubdomains2d 
>>>>>>>>>>> <https://gitlab.com/petsc/petsc/-/commits/barry/2023-04-29/fix-pcasmcreatesubdomains2d>
>>>>>>>>>>>    ./configure etc
>>>>>>>>>>> 
>>>>>>>>>>>    Your now working test code is in src/ksp/ksp/tests/ex71f.F90  I 
>>>>>>>>>>> had to change things slightly and I updated the error handling for 
>>>>>>>>>>> the latest version.
>>>>>>>>>>> 
>>>>>>>>>>>    Please let us know if you have any later questions.
>>>>>>>>>>> 
>>>>>>>>>>>   Barry
>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>>>>> On Apr 28, 2023, at 12:07 PM, LEONARDO MUTTI 
>>>>>>>>>>>> <[email protected] 
>>>>>>>>>>>> <mailto:[email protected]>> wrote:
>>>>>>>>>>>> 
>>>>>>>>>>>> Hello. I am having a hard time understanding the index sets to 
>>>>>>>>>>>> feed PCGASMSetSubdomains, and I am working in Fortran (as a PETSc 
>>>>>>>>>>>> novice). To get more intuition on how the IS objects behave I 
>>>>>>>>>>>> tried the following minimal (non) working example, which should 
>>>>>>>>>>>> tile a 16x16 matrix into 16 square, non-overlapping submatrices:
>>>>>>>>>>>> 
>>>>>>>>>>>> #include <petsc/finclude/petscmat.h>
>>>>>>>>>>>> #include <petsc/finclude/petscksp.h>
>>>>>>>>>>>> #include <petsc/finclude/petscpc.h>
>>>>>>>>>>>>       USE petscmat
>>>>>>>>>>>>       USE petscksp
>>>>>>>>>>>>       USE petscpc
>>>>>>>>>>>>       
>>>>>>>>>>>>       Mat   :: A
>>>>>>>>>>>>       PetscInt :: M, NSubx, dof, overlap, NSub
>>>>>>>>>>>>       INTEGER :: I,J
>>>>>>>>>>>>       PetscErrorCode      :: ierr
>>>>>>>>>>>>       PetscScalar :: v
>>>>>>>>>>>>       KSP            :: ksp
>>>>>>>>>>>>       PC             :: pc
>>>>>>>>>>>>       IS :: subdomains_IS, inflated_IS
>>>>>>>>>>>>       
>>>>>>>>>>>>       call PetscInitialize(PETSC_NULL_CHARACTER , ierr)
>>>>>>>>>>>>       
>>>>>>>>>>>> !-----Create a dummy matrix
>>>>>>>>>>>>       M = 16
>>>>>>>>>>>>       call MatCreateAIJ(MPI_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE,
>>>>>>>>>>>>      & M, M,
>>>>>>>>>>>>      & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
>>>>>>>>>>>>      & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER,
>>>>>>>>>>>>      & A, ierr)
>>>>>>>>>>>>       
>>>>>>>>>>>>       DO I=1,M
>>>>>>>>>>>>          DO J=1,M
>>>>>>>>>>>>             v = I*J
>>>>>>>>>>>>             CALL MatSetValue (A,I-1,J-1,v,
>>>>>>>>>>>>      &                     INSERT_VALUES , ierr)
>>>>>>>>>>>>          END DO
>>>>>>>>>>>>       END DO
>>>>>>>>>>>>       
>>>>>>>>>>>>       call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY , ierr)
>>>>>>>>>>>>       call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY , ierr)
>>>>>>>>>>>>       
>>>>>>>>>>>> !-----Create KSP and PC
>>>>>>>>>>>>       call KSPCreate(PETSC_COMM_WORLD,ksp, ierr)
>>>>>>>>>>>>       call KSPSetOperators(ksp,A,A, ierr)
>>>>>>>>>>>>       call KSPSetType(ksp,"bcgs",ierr)
>>>>>>>>>>>>       call KSPGetPC(ksp,pc,ierr)
>>>>>>>>>>>>       call KSPSetUp(ksp, ierr)
>>>>>>>>>>>>       call PCSetType(pc,PCGASM, ierr)
>>>>>>>>>>>>       call PCSetUp(pc , ierr)
>>>>>>>>>>>>       
>>>>>>>>>>>> !-----GASM setup
>>>>>>>>>>>>       NSubx = 4
>>>>>>>>>>>>       dof = 1
>>>>>>>>>>>>       overlap = 0
>>>>>>>>>>>>       
>>>>>>>>>>>>       call PCGASMCreateSubdomains2D(pc, 
>>>>>>>>>>>>      &      M, M,
>>>>>>>>>>>>      &      NSubx, NSubx,
>>>>>>>>>>>>      &      dof, overlap,
>>>>>>>>>>>>      &      NSub, subdomains_IS, inflated_IS, ierr)
>>>>>>>>>>>> 
>>>>>>>>>>>>       call ISView(subdomains_IS, PETSC_VIEWER_STDOUT_WORLD, ierr)
>>>>>>>>>>>>       
>>>>>>>>>>>>       call KSPDestroy(ksp, ierr)  
>>>>>>>>>>>>       call PetscFinalize(ierr)
>>>>>>>>>>>> 
>>>>>>>>>>>> Running this on one processor, I get NSub = 4.
>>>>>>>>>>>> If PCASM and PCASMCreateSubdomains2D are used instead, I get NSub 
>>>>>>>>>>>> = 16 as expected.
>>>>>>>>>>>> Moreover, I get in the end "forrtl: severe (157): Program 
>>>>>>>>>>>> Exception - access violation". So:
>>>>>>>>>>>> 1) why do I get two different results with ASM, and GASM?
>>>>>>>>>>>> 2) why do I get access violation and how can I solve this?
>>>>>>>>>>>> In fact, in C, subdomains_IS, inflated_IS should pointers to IS 
>>>>>>>>>>>> objects. As I see on the Fortran interface, the arguments to 
>>>>>>>>>>>> PCGASMCreateSubdomains2D are IS objects:
>>>>>>>>>>>> 
>>>>>>>>>>>>        subroutine PCGASMCreateSubdomains2D(a,b,c,d,e,f,g,h,i,j,z)
>>>>>>>>>>>>        import tPC,tIS
>>>>>>>>>>>>        PC a ! PC
>>>>>>>>>>>>        PetscInt b ! PetscInt
>>>>>>>>>>>>        PetscInt c ! PetscInt
>>>>>>>>>>>>        PetscInt d ! PetscInt
>>>>>>>>>>>>        PetscInt e ! PetscInt
>>>>>>>>>>>>        PetscInt f ! PetscInt
>>>>>>>>>>>>        PetscInt g ! PetscInt
>>>>>>>>>>>>        PetscInt h ! PetscInt
>>>>>>>>>>>>        IS i ! IS
>>>>>>>>>>>>        IS j ! IS
>>>>>>>>>>>>        PetscErrorCode z
>>>>>>>>>>>>        end subroutine PCGASMCreateSubdomains2D
>>>>>>>>>>>> Thus:
>>>>>>>>>>>> 3) what should be inside e.g., subdomains_IS? I expect it to 
>>>>>>>>>>>> contain, for every created subdomain, the list of rows and columns 
>>>>>>>>>>>> defining the subblock in the matrix, am I right?
>>>>>>>>>>>> 
>>>>>>>>>>>> Context: I have a block-tridiagonal system arising from space-time 
>>>>>>>>>>>> finite elements, and I want to solve it with GMRES+PCGASM 
>>>>>>>>>>>> preconditioner, where each overlapping submatrix is on the 
>>>>>>>>>>>> diagonal and of size 3x3 blocks (and spanning multiple processes). 
>>>>>>>>>>>> This is PETSc 3.17.1 on Windows.
>>>>>>>>>>>> 
>>>>>>>>>>>> Thanks in advance,
>>>>>>>>>>>> Leonardo
>>>>>>>>>>> 
>>>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> -- 
>>>>>>> What most experimenters take for granted before they begin their 
>>>>>>> experiments is infinitely more interesting than any results to which 
>>>>>>> their experiments lead.
>>>>>>> -- Norbert Wiener
>>>>>>> 
>>>>>>> https://www.cse.buffalo.edu/~knepley/ 
>>>>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>> 
>> 

Reply via email to