I have removed an unnecessary PetscMPIIntCast() on MPI rank zero that was 
causing your test code to fail. See 
https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/7747__;!!G_uCfscf7eWS!fyVpTZyH7SS1nVHiZBR-6MSWa6uJ0mSExyg1aNmU4PyWfukw1682_dX9rwUKstiGP6Z8i22L4pmElEi9qsfHA6o$
 

   Thanks for reporting the problem.

   Barry

   BTW:  I don't think we have code to distribute a dense matrix that has 
values only on one rank to all the ranks. The code would essentially like the 
combination of 
of MatView_Dense_Binary/MatLoad_Dense_Binary with PetscViewerBinaryWriteReadAll 
without the saving and reading from disk.

   It is likely relatively easy to fix the dense matrix view/load with native 
format so that it does not need 64 bit indices to work with your test code.



> On Aug 5, 2024, at 9:19 PM, Sreeram R Venkat <[email protected]> wrote:
> 
> Here's an example code that should replicate the error: 
> https://urldefense.us/v3/__https://github.com/s769/petsc-test/tree/master__;!!G_uCfscf7eWS!fyVpTZyH7SS1nVHiZBR-6MSWa6uJ0mSExyg1aNmU4PyWfukw1682_dX9rwUKstiGP6Z8i22L4pmElEi9y0M6fi0$
>  .
> 
> I tried using the PETSC_FORMAT_NATIVE, but I still get the error. I have a 
> situation where the matrix is created on PETSC_COMM_WORLD but only has 
> entries on the first process due to some layout constraints elsewhere in the 
> program. The nodes I'm running on should have more than enough memory to hold 
> the entire matrix on one process, and the error I get is not an out-of-memory 
> error anyway. 
> 
> Let me know if you aren't able to build the example.
> 
> I noticed that if I fully distributed the matrix over all processes, then the 
> save works fine. Is there some way to do that after I create the matrix but 
> before saving it?
> 
> On Mon, Aug 5, 2024 at 1:19 PM Barry Smith <[email protected] 
> <mailto:[email protected]>> wrote:
>> 
>>    By default PETSc MatView() to a binary viewer uses the "standard" 
>> compressed sparse storage format. This is not efficient (or reasonable) for 
>> dense matrices and 
>> produces issues with integer overflow.
>> 
>>    To store a dense matrix as dense on disk, use the PetscViewerFormat of 
>> PETSC_VIEWER_NATIVE. So for example
>> 
>>    PetscViewerPushFormat(viewer,PETSC_VIEWER_NATIVE);
>>    MatView(mat, viewer);
>>    PetscViewerPopFormat(viewer);
>> 
>> 
>>> On Aug 5, 2024, at 1:10 PM, Sreeram R Venkat <[email protected] 
>>> <mailto:[email protected]>> wrote:
>>> 
>>> This Message Is From an External Sender
>>> This message came from outside your organization.
>>> I have a large dense matrix (size ranging from 5e4 to 1e5) that arises as a 
>>> result of doing MatComputeOperator() on a MatShell. When the total number 
>>> of nonzeros exceeds the 32 bit integer value, I get an error (MPI buffer 
>>> size too big) when trying to do MatView() on this to save to binary. Is 
>>> there a way I can save this matrix to load again for later use? 
>>> 
>>> The other thing I tried was to save each column as a separate dataset in an 
>>> hdf5 file. Then, I tried to load this in python, combine them to an np 
>>> array, and then create/save a dense matrix with petsc4py. I was able to 
>>> create the dense Mat, but the MatView() once again resulted in an error 
>>> (out of memory). 
>>> 
>>> Thanks,
>>> Sreeram
>> 

Reply via email to