[deal.II] Parallelization of step-2: DynamicMatrix to SparsityMatrix

2017-02-10 Thread Kartik Jujare
Hello everyone, This question is regarding DynamicSparsityPattern and SparsityPattern. I am trying, as a small exercise to parallelize step files and observe the output. In the step-2 file. I am not able to use the copy_from() function when I run it in parallel. Could anyone please suggest a wo

Re: [deal.II] Renumbering dofs with petsc + block + MPI + Direct solver work around

2017-02-10 Thread Spencer Patty
On Friday, February 10, 2017 at 12:15:10 PM UTC-6, Timo Heister wrote: > > I wanted to give you some info on your original question in case your > want to still use PETSc. > > > It appears that for petsc, the assumption that the locally owned dofs > Index > > Sets are contiguous is really thr

Re: [deal.II] looking for an deallii example using trilinos's amesos direct solver in a parallel way

2017-02-10 Thread Praveen C
Hello Lailai Here is an example from my code where I use MUMPS via Trilinos static TrilinosWrappers::SolverDirect::AdditionalData data (false, "Amesos_Mumps"); static SolverControl solver_control (1, 0); // solve for x { TrilinosWrappers::MPI::Vector tmp (locally_owne

[deal.II] looking for an deallii example using trilinos's amesos direct solver in a parallel way

2017-02-10 Thread Lailai Zhu
hi, dear all, I am looking for an dealII example that uses trilino's amesos direct solver with MPI, i guess more specifically, i mean how to call amesos_mumps solver based on MPI. I have installed them correctly with dealII. I think i need to use the subroutine void solve

Re: [deal.II] Re: Renumbering dofs with petsc + block + MPI + Direct solver work around

2017-02-10 Thread Spencer Patty
Thanks Bruno and Daniel for pointing to the other options. I can report that superlu_dist does indeed work through trilinos for my problem. And you are absolutely correct that it is not the easiest to install. It took me a few hours to work out the details for installing on my mac based on t

Re: [deal.II] Renumbering dofs with petsc + block + MPI + Direct solver work around

2017-02-10 Thread Timo Heister
I wanted to give you some info on your original question in case your want to still use PETSc. > It appears that for petsc, the assumption that the locally owned dofs Index > Sets are contiguous is really throwing a wrench in our plans for the non > block system approach. I have seen the other di

[deal.II] New: step-57

2017-02-10 Thread Timo Heister
I am happy to announce that we added a new tutorial program: The new tutorial program step-57 (contributed by Liang Zhao and Timo Heister) shows how to solve the stationary, incompressible Navier-Stokes equations using Newton's method. The program features block linear solvers for the saddle point

[deal.II] Re: More questions about FE_enriched & PUM FEM

2017-02-10 Thread Edith Sotelo
A follow up about this post: This is the way I went on implementing the strong bc.: Assigned the Boundary value to the non-enriched dof, and assigned Zero to the enriched dofs.. it worked for me. I used system_to_base_table() as suggested, though this function is protected, I just changed the f

Re: [deal.II] Fully distributed triangulation (level 0)

2017-02-10 Thread Timo Heister
> I'm going to refingure my deal.ii to this. Hopefully I can migrate all my > design to this asap. I'm sure I can report you some performance analysis etc > later on. That would be interesting to see. Keep in mind that this is still quite far from being usable, though. Right now I can create mesh

Re: [deal.II] Fully distributed triangulation (level 0)

2017-02-10 Thread Yi-Chung Chen
Thank you, Timo. I'm going to refingure my deal.ii to this. Hopefully I can migrate all my design to this asap. I'm sure I can report you some performance analysis etc later on. Regards YC Chen On Thursday, February 9, 2017 at 9:33:05 PM UTC, Timo Heister wrote: > > see > https://urldefense.

Re: [deal.II] Fully distributed triangulation (level 0)

2017-02-10 Thread Yi-Chung Chen
Hi Wolfgang, Thank you for reply. Unfortunately, google does not allow me to attach large meshes here (file is too big? > 300MB for a 3D mesh with dofs 2.5M in .msh). Is there any other way? I would like to share it with you. Here I attach the website of our project here, MTA