Many thanks for such an instant response. I'll adopt these ideas and try to
eliminate those DOFs related to bubble support following the workflow in
the tutorial.
Regards,
Lixing
On Tuesday, January 5, 2021 at 1:21:56 PM UTC+8 Wolfgang Bangerth wrote:
>
> Lixing,
>
> > I am trying to implement
Lixing,
I am trying to implement a stabiliazed weak form (e.g. advection-diffusion)
where the stabilization tensor is computed element-wise through a standard
bubble: \Pi(1-x_i^2). It seems that FE_Q_Bubbles should provides all I need,
but here are two things I am not quite clear about,
1.
Dear all,
I am trying to implement a stabiliazed weak form (e.g. advection-diffusion)
where the stabilization tensor is computed element-wise through a standard
bubble: \Pi(1-x_i^2). It seems that FE_Q_Bubbles should provides all I
need, but here are two things I am not quite clear about,
1. I
My project is in quantum scattering and I would like to have some operators
be distributed PETSc objects. So inside my OneBodyHamiltonianOperator
class (for example), I would like to create a
PETScWrappers::MPI::SparseMatrix and then use SLEPC to solve for the ground
state and excited states.
Kaushik
Marc and others have already answered the technical details, so just one
overall comment:
Let me explain what I am trying to do and why.
I want to solve a transient heat transfer problem of the additive
manufacturing (AM) process. In AM processes, metal powder is deposited in
layer
Zachary,
I am trying to debug this strange behavior. I am trying to build a PETSC
sparse parallel matrix using 4 processors. This gives me 32 local number of
rows (so 128 global number of rows). But when I pass the local_num_of_rows
variable into the reinit function, this is the PETSC err
Dear all,
I wish you all an happy new year!
One problem we always end up facing with FEM problems is that, as program
grow, more and more features are added to the equations. This leads to
multiple variation of the same equations (for example, Navier-Stokes with
Newtonian and non-Newtonian visc
Hi everyone,
I am trying to debug this strange behavior. I am trying to build a PETSC
sparse parallel matrix using 4 processors. This gives me 32 local number of
rows (so 128 global number of rows). But when I pass the local_num_of_rows
variable into the reinit function, this is the PETSC er
Hi Corbin,
> Is there a better way I could go about mapping the 3D volume data and
> depth-averaged data to the surface mesh?
This is a tough question - other deal.II developers are trying to figure out
how to do this in parallel here:
https://github.com/dealii/dealii/issues/10037
for now I'l
Hi Romin,
That error means that the quicktest for gmsh failed - in particular, deal.II
was not able to successfully run your gmsh executable. This usually means that
there is something wrong with your gmsh installation.
Unless you plan on using gmsh from inside deal.II (i.e., calling gmsh to
g
Kaushik,
Oh wow this is a small world :D Unfortunately, PETSc solver requires a
PETSc vector but I think it should be straightforward to add
compress(min) to the PETSc vector. So that's a possibility if copying
the solution takes too much time.
Bestm
Bruno
Le dim. 3 janv. 2021 à 21:42, Kaushik
Dear Chris,
I’m forwarding the mail also to the deal.II usergroup, as many others may find
this useful.
> On 4 Jan 2021, at 10:20, Christopher Ham wrote:
>
> Dear Both,
>
> I wonder if you might be able to help me. I would really like to try
> out deal.ii. I am a bit of a novice with the comp
12 matches
Mail list logo