Hi Denis,
I wonder what is the best workaround given the fact that i have quite
a number of such vectors?
You only see this for KellyErrorEstimator, right? Given that the error
estimation is much more expensive than the operator evaluation anyway, I
would suggest you do the copy exactly the way you describe here:
Probably the only way is to copy content into a temporary vector, then
call reinit(locally_owned, locally_relevant, mpi) and assign the
content back.
Then i would use such vectors in Kelly and solution transfer.
You would of course also invoke vector.update_ghost_values().
Any better ways to accomplish this? Maybe implement/add
LA::distributed::Vector::set_ghost_set(const IndexSet &) but not sure
it's worth the effort.
I think the current interface is pretty good after all. Doing one copy
is not that bad. Trilinos does copy the vector to be ghosted for the
parallel sparse matrix-vector product in every operator application. If
you want to change the ghost set, you need to throw away the vector as
you note in the next post. Otherwise, one cannot change the ghost range
of the vector. In addition, changing just the ghost side inside the
vector is extremely dangerous. It took me a few iterations to get the
`partitioner_is_compatible` check work somewhat reliably. The main
problem is that local index spaces with more than one set of ghosts are
ambiguous per definition. Besides deal.II I know two more big projects
that have local and global MPI indices and everyone does the same
struggles: You want local indices for performance, but it's so easy to
get things wrong.
Best,
Martin
--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see
https://groups.google.com/d/forum/dealii?hl=en
---
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.