Public bug reported: Paraview Volume rendering crashes on NVIDIA cards (using the proprietary driver) for large datasets (larger than 96 MB). This happens when using the GPU Based renderer or the Smart renderer (which I suppose uses the GPU Based renderer).
I've attached a testcase for the problem which can be run with "pvpython testcase.py". Output on my system is: ERROR: In /build/buildd/paraview-4.0.1/VTK/Common/ExecutionModel/vtkAlgorithm.cxx, line 1387 vtkImageResample (0x27c36f0): Attempt to get connection index 0 for input port 0, which has 0 connections. Segmentation fault (core dumped) When recompiling the paraview package with "nvidia-settings" installed, the problem disappears. As far as I can see this is because in this case paraview is using libXNVCtrl for determining the available GPU memory, while otherwise it will use a default value of 128 MB. When the GPU renderer then decides that the data won't fit into memory, something goes wrong a paraview crashes. So one way of fixing this would be do add nvidia-settings to Build- Depends. ** Affects: paraview (Ubuntu) Importance: Undecided Status: New ** Attachment added: "Testcase" https://bugs.launchpad.net/bugs/1355683/+attachment/4175238/+files/testcase.py -- You received this bug notification because you are a member of Ubuntu Bugs, which is subscribed to Ubuntu. https://bugs.launchpad.net/bugs/1355683 Title: Paraview Volume rendering crashes on NVIDIA cards for large datasets To manage notifications about this bug go to: https://bugs.launchpad.net/ubuntu/+source/paraview/+bug/1355683/+subscriptions -- ubuntu-bugs mailing list ubuntu-bugs@lists.ubuntu.com https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs