Hello,

I'm sending this question to the development group because it seems that only 
developers would be able to answer it. Moreover, it involves a possible change 
request.

I work for Red Hat as a software maintenance engineer. Recently one of our 
customers reported a crazy situation in which really BIG server ran out of 
memory due to VIM's undo feature. Some user in their environment did a global 
search and replace ( :%s/\n/,/g ) against a file that had 59 Million lines. The 
operation consumed more than 100 GB of memory and they had to reboot the 
machine in order to restore normal functionality.

The problem is quite easy to reproduce. Even a small scale reproduction with a 
46 Mb file and 6 Million lines consumed all available memory (12 Gb) of our 
test machine, until oom-killer killed it.

We explained to the customer that they should not use VIM for such task. 
Anyway, there are two questions that I wold like to ask.

1. Is there a formula that a single search-and-replace consumes x memory? Is 
this documented anywhere?

2. Would it be reasonable to open a request to implement some kind of maximum 
cap as a safe guard?

Thanks in advance,

Carlos Santos (casantos)
Senior *Software* Maintenance Engineer
(no, I'm not going to fix your roof)
Red Hat, Inc


-- 
-- 
You received this message from the "vim_dev" maillist.
Do not top-post! Type your reply below the text you are replying to.
For more information, visit http://www.vim.org/maillist.php

--- 
You received this message because you are subscribed to the Google Groups 
"vim_dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Raspunde prin e-mail lui