Just to follow up, I did some tests with a simple project that had
core.async, component, and tools.namespace. I ran a REPL and in it had a
doseq loop calling (refresh) over and over. In another terminal, I had a
"watch -n 1 "touch src/test/core.clj" repeating to update the file. The
core fi
I received a message from Moritz that everyone is on openJDK 1.7. I was
curious and looked into this a little bit. I found one older thread from
this forum [1] that mentioned PermGen and OutOfMemory. The following are
some notes that will hopefully help here.
Regarding the exception:
Caused
Out of curiosity, are you colleagues perhaps using Java 8 JDK's? I saw
your message mentions JDK 7 and PermGen was removed in Java 8[1].
[1] - http://www.infoq.com/articles/Java-PERMGEN-Removed
On Wednesday, December 17, 2014 12:41:29 PM UTC-5, Moritz Ulrich wrote:
>
>
> Hello,
>
> I'm getting
You might try experimenting with JVM options to increase
PermGen: https://github.com/boot-clj/boot/wiki/JVM-Options#permgen-errors
On Wednesday, December 17, 2014 12:41:29 PM UTC-5, Moritz Ulrich wrote:
>
>
> Hello,
>
> I'm getting the following error while working on a Clojure(Script) REPL
> o
Thank you very much, that worked splendidly.
On Friday, April 5, 2013 5:14:30 PM UTC+2, Alex Nixon wrote:
>
> Java substrings prevent the original string from being garbage collected;
> perhaps this also happens with regex matches?
>
> You can test the theory by surrounding the values in your ma
Alex Nixon managed to figure it out (further down)
offtopic: I live in the same city as you and I'm also interested in
clojure, email me if you want to have a coffee
Adrian
On Friday, April 5, 2013 5:08:04 PM UTC+2, Laurent PETIT wrote:
>
> You should show us the calling code, I guess ...
>
>
Java substrings prevent the original string from being garbage collected;
perhaps this also happens with regex matches?
You can test the theory by surrounding the values in your map with (String.
) and seeing if the problem goes away.
On 5 April 2013 15:57, Adrian Muresan wrote:
> Hello everyo
Slurp reads the entire file into memory. Maybe it is a combination of a)
the program taking up more of the heap in other parts as it runs and then
b) a particularly large file?
Is there a reason you can't process the files as a line-seq so you don't
have to load the entire thing into memory all at
You should show us the calling code, I guess ...
2013/4/5 Adrian Muresan :
> Hello everyone,
>
> I'm trying to parse a large number of small reports for some data and I'm
> doing this by repeatedly calling the following function (with a for) on each
> of the files:
>
> (defn get-rep [file]
> (
Limiting *print-length* keeps the OutOfMemoryError away, but I guess
it would leave me - when testing more complicated and obscure
functions - insecure whether the returned sequence really is a lazy
one or will blow up the memory instead. But good to know anyway ...
I guess the println function is
On Sat, Jul 30, 2011 at 2:25 PM, Ben wrote:
> Hi,
>
> I'm new in the community and new to Clojure and both a really great. A
> big thanks to Rich Hickey and everyone else involved.
>
> I have a question that refers to an example Stuart Halloway used in
> "Programming Clojure". There he defines the
Ben writes:
> (defn whole-numbers [] (iterate inc 1))
>
> If I use it like this at the REPL
>
> (take (whole-numbers))
>
> I get:
> Java heap space [Thrown class java.lang.OutOfMemoryError]
>
> This unexpectedly is the same result that I expectedly get when
> binding whole-numbers to a t
When I do that, the REPL starts printing the sequence, filling screens
after screens with numbers.
By doing that, it realizes the printed part of the sequence, which
will eventually lead to an OOM error, since it probably holds on to
the reference to the start of the sequence.
Doing (set! clojure.
So in case anyone else stumbles across this topic, I thought I'd share
what little I have learned about the laziness of concat, and by
extension mapcat, as used in this function.
(defn json-seq [dir-name]
(mapcat #(do (print "f") (str/split (slurp %) #"\nStatusJSONImpl"))
(out-files di
>> the entire sequence being in memory. However, if you retain the head
>> of the sequence elsewhere, you will see the same effect.
>
> I don't think my function retains the head? Please correct me if I am
> wrong.
Not that I can see but I don't have the full context. I tried
reproducing just now
Cheers. But tests suggest to me that (for...) has the same laziness
characteristics --or lack thereof-- as does (map...)
On Jul 26, 6:56 pm, Randy Hudson wrote:
> You can get a lazy sequence of all the lines in all the files by
> something like:
>
> (for [file out-files
> line (with-open [
Thanks! This is still driving me mad 'though.
On Jul 27, 5:11 pm, Peter Schuller
wrote:
> The observations that the data structures are non-lazy still apply,
> even if you could postpone the problem by increasing the heap size.
Yes I can see that the sequence returned from str/split is not laz
> I am getting a lot further now, but still running into OutOfMemory
> errors sometimes. And it is still the case that once I have suffered
> an OutOfMemoryError, they keep coming. It does feel as if there must
> be some large memory leak in the emacs/lein swank repl. Is this a
> recognised iss
Thanks Sean, your first suggestion was a very good one :)
Tweaking JVM settings feels like advanced magic, and I am a little
surprised that it is necessary at such an early stage in my Clojure
journey. But googling confirms that the default JVM settings are
miserly to an extreme, and I need at le
Thanks! but not entirely convinced. At my REPL:
user> (repeatedly 10 #(do (print "f") [(rand-int 10)]))
(ff[0] f[8] f[5] f[7] f[1] f[6] f[7] f[3] f[3] [0])
user> (take 5 (apply concat (repeatedly 10 #(do (print "f") [(rand-int
10)]
(7 1 f6 f5 8)
Only six "f"s... so doesn't that mean th
>> Here is my function:
>> (defn json-seq []
>> (apply concat
>> (map #(do (print "f") (str/split (slurp %) #"\nStatusJSONImpl"))
>> out-files)))
>
> Try removing the "apply concat" at the front, I'm pretty sure that's
> making your sequence non-lazy.
Correct me if I'm wrong
On Mon, Jul 26, 2010 at 9:53 AM, atucker wrote:
> Here is my function:
> (defn json-seq []
> (apply concat
> (map #(do (print "f") (str/split (slurp %) #"\nStatusJSONImpl"))
> out-files)))
Try removing the "apply concat" at the front, I'm pretty sure that's
making your seque
You can get a lazy sequence of all the lines in all the files by
something like:
(for [file out-files
line (with-open [r (io/reader file)] (line-seq r))]
line)
If "StatusJSONImpl" is on a separate line, you can throw in a :when
clause to filter them out:
(for [file out-files
line (
> Here is my function:
>
> (defn json-seq []
> (apply concat
> (map #(do (print "f") (str/split (slurp %) #"\nStatusJSONImpl"))
> out-files)))
Assuming the str namespace is clojure.contrib.string, (str/split ..)
won't be lazy. Currently it's implemented as:
(defn split
"Sp
My first thought is that you need to tweak your JVM settings. Try
allocation a minimum of 512MB to the total.
My second thought is that you need to use laziness to your advantage.
Remove the print expression from the mapping operation. It's useful
for debugging/prototyping, but shouldn't be in t
25 matches
Mail list logo