That assumption needs checking - first rule of performance analysis: check,
don't guess :)

For example, is the java code using an existing connection versus clojure
creating one?  I would also time the cost of creating 100000 clojure maps of
a similar structure.  Finally - 100,000 is big enough to give a small heap
size worries....are the jvm settings the same?

Sent from my iPad

On 6 Aug 2011, at 19:11, Shoeb Bhinderwala <shoeb.bhinderw...@gmail.com>
wrote:

I am loading about 100,000 records from the database with
clojure.contrib.sql, using a simple query that pulls in 25 attributes
(columns) per row. Most of the columns are of type NUMBER so they get loaded
as BigDecimals. I am using Oracle database and the jdbc 6 driver (
com.oracle/ojdbc6 "11.1.0.7.0").

I am using clojure 1.2.1. The code is about 10 times slower than the same
code written in Java using the JDBC API.

Is there any way to speed this up? Type hints? Move to Clojure 1.3?

I am assuming that most of the extra time is spent converting the results
into Clojure maps.

Does anybody have experience optimizing code to load data from the database?

-- Shoeb

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en

Reply via email to