Sorry, I got the conditions backwards, it was late :) And I also
didn't replace my test data with your erb...

> FasterCSV.foreach('CIS.csv') do |row|
>   port,ip,circuitid,node,eqpt,org = row
>   if connection.select_count("select count(1) from configuration_items
> ci where ci.name='#{port}'") > 0
>     update configuration_items ci set ci.name='#{port}', ci.org_id=
> (select id from organizations o where o.name='#{org}'), ci.equipment_id=
> (select id from equipments e where e.name='#{eqpt}'), ci.ip_id=(select id
> from ips i where i.name='#{ip}'), ci.circuitid=4, ci.site_id=(select
> id from sites s where s.name='#{node}') where ci.name='#{port}'
>   else
>     insert into configuration_items (name, org_id, equipment_id,
> ip_id, circuitid, site_id) select '#{port}', (select id from
> organizations o where o.name='#{org}'), (select id from equipments e
> where e.name='#{eqpt}'), (select id from ips i where i.name='#{ip}'),
> (select '#{circuit}'), (select id from sites s where s.name='#{node}')
>   end
> end

On Jun 1, 12:06 am, nodoubtarockstar <jenniferwendl...@gmail.com>
wrote:
> First, your find_or_creates are just a hack to not throw a runtime
> error if you were to try to call id on nil , while also setting
> conditions for your finder... Either way,
>
> Honestly, in my opinion, you'd be better off doing this in pure sql --
> using Rails for this type of stuff is sometimes too slow :-/
> Especially since you're not really using Rails for anything except the
> getters/setters, it's not like you're performing operations on any of
> your queries, you're just finding records and setting field values
>
> This is just an "initial" stab at what you're looking at (in MySQL) --
> could probably be beefed up a ton more, who knows:
> FasterCSV.foreach('CIS.csv') do |row|
>   port,ip,circuitid,node,eqpt,org = row
>   if connection.select_count("select count(1) from configuration_items
> ci where ci.name='#{port}'") > 0
>     insert into configuration_items (name, org_id, equipment_id,
> ip_id, circuitid, site_id) select '#{port}', (select id from
> organizations o where o.name='#{org}'), (select id from equipments e
> where e.name='#{eqpt}'), (select id from ips i where i.name='#{ip}'),
> (select '#{circuit}'), (select id from sites s where s.name='#{node}')
>   else
>     update configuration_items ci set ci.name='1234', ci.org_id=
> (select id from organizations o where o.name='test'), ci.equipment_id=
> (select id from equipments e where e.name='test'), ci.ip_id=(select id
> from ips i where i.name='test 4'), ci.circuitid=4, ci.site_id=(select
> id from sites s where s.name='test 2') where ci.name='test'
>   end
> end
>
> This could *possibly* be done in one query with an on duplicate key
> update, but I'm honestly not quite familiar enough with MySQL to do
> this complex of a query with DKU, I'm used to just doing things like
> insert or increment with DKU :).
>
> Anyway, run that in query browser or something and see how the times
> compare when you're not doing any Rails lookups...
>
> Cheers!
>
> On May 31, 12:09 pm, Chris Habgood <chabg...@gmail.com> wrote:
>
> > Well this is crazy.  I do most of my dev work on my computer I use as I work
> > from home.  Running Vista I was able to insert 388,000 records in 2:40
> > mins.  I hesitated trying to do it on my laptop.  BUT!!  I am running winxp
> > on it and I inserted 389,000 records in 40 mins.  WOW!  also in winxp the
> > autocomplete lookup runs much faster in XP than Vista.  Also it involves
> > putting repetitive data in the ier own tables for better normalization.
> > This is really more of a test to see how it compares to another system we
> > use(JAVA) for doing data uploads.
>
> > FasterCSV.foreach('CIS.csv') do |row|
> >     port,ip,circuitid,node,eqpt,org = row
>
> >     ip1 = Ip.find_or_create_by_name(ip).id
> >     node1 = Site.find_or_create_by_name(node).id
> >     ci = ConfigurationItem.find_by_name(port)
> >     eq = Equipment.find_or_create_by_name(eqpt).id
> >     organ = Organization.find_or_create_by_name(org).id
> >     if ci.nil?
> >         ConfigurationItem.create(:name => port, :org_id => organ
> > ,:equipment_id => eq ,:ip_id => ip1, :circuitid => circuitid, :site_id =>
> > node1)
> >     else
> >         ci.update_attributes(:name => port, :org_id => organ , :equipment_id
> > => eq,  :ip_id => ip1, :circuitid => circuitid, :site_id => node1)
>
> >     end
> > end
>
> > On Sat, May 30, 2009 at 5:12 PM, Conrad Taylor <conra...@gmail.com> wrote:
> > > 2009/5/30 Maurício Linhares <mauricio.linha...@gmail.com>
>
> > >> Have you tried to add a :limit clause to your select statement?
>
> > >> Maybe show only the first 15 results and make sure there's an index at
> > >> the column being searched for.
>
> > >> -
> > >> Maurício Linhares
> > >>http://alinhavado.wordpress.com/(pt-br) |http://blog.codevader.com/(en)
>
> > >> On Sat, May 30, 2009 at 3:08 PM, Me <chabg...@gmail.com> wrote:
>
> > >> > Is there anything I can do to speed up the query of a table that has
> > >> > 380,000 entries and will only get bigger?
>
> > >> > I tried the recipe in the rails recipe book but the page takes a long
> > >> > time just for the page to load due to the array being generated.
>
> > > Also, you may also try using the :select to limit what fields you fetch
> > > from
> > > your database.  If you can provide a sample code, we can better assist 
> > > you.
> > > Other than that, I would recommend using the
>
> > > #{RAILS_ROOT}/script/performance/benchmarker
>
> > > to profile your queries to make sure that your optimizations are truly
> > > working
> > > for the query in question.
>
> > > Good luck,
>
> > > -Conrad
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Ruby 
on Rails: Talk" group.
To post to this group, send email to rubyonrails-talk@googlegroups.com
To unsubscribe from this group, send email to 
rubyonrails-talk+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/rubyonrails-talk?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to