Hi,

I have a query as below which is used for pulling out all the urls accessed 
within app.

select name, id, received_at
from users 
where (url LIKE '%abc%' or url LIKE '%def%' or url LIKE '%ghi%' or url LIKE 
'%jkl%' or url LIKE '%mno%' or url LIKE '%pqr%' or url LIKE '%stu%' or url 
LIKE '%vwx%' or url LIKE '%yza%' or url LIKE '%bcd%') order received_at;

Please note i have put dummy names in between but we have about 1000 urls 
each are unique.

I have queried this manually it takes about 10-15 minutes with out remote 
redshift database via golang program.

If i use prepare statement and query each url one at a time using for loop 
then time has reduced to 5 minutes. (currently all the urls have been saved 
as a string array.)

Ultimate goal is to create a csv file from the output of the above sql. We 
want to use golang for this so we could create a automated service to run 
and generate this csv on regular intervals. So as of now able to use Gorm 
and also using for loop am able to collect everything in a slice.

I believe there should be a easier way to do this further, any suggestions 
would be helpful. I was wondering to make use of go routines, but not 
explored that option.

-- 
You received this message because you are subscribed to the Google Groups 
"golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to golang-nuts+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to