Re: Problem of results ordering

2010-12-26 Thread Grijesh.singh
Your problem is Term Frequency, You do not want to consider term frequency try omit Term Frequency. - Grijesh -- View this message in context: http://lucene.472066.n3.nabble.com/Problem-of-results-ordering-tp2139314p2150722.html Sent from the Solr - User mailing list archive at Nabble.com.

Re: Optimizing to only 1 segment

2010-12-26 Thread Rok Rejc
Hi, there is nothing in the log, and the optimize finishes successfully: 0 17 I run optmize through browser by entering url http://localhost:8080/myindex/update?optimize=true or http://localhost:8080/myindex/update?stream.body= Thanks. On Mon, Dec 27, 2010 at 7:12 AM, Li Li wrote: > mayb

Re: Optimizing to only 1 segment

2010-12-26 Thread Li Li
maybe you can consult log files and it may show you something btw how do you post your command? do you use curl 'http://localhost:8983/solr/update?optimize=true' ? or posting a xml file? 2010/12/27 Rok Rejc : > On Mon, Dec 27, 2010 at 3:26 AM, Li Li wrote: > >> see maxMergeDocs(maxMergeSize) in s

Re: Optimizing to only 1 segment

2010-12-26 Thread Rok Rejc
On Mon, Dec 27, 2010 at 3:26 AM, Li Li wrote: > see maxMergeDocs(maxMergeSize) in solrconfig.xml. if the segment's > documents size is larger than this value, it will not be merged. > I see that in my solrconfig.xml, but it is commented and marked as deprecated. I have uncommented this setting (

exception with xml file processing

2010-12-26 Thread xu cheng
hi all: I use solr to index my documents, and I put my text in a cdata segment.however, solr always throws an exception complaining about thexml file processing . It seems that I can still index the document successfully!!!(actually , I'm not sure about cos there are pretty too many document!)

Re: Optimizing to only 1 segment

2010-12-26 Thread Lance Norskog
Is the optimize finished? By default, the optimize command goes in and the HTTP request returns. You have to add attributes to the command. On Sun, Dec 26, 2010 at 9:23 AM, Rok Rejc wrote: > Hi all, > > I have created an index, commited the data and after that I had run the > optimize with defau

Re: Increase Search Speed for multiple solr request for different time range query

2010-12-26 Thread Lance Norskog
500 rows can be a lot of rows. A Filter query is a normal query the first time it is run, and cached thereafter. If you do a sequence of different time ranges, it will be slow. So, if you just do a query for each time range, and use the query and filter query caches, they might be faster. On Sat,

Re: Optimizing to only 1 segment

2010-12-26 Thread Li Li
see maxMergeDocs(maxMergeSize) in solrconfig.xml. if the segment's documents size is larger than this value, it will not be merged. 2010/12/27 Rok Rejc : > Hi all, > > I have created an index, commited the data and after that I had run the > optimize with default parameters: > > http://localhost:8

Re: Solr branch_3x problems

2010-12-26 Thread Lance Norskog
> I did heap dump + heap histogram before killing the jvm today and the only really suspicious thing was the top line in the histogram: class [B, 81883 instances, 3,974,092,842 bytes > Most of the instances (actually all of around a hundred of them I checked with jhat) look almost the same in term

Optimizing to only 1 segment

2010-12-26 Thread Rok Rejc
Hi all, I have created an index, commited the data and after that I had run the optimize with default parameters: http://localhost:8080/myindex/update?stream.body= I was suprised that after the optimizing was finished there was 21 segments in the index: reader : SolrIndexReader{this=724a2dd4,r