Hypothetically I have 100 million records. Each record has 100+ fields. Only 20 of those fields need to be searched on, the rest (including the 20) are just for display purposes. Would it be best to just add the 20 fields to the index and keep the rest in a relational database? What affect does all that fluff data have on the index size and search speeds? Does it matter that some of the fluff data is repeated a lot. (certain fields might just contain state a person lives, the color of their hair, number of fingers, etc). Our indexes are going to be very big, 100 million+ is not an exageration. Will Lucene handle this ok? I have created indexes in the 8-30 million range, but never this big in the number of documents and also the number of fields.
Thanks for any info you can provide. --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]