Queueing/too many files

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Queueing/too many files

junklight
Hi,

I am wanting to index a very large number of documents. I can POST  
these to solr very quickly but rapidly run into the 'too many files  
open problem'.

Is there anyway to avoid this - either by solr blocking *before* the  
error or by it queuing the incoming documents until it can cope with  
them. I have tried using ulimit and reducing merge - but that just  
moves the problem along slightly

cheers

mark
Reply | Threaded
Open this post in threaded view
|

Re: Queueing/too many files

Chris Hostetter-3
: these to solr very quickly but rapidly run into the 'too many files
: open problem'.

too many filehandles may be because you aren't closing the connections in
your update client ... or it could be a result of *not* using the compound
file format (controlled via solrconfig.xml)

Solr itself really doesn't use a lot of open files -- just the Lucene
index files, but if you use the compound file format that really shouldn't
be a problem (unless your file limit is really, really tiny)



-Hoss

Reply | Threaded
Open this post in threaded view
|

Re: Queueing/too many files

junklight
>  or it could be a result of *not* using the compound
> file format (controlled via solrconfig.xml)
>


Ah yes - this was set to false by default - seems to have done the  
trick.

cheers

mark