Can't add new data or optimize

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Can't add new data or optimize

Kevin Lewandowski
I'm no longer able to add new data or optimize my index. There are
currently 1600 files in the index directory and it's about 1.1gb. I've
tried changing solrconfig.xml to use the compound file format and that
didn't make a difference. My ulimit is "unlimited" but I've tried
setting it at 1000000 and restarting tomcat but tomcat wouldn't start
after doing that.

I'm not sure what else can be done. Any ideas?

thanks,
Kevin



INFO: start commit(optimize=true,waitFlush=false,waitSearcher=true)
Oct 30, 2006 2:08:33 PM org.apache.solr.core.SolrException log
SEVERE: Exception during
commit/optimize:java.io.FileNotFoundException:
/root/solr-tomcat/solr/data/index/_2e5ht.f8 (Too many open files)
        at java.io.RandomAccessFile.open(Native Method)
        at java.io.RandomAccessFile.<init>(RandomAccessFile.java:212)
        at org.apache.lucene.store.FSIndexInput$Descriptor.<init>(FSDirectory.java:481)
        at org.apache.lucene.store.FSIndexInput.<init>(FSDirectory.java:490)
        at org.apache.lucene.store.FSDirectory.openInput(FSDirectory.java:423)
        at org.apache.lucene.index.SegmentReader.openNorms(SegmentReader.java:445)
        at org.apache.lucene.index.SegmentReader.initialize(SegmentReader.java:157)
        at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:129)
        at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:110)
        at org.apache.lucene.index.IndexWriter.mergeSegments(IndexWriter.java:743)
        at org.apache.lucene.index.IndexWriter.mergeSegments(IndexWriter.java:727)
        at org.apache.lucene.index.IndexWriter.optimize(IndexWriter.java:580)
        at org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:459)
        at org.apache.solr.core.SolrCore.update(SolrCore.java:755)
        at org.apache.solr.servlet.SolrUpdateServlet.doPost(SolrUpdateServlet.java:52)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:709)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:252)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:173)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:178)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:126)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:105)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:107)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:148)
        at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:869)
        at org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:664)
        at org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:527)
        at org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:80)
        at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:684)
        at java.lang.Thread.run(Thread.java:595)

Oct 30, 2006 2:08:33 PM org.apache.solr.core.SolrException log
SEVERE: java.lang.NoClassDefFoundError: org/apache/solr/util/XML
        at org.apache.solr.core.SolrCore.writeResult(SolrCore.java:943)
        at org.apache.solr.core.SolrCore.update(SolrCore.java:780)
        at org.apache.solr.servlet.SolrUpdateServlet.doPost(SolrUpdateServlet.java:52)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:709)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:252)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:173)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:178)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:126)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:105)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:107)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:148)
        at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:869)
        at org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:664)
        at org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:527)
        at org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:80)
        at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:684)
        at java.lang.Thread.run(Thread.java:595)
Reply | Threaded
Open this post in threaded view
|

Re: Can't add new data or optimize

Mike Klaas
On 10/30/06, Kevin Lewandowski <[hidden email]> wrote:
> I'm no longer able to add new data or optimize my index. There are
> currently 1600 files in the index directory and it's about 1.1gb. I've
> tried changing solrconfig.xml to use the compound file format and that
> didn't make a difference. My ulimit is "unlimited" but I've tried
> setting it at 1000000 and restarting tomcat but tomcat wouldn't start
> after doing that.

The ulimit setting that matters here is ''ulimit -n''.  Try upping that.

-Mike
Reply | Threaded
Open this post in threaded view
|

Re: Can't add new data or optimize

Yonik Seeley-2
In reply to this post by Kevin Lewandowski
On 10/30/06, Kevin Lewandowski <[hidden email]> wrote:
> I'm no longer able to add new data or optimize my index. There are
> currently 1600 files in the index directory and it's about 1.1gb. I've
> tried changing solrconfig.xml to use the compound file format and that
> didn't make a difference. My ulimit is "unlimited" but I've tried
> setting it at 1000000 and restarting tomcat but tomcat wouldn't start
> after doing that.
>
> I'm not sure what else can be done. Any ideas?

What are your index stats? number of indexed fields, maxBufferedDocs,
maxMergeDocs, mergeFactor, etc?

1600 files sounds like too many, unless they are no longer part of the
active index and are just left over from previous merge failures.


A ulimit of 8192 is normally more than enough.... depending on the OS,
raising the limit could take extra resources, so don't set it as high
as you can.
Be aware that using "ulimit" to change the number of open files or
file descriptors is probably only setting the "soft" limit, and not
the system-wide hard limit (which may be 1024 unless you change it)

http://www.google.com/search?q=linux+ulimit+hard+limit

-Yonik
Reply | Threaded
Open this post in threaded view
|

Re: Can't add new data or optimize

Kevin Lewandowski
Thanks for the help! I the problem was I was not using "ulimit -n".
It's back to normal now.

thanks,
Kevin

On 10/30/06, Yonik Seeley <[hidden email]> wrote:

> On 10/30/06, Kevin Lewandowski <[hidden email]> wrote:
> > I'm no longer able to add new data or optimize my index. There are
> > currently 1600 files in the index directory and it's about 1.1gb. I've
> > tried changing solrconfig.xml to use the compound file format and that
> > didn't make a difference. My ulimit is "unlimited" but I've tried
> > setting it at 1000000 and restarting tomcat but tomcat wouldn't start
> > after doing that.
> >
> > I'm not sure what else can be done. Any ideas?
>
> What are your index stats? number of indexed fields, maxBufferedDocs,
> maxMergeDocs, mergeFactor, etc?
>
> 1600 files sounds like too many, unless they are no longer part of the
> active index and are just left over from previous merge failures.
>
>
> A ulimit of 8192 is normally more than enough.... depending on the OS,
> raising the limit could take extra resources, so don't set it as high
> as you can.
> Be aware that using "ulimit" to change the number of open files or
> file descriptors is probably only setting the "soft" limit, and not
> the system-wide hard limit (which may be 1024 unless you change it)
>
> http://www.google.com/search?q=linux+ulimit+hard+limit
>
> -Yonik
>