[JENKINS-EA] Lucene-Solr-master-Windows (64bit/jdk-9-ea+173) - Build # 6748 - Unstable!

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

[JENKINS-EA] Lucene-Solr-master-Windows (64bit/jdk-9-ea+173) - Build # 6748 - Unstable!

Policeman Jenkins Server-2
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Windows/6748/
Java: 64bit/jdk-9-ea+173 -XX:+UseCompressedOops -XX:+UseG1GC

1 tests failed.
FAILED:  org.apache.solr.cloud.ShardSplitTest.testSplitWithChaosMonkey

Error Message:
There are still nodes recoverying - waited for 330 seconds

Stack Trace:
java.lang.AssertionError: There are still nodes recoverying - waited for 330 seconds
        at __randomizedtesting.SeedInfo.seed([293125BE5DA80AC6:A216F66F1CAEA142]:0)
        at org.junit.Assert.fail(Assert.java:93)
        at org.apache.solr.cloud.AbstractDistribZkTestBase.waitForRecoveriesToFinish(AbstractDistribZkTestBase.java:183)
        at org.apache.solr.cloud.AbstractDistribZkTestBase.waitForRecoveriesToFinish(AbstractDistribZkTestBase.java:140)
        at org.apache.solr.cloud.AbstractDistribZkTestBase.waitForRecoveriesToFinish(AbstractDistribZkTestBase.java:135)
        at org.apache.solr.cloud.AbstractFullDistribZkTestBase.waitForRecoveriesToFinish(AbstractFullDistribZkTestBase.java:907)
        at org.apache.solr.cloud.ShardSplitTest.testSplitWithChaosMonkey(ShardSplitTest.java:436)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:564)
        at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1713)
        at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:907)
        at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:943)
        at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:957)
        at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:985)
        at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:960)
        at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
        at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
        at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
        at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
        at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
        at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
        at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
        at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
        at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
        at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:916)
        at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:802)
        at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:852)
        at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
        at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
        at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
        at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
        at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
        at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
        at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
        at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
        at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
        at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
        at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
        at java.base/java.lang.Thread.run(Thread.java:844)




Build Log:
[...truncated 11568 lines...]
   [junit4] Suite: org.apache.solr.cloud.ShardSplitTest
   [junit4]   2> Creating dataDir: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\init-core-data-001
   [junit4]   2> 503128 INFO  (SUITE-ShardSplitTest-seed#[293125BE5DA80AC6]-worker) [    ] o.a.s.SolrTestCaseJ4 Using TrieFields (NUMERIC_POINTS_SYSPROP=false) w/NUMERIC_DOCVALUES_SYSPROP=true
   [junit4]   2> 503128 INFO  (SUITE-ShardSplitTest-seed#[293125BE5DA80AC6]-worker) [    ] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (false) via: @org.apache.solr.SolrTestCaseJ4$SuppressSSL(bugUrl="https://issues.apache.org/jira/browse/SOLR-5776")
   [junit4]   2> 503128 INFO  (SUITE-ShardSplitTest-seed#[293125BE5DA80AC6]-worker) [    ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /
   [junit4]   2> 503148 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 503149 INFO  (Thread-1837) [    ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 503149 INFO  (Thread-1837) [    ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 503157 ERROR (Thread-1837) [    ] o.a.z.s.ZooKeeperServer ZKShutdownHandler is not registered, so ZooKeeper server won't take any action on ERROR or SHUTDOWN server state changes
   [junit4]   2> 503250 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.ZkTestServer start zk server on port:51082
   [junit4]   2> 503293 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn caught end of stream exception
   [junit4]   2> EndOfStreamException: Unable to read additional data from client sessionid 0x15d463e15d70000, likely client has closed socket
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:239)
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:203)
   [junit4]   2> at java.base/java.lang.Thread.run(Thread.java:844)
   [junit4]   2> 503381 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 503387 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\schema15.xml to /configs/conf1/schema.xml
   [junit4]   2> 503390 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 503396 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 503402 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 503405 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 503410 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 503413 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 503416 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 503431 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 503463 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 503467 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn caught end of stream exception
   [junit4]   2> EndOfStreamException: Unable to read additional data from client sessionid 0x15d463e15d70001, likely client has closed socket
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:239)
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:203)
   [junit4]   2> at java.base/java.lang.Thread.run(Thread.java:844)
   [junit4]   2> 503468 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractFullDistribZkTestBase Will use TLOG replicas unless explicitly asked otherwise
   [junit4]   2> 504229 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 504230 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@2ec56163{/,null,AVAILABLE}
   [junit4]   2> 504234 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@72d9b547{HTTP/1.1,[http/1.1]}{127.0.0.1:51089}
   [junit4]   2> 504234 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.Server Started @505615ms
   [junit4]   2> 504234 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solr.data.dir=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\tempDir-001/control/data, hostPort=51089, coreRootDirectory=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\control-001\cores, replicaType=NRT}
   [junit4]   2> 504234 ERROR (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 504234 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 8.0.0
   [junit4]   2> 504235 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 504235 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null, Default config dir: null
   [junit4]   2> 504235 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2017-07-15T12:34:21.328740Z
   [junit4]   2> 504240 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn caught end of stream exception
   [junit4]   2> EndOfStreamException: Unable to read additional data from client sessionid 0x15d463e15d70002, likely client has closed socket
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:239)
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:203)
   [junit4]   2> at java.base/java.lang.Thread.run(Thread.java:844)
   [junit4]   2> 504240 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 504240 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\control-001\solr.xml
   [junit4]   2> 504243 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 504251 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:51082/solr
   [junit4]   2> 504255 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn caught end of stream exception
   [junit4]   2> EndOfStreamException: Unable to read additional data from client sessionid 0x15d463e15d70003, likely client has closed socket
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:239)
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:203)
   [junit4]   2> at java.base/java.lang.Thread.run(Thread.java:844)
   [junit4]   2> 504855 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.Overseer Overseer (id=null) closing
   [junit4]   2> 504857 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:51089_
   [junit4]   2> 504864 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.Overseer Overseer (id=98311999340347396-127.0.0.1:51089_-n_0000000000) starting
   [junit4]   2> 504874 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:51089_
   [junit4]   2> 504876 INFO  (zkCallback-979-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 504992 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 504998 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 504998 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 505001 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\control-001\cores
   [junit4]   2> 505059 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 505067 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:51082/solr ready
   [junit4]   2> 505079 INFO  (qtp1887831414-6193) [    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params replicationFactor=1&collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:51089_&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 505122 INFO  (OverseerThreadFactory-3178-thread-1) [    ] o.a.s.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 505250 INFO  (qtp1887831414-6195) [    ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 505255 INFO  (qtp1887831414-6195) [    ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 505396 INFO  (zkCallback-979-thread-1) [    ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 506286 INFO  (qtp1887831414-6195) [    ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.0.0
   [junit4]   2> 506301 INFO  (qtp1887831414-6195) [    ] o.a.s.s.IndexSchema [control_collection_shard1_replica_n1] Schema name=test
   [junit4]   2> 506365 INFO  (qtp1887831414-6195) [    ] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id
   [junit4]   2> 506403 INFO  (qtp1887831414-6195) [    ] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from collection control_collection, trusted=true
   [junit4]   2> 506404 INFO  (qtp1887831414-6195) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 506404 INFO  (qtp1887831414-6195) [    ] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 506404 INFO  (qtp1887831414-6195) [    ] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\control-001\cores\control_collection_shard1_replica_n1], dataDir=[C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\control-001\cores\control_collection_shard1_replica_n1\data\]
   [junit4]   2> 506415 INFO  (qtp1887831414-6195) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=32, maxMergeAtOnceExplicit=19, maxMergedSegmentMB=90.5986328125, floorSegmentMB=0.8515625, forceMergeDeletesPctAllowed=29.909096492685613, segmentsPerTier=23.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0
   [junit4]   2> 506452 WARN  (qtp1887831414-6195) [    ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 506564 INFO  (qtp1887831414-6195) [    ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 506564 INFO  (qtp1887831414-6195) [    ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 506566 INFO  (qtp1887831414-6195) [    ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 506566 INFO  (qtp1887831414-6195) [    ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 506569 INFO  (qtp1887831414-6195) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=13, maxMergeAtOnceExplicit=50, maxMergedSegmentMB=87.9814453125, floorSegmentMB=0.9130859375, forceMergeDeletesPctAllowed=28.713299927934685, segmentsPerTier=22.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.17693655700016617
   [junit4]   2> 506570 INFO  (qtp1887831414-6195) [    ] o.a.s.s.SolrIndexSearcher Opening [Searcher@72d6098e[control_collection_shard1_replica_n1] main]
   [junit4]   2> 506580 INFO  (qtp1887831414-6195) [    ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 506583 INFO  (qtp1887831414-6195) [    ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 506587 INFO  (qtp1887831414-6195) [    ] o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 506587 INFO  (qtp1887831414-6195) [    ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1572991993045319680
   [junit4]   2> 506590 INFO  (searcherExecutor-3181-thread-1) [    ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@72d6098e[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 506607 INFO  (qtp1887831414-6195) [    ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 506607 INFO  (qtp1887831414-6195) [    ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 506607 INFO  (qtp1887831414-6195) [    ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:51089/control_collection_shard1_replica_n1/
   [junit4]   2> 506607 INFO  (qtp1887831414-6195) [    ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 506607 INFO  (qtp1887831414-6195) [    ] o.a.s.c.SyncStrategy http://127.0.0.1:51089/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 506607 INFO  (qtp1887831414-6195) [    ] o.a.s.c.ShardLeaderElectionContext Found all replicas participating in election, clear LIR
   [junit4]   2> 506625 INFO  (qtp1887831414-6195) [    ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:51089/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 506735 INFO  (zkCallback-979-thread-1) [    ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 506782 INFO  (qtp1887831414-6195) [    ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 506798 INFO  (qtp1887831414-6195) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1547
   [junit4]   2> 506833 INFO  (qtp1887831414-6193) [    ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 30 seconds. Check all shard replicas
   [junit4]   2> 506903 INFO  (zkCallback-979-thread-1) [    ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 507142 INFO  (OverseerCollectionConfigSetProcessor-98311999340347396-127.0.0.1:51089_-n_0000000000) [    ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 507839 INFO  (qtp1887831414-6193) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={replicationFactor=1&collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:51089_&wt=javabin&version=2} status=0 QTime=2759
   [junit4]   2> 507866 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 507873 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:51082/solr ready
   [junit4]   2> 507873 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 507874 INFO  (qtp1887831414-6198) [    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params replicationFactor=1&collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=2&createNodeSet=&stateFormat=1&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 507884 INFO  (OverseerThreadFactory-3178-thread-2) [    ] o.a.s.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 507885 WARN  (OverseerThreadFactory-3178-thread-2) [    ] o.a.s.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 508105 INFO  (qtp1887831414-6198) [    ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 30 seconds. Check all shard replicas
   [junit4]   2> 508105 INFO  (qtp1887831414-6198) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={replicationFactor=1&collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=2&createNodeSet=&stateFormat=1&wt=javabin&version=2} status=0 QTime=230
   [junit4]   2> 508646 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-1-001 of type TLOG
   [junit4]   2> 508647 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 508648 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@8bec6fe{/,null,AVAILABLE}
   [junit4]   2> 508649 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@4dda8659{HTTP/1.1,[http/1.1]}{127.0.0.1:51110}
   [junit4]   2> 508649 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.Server Started @510030ms
   [junit4]   2> 508649 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\tempDir-001/jetty1, hostPort=51110, coreRootDirectory=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-1-001\cores, replicaType=TLOG}
   [junit4]   2> 508649 ERROR (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 508650 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 8.0.0
   [junit4]   2> 508650 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 508650 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null, Default config dir: null
   [junit4]   2> 508650 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2017-07-15T12:34:25.743608300Z
   [junit4]   2> 508656 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 508656 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-1-001\solr.xml
   [junit4]   2> 508661 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 508666 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:51082/solr
   [junit4]   2> 508695 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 508697 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.Overseer Overseer (id=null) closing
   [junit4]   2> 508700 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:51110_
   [junit4]   2> 508750 INFO  (zkCallback-991-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 508750 INFO  (zkCallback-986-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 508750 INFO  (zkCallback-979-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 508814 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 508825 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 508825 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 508827 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-1-001\cores
   [junit4]   2> 508888 INFO  (qtp1541802432-6232) [    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:51110_&action=ADDREPLICA&collection=collection1&shard=shard2&type=TLOG&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 508904 INFO  (OverseerCollectionConfigSetProcessor-98311999340347396-127.0.0.1:51089_-n_0000000000) [    ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 508920 INFO  (OverseerThreadFactory-3178-thread-3) [    ] o.a.s.c.AddReplicaCmd Node Identified 127.0.0.1:51110_ for creating new replica
   [junit4]   2> 508936 INFO  (qtp1541802432-6235) [    ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard2_replica_t0&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=TLOG
   [junit4]   2> 508936 INFO  (qtp1541802432-6235) [    ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 509957 INFO  (qtp1541802432-6235) [    ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.0.0
   [junit4]   2> 509979 INFO  (qtp1541802432-6235) [    ] o.a.s.s.IndexSchema [collection1_shard2_replica_t0] Schema name=test
   [junit4]   2> 510081 INFO  (qtp1541802432-6235) [    ] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id
   [junit4]   2> 510230 INFO  (qtp1541802432-6235) [    ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard2_replica_t0' using configuration from collection collection1, trusted=true
   [junit4]   2> 510231 INFO  (qtp1541802432-6235) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard2.replica_t0' (registry 'solr.core.collection1.shard2.replica_t0') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 510231 INFO  (qtp1541802432-6235) [    ] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 510231 INFO  (qtp1541802432-6235) [    ] o.a.s.c.SolrCore [[collection1_shard2_replica_t0] ] Opening new SolrCore at [C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-1-001\cores\collection1_shard2_replica_t0], dataDir=[C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-1-001\cores\collection1_shard2_replica_t0\data\]
   [junit4]   2> 510258 INFO  (qtp1541802432-6235) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=32, maxMergeAtOnceExplicit=19, maxMergedSegmentMB=90.5986328125, floorSegmentMB=0.8515625, forceMergeDeletesPctAllowed=29.909096492685613, segmentsPerTier=23.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0
   [junit4]   2> 510267 WARN  (qtp1541802432-6235) [    ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 510405 INFO  (qtp1541802432-6235) [    ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 510405 INFO  (qtp1541802432-6235) [    ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 510406 INFO  (qtp1541802432-6235) [    ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 510407 INFO  (qtp1541802432-6235) [    ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 510409 INFO  (qtp1541802432-6235) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=13, maxMergeAtOnceExplicit=50, maxMergedSegmentMB=87.9814453125, floorSegmentMB=0.9130859375, forceMergeDeletesPctAllowed=28.713299927934685, segmentsPerTier=22.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.17693655700016617
   [junit4]   2> 510409 INFO  (qtp1541802432-6235) [    ] o.a.s.s.SolrIndexSearcher Opening [Searcher@425cc3a6[collection1_shard2_replica_t0] main]
   [junit4]   2> 510416 INFO  (qtp1541802432-6235) [    ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 510418 INFO  (qtp1541802432-6235) [    ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 510419 INFO  (qtp1541802432-6235) [    ] o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 510420 INFO  (qtp1541802432-6235) [    ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1572991997064511488
   [junit4]   2> 510421 INFO  (searcherExecutor-3192-thread-1) [    ] o.a.s.c.SolrCore [collection1_shard2_replica_t0] Registered new searcher Searcher@425cc3a6[collection1_shard2_replica_t0] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 510432 INFO  (qtp1541802432-6235) [    ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 510432 INFO  (qtp1541802432-6235) [    ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 510433 INFO  (qtp1541802432-6235) [    ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:51110/collection1_shard2_replica_t0/
   [junit4]   2> 510433 INFO  (qtp1541802432-6235) [    ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 510433 INFO  (qtp1541802432-6235) [    ] o.a.s.c.SyncStrategy http://127.0.0.1:51110/collection1_shard2_replica_t0/ has no replicas
   [junit4]   2> 510433 INFO  (qtp1541802432-6235) [    ] o.a.s.c.ShardLeaderElectionContext Found all replicas participating in election, clear LIR
   [junit4]   2> 510434 INFO  (qtp1541802432-6235) [    ] o.a.s.c.ZkController collection1_shard2_replica_t0 stopping background replication from leader
   [junit4]   2> 510441 INFO  (qtp1541802432-6235) [    ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:51110/collection1_shard2_replica_t0/ shard2
   [junit4]   2> 510598 INFO  (qtp1541802432-6235) [    ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 510605 INFO  (qtp1541802432-6235) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard2_replica_t0&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=TLOG} status=0 QTime=1666
   [junit4]   2> 510614 INFO  (qtp1541802432-6232) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:51110_&action=ADDREPLICA&collection=collection1&shard=shard2&type=TLOG&wt=javabin&version=2} status=0 QTime=1726
   [junit4]   2> 510922 INFO  (OverseerCollectionConfigSetProcessor-98311999340347396-127.0.0.1:51089_-n_0000000000) [    ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000004 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 511459 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 2 in directory C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-2-001 of type TLOG
   [junit4]   2> 511460 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 511460 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@5e02edcc{/,null,AVAILABLE}
   [junit4]   2> 511462 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@61a6bffb{HTTP/1.1,[http/1.1]}{127.0.0.1:51124}
   [junit4]   2> 511463 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.Server Started @512844ms
   [junit4]   2> 511463 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\tempDir-001/jetty2, hostPort=51124, coreRootDirectory=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-2-001\cores, replicaType=TLOG}
   [junit4]   2> 511464 ERROR (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 511464 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 8.0.0
   [junit4]   2> 511464 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 511464 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null, Default config dir: null
   [junit4]   2> 511464 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2017-07-15T12:34:28.557947600Z
   [junit4]   2> 511474 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 511474 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-2-001\solr.xml
   [junit4]   2> 511480 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 511485 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:51082/solr
   [junit4]   2> 511493 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn caught end of stream exception
   [junit4]   2> EndOfStreamException: Unable to read additional data from client sessionid 0x15d463e15d7000b, likely client has closed socket
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:239)
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:203)
   [junit4]   2> at java.base/java.lang.Thread.run(Thread.java:844)
   [junit4]   2> 511533 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (2)
   [junit4]   2> 511536 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.Overseer Overseer (id=null) closing
   [junit4]   2> 511539 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:51124_
   [junit4]   2> 511542 INFO  (zkCallback-997-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 511544 INFO  (zkCallback-991-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 511544 INFO  (zkCallback-979-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 511545 INFO  (zkCallback-986-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 511641 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 511648 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 511649 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 511651 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-2-001\cores
   [junit4]   2> 511683 INFO  (qtp1541802432-6236) [    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:51124_&action=ADDREPLICA&collection=collection1&shard=shard1&type=TLOG&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 511687 INFO  (OverseerThreadFactory-3178-thread-4) [    ] o.a.s.c.AddReplicaCmd Node Identified 127.0.0.1:51124_ for creating new replica
   [junit4]   2> 511689 INFO  (qtp243838020-6263) [    ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_t0&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=TLOG
   [junit4]   2> 511690 INFO  (qtp243838020-6263) [    ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 512705 INFO  (qtp243838020-6263) [    ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.0.0
   [junit4]   2> 512716 INFO  (qtp243838020-6263) [    ] o.a.s.s.IndexSchema [collection1_shard1_replica_t0] Schema name=test
   [junit4]   2> 512765 INFO  (qtp243838020-6263) [    ] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id
   [junit4]   2> 512784 INFO  (qtp243838020-6263) [    ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_t0' using configuration from collection collection1, trusted=true
   [junit4]   2> 512785 INFO  (qtp243838020-6263) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_t0' (registry 'solr.core.collection1.shard1.replica_t0') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 512785 INFO  (qtp243838020-6263) [    ] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 512785 INFO  (qtp243838020-6263) [    ] o.a.s.c.SolrCore [[collection1_shard1_replica_t0] ] Opening new SolrCore at [C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-2-001\cores\collection1_shard1_replica_t0], dataDir=[C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-2-001\cores\collection1_shard1_replica_t0\data\]
   [junit4]   2> 512794 INFO  (qtp243838020-6263) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=32, maxMergeAtOnceExplicit=19, maxMergedSegmentMB=90.5986328125, floorSegmentMB=0.8515625, forceMergeDeletesPctAllowed=29.909096492685613, segmentsPerTier=23.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0
   [junit4]   2> 512800 WARN  (qtp243838020-6263) [    ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 512884 INFO  (qtp243838020-6263) [    ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 512884 INFO  (qtp243838020-6263) [    ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 512884 INFO  (qtp243838020-6263) [    ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 512884 INFO  (qtp243838020-6263) [    ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 512884 INFO  (qtp243838020-6263) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=13, maxMergeAtOnceExplicit=50, maxMergedSegmentMB=87.9814453125, floorSegmentMB=0.9130859375, forceMergeDeletesPctAllowed=28.713299927934685, segmentsPerTier=22.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.17693655700016617
   [junit4]   2> 512891 INFO  (qtp243838020-6263) [    ] o.a.s.s.SolrIndexSearcher Opening [Searcher@19a4b4d1[collection1_shard1_replica_t0] main]
   [junit4]   2> 512909 INFO  (qtp243838020-6263) [    ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 512915 INFO  (qtp243838020-6263) [    ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 512916 INFO  (qtp243838020-6263) [    ] o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 512916 INFO  (searcherExecutor-3203-thread-1) [    ] o.a.s.c.SolrCore [collection1_shard1_replica_t0] Registered new searcher Searcher@19a4b4d1[collection1_shard1_replica_t0] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 512918 INFO  (qtp243838020-6263) [    ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1572991999683854336
   [junit4]   2> 512959 INFO  (qtp243838020-6263) [    ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 512959 INFO  (qtp243838020-6263) [    ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 512959 INFO  (qtp243838020-6263) [    ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:51124/collection1_shard1_replica_t0/
   [junit4]   2> 512959 INFO  (qtp243838020-6263) [    ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 512959 INFO  (qtp243838020-6263) [    ] o.a.s.c.SyncStrategy http://127.0.0.1:51124/collection1_shard1_replica_t0/ has no replicas
   [junit4]   2> 512959 INFO  (qtp243838020-6263) [    ] o.a.s.c.ShardLeaderElectionContext Found all replicas participating in election, clear LIR
   [junit4]   2> 512959 INFO  (qtp243838020-6263) [    ] o.a.s.c.ZkController collection1_shard1_replica_t0 stopping background replication from leader
   [junit4]   2> 512971 INFO  (qtp243838020-6263) [    ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:51124/collection1_shard1_replica_t0/ shard1
   [junit4]   2> 513135 INFO  (qtp243838020-6263) [    ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 513135 INFO  (qtp243838020-6263) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_t0&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=TLOG} status=0 QTime=1447
   [junit4]   2> 513140 INFO  (qtp1541802432-6236) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:51124_&action=ADDREPLICA&collection=collection1&shard=shard1&type=TLOG&wt=javabin&version=2} status=0 QTime=1457
   [junit4]   2> 513696 INFO  (OverseerCollectionConfigSetProcessor-98311999340347396-127.0.0.1:51089_-n_0000000000) [    ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000006 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 513977 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 3 in directory C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-3-001 of type TLOG
   [junit4]   2> 513979 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 513980 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@4606b894{/,null,AVAILABLE}
   [junit4]   2> 513981 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@6c2651df{HTTP/1.1,[http/1.1]}{127.0.0.1:51137}
   [junit4]   2> 513982 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.Server Started @515363ms
   [junit4]   2> 513982 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\tempDir-001/jetty3, hostPort=51137, coreRootDirectory=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-3-001\cores, replicaType=TLOG}
   [junit4]   2> 513982 ERROR (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 513982 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 8.0.0
   [junit4]   2> 513982 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 513982 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null, Default config dir: null
   [junit4]   2> 513982 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2017-07-15T12:34:31.075496100Z
   [junit4]   2> 513993 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn caught end of stream exception
   [junit4]   2> EndOfStreamException: Unable to read additional data from client sessionid 0x15d463e15d7000d, likely client has closed socket
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:239)
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:203)
   [junit4]   2> at java.base/java.lang.Thread.run(Thread.java:844)
   [junit4]   2> 513994 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 513994 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-3-001\solr.xml
   [junit4]   2> 514037 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 514042 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:51082/solr
   [junit4]   2> 514047 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn caught end of stream exception
   [junit4]   2> EndOfStreamException: Unable to read additional data from client sessionid 0x15d463e15d7000e, likely client has closed socket
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:239)
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:203)
   [junit4]   2> at java.base/java.lang.Thread.run(Thread.java:844)
   [junit4]   2> 514067 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (3)
   [junit4]   2> 514070 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.Overseer Overseer (id=null) closing
   [junit4]   2> 514073 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:51137_
   [junit4]   2> 514076 INFO  (zkCallback-991-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 514076 INFO  (zkCallback-997-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 514077 INFO  (zkCallback-1003-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 514077 INFO  (zkCallback-979-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 514077 INFO  (zkCallback-986-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 514172 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 514180 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 514180 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 514183 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-3-001\cores
   [junit4]   2> 514234 INFO  (qtp1541802432-6237) [    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:51137_&action=ADDREPLICA&collection=collection1&shard=shard2&type=TLOG&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 514254 INFO  (OverseerThreadFactory-3178-thread-5) [    ] o.a.s.c.AddReplicaCmd Node Identified 127.0.0.1:51137_ for creating new replica
   [junit4]   2> 514259 INFO  (qtp1503993007-6301) [    ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard2_replica_t1&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=TLOG
   [junit4]   2> 514260 INFO  (qtp1503993007-6301) [    ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 515277 INFO  (qtp1503993007-6301) [    ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.0.0
   [junit4]   2> 515306 INFO  (qtp1503993007-6301) [    ] o.a.s.s.IndexSchema [collection1_shard2_replica_t1] Schema name=test
   [junit4]   2> 515386 INFO  (qtp1503993007-6301) [    ] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id
   [junit4]   2> 515428 INFO  (qtp1503993007-6301) [    ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard2_replica_t1' using configuration from collection collection1, trusted=true
   [junit4]   2> 515429 INFO  (qtp1503993007-6301) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard2.replica_t1' (registry 'solr.core.collection1.shard2.replica_t1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 515429 INFO  (qtp1503993007-6301) [    ] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 515430 INFO  (qtp1503993007-6301) [    ] o.a.s.c.SolrCore [[collection1_shard2_replica_t1] ] Opening new SolrCore at [C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-3-001\cores\collection1_shard2_replica_t1], dataDir=[C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-3-001\cores\collection1_shard2_replica_t1\data\]
   [junit4]   2> 515444 INFO  (qtp1503993007-6301) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=32, maxMergeAtOnceExplicit=19, maxMergedSegmentMB=90.5986328125, floorSegmentMB=0.8515625, forceMergeDeletesPctAllowed=29.909096492685613, segmentsPerTier=23.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0
   [junit4]   2> 515451 WARN  (qtp1503993007-6301) [    ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 515543 INFO  (qtp1503993007-6301) [    ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 515543 INFO  (qtp1503993007-6301) [    ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 515544 INFO  (qtp1503993007-6301) [    ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 515545 INFO  (qtp1503993007-6301) [    ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 515549 INFO  (qtp1503993007-6301) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=13, maxMergeAtOnceExplicit=50, maxMergedSegmentMB=87.9814453125, floorSegmentMB=0.9130859375, forceMergeDeletesPctAllowed=28.713299927934685, segmentsPerTier=22.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.17693655700016617
   [junit4]   2> 515550 INFO  (qtp1503993007-6301) [    ] o.a.s.s.SolrIndexSearcher Opening [Searcher@25479248[collection1_shard2_replica_t1] main]
   [junit4]   2> 515552 INFO  (qtp1503993007-6301) [    ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 515553 INFO  (qtp1503993007-6301) [    ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 515553 INFO  (qtp1503993007-6301) [    ] o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 515553 INFO  (qtp1503993007-6301) [    ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1572992002446852096
   [junit4]   2> 515557 INFO  (searcherExecutor-3214-thread-1) [    ] o.a.s.c.SolrCore [collection1_shard2_replica_t1] Registered new searcher Searcher@25479248[collection1_shard2_replica_t1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 515560 INFO  (qtp1503993007-6301) [    ] o.a.s.c.ZkController Core needs to recover:collection1_shard2_replica_t1
   [junit4]   2> 515561 INFO  (updateExecutor-1000-thread-1) [    ] o.a.s.u.DefaultSolrCoreState Running recovery
   [junit4]   2> 515562 INFO  (qtp1503993007-6301) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard2_replica_t1&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=TLOG} status=0 QTime=1304
   [junit4]   2> 515563 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true
   [junit4]   2> 515565 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.c.RecoveryStrategy ###### startupVersions=[[]]
   [junit4]   2> 515565 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.c.ZkController collection1_shard2_replica_t1 stopping background replication from leader
   [junit4]   2> 515565 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[collection1_shard2_replica_t1]
   [junit4]   2> 515565 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null}
   [junit4]   2> 515565 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.c.RecoveryStrategy Publishing state of core [collection1_shard2_replica_t1] as recovering, leader is [http://127.0.0.1:51110/collection1_shard2_replica_t0/] and I am [http://127.0.0.1:51137/collection1_shard2_replica_t1/]
   [junit4]   2> 515566 INFO  (qtp1541802432-6237) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:51137_&action=ADDREPLICA&collection=collection1&shard=shard2&type=TLOG&wt=javabin&version=2} status=0 QTime=1331
   [junit4]   2> 515571 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:51110]; [WaitForState: action=PREPRECOVERY&core=collection1_shard2_replica_t0&nodeName=127.0.0.1:51137_&coreNodeName=core_node3&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true]
   [junit4]   2> 515581 INFO  (qtp1541802432-6239) [    ] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node3, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true, maxTime: 183 s
   [junit4]   2> 515581 INFO  (qtp1541802432-6239) [    ] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard2, thisCore=collection1_shard2_replica_t0, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:51137_, coreNodeName=core_node3, onlyIfActiveCheckResult=false, nodeProps: core_node3:{"core":"collection1_shard2_replica_t1","base_url":"http://127.0.0.1:51137","node_name":"127.0.0.1:51137_","state":"down","type":"TLOG"}
   [junit4]   2> 516277 INFO  (OverseerCollectionConfigSetProcessor-98311999340347396-127.0.0.1:51089_-n_0000000000) [    ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000008 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 516372 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 4 in directory C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-4-001 of type TLOG
   [junit4]   2> 516373 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 516374 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@3e933b2d{/,null,AVAILABLE}
   [junit4]   2> 516377 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@17431c58{HTTP/1.1,[http/1.1]}{127.0.0.1:51151}
   [junit4]   2> 516378 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.Server Started @517758ms
   [junit4]   2> 516378 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\tempDir-001/jetty4, hostPort=51151, coreRootDirectory=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-4-001\cores, replicaType=TLOG}
   [junit4]   2> 516378 ERROR (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 516378 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 8.0.0
   [junit4]   2> 516378 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 516378 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null, Default config dir: null
   [junit4]   2> 516378 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2017-07-15T12:34:33.471957400Z
   [junit4]   2> 516402 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn caught end of stream exception
   [junit4]   2> EndOfStreamException: Unable to read additional data from client sessionid 0x15d463e15d70010, likely client has closed socket
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:239)
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:203)
   [junit4]   2> at java.base/java.lang.Thread.run(Thread.java:844)
   [junit4]   2> 516402 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 516402 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-4-001\solr.xml
   [junit4]   2> 516407 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 516411 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:51082/solr
   [junit4]   2> 516420 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn caught end of stream exception
   [junit4]   2> EndOfStreamException: Unable to read additional data from client sessionid 0x15d463e15d70011, likely client has closed socket
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:239)
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:203)
   [junit4]   2> at java.base/java.lang.Thread.run(Thread.java:844)
   [junit4]   2> 516431 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (4)
   [junit4]   2> 516433 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.Overseer Overseer (id=null) closing
   [junit4]   2> 516435 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:51151_
   [junit4]   2> 516438 INFO  (zkCallback-991-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]   2> 516439 INFO  (zkCallback-997-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]   2> 516440 INFO  (zkCallback-1010-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]   2> 516440 INFO  (zkCallback-979-thread-2) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]   2> 516440 INFO  (zkCallback-986-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]   2> 516440 INFO  (zkCallback-1003-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]   2> 516524 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 516529 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 516529 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 516533 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-4-001\cores
   [junit4]   2> 516576 INFO  (qtp1541802432-6234) [    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:51151_&action=ADDREPLICA&collection=collection1&shard=shard1&type=TLOG&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 516580 INFO  (OverseerThreadFactory-3178-thread-5) [    ] o.a.s.c.AddReplicaCmd Node Identified 127.0.0.1:51151_ for creating new replica
   [junit4]   2> 516582 INFO  (qtp1541802432-6239) [    ] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard2, thisCore=collection1_shard2_replica_t0, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:51137_, coreNodeName=core_node3, onlyIfActiveCheckResult=false, nodeProps: core_node3:{"core":"collection1_shard2_replica_t1","base_url":"http://127.0.0.1:51137","node_name":"127.0.0.1:51137_","state":"recovering","type":"TLOG"}
   [junit4]   2> 516582 INFO  (qtp1541802432-6239) [    ] o.a.s.h.a.PrepRecoveryOp Waited coreNodeName: core_node3, state: recovering, checkLive: true, onlyIfLeader: true for: 1 seconds.
   [junit4]   2> 516582 INFO  (qtp1541802432-6239) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:51137_&onlyIfLeaderActive=true&core=collection1_shard2_replica_t0&coreNodeName=core_node3&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=1000
   [junit4]   2> 516582 INFO  (qtp1619708446-6326) [    ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_t1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=TLOG
   [junit4]   2> 516584 INFO  (qtp1619708446-6326) [    ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 517108 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.c.RecoveryStrategy Starting Replication Recovery.
   [junit4]   2> 517108 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:51110/collection1_shard2_replica_t0/].
   [junit4]   2> 517111 INFO  (qtp1541802432-6236) [    ] o.a.s.u.DirectUpdateHandler2 start commit{_version_=1572992004080533504,optimize=false,openSearcher=false,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
   [junit4]   2> 517116 INFO  (qtp1541802432-6236) [    ] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commit.
   [junit4]   2> 517118 INFO  (qtp1541802432-6236) [    ] o.a.s.u.DirectUpdateHandler2 end_commit_flush
   [junit4]   2> 517118 INFO  (qtp1541802432-6236) [    ] o.a.s.u.p.LogUpdateProcessorFactory [collection1_shard2_replica_t0]  webapp= path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&commit_end_point=true&wt=javabin&version=2}{commit=} 0 6
   [junit4]   2> 517124 INFO  (qtp1541802432-6237) [    ] o.a.s.c.S.Request [collection1_shard2_replica_t0]  webapp= path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=0
   [junit4]   2> 517125 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.h.IndexFetcher Master's generation: 1
   [junit4]   2> 517125 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.h.IndexFetcher Master's version: 0
   [junit4]   2> 517125 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.h.IndexFetcher Slave's generation: 1
   [junit4]   2> 517125 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.h.IndexFetcher Slave's version: 0
   [junit4]   2> 517125 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.c.RecoveryStrategy Replication Recovery was successful.
   [junit4]   2> 517125 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.c.RecoveryStrategy Registering as Active after recovery.
   [junit4]   2> 517125 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.c.ZkController collection1_shard2_replica_t1 starting background replication from leader
   [junit4]   2> 517125 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.c.ReplicateFromLeader Will start replication from leader with poll interval: 00:00:03
   [junit4]   2> 517126 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.h.ReplicationHandler Poll scheduled at an interval of 3000ms
   [junit4]   2> 517139 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery.
   [junit4]   2> 517139 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1572992004109893632
   [junit4]   2> 517139 INFO  (recoveryExecutor-1001-thread-1) [    ] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true]
   [junit4]   2> 517638 INFO  (qtp1619708446-6326) [    ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.0.0
   [junit4]   2> 517656 INFO  (qtp1619708446-6326) [    ] o.a.s.s.IndexSchema [collection1_shard1_replica_t1] Schema name=test
   [junit4]   2> 517719 INFO  (qtp1619708446-6326) [    ] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id
   [junit4]   2> 517931 INFO  (qtp1619708446-6326) [    ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_t1' using configuration from collection collection1, trusted=true
   [junit4]   2> 517932 INFO  (qtp1619708446-6326) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_t1' (registry 'solr.core.collection1.shard1.replica_t1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@68288224
   [junit4]   2> 517932 INFO  (qtp1619708446-6326) [    ] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 517933 INFO  (qtp1619708446-6326) [    ] o.a.s.c.SolrCore [[collection1_shard1_replica_t1] ] Opening new SolrCore at [C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-4-001\cores\collection1_shard1_replica_t1], dataDir=[C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001\shard-4-001\cores\collection1_shard1_replica_t1\data\]
   [junit4]   2> 517945 INFO  (qtp1619708446-6326) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=32, maxMergeAtOnceExplicit=19, maxMergedSegmentMB=90.5986328125, floorSegmentMB=0.8515625, forceMergeDeletesPctAllowed=29.909096492685613, segmentsPerTier=23.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0
   [junit4]   2> 517974 WARN  (qtp1619708446-6326) [    ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 518104 INFO  (qtp1619708446-6326) [    ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 518104 INFO  (qtp1619708446-6326) [    ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 518106 INFO  (qtp1619708446-6326) [    ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 518106 INFO  (qtp1619708446-6326) [    ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 518109 INFO  (qtp1619708446-6326) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=13, maxMergeAtOnceExplicit=50, maxMergedSegmentMB=87.9814453125, floorSegmentMB=0.9130859375, forceMergeDeletesPctAllowed=28.713299927934685, segmentsPerTier=22.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.17693655700016617
   [junit4]   2> 518688 INFO  (qtp1619708446-6326) [    ] o.a.s.s.SolrIndexSearcher Opening [Searcher@68e817a8[collection1_shard1_replica_t1] main]
   [junit4]   2> 518691 INFO  (qtp1619708446-6326) [    ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 518691 INFO  (qtp1619708446-6326) [    ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 518692 INFO  (qtp1619708446-6326) [    ] o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 518694 INFO  (searcherExecutor-3227-thread-1) [    ] o.a.s.c.SolrCore [collection1_shard1_replica_t1] Registered new searcher Searcher@68e817a8[collection1_shard1_replica_t1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 518694 INFO  (qtp1619708446-6326) [    ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1572992005740429312
   [junit4]   2> 518699 INFO  (qtp1619708446-6326) [    ] o.a.s.c.ZkController Core needs to recover:collection1_shard1_replica_t1
   [junit4]   2> 518702 INFO  (qtp1619708446-6326) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_t1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=TLOG} status=0 QTime=2120
   [junit4]   2> 518705 INFO  (qtp1541802432-6234) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:51151_&action=ADDREPLICA&collection=collection1&shard=shard1&type=TLOG&wt=javabin&version=2} status=0 QTime=2128
   [junit4]   2> 518709 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.SolrTestCaseJ4 ###Starting testSplitShardWithRule
   [junit4]   2> 518709 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractFullDistribZkTestBase Wait for recoveries to finish - wait 15 for each attempt
   [junit4]   2> 518710 INFO  (TEST-ShardSplitTest.testSplitShardWithRule-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.AbstractDistribZkTestBase Wait for recoveries to finish - collection: collection1 failOnTimeout:true timeout (sec):15
   [junit4]   2> 518710 INFO  (updateExecutor-1007-thread-1) [    ] o.a.s.u.DefaultSolrCoreState Running recovery
   [junit4]   2> 518711 INFO  (recoveryExecutor-1008-thread-1) [    ] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true
   [junit4]   2> 518712 INFO  (recoveryExecutor-1008-thread-1) [    ] o.a.s.c.RecoveryStrategy ###### startupVersions=[[]]
   [junit4]   2> 518712 INFO  (recoveryExecutor-1008-thread-1) [    ] o.a.s.c.ZkController collection1_shard1_replica_t1 stopping background replication from leader
   [junit4]   2> 518712 INFO  (recoveryExecutor-1008-thread-1) [    ] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[collection1_shard1_replica_t1]
   [junit4]   2> 518712 INFO  (recoveryExecutor-1008-thread-1) [    ] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null}
   [junit4]   2> 518712 INFO  (recoveryExecutor-1008-thread-1) [    ] o.a.s.c.RecoveryStrategy Publishing state of core [collection1_shard1_replica_t1] as recovering, leader is [http://127.0.0.1:51124/collection1_shard1_replica_t0/] and I am [http://127.0.0.1:51151/collection1_shard1_replica_t1/]
   [junit4]   2> 518716 INFO  (recoveryExecutor-1008-thread-1) [    ] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:51124]; [WaitForState: action=PREPRECOVERY&core=collection1_shard1_replica_t0&nodeName=127.0.0.1:51151_&coreNodeName=core_node4&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true]
   [junit4]   2> 518723 INFO  (qtp243838020-6265) [    ] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node4, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true, maxTime: 183 s
   [junit4]   2> 518724 INFO  (qtp243838020-6265) [    ] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1_shard1_replica_t0, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:51151_, coreNodeName=core_node4, onlyIfActiveCheckResult=false, nodeProps: co

[...truncated too long message...]

plica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@76f7c843
   [junit4]   2> 1162669 INFO  (zkCallback-1195-thread-4) [    ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard2.replica_n1, tag=1995950147
   [junit4]   2> 1162672 INFO  (zkCallback-1195-thread-4) [    ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard2.leader, tag=1995950147
   [junit4]   2> 1162673 INFO  (TEST-ShardSplitTest.testSplitWithChaosMonkey-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.AbstractConnector Stopped ServerConnector@31cee8e3{HTTP/1.1,[http/1.1]}{127.0.0.1:0}
   [junit4]   2> 1162674 INFO  (TEST-ShardSplitTest.testSplitWithChaosMonkey-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@16d39673{/,null,UNAVAILABLE}
   [junit4]   2> 1162674 INFO  (TEST-ShardSplitTest.testSplitWithChaosMonkey-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.ChaosMonkey monkey: stop jetty! 52441
   [junit4]   2> 1162675 INFO  (TEST-ShardSplitTest.testSplitWithChaosMonkey-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1681861885
   [junit4]   2> 1162675 INFO  (TEST-ShardSplitTest.testSplitWithChaosMonkey-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node, tag=null
   [junit4]   2> 1162675 INFO  (TEST-ShardSplitTest.testSplitWithChaosMonkey-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm, tag=null
   [junit4]   2> 1162675 INFO  (TEST-ShardSplitTest.testSplitWithChaosMonkey-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty, tag=null
   [junit4]   2> 1162677 INFO  (TEST-ShardSplitTest.testSplitWithChaosMonkey-seed#[293125BE5DA80AC6]) [    ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster, tag=null
   [junit4]   2> 1162679 INFO  (coreCloseExecutor-3858-thread-1) [    ] o.a.s.c.SolrCore [collection1_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@b770abb
   [junit4]   2> 1162682 INFO  (zkCallback-1202-thread-5) [    ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1162684 INFO  (coreCloseExecutor-3858-thread-1) [    ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard1.replica_n1, tag=192350907
   [junit4]   2> 1162686 INFO  (coreCloseExecutor-3858-thread-1) [    ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard1.leader, tag=192350907
   [junit4]   2> 1162687 INFO  (TEST-ShardSplitTest.testSplitWithChaosMonkey-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.Overseer Overseer (id=98312011496751122-127.0.0.1:52441_-n_0000000004) closing
   [junit4]   2> 1162687 INFO  (OverseerStateUpdate-98312011496751122-127.0.0.1:52441_-n_0000000004) [    ] o.a.s.c.Overseer Overseer Loop exiting : 127.0.0.1:52441_
   [junit4]   2> 1162692 WARN  (zkCallback-1202-thread-5) [    ] o.a.s.c.c.ZkStateReader ZooKeeper watch triggered, but Solr cannot talk to ZK: [KeeperErrorCode = Session expired for /live_nodes]
   [junit4]   2> 1162695 INFO  (TEST-ShardSplitTest.testSplitWithChaosMonkey-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.AbstractConnector Stopped ServerConnector@6ec4e51f{HTTP/1.1,[http/1.1]}{127.0.0.1:0}
   [junit4]   2> 1162695 INFO  (TEST-ShardSplitTest.testSplitWithChaosMonkey-seed#[293125BE5DA80AC6]) [    ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@7be149e{/,null,UNAVAILABLE}
   [junit4]   2> 1162697 ERROR (TEST-ShardSplitTest.testSplitWithChaosMonkey-seed#[293125BE5DA80AC6]) [    ] o.a.z.s.ZooKeeperServer ZKShutdownHandler is not registered, so ZooKeeper server won't take any action on ERROR or SHUTDOWN server state changes
   [junit4]   2> 1162699 INFO  (TEST-ShardSplitTest.testSplitWithChaosMonkey-seed#[293125BE5DA80AC6]) [    ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:52314 52314
   [junit4]   2> 1162700 INFO  (Thread-2089) [    ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:52314 52314
   [junit4]   2> 1162700 WARN  (Thread-2089) [    ] o.a.s.c.ZkTestServer Watch limit violations:
   [junit4]   2> Maximum concurrent create/delete watches above limit:
   [junit4]   2>
   [junit4]   2> 8 /solr/aliases.json
   [junit4]   2> 6 /solr/security.json
   [junit4]   2> 6 /solr/configs/conf1
   [junit4]   2>
   [junit4]   2> Maximum concurrent data watches above limit:
   [junit4]   2>
   [junit4]   2> 8 /solr/clusterprops.json
   [junit4]   2> 8 /solr/clusterstate.json
   [junit4]   2> 6 /solr/collections/collection1/state.json
   [junit4]   2> 2 /solr/overseer_elect/election/98312011496751113-127.0.0.1:52367_-n_0000000001
   [junit4]   2>
   [junit4]   2> Maximum concurrent children watches above limit:
   [junit4]   2>
   [junit4]   2> 8 /solr/collections
   [junit4]   2> 7 /solr/live_nodes
   [junit4]   2> 4 /solr/overseer/collection-queue-work
   [junit4]   2> 3 /solr/overseer/queue
   [junit4]   2> 2 /solr/overseer/queue-work
   [junit4]   2>
   [junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=ShardSplitTest -Dtests.method=testSplitWithChaosMonkey -Dtests.seed=293125BE5DA80AC6 -Dtests.slow=true -Dtests.locale=ksh -Dtests.timezone=America/Bahia -Dtests.asserts=true -Dtests.file.encoding=UTF-8
   [junit4] FAILURE  474s J0 | ShardSplitTest.testSplitWithChaosMonkey <<<
   [junit4]    > Throwable #1: java.lang.AssertionError: There are still nodes recoverying - waited for 330 seconds
   [junit4]    > at __randomizedtesting.SeedInfo.seed([293125BE5DA80AC6:A216F66F1CAEA142]:0)
   [junit4]    > at org.apache.solr.cloud.AbstractDistribZkTestBase.waitForRecoveriesToFinish(AbstractDistribZkTestBase.java:183)
   [junit4]    > at org.apache.solr.cloud.AbstractDistribZkTestBase.waitForRecoveriesToFinish(AbstractDistribZkTestBase.java:140)
   [junit4]    > at org.apache.solr.cloud.AbstractDistribZkTestBase.waitForRecoveriesToFinish(AbstractDistribZkTestBase.java:135)
   [junit4]    > at org.apache.solr.cloud.AbstractFullDistribZkTestBase.waitForRecoveriesToFinish(AbstractFullDistribZkTestBase.java:907)
   [junit4]    > at org.apache.solr.cloud.ShardSplitTest.testSplitWithChaosMonkey(ShardSplitTest.java:436)
   [junit4]    > at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   [junit4]    > at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   [junit4]    > at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   [junit4]    > at java.base/java.lang.reflect.Method.invoke(Method.java:564)
   [junit4]    > at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:985)
   [junit4]    > at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:960)
   [junit4]    > at java.base/java.lang.Thread.run(Thread.java:844)
   [junit4]   2> NOTE: leaving temporary files on disk at: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.ShardSplitTest_293125BE5DA80AC6-001
   [junit4]   2> Jul 15, 2017 12:45:19 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 2 leaked thread(s).
   [junit4]   2> NOTE: test params are: codec=CheapBastard, sim=RandomSimilarity(queryNorm=false): {}, locale=ksh, timezone=America/Bahia
   [junit4]   2> NOTE: Windows 10 10.0 amd64/Oracle Corporation 9-ea (64-bit)/cpus=3,threads=1,free=163577864,total=536870912
   [junit4]   2> NOTE: All tests run in this JVM: [SolrShardReporterTest, TestXIncludeConfig, TestHighFrequencyDictionaryFactory, TestDFRSimilarityFactory, TestTrieFacet, TestStreamBody, QueryElevationComponentTest, XmlUpdateRequestHandlerTest, QueryParsingTest, AliasIntegrationTest, CdcrBootstrapTest, TestRawResponseWriter, TestCSVResponseWriter, DistributedFacetPivotLongTailTest, MinimalSchemaTest, LeaderElectionContextKeyTest, TestSolrCoreProperties, CursorPagingTest, TestDeleteCollectionOnDownNodes, TestLocalFSCloudBackupRestore, DistributedFacetExistsSmallTest, DistributedSuggestComponentTest, TestClassicSimilarityFactory, SegmentsInfoRequestHandlerTest, TestCorePropertiesReload, TestSchemalessBufferedUpdates, AnalyticsQueryTest, PropertiesRequestHandlerTest, ReplaceNodeTest, TestSolrCoreParser, TestComplexPhraseLeadingWildcard, TestFilteredDocIdSet, ConnectionReuseTest, DistribCursorPagingTest, TestReqParamsAPI, TestExceedMaxTermLength, SpatialRPTFieldTypeTest, TestDynamicFieldResource, TestConfigOverlay, TestApiFramework, TestHashPartitioner, InfoHandlerTest, TestDownShardTolerantSearch, TestCloudPseudoReturnFields, ShufflingReplicaListTransformerTest, CSVRequestHandlerTest, SolrGraphiteReporterTest, SharedFSAutoReplicaFailoverUtilsTest, TestDistributedStatsComponentCardinality, IndexSchemaTest, TestPseudoReturnFields, RecoveryZkTest, SolrIndexMetricsTest, TestNestedDocsSort, TestSolrCloudWithKerberosAlt, TestSolrDeletionPolicy2, ClusterStateUpdateTest, TestCodecSupport, PeerSyncTest, TestSmileRequest, V2StandaloneTest, TestStressCloudBlindAtomicUpdates, AddSchemaFieldsUpdateProcessorFactoryTest, TestSearcherReuse, CdcrVersionReplicationTest, SliceStateTest, TestFieldTypeResource, TestRandomRequestDistribution, TestReplicationHandler, HighlighterTest, PathHierarchyTokenizerFactoryTest, HdfsWriteToMultipleCollectionsTest, ClusterStateTest, ShardSplitTest]
   [junit4] Completed [231/728 (1!)] on J0 in 659.93s, 5 tests, 1 failure <<< FAILURES!

[...truncated 38087 lines...]


---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

[JENKINS-EA] Lucene-Solr-master-Windows (32bit/jdk-9-ea+173) - Build # 6749 - Still Unstable!

Policeman Jenkins Server-2
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Windows/6749/
Java: 32bit/jdk-9-ea+173 -client -XX:+UseConcMarkSweepGC

1 tests failed.
FAILED:  org.apache.solr.cloud.TestStressInPlaceUpdates.stressTest

Error Message:
Captured an uncaught exception in thread: Thread[id=24987, name=WRITER11, state=RUNNABLE, group=TGRP-TestStressInPlaceUpdates]

Stack Trace:
com.carrotsearch.randomizedtesting.UncaughtExceptionError: Captured an uncaught exception in thread: Thread[id=24987, name=WRITER11, state=RUNNABLE, group=TGRP-TestStressInPlaceUpdates]
        at __randomizedtesting.SeedInfo.seed([309721A0D7F0E5CA:5BF1FE0DE9253130]:0)
Caused by: java.lang.RuntimeException: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:55947/lk_n/m/collection1: java.base/java.util.ArrayList cannot be cast to java.base/java.lang.Integer
        at __randomizedtesting.SeedInfo.seed([309721A0D7F0E5CA]:0)
        at org.apache.solr.cloud.TestStressInPlaceUpdates$1.run(TestStressInPlaceUpdates.java:306)
Caused by: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:55947/lk_n/m/collection1: java.base/java.util.ArrayList cannot be cast to java.base/java.lang.Integer
        at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:626)
        at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:252)
        at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:241)
        at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:178)
        at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:195)
        at org.apache.solr.cloud.TestStressInPlaceUpdates.addDocAndGetVersion(TestStressInPlaceUpdates.java:549)
        at org.apache.solr.cloud.TestStressInPlaceUpdates$1.run(TestStressInPlaceUpdates.java:271)




Build Log:
[...truncated 1716 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\test\temp\junit4-J1-20170715_201031_5893507443802137470312.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\test\temp\junit4-J0-20170715_201031_5893093277807200172212.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 301 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\test-framework\test\temp\junit4-J1-20170715_201604_2034294026956627605545.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\test-framework\test\temp\junit4-J0-20170715_201604_20313595729875291667234.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 1052 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\common\test\temp\junit4-J1-20170715_201815_9986062790494088235262.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\common\test\temp\junit4-J0-20170715_201815_99818137005487821060652.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 217 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\icu\test\temp\junit4-J1-20170715_202001_82716880766855695045527.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\icu\test\temp\junit4-J0-20170715_202001_82713399871641057378996.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 238 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\kuromoji\test\temp\junit4-J0-20170715_202008_728426467011020226901.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\kuromoji\test\temp\junit4-J1-20170715_202008_7299981007964455211026.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 147 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\morfologik\test\temp\junit4-J0-20170715_202023_50616453256801897077901.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\morfologik\test\temp\junit4-J1-20170715_202023_506164038952126146201.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 162 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\phonetic\test\temp\junit4-J1-20170715_202026_97518225682932860458129.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\phonetic\test\temp\junit4-J0-20170715_202026_97510846752765692610696.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 144 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\smartcn\test\temp\junit4-J0-20170715_202032_10612173316303545547671.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\smartcn\test\temp\junit4-J1-20170715_202032_1069892858632538813187.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 152 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\stempel\test\temp\junit4-J0-20170715_202037_24216635139566120587854.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\stempel\test\temp\junit4-J1-20170715_202037_24217172800474804144017.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 160 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\uima\test\temp\junit4-J0-20170715_202040_00710604243154260749456.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 145 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\backward-codecs\test\temp\junit4-J1-20170715_202049_8444920726036940831327.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 25 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\backward-codecs\test\temp\junit4-J0-20170715_202049_8443883564308943072712.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 1217 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\benchmark\test\temp\junit4-J0-20170715_202103_38413658780739317383813.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\benchmark\test\temp\junit4-J1-20170715_202103_3844276146624010943232.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 219 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\classification\test\temp\junit4-J1-20170715_202110_9103910743929157635376.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\classification\test\temp\junit4-J0-20170715_202110_91012215693120724190408.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 255 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\codecs\test\temp\junit4-J0-20170715_202124_3408953906728363918362.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\codecs\test\temp\junit4-J1-20170715_202124_34011373062485686991172.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 229 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\demo\test\temp\junit4-J0-20170715_202232_9621372558479829400588.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\demo\test\temp\junit4-J1-20170715_202232_96212509572700468867110.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 166 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\expressions\test\temp\junit4-J1-20170715_202236_1047312812536963157465.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\expressions\test\temp\junit4-J0-20170715_202236_10416445739766300224071.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 208 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\facet\test\temp\junit4-J0-20170715_202240_36613628423705790339747.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\facet\test\temp\junit4-J1-20170715_202240_3665925298353027102384.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 172 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\grouping\test\temp\junit4-J0-20170715_202255_5558614440164011135560.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\grouping\test\temp\junit4-J1-20170715_202255_55510424778486053216299.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 236 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\highlighter\test\temp\junit4-J0-20170715_202303_0111717591186612250604.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\highlighter\test\temp\junit4-J1-20170715_202303_01117335135393516063339.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 156 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\join\test\temp\junit4-J1-20170715_202317_3646067205623075883149.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\join\test\temp\junit4-J0-20170715_202317_36415639031247850397757.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 140 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\memory\test\temp\junit4-J0-20170715_202327_6458248370651415886591.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\memory\test\temp\junit4-J1-20170715_202327_6455065115070244545497.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 177 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\misc\test\temp\junit4-J0-20170715_202333_06914377909398969716485.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\misc\test\temp\junit4-J1-20170715_202333_0693304627653466783799.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 305 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\queries\test\temp\junit4-J0-20170715_202347_14911249229859142247364.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\queries\test\temp\junit4-J1-20170715_202347_14917970799031352936107.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 217 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\queryparser\test\temp\junit4-J1-20170715_202353_58112831788134454314206.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\queryparser\test\temp\junit4-J0-20170715_202353_58115604987974422177645.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 200 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\replicator\test\temp\junit4-J1-20170715_202359_64416813263542594399145.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\replicator\test\temp\junit4-J0-20170715_202359_64410376823427639009094.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 202 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\sandbox\test\temp\junit4-J0-20170715_202408_38616163593345240649641.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\sandbox\test\temp\junit4-J1-20170715_202408_38616605905142345796811.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 208 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\spatial-extras\test\temp\junit4-J0-20170715_202426_80310874081242848164337.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\spatial-extras\test\temp\junit4-J1-20170715_202426_8037412766426566788035.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 160 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\spatial3d\test\temp\junit4-J0-20170715_202435_6179697482287094947163.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 8 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\spatial3d\test\temp\junit4-J1-20170715_202435_6179078283849850372067.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 139 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\spatial\test\temp\junit4-J0-20170715_202452_57011778006336986064297.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 298 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\suggest\test\temp\junit4-J0-20170715_202456_3504705360942547121266.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\suggest\test\temp\junit4-J1-20170715_202456_35017851334189112345754.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3145 lines...]
   [junit4] Suite: org.apache.solr.cloud.TestStressInPlaceUpdates
   [junit4]   2> Creating dataDir: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\init-core-data-001
   [junit4]   2> 2714363 INFO  (SUITE-TestStressInPlaceUpdates-seed#[309721A0D7F0E5CA]-worker) [    ] o.a.s.SolrTestCaseJ4 Using TrieFields (NUMERIC_POINTS_SYSPROP=false) w/NUMERIC_DOCVALUES_SYSPROP=true
   [junit4]   2> 2714366 INFO  (SUITE-TestStressInPlaceUpdates-seed#[309721A0D7F0E5CA]-worker) [    ] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason="", value=0.0/0.0, ssl=0.0/0.0, clientAuth=0.0/0.0)
   [junit4]   2> 2714367 INFO  (SUITE-TestStressInPlaceUpdates-seed#[309721A0D7F0E5CA]-worker) [    ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /lk_n/m
   [junit4]   2> 2714367 INFO  (SUITE-TestStressInPlaceUpdates-seed#[309721A0D7F0E5CA]-worker) [    ] o.a.s.SolrTestCaseJ4 ####initCore
   [junit4]   2> 2714369 INFO  (SUITE-TestStressInPlaceUpdates-seed#[309721A0D7F0E5CA]-worker) [    ] o.a.s.c.SolrResourceLoader [null] Added 2 libs to classloader, from paths: [/C:/Users/jenkins/workspace/Lucene-Solr-master-Windows/solr/core/src/test-files/solr/collection1/lib, /C:/Users/jenkins/workspace/Lucene-Solr-master-Windows/solr/core/src/test-files/solr/collection1/lib/classes]
   [junit4]   2> 2714421 INFO  (SUITE-TestStressInPlaceUpdates-seed#[309721A0D7F0E5CA]-worker) [    ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.0.0
   [junit4]   2> 2714438 INFO  (SUITE-TestStressInPlaceUpdates-seed#[309721A0D7F0E5CA]-worker) [    ] o.a.s.s.IndexSchema [null] Schema name=inplace-updates
   [junit4]   2> 2714444 INFO  (SUITE-TestStressInPlaceUpdates-seed#[309721A0D7F0E5CA]-worker) [    ] o.a.s.s.IndexSchema Loaded schema inplace-updates/1.6 with uniqueid field id
   [junit4]   2> 2714511 INFO  (SUITE-TestStressInPlaceUpdates-seed#[309721A0D7F0E5CA]-worker) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca
   [junit4]   2> 2714513 INFO  (SUITE-TestStressInPlaceUpdates-seed#[309721A0D7F0E5CA]-worker) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca
   [junit4]   2> 2714513 INFO  (SUITE-TestStressInPlaceUpdates-seed#[309721A0D7F0E5CA]-worker) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca
   [junit4]   2> 2714518 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.c.SolrResourceLoader [null] Added 2 libs to classloader, from paths: [/C:/Users/jenkins/workspace/Lucene-Solr-master-Windows/solr/core/src/test-files/solr/collection1/lib, /C:/Users/jenkins/workspace/Lucene-Solr-master-Windows/solr/core/src/test-files/solr/collection1/lib/classes]
   [junit4]   2> 2714565 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.0.0
   [junit4]   2> 2714574 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.s.IndexSchema [collection1] Schema name=inplace-updates
   [junit4]   2> 2714577 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.s.IndexSchema Loaded schema inplace-updates/1.6 with uniqueid field id
   [junit4]   2> 2714577 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.c.CoreContainer Creating SolrCore 'collection1' using configuration from instancedir C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1, trusted=true
   [junit4]   2> 2714578 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1' (registry 'solr.core.collection1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca
   [junit4]   2> 2714578 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 2714578 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.c.SolrCore [[collection1] ] Opening new SolrCore at [C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1], dataDir=[C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\init-core-data-001\]
   [junit4]   2> 2714580 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=15, maxMergeAtOnceExplicit=13, maxMergedSegmentMB=50.208984375, floorSegmentMB=1.875, forceMergeDeletesPctAllowed=15.371526113683228, segmentsPerTier=43.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0
   [junit4]   2> 2714583 WARN  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 2714664 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 2714664 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 2714665 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 2714665 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 2714666 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=49, maxMergeAtOnceExplicit=24, maxMergedSegmentMB=48.8349609375, floorSegmentMB=1.1123046875, forceMergeDeletesPctAllowed=18.409218279675667, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0
   [junit4]   2> 2714666 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.s.SolrIndexSearcher Opening [Searcher@1337a01[collection1] main]
   [junit4]   2> 2714667 WARN  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.r.ManagedResourceStorage Cannot write to config directory C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf; switching to use InMemory storage instead.
   [junit4]   2> 2714668 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 2714670 INFO  (searcherExecutor-9577-thread-1) [    ] o.a.s.c.SolrCore [collection1] Registered new searcher Searcher@1337a01[collection1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 2714670 INFO  (coreLoadExecutor-9576-thread-1) [    ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1573024506484621312
   [junit4]   2> 2714673 INFO  (SUITE-TestStressInPlaceUpdates-seed#[309721A0D7F0E5CA]-worker) [    ] o.a.s.SolrTestCaseJ4 ####initCore end
   [junit4]   2> 2714675 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 2714675 INFO  (Thread-4822) [    ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 2714676 INFO  (Thread-4822) [    ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 2714680 ERROR (Thread-4822) [    ] o.a.z.s.ZooKeeperServer ZKShutdownHandler is not registered, so ZooKeeper server won't take any action on ERROR or SHUTDOWN server state changes
   [junit4]   2> 2714776 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.ZkTestServer start zk server on port:55919
   [junit4]   2> 2714793 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 2714797 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\schema-inplace-updates.xml to /configs/conf1/schema.xml
   [junit4]   2> 2714799 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 2714801 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 2714803 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 2714805 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 2714806 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 2714808 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 2714810 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 2714812 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 2714813 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.AbstractZkTestCase put C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\src\test-files\solr\collection1\conf\synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 2714813 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise
   [junit4]   2> 2715269 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 2715269 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@1c2bc79{/lk_n/m,null,AVAILABLE}
   [junit4]   2> 2715273 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@dee16c{HTTP/1.1,[http/1.1]}{127.0.0.1:55926}
   [junit4]   2> 2715273 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.e.j.s.Server Started @2716045ms
   [junit4]   2> 2715273 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/lk_n/m, solr.data.dir=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\tempDir-001/control/data, hostPort=55926, coreRootDirectory=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\control-001\cores, replicaType=NRT}
   [junit4]   2> 2715275 ERROR (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 2715275 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 8.0.0
   [junit4]   2> 2715275 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 2715275 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null, Default config dir: null
   [junit4]   2> 2715275 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2017-07-15T21:11:11.517983300Z
   [junit4]   2> 2715279 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 2715279 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\control-001\solr.xml
   [junit4]   2> 2715283 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 2715286 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:55919/solr
   [junit4]   2> 2715403 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.Overseer Overseer (id=null) closing
   [junit4]   2> 2715404 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:55926_lk_n%2Fm
   [junit4]   2> 2715405 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.Overseer Overseer (id=98314031655944196-127.0.0.1:55926_lk_n%2Fm-n_0000000000) starting
   [junit4]   2> 2715412 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:55926_lk_n%2Fm
   [junit4]   2> 2715413 INFO  (OverseerStateUpdate-98314031655944196-127.0.0.1:55926_lk_n%2Fm-n_0000000000) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 2715509 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca
   [junit4]   2> 2715514 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca
   [junit4]   2> 2715515 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca
   [junit4]   2> 2715517 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\control-001\cores
   [junit4]   2> 2715545 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 2715549 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:55919/solr ready
   [junit4]   2> 2715553 INFO  (qtp10320802-24831) [    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params replicationFactor=1&collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:55926_lk_n%252Fm&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 2715559 INFO  (OverseerThreadFactory-9587-thread-1) [    ] o.a.s.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 2715671 INFO  (qtp10320802-24832) [    ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 2715671 INFO  (qtp10320802-24832) [    ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 2715778 INFO  (zkCallback-4637-thread-1) [    ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 2716686 INFO  (qtp10320802-24832) [    ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.0.0
   [junit4]   2> 2716696 INFO  (qtp10320802-24832) [    ] o.a.s.s.IndexSchema [control_collection_shard1_replica_n1] Schema name=inplace-updates
   [junit4]   2> 2716699 INFO  (qtp10320802-24832) [    ] o.a.s.s.IndexSchema Loaded schema inplace-updates/1.6 with uniqueid field id
   [junit4]   2> 2716699 INFO  (qtp10320802-24832) [    ] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from collection control_collection, trusted=true
   [junit4]   2> 2716699 INFO  (qtp10320802-24832) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca
   [junit4]   2> 2716700 INFO  (qtp10320802-24832) [    ] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 2716700 INFO  (qtp10320802-24832) [    ] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\control-001\cores\control_collection_shard1_replica_n1], dataDir=[C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\control-001\cores\control_collection_shard1_replica_n1\data\]
   [junit4]   2> 2716704 INFO  (qtp10320802-24832) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=15, maxMergeAtOnceExplicit=13, maxMergedSegmentMB=50.208984375, floorSegmentMB=1.875, forceMergeDeletesPctAllowed=15.371526113683228, segmentsPerTier=43.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0
   [junit4]   2> 2716707 WARN  (qtp10320802-24832) [    ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 2716793 INFO  (qtp10320802-24832) [    ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 2716793 INFO  (qtp10320802-24832) [    ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 2716797 INFO  (qtp10320802-24832) [    ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 2716797 INFO  (qtp10320802-24832) [    ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 2716799 INFO  (qtp10320802-24832) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=49, maxMergeAtOnceExplicit=24, maxMergedSegmentMB=48.8349609375, floorSegmentMB=1.1123046875, forceMergeDeletesPctAllowed=18.409218279675667, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0
   [junit4]   2> 2716799 INFO  (qtp10320802-24832) [    ] o.a.s.s.SolrIndexSearcher Opening [Searcher@1df4080[control_collection_shard1_replica_n1] main]
   [junit4]   2> 2716801 INFO  (qtp10320802-24832) [    ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 2716801 INFO  (qtp10320802-24832) [    ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 2716802 INFO  (qtp10320802-24832) [    ] o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 2716804 INFO  (searcherExecutor-9590-thread-1) [    ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@1df4080[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 2716804 INFO  (qtp10320802-24832) [    ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1573024508722282496
   [junit4]   2> 2716810 INFO  (qtp10320802-24832) [    ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 2716810 INFO  (qtp10320802-24832) [    ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 2716810 INFO  (qtp10320802-24832) [    ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:55926/lk_n/m/control_collection_shard1_replica_n1/
   [junit4]   2> 2716810 INFO  (qtp10320802-24832) [    ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 2716810 INFO  (qtp10320802-24832) [    ] o.a.s.c.SyncStrategy http://127.0.0.1:55926/lk_n/m/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 2716810 INFO  (qtp10320802-24832) [    ] o.a.s.c.ShardLeaderElectionContext Found all replicas participating in election, clear LIR
   [junit4]   2> 2716814 INFO  (qtp10320802-24832) [    ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:55926/lk_n/m/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 2716917 INFO  (zkCallback-4637-thread-1) [    ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 2716967 INFO  (qtp10320802-24832) [    ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 2716970 INFO  (qtp10320802-24832) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1300
   [junit4]   2> 2716974 INFO  (qtp10320802-24831) [    ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 30 seconds. Check all shard replicas
   [junit4]   2> 2717071 INFO  (zkCallback-4637-thread-1) [    ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 2717559 INFO  (OverseerCollectionConfigSetProcessor-98314031655944196-127.0.0.1:55926_lk_n%2Fm-n_0000000000) [    ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 2717974 INFO  (qtp10320802-24831) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={replicationFactor=1&collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:55926_lk_n%252Fm&wt=javabin&version=2} status=0 QTime=2420
   [junit4]   2> 2717984 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 2717985 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:55919/solr ready
   [junit4]   2> 2717985 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 2717987 INFO  (qtp10320802-24830) [    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params replicationFactor=1&collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 2717991 INFO  (OverseerThreadFactory-9587-thread-2) [    ] o.a.s.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 2717992 WARN  (OverseerThreadFactory-9587-thread-2) [    ] o.a.s.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 2718200 INFO  (qtp10320802-24830) [    ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 30 seconds. Check all shard replicas
   [junit4]   2> 2718200 INFO  (qtp10320802-24830) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={replicationFactor=1&collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2} status=0 QTime=212
   [junit4]   2> 2719092 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\shard-1-001 of type NRT
   [junit4]   2> 2719097 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 2719099 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@b57e09{/lk_n/m,null,AVAILABLE}
   [junit4]   2> 2719100 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@1e2606c{HTTP/1.1,[http/1.1]}{127.0.0.1:55947}
   [junit4]   2> 2719100 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.e.j.s.Server Started @2719872ms
   [junit4]   2> 2719100 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/lk_n/m, solrconfig=solrconfig.xml, solr.data.dir=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\tempDir-001/jetty1, hostPort=55947, coreRootDirectory=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\shard-1-001\cores}
   [junit4]   2> 2719101 ERROR (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 2719102 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 8.0.0
   [junit4]   2> 2719102 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 2719102 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null, Default config dir: null
   [junit4]   2> 2719103 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2017-07-15T21:11:15.345799800Z
   [junit4]   2> 2719116 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn caught end of stream exception
   [junit4]   2> EndOfStreamException: Unable to read additional data from client sessionid 0x15d4817454a0007, likely client has closed socket
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:239)
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:203)
   [junit4]   2> at java.base/java.lang.Thread.run(Thread.java:844)
   [junit4]   2> 2719116 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 2719117 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\shard-1-001\solr.xml
   [junit4]   2> 2719121 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 2719129 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:55919/solr
   [junit4]   2> 2719135 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn caught end of stream exception
   [junit4]   2> EndOfStreamException: Unable to read additional data from client sessionid 0x15d4817454a0008, likely client has closed socket
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:239)
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:203)
   [junit4]   2> at java.base/java.lang.Thread.run(Thread.java:844)
   [junit4]   2> 2719149 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 2719153 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.Overseer Overseer (id=null) closing
   [junit4]   2> 2719156 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:55947_lk_n%2Fm
   [junit4]   2> 2719159 INFO  (zkCallback-4649-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 2719159 INFO  (zkCallback-4637-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 2719160 INFO  (zkCallback-4644-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 2719495 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca
   [junit4]   2> 2719506 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca
   [junit4]   2> 2719506 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca
   [junit4]   2> 2719509 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\shard-1-001\cores
   [junit4]   2> 2719569 INFO  (qtp10320802-24836) [    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:55947_lk_n%252Fm&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 2719573 INFO  (OverseerCollectionConfigSetProcessor-98314031655944196-127.0.0.1:55926_lk_n%2Fm-n_0000000000) [    ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 2719575 INFO  (OverseerThreadFactory-9587-thread-3) [    ] o.a.s.c.AddReplicaCmd Node Identified 127.0.0.1:55947_lk_n%2Fm for creating new replica
   [junit4]   2> 2719581 INFO  (qtp2504807-24880) [    ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n0&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 2719582 INFO  (qtp2504807-24880) [    ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 2720605 INFO  (qtp2504807-24880) [    ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.0.0
   [junit4]   2> 2720628 INFO  (qtp2504807-24880) [    ] o.a.s.s.IndexSchema [collection1_shard1_replica_n0] Schema name=inplace-updates
   [junit4]   2> 2720633 INFO  (qtp2504807-24880) [    ] o.a.s.s.IndexSchema Loaded schema inplace-updates/1.6 with uniqueid field id
   [junit4]   2> 2720633 INFO  (qtp2504807-24880) [    ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n0' using configuration from collection collection1, trusted=true
   [junit4]   2> 2720633 INFO  (qtp2504807-24880) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_n0' (registry 'solr.core.collection1.shard1.replica_n0') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca
   [junit4]   2> 2720633 INFO  (qtp2504807-24880) [    ] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 2720633 INFO  (qtp2504807-24880) [    ] o.a.s.c.SolrCore [[collection1_shard1_replica_n0] ] Opening new SolrCore at [C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\shard-1-001\cores\collection1_shard1_replica_n0], dataDir=[C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\shard-1-001\cores\collection1_shard1_replica_n0\data\]
   [junit4]   2> 2720640 INFO  (qtp2504807-24880) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=15, maxMergeAtOnceExplicit=13, maxMergedSegmentMB=50.208984375, floorSegmentMB=1.875, forceMergeDeletesPctAllowed=15.371526113683228, segmentsPerTier=43.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0
   [junit4]   2> 2720646 WARN  (qtp2504807-24880) [    ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 2720811 INFO  (qtp2504807-24880) [    ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 2720811 INFO  (qtp2504807-24880) [    ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 2720817 INFO  (qtp2504807-24880) [    ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 2720817 INFO  (qtp2504807-24880) [    ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 2720820 INFO  (qtp2504807-24880) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=49, maxMergeAtOnceExplicit=24, maxMergedSegmentMB=48.8349609375, floorSegmentMB=1.1123046875, forceMergeDeletesPctAllowed=18.409218279675667, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0
   [junit4]   2> 2720821 INFO  (qtp2504807-24880) [    ] o.a.s.s.SolrIndexSearcher Opening [Searcher@48e267[collection1_shard1_replica_n0] main]
   [junit4]   2> 2720821 INFO  (qtp2504807-24880) [    ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 2720821 INFO  (qtp2504807-24880) [    ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 2720825 INFO  (qtp2504807-24880) [    ] o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 2720825 INFO  (searcherExecutor-9601-thread-1) [    ] o.a.s.c.SolrCore [collection1_shard1_replica_n0] Registered new searcher Searcher@48e267[collection1_shard1_replica_n0] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 2720825 INFO  (qtp2504807-24880) [    ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1573024512938606592
   [junit4]   2> 2720840 INFO  (qtp2504807-24880) [    ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 2720841 INFO  (qtp2504807-24880) [    ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 2720841 INFO  (qtp2504807-24880) [    ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:55947/lk_n/m/collection1_shard1_replica_n0/
   [junit4]   2> 2720841 INFO  (qtp2504807-24880) [    ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 2720841 INFO  (qtp2504807-24880) [    ] o.a.s.c.SyncStrategy http://127.0.0.1:55947/lk_n/m/collection1_shard1_replica_n0/ has no replicas
   [junit4]   2> 2720841 INFO  (qtp2504807-24880) [    ] o.a.s.c.ShardLeaderElectionContext Found all replicas participating in election, clear LIR
   [junit4]   2> 2720847 INFO  (qtp2504807-24880) [    ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:55947/lk_n/m/collection1_shard1_replica_n0/ shard1
   [junit4]   2> 2721001 INFO  (qtp2504807-24880) [    ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 2721010 INFO  (qtp2504807-24880) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n0&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1428
   [junit4]   2> 2721017 INFO  (qtp10320802-24836) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:55947_lk_n%252Fm&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1449
   [junit4]   2> 2721581 INFO  (OverseerCollectionConfigSetProcessor-98314031655944196-127.0.0.1:55926_lk_n%2Fm-n_0000000000) [    ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000004 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 2721898 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 2 in directory C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\shard-2-001 of type NRT
   [junit4]   2> 2721898 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 2721900 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@1948bbd{/lk_n/m,null,AVAILABLE}
   [junit4]   2> 2721901 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@1acd106{HTTP/1.1,[http/1.1]}{127.0.0.1:55960}
   [junit4]   2> 2721901 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.e.j.s.Server Started @2722673ms
   [junit4]   2> 2721901 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/lk_n/m, solrconfig=solrconfig.xml, solr.data.dir=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\tempDir-001/jetty2, hostPort=55960, coreRootDirectory=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\shard-2-001\cores}
   [junit4]   2> 2721902 ERROR (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 2721902 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 8.0.0
   [junit4]   2> 2721902 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 2721902 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null, Default config dir: null
   [junit4]   2> 2721902 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2017-07-15T21:11:18.144910900Z
   [junit4]   2> 2721912 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn caught end of stream exception
   [junit4]   2> EndOfStreamException: Unable to read additional data from client sessionid 0x15d4817454a000a, likely client has closed socket
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:239)
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:203)
   [junit4]   2> at java.base/java.lang.Thread.run(Thread.java:844)
   [junit4]   2> 2721912 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 2721913 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\shard-2-001\solr.xml
   [junit4]   2> 2721920 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 2721927 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:55919/solr
   [junit4]   2> 2721957 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (2)
   [junit4]   2> 2721963 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.Overseer Overseer (id=null) closing
   [junit4]   2> 2721965 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:55960_lk_n%2Fm
   [junit4]   2> 2721971 INFO  (zkCallback-4649-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 2721977 INFO  (zkCallback-4637-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 2721974 INFO  (zkCallback-4644-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 2721974 INFO  (zkCallback-4655-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 2722226 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca
   [junit4]   2> 2722240 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca
   [junit4]   2> 2722241 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca
   [junit4]   2> 2722245 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\shard-2-001\cores
   [junit4]   2> 2722340 INFO  (qtp10320802-24828) [    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:55960_lk_n%252Fm&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 2722345 INFO  (OverseerThreadFactory-9587-thread-4) [    ] o.a.s.c.AddReplicaCmd Node Identified 127.0.0.1:55960_lk_n%2Fm for creating new replica
   [junit4]   2> 2722351 INFO  (qtp27177995-24910) [    ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 2722352 INFO  (qtp27177995-24910) [    ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 2723386 INFO  (qtp27177995-24910) [    ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.0.0
   [junit4]   2> 2723406 INFO  (qtp27177995-24910) [    ] o.a.s.s.IndexSchema [collection1_shard1_replica_n1] Schema name=inplace-updates
   [junit4]   2> 2723412 INFO  (qtp27177995-24910) [    ] o.a.s.s.IndexSchema Loaded schema inplace-updates/1.6 with uniqueid field id
   [junit4]   2> 2723412 INFO  (qtp27177995-24910) [    ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n1' using configuration from collection collection1, trusted=true
   [junit4]   2> 2723413 INFO  (qtp27177995-24910) [    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_n1' (registry 'solr.core.collection1.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca
   [junit4]   2> 2723414 INFO  (qtp27177995-24910) [    ] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 2723414 INFO  (qtp27177995-24910) [    ] o.a.s.c.SolrCore [[collection1_shard1_replica_n1] ] Opening new SolrCore at [C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\shard-2-001\cores\collection1_shard1_replica_n1], dataDir=[C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\shard-2-001\cores\collection1_shard1_replica_n1\data\]
   [junit4]   2> 2723420 INFO  (qtp27177995-24910) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=15, maxMergeAtOnceExplicit=13, maxMergedSegmentMB=50.208984375, floorSegmentMB=1.875, forceMergeDeletesPctAllowed=15.371526113683228, segmentsPerTier=43.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0
   [junit4]   2> 2723425 WARN  (qtp27177995-24910) [    ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 2723654 INFO  (qtp27177995-24910) [    ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 2723654 INFO  (qtp27177995-24910) [    ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 2723675 INFO  (qtp27177995-24910) [    ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 2723675 INFO  (qtp27177995-24910) [    ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 2723681 INFO  (qtp27177995-24910) [    ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=49, maxMergeAtOnceExplicit=24, maxMergedSegmentMB=48.8349609375, floorSegmentMB=1.1123046875, forceMergeDeletesPctAllowed=18.409218279675667, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0
   [junit4]   2> 2723681 INFO  (qtp27177995-24910) [    ] o.a.s.s.SolrIndexSearcher Opening [Searcher@4aa92b[collection1_shard1_replica_n1] main]
   [junit4]   2> 2723685 INFO  (qtp27177995-24910) [    ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 2723686 INFO  (qtp27177995-24910) [    ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 2723688 INFO  (qtp27177995-24910) [    ] o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 2723692 INFO  (searcherExecutor-9612-thread-1) [    ] o.a.s.c.SolrCore [collection1_shard1_replica_n1] Registered new searcher Searcher@4aa92b[collection1_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 2723692 INFO  (qtp27177995-24910) [    ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1573024515944873984
   [junit4]   2> 2723698 INFO  (qtp27177995-24910) [    ] o.a.s.c.ZkController Core needs to recover:collection1_shard1_replica_n1
   [junit4]   2> 2723701 INFO  (updateExecutor-4652-thread-1) [    ] o.a.s.u.DefaultSolrCoreState Running recovery
   [junit4]   2> 2723702 INFO  (qtp27177995-24910) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1352
   [junit4]   2> 2723706 INFO  (recoveryExecutor-4653-thread-1) [    ] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true
   [junit4]   2> 2723708 INFO  (recoveryExecutor-4653-thread-1) [    ] o.a.s.c.RecoveryStrategy ###### startupVersions=[[]]
   [junit4]   2> 2723709 INFO  (qtp10320802-24828) [    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:55960_lk_n%252Fm&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1368
   [junit4]   2> 2723709 INFO  (recoveryExecutor-4653-thread-1) [    ] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[collection1_shard1_replica_n1]
   [junit4]   2> 2723709 INFO  (recoveryExecutor-4653-thread-1) [    ] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null}
   [junit4]   2> 2723709 INFO  (recoveryExecutor-4653-thread-1) [    ] o.a.s.c.RecoveryStrategy Publishing state of core [collection1_shard1_replica_n1] as recovering, leader is [http://127.0.0.1:55947/lk_n/m/collection1_shard1_replica_n0/] and I am [http://127.0.0.1:55960/lk_n/m/collection1_shard1_replica_n1/]
   [junit4]   2> 2723730 INFO  (recoveryExecutor-4653-thread-1) [    ] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:55947/lk_n/m]; [WaitForState: action=PREPRECOVERY&core=collection1_shard1_replica_n0&nodeName=127.0.0.1:55960_lk_n%252Fm&coreNodeName=core_node2&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true]
   [junit4]   2> 2723732 INFO  (qtp2504807-24875) [    ] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node2, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true, maxTime: 183 s
   [junit4]   2> 2723733 INFO  (qtp2504807-24875) [    ] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1_shard1_replica_n0, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:55960_lk_n%2Fm, coreNodeName=core_node2, onlyIfActiveCheckResult=false, nodeProps: core_node2:{"core":"collection1_shard1_replica_n1","base_url":"http://127.0.0.1:55960/lk_n/m","node_name":"127.0.0.1:55960_lk_n%2Fm","state":"down","type":"NRT"}
   [junit4]   2> 2724346 INFO  (OverseerCollectionConfigSetProcessor-98314031655944196-127.0.0.1:55926_lk_n%2Fm-n_0000000000) [    ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000006 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 2724552 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 3 in directory C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\shard-3-001 of type NRT
   [junit4]   2> 2724553 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 2724554 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@74ab8{/lk_n/m,null,AVAILABLE}
   [junit4]   2> 2724555 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@c0b7af{HTTP/1.1,[http/1.1]}{127.0.0.1:55974}
   [junit4]   2> 2724555 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.e.j.s.Server Started @2725327ms
   [junit4]   2> 2724555 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/lk_n/m, solrconfig=solrconfig.xml, solr.data.dir=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\tempDir-001/jetty3, hostPort=55974, coreRootDirectory=C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\..\..\..\..\..\..\..\..\..\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\shard-3-001\cores}
   [junit4]   2> 2724556 ERROR (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 2724557 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 8.0.0
   [junit4]   2> 2724560 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 2724560 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null, Default config dir: null
   [junit4]   2> 2724560 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2017-07-15T21:11:20.802485600Z
   [junit4]   2> 2724567 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 2724567 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.TestStressInPlaceUpdates_309721A0D7F0E5CA-001\shard-3-001\solr.xml
   [junit4]   2> 2724575 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@132f4ca, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 2724581 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[309721A0D7F0E5CA]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:55919/solr
   [junit4]   2> 2724587 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn caught end of stream exception
   [junit4]   2> EndOfStreamException: Unable to read additional data from client sessionid 0x15d4817454a000e, likely client has closed socket
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:239)
   [junit4]   2> at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:203)
   [junit4]   2> at java.base/java.lang.Thread.run(Thread.java:844)
   [junit4]   2> 2724600 INFO  (TEST-TestStressInPlaceUpdates.stressTes

[...truncated too long message...]

emp\junit4-J1-20170715_211820_74117065087316815002371.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 21 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-solrj\test\temp\junit4-J0-20170715_211820_74113700317394745395551.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 1238 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-analysis-extras\test\temp\junit4-J1-20170715_212234_42218229878380768201947.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-analysis-extras\test\temp\junit4-J0-20170715_212234_42266620653479825185.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 548 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-analytics\test\temp\junit4-J0-20170715_212256_12417030303352755179659.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-analytics\test\temp\junit4-J1-20170715_212256_124841023994557483305.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 500 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-clustering\test\temp\junit4-J1-20170715_212342_37313934719078663679668.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-clustering\test\temp\junit4-J0-20170715_212342_3732408214448563833971.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 1130 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-dataimporthandler-extras\test\temp\junit4-J1-20170715_212424_3985193057736528463449.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-dataimporthandler-extras\test\temp\junit4-J0-20170715_212424_39813255418296709831037.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 566 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-dataimporthandler\test\temp\junit4-J0-20170715_212442_1881683042630193012065.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 30 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-dataimporthandler\test\temp\junit4-J1-20170715_212442_18816836152995873499234.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 501 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-cell\test\temp\junit4-J1-20170715_212528_7328387799060317838522.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-cell\test\temp\junit4-J0-20170715_212528_73215299499716560452460.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 506 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-langid\test\temp\junit4-J1-20170715_212548_9881560258791007127899.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-langid\test\temp\junit4-J0-20170715_212548_98810738797572536524330.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 593 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-ltr\test\temp\junit4-J1-20170715_212600_69812399207025089824751.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-ltr\test\temp\junit4-J0-20170715_212600_69816282121009833049688.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 538 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-uima\test\temp\junit4-J1-20170715_212650_40516221586818721441022.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-uima\test\temp\junit4-J0-20170715_212650_405694048745513377394.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 495 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-velocity\test\temp\junit4-J0-20170715_212712_72616155928455756340649.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 28933 lines...]


---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

[JENKINS-EA] Lucene-Solr-master-Windows (32bit/jdk-9-ea+173) - Build # 6750 - Still Unstable!

Policeman Jenkins Server-2
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Windows/6750/
Java: 32bit/jdk-9-ea+173 -client -XX:+UseSerialGC

3 tests failed.
FAILED:  org.apache.solr.cloud.ChaosMonkeyNothingIsSafeTest.test

Error Message:
Timeout occured while waiting response from server at: http://127.0.0.1:57317/collection1

Stack Trace:
org.apache.solr.client.solrj.SolrServerException: Timeout occured while waiting response from server at: http://127.0.0.1:57317/collection1
        at __randomizedtesting.SeedInfo.seed([1D4C02BAFFE2FCAF:95183D60511E9157]:0)
        at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:637)
        at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:252)
        at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:241)
        at org.apache.solr.client.solrj.impl.LBHttpSolrClient.doRequest(LBHttpSolrClient.java:483)
        at org.apache.solr.client.solrj.impl.LBHttpSolrClient.request(LBHttpSolrClient.java:413)
        at org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1121)
        at org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:862)
        at org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:793)
        at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:178)
        at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:484)
        at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:463)
        at org.apache.solr.cloud.AbstractFullDistribZkTestBase.commit(AbstractFullDistribZkTestBase.java:1581)
        at org.apache.solr.cloud.ChaosMonkeyNothingIsSafeTest.test(ChaosMonkeyNothingIsSafeTest.java:213)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:564)
        at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1713)
        at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:907)
        at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:943)
        at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:957)
        at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:985)
        at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:960)
        at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
        at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
        at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
        at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
        at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
        at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
        at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
        at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
        at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
        at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:916)
        at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:802)
        at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:852)
        at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
        at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
        at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
        at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
        at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
        at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
        at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
        at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
        at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
        at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
        at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
        at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
        at java.base/java.lang.Thread.run(Thread.java:844)
Caused by: java.net.SocketTimeoutException: Read timed out
        at java.base/java.net.SocketInputStream.socketRead0(Native Method)
        at java.base/java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
        at java.base/java.net.SocketInputStream.read(SocketInputStream.java:171)
        at java.base/java.net.SocketInputStream.read(SocketInputStream.java:141)
        at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:139)
        at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:155)
        at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:284)
        at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:140)
        at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:57)
        at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:261)
        at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:165)
        at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:167)
        at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:272)
        at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:124)
        at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:271)
        at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184)
        at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88)
        at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
        at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184)
        at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
        at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55)
        at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:523)
        ... 52 more


FAILED:  junit.framework.TestSuite.org.apache.solr.cloud.ChaosMonkeyNothingIsSafeTest

Error Message:
9 threads leaked from SUITE scope at org.apache.solr.cloud.ChaosMonkeyNothingIsSafeTest:     1) Thread[id=17191, name=TEST-ChaosMonkeyNothingIsSafeTest.test-seed#[1D4C02BAFFE2FCAF]-SendThread(127.0.0.1:57246), state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]         at java.base@9-ea/java.lang.Thread.sleep(Native Method)         at app//org.apache.zookeeper.ClientCnxnSocketNIO.cleanup(ClientCnxnSocketNIO.java:230)         at app//org.apache.zookeeper.ClientCnxn$SendThread.cleanup(ClientCnxn.java:1246)         at app//org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1170)    2) Thread[id=17193, name=zkCallback-2754-thread-1, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]         at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)         at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)         at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)         at java.base@9-ea/java.lang.Thread.run(Thread.java:844)    3) Thread[id=17335, name=zkCallback-2754-thread-3, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]         at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)         at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)         at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)         at java.base@9-ea/java.lang.Thread.run(Thread.java:844)    4) Thread[id=17336, name=zkCallback-2754-thread-4, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]         at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)         at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)         at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)         at java.base@9-ea/java.lang.Thread.run(Thread.java:844)    5) Thread[id=17192, name=TEST-ChaosMonkeyNothingIsSafeTest.test-seed#[1D4C02BAFFE2FCAF]-EventThread, state=WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]         at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)         at java.base@9-ea/java.util.concurrent.locks.LockSupport.park(LockSupport.java:194)         at java.base@9-ea/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2062)         at java.base@9-ea/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:435)         at app//org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:501)    6) Thread[id=17190, name=Connection evictor, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]         at java.base@9-ea/java.lang.Thread.sleep(Native Method)         at app//org.apache.http.impl.client.IdleConnectionEvictor$1.run(IdleConnectionEvictor.java:66)         at java.base@9-ea/java.lang.Thread.run(Thread.java:844)    7) Thread[id=17334, name=zkCallback-2754-thread-2, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]         at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)         at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)         at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)         at java.base@9-ea/java.lang.Thread.run(Thread.java:844)    8) Thread[id=17362, name=zkCallback-2754-thread-6, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]         at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)         at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)         at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)         at java.base@9-ea/java.lang.Thread.run(Thread.java:844)    9) Thread[id=17338, name=zkCallback-2754-thread-5, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]         at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)         at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)         at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)         at java.base@9-ea/java.lang.Thread.run(Thread.java:844)

Stack Trace:
com.carrotsearch.randomizedtesting.ThreadLeakError: 9 threads leaked from SUITE scope at org.apache.solr.cloud.ChaosMonkeyNothingIsSafeTest:
   1) Thread[id=17191, name=TEST-ChaosMonkeyNothingIsSafeTest.test-seed#[1D4C02BAFFE2FCAF]-SendThread(127.0.0.1:57246), state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]
        at java.base@9-ea/java.lang.Thread.sleep(Native Method)
        at app//org.apache.zookeeper.ClientCnxnSocketNIO.cleanup(ClientCnxnSocketNIO.java:230)
        at app//org.apache.zookeeper.ClientCnxn$SendThread.cleanup(ClientCnxn.java:1246)
        at app//org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1170)
   2) Thread[id=17193, name=zkCallback-2754-thread-1, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]
        at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)
        at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base@9-ea/java.lang.Thread.run(Thread.java:844)
   3) Thread[id=17335, name=zkCallback-2754-thread-3, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]
        at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)
        at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base@9-ea/java.lang.Thread.run(Thread.java:844)
   4) Thread[id=17336, name=zkCallback-2754-thread-4, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]
        at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)
        at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base@9-ea/java.lang.Thread.run(Thread.java:844)
   5) Thread[id=17192, name=TEST-ChaosMonkeyNothingIsSafeTest.test-seed#[1D4C02BAFFE2FCAF]-EventThread, state=WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]
        at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)
        at java.base@9-ea/java.util.concurrent.locks.LockSupport.park(LockSupport.java:194)
        at java.base@9-ea/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2062)
        at java.base@9-ea/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:435)
        at app//org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:501)
   6) Thread[id=17190, name=Connection evictor, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]
        at java.base@9-ea/java.lang.Thread.sleep(Native Method)
        at app//org.apache.http.impl.client.IdleConnectionEvictor$1.run(IdleConnectionEvictor.java:66)
        at java.base@9-ea/java.lang.Thread.run(Thread.java:844)
   7) Thread[id=17334, name=zkCallback-2754-thread-2, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]
        at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)
        at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base@9-ea/java.lang.Thread.run(Thread.java:844)
   8) Thread[id=17362, name=zkCallback-2754-thread-6, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]
        at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)
        at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base@9-ea/java.lang.Thread.run(Thread.java:844)
   9) Thread[id=17338, name=zkCallback-2754-thread-5, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]
        at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)
        at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base@9-ea/java.lang.Thread.run(Thread.java:844)
        at __randomizedtesting.SeedInfo.seed([1D4C02BAFFE2FCAF]:0)


FAILED:  junit.framework.TestSuite.org.apache.solr.cloud.ChaosMonkeyNothingIsSafeTest

Error Message:
There are still zombie threads that couldn't be terminated:    1) Thread[id=17191, name=TEST-ChaosMonkeyNothingIsSafeTest.test-seed#[1D4C02BAFFE2FCAF]-SendThread(127.0.0.1:57246), state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]         at java.base@9-ea/java.lang.Thread.sleep(Native Method)         at app//org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1051)    2) Thread[id=17193, name=zkCallback-2754-thread-1, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]         at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)         at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)         at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)         at java.base@9-ea/java.lang.Thread.run(Thread.java:844)    3) Thread[id=17335, name=zkCallback-2754-thread-3, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]         at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)         at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)         at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)         at java.base@9-ea/java.lang.Thread.run(Thread.java:844)    4) Thread[id=17336, name=zkCallback-2754-thread-4, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]         at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)         at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)         at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)         at java.base@9-ea/java.lang.Thread.run(Thread.java:844)    5) Thread[id=17334, name=zkCallback-2754-thread-2, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]         at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)         at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)         at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)         at java.base@9-ea/java.lang.Thread.run(Thread.java:844)    6) Thread[id=17362, name=zkCallback-2754-thread-6, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]         at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)         at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)         at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)         at java.base@9-ea/java.lang.Thread.run(Thread.java:844)    7) Thread[id=17338, name=zkCallback-2754-thread-5, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]         at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)         at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)         at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)         at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)         at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)         at java.base@9-ea/java.lang.Thread.run(Thread.java:844)

Stack Trace:
com.carrotsearch.randomizedtesting.ThreadLeakError: There are still zombie threads that couldn't be terminated:
   1) Thread[id=17191, name=TEST-ChaosMonkeyNothingIsSafeTest.test-seed#[1D4C02BAFFE2FCAF]-SendThread(127.0.0.1:57246), state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]
        at java.base@9-ea/java.lang.Thread.sleep(Native Method)
        at app//org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1051)
   2) Thread[id=17193, name=zkCallback-2754-thread-1, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]
        at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)
        at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base@9-ea/java.lang.Thread.run(Thread.java:844)
   3) Thread[id=17335, name=zkCallback-2754-thread-3, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]
        at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)
        at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base@9-ea/java.lang.Thread.run(Thread.java:844)
   4) Thread[id=17336, name=zkCallback-2754-thread-4, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]
        at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)
        at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base@9-ea/java.lang.Thread.run(Thread.java:844)
   5) Thread[id=17334, name=zkCallback-2754-thread-2, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]
        at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)
        at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base@9-ea/java.lang.Thread.run(Thread.java:844)
   6) Thread[id=17362, name=zkCallback-2754-thread-6, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]
        at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)
        at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base@9-ea/java.lang.Thread.run(Thread.java:844)
   7) Thread[id=17338, name=zkCallback-2754-thread-5, state=TIMED_WAITING, group=TGRP-ChaosMonkeyNothingIsSafeTest]
        at java.base@9-ea/jdk.internal.misc.Unsafe.park(Native Method)
        at java.base@9-ea/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:462)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:361)
        at java.base@9-ea/java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:937)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1085)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
        at java.base@9-ea/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base@9-ea/java.lang.Thread.run(Thread.java:844)
        at __randomizedtesting.SeedInfo.seed([1D4C02BAFFE2FCAF]:0)




Build Log:
[...truncated 11 lines...]
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from git://git.apache.org/lucene-solr.git
        at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:812)
        at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1079)
        at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1110)
        at hudson.scm.SCM.checkout(SCM.java:495)
        at hudson.model.AbstractProject.checkout(AbstractProject.java:1212)
        at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:560)
        at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
        at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:485)
        at hudson.model.Run.execute(Run.java:1735)
        at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
        at hudson.model.ResourceController.execute(ResourceController.java:97)
        at hudson.model.Executor.run(Executor.java:415)
Caused by: hudson.plugins.git.GitException: org.eclipse.jgit.api.errors.TransportException: Software caused connection abort: socket write error
        at org.jenkinsci.plugins.gitclient.JGitAPIImpl$2.execute(JGitAPIImpl.java:619)
        at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler$1.call(RemoteGitImpl.java:153)
        at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler$1.call(RemoteGitImpl.java:146)
        at hudson.remoting.UserRequest.perform(UserRequest.java:181)
        at hudson.remoting.UserRequest.perform(UserRequest.java:52)
        at hudson.remoting.Request$2.run(Request.java:336)
        at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:68)
        at java.util.concurrent.FutureTask.run(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
        at java.lang.Thread.run(Unknown Source)
        at ......remote call to Windows VBOX(Native Method)
        at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1554)
        at hudson.remoting.UserResponse.retrieve(UserRequest.java:281)
        at hudson.remoting.Channel.call(Channel.java:839)
        at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler.execute(RemoteGitImpl.java:146)
        at sun.reflect.GeneratedMethodAccessor720.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler.invoke(RemoteGitImpl.java:132)
        at com.sun.proxy.$Proxy57.execute(Unknown Source)
        at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:810)
        ... 11 more
Caused by: org.eclipse.jgit.api.errors.TransportException: Software caused connection abort: socket write error
        at org.eclipse.jgit.api.FetchCommand.call(FetchCommand.java:135)
        at org.jenkinsci.plugins.gitclient.JGitAPIImpl$2.execute(JGitAPIImpl.java:617)
        at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler$1.call(RemoteGitImpl.java:153)
        at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler$1.call(RemoteGitImpl.java:146)
        at hudson.remoting.UserRequest.perform(UserRequest.java:181)
        at hudson.remoting.UserRequest.perform(UserRequest.java:52)
        at hudson.remoting.Request$2.run(Request.java:336)
        at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:68)
        at java.util.concurrent.FutureTask.run(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
        at java.lang.Thread.run(Unknown Source)
Caused by: org.eclipse.jgit.errors.TransportException: Software caused connection abort: socket write error
        at org.eclipse.jgit.transport.BasePackFetchConnection.doFetch(BasePackFetchConnection.java:377)
        at org.eclipse.jgit.transport.BasePackFetchConnection.fetch(BasePackFetchConnection.java:308)
        at org.eclipse.jgit.transport.BasePackFetchConnection.fetch(BasePackFetchConnection.java:298)
        at org.eclipse.jgit.transport.FetchProcess.fetchObjects(FetchProcess.java:245)
        at org.eclipse.jgit.transport.FetchProcess.executeImp(FetchProcess.java:161)
        at org.eclipse.jgit.transport.FetchProcess.execute(FetchProcess.java:122)
        at org.eclipse.jgit.transport.Transport.fetch(Transport.java:1201)
        at org.eclipse.jgit.api.FetchCommand.call(FetchCommand.java:128)
        ... 11 more
Caused by: java.net.SocketException: Software caused connection abort: socket write error
        at java.net.SocketOutputStream.socketWrite0(Native Method)
        at java.net.SocketOutputStream.socketWrite(Unknown Source)
        at java.net.SocketOutputStream.write(Unknown Source)
        at java.io.BufferedOutputStream.flushBuffer(Unknown Source)
        at java.io.BufferedOutputStream.flush(Unknown Source)
        at org.eclipse.jgit.transport.PacketLineOut.flush(PacketLineOut.java:180)
        at org.eclipse.jgit.transport.PacketLineOut.end(PacketLineOut.java:167)
        at org.eclipse.jgit.transport.BasePackFetchConnection.negotiate(BasePackFetchConnection.java:573)
        at org.eclipse.jgit.transport.BasePackFetchConnection.doFetch(BasePackFetchConnection.java:363)
        ... 18 more
ERROR: Error fetching remote repo 'origin'
Retrying after 10 seconds
Fetching changes from the remote Git repository
Cleaning workspace
Checking out Revision bab1731b23feb1a85c4258d26913e668d6c18397 (refs/remotes/origin/master)
Commit message: "SOLR-11088: Fix sporadic failures of MetricsHandlerTest.testPropertyFilter on jenkins"
No emails were triggered.
[description-setter] Description set: Java: 32bit/jdk-9-ea+173 -client -XX:+UseSerialGC
[Lucene-Solr-master-Windows] $ cmd.exe /C "C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\bin\ant.bat '"-Dargs=-client -XX:+UseSerialGC"' jenkins-hourly && exit %%ERRORLEVEL%%"
Buildfile: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\build.xml

jenkins-hourly:

-print-java-info:
[java-info] java version "9-ea"
[java-info] Java(TM) SE Runtime Environment (9-ea+173, Oracle Corporation)
[java-info] Java HotSpot(TM) Server VM (9-ea+173, Oracle Corporation)
[java-info] Test args: [-client -XX:+UseSerialGC]

clean:

clean:

clean:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: Apache Ivy 2.3.0 - 20130110142753 :: http://ant.apache.org/ivy/ ::
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve-groovy:
[ivy:cachepath] :: resolving dependencies :: org.codehaus.groovy#groovy-all-caller;working
[ivy:cachepath] confs: [default]
[ivy:cachepath] found org.codehaus.groovy#groovy-all;2.4.8 in public
[ivy:cachepath] :: resolution report :: resolve 123ms :: artifacts dl 2ms
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   1   |   0   |   0   |   0   ||   1   |   0   |
        ---------------------------------------------------------------------

-test-with-heapdumps-enabled:
     [echo] Java HotSpot(TM) Server VM: Enabling heap dumps on OutOfMemoryError to dir 'C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\heapdumps'.
    [mkdir] Created dir: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\heapdumps

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve-groovy:

test:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

-clover.load:

resolve-groovy:

-init-totals:

test-core:

-clover.disable:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

-clover.load:

-clover.classpath:

-clover.setup:

clover:

-check-git-state:

-git-cleanroot:

-copy-git-state:
     [copy] Copying 1 file to C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build

git-autoclean:

resolve:

init:

compile-core:
    [mkdir] Created dir: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java
    [javac] Compiling 823 source files to C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\util\graph\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\document\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\util\bkd\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\codecs\compressing\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\codecs\lucene60\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\codecs\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\util\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\index\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\util\fst\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\analysis\tokenattributes\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\search\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\codecs\lucene62\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\util\packed\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\util\mutable\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\codecs\lucene70\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\codecs\perfield\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\analysis\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\analysis\standard\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\store\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\codecs\blocktree\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\search\spans\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\search\similarities\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\util\automaton\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\codecs\lucene50\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java\org\apache\lucene\geo\package-info.class
     [copy] Copying 3 files to C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\java

compile-test-framework:

-check-git-state:

-git-cleanroot:

-copy-git-state:

git-autoclean:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

init:

compile-lucene-core:

-check-git-state:

-git-cleanroot:

-copy-git-state:

git-autoclean:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

init:

-clover.disable:

-clover.load:

-clover.classpath:

-clover.setup:

clover:

compile-core:

compile-codecs:

-check-git-state:

-git-cleanroot:

-copy-git-state:

git-autoclean:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

common.init:

compile-lucene-core:

init:

-clover.disable:

-clover.load:

-clover.classpath:

-clover.setup:

clover:

compile-core:
    [mkdir] Created dir: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\codecs\classes\java
    [javac] Compiling 67 source files to C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\codecs\classes\java
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\codecs\classes\java\org\apache\lucene\codecs\bloom\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\codecs\classes\java\org\apache\lucene\codecs\memory\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\codecs\classes\java\org\apache\lucene\codecs\simpletext\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\codecs\classes\java\org\apache\lucene\codecs\blockterms\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\codecs\classes\java\org\apache\lucene\codecs\blocktreeords\package-info.class
     [copy] Copying 3 files to C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\codecs\classes\java

-clover.disable:

-clover.load:

-clover.classpath:

-clover.setup:

clover:

common.compile-core:
    [mkdir] Created dir: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\test-framework\classes\java
    [javac] Compiling 188 source files to C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\test-framework\classes\java
    [javac] Note: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\test-framework\src\java\org\apache\lucene\store\BaseDirectoryTestCase.java uses or overrides a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\test-framework\classes\java\org\apache\lucene\codecs\asserting\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\test-framework\classes\java\org\apache\lucene\codecs\ramonly\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\test-framework\classes\java\org\apache\lucene\mockfile\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\test-framework\classes\java\org\apache\lucene\codecs\cheapbastard\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\test-framework\classes\java\org\apache\lucene\codecs\mockrandom\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\test-framework\classes\java\org\apache\lucene\codecs\cranky\package-info.class
    [javac] Creating empty C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\test-framework\classes\java\org\apache\lucene\codecs\compressing\dummy\package-info.class
     [copy] Copying 4 files to C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\test-framework\classes\java

compile-core:

compile-test:
    [mkdir] Created dir: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\test
    [javac] Compiling 461 source files to C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\test
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
     [copy] Copying 3 files to C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\classes\test

install-junit4-taskdef:

validate:

resolve-groovy:

-init-totals:

-test:
    [mkdir] Created dir: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\test
[loadresource] Do not set property tests.explicitclass as its length is 0.
    [mkdir] Created dir: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\test\temp
    [mkdir] Created dir: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\.caches\test-stats\core
   [junit4] <JUnit4> says kaixo! Master seed: 61E609E248577339
   [junit4] Your default console's encoding may not display certain unicode glyphs: windows-1252
   [junit4] Executing 454 suites with 2 JVMs.
   [junit4]
   [junit4] Started J1 PID(3268@localhost).
   [junit4] Started J0 PID(1552@localhost).
   [junit4] Suite: org.apache.lucene.search.TestEarlyTermination
   [junit4] Completed [1/454] on J1 in 0.40s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.store.TestFilterDirectory
   [junit4] IGNOR/A 0.01s J0 | TestFilterDirectory.testPendingDeletions
   [junit4]    > Assumption #1: we can only install VirusCheckingFS on an FSDirectory
   [junit4] IGNOR/A 0.00s J0 | TestFilterDirectory.testFsyncDoesntCreateNewFiles
   [junit4]    > Assumption #1: test only works for FSDirectory subclasses
   [junit4] Completed [2/454] on J0 in 1.09s, 44 tests, 2 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.index.Test2BSortedDocValuesOrds
   [junit4] IGNOR/A 0.00s J0 | Test2BSortedDocValuesOrds.test2BOrds
   [junit4]    > Assumption #1: 'monster' test group is disabled (@Monster(value="Takes ~ 6 hours if the heap is 5gb"))
   [junit4] Completed [3/454] on J0 in 0.00s, 1 test, 1 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestDocsWithFieldSet
   [junit4] Completed [4/454] on J0 in 0.02s, 3 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestFieldsReader
   [junit4] Completed [5/454] on J0 in 0.60s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestIndexWriterOutOfFileDescriptors
   [junit4] Completed [6/454] on J0 in 1.37s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestMatchAllDocsQuery
   [junit4] Completed [7/454] on J0 in 0.05s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.automaton.TestDeterminizeLexicon
   [junit4] Completed [8/454] on J0 in 0.57s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestNeverDelete
   [junit4] Completed [9/454] on J0 in 1.35s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestIndexWriter
   [junit4] IGNOR/A 0.00s J0 | TestIndexWriter.testWithPendingDeletions
   [junit4]    > Assumption #1: windows is not supported
   [junit4] Completed [10/454] on J0 in 3.41s, 72 tests, 2 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestSparseFixedBitDocIdSet
   [junit4] Completed [11/454] on J0 in 0.83s, 5 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.automaton.TestRegExp
   [junit4] Completed [12/454] on J0 in 0.07s, 5 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestSort
   [junit4] Completed [13/454] on J0 in 0.13s, 25 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestFieldCacheRewriteMethod
   [junit4] Completed [14/454] on J0 in 0.51s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestSameScoresWithThreads
   [junit4] Completed [15/454] on J0 in 0.20s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestFloatRangeFieldQueries
   [junit4] IGNOR/A 0.00s J0 | TestFloatRangeFieldQueries.testRandomBig
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] Completed [16/454] on J0 in 0.69s, 5 tests, 1 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestFilterIterator
   [junit4] Completed [17/454] on J0 in 0.02s, 8 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestConsistentFieldNumbers
   [junit4] Completed [18/454] on J0 in 0.23s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestAddIndexes
   [junit4] Completed [19/454] on J0 in 1.50s, 25 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestPrefixQuery
   [junit4] Completed [20/454] on J0 in 0.64s, 3 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.TestAssertions
   [junit4] Completed [21/454] on J0 in 0.01s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestNamedSPILoader
   [junit4] Completed [22/454] on J0 in 0.01s, 3 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestSegmentTermEnum
   [junit4] Completed [23/454] on J0 in 0.03s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestCollectionUtil
   [junit4] Completed [24/454] on J0 in 2.01s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.Test2BPoints
   [junit4] IGNOR/A 0.00s J0 | Test2BPoints.test1D
   [junit4]    > Assumption #1: 'monster' test group is disabled (@Monster(value="takes at least 4 hours and consumes many GB of temp disk space"))
   [junit4] IGNOR/A 0.00s J0 | Test2BPoints.test2D
   [junit4]    > Assumption #1: 'monster' test group is disabled (@Monster(value="takes at least 4 hours and consumes many GB of temp disk space"))
   [junit4] Completed [25/454] on J0 in 0.00s, 2 tests, 2 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestTimeLimitingCollector
   [junit4] Completed [26/454] on J0 in 1.47s, 7 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.store.TestMultiMMap
   [junit4] Completed [27/454] on J0 in 1.92s, 54 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestIntroSorter
   [junit4] Completed [28/454] on J0 in 0.04s, 9 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestIndexInput
   [junit4] Completed [29/454] on J0 in 0.28s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestFrequencyTrackingRingBuffer
   [junit4] Completed [30/454] on J0 in 0.08s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestPhrasePrefixQuery
   [junit4] Completed [31/454] on J0 in 0.02s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestSortedNumericSortField
   [junit4] Completed [32/454] on J0 in 0.05s, 9 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestVirtualMethod
   [junit4] Completed [33/454] on J0 in 0.01s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestArrayUtil
   [junit4] Completed [34/454] on J0 in 0.84s, 14 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestRamUsageEstimator
   [junit4] IGNOR/A 0.00s J0 | TestRamUsageEstimator.testPrintValues
   [junit4]    > Assumption #1: Specify -Dtests.verbose=true to print constants of RamUsageEstimator.
   [junit4] IGNOR/A 0.00s J0 | TestRamUsageEstimator.testHotspotBean
   [junit4]    > Assumption #1: testHotspotBean only works on 64bit JVMs.
   [junit4] Completed [35/454] on J0 in 0.01s, 5 tests, 2 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.TestExternalCodecs
   [junit4] Completed [36/454] on J0 in 0.08s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestRegexpRandom2
   [junit4] Completed [37/454] on J0 in 0.38s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.analysis.tokenattributes.TestSimpleAttributeImpl
   [junit4] Completed [38/454] on J0 in 0.01s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestTermScorer
   [junit4] Completed [39/454] on J0 in 0.04s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestQueryRescorer
   [junit4] Completed [40/454] on J0 in 0.15s, 6 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestRegexpQuery
   [junit4] Completed [41/454] on J0 in 0.07s, 7 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestPriorityQueue
   [junit4] Completed [42/454] on J0 in 0.44s, 9 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.codecs.compressing.TestHighCompressionMode
   [junit4] Completed [43/454] on J0 in 0.51s, 7 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.document.TestBinaryDocument
   [junit4] Completed [44/454] on J0 in 0.01s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestPayloadsOnVectors
   [junit4] Completed [45/454] on J0 in 0.04s, 3 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.store.TestBufferedIndexInput
   [junit4] Completed [46/454] on J0 in 0.36s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestParallelCompositeReader
   [junit4] Completed [47/454] on J0 in 0.21s, 11 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestCodecHoldsOpenFiles
   [junit4] Completed [48/454] on J0 in 0.02s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.search.spans.TestSpansEnum
   [junit4] Completed [49/454] on J0 in 0.06s, 7 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.packed.TestDirectPacked
   [junit4] Completed [50/454] on J0 in 0.69s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.automaton.TestAutomaton
   [junit4] Completed [51/454] on J0 in 0.56s, 56 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestIndexSearcher
   [junit4] Completed [52/454] on J0 in 0.08s, 5 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestRecyclingByteBlockAllocator
   [junit4] Completed [53/454] on J0 in 0.01s, 3 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestIndexingSequenceNumbers
   [junit4] Completed [54/454] on J0 in 7.70s, 8 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.TestSearch
   [junit4] Completed [55/454] on J0 in 0.03s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.bkd.TestMutablePointsReaderUtils
   [junit4] Completed [56/454] on J0 in 0.83s, 3 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestExceedMaxTermLength
   [junit4] Completed [57/454] on J0 in 0.02s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestNoMergePolicy
   [junit4] Completed [58/454] on J0 in 0.02s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestTopDocsMerge
   [junit4] Completed [59/454] on J0 in 0.23s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestDirectoryReader
   [junit4] Completed [60/454] on J0 in 0.58s, 24 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestTerm
   [junit4] Completed [61/454] on J0 in 0.01s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestIndexSorting
   [junit4] Completed [62/454] on J0 in 2.97s, 50 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestCustomNorms
   [junit4] Completed [63/454] on J0 in 0.24s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.search.spans.TestSpanExplanations
   [junit4] Completed [64/454] on J0 in 0.26s, 30 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestIntArrayDocIdSet
   [junit4] Completed [65/454] on J0 in 0.29s, 5 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.spans.TestSpanExplanationsOfNonMatches
   [junit4] Completed [66/454] on J0 in 0.04s, 30 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestParallelLeafReader
   [junit4] Completed [67/454] on J0 in 0.06s, 9 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestDocValuesQueries
   [junit4] Completed [68/454] on J0 in 0.93s, 10 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestOmitPositions
   [junit4] Completed [69/454] on J0 in 0.08s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestPrefixRandom
   [junit4] Completed [70/454] on J0 in 0.18s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestTimSorterWorstCase
   [junit4] Completed [71/454] on J1 in 39.32s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.analysis.TestGraphTokenizers
   [junit4] Completed [72/454] on J1 in 1.01s, 23 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestRollingUpdates
   [junit4] Completed [73/454] on J1 in 0.76s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestParallelTermEnum
   [junit4] Completed [74/454] on J1 in 0.03s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.TestMergeSchedulerExternal
   [junit4] Completed [75/454] on J1 in 0.47s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestIntsRef
   [junit4] Completed [76/454] on J1 in 0.01s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.store.TestDirectory
   [junit4] IGNOR/A 0.00s J1 | TestDirectory.testListAll
   [junit4]    > Assumption #1: this test does not expect extra files
   [junit4] Completed [77/454] on J1 in 0.32s, 4 tests, 1 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.analysis.TestStopFilter
   [junit4] Completed [78/454] on J1 in 0.02s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestMultiTermQueryRewrites
   [junit4] Completed [79/454] on J1 in 0.09s, 3 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.codecs.lucene50.TestBlockPostingsFormat
   [junit4] Completed [80/454] on J1 in 2.53s, 27 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestFixedBitSet
   [junit4] Completed [81/454] on J1 in 1.84s, 19 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestPersistentSnapshotDeletionPolicy
   [junit4] Completed [82/454] on J1 in 1.36s, 13 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.analysis.TestCharArraySet
   [junit4] Completed [83/454] on J1 in 0.03s, 15 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestSetOnce
   [junit4] Completed [84/454] on J1 in 0.03s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.analysis.TestDelegatingAnalyzerWrapper
   [junit4] Completed [85/454] on J1 in 0.02s, 3 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestWeakIdentityMap
   [junit4] Completed [86/454] on J1 in 0.89s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestSwappedIndexFiles
   [junit4] Completed [87/454] on J1 in 0.04s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestTermdocPerf
   [junit4] Completed [88/454] on J1 in 0.01s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestDocumentsWriterStallControl
   [junit4] Completed [89/454] on J1 in 2.48s, 3 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestSortedSetSelector
   [junit4] Completed [90/454] on J1 in 0.12s, 15 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestStringMSBRadixSorter
   [junit4] Completed [91/454] on J1 in 2.00s, 7 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestSentinelIntSet
   [junit4] Completed [92/454] on J1 in 0.05s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestIndexWriterDelete
   [junit4] IGNOR/A 0.00s J1 | TestIndexWriterDelete.testApplyDeletesOnFlush
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] Completed [93/454] on J1 in 2.75s, 25 tests, 2 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestPointQueries
   [junit4] IGNOR/A 0.00s J1 | TestPointQueries.testRandomLongsBig
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] IGNOR/A 0.00s J1 | TestPointQueries.testRandomBinaryBig
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] Completed [94/454] on J1 in 3.80s, 49 tests, 2 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestConcurrentMergeScheduler
   [junit4] Completed [95/454] on J1 in 2.81s, 16 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.spans.TestSpanContainQuery
   [junit4] Completed [96/454] on J1 in 0.05s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestElevationComparator
   [junit4] Completed [97/454] on J1 in 0.03s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestBytesRefArray
   [junit4] Completed [98/454] on J1 in 0.04s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.store.TestSimpleFSLockFactory
   [junit4] Completed [99/454] on J1 in 9.92s, 7 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestApproximationSearchEquivalence
   [junit4] Completed [100/454] on J1 in 0.25s, 10 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestDirectoryReaderReopen
   [junit4] Completed [101/454] on J1 in 1.34s, 14 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestTermsEnum2
   [junit4] Completed [102/454] on J1 in 0.17s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.graph.TestGraphTokenStreamFiniteStrings
   [junit4] Completed [103/454] on J1 in 0.03s, 12 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestSegmentTermDocs
   [junit4] Completed [104/454] on J1 in 0.29s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.analysis.tokenattributes.TestBytesRefAttImpl
   [junit4] Completed [105/454] on J1 in 0.01s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestMultiDocValues
   [junit4] Completed [106/454] on J1 in 0.13s, 7 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestPrefixInBooleanQuery
   [junit4] Completed [107/454] on J1 in 0.24s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.Test2BPostingsBytes
   [junit4] IGNOR/A 0.00s J1 | Test2BPostingsBytes.test
   [junit4]    > Assumption #1: 'monster' test group is disabled (@Monster(value="takes ~20GB-30GB of space and 10 minutes"))
   [junit4] Completed [108/454] on J1 in 0.00s, 1 test, 1 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestIndexWriterReader
   [junit4] IGNOR/A 0.00s J1 | TestIndexWriterReader.testDuringAddIndexes
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] Completed [109/454] on J1 in 2.70s, 23 tests, 1 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestMultiCollector
   [junit4] Completed [111/454] on J1 in 0.07s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestNRTReaderCleanup
   [junit4] IGNOR/A 0.00s J1 | TestNRTReaderCleanup.testClosingNRTReaderDoesNotCorruptYourIndex
   [junit4]    > Assumption #1: this test can't run on Windows
   [junit4] Completed [112/454] on J1 in 0.01s, 1 test, 1 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestReaderClosed
   [junit4] Completed [113/454] on J1 in 0.02s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestIndexWriterOnDiskFull
   [junit4] Completed [114/454] on J1 in 0.20s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.automaton.TestDeterminism
   [junit4] Completed [115/454] on J1 in 0.19s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestIntroSelector
   [junit4] Completed [116/454] on J1 in 0.46s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.bkd.Test2BBKDPoints
   [junit4] IGNOR/A 0.00s J1 | Test2BBKDPoints.test1D
   [junit4]    > Assumption #1: 'monster' test group is disabled (@Monster(value="takes at least 4 hours and consumes many GB of temp disk space"))
   [junit4] IGNOR/A 0.00s J1 | Test2BBKDPoints.test2D
   [junit4]    > Assumption #1: 'monster' test group is disabled (@Monster(value="takes at least 4 hours and consumes many GB of temp disk space"))
   [junit4] Completed [117/454] on J1 in 0.00s, 2 tests, 2 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestPagedBytes
   [junit4] Completed [118/454] on J1 in 5.40s, 4 tests, 1 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestOmitNorms
   [junit4] Completed [119/454] on J1 in 0.40s, 5 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestAllFilesCheckIndexHeader
   [junit4] Completed [120/454] on J1 in 0.21s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestCheckIndex
   [junit4] Completed [121/454] on J1 in 0.27s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.codecs.lucene70.TestLucene70NormsFormat
   [junit4] IGNOR/A 0.00s J1 | TestLucene70NormsFormat.testNCommonBig
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] IGNOR/A 0.00s J1 | TestLucene70NormsFormat.testSparseNCommonBig
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] IGNOR/A 0.00s J1 | TestLucene70NormsFormat.testMergeStability
   [junit4]    > Assumption #1: The MockRandom PF randomizes content on the fly, so we can't check it
   [junit4] Completed [122/454] on J1 in 1.45s, 29 tests, 3 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.codecs.perfield.TestPerFieldPostingsFormat2
   [junit4] Completed [123/454] on J1 in 1.37s, 6 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.spans.TestBasics
   [junit4] Completed [124/454] on J1 in 2.12s, 26 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestFlex
   [junit4] Completed [125/454] on J1 in 0.07s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestCachingCollector
   [junit4] Completed [126/454] on J1 in 0.02s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestRollingBuffer
   [junit4] Completed [127/454] on J1 in 0.08s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.util.packed.TestPackedInts
   [junit4] IGNOR/A 0.00s J1 | TestPackedInts.testBlockReaderOverflow
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] Completed [128/454] on J1 in 5.78s, 28 tests, 3 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestPayloads
   [junit4] Completed [129/454] on J1 in 0.06s, 7 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.geo.TestGeoUtils
   [junit4] Completed [130/454] on J1 in 6.65s, 7 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.codecs.perfield.TestPerFieldDocValuesFormat
   [junit4] Completed [131/454] on J1 in 9.57s, 110 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestIndexWriterCommit
   [junit4] Completed [132/454] on J1 in 1.23s, 14 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestStressAdvance
   [junit4] Completed [133/454] on J1 in 0.57s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestIndexWriterOnVMError
   [junit4] IGNOR/A 0.00s J1 | TestIndexWriterOnVMError.testCheckpoint
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] Completed [134/454] on J1 in 0.07s, 3 tests, 1 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestMaxTermFrequency
   [junit4] Completed [135/454] on J1 in 0.10s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.search.spans.TestSpanFirstQuery
   [junit4] Completed [136/454] on J1 in 0.02s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestPrefixCodedTerms
   [junit4] Completed [137/454] on J1 in 0.05s, 3 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestTransactionRollback
   [junit4] Completed [138/454] on J1 in 0.08s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestNoMergeScheduler
   [junit4] Completed [139/454] on J1 in 0.01s, 3 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestReaderWrapperDVTypeCheck
   [junit4] Completed [140/454] on J1 in 0.02s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestPointValues
   [junit4] Completed [141/454] on J1 in 10.98s, 32 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestDocIDMerger
   [junit4] Completed [142/454] on J1 in 0.02s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestIndexOrDocValuesQuery
   [junit4] Completed [143/454] on J1 in 0.04s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestMSBRadixSorter
   [junit4] Completed [144/454] on J1 in 1.70s, 8 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.spans.TestNearSpansOrdered
   [junit4] Completed [145/454] on J1 in 0.11s, 19 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestMultiTermConstantScore
   [junit4] Completed [146/454] on J1 in 0.21s, 7 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestSimpleSearchEquivalence
   [junit4] Completed [147/454] on J1 in 0.23s, 17 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.store.TestBufferedChecksum
   [junit4] Completed [148/454] on J1 in 0.04s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.codecs.compressing.TestFastCompressionMode
   [junit4] Completed [149/454] on J1 in 0.44s, 11 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestDocValuesScoring
   [junit4] Completed [150/454] on J1 in 0.04s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestIndexWriterConfig
   [junit4] Completed [151/454] on J1 in 0.03s, 8 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.spans.TestSpanNotQuery
   [junit4] Completed [152/454] on J1 in 0.02s, 3 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestOneMergeWrappingMergePolicy
   [junit4] Completed [153/454] on J1 in 0.01s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestNoDeletionPolicy
   [junit4] Completed [154/454] on J1 in 0.03s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestIsCurrent
   [junit4] Completed [155/454] on J1 in 0.02s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestMultiset
   [junit4] Completed [156/454] on J1 in 0.01s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestRollback
   [junit4] Completed [157/454] on J1 in 0.03s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestNot
   [junit4] Completed [158/454] on J1 in 0.02s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestDocumentWriter
   [junit4] Completed [159/454] on J1 in 0.10s, 6 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestInfoStream
   [junit4] Completed [160/454] on J1 in 0.01s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestDuelingCodecsAtNight
   [junit4] IGNOR/A 0.00s J1 | TestDuelingCodecsAtNight.testBigEquals
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] IGNOR/A 0.00s J1 | TestDuelingCodecsAtNight.testEquals
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] IGNOR/A 0.00s J1 | TestDuelingCodecsAtNight.testCrazyReaderEquals
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] Completed [161/454] on J1 in 0.00s, 3 tests, 3 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestBoolean2
   [junit4] Completed [162/454] on J1 in 8.30s, 10 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestSynonymQuery
   [junit4] Completed [163/454] on J1 in 0.03s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestQueryBuilder
   [junit4] Completed [164/454] on J1 in 0.03s, 23 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.analysis.tokenattributes.TestCharTermAttributeImpl
   [junit4] Completed [165/454] on J1 in 0.77s, 12 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestUniqueTermCount
   [junit4] Completed [166/454] on J1 in 0.03s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.Test2BBinaryDocValues
   [junit4] IGNOR/A 0.00s J1 | Test2BBinaryDocValues.testFixedBinary
   [junit4]    > Assumption #1: 'monster' test group is disabled (@Monster(value="takes ~ 6 hours if the heap is 5gb"))
   [junit4] IGNOR/A 0.00s J1 | Test2BBinaryDocValues.testVariableBinary
   [junit4]    > Assumption #1: 'monster' test group is disabled (@Monster(value="takes ~ 6 hours if the heap is 5gb"))
   [junit4] Completed [167/454] on J1 in 0.00s, 2 tests, 2 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestConstantScoreQuery
   [junit4] Completed [168/454] on J1 in 0.02s, 5 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestDoubleValuesSource
   [junit4] Completed [169/454] on J1 in 1.09s, 5 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.codecs.lucene70.TestIndexedDISI
   [junit4] Completed [170/454] on J1 in 5.62s, 10 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestSimilarity
   [junit4] Completed [171/454] on J1 in 0.02s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestMultiFields
   [junit4] Completed [172/454] on J1 in 0.11s, 3 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.codecs.TestCodecLoadingDeadlock
   [junit4] Completed [173/454] on J1 in 0.25s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestPositiveScoresOnlyCollector
   [junit4] Completed [174/454] on J1 in 0.02s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestBoolean2ScorerSupplier
   [junit4] Completed [175/454] on J1 in 0.04s, 10 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestStressIndexing
   [junit4] Completed [176/454] on J1 in 1.64s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestStressNRT
   [junit4] Completed [177/454] on J1 in 0.07s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestMergePolicyWrapper
   [junit4] Completed [178/454] on J1 in 0.01s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.codecs.lucene70.TestLucene70DocValuesFormat
   [junit4] IGNOR/A 0.00s J0 | TestLucene70DocValuesFormat.testSortedVariableLengthManyVsStoredFields
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] IGNOR/A 0.00s J0 | TestLucene70DocValuesFormat.testSortedSetVariableLengthManyVsStoredFields
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] IGNOR/A 0.00s J0 | TestLucene70DocValuesFormat.testTermsEnumRandomMany
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] Completed [179/454] on J0 in 110.62s, 126 tests, 3 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.codecs.perfield.TestPerFieldPostingsFormat
   [junit4] IGNOR/A 0.00s J1 | TestPerFieldPostingsFormat.testMergeStability
   [junit4]    > Assumption #1: The MockRandom PF randomizes content on the fly, so we can't check it
   [junit4] IGNOR/A 0.00s J1 | TestPerFieldPostingsFormat.testPostingsEnumReuse
   [junit4]    > Assumption #1: The MockRandom PF randomizes content on the fly, so we can't check it
   [junit4] Completed [180/454] on J1 in 2.26s, 25 tests, 2 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestBooleanMinShouldMatch
   [junit4] Completed [181/454] on J0 in 0.69s, 17 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.analysis.TestWordlistLoader
   [junit4] Completed [182/454] on J0 in 0.02s, 3 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.mutable.TestMutableValues
   [junit4] Completed [183/454] on J0 in 0.02s, 6 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestCrashCausesCorruptIndex
   [junit4] Completed [184/454] on J0 in 0.10s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.util.automaton.TestOperations
   [junit4] Completed [185/454] on J1 in 0.17s, 5 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.Test2BDocs
   [junit4] IGNOR/A 0.00s J1 | Test2BDocs.test2BDocs
   [junit4]    > Assumption #1: 'monster' test group is disabled (@Monster(value="Takes ~30min"))
   [junit4] Completed [186/454] on J1 in 0.00s, 1 test, 1 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestIndexFileDeleter
   [junit4] Completed [187/454] on J0 in 0.25s, 10 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestRoaringDocIdSet
   [junit4] Completed [188/454] on J1 in 0.32s, 5 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestAtomicUpdate
   [junit4] Completed [189/454] on J0 in 1.95s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestFieldInvertState
   [junit4] Completed [190/454] on J0 in 0.04s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestSparseFixedBitSet
   [junit4] Completed [191/454] on J1 in 1.90s, 11 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.automaton.TestMinimize
   [junit4] Completed [192/454] on J0 in 1.00s, 3 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestDocsAndPositions
   [junit4] Completed [193/454] on J0 in 0.14s, 6 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.analysis.tokenattributes.TestPackedTokenAttributeImpl
   [junit4] Completed [194/454] on J0 in 0.02s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.TestUnicodeUtil
   [junit4] Completed [195/454] on J0 in 0.20s, 5 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestFlushByRamOrCountsPolicy
   [junit4] Completed [196/454] on J0 in 0.77s, 5 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestMultiPhraseEnum
   [junit4] Completed [197/454] on J0 in 0.02s, 2 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestIndexWriterMerging
   [junit4] Completed [198/454] on J1 in 2.75s, 6 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestTieredMergePolicy
   [junit4] Completed [199/454] on J0 in 0.91s, 6 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.automaton.TestLevenshteinAutomata
   [junit4] Completed [200/454] on J0 in 1.00s, 4 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestCodecs
   [junit4] Completed [201/454] on J0 in 0.19s, 3 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.util.fst.TestFSTs
   [junit4] IGNOR/A 0.00s J1 | TestFSTs.testBigSet
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] Completed [202/454] on J1 in 1.16s, 20 tests, 1 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.util.fst.TestBytesStore
   [junit4] Completed [203/454] on J1 in 0.41s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestIndexWriterWithThreads
   [junit4] IGNOR/A 0.01s J1 | TestIndexWriterWithThreads.testOpenTwoIndexWritersOnDifferentThreads
   [junit4]    > Assumption #1: aborting test: timeout obtaining lock
   [junit4] Completed [204/454] on J1 in 0.88s, 12 tests, 1 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.search.TestShardSearching
   [junit4] Completed [205/454] on J1 in 4.38s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.util.bkd.TestBKD
   [junit4] IGNOR/A 0.00s J0 | TestBKD.testRandomBinaryBig
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] Completed [206/454] on J0 in 6.68s, 19 tests, 1 skipped
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestSegmentReader
   [junit4] Completed [207/454] on J0 in 0.48s, 7 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.codecs.lucene70.TestLucene70SegmentInfoFormat
   [junit4] Completed [208/454] on J0 in 0.13s, 16 tests
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestSumDocFreq
   [junit4] Completed [209/454] on J0 in 0.06s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.index.TestDoc
   [junit4] Completed [210/454] on J0 in 0.20s, 1 test
   [junit4]
   [junit4] Suite: org.apache.lucene.analysis.TestCharFilter
   [junit4] Completed [211/454] on J0 in 0.01s, 4 tests
 

[...truncated too long message...]

nfigure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

common.resolve:

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

jar-checksums:
    [mkdir] Created dir: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\null250989699
     [copy] Copying 35 files to C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\null250989699
   [delete] Deleting directory C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\null250989699

resolve-example:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

resolve-server:

ivy-availability-check:

ivy-fail:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

resolve:

ivy-availability-check:

ivy-fail:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

ivy-availability-check:

ivy-fail:

ivy-fail:

ivy-configure:
[ivy:configure] :: loading settings :: file = C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\top-level-ivy-settings.xml

resolve:

jar-checksums:
    [mkdir] Created dir: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\null985851151
     [copy] Copying 214 files to C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\null985851151
   [delete] Deleting directory C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\null985851151

check-working-copy:
[ivy:cachepath] :: resolving dependencies :: org.eclipse.jgit#org.eclipse.jgit-caller;working
[ivy:cachepath] confs: [default]
[ivy:cachepath] found org.eclipse.jgit#org.eclipse.jgit;4.6.0.201612231935-r in public
[ivy:cachepath] found com.jcraft#jsch;0.1.53 in public
[ivy:cachepath] found com.googlecode.javaewah#JavaEWAH;1.1.6 in public
[ivy:cachepath] found org.apache.httpcomponents#httpclient;4.3.6 in public
[ivy:cachepath] found org.apache.httpcomponents#httpcore;4.3.3 in public
[ivy:cachepath] found commons-logging#commons-logging;1.1.3 in public
[ivy:cachepath] found commons-codec#commons-codec;1.6 in public
[ivy:cachepath] found org.slf4j#slf4j-api;1.7.2 in public
[ivy:cachepath] :: resolution report :: resolve 49ms :: artifacts dl 5ms
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   8   |   0   |   0   |   0   ||   8   |   0   |
        ---------------------------------------------------------------------
[wc-checker] Initializing working copy...
[wc-checker] SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
[wc-checker] SLF4J: Defaulting to no-operation (NOP) logger implementation
[wc-checker] SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
[wc-checker] Checking working copy status...

-jenkins-base:

BUILD SUCCESSFUL
Total time: 68 minutes 31 seconds
Archiving artifacts
WARN: No artifacts found that match the file pattern "**/*.events,heapdumps/**,**/*_pid*.log". Configuration error?
WARN: java.lang.InterruptedException: no matches found within 10000
[WARNINGS] Parsing warnings in console log with parser Java Compiler (javac)
[WARNINGS] Computing warning deltas based on reference build #6749
Recording test results
Build step 'Publish JUnit test result report' changed build result to UNSTABLE
Email was triggered for: Unstable (Test Failures)
Sending email for trigger: Unstable (Test Failures)


---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]
Loading...