Lucene/Solr 7.6

classic Classic list List threaded Threaded
37 messages Options
12
Reply | Threaded
Open this post in threaded view
|

Re: Lucene/Solr 7.6

Cassandra Targett

On Sat, Nov 10, 2018 at 9:50 AM Steve Rowe <[hidden email]> wrote:
Hi Cassandra,

> On Nov 9, 2018, at 3:47 PM, Cassandra Targett <[hidden email]> wrote:
>
> I don't know if it's on the Release ToDo list, but we need a Jenkins job for the Ref Guide to be built from branch_7x  also.

I assume you mean a branch_7_6 ref guide job, since there already is one for branch_7x; I created it along with the others.

 
Right, I meant branch_7_6! Thanks Steve.

Reply | Threaded
Open this post in threaded view
|

Re: Lucene/Solr 7.6

Nicholas Knize
In reply to this post by caomanhdat
Hi Đạt

Sorry for the delay.  I'm okay with you backporting SOLR-12969 to branch_7_6. 



On Mon, Nov 12, 2018, 4:09 AM Đạt Cao Mạnh <[hidden email]> wrote:
Hi guys,

Is it ok for backporting SOLR-12969 to branch_7_6?

On Sat, Nov 10, 2018 at 3:50 PM Steve Rowe <[hidden email]> wrote:
Hi Cassandra,

> On Nov 9, 2018, at 3:47 PM, Cassandra Targett <[hidden email]> wrote:
>
> I don't know if it's on the Release ToDo list, but we need a Jenkins job for the Ref Guide to be built from branch_7x  also.

I assume you mean a branch_7_6 ref guide job, since there already is one for branch_7x; I created it along with the others.

FYI the ref guide job is listed among those to create on https://wiki.apache.org/lucene-java/JenkinsReleaseBuilds , which is linked from https://wiki.apache.org/lucene-java/ReleaseTodo#Jenkins_Release_Builds .

Steve


---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]

--

Nicholas Knize, Ph.D., GISP
Geospatial Software Guy  |  Elasticsearch
Apache Lucene Committer
[hidden email]  

Reply | Threaded
Open this post in threaded view
|

Re: Lucene/Solr 7.6

Nicholas Knize
In reply to this post by sarowe
Thank you for setting up the Jenkins jobs Steve! Sorry for missing that note during the 7.5 release. I'll ask Adrien to get me admin access for the future. 

On Sat, Nov 10, 2018, 9:41 AM Steve Rowe <[hidden email]> wrote:
Hi Nick,

> On Nov 9, 2018, at 3:24 PM, Nicholas Knize <[hidden email]> wrote:
>
> branch_7_6 has been cut and versions updated to 7.7 on branch_7x.
> [...]
> Also, a quick request. It doesn't look like I have Jenkins admin rights. Would someone (with the power) either mind granting me admin privileges or add/update the Jenkins build tasks for the new branch?

Repeating myself from the 7.5 release thread:

> Each project's PMC Chair is responsible for granting job creation/modification permissions on Jenkins: https://cwiki.apache.org/confluence/display/INFRA/Jenkins#Jenkins-HowdoIgetanaccount

In case it's not clear, only Adrien can do this ^^.

I have created the 7.6 jobs on ASF Jenkins.

Steve


---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]

--

Nicholas Knize, Ph.D., GISP
Geospatial Software Guy  |  Elasticsearch
Apache Lucene Committer
[hidden email]  

Reply | Threaded
Open this post in threaded view
|

RE: Lucene/Solr 7.6

Uwe Schindler
In reply to this post by sarowe
With a short delay, I also created the jobs on Policeman Jenkins.

Uwe

-----
Uwe Schindler
Achterdiek 19, D-28357 Bremen
http://www.thetaphi.de
eMail: [hidden email]

> -----Original Message-----
> From: Steve Rowe <[hidden email]>
> Sent: Saturday, November 10, 2018 4:41 PM
> To: Lucene Dev <[hidden email]>
> Subject: Re: Lucene/Solr 7.6
>
> Hi Nick,
>
> > On Nov 9, 2018, at 3:24 PM, Nicholas Knize <[hidden email]> wrote:
> >
> > branch_7_6 has been cut and versions updated to 7.7 on branch_7x.
> > [...]
> > Also, a quick request. It doesn't look like I have Jenkins admin rights. Would
> someone (with the power) either mind granting me admin privileges or
> add/update the Jenkins build tasks for the new branch?
>
> Repeating myself from the 7.5 release thread:
>
> > Each project's PMC Chair is responsible for granting job
> creation/modification permissions on Jenkins:
> https://cwiki.apache.org/confluence/display/INFRA/Jenkins#Jenkins-
> HowdoIgetanaccount
>
> In case it's not clear, only Adrien can do this ^^.
>
> I have created the 7.6 jobs on ASF Jenkins.
>
> Steve
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [hidden email]
> For additional commands, e-mail: [hidden email]


---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Lucene/Solr 7.6

Andrzej Białecki-2
In reply to this post by Nicholas Knize
Hi Nicholas,

If it’s ok I would like to merge a small fix to the Ref Guide, spotted by Christine in SOLR-9856.

On 1 Nov 2018, at 21:38, Nicholas Knize <[hidden email]> wrote:

Hi all,

To follow up from our discussion on the 8.0 thread, I would like to cut the 7.6 branch on either Tuesday or Wednesday of next week. Since this implies feature freeze I went ahead and had a look at some of the issues that are labeled for 7.6.

It looks like we only have one active issue listed as a blocker for Solr. The upgrade notes in SOLR-12927

For Lucene we have five active issues (each with a patch provided) listed as blockers targeted for 7.6.

If there are any other issues that need to land before cutting the branch, and they are not already labeled, please either mark them as blockers accordingly or let me know prior to cutting the branch next Tuesday or Wednesday.

Thank you!

- Nick
--

Nicholas Knize, Ph.D., GISP
Geospatial Software Guy  |  Elasticsearch
Apache Lucene Committer
[hidden email]  





Andrzej Białecki

Reply | Threaded
Open this post in threaded view
|

Re: Lucene/Solr 7.6

Cassandra Targett
Doc changes are still fine, Andrzej. I still have a couple things to do for the Ref Guide also.

On Wed, Nov 21, 2018 at 12:09 PM Andrzej Białecki <[hidden email]> wrote:
Hi Nicholas,

If it’s ok I would like to merge a small fix to the Ref Guide, spotted by Christine in SOLR-9856.

On 1 Nov 2018, at 21:38, Nicholas Knize <[hidden email]> wrote:

Hi all,

To follow up from our discussion on the 8.0 thread, I would like to cut the 7.6 branch on either Tuesday or Wednesday of next week. Since this implies feature freeze I went ahead and had a look at some of the issues that are labeled for 7.6.

It looks like we only have one active issue listed as a blocker for Solr. The upgrade notes in SOLR-12927

For Lucene we have five active issues (each with a patch provided) listed as blockers targeted for 7.6.

If there are any other issues that need to land before cutting the branch, and they are not already labeled, please either mark them as blockers accordingly or let me know prior to cutting the branch next Tuesday or Wednesday.

Thank you!

- Nick
--

Nicholas Knize, Ph.D., GISP
Geospatial Software Guy  |  Elasticsearch
Apache Lucene Committer
[hidden email]  





Andrzej Białecki

Reply | Threaded
Open this post in threaded view
|

Re: Lucene/Solr 7.6

Nicholas Knize
Hello all. A quick update on 7.6. It looks like all blockers have been resolved. Let me know if I missed any and if there aren't any objections I'll plan to build a 7.6 RC on Thursday of this week.

Thanks!



On Wed, Nov 21, 2018 at 1:34 PM Cassandra Targett <[hidden email]> wrote:
Doc changes are still fine, Andrzej. I still have a couple things to do for the Ref Guide also.

On Wed, Nov 21, 2018 at 12:09 PM Andrzej Białecki <[hidden email]> wrote:
Hi Nicholas,

If it’s ok I would like to merge a small fix to the Ref Guide, spotted by Christine in SOLR-9856.

On 1 Nov 2018, at 21:38, Nicholas Knize <[hidden email]> wrote:

Hi all,

To follow up from our discussion on the 8.0 thread, I would like to cut the 7.6 branch on either Tuesday or Wednesday of next week. Since this implies feature freeze I went ahead and had a look at some of the issues that are labeled for 7.6.

It looks like we only have one active issue listed as a blocker for Solr. The upgrade notes in SOLR-12927

For Lucene we have five active issues (each with a patch provided) listed as blockers targeted for 7.6.

If there are any other issues that need to land before cutting the branch, and they are not already labeled, please either mark them as blockers accordingly or let me know prior to cutting the branch next Tuesday or Wednesday.

Thank you!

- Nick
--

Nicholas Knize, Ph.D., GISP
Geospatial Software Guy  |  Elasticsearch
Apache Lucene Committer
[hidden email]  





Andrzej Białecki

--

Nicholas Knize, Ph.D., GISP
Geospatial Software Guy  |  Elasticsearch
Apache Lucene Committer
[hidden email]  

Reply | Threaded
Open this post in threaded view
|

Re: Lucene/Solr 7.6

Adrien Grand
+1 to build a RC on Thursday this week. Thanks Nick.
On Tue, Nov 27, 2018 at 5:56 PM Nicholas Knize <[hidden email]> wrote:

>
> Hello all. A quick update on 7.6. It looks like all blockers have been resolved. Let me know if I missed any and if there aren't any objections I'll plan to build a 7.6 RC on Thursday of this week.
>
> Thanks!
>
>
>
> On Wed, Nov 21, 2018 at 1:34 PM Cassandra Targett <[hidden email]> wrote:
>>
>> Doc changes are still fine, Andrzej. I still have a couple things to do for the Ref Guide also.
>>
>> On Wed, Nov 21, 2018 at 12:09 PM Andrzej Białecki <[hidden email]> wrote:
>>>
>>> Hi Nicholas,
>>>
>>> If it’s ok I would like to merge a small fix to the Ref Guide, spotted by Christine in SOLR-9856.
>>>
>>> On 1 Nov 2018, at 21:38, Nicholas Knize <[hidden email]> wrote:
>>>
>>> Hi all,
>>>
>>> To follow up from our discussion on the 8.0 thread, I would like to cut the 7.6 branch on either Tuesday or Wednesday of next week. Since this implies feature freeze I went ahead and had a look at some of the issues that are labeled for 7.6.
>>>
>>> It looks like we only have one active issue listed as a blocker for Solr. The upgrade notes in SOLR-12927
>>>
>>> For Lucene we have five active issues (each with a patch provided) listed as blockers targeted for 7.6.
>>>
>>> If there are any other issues that need to land before cutting the branch, and they are not already labeled, please either mark them as blockers accordingly or let me know prior to cutting the branch next Tuesday or Wednesday.
>>>
>>> Thank you!
>>>
>>> - Nick
>>> --
>>>
>>> Nicholas Knize, Ph.D., GISP
>>> Geospatial Software Guy  |  Elasticsearch
>>> Apache Lucene Committer
>>> [hidden email]
>>>
>>>
>>>
>>>
>>> —
>>>
>>> Andrzej Białecki
>>>
> --
>
> Nicholas Knize, Ph.D., GISP
> Geospatial Software Guy  |  Elasticsearch
> Apache Lucene Committer
> [hidden email]



--
Adrien

---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Lucene/Solr 7.6

Alan Woodward-3
Hi Nick,

I found a nasty bug in the intervals code: LUCENE-8586; is there still time to get that in for 7.6?  At the moment, somebody with a perfectly valid proximity query can hit an infinite loop.

Thanks, Alan

> On 28 Nov 2018, at 09:12, Adrien Grand <[hidden email]> wrote:
>
> +1 to build a RC on Thursday this week. Thanks Nick.
> On Tue, Nov 27, 2018 at 5:56 PM Nicholas Knize <[hidden email]> wrote:
>>
>> Hello all. A quick update on 7.6. It looks like all blockers have been resolved. Let me know if I missed any and if there aren't any objections I'll plan to build a 7.6 RC on Thursday of this week.
>>
>> Thanks!
>>
>>
>>
>> On Wed, Nov 21, 2018 at 1:34 PM Cassandra Targett <[hidden email]> wrote:
>>>
>>> Doc changes are still fine, Andrzej. I still have a couple things to do for the Ref Guide also.
>>>
>>> On Wed, Nov 21, 2018 at 12:09 PM Andrzej Białecki <[hidden email]> wrote:
>>>>
>>>> Hi Nicholas,
>>>>
>>>> If it’s ok I would like to merge a small fix to the Ref Guide, spotted by Christine in SOLR-9856.
>>>>
>>>> On 1 Nov 2018, at 21:38, Nicholas Knize <[hidden email]> wrote:
>>>>
>>>> Hi all,
>>>>
>>>> To follow up from our discussion on the 8.0 thread, I would like to cut the 7.6 branch on either Tuesday or Wednesday of next week. Since this implies feature freeze I went ahead and had a look at some of the issues that are labeled for 7.6.
>>>>
>>>> It looks like we only have one active issue listed as a blocker for Solr. The upgrade notes in SOLR-12927
>>>>
>>>> For Lucene we have five active issues (each with a patch provided) listed as blockers targeted for 7.6.
>>>>
>>>> If there are any other issues that need to land before cutting the branch, and they are not already labeled, please either mark them as blockers accordingly or let me know prior to cutting the branch next Tuesday or Wednesday.
>>>>
>>>> Thank you!
>>>>
>>>> - Nick
>>>> --
>>>>
>>>> Nicholas Knize, Ph.D., GISP
>>>> Geospatial Software Guy  |  Elasticsearch
>>>> Apache Lucene Committer
>>>> [hidden email]
>>>>
>>>>
>>>>
>>>>
>>>> —
>>>>
>>>> Andrzej Białecki
>>>>
>> --
>>
>> Nicholas Knize, Ph.D., GISP
>> Geospatial Software Guy  |  Elasticsearch
>> Apache Lucene Committer
>> [hidden email]
>
>
>
> --
> Adrien
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [hidden email]
> For additional commands, e-mail: [hidden email]
>


---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Lucene/Solr 7.6

Nicholas Knize
Hey Alan,

If you have a fix ready, I'm okay with you committing it. So far I've been unsuccessful in building an RC anyway. It appears I'll need to BadApple some more tests before I can move forward with the RC build. Cross referencing my list of consistent failures with the jenkins builds to verify there aren't any new failures we haven't seen before.

- Nick

On Mon, Dec 3, 2018 at 10:08 AM Alan Woodward <[hidden email]> wrote:
Hi Nick,

I found a nasty bug in the intervals code: LUCENE-8586; is there still time to get that in for 7.6?  At the moment, somebody with a perfectly valid proximity query can hit an infinite loop.

Thanks, Alan

> On 28 Nov 2018, at 09:12, Adrien Grand <[hidden email]> wrote:
>
> +1 to build a RC on Thursday this week. Thanks Nick.
> On Tue, Nov 27, 2018 at 5:56 PM Nicholas Knize <[hidden email]> wrote:
>>
>> Hello all. A quick update on 7.6. It looks like all blockers have been resolved. Let me know if I missed any and if there aren't any objections I'll plan to build a 7.6 RC on Thursday of this week.
>>
>> Thanks!
>>
>>
>>
>> On Wed, Nov 21, 2018 at 1:34 PM Cassandra Targett <[hidden email]> wrote:
>>>
>>> Doc changes are still fine, Andrzej. I still have a couple things to do for the Ref Guide also.
>>>
>>> On Wed, Nov 21, 2018 at 12:09 PM Andrzej Białecki <[hidden email]> wrote:
>>>>
>>>> Hi Nicholas,
>>>>
>>>> If it’s ok I would like to merge a small fix to the Ref Guide, spotted by Christine in SOLR-9856.
>>>>
>>>> On 1 Nov 2018, at 21:38, Nicholas Knize <[hidden email]> wrote:
>>>>
>>>> Hi all,
>>>>
>>>> To follow up from our discussion on the 8.0 thread, I would like to cut the 7.6 branch on either Tuesday or Wednesday of next week. Since this implies feature freeze I went ahead and had a look at some of the issues that are labeled for 7.6.
>>>>
>>>> It looks like we only have one active issue listed as a blocker for Solr. The upgrade notes in SOLR-12927
>>>>
>>>> For Lucene we have five active issues (each with a patch provided) listed as blockers targeted for 7.6.
>>>>
>>>> If there are any other issues that need to land before cutting the branch, and they are not already labeled, please either mark them as blockers accordingly or let me know prior to cutting the branch next Tuesday or Wednesday.
>>>>
>>>> Thank you!
>>>>
>>>> - Nick
>>>> --
>>>>
>>>> Nicholas Knize, Ph.D., GISP
>>>> Geospatial Software Guy  |  Elasticsearch
>>>> Apache Lucene Committer
>>>> [hidden email]
>>>>
>>>>
>>>>
>>>>
>>>> —
>>>>
>>>> Andrzej Białecki
>>>>
>> --
>>
>> Nicholas Knize, Ph.D., GISP
>> Geospatial Software Guy  |  Elasticsearch
>> Apache Lucene Committer
>> [hidden email]
>
>
>
> --
> Adrien
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [hidden email]
> For additional commands, e-mail: [hidden email]
>


---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]

--

Nicholas Knize, Ph.D., GISP
Geospatial Software Guy  |  Elasticsearch
Apache Lucene Committer
[hidden email]  

Reply | Threaded
Open this post in threaded view
|

Re: Lucene/Solr 7.6

Alan Woodward-3
I’ve backported LUCENE-8586 to the 7.6 branch.

On 3 Dec 2018, at 16:29, Nicholas Knize <[hidden email]> wrote:

Hey Alan,

If you have a fix ready, I'm okay with you committing it. So far I've been unsuccessful in building an RC anyway. It appears I'll need to BadApple some more tests before I can move forward with the RC build. Cross referencing my list of consistent failures with the jenkins builds to verify there aren't any new failures we haven't seen before.

- Nick

On Mon, Dec 3, 2018 at 10:08 AM Alan Woodward <[hidden email]> wrote:
Hi Nick,

I found a nasty bug in the intervals code: LUCENE-8586; is there still time to get that in for 7.6?  At the moment, somebody with a perfectly valid proximity query can hit an infinite loop.

Thanks, Alan

> On 28 Nov 2018, at 09:12, Adrien Grand <[hidden email]> wrote:
>
> +1 to build a RC on Thursday this week. Thanks Nick.
> On Tue, Nov 27, 2018 at 5:56 PM Nicholas Knize <[hidden email]> wrote:
>>
>> Hello all. A quick update on 7.6. It looks like all blockers have been resolved. Let me know if I missed any and if there aren't any objections I'll plan to build a 7.6 RC on Thursday of this week.
>>
>> Thanks!
>>
>>
>>
>> On Wed, Nov 21, 2018 at 1:34 PM Cassandra Targett <[hidden email]> wrote:
>>>
>>> Doc changes are still fine, Andrzej. I still have a couple things to do for the Ref Guide also.
>>>
>>> On Wed, Nov 21, 2018 at 12:09 PM Andrzej Białecki <[hidden email]> wrote:
>>>>
>>>> Hi Nicholas,
>>>>
>>>> If it’s ok I would like to merge a small fix to the Ref Guide, spotted by Christine in SOLR-9856.
>>>>
>>>> On 1 Nov 2018, at 21:38, Nicholas Knize <[hidden email]> wrote:
>>>>
>>>> Hi all,
>>>>
>>>> To follow up from our discussion on the 8.0 thread, I would like to cut the 7.6 branch on either Tuesday or Wednesday of next week. Since this implies feature freeze I went ahead and had a look at some of the issues that are labeled for 7.6.
>>>>
>>>> It looks like we only have one active issue listed as a blocker for Solr. The upgrade notes in SOLR-12927
>>>>
>>>> For Lucene we have five active issues (each with a patch provided) listed as blockers targeted for 7.6.
>>>>
>>>> If there are any other issues that need to land before cutting the branch, and they are not already labeled, please either mark them as blockers accordingly or let me know prior to cutting the branch next Tuesday or Wednesday.
>>>>
>>>> Thank you!
>>>>
>>>> - Nick
>>>> --
>>>>
>>>> Nicholas Knize, Ph.D., GISP
>>>> Geospatial Software Guy  |  Elasticsearch
>>>> Apache Lucene Committer
>>>> [hidden email]
>>>>
>>>>
>>>>
>>>>
>>>> —
>>>>
>>>> Andrzej Białecki
>>>>
>> --
>>
>> Nicholas Knize, Ph.D., GISP
>> Geospatial Software Guy  |  Elasticsearch
>> Apache Lucene Committer
>> [hidden email]
>
>
>
> --
> Adrien
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [hidden email]
> For additional commands, e-mail: [hidden email]
>


---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]

--

Nicholas Knize, Ph.D., GISP
Geospatial Software Guy  |  Elasticsearch
Apache Lucene Committer
[hidden email]  


Reply | Threaded
Open this post in threaded view
|

Re: Lucene/Solr 7.6

jim ferenczi
We found another bad bug in 7x that affects multi phrase query: https://issues.apache.org/jira/browse/LUCENE-8589
I'd like to backport the fix to the 7.6 branch if there are no objections.

Le mar. 4 déc. 2018 à 11:28, Alan Woodward <[hidden email]> a écrit :
I’ve backported LUCENE-8586 to the 7.6 branch.

On 3 Dec 2018, at 16:29, Nicholas Knize <[hidden email]> wrote:

Hey Alan,

If you have a fix ready, I'm okay with you committing it. So far I've been unsuccessful in building an RC anyway. It appears I'll need to BadApple some more tests before I can move forward with the RC build. Cross referencing my list of consistent failures with the jenkins builds to verify there aren't any new failures we haven't seen before.

- Nick

On Mon, Dec 3, 2018 at 10:08 AM Alan Woodward <[hidden email]> wrote:
Hi Nick,

I found a nasty bug in the intervals code: LUCENE-8586; is there still time to get that in for 7.6?  At the moment, somebody with a perfectly valid proximity query can hit an infinite loop.

Thanks, Alan

> On 28 Nov 2018, at 09:12, Adrien Grand <[hidden email]> wrote:
>
> +1 to build a RC on Thursday this week. Thanks Nick.
> On Tue, Nov 27, 2018 at 5:56 PM Nicholas Knize <[hidden email]> wrote:
>>
>> Hello all. A quick update on 7.6. It looks like all blockers have been resolved. Let me know if I missed any and if there aren't any objections I'll plan to build a 7.6 RC on Thursday of this week.
>>
>> Thanks!
>>
>>
>>
>> On Wed, Nov 21, 2018 at 1:34 PM Cassandra Targett <[hidden email]> wrote:
>>>
>>> Doc changes are still fine, Andrzej. I still have a couple things to do for the Ref Guide also.
>>>
>>> On Wed, Nov 21, 2018 at 12:09 PM Andrzej Białecki <[hidden email]> wrote:
>>>>
>>>> Hi Nicholas,
>>>>
>>>> If it’s ok I would like to merge a small fix to the Ref Guide, spotted by Christine in SOLR-9856.
>>>>
>>>> On 1 Nov 2018, at 21:38, Nicholas Knize <[hidden email]> wrote:
>>>>
>>>> Hi all,
>>>>
>>>> To follow up from our discussion on the 8.0 thread, I would like to cut the 7.6 branch on either Tuesday or Wednesday of next week. Since this implies feature freeze I went ahead and had a look at some of the issues that are labeled for 7.6.
>>>>
>>>> It looks like we only have one active issue listed as a blocker for Solr. The upgrade notes in SOLR-12927
>>>>
>>>> For Lucene we have five active issues (each with a patch provided) listed as blockers targeted for 7.6.
>>>>
>>>> If there are any other issues that need to land before cutting the branch, and they are not already labeled, please either mark them as blockers accordingly or let me know prior to cutting the branch next Tuesday or Wednesday.
>>>>
>>>> Thank you!
>>>>
>>>> - Nick
>>>> --
>>>>
>>>> Nicholas Knize, Ph.D., GISP
>>>> Geospatial Software Guy  |  Elasticsearch
>>>> Apache Lucene Committer
>>>> [hidden email]
>>>>
>>>>
>>>>
>>>>
>>>> —
>>>>
>>>> Andrzej Białecki
>>>>
>> --
>>
>> Nicholas Knize, Ph.D., GISP
>> Geospatial Software Guy  |  Elasticsearch
>> Apache Lucene Committer
>> [hidden email]
>
>
>
> --
> Adrien
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [hidden email]
> For additional commands, e-mail: [hidden email]
>


---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]

--

Nicholas Knize, Ph.D., GISP
Geospatial Software Guy  |  Elasticsearch
Apache Lucene Committer
[hidden email]  


Reply | Threaded
Open this post in threaded view
|

Re: Lucene/Solr 7.6

Adrien Grand
+1
On Tue, Dec 4, 2018 at 12:28 PM jim ferenczi <[hidden email]> wrote:

>
> We found another bad bug in 7x that affects multi phrase query: https://issues.apache.org/jira/browse/LUCENE-8589
> I'd like to backport the fix to the 7.6 branch if there are no objections.
>
> Le mar. 4 déc. 2018 à 11:28, Alan Woodward <[hidden email]> a écrit :
>>
>> I’ve backported LUCENE-8586 to the 7.6 branch.
>>
>> On 3 Dec 2018, at 16:29, Nicholas Knize <[hidden email]> wrote:
>>
>> Hey Alan,
>>
>> If you have a fix ready, I'm okay with you committing it. So far I've been unsuccessful in building an RC anyway. It appears I'll need to BadApple some more tests before I can move forward with the RC build. Cross referencing my list of consistent failures with the jenkins builds to verify there aren't any new failures we haven't seen before.
>>
>> - Nick
>>
>> On Mon, Dec 3, 2018 at 10:08 AM Alan Woodward <[hidden email]> wrote:
>>>
>>> Hi Nick,
>>>
>>> I found a nasty bug in the intervals code: LUCENE-8586; is there still time to get that in for 7.6?  At the moment, somebody with a perfectly valid proximity query can hit an infinite loop.
>>>
>>> Thanks, Alan
>>>
>>> > On 28 Nov 2018, at 09:12, Adrien Grand <[hidden email]> wrote:
>>> >
>>> > +1 to build a RC on Thursday this week. Thanks Nick.
>>> > On Tue, Nov 27, 2018 at 5:56 PM Nicholas Knize <[hidden email]> wrote:
>>> >>
>>> >> Hello all. A quick update on 7.6. It looks like all blockers have been resolved. Let me know if I missed any and if there aren't any objections I'll plan to build a 7.6 RC on Thursday of this week.
>>> >>
>>> >> Thanks!
>>> >>
>>> >>
>>> >>
>>> >> On Wed, Nov 21, 2018 at 1:34 PM Cassandra Targett <[hidden email]> wrote:
>>> >>>
>>> >>> Doc changes are still fine, Andrzej. I still have a couple things to do for the Ref Guide also.
>>> >>>
>>> >>> On Wed, Nov 21, 2018 at 12:09 PM Andrzej Białecki <[hidden email]> wrote:
>>> >>>>
>>> >>>> Hi Nicholas,
>>> >>>>
>>> >>>> If it’s ok I would like to merge a small fix to the Ref Guide, spotted by Christine in SOLR-9856.
>>> >>>>
>>> >>>> On 1 Nov 2018, at 21:38, Nicholas Knize <[hidden email]> wrote:
>>> >>>>
>>> >>>> Hi all,
>>> >>>>
>>> >>>> To follow up from our discussion on the 8.0 thread, I would like to cut the 7.6 branch on either Tuesday or Wednesday of next week. Since this implies feature freeze I went ahead and had a look at some of the issues that are labeled for 7.6.
>>> >>>>
>>> >>>> It looks like we only have one active issue listed as a blocker for Solr. The upgrade notes in SOLR-12927
>>> >>>>
>>> >>>> For Lucene we have five active issues (each with a patch provided) listed as blockers targeted for 7.6.
>>> >>>>
>>> >>>> If there are any other issues that need to land before cutting the branch, and they are not already labeled, please either mark them as blockers accordingly or let me know prior to cutting the branch next Tuesday or Wednesday.
>>> >>>>
>>> >>>> Thank you!
>>> >>>>
>>> >>>> - Nick
>>> >>>> --
>>> >>>>
>>> >>>> Nicholas Knize, Ph.D., GISP
>>> >>>> Geospatial Software Guy  |  Elasticsearch
>>> >>>> Apache Lucene Committer
>>> >>>> [hidden email]
>>> >>>>
>>> >>>>
>>> >>>>
>>> >>>>
>>> >>>> —
>>> >>>>
>>> >>>> Andrzej Białecki
>>> >>>>
>>> >> --
>>> >>
>>> >> Nicholas Knize, Ph.D., GISP
>>> >> Geospatial Software Guy  |  Elasticsearch
>>> >> Apache Lucene Committer
>>> >> [hidden email]
>>> >
>>> >
>>> >
>>> > --
>>> > Adrien
>>> >
>>> > ---------------------------------------------------------------------
>>> > To unsubscribe, e-mail: [hidden email]
>>> > For additional commands, e-mail: [hidden email]
>>> >
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: [hidden email]
>>> For additional commands, e-mail: [hidden email]
>>>
>> --
>>
>> Nicholas Knize, Ph.D., GISP
>> Geospatial Software Guy  |  Elasticsearch
>> Apache Lucene Committer
>> [hidden email]
>>
>>


--
Adrien

---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Lucene/Solr 7.6

Nicholas Knize
In reply to this post by Nicholas Knize
Hi All,

https://issues.apache.org/jira/browse/SOLR-13039 contains a patch that sets a list of common failing Tests to BadApple. As mentioned above I cross referenced our CI builds to make sure there aren't any new test failures that we haven't seen before. Let me know if any of these come as a surprise. I'll plan to commit this change to the 7.6 branch only to continue along in the release process. Once 7.6 is released I can revert the change to continue CI testing on the bug fix branch.

Thanks for the patience on this.

On Mon, Dec 3, 2018 at 10:29 AM Nicholas Knize <[hidden email]> wrote:
Hey Alan,

If you have a fix ready, I'm okay with you committing it. So far I've been unsuccessful in building an RC anyway. It appears I'll need to BadApple some more tests before I can move forward with the RC build. Cross referencing my list of consistent failures with the jenkins builds to verify there aren't any new failures we haven't seen before.

- Nick

On Mon, Dec 3, 2018 at 10:08 AM Alan Woodward <[hidden email]> wrote:
Hi Nick,

I found a nasty bug in the intervals code: LUCENE-8586; is there still time to get that in for 7.6?  At the moment, somebody with a perfectly valid proximity query can hit an infinite loop.

Thanks, Alan

> On 28 Nov 2018, at 09:12, Adrien Grand <[hidden email]> wrote:
>
> +1 to build a RC on Thursday this week. Thanks Nick.
> On Tue, Nov 27, 2018 at 5:56 PM Nicholas Knize <[hidden email]> wrote:
>>
>> Hello all. A quick update on 7.6. It looks like all blockers have been resolved. Let me know if I missed any and if there aren't any objections I'll plan to build a 7.6 RC on Thursday of this week.
>>
>> Thanks!
>>
>>
>>
>> On Wed, Nov 21, 2018 at 1:34 PM Cassandra Targett <[hidden email]> wrote:
>>>
>>> Doc changes are still fine, Andrzej. I still have a couple things to do for the Ref Guide also.
>>>
>>> On Wed, Nov 21, 2018 at 12:09 PM Andrzej Białecki <[hidden email]> wrote:
>>>>
>>>> Hi Nicholas,
>>>>
>>>> If it’s ok I would like to merge a small fix to the Ref Guide, spotted by Christine in SOLR-9856.
>>>>
>>>> On 1 Nov 2018, at 21:38, Nicholas Knize <[hidden email]> wrote:
>>>>
>>>> Hi all,
>>>>
>>>> To follow up from our discussion on the 8.0 thread, I would like to cut the 7.6 branch on either Tuesday or Wednesday of next week. Since this implies feature freeze I went ahead and had a look at some of the issues that are labeled for 7.6.
>>>>
>>>> It looks like we only have one active issue listed as a blocker for Solr. The upgrade notes in SOLR-12927
>>>>
>>>> For Lucene we have five active issues (each with a patch provided) listed as blockers targeted for 7.6.
>>>>
>>>> If there are any other issues that need to land before cutting the branch, and they are not already labeled, please either mark them as blockers accordingly or let me know prior to cutting the branch next Tuesday or Wednesday.
>>>>
>>>> Thank you!
>>>>
>>>> - Nick
>>>> --
>>>>
>>>> Nicholas Knize, Ph.D., GISP
>>>> Geospatial Software Guy  |  Elasticsearch
>>>> Apache Lucene Committer
>>>> [hidden email]
>>>>
>>>>
>>>>
>>>>
>>>> —
>>>>
>>>> Andrzej Białecki
>>>>
>> --
>>
>> Nicholas Knize, Ph.D., GISP
>> Geospatial Software Guy  |  Elasticsearch
>> Apache Lucene Committer
>> [hidden email]
>
>
>
> --
> Adrien
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [hidden email]
> For additional commands, e-mail: [hidden email]
>


---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]

--

Nicholas Knize, Ph.D., GISP
Geospatial Software Guy  |  Elasticsearch
Apache Lucene Committer
[hidden email]  

--

Nicholas Knize, Ph.D., GISP
Geospatial Software Guy  |  Elasticsearch
Apache Lucene Committer
[hidden email]  

Reply | Threaded
Open this post in threaded view
|

recent failures on fucit - how to de-dupe...

Gus Heck
Note: by the time I got to the end of writing this email I have come to strongly suspect that the duplication relates to how the filtering in the table works... but how one easily gets a non-duplicative set when investigating failures is still a question I don't have an answer for.... 

Seeing TimeRoutedAliasUpdateProcessorTest on the 7.6 bad apple list, having recently been looking at that test, and waiting on a long build for other work, I went to http://fucit.org/solr-jenkins-reports/failure-report.html to gather recent failures, and when I started looking I began to suspect there were duplicates... So I downloaded/extracted everything that comes up when I filter on TimeR and fiddled about a bit with grepping/etc and got this result that I admit I don't understand.... 

Filtered results on fucit.org looked like:

NS2-MacBook-Pro:TRA gus$ column -ts, failure-rates-data.csv 

"Suite?"  "Class"                                                                "Method"                  "Rate"              "Runs"  "Fails"

"false"   "org.apache.solr.update.processor.TimeRoutedAliasUpdateProcessorTest"  "test"                    "4.76190476190476"  "42"    "2"

"false"   "org.apache.solr.update.processor.TimeRoutedAliasUpdateProcessorTest"  "testSliceRouting"        "2.27272727272727"  "308"   "7"

"true"    "org.apache.solr.update.processor.TimeRoutedAliasUpdateProcessorTest"  ""                        "1.3986013986014"   "286"   "4"

"false"   "org.apache.solr.update.processor.TimeRoutedAliasUpdateProcessorTest"  "testPreemptiveCreation"  "1.2987012987013"   "308"   "4"


I clicked on each line and opened a tab fore each line in the modal dialog and then from each tab downloaded jenkins.log.txt.gz into a folder corresponding to the day on the file timestamp 

gus$ find . -name *.gz -print0 | xargs -0 gunzip

NS2-MacBook-Pro:testfails gus$ find . -name *.txt

./TRA/2018-12-01/jenkins.log.txt

./TRA/2018-12-01/jenkins.log (2).txt

./TRA/2018-12-01/jenkins.log (3).txt

./TRA/2018-12-01/jenkins.log (1).txt

./TRA/2018-11-30/jenkins.log (4).txt

./TRA/2018-11-30/jenkins.log.txt

./TRA/2018-11-30/jenkins.log (2).txt

./TRA/2018-11-30/jenkins.log (3).txt

./TRA/2018-11-30/jenkins.log (1).txt

./TRA/2018-12-02/jenkins.log.txt

./TRA/2018-12-02/jenkins.log (2).txt

./TRA/2018-12-02/jenkins.log (3).txt

./TRA/2018-12-02/jenkins.log (1).txt

./TRA/2018-12-03/jenkins.log.txt


gus$ grep -r 'reprod' * | grep TimeRouted | perl -pe 's/(^[^[]*).*reproduce with:(.*Dtests\.seed=(\w+)\s.*)/\3 \1 \2/' | sort

2E743D2D45BF625E TRA/2018-12-02/jenkins.log (1).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=test -Dtests.seed=2E743D2D45BF625E -Dtests.multiplier=3 -Dtests.slow=true -Dtests.badapples=true -Dtests.locale=guz-KE -Dtests.timezone=Etc/GMT-5 -Dtests.asserts=true -Dtests.file.encoding=US-ASCII

2E743D2D45BF625E TRA/2018-12-02/jenkins.log (1).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testSliceRouting -Dtests.seed=2E743D2D45BF625E -Dtests.multiplier=3 -Dtests.slow=true -Dtests.locale=guz-KE -Dtests.timezone=Etc/GMT-5 -Dtests.asserts=true -Dtests.file.encoding=US-ASCII

2E743D2D45BF625E TRA/2018-12-02/jenkins.log (1).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.seed=2E743D2D45BF625E -Dtests.multiplier=3 -Dtests.slow=true -Dtests.badapples=true -Dtests.locale=guz-KE -Dtests.timezone=Etc/GMT-5 -Dtests.asserts=true -Dtests.file.encoding=US-ASCII

2E743D2D45BF625E TRA/2018-12-02/jenkins.log (2).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=test -Dtests.seed=2E743D2D45BF625E -Dtests.multiplier=3 -Dtests.slow=true -Dtests.badapples=true -Dtests.locale=guz-KE -Dtests.timezone=Etc/GMT-5 -Dtests.asserts=true -Dtests.file.encoding=US-ASCII

2E743D2D45BF625E TRA/2018-12-02/jenkins.log (2).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testSliceRouting -Dtests.seed=2E743D2D45BF625E -Dtests.multiplier=3 -Dtests.slow=true -Dtests.locale=guz-KE -Dtests.timezone=Etc/GMT-5 -Dtests.asserts=true -Dtests.file.encoding=US-ASCII

2E743D2D45BF625E TRA/2018-12-02/jenkins.log (2).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.seed=2E743D2D45BF625E -Dtests.multiplier=3 -Dtests.slow=true -Dtests.badapples=true -Dtests.locale=guz-KE -Dtests.timezone=Etc/GMT-5 -Dtests.asserts=true -Dtests.file.encoding=US-ASCII

2E743D2D45BF625E TRA/2018-12-02/jenkins.log.txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=test -Dtests.seed=2E743D2D45BF625E -Dtests.multiplier=3 -Dtests.slow=true -Dtests.badapples=true -Dtests.locale=guz-KE -Dtests.timezone=Etc/GMT-5 -Dtests.asserts=true -Dtests.file.encoding=US-ASCII

2E743D2D45BF625E TRA/2018-12-02/jenkins.log.txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testSliceRouting -Dtests.seed=2E743D2D45BF625E -Dtests.multiplier=3 -Dtests.slow=true -Dtests.locale=guz-KE -Dtests.timezone=Etc/GMT-5 -Dtests.asserts=true -Dtests.file.encoding=US-ASCII

2E743D2D45BF625E TRA/2018-12-02/jenkins.log.txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.seed=2E743D2D45BF625E -Dtests.multiplier=3 -Dtests.slow=true -Dtests.badapples=true -Dtests.locale=guz-KE -Dtests.timezone=Etc/GMT-5 -Dtests.asserts=true -Dtests.file.encoding=US-ASCII

653A3F94747D4B6C TRA/2018-12-02/jenkins.log (3).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testSliceRouting -Dtests.seed=653A3F94747D4B6C -Dtests.slow=true -Dtests.locale=is -Dtests.timezone=Pacific/Kiritimati -Dtests.asserts=true -Dtests.file.encoding=UTF-8

85F52ED219B35581 TRA/2018-11-30/jenkins.log (2).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testPreemptiveCreation -Dtests.seed=85F52ED219B35581 -Dtests.slow=true -Dtests.locale=lv-LV -Dtests.timezone=America/Resolute -Dtests.asserts=true -Dtests.file.encoding=US-ASCII

85F52ED219B35581 TRA/2018-11-30/jenkins.log (2).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testSliceRouting -Dtests.seed=85F52ED219B35581 -Dtests.slow=true -Dtests.locale=lv-LV -Dtests.timezone=America/Resolute -Dtests.asserts=true -Dtests.file.encoding=US-ASCII

85F52ED219B35581 TRA/2018-11-30/jenkins.log (4).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testPreemptiveCreation -Dtests.seed=85F52ED219B35581 -Dtests.slow=true -Dtests.locale=lv-LV -Dtests.timezone=America/Resolute -Dtests.asserts=true -Dtests.file.encoding=US-ASCII

85F52ED219B35581 TRA/2018-11-30/jenkins.log (4).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testSliceRouting -Dtests.seed=85F52ED219B35581 -Dtests.slow=true -Dtests.locale=lv-LV -Dtests.timezone=America/Resolute -Dtests.asserts=true -Dtests.file.encoding=US-ASCII

87AA84094394A25D TRA/2018-11-30/jenkins.log (3).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testPreemptiveCreation -Dtests.seed=87AA84094394A25D -Dtests.multiplier=3 -Dtests.slow=true -Dtests.locale=fr-FR -Dtests.timezone=Pacific/Efate -Dtests.asserts=true -Dtests.file.encoding=US-ASCII

8DF7794AB2D01C00 TRA/2018-12-01/jenkins.log (1).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testSliceRouting -Dtests.seed=8DF7794AB2D01C00 -Dtests.multiplier=3 -Dtests.slow=true -Dtests.locale=en-IO -Dtests.timezone=America/Belize -Dtests.asserts=true -Dtests.file.encoding=UTF-8

953C910946955E70 TRA/2018-12-01/jenkins.log (2).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testSliceRouting -Dtests.seed=953C910946955E70 -Dtests.slow=true -Dtests.locale=uk -Dtests.timezone=Indian/Christmas -Dtests.asserts=true -Dtests.file.encoding=UTF-8

96F44DBF886ECD38 TRA/2018-12-01/jenkins.log.txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testSliceRouting -Dtests.seed=96F44DBF886ECD38 -Dtests.slow=true -Dtests.locale=de-LU -Dtests.timezone=Etc/GMT+2 -Dtests.asserts=true -Dtests.file.encoding=ISO-8859-1

B72E2FF953D47986 TRA/2018-12-03/jenkins.log.txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=test -Dtests.seed=B72E2FF953D47986 -Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true -Dtests.locale=sl-SI -Dtests.timezone=Asia/Famagusta -Dtests.asserts=true -Dtests.file.encoding=UTF-8

B9DA44E3F56E5A8C TRA/2018-12-01/jenkins.log (3).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testSliceRouting -Dtests.seed=B9DA44E3F56E5A8C -Dtests.slow=true -Dtests.locale=hr -Dtests.timezone=Mexico/BajaSur -Dtests.asserts=true -Dtests.file.encoding=ISO-8859-1

C3EC920833C32D9F TRA/2018-11-30/jenkins.log (1).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testPreemptiveCreation -Dtests.seed=C3EC920833C32D9F -Dtests.slow=true -Dtests.badapples=true -Dtests.locale=be-BY -Dtests.timezone=Asia/Dubai -Dtests.asserts=true -Dtests.file.encoding=Cp1252

C3EC920833C32D9F TRA/2018-11-30/jenkins.log (1).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testPreemptiveCreation -Dtests.seed=C3EC920833C32D9F -Dtests.slow=true -Dtests.locale=be-BY -Dtests.timezone=Asia/Dubai -Dtests.asserts=true -Dtests.file.encoding=Cp1252

C3EC920833C32D9F TRA/2018-11-30/jenkins.log (1).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.seed=C3EC920833C32D9F -Dtests.slow=true -Dtests.badapples=true -Dtests.locale=be-BY -Dtests.timezone=Asia/Dubai -Dtests.asserts=true -Dtests.file.encoding=Cp1252

C3EC920833C32D9F TRA/2018-11-30/jenkins.log (1).txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.seed=C3EC920833C32D9F -Dtests.slow=true -Dtests.locale=be-BY -Dtests.timezone=Asia/Dubai -Dtests.asserts=true -Dtests.file.encoding=Cp1252

C3EC920833C32D9F TRA/2018-11-30/jenkins.log.txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testPreemptiveCreation -Dtests.seed=C3EC920833C32D9F -Dtests.slow=true -Dtests.badapples=true -Dtests.locale=be-BY -Dtests.timezone=Asia/Dubai -Dtests.asserts=true -Dtests.file.encoding=Cp1252

C3EC920833C32D9F TRA/2018-11-30/jenkins.log.txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.method=testPreemptiveCreation -Dtests.seed=C3EC920833C32D9F -Dtests.slow=true -Dtests.locale=be-BY -Dtests.timezone=Asia/Dubai -Dtests.asserts=true -Dtests.file.encoding=Cp1252

C3EC920833C32D9F TRA/2018-11-30/jenkins.log.txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.seed=C3EC920833C32D9F -Dtests.slow=true -Dtests.badapples=true -Dtests.locale=be-BY -Dtests.timezone=Asia/Dubai -Dtests.asserts=true -Dtests.file.encoding=Cp1252

C3EC920833C32D9F TRA/2018-11-30/jenkins.log.txt:     ant test  -Dtestcase=TimeRoutedAliasUpdateProcessorTest -Dtests.seed=C3EC920833C32D9F -Dtests.slow=true -Dtests.locale=be-BY -Dtests.timezone=Asia/Dubai -Dtests.asserts=true -Dtests.file.encoding=Cp1252


What I've done is sort by seed, and found a LOT of duplication, even across files and some apparent running of specific test methods. I'd like to understand what's happening with the build servers here... why am we seeing so many duplicates? I would guess that this really boils down to 1 fail per seed value seen? I'm trying to figure out how many and which of these I need to consider, and I'm interested in the frequency of different failure scenarios which is hard to gauge if there's duplication.

Other interesting stuff...
gus$ grep -r 'reprod' * | grep Time | perl -pe 's/(^[^[]*).*reproduce with:(.*Dtests\.seed=(\w+)\s.*)/\1/' | sort | uniq -c

   4 TRA/2018-11-30/jenkins.log (1).txt:   

   2 TRA/2018-11-30/jenkins.log (2).txt:   

   1 TRA/2018-11-30/jenkins.log (3).txt:   

   2 TRA/2018-11-30/jenkins.log (4).txt:   

   4 TRA/2018-11-30/jenkins.log.txt:   

   1 TRA/2018-12-01/jenkins.log (1).txt:   

   1 TRA/2018-12-01/jenkins.log (2).txt:   

   1 TRA/2018-12-01/jenkins.log (3).txt:   

   1 TRA/2018-12-01/jenkins.log.txt:   

   3 TRA/2018-12-02/jenkins.log (1).txt:   

   3 TRA/2018-12-02/jenkins.log (2).txt:   

   1 TRA/2018-12-02/jenkins.log (3).txt:   

   3 TRA/2018-12-02/jenkins.log.txt:   

   1 TRA/2018-12-03/jenkins.log.txt:   

gus$ grep -r 'reprod' * | grep Time | perl -pe 's/.*reproduce with:(.*Dtests\.seed=(\w+)\s.*)/\2/' | sort | uniq -c

   9 2E743D2D45BF625E

   1 653A3F94747D4B6C

   4 85F52ED219B35581

   1 87AA84094394A25D

   1 8DF7794AB2D01C00

   1 953C910946955E70

   1 96F44DBF886ECD38

   1 B72E2FF953D47986

   1 B9DA44E3F56E5A8C

   8 C3EC920833C32D9F

 gus$ grep -r 'reprod' * | grep Time | perl -pe 's/.*reproduce with:(.*Dtests\.seed=(\w+)\s.*)/\2/' | wc -l

      28


So 17 failures listed in fucit.org lead me to find 14 files containing 10 distinct seeds seeds and 28 lines that contain "reproduce with:" 

Not quite sure how to interpret that. Even if I say each seed is a unique fail, I have no idea how many total builds that relates to...

-Gus

On Wed, Dec 5, 2018 at 12:02 PM Nicholas Knize <[hidden email]> wrote:
Hi All,

https://issues.apache.org/jira/browse/SOLR-13039 contains a patch that sets a list of common failing Tests to BadApple. As mentioned above I cross referenced our CI builds to make sure there aren't any new test failures that we haven't seen before. Let me know if any of these come as a surprise. I'll plan to commit this change to the 7.6 branch only to continue along in the release process. Once 7.6 is released I can revert the change to continue CI testing on the bug fix branch.

Thanks for the patience on this.

Reply | Threaded
Open this post in threaded view
|

Re: recent failures on fucit - how to de-dupe...

Chris Hostetter-3

: Seeing TimeRoutedAliasUpdateProcessorTest on the 7.6 bad apple list, having
: recently been looking at that test, and waiting on a long build for other
: work, I went to http://fucit.org/solr-jenkins-reports/failure-report.html
: to gather recent failures, and when I started looking I began to suspect
: there were duplicates... So I downloaded/extracted everything that comes up

Duplicates of what exactly?

It's not 100% clear what the subject/object you're asking about in
regardes to "de-dupe" ing ... it seems like maybe you are worried that
individual failures are being "duplicate counted" but i suspect what
you're actually seeing/confused by is either:

1) That a single failure / reproduce line might exist multiple times in
the logs & test reports for a single jenkins build ID.  this is absolutely
possible because of how our jenkinsjobs run.  Depending on what jenkins
server & what target it invokes, some jenkins jobs try to "repro" any
failure that eixsted during hte main test run.

2) That you might see the same exact reproduce line / seed in multiple
jenkins build logs & test reports ... similar to #1, we have other jenkins
jobs with "repro" in their name that only run that bit of logic: looking
at some recent jenkins failures (or other jobs) and running the "reproduce
with" lines from the logs.

...or the combination of both.

: I clicked on each line and opened a tab fore each line in the modal dialog
: and then from each tab downloaded jenkins.log.txt.gz into a folder
: corresponding to the day on the file timestamp

FYI: if that modal dialog box has (XN) next to a jenkins build ID, that
means that w/in that sinle jenkins build that test failed multiple times.

: gus$ grep -r 'reprod' * | grep TimeRouted | perl -pe 's/(^[^[]*).*reproduce
: with:(.*Dtests\.seed=(\w+)\s.*)/\3 \1 \2/' | sort
        ....
: What I've done is sort by seed, and found a LOT of duplication, even across
: files and some apparent running of specific test methods. I'd like to
: understand what's happening with the build servers here... why am we seeing
: so many duplicates? I would guess that this really boils down to 1 fail per
: seed value seen? I'm trying to figure out how many and which of these I
: need to consider, and I'm interested in the frequency of different failure
: scenarios which is hard to gauge if there's duplication.

The duplicated "failures" (with identical seeds) in the logs come from
duplicated "runs" (with identical seeds) for the express purpose of trying to figure
out if a given failure is reliably reproducible.

Ie: don't assume that because you see the same "reproduce with" line
duplicated multiple times that the failure stats are "wrong" and the test
isn't failing as often as it seems -- quite the oposite is true: if you
see the "reproduce with" line failing multiple times (either in the same
build, or in two diff builds) then that just re-iterates that the failure
is very easy to reproduce.

        ...
:  gus$ grep -r 'reprod' * | grep Time | perl -pe 's/.*reproduce
: with:(.*Dtests\.seed=(\w+)\s.*)/\2/' | wc -l
:
:       28
:
: So 17 failures listed in fucit.org lead me to find 14 files containing 10
: distinct seeds seeds and 28 lines that contain "reproduce with:"
:
: Not quite sure how to interpret that. Even if I say each seed is a unique
: fail, I have no idea how many total builds that relates to...

I think you need to look more closely at what lines your regexes are
matching -- that "29" is also going to include other failures in the same
job that have nothing to do with the class you are focused on (example:
failures for unrelated tests like TestTlogReplica.testRealTimeGet with
match your grep for "Time")


-Hoss
http://www.lucidworks.com/

---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: recent failures on fucit - how to de-dupe...

Gus Heck
Hi Hoss, 

Thanks, this starts to clear some things up for me. Please let me be clear that i am in no way complaining or requesting changes. The builds page on fucit.org is quite cool. It's very likely that I am confused because I'm lacking in knowledge about what builds are doing what. Is there anywhere the various Jenkins builds are listed, maybe with some high level description?

I had guessed that there was some re-running of individual tests going on based on the repro line options, and it's definitely clear that more than one match is occurring per file, but I wasn't seeing a consistent pattern. What I'm trying to understand is how to pick apart what I see. It's great that we are doing automatic "repro" runs. Till today I didn't realize that was happening automatically. That's awesome :).

I guess I'd ideally want to be able to come up with the following numbers:
  1. How many fresh, clean, completely independent runs were conducted
  2. How many fresh, clean, completely independent runs failed.
  3. With failures on independent runs identified, I intend to look for and count different types of failures (SSL, ZK session loss, assertion fail, carrot complaining about thread leak etc). I don't want to spend time sorting through reproductions for that.
Once I have that I want to understand
  1. Which repro runs tie back to the fresh runs (seed probably tells me that)
  2. How many repro runs were attempted pursuant to each fresh run
  3. Failure rate for repro runs attempted pursuant to a fresh run.
  4. Finally, is the failure the same cause in the reproduction as in the initial run. (flakey test bad luck vs real reproduction).
So I guess what I'm asking is: What I should be looking at to weed out and quantify the reproduction and duplicative runs vs fresh runs so I can understand the base failure rate? 

I'd like to venture a guess that a good first pass is that anything with tests.method in the repro line is a reproduction re-run. Do any of the builds re-run at the top level? It appears so based on what i see for seed 2E743D2D45BF625E, but I'm not necessarily convinced that in going through the modal dialogs I didn't wind up grabbing the same build more than once either. Do the links in the modal dialogs potentially overlap?

-Gus

On Wed, Dec 5, 2018 at 5:09 PM Chris Hostetter <[hidden email]> wrote:

: Seeing TimeRoutedAliasUpdateProcessorTest on the 7.6 bad apple list, having
: recently been looking at that test, and waiting on a long build for other
: work, I went to http://fucit.org/solr-jenkins-reports/failure-report.html
: to gather recent failures, and when I started looking I began to suspect
: there were duplicates... So I downloaded/extracted everything that comes up

Duplicates of what exactly?

It's not 100% clear what the subject/object you're asking about in
regardes to "de-dupe" ing ... it seems like maybe you are worried that
individual failures are being "duplicate counted" but i suspect what
you're actually seeing/confused by is either:

1) That a single failure / reproduce line might exist multiple times in
the logs & test reports for a single jenkins build ID.  this is absolutely
possible because of how our jenkinsjobs run.  Depending on what jenkins
server & what target it invokes, some jenkins jobs try to "repro" any
failure that eixsted during hte main test run.

2) That you might see the same exact reproduce line / seed in multiple
jenkins build logs & test reports ... similar to #1, we have other jenkins
jobs with "repro" in their name that only run that bit of logic: looking
at some recent jenkins failures (or other jobs) and running the "reproduce
with" lines from the logs.

...or the combination of both.

: I clicked on each line and opened a tab fore each line in the modal dialog
: and then from each tab downloaded jenkins.log.txt.gz into a folder
: corresponding to the day on the file timestamp

FYI: if that modal dialog box has (XN) next to a jenkins build ID, that
means that w/in that sinle jenkins build that test failed multiple times.

: gus$ grep -r 'reprod' * | grep TimeRouted | perl -pe 's/(^[^[]*).*reproduce
: with:(.*Dtests\.seed=(\w+)\s.*)/\3 \1 \2/' | sort
        ....
: What I've done is sort by seed, and found a LOT of duplication, even across
: files and some apparent running of specific test methods. I'd like to
: understand what's happening with the build servers here... why am we seeing
: so many duplicates? I would guess that this really boils down to 1 fail per
: seed value seen? I'm trying to figure out how many and which of these I
: need to consider, and I'm interested in the frequency of different failure
: scenarios which is hard to gauge if there's duplication.

The duplicated "failures" (with identical seeds) in the logs come from
duplicated "runs" (with identical seeds) for the express purpose of trying to figure
out if a given failure is reliably reproducible.

Ie: don't assume that because you see the same "reproduce with" line
duplicated multiple times that the failure stats are "wrong" and the test
isn't failing as often as it seems -- quite the oposite is true: if you
see the "reproduce with" line failing multiple times (either in the same
build, or in two diff builds) then that just re-iterates that the failure
is very easy to reproduce.

        ...
:  gus$ grep -r 'reprod' * | grep Time | perl -pe 's/.*reproduce
: with:(.*Dtests\.seed=(\w+)\s.*)/\2/' | wc -l
:
:       28
:
: So 17 failures listed in fucit.org lead me to find 14 files containing 10
: distinct seeds seeds and 28 lines that contain "reproduce with:"
:
: Not quite sure how to interpret that. Even if I say each seed is a unique
: fail, I have no idea how many total builds that relates to...

I think you need to look more closely at what lines your regexes are
matching -- that "29" is also going to include other failures in the same
job that have nothing to do with the class you are focused on (example:
failures for unrelated tests like TestTlogReplica.testRealTimeGet with
match your grep for "Time")


-Hoss
http://www.lucidworks.com/

---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]



--
12