Help needed in breaking large index file into smaller ones

classic Classic list List threaded Threaded
12 messages Options
Reply | Threaded
Open this post in threaded view
|

Help needed in breaking large index file into smaller ones

Narsimha Reddy CHALLA
Hi All,

      My solr server has a few large index files (say ~10G). I am looking
for some help on breaking them it into smaller ones (each < 4G) to satisfy
my application requirements. Are there any such tools available?

Appreciate your help.

Thanks
NRC
Reply | Threaded
Open this post in threaded view
|

RE: Help needed in breaking large index file into smaller ones

Moenieb Davids
Hi,

Try split on linux or unix

split -l 100 originalfile.csv
this will split a file into 100 lines each

see other options for how to split like size


-----Original Message-----
From: Narsimha Reddy CHALLA [mailto:[hidden email]]
Sent: 09 January 2017 12:12 PM
To: [hidden email]
Subject: Help needed in breaking large index file into smaller ones

Hi All,

      My solr server has a few large index files (say ~10G). I am looking for some help on breaking them it into smaller ones (each < 4G) to satisfy my application requirements. Are there any such tools available?

Appreciate your help.

Thanks
NRC










===========================================================================
GPAA e-mail Disclaimers and confidential note

This e-mail is intended for the exclusive use of the addressee only.
If you are not the intended recipient, you should not use the contents
or disclose them to any other person. Please notify the sender immediately
and delete the e-mail. This e-mail is not intended nor
shall it be taken to create any legal relations, contractual or otherwise.
Legally binding obligations can only arise for the GPAA by means of
a written instrument signed by an authorised signatory.
===========================================================================
Reply | Threaded
Open this post in threaded view
|

Re: Help needed in breaking large index file into smaller ones

Manan Sheth
Is this really works for lucene index files?

Thanks,
Manan Sheth
________________________________________
From: Moenieb Davids <[hidden email]>
Sent: Monday, January 9, 2017 7:36 PM
To: [hidden email]
Subject: RE: Help needed in breaking large index file into smaller ones

Hi,

Try split on linux or unix

split -l 100 originalfile.csv
this will split a file into 100 lines each

see other options for how to split like size


-----Original Message-----
From: Narsimha Reddy CHALLA [mailto:[hidden email]]
Sent: 09 January 2017 12:12 PM
To: [hidden email]
Subject: Help needed in breaking large index file into smaller ones

Hi All,

      My solr server has a few large index files (say ~10G). I am looking for some help on breaking them it into smaller ones (each < 4G) to satisfy my application requirements. Are there any such tools available?

Appreciate your help.

Thanks
NRC










===========================================================================
GPAA e-mail Disclaimers and confidential note

This e-mail is intended for the exclusive use of the addressee only.
If you are not the intended recipient, you should not use the contents
or disclose them to any other person. Please notify the sender immediately
and delete the e-mail. This e-mail is not intended nor
shall it be taken to create any legal relations, contractual or otherwise.
Legally binding obligations can only arise for the GPAA by means of
a written instrument signed by an authorised signatory.
===========================================================================

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.
Reply | Threaded
Open this post in threaded view
|

Re: Help needed in breaking large index file into smaller ones

Narsimha Reddy CHALLA
No, it does not work by splitting. First of all lucene index files are not
text files. There is a segment_NN file which will refer index files in a
commit. So, when we split a large index file into smaller ones, the
corresponding segment_NN file also needs to be updated with new index files
OR a new segment_NN file should be created, probably.

Can someone who is familiar with lucene index files please help us in this
regard?

Thanks
NRC

On Mon, Jan 9, 2017 at 7:38 PM, Manan Sheth <[hidden email]>
wrote:

> Is this really works for lucene index files?
>
> Thanks,
> Manan Sheth
> ________________________________________
> From: Moenieb Davids <[hidden email]>
> Sent: Monday, January 9, 2017 7:36 PM
> To: [hidden email]
> Subject: RE: Help needed in breaking large index file into smaller ones
>
> Hi,
>
> Try split on linux or unix
>
> split -l 100 originalfile.csv
> this will split a file into 100 lines each
>
> see other options for how to split like size
>
>
> -----Original Message-----
> From: Narsimha Reddy CHALLA [mailto:[hidden email]]
> Sent: 09 January 2017 12:12 PM
> To: [hidden email]
> Subject: Help needed in breaking large index file into smaller ones
>
> Hi All,
>
>       My solr server has a few large index files (say ~10G). I am looking
> for some help on breaking them it into smaller ones (each < 4G) to satisfy
> my application requirements. Are there any such tools available?
>
> Appreciate your help.
>
> Thanks
> NRC
>
>
>
>
>
>
>
>
>
>
> ============================================================
> ===============
> GPAA e-mail Disclaimers and confidential note
>
> This e-mail is intended for the exclusive use of the addressee only.
> If you are not the intended recipient, you should not use the contents
> or disclose them to any other person. Please notify the sender immediately
> and delete the e-mail. This e-mail is not intended nor
> shall it be taken to create any legal relations, contractual or otherwise.
> Legally binding obligations can only arise for the GPAA by means of
> a written instrument signed by an authorised signatory.
> ============================================================
> ===============
>
> ________________________________
>
>
>
>
>
>
> NOTE: This message may contain information that is confidential,
> proprietary, privileged or otherwise protected by law. The message is
> intended solely for the named addressee. If received in error, please
> destroy and notify the sender. Any use of this email is prohibited when
> received in error. Impetus does not represent, warrant and/or guarantee,
> that the integrity of this communication has been maintained nor that the
> communication is free of errors, virus, interception or interference.
>
Reply | Threaded
Open this post in threaded view
|

Re: Help needed in breaking large index file into smaller ones

Yago Riveiro
You can try to reindex your data to another collection with more shards

--

/Yago Riveiro

On 9 Jan 2017 14:15 +0000, Narsimha Reddy CHALLA <[hidden email]>, wrote:

> No, it does not work by splitting. First of all lucene index files are not
> text files. There is a segment_NN file which will refer index files in a
> commit. So, when we split a large index file into smaller ones, the
> corresponding segment_NN file also needs to be updated with new index files
> OR a new segment_NN file should be created, probably.
>
> Can someone who is familiar with lucene index files please help us in this
> regard?
>
> Thanks
> NRC
>
> On Mon, Jan 9, 2017 at 7:38 PM, Manan Sheth <[hidden email]
> wrote:
>
> > Is this really works for lucene index files?
> >
> > Thanks,
> > Manan Sheth
> > ________________________________________
> > From: Moenieb Davids <[hidden email]
> > Sent: Monday, January 9, 2017 7:36 PM
> > To: [hidden email]
> > Subject: RE: Help needed in breaking large index file into smaller ones
> >
> > Hi,
> >
> > Try split on linux or unix
> >
> > split -l 100 originalfile.csv
> > this will split a file into 100 lines each
> >
> > see other options for how to split like size
> >
> >
> > -----Original Message-----
> > From: Narsimha Reddy CHALLA [mailto:[hidden email]]
> > Sent: 09 January 2017 12:12 PM
> > To: [hidden email]
> > Subject: Help needed in breaking large index file into smaller ones
> >
> > Hi All,
> >
> > My solr server has a few large index files (say ~10G). I am looking
> > for some help on breaking them it into smaller ones (each < 4G) to satisfy
> > my application requirements. Are there any such tools available?
> >
> > Appreciate your help.
> >
> > Thanks
> > NRC
> >
> >
> >
> >
> >
> >
> >
> >
> >
> >
> > ============================================================
> > ===============
> > GPAA e-mail Disclaimers and confidential note
> >
> > This e-mail is intended for the exclusive use of the addressee only.
> > If you are not the intended recipient, you should not use the contents
> > or disclose them to any other person. Please notify the sender immediately
> > and delete the e-mail. This e-mail is not intended nor
> > shall it be taken to create any legal relations, contractual or otherwise.
> > Legally binding obligations can only arise for the GPAA by means of
> > a written instrument signed by an authorised signatory.
> > ============================================================
> > ===============
> >
> > ________________________________
> >
> >
> >
> >
> >
> >
> > NOTE: This message may contain information that is confidential,
> > proprietary, privileged or otherwise protected by law. The message is
> > intended solely for the named addressee. If received in error, please
> > destroy and notify the sender. Any use of this email is prohibited when
> > received in error. Impetus does not represent, warrant and/or guarantee,
> > that the integrity of this communication has been maintained nor that the
> > communication is free of errors, virus, interception or interference.
> >
Best regards /Yago
Reply | Threaded
Open this post in threaded view
|

Re: Help needed in breaking large index file into smaller ones

Billnbell
In reply to this post by Narsimha Reddy CHALLA
Can you set Solr config segments to a higher number, don't optimize and you will get smaller files after a new index is created.

Can you reindex ?

Bill Bell
Sent from mobile


> On Jan 9, 2017, at 7:15 AM, Narsimha Reddy CHALLA <[hidden email]> wrote:
>
> No, it does not work by splitting. First of all lucene index files are not
> text files. There is a segment_NN file which will refer index files in a
> commit. So, when we split a large index file into smaller ones, the
> corresponding segment_NN file also needs to be updated with new index files
> OR a new segment_NN file should be created, probably.
>
> Can someone who is familiar with lucene index files please help us in this
> regard?
>
> Thanks
> NRC
>
> On Mon, Jan 9, 2017 at 7:38 PM, Manan Sheth <[hidden email]>
> wrote:
>
>> Is this really works for lucene index files?
>>
>> Thanks,
>> Manan Sheth
>> ________________________________________
>> From: Moenieb Davids <[hidden email]>
>> Sent: Monday, January 9, 2017 7:36 PM
>> To: [hidden email]
>> Subject: RE: Help needed in breaking large index file into smaller ones
>>
>> Hi,
>>
>> Try split on linux or unix
>>
>> split -l 100 originalfile.csv
>> this will split a file into 100 lines each
>>
>> see other options for how to split like size
>>
>>
>> -----Original Message-----
>> From: Narsimha Reddy CHALLA [mailto:[hidden email]]
>> Sent: 09 January 2017 12:12 PM
>> To: [hidden email]
>> Subject: Help needed in breaking large index file into smaller ones
>>
>> Hi All,
>>
>>      My solr server has a few large index files (say ~10G). I am looking
>> for some help on breaking them it into smaller ones (each < 4G) to satisfy
>> my application requirements. Are there any such tools available?
>>
>> Appreciate your help.
>>
>> Thanks
>> NRC
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> ============================================================
>> ===============
>> GPAA e-mail Disclaimers and confidential note
>>
>> This e-mail is intended for the exclusive use of the addressee only.
>> If you are not the intended recipient, you should not use the contents
>> or disclose them to any other person. Please notify the sender immediately
>> and delete the e-mail. This e-mail is not intended nor
>> shall it be taken to create any legal relations, contractual or otherwise.
>> Legally binding obligations can only arise for the GPAA by means of
>> a written instrument signed by an authorised signatory.
>> ============================================================
>> ===============
>>
>> ________________________________
>>
>>
>>
>>
>>
>>
>> NOTE: This message may contain information that is confidential,
>> proprietary, privileged or otherwise protected by law. The message is
>> intended solely for the named addressee. If received in error, please
>> destroy and notify the sender. Any use of this email is prohibited when
>> received in error. Impetus does not represent, warrant and/or guarantee,
>> that the integrity of this communication has been maintained nor that the
>> communication is free of errors, virus, interception or interference.
>>
Reply | Threaded
Open this post in threaded view
|

RE: Help needed in breaking large index file into smaller ones

Moenieb Davids
Hi,

Aplogies for my response, did not read the question properly.
I was speaking about splitting files for import

-----Original Message-----
From: [hidden email] [mailto:[hidden email]]
Sent: 09 January 2017 05:45 PM
To: [hidden email]
Subject: Re: Help needed in breaking large index file into smaller ones

Can you set Solr config segments to a higher number, don't optimize and you will get smaller files after a new index is created.

Can you reindex ?

Bill Bell
Sent from mobile


> On Jan 9, 2017, at 7:15 AM, Narsimha Reddy CHALLA <[hidden email]> wrote:
>
> No, it does not work by splitting. First of all lucene index files are
> not text files. There is a segment_NN file which will refer index
> files in a commit. So, when we split a large index file into smaller
> ones, the corresponding segment_NN file also needs to be updated with
> new index files OR a new segment_NN file should be created, probably.
>
> Can someone who is familiar with lucene index files please help us in
> this regard?
>
> Thanks
> NRC
>
> On Mon, Jan 9, 2017 at 7:38 PM, Manan Sheth
> <[hidden email]>
> wrote:
>
>> Is this really works for lucene index files?
>>
>> Thanks,
>> Manan Sheth
>> ________________________________________
>> From: Moenieb Davids <[hidden email]>
>> Sent: Monday, January 9, 2017 7:36 PM
>> To: [hidden email]
>> Subject: RE: Help needed in breaking large index file into smaller
>> ones
>>
>> Hi,
>>
>> Try split on linux or unix
>>
>> split -l 100 originalfile.csv
>> this will split a file into 100 lines each
>>
>> see other options for how to split like size
>>
>>
>> -----Original Message-----
>> From: Narsimha Reddy CHALLA [mailto:[hidden email]]
>> Sent: 09 January 2017 12:12 PM
>> To: [hidden email]
>> Subject: Help needed in breaking large index file into smaller ones
>>
>> Hi All,
>>
>>      My solr server has a few large index files (say ~10G). I am
>> looking for some help on breaking them it into smaller ones (each <
>> 4G) to satisfy my application requirements. Are there any such tools available?
>>
>> Appreciate your help.
>>
>> Thanks
>> NRC
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> ============================================================
>> ===============
>> GPAA e-mail Disclaimers and confidential note
>>
>> This e-mail is intended for the exclusive use of the addressee only.
>> If you are not the intended recipient, you should not use the
>> contents or disclose them to any other person. Please notify the
>> sender immediately and delete the e-mail. This e-mail is not intended
>> nor shall it be taken to create any legal relations, contractual or otherwise.
>> Legally binding obligations can only arise for the GPAA by means of a
>> written instrument signed by an authorised signatory.
>> ============================================================
>> ===============
>>
>> ________________________________
>>
>>
>>
>>
>>
>>
>> NOTE: This message may contain information that is confidential,
>> proprietary, privileged or otherwise protected by law. The message is
>> intended solely for the named addressee. If received in error, please
>> destroy and notify the sender. Any use of this email is prohibited
>> when received in error. Impetus does not represent, warrant and/or
>> guarantee, that the integrity of this communication has been
>> maintained nor that the communication is free of errors, virus, interception or interference.
>>










===========================================================================
GPAA e-mail Disclaimers and confidential note

This e-mail is intended for the exclusive use of the addressee only.
If you are not the intended recipient, you should not use the contents
or disclose them to any other person. Please notify the sender immediately
and delete the e-mail. This e-mail is not intended nor
shall it be taken to create any legal relations, contractual or otherwise.
Legally binding obligations can only arise for the GPAA by means of
a written instrument signed by an authorised signatory.
===========================================================================

Reply | Threaded
Open this post in threaded view
|

Re: Help needed in breaking large index file into smaller ones

Mikhail Khludnev-2
In reply to this post by Narsimha Reddy CHALLA
Perhaps you can copy this index into a separate location. Remove odd and
even docs into former and later indexes consequently, and then force merge
to single segment in both locations separately.
Perhaps shard splitting in SolrCloud does something like that.

On Mon, Jan 9, 2017 at 1:12 PM, Narsimha Reddy CHALLA <[hidden email]>
wrote:

> Hi All,
>
>       My solr server has a few large index files (say ~10G). I am looking
> for some help on breaking them it into smaller ones (each < 4G) to satisfy
> my application requirements. Are there any such tools available?
>
> Appreciate your help.
>
> Thanks
> NRC
>



--
Sincerely yours
Mikhail Khludnev
Reply | Threaded
Open this post in threaded view
|

Re: Help needed in breaking large index file into smaller ones

Anshum Gupta
Can you provide more information about:
- Are you using Solr in standalone or SolrCloud mode? What version of Solr?
- Why do you want this? Lack of disk space? Uneven distribution of data on
shards?
- Do you want this data together i.e. as part of a single collection?

You can check out the following APIs:
SPLITSHARD:
https://cwiki.apache.org/confluence/display/solr/Collections+API#CollectionsAPI-api3
MIGRATE:
https://cwiki.apache.org/confluence/display/solr/Collections+API#CollectionsAPI-api12

Among other things, make sure you have enough spare disk-space before
trying out the SPLITSHARD API in particular.

-Anshum



On Mon, Jan 9, 2017 at 12:08 PM Mikhail Khludnev <[hidden email]> wrote:

> Perhaps you can copy this index into a separate location. Remove odd and
> even docs into former and later indexes consequently, and then force merge
> to single segment in both locations separately.
> Perhaps shard splitting in SolrCloud does something like that.
>
> On Mon, Jan 9, 2017 at 1:12 PM, Narsimha Reddy CHALLA <
> [hidden email]>
> wrote:
>
> > Hi All,
> >
> >       My solr server has a few large index files (say ~10G). I am looking
> > for some help on breaking them it into smaller ones (each < 4G) to
> satisfy
> > my application requirements. Are there any such tools available?
> >
> > Appreciate your help.
> >
> > Thanks
> > NRC
> >
>
>
>
> --
> Sincerely yours
> Mikhail Khludnev
>
Reply | Threaded
Open this post in threaded view
|

Re: Help needed in breaking large index file into smaller ones

Erick Erickson
Why do you have a requirement that the indexes be < 4G? If it's
arbitrarily imposed why bother?

Or is it a non-negotiable requirement imposed by the platform you're on?

Because just splitting the files into a smaller set won't help you if
you then start to index into it, the merge process will just recreate
them.

You might be able to do something with the settings in
TieredMergePolicy in the first place to stop generating files > 4g..

Best,
Erick

On Mon, Jan 9, 2017 at 3:27 PM, Anshum Gupta <[hidden email]> wrote:

> Can you provide more information about:
> - Are you using Solr in standalone or SolrCloud mode? What version of Solr?
> - Why do you want this? Lack of disk space? Uneven distribution of data on
> shards?
> - Do you want this data together i.e. as part of a single collection?
>
> You can check out the following APIs:
> SPLITSHARD:
> https://cwiki.apache.org/confluence/display/solr/Collections+API#CollectionsAPI-api3
> MIGRATE:
> https://cwiki.apache.org/confluence/display/solr/Collections+API#CollectionsAPI-api12
>
> Among other things, make sure you have enough spare disk-space before
> trying out the SPLITSHARD API in particular.
>
> -Anshum
>
>
>
> On Mon, Jan 9, 2017 at 12:08 PM Mikhail Khludnev <[hidden email]> wrote:
>
>> Perhaps you can copy this index into a separate location. Remove odd and
>> even docs into former and later indexes consequently, and then force merge
>> to single segment in both locations separately.
>> Perhaps shard splitting in SolrCloud does something like that.
>>
>> On Mon, Jan 9, 2017 at 1:12 PM, Narsimha Reddy CHALLA <
>> [hidden email]>
>> wrote:
>>
>> > Hi All,
>> >
>> >       My solr server has a few large index files (say ~10G). I am looking
>> > for some help on breaking them it into smaller ones (each < 4G) to
>> satisfy
>> > my application requirements. Are there any such tools available?
>> >
>> > Appreciate your help.
>> >
>> > Thanks
>> > NRC
>> >
>>
>>
>>
>> --
>> Sincerely yours
>> Mikhail Khludnev
>>
Reply | Threaded
Open this post in threaded view
|

Re: Help needed in breaking large index file into smaller ones

Manan Sheth
Hi Erick,

Its due to some past issues observed with Joins on Solr 4, which got OOM on joining to large indexes after optimization/compaction, if those are stored as smaller files those gets fit into memory and operations are performed appropriately. Also, there are slow write/commit/updates are observed for large files. Thus, to minimize this risk while upgrading on Solr 6, we wanted to store indexes into smaller sized files.

Thanks,
Manan Sheth
________________________________________
From: Erick Erickson <[hidden email]>
Sent: Tuesday, January 10, 2017 5:24 AM
To: solr-user
Subject: Re: Help needed in breaking large index file into smaller ones

Why do you have a requirement that the indexes be < 4G? If it's
arbitrarily imposed why bother?

Or is it a non-negotiable requirement imposed by the platform you're on?

Because just splitting the files into a smaller set won't help you if
you then start to index into it, the merge process will just recreate
them.

You might be able to do something with the settings in
TieredMergePolicy in the first place to stop generating files > 4g..

Best,
Erick

On Mon, Jan 9, 2017 at 3:27 PM, Anshum Gupta <[hidden email]> wrote:

> Can you provide more information about:
> - Are you using Solr in standalone or SolrCloud mode? What version of Solr?
> - Why do you want this? Lack of disk space? Uneven distribution of data on
> shards?
> - Do you want this data together i.e. as part of a single collection?
>
> You can check out the following APIs:
> SPLITSHARD:
> https://cwiki.apache.org/confluence/display/solr/Collections+API#CollectionsAPI-api3
> MIGRATE:
> https://cwiki.apache.org/confluence/display/solr/Collections+API#CollectionsAPI-api12
>
> Among other things, make sure you have enough spare disk-space before
> trying out the SPLITSHARD API in particular.
>
> -Anshum
>
>
>
> On Mon, Jan 9, 2017 at 12:08 PM Mikhail Khludnev <[hidden email]> wrote:
>
>> Perhaps you can copy this index into a separate location. Remove odd and
>> even docs into former and later indexes consequently, and then force merge
>> to single segment in both locations separately.
>> Perhaps shard splitting in SolrCloud does something like that.
>>
>> On Mon, Jan 9, 2017 at 1:12 PM, Narsimha Reddy CHALLA <
>> [hidden email]>
>> wrote:
>>
>> > Hi All,
>> >
>> >       My solr server has a few large index files (say ~10G). I am looking
>> > for some help on breaking them it into smaller ones (each < 4G) to
>> satisfy
>> > my application requirements. Are there any such tools available?
>> >
>> > Appreciate your help.
>> >
>> > Thanks
>> > NRC
>> >
>>
>>
>>
>> --
>> Sincerely yours
>> Mikhail Khludnev
>>

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.
Reply | Threaded
Open this post in threaded view
|

Re: Help needed in breaking large index file into smaller ones

Manan Sheth
Additionally to answer Anshum's queries,

We are currently using Solr 4.10 and planning to upgrade to Solr 6.2.1 and upgradation process in creating the current problem.

We are using it in SolrCloud with 8-10 shards split on different nodes each having segment size ~30 GB for some collection and ranging 10-12 GB across board.

This is due to performance and partial lack of large RAM (currently ~32 GB/node).

Yes, we want data together in single collection.

Thanks,
Manan Sheth
________________________________________
From: Manan Sheth <[hidden email]>
Sent: Tuesday, January 10, 2017 10:51 AM
To: solr-user
Subject: Re: Help needed in breaking large index file into smaller ones

Hi Erick,

Its due to some past issues observed with Joins on Solr 4, which got OOM on joining to large indexes after optimization/compaction, if those are stored as smaller files those gets fit into memory and operations are performed appropriately. Also, there are slow write/commit/updates are observed for large files. Thus, to minimize this risk while upgrading on Solr 6, we wanted to store indexes into smaller sized files.

Thanks,
Manan Sheth
________________________________________
From: Erick Erickson <[hidden email]>
Sent: Tuesday, January 10, 2017 5:24 AM
To: solr-user
Subject: Re: Help needed in breaking large index file into smaller ones

Why do you have a requirement that the indexes be < 4G? If it's
arbitrarily imposed why bother?

Or is it a non-negotiable requirement imposed by the platform you're on?

Because just splitting the files into a smaller set won't help you if
you then start to index into it, the merge process will just recreate
them.

You might be able to do something with the settings in
TieredMergePolicy in the first place to stop generating files > 4g..

Best,
Erick

On Mon, Jan 9, 2017 at 3:27 PM, Anshum Gupta <[hidden email]> wrote:

> Can you provide more information about:
> - Are you using Solr in standalone or SolrCloud mode? What version of Solr?
> - Why do you want this? Lack of disk space? Uneven distribution of data on
> shards?
> - Do you want this data together i.e. as part of a single collection?
>
> You can check out the following APIs:
> SPLITSHARD:
> https://cwiki.apache.org/confluence/display/solr/Collections+API#CollectionsAPI-api3
> MIGRATE:
> https://cwiki.apache.org/confluence/display/solr/Collections+API#CollectionsAPI-api12
>
> Among other things, make sure you have enough spare disk-space before
> trying out the SPLITSHARD API in particular.
>
> -Anshum
>
>
>
> On Mon, Jan 9, 2017 at 12:08 PM Mikhail Khludnev <[hidden email]> wrote:
>
>> Perhaps you can copy this index into a separate location. Remove odd and
>> even docs into former and later indexes consequently, and then force merge
>> to single segment in both locations separately.
>> Perhaps shard splitting in SolrCloud does something like that.
>>
>> On Mon, Jan 9, 2017 at 1:12 PM, Narsimha Reddy CHALLA <
>> [hidden email]>
>> wrote:
>>
>> > Hi All,
>> >
>> >       My solr server has a few large index files (say ~10G). I am looking
>> > for some help on breaking them it into smaller ones (each < 4G) to
>> satisfy
>> > my application requirements. Are there any such tools available?
>> >
>> > Appreciate your help.
>> >
>> > Thanks
>> > NRC
>> >
>>
>>
>>
>> --
>> Sincerely yours
>> Mikhail Khludnev
>>

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

________________________________






NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.