Teradata into hadoop Migration

classic Classic list List threaded Threaded
10 messages Options
Reply | Threaded
Open this post in threaded view
|

Teradata into hadoop Migration

Bhagaban Khatai
Hi Guys-

I need a quick help if anybody done any migration project in TD into hadoop.
We have very tight deadline and I am trying to find any tool (online or paid) for quick development.

Please help us here and guide me if any other way is available to do the development fast.

Bhagaban
Reply | Threaded
Open this post in threaded view
|

Re: Teradata into hadoop Migration

Rakesh Radhakrishnan-2
Hi Bhagaban,

Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from Teradata. Apache Sqoop provides an efficient approach for transferring large data between Hadoop related systems and structured data stores. It allows support for a data store to be added as a so-called connector and can connect to various databases including Oracle etc.

I hope the below links will be helpful to you,

Below are few data ingestion tools, probably you can dig more into it,

Thanks,
Rakesh

On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <[hidden email]> wrote:
Hi Guys-

I need a quick help if anybody done any migration project in TD into hadoop.
We have very tight deadline and I am trying to find any tool (online or paid) for quick development.

Please help us here and guide me if any other way is available to do the development fast.

Bhagaban

Reply | Threaded
Open this post in threaded view
|

Re: Teradata into hadoop Migration

Sudhir.Kumar

Hi Bhagaban,

 

Data migration can be achieved by Scoop. However if you are also looking for ETL/ELT application migration then you would have to look into converting the ELT SQLs into map-reduce framework based codes. You can build an conversion tool.

 

Thanks,

 

Sudhir Kumar

 

“Your present circumstances don’t determine where you can go; they merely determine where you start”. — Nido Qubein

 

 

From: Rakesh Radhakrishnan <[hidden email]>
Date: Monday, August 1, 2016 at 6:07 PM
To: Bhagaban Khatai <[hidden email]>
Cc: "user.hadoop" <[hidden email]>
Subject: Re: Teradata into hadoop Migration

 

Hi Bhagaban,

 

Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from Teradata. Apache Sqoop provides an efficient approach for transferring large data between Hadoop related systems and structured data stores. It allows support for a data store to be added as a so-called connector and can connect to various databases including Oracle etc.

 

I hope the below links will be helpful to you,

 

Below are few data ingestion tools, probably you can dig more into it,

 

Thanks,

Rakesh

 

On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <[hidden email]> wrote:

Hi Guys-

 

I need a quick help if anybody done any migration project in TD into hadoop.

We have very tight deadline and I am trying to find any tool (online or paid) for quick development.

 

Please help us here and guide me if any other way is available to do the development fast.

 

Bhagaban

 

Reply | Threaded
Open this post in threaded view
|

Re: Teradata into hadoop Migration

Gmail-2
In reply to this post by Rakesh Radhakrishnan-2
Hi Bhagaban

I have seen more efficient way of data transfer by fast exporting the data and put them directly in HDFS. And then create structure as per the data in HIVE or PIG. We have developed DDL transformation script using shell script which can convert the TERADATA DDL to Hive DDL. This automation effort will not take much time . We have faced some challenges while converting the timestamp and date data type in parquet format. But later those resolved with higher version of hive. I will also suggest to check TERADATA connector for Hadoop the tool developed by TERADATA which is self efficient to choose different data transfer strategies.

Thanks
Asim

Sent from my iPhone

On Aug 1, 2016, at 7:37 AM, Rakesh Radhakrishnan <[hidden email]> wrote:

Hi Bhagaban,

Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from Teradata. Apache Sqoop provides an efficient approach for transferring large data between Hadoop related systems and structured data stores. It allows support for a data store to be added as a so-called connector and can connect to various databases including Oracle etc.

I hope the below links will be helpful to you,

Below are few data ingestion tools, probably you can dig more into it,

Thanks,
Rakesh

On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <[hidden email]> wrote:
Hi Guys-

I need a quick help if anybody done any migration project in TD into hadoop.
We have very tight deadline and I am trying to find any tool (online or paid) for quick development.

Please help us here and guide me if any other way is available to do the development fast.

Bhagaban

Reply | Threaded
Open this post in threaded view
|

Re: Teradata into hadoop Migration

Sandeep Khurana

I will suggest to also look at TPT from teradata which is much faster than sqoop


On 01-Aug-2016 6:17 pm, "Gmail" <[hidden email]> wrote:
Hi Bhagaban

I have seen more efficient way of data transfer by fast exporting the data and put them directly in HDFS. And then create structure as per the data in HIVE or PIG. We have developed DDL transformation script using shell script which can convert the TERADATA DDL to Hive DDL. This automation effort will not take much time . We have faced some challenges while converting the timestamp and date data type in parquet format. But later those resolved with higher version of hive. I will also suggest to check TERADATA connector for Hadoop the tool developed by TERADATA which is self efficient to choose different data transfer strategies.

Thanks
Asim

Sent from my iPhone

On Aug 1, 2016, at 7:37 AM, Rakesh Radhakrishnan <[hidden email]> wrote:

Hi Bhagaban,

Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from Teradata. Apache Sqoop provides an efficient approach for transferring large data between Hadoop related systems and structured data stores. It allows support for a data store to be added as a so-called connector and can connect to various databases including Oracle etc.

I hope the below links will be helpful to you,

Below are few data ingestion tools, probably you can dig more into it,

Thanks,
Rakesh

On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <[hidden email]> wrote:
Hi Guys-

I need a quick help if anybody done any migration project in TD into hadoop.
We have very tight deadline and I am trying to find any tool (online or paid) for quick development.

Please help us here and guide me if any other way is available to do the development fast.

Bhagaban

Reply | Threaded
Open this post in threaded view
|

Re: Teradata into hadoop Migration

Bhagaban Khatai
In reply to this post by Rakesh Radhakrishnan-2
Thanks Rakesh for the useful information. But we are using sqoop for data transfer but all TD logic we are implementing thru Hive.
But it's taking time by using mapping provided by TD team and the same logic we are implementing.

What I want some tool or ready-made framework so that development effort would be less.

Thanks in advance for your help.

Bhagaban 

On Mon, Aug 1, 2016 at 6:07 PM, Rakesh Radhakrishnan <[hidden email]> wrote:
Hi Bhagaban,

Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from Teradata. Apache Sqoop provides an efficient approach for transferring large data between Hadoop related systems and structured data stores. It allows support for a data store to be added as a so-called connector and can connect to various databases including Oracle etc.

I hope the below links will be helpful to you,

Below are few data ingestion tools, probably you can dig more into it,

Thanks,
Rakesh

On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <[hidden email]> wrote:
Hi Guys-

I need a quick help if anybody done any migration project in TD into hadoop.
We have very tight deadline and I am trying to find any tool (online or paid) for quick development.

Please help us here and guide me if any other way is available to do the development fast.

Bhagaban


Reply | Threaded
Open this post in threaded view
|

Re: Teradata into hadoop Migration

Rakesh Radhakrishnan-2
Sorry, I don't have much insight about this apart from basic Sqoop. AFAIK, it is more of vendor specific, you may need to dig more into that line.

Thanks,
Rakesh

On Mon, Aug 1, 2016 at 11:38 PM, Bhagaban Khatai <[hidden email]> wrote:
Thanks Rakesh for the useful information. But we are using sqoop for data transfer but all TD logic we are implementing thru Hive.
But it's taking time by using mapping provided by TD team and the same logic we are implementing.

What I want some tool or ready-made framework so that development effort would be less.

Thanks in advance for your help.

Bhagaban 

On Mon, Aug 1, 2016 at 6:07 PM, Rakesh Radhakrishnan <[hidden email]> wrote:
Hi Bhagaban,

Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from Teradata. Apache Sqoop provides an efficient approach for transferring large data between Hadoop related systems and structured data stores. It allows support for a data store to be added as a so-called connector and can connect to various databases including Oracle etc.

I hope the below links will be helpful to you,

Below are few data ingestion tools, probably you can dig more into it,

Thanks,
Rakesh

On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <[hidden email]> wrote:
Hi Guys-

I need a quick help if anybody done any migration project in TD into hadoop.
We have very tight deadline and I am trying to find any tool (online or paid) for quick development.

Please help us here and guide me if any other way is available to do the development fast.

Bhagaban



Reply | Threaded
Open this post in threaded view
|

Re: Teradata into hadoop Migration

Wei-Chiu Chuang
Hi,

I think Cloudera Navigator Optimizer is the tool you are looking for. It allows you to transform SQL queries (TD) into Impala and Hive.
Hope this doesn’t sound like a sales pitch. If you’re a Cloudera paid customer you should reach out to the account/support team for more information.

*disclaimer: I work for Cloudera

Wei-Chiu Chuang
A very happy Clouderan

On Aug 4, 2016, at 10:50 PM, Rakesh Radhakrishnan <[hidden email]> wrote:

Sorry, I don't have much insight about this apart from basic Sqoop. AFAIK, it is more of vendor specific, you may need to dig more into that line.

Thanks,
Rakesh

On Mon, Aug 1, 2016 at 11:38 PM, Bhagaban Khatai <[hidden email]> wrote:
Thanks Rakesh for the useful information. But we are using sqoop for data transfer but all TD logic we are implementing thru Hive.
But it's taking time by using mapping provided by TD team and the same logic we are implementing.

What I want some tool or ready-made framework so that development effort would be less.

Thanks in advance for your help.

Bhagaban 

On Mon, Aug 1, 2016 at 6:07 PM, Rakesh Radhakrishnan <[hidden email]> wrote:
Hi Bhagaban,

Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from Teradata. Apache Sqoop provides an efficient approach for transferring large data between Hadoop related systems and structured data stores. It allows support for a data store to be added as a so-called connector and can connect to various databases including Oracle etc.

I hope the below links will be helpful to you,

Below are few data ingestion tools, probably you can dig more into it,

Thanks,
Rakesh

On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <[hidden email]> wrote:
Hi Guys-

I need a quick help if anybody done any migration project in TD into hadoop.
We have very tight deadline and I am trying to find any tool (online or paid) for quick development.

Please help us here and guide me if any other way is available to do the development fast.

Bhagaban




Reply | Threaded
Open this post in threaded view
|

Re: Teradata into hadoop Migration

praveenesh kumar
From TD perspective have a look at this - https://youtu.be/NTTQdAfZMJA They are planning to opensource it. Perhaps you can get in touch with the team. Let me know if you are interested. If you are TD contacts, ask about this, they should be able to point to the right people.

Again, this is not sales pitch. This tool looks like what you are looking for and will be open source soon. Let me know if you want to get in touch with the folks you are working on this. 

Regards
Prav

On Fri, Aug 5, 2016 at 4:29 PM, Wei-Chiu Chuang <[hidden email]> wrote:
Hi,

I think Cloudera Navigator Optimizer is the tool you are looking for. It allows you to transform SQL queries (TD) into Impala and Hive.
Hope this doesn’t sound like a sales pitch. If you’re a Cloudera paid customer you should reach out to the account/support team for more information.

*disclaimer: I work for Cloudera

Wei-Chiu Chuang
A very happy Clouderan

On Aug 4, 2016, at 10:50 PM, Rakesh Radhakrishnan <[hidden email]> wrote:

Sorry, I don't have much insight about this apart from basic Sqoop. AFAIK, it is more of vendor specific, you may need to dig more into that line.

Thanks,
Rakesh

On Mon, Aug 1, 2016 at 11:38 PM, Bhagaban Khatai <[hidden email]> wrote:
Thanks Rakesh for the useful information. But we are using sqoop for data transfer but all TD logic we are implementing thru Hive.
But it's taking time by using mapping provided by TD team and the same logic we are implementing.

What I want some tool or ready-made framework so that development effort would be less.

Thanks in advance for your help.

Bhagaban 

On Mon, Aug 1, 2016 at 6:07 PM, Rakesh Radhakrishnan <[hidden email]> wrote:
Hi Bhagaban,

Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from Teradata. Apache Sqoop provides an efficient approach for transferring large data between Hadoop related systems and structured data stores. It allows support for a data store to be added as a so-called connector and can connect to various databases including Oracle etc.

I hope the below links will be helpful to you,

Below are few data ingestion tools, probably you can dig more into it,

Thanks,
Rakesh

On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <[hidden email]> wrote:
Hi Guys-

I need a quick help if anybody done any migration project in TD into hadoop.
We have very tight deadline and I am trying to find any tool (online or paid) for quick development.

Please help us here and guide me if any other way is available to do the development fast.

Bhagaban





Reply | Threaded
Open this post in threaded view
|

Re: Teradata into hadoop Migration

Arun Natva
Bhagaban,
First step is to ingest data into Hadoop using sqoop.
Teradata has powerful connectors to Hadoop where the connectors are to be installed on all data nodes and then run imports using fast export etc., 

Challenge would be to create the same workflows in Hadoop that you had in teradata.

Teradata is rich in features compared to Hive & Impala.

Mostly data in teradata is encrypted so pls make sure you have HDFS encryption at rest enabled.

You can use oozie to create a chain of SQLs to mimic your ETL jobs written in Datastage or TD itself or Informatica.

Please note that TD may perform better than Hadoop since it has proprietary hardware and software which is efficient, Hadoop can save you money.


Sent from my iPhone

On Aug 5, 2016, at 12:02 PM, praveenesh kumar <[hidden email]> wrote:

From TD perspective have a look at this - https://youtu.be/NTTQdAfZMJA They are planning to opensource it. Perhaps you can get in touch with the team. Let me know if you are interested. If you are TD contacts, ask about this, they should be able to point to the right people.

Again, this is not sales pitch. This tool looks like what you are looking for and will be open source soon. Let me know if you want to get in touch with the folks you are working on this. 

Regards
Prav

On Fri, Aug 5, 2016 at 4:29 PM, Wei-Chiu Chuang <[hidden email]> wrote:
Hi,

I think Cloudera Navigator Optimizer is the tool you are looking for. It allows you to transform SQL queries (TD) into Impala and Hive.
Hope this doesn’t sound like a sales pitch. If you’re a Cloudera paid customer you should reach out to the account/support team for more information.

*disclaimer: I work for Cloudera

Wei-Chiu Chuang
A very happy Clouderan

On Aug 4, 2016, at 10:50 PM, Rakesh Radhakrishnan <[hidden email]> wrote:

Sorry, I don't have much insight about this apart from basic Sqoop. AFAIK, it is more of vendor specific, you may need to dig more into that line.

Thanks,
Rakesh

On Mon, Aug 1, 2016 at 11:38 PM, Bhagaban Khatai <[hidden email]> wrote:
Thanks Rakesh for the useful information. But we are using sqoop for data transfer but all TD logic we are implementing thru Hive.
But it's taking time by using mapping provided by TD team and the same logic we are implementing.

What I want some tool or ready-made framework so that development effort would be less.

Thanks in advance for your help.

Bhagaban 

On Mon, Aug 1, 2016 at 6:07 PM, Rakesh Radhakrishnan <[hidden email]> wrote:
Hi Bhagaban,

Perhaps, you can try "Apache Sqoop" to transfer data to Hadoop from Teradata. Apache Sqoop provides an efficient approach for transferring large data between Hadoop related systems and structured data stores. It allows support for a data store to be added as a so-called connector and can connect to various databases including Oracle etc.

I hope the below links will be helpful to you,

Below are few data ingestion tools, probably you can dig more into it,

Thanks,
Rakesh

On Mon, Aug 1, 2016 at 4:54 PM, Bhagaban Khatai <[hidden email]> wrote:
Hi Guys-

I need a quick help if anybody done any migration project in TD into hadoop.
We have very tight deadline and I am trying to find any tool (online or paid) for quick development.

Please help us here and guide me if any other way is available to do the development fast.

Bhagaban