way to add custom udf jar in hadoop 2.x version

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

way to add custom udf jar in hadoop 2.x version

reena upadhyay-2
Hi,

I am using hadoop 2.4.0 version. I have created custom udf jar. I am trying to execute a simple select udf query using java hive jdbc client program. When hive execute the query using map reduce job, then the query execution get fails because the mapper is not able to locate the udf class.
So I wanted to add the udf jar in hadoop environment permanently. Please suggest me a way to add this external jar for single node and multi node hadoop cluster.

PS: I am using hive 0.13.1 version and I already have this custom udf jar added in HIVE_HOME/lib directory.


Thanks
Reply | Threaded
Open this post in threaded view
|

Re: way to add custom udf jar in hadoop 2.x version

Ted Yu-3
Have you seen this thread ?

On Dec 30, 2014, at 10:56 PM, reena upadhyay <[hidden email]> wrote:

Hi,

I am using hadoop 2.4.0 version. I have created custom udf jar. I am trying to execute a simple select udf query using java hive jdbc client program. When hive execute the query using map reduce job, then the query execution get fails because the mapper is not able to locate the udf class.
So I wanted to add the udf jar in hadoop environment permanently. Please suggest me a way to add this external jar for single node and multi node hadoop cluster.

PS: I am using hive 0.13.1 version and I already have this custom udf jar added in HIVE_HOME/lib directory.


Thanks
Reply | Threaded
Open this post in threaded view
|

Re: way to add custom udf jar in hadoop 2.x version

Alexander Pivovarov
In reply to this post by reena upadhyay-2

I found that the easiest way is to put udf  jar to /usr/lib/hadoop-mapred on all computers in the cluster. Hive cli, hiveserver2, oozie launcher, oozie hive action, mr will see the jar then. I'm using hdp-2.1.5

On Dec 30, 2014 10:58 PM, "reena upadhyay" <[hidden email]> wrote:
Hi,

I am using hadoop 2.4.0 version. I have created custom udf jar. I am trying to execute a simple select udf query using java hive jdbc client program. When hive execute the query using map reduce job, then the query execution get fails because the mapper is not able to locate the udf class.
So I wanted to add the udf jar in hadoop environment permanently. Please suggest me a way to add this external jar for single node and multi node hadoop cluster.

PS: I am using hive 0.13.1 version and I already have this custom udf jar added in HIVE_HOME/lib directory.


Thanks
Reply | Threaded
Open this post in threaded view
|

Re: way to add custom udf jar in hadoop 2.x version

Niels Basjes
In reply to this post by Ted Yu-3

Thanks for the pointer.
This seems to work for functions. Is there something similar for CREATE EXTERNAL TABLE ??

Niels

On Dec 31, 2014 8:13 AM, "Ted Yu" <[hidden email]> wrote:
Have you seen this thread ?

On Dec 30, 2014, at 10:56 PM, reena upadhyay <[hidden email]> wrote:

Hi,

I am using hadoop 2.4.0 version. I have created custom udf jar. I am trying to execute a simple select udf query using java hive jdbc client program. When hive execute the query using map reduce job, then the query execution get fails because the mapper is not able to locate the udf class.
So I wanted to add the udf jar in hadoop environment permanently. Please suggest me a way to add this external jar for single node and multi node hadoop cluster.

PS: I am using hive 0.13.1 version and I already have this custom udf jar added in HIVE_HOME/lib directory.


Thanks
Reply | Threaded
Open this post in threaded view
|

Re: way to add custom udf jar in hadoop 2.x version

Ted Yu-3
In reply to this post by Alexander Pivovarov
The location for lib jars may change across releases.
e.g. in HDP 2.2, /usr/lib/hadoop-mapred doesn't exist.

FYI

On Wed, Dec 31, 2014 at 12:53 AM, Alexander Pivovarov <[hidden email]> wrote:

I found that the easiest way is to put udf  jar to /usr/lib/hadoop-mapred on all computers in the cluster. Hive cli, hiveserver2, oozie launcher, oozie hive action, mr will see the jar then. I'm using hdp-2.1.5

On Dec 30, 2014 10:58 PM, "reena upadhyay" <[hidden email]> wrote:
Hi,

I am using hadoop 2.4.0 version. I have created custom udf jar. I am trying to execute a simple select udf query using java hive jdbc client program. When hive execute the query using map reduce job, then the query execution get fails because the mapper is not able to locate the udf class.
So I wanted to add the udf jar in hadoop environment permanently. Please suggest me a way to add this external jar for single node and multi node hadoop cluster.

PS: I am using hive 0.13.1 version and I already have this custom udf jar added in HIVE_HOME/lib directory.


Thanks

Reply | Threaded
Open this post in threaded view
|

Re: way to add custom udf jar in hadoop 2.x version

Binglin Chang
On hiveserver machine, create a dir: HIVE_HOME/auxlib , and add all extend jars here, when hiveserver2 starts, it will automatically pick up all jars in this directory and set hive.aux.jars.path properly. 
So every new session will first add those jars automatically, you don't need to add those jars either. 

On Wed, Dec 31, 2014 at 10:29 PM, Ted Yu <[hidden email]> wrote:
The location for lib jars may change across releases.
e.g. in HDP 2.2, /usr/lib/hadoop-mapred doesn't exist.

FYI

On Wed, Dec 31, 2014 at 12:53 AM, Alexander Pivovarov <[hidden email]> wrote:

I found that the easiest way is to put udf  jar to /usr/lib/hadoop-mapred on all computers in the cluster. Hive cli, hiveserver2, oozie launcher, oozie hive action, mr will see the jar then. I'm using hdp-2.1.5

On Dec 30, 2014 10:58 PM, "reena upadhyay" <[hidden email]> wrote:
Hi,

I am using hadoop 2.4.0 version. I have created custom udf jar. I am trying to execute a simple select udf query using java hive jdbc client program. When hive execute the query using map reduce job, then the query execution get fails because the mapper is not able to locate the udf class.
So I wanted to add the udf jar in hadoop environment permanently. Please suggest me a way to add this external jar for single node and multi node hadoop cluster.

PS: I am using hive 0.13.1 version and I already have this custom udf jar added in HIVE_HOME/lib directory.


Thanks