Minimal requirements for building libhdfs

classic Classic list List threaded Threaded
8 messages Options
Reply | Threaded
Open this post in threaded view
|

Minimal requirements for building libhdfs

alakshman
Hi All

What are minimal requirements on my Linux machine for building libhdfs ? On
my Linux box I do not seem to have jni.h and what are the other binaries I
need for this to work ? Could someone please tell me what is the easiest way
to get this done ?

Thanks
Avinash
Reply | Threaded
Open this post in threaded view
|

Re: Minimal requirements for building libhdfs

Doug Cutting
Phantom wrote:
> What are minimal requirements on my Linux machine for building libhdfs ? On
> my Linux box I do not seem to have jni.h and what are the other binaries I
> need for this to work ? Could someone please tell me what is the easiest
> way
> to get this done ?

The following wiki page should help:

http://wiki.apache.org/lucene-hadoop/NativeHadoop

Doug
Reply | Threaded
Open this post in threaded view
|

Re: Minimal requirements for building libhdfs

alakshman
I am running into this wierd build problem - I am building this on Fedora
Linux x86 64 bit machine but the build is spitting out AMD64 library. How
can I fix this ? Here is the error from the build :

[exec] then mv -f ".deps/ZlibDecompressor.Tpo" ".deps/ZlibDecompressor.Plo";
else rm -f ".deps/ZlibDecompressor.Tpo"; exit 1; fi
     [exec]  gcc -DHAVE_CONFIG_H -I. -I/home/alakshman/FB-Projects/hadoop-
0.13.0/src/native/src/org/apache/hadoop/io/compress/zlib
-I../../../../../../.. -I/usr/local/jdk1.5.0_07/include
-I/usr/local/jdk1.5.0_07/include/linux -I/home/alakshman/FB-Projects/hadoop-
0.13.0/src/native/src -g -Wall -fPIC -O2 -m64 -g -O2 -MT
ZlibDecompressor.lo-MD -MP -MF .deps/ZlibDecompressor.Tpo -c
/home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibDecompressor.c
-fPIC -DPIC -o .libs/ZlibDecompressor.o
     [exec]  gcc -DHAVE_CONFIG_H -I. -I/home/alakshman/FB-Projects/hadoop-
0.13.0/src/native/src/org/apache/hadoop/io/compress/zlib
-I../../../../../../.. -I/usr/local/jdk1.5.0_07/include
-I/usr/local/jdk1.5.0_07/include/linux -I/home/alakshman/FB-Projects/hadoop-
0.13.0/src/native/src -g -Wall -fPIC -O2 -m64 -g -O2 -MT
ZlibDecompressor.lo-MD -MP -MF .deps/ZlibDecompressor.Tpo -c
/home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibDecompressor.c
-o ZlibDecompressor.o >/dev/null 2>&1
     [exec] /bin/sh ../../../../../../../libtool --tag=CC --mode=link gcc -g
-Wall -fPIC -O2 -m64 -g -O2 -L/usr/local/jdk1.5.0_07/jre/lib/amd64/server
-o libnativezlib.la   ZlibCompressor.lo ZlibDecompressor.lo -ldl -ljvm -ljvm
-ldl
     [exec] ar cru .libs/libnativezlib.a .libs/ZlibCompressor.o
.libs/ZlibDecompressor.o
     [exec] ranlib .libs/libnativezlib.a
     [exec] creating libnativezlib.la
     [exec] (cd .libs && rm -f libnativezlib.la && ln -s ../libnativezlib.la
libnativezlib.la)
     [exec] make[2]: Leaving directory `/home/alakshman/FB-Projects/hadoop-
0.13.0/build/native/Linux-amd64-64/src/org/apache/hadoop/io/compress/zlib'
     [exec] Making all in src/org/apache/hadoop/io/compress/lzo
     [exec] make[2]: Entering directory `/home/alakshman/FB-Projects/hadoop-
0.13.0/build/native/Linux-amd64-64/src/org/apache/hadoop/io/compress/lzo'
     [exec] if /bin/sh ../../../../../../../libtool --tag=CC --mode=compile
gcc -DHAVE_CONFIG_H -I.
-I/home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src/org/apache/hadoop/io/compress/lzo
-I../../../../../../..  -I/usr/local/jdk1.5.0_07/include
-I/usr/local/jdk1.5.0_07/include/linux -I/home/alakshman/FB-Projects/hadoop-
0.13.0/src/native/src  -g -Wall -fPIC -O2 -m64 -g -O2 -MT
LzoCompressor.lo-MD -MP -MF ".deps/LzoCompressor.Tpo" -c -o
LzoCompressor.lo
/home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c;
\
     [exec] then mv -f ".deps/LzoCompressor.Tpo" ".deps/LzoCompressor.Plo";
else rm -f ".deps/LzoCompressor.Tpo"; exit 1; fi
     [exec] mkdir .libs
     [exec]  gcc -DHAVE_CONFIG_H -I. -I/home/alakshman/FB-Projects/hadoop-
0.13.0/src/native/src/org/apache/hadoop/io/compress/lzo
-I../../../../../../.. -I/usr/local/jdk1.5.0_07/include
-I/usr/local/jdk1.5.0_07/include/linux -I/home/alakshman/FB-Projects/hadoop-
0.13.0/src/native/src -g -Wall -fPIC -O2 -m64 -g -O2 -MT
LzoCompressor.lo-MD -MP -MF .deps/LzoCompressor.Tpo -c
/home/alakshman/FB-Projects/hadoop-
0.13.0/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c
-fPIC -DPIC -o .libs/LzoCompressor.o
     [exec] /home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:
In function 'Java_org_apache_hadoop_io_compress_lzo_LzoCompressor_initIDs':
     [exec] /home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:116:
error: syntax error before ',' token
     [exec] make[2]: *** [LzoCompressor.lo] Error 1
     [exec] make[2]: Leaving directory `/home/alakshman/FB-Projects/hadoop-
0.13.0/build/native/Linux-amd64-64/src/org/apache/hadoop/io/compress/lzo'
     [exec] make[1]: *** [all-recursive] Error 1
     [exec] make[1]: Leaving directory `/home/alakshman/FB-Projects/hadoop-
0.13.0/build/native/Linux-amd64-64'
     [exec] make: *** [all] Error 2

BUILD FAILED
/home/alakshman/FB-Projects/hadoop-0.13.0/build.xml:285: exec returned: 2


On 6/8/07, Doug Cutting <[hidden email]> wrote:

>
> Phantom wrote:
> > What are minimal requirements on my Linux machine for building libhdfs ?
> On
> > my Linux box I do not seem to have jni.h and what are the other binaries
> I
> > need for this to work ? Could someone please tell me what is the easiest
> > way
> > to get this done ?
>
> The following wiki page should help:
>
> http://wiki.apache.org/lucene-hadoop/NativeHadoop
>
> Doug
>
Reply | Threaded
Open this post in threaded view
|

Re: Minimal requirements for building libhdfs

Arun C Murthy-2
There are 2 distinctive native components to hadoop:
a) libhdfs - Jni based C bindings for the hadoop DFS java libraries.
b) libhadoop - Native libraries for core-hadoop (for now we have lzo and zlib compression libraries: http://wiki.apache.org/lucene-hadoop/NativeHadoop as Doug pointed out).

On Fri, Jun 08, 2007 at 04:14:04PM -0700, Phantom wrote:
>I am running into this wierd build problem - I am building this on Fedora
>Linux x86 64 bit machine but the build is spitting out AMD64 library. How
>can I fix this ? Here is the error from the build :
>

Looking at the errors below 2 data-points:

>    [exec] /bin/sh ../../../../../../../libtool --tag=CC --mode=link gcc -g
>-Wall -fPIC -O2 -m64 -g -O2 -L/usr/local/jdk1.5.0_07/jre/lib/amd64/server
>-o libnativezlib.la   ZlibCompressor.lo ZlibDecompressor.lo -ldl -ljvm -ljvm

and

>    [exec] Making all in src/org/apache/hadoop/io/compress/lzo
>    [exec] make[2]: Entering directory `/home/alakshman/FB-Projects/hadoop-
>0.13.0/build/native/Linux-amd64-64/src/org/apache/hadoop/io/compress/lzo'

(emphasis on -L/usr/local/jdk1.5.0_07/jre/lib/amd64/server and 0.13.0/build/native/Linux-amd64-64/src/org/apache/hadoop/io/compress/lzo)

lead me to suspect that you have a amd64 build of the jvm installed... could you run org.apache.hadoop.util.PlatformName and check? Also double check the 'JAVA_HOME' env. variable...

Arun

>
>BUILD FAILED
>/home/alakshman/FB-Projects/hadoop-0.13.0/build.xml:285: exec returned: 2
>
>
>On 6/8/07, Doug Cutting <[hidden email]> wrote:
>>
>>Phantom wrote:
>>> What are minimal requirements on my Linux machine for building libhdfs ?
>>On
>>> my Linux box I do not seem to have jni.h and what are the other binaries
>>I
>>> need for this to work ? Could someone please tell me what is the easiest
>>> way
>>> to get this done ?
>>
>>The following wiki page should help:
>>
>>http://wiki.apache.org/lucene-hadoop/NativeHadoop
>>
>>Doug
>>
Reply | Threaded
Open this post in threaded view
|

Re: Minimal requirements for building libhdfs

alakshman
I ran the java org.apache.hadoop.util.Platform and it says that it is an
AMD64. Your guess that I have an AMD64 bit install of Java is correct. Now
is there a 64 bit install for x86 Linux ? I don't seem to find one on the
Sun website. Does that mean I have to run this with 32 bit binaries for Java
and for Lzo and zlib too ? Please advice.


On 6/10/07, Arun C Murthy <[hidden email]> wrote:

>
> There are 2 distinctive native components to hadoop:
> a) libhdfs - Jni based C bindings for the hadoop DFS java libraries.
> b) libhadoop - Native libraries for core-hadoop (for now we have lzo and
> zlib compression libraries:
> http://wiki.apache.org/lucene-hadoop/NativeHadoop as Doug pointed out).
>
> On Fri, Jun 08, 2007 at 04:14:04PM -0700, Phantom wrote:
> >I am running into this wierd build problem - I am building this on Fedora
> >Linux x86 64 bit machine but the build is spitting out AMD64 library. How
> >can I fix this ? Here is the error from the build :
> >
>
> Looking at the errors below 2 data-points:
>
> >    [exec] /bin/sh ../../../../../../../libtool --tag=CC --mode=link gcc
> -g
> >-Wall -fPIC -O2 -m64 -g -O2 -L/usr/local/jdk1.5.0_07/jre/lib/amd64/server
> >-o libnativezlib.la   ZlibCompressor.lo ZlibDecompressor.lo -ldl -ljvm
> -ljvm
>
> and
>
> >    [exec] Making all in src/org/apache/hadoop/io/compress/lzo
> >    [exec] make[2]: Entering directory
> `/home/alakshman/FB-Projects/hadoop-
> >0.13.0/build/native/Linux-amd64-64/src/org/apache/hadoop/io/compress/lzo'
>
> (emphasis on -L/usr/local/jdk1.5.0_07/jre/lib/amd64/server and 0.13.0
> /build/native/Linux-amd64-64/src/org/apache/hadoop/io/compress/lzo)
>
> lead me to suspect that you have a amd64 build of the jvm installed...
> could you run org.apache.hadoop.util.PlatformName and check? Also double
> check the 'JAVA_HOME' env. variable...
>
> Arun
>
> >
> >BUILD FAILED
> >/home/alakshman/FB-Projects/hadoop-0.13.0/build.xml:285: exec returned: 2
> >
> >
> >On 6/8/07, Doug Cutting <[hidden email]> wrote:
> >>
> >>Phantom wrote:
> >>> What are minimal requirements on my Linux machine for building libhdfs
> ?
> >>On
> >>> my Linux box I do not seem to have jni.h and what are the other
> binaries
> >>I
> >>> need for this to work ? Could someone please tell me what is the
> easiest
> >>> way
> >>> to get this done ?
> >>
> >>The following wiki page should help:
> >>
> >>http://wiki.apache.org/lucene-hadoop/NativeHadoop
> >>
> >>Doug
> >>
>
Reply | Threaded
Open this post in threaded view
|

Re: Minimal requirements for building libhdfs

alakshman
I tried building with a 32 bit install of Java. Now I get the following
error w.r.t zlib. What is the workaround for me ?

[exec] /usr/bin/make  all-recursive
     [exec] make[1]: Entering directory `/home/alakshman/FB-Projects/hadoop-
0.13.0/build/native/Linux-i386-32'
     [exec] Making all in src/org/apache/hadoop/io/compress/zlib
     [exec] make[2]: Entering directory `/home/alakshman/FB-Projects/hadoop-
0.13.0/build/native/Linux-i386-32/src/org/apache/hadoop/io/compress/zlib'
     [exec] if /bin/sh ../../../../../../../libtool --tag=CC --mode=compile
gcc -DHAVE_CONFIG_H -I.
-I/home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src/org/apache/hadoop/io/compress/zlib
-I../../../../../../..  -I/home/alakshman/jdk1.5.0_06/include
-I/home/alakshman/jdk1.5.0_06/include/linux
-I/home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src  -g -Wall -fPIC
-O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF ".deps/ZlibCompressor.Tpo"
-c -o ZlibCompressor.lo
/home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c;
\
     [exec] then mv -f ".deps/ZlibCompressor.Tpo"
".deps/ZlibCompressor.Plo"; else rm -f ".deps/ZlibCompressor.Tpo"; exit 1;
fi
     [exec] mkdir .libs
     [exec]  gcc -DHAVE_CONFIG_H -I. -I/home/alakshman/FB-Projects/hadoop-
0.13.0/src/native/src/org/apache/hadoop/io/compress/zlib
-I../../../../../../.. -I/home/alakshman/jdk1.5.0_06/include
-I/home/alakshman/jdk1.5.0_06/include/linux
-I/home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src -g -Wall -fPIC
-O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF .deps/ZlibCompressor.Tpo
-c /home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c
-fPIC -DPIC -o .libs/ZlibCompressor.o
     [exec] In file included from /usr/include/features.h:337,
     [exec]                  from /usr/include/stdio.h:28,
     [exec]                  from /home/alakshman/FB-Projects/hadoop-0.13.0
/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c:24:
     [exec] /usr/include/gnu/stubs.h:7:27: error: gnu/stubs-32.h: No such
file or directory
     [exec] make[2]: *** [ZlibCompressor.lo] Error 1
     [exec] make[2]: Leaving directory `/home/alakshman/FB-Projects/hadoop-
0.13.0/build/native/Linux-i386-32/src/org/apache/hadoop/io/compress/zlib'
     [exec] make[1]: *** [all-recursive] Error 1
     [exec] make[1]: Leaving directory `/home/alakshman/FB-Projects/hadoop-
0.13.0/build/native/Linux-i386-32'
     [exec] make: *** [all] Error 2

Thanks
A

On 6/11/07, Phantom <[hidden email]> wrote:

>
> I ran the java org.apache.hadoop.util.Platform and it says that it is an
> AMD64. Your guess that I have an AMD64 bit install of Java is correct. Now
> is there a 64 bit install for x86 Linux ? I don't seem to find one on the
> Sun website. Does that mean I have to run this with 32 bit binaries for Java
> and for Lzo and zlib too ? Please advice.
>
>
> On 6/10/07, Arun C Murthy <[hidden email]> wrote:
> >
> > There are 2 distinctive native components to hadoop:
> > a) libhdfs - Jni based C bindings for the hadoop DFS java libraries.
> > b) libhadoop - Native libraries for core-hadoop (for now we have lzo and
> > zlib compression libraries:
> > http://wiki.apache.org/lucene-hadoop/NativeHadoop as Doug pointed out).
> >
> > On Fri, Jun 08, 2007 at 04:14:04PM -0700, Phantom wrote:
> > >I am running into this wierd build problem - I am building this on
> > Fedora
> > >Linux x86 64 bit machine but the build is spitting out AMD64 library.
> > How
> > >can I fix this ? Here is the error from the build :
> > >
> >
> > Looking at the errors below 2 data-points:
> >
> > >    [exec] /bin/sh ../../../../../../../libtool --tag=CC --mode=link
> > gcc -g
> > >-Wall -fPIC -O2 -m64 -g -O2
> > -L/usr/local/jdk1.5.0_07/jre/lib/amd64/server
> > >-o libnativezlib.la   ZlibCompressor.lo ZlibDecompressor.lo -ldl -ljvm
> > -ljvm
> >
> > and
> >
> > >    [exec] Making all in src/org/apache/hadoop/io/compress/lzo
> > >    [exec] make[2]: Entering directory
> > `/home/alakshman/FB-Projects/hadoop-
> > >0.13.0/build/native/Linux-amd64-64/src/org/apache/hadoop/io/compress/lzo'
> >
> >
> > (emphasis on -L/usr/local/jdk1.5.0_07/jre/lib/amd64/server and 0.13.0
> > /build/native/Linux-amd64-64/src/org/apache/hadoop/io/compress/lzo)
> >
> > lead me to suspect that you have a amd64 build of the jvm installed...
> > could you run org.apache.hadoop.util.PlatformName and check? Also double
> > check the 'JAVA_HOME' env. variable...
> >
> > Arun
> >
> > >
> > >BUILD FAILED
> > >/home/alakshman/FB-Projects/hadoop-0.13.0/build.xml:285: exec returned:
> > 2
> > >
> > >
> > >On 6/8/07, Doug Cutting <[hidden email]> wrote:
> > >>
> > >>Phantom wrote:
> > >>> What are minimal requirements on my Linux machine for building
> > libhdfs ?
> > >>On
> > >>> my Linux box I do not seem to have jni.h and what are the other
> > binaries
> > >>I
> > >>> need for this to work ? Could someone please tell me what is the
> > easiest
> > >>> way
> > >>> to get this done ?
> > >>
> > >>The following wiki page should help:
> > >>
> > >>http://wiki.apache.org/lucene-hadoop/NativeHadoop
> > >>
> > >>Doug
> > >>
> >
>
>
Reply | Threaded
Open this post in threaded view
|

Re: Minimal requirements for building libhdfs

Arun C Murthy-2
On Mon, Jun 11, 2007 at 11:16:55AM -0700, Phantom wrote:
>I tried building with a 32 bit install of Java. Now I get the following
>error w.r.t zlib. What is the workaround for me ?
>

So, I assume you want libhadoop rather than libhdfs...

Sorry if my previous reply was vague: amd64 *is* x64 (aka AA64/IA-32e/EM64T etc.), so that isn't the problem.

If you are not particular about using a 64-bit jvm, and Linux is your OS, I'd suggest you first grab the Linux-i386-32 pre-built libhadoop from the 0.13.0 release and try it...

If that doesn't work for you please double-check your (preferred) 64/32-bit platform and ensure you have the relevant jvm and zlib/lzo libs.

Arun

>[exec] /usr/bin/make  all-recursive
>    [exec] make[1]: Entering directory `/home/alakshman/FB-Projects/hadoop-
>0.13.0/build/native/Linux-i386-32'
>    [exec] Making all in src/org/apache/hadoop/io/compress/zlib
>    [exec] make[2]: Entering directory `/home/alakshman/FB-Projects/hadoop-
>0.13.0/build/native/Linux-i386-32/src/org/apache/hadoop/io/compress/zlib'
>    [exec] if /bin/sh ../../../../../../../libtool --tag=CC --mode=compile
>gcc -DHAVE_CONFIG_H -I.
>-I/home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src/org/apache/hadoop/io/compress/zlib
>-I../../../../../../..  -I/home/alakshman/jdk1.5.0_06/include
>-I/home/alakshman/jdk1.5.0_06/include/linux
>-I/home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src  -g -Wall -fPIC
>-O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF ".deps/ZlibCompressor.Tpo"
>-c -o ZlibCompressor.lo
>/home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c;
>\
>    [exec] then mv -f ".deps/ZlibCompressor.Tpo"
>".deps/ZlibCompressor.Plo"; else rm -f ".deps/ZlibCompressor.Tpo"; exit 1;
>fi
>    [exec] mkdir .libs
>    [exec]  gcc -DHAVE_CONFIG_H -I. -I/home/alakshman/FB-Projects/hadoop-
>0.13.0/src/native/src/org/apache/hadoop/io/compress/zlib
>-I../../../../../../.. -I/home/alakshman/jdk1.5.0_06/include
>-I/home/alakshman/jdk1.5.0_06/include/linux
>-I/home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src -g -Wall -fPIC
>-O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF .deps/ZlibCompressor.Tpo
>-c
>/home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c
>-fPIC -DPIC -o .libs/ZlibCompressor.o
>    [exec] In file included from /usr/include/features.h:337,
>    [exec]                  from /usr/include/stdio.h:28,
>    [exec]                  from /home/alakshman/FB-Projects/hadoop-0.13.0
>/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c:24:
>    [exec] /usr/include/gnu/stubs.h:7:27: error: gnu/stubs-32.h: No such
>file or directory
>    [exec] make[2]: *** [ZlibCompressor.lo] Error 1
>    [exec] make[2]: Leaving directory `/home/alakshman/FB-Projects/hadoop-
>0.13.0/build/native/Linux-i386-32/src/org/apache/hadoop/io/compress/zlib'
>    [exec] make[1]: *** [all-recursive] Error 1
>    [exec] make[1]: Leaving directory `/home/alakshman/FB-Projects/hadoop-
>0.13.0/build/native/Linux-i386-32'
>    [exec] make: *** [all] Error 2
>
>Thanks
>A
>
>On 6/11/07, Phantom <[hidden email]> wrote:
>>
>>I ran the java org.apache.hadoop.util.Platform and it says that it is an
>>AMD64. Your guess that I have an AMD64 bit install of Java is correct. Now
>>is there a 64 bit install for x86 Linux ? I don't seem to find one on the
>>Sun website. Does that mean I have to run this with 32 bit binaries for
>>Java
>>and for Lzo and zlib too ? Please advice.
>>
>>
>>On 6/10/07, Arun C Murthy <[hidden email]> wrote:
>>>
>>> There are 2 distinctive native components to hadoop:
>>> a) libhdfs - Jni based C bindings for the hadoop DFS java libraries.
>>> b) libhadoop - Native libraries for core-hadoop (for now we have lzo and
>>> zlib compression libraries:
>>> http://wiki.apache.org/lucene-hadoop/NativeHadoop as Doug pointed out).
>>>
>>> On Fri, Jun 08, 2007 at 04:14:04PM -0700, Phantom wrote:
>>> >I am running into this wierd build problem - I am building this on
>>> Fedora
>>> >Linux x86 64 bit machine but the build is spitting out AMD64 library.
>>> How
>>> >can I fix this ? Here is the error from the build :
>>> >
>>>
>>> Looking at the errors below 2 data-points:
>>>
>>> >    [exec] /bin/sh ../../../../../../../libtool --tag=CC --mode=link
>>> gcc -g
>>> >-Wall -fPIC -O2 -m64 -g -O2
>>> -L/usr/local/jdk1.5.0_07/jre/lib/amd64/server
>>> >-o libnativezlib.la   ZlibCompressor.lo ZlibDecompressor.lo -ldl -ljvm
>>> -ljvm
>>>
>>> and
>>>
>>> >    [exec] Making all in src/org/apache/hadoop/io/compress/lzo
>>> >    [exec] make[2]: Entering directory
>>> `/home/alakshman/FB-Projects/hadoop-
>>>
>>>0.13.0/build/native/Linux-amd64-64/src/org/apache/hadoop/io/compress/lzo'
>>>
>>>
>>> (emphasis on -L/usr/local/jdk1.5.0_07/jre/lib/amd64/server and 0.13.0
>>> /build/native/Linux-amd64-64/src/org/apache/hadoop/io/compress/lzo)
>>>
>>> lead me to suspect that you have a amd64 build of the jvm installed...
>>> could you run org.apache.hadoop.util.PlatformName and check? Also double
>>> check the 'JAVA_HOME' env. variable...
>>>
>>> Arun
>>>
>>> >
>>> >BUILD FAILED
>>> >/home/alakshman/FB-Projects/hadoop-0.13.0/build.xml:285: exec returned:
>>> 2
>>> >
>>> >
>>> >On 6/8/07, Doug Cutting <[hidden email]> wrote:
>>> >>
>>> >>Phantom wrote:
>>> >>> What are minimal requirements on my Linux machine for building
>>> libhdfs ?
>>> >>On
>>> >>> my Linux box I do not seem to have jni.h and what are the other
>>> binaries
>>> >>I
>>> >>> need for this to work ? Could someone please tell me what is the
>>> easiest
>>> >>> way
>>> >>> to get this done ?
>>> >>
>>> >>The following wiki page should help:
>>> >>
>>> >>http://wiki.apache.org/lucene-hadoop/NativeHadoop
>>> >>
>>> >>Doug
>>> >>
>>>
>>
>>
Reply | Threaded
Open this post in threaded view
|

Re: Minimal requirements for building libhdfs

alakshman
So what I want is a C/C++ interface to write to HDFS ? What do I have to do
to achieve this ?

Thanks
A

On 6/11/07, Arun C Murthy <[hidden email]> wrote:

>
> On Mon, Jun 11, 2007 at 11:16:55AM -0700, Phantom wrote:
> >I tried building with a 32 bit install of Java. Now I get the following
> >error w.r.t zlib. What is the workaround for me ?
> >
>
> So, I assume you want libhadoop rather than libhdfs...
>
> Sorry if my previous reply was vague: amd64 *is* x64 (aka
> AA64/IA-32e/EM64T etc.), so that isn't the problem.
>
> If you are not particular about using a 64-bit jvm, and Linux is your OS,
> I'd suggest you first grab the Linux-i386-32 pre-built libhadoop from the
> 0.13.0 release and try it...
>
> If that doesn't work for you please double-check your (preferred)
> 64/32-bit platform and ensure you have the relevant jvm and zlib/lzo libs.
>
> Arun
>
> >[exec] /usr/bin/make  all-recursive
> >    [exec] make[1]: Entering directory
> `/home/alakshman/FB-Projects/hadoop-
> >0.13.0/build/native/Linux-i386-32'
> >    [exec] Making all in src/org/apache/hadoop/io/compress/zlib
> >    [exec] make[2]: Entering directory
> `/home/alakshman/FB-Projects/hadoop-
> >0.13.0/build/native/Linux-i386-32/src/org/apache/hadoop/io/compress/zlib'
> >    [exec] if /bin/sh ../../../../../../../libtool --tag=CC
> --mode=compile
> >gcc -DHAVE_CONFIG_H -I.
> >-I/home/alakshman/FB-Projects/hadoop-0.13.0
> /src/native/src/org/apache/hadoop/io/compress/zlib
> >-I../../../../../../..  -I/home/alakshman/jdk1.5.0_06/include
> >-I/home/alakshman/jdk1.5.0_06/include/linux
> >-I/home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src  -g -Wall
> -fPIC
> >-O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF
> ".deps/ZlibCompressor.Tpo"
> >-c -o ZlibCompressor.lo
> >/home/alakshman/FB-Projects/hadoop-0.13.0
> /src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c;
> >\
> >    [exec] then mv -f ".deps/ZlibCompressor.Tpo"
> >".deps/ZlibCompressor.Plo"; else rm -f ".deps/ZlibCompressor.Tpo"; exit
> 1;
> >fi
> >    [exec] mkdir .libs
> >    [exec]  gcc -DHAVE_CONFIG_H -I. -I/home/alakshman/FB-Projects/hadoop-
> >0.13.0/src/native/src/org/apache/hadoop/io/compress/zlib
> >-I../../../../../../.. -I/home/alakshman/jdk1.5.0_06/include
> >-I/home/alakshman/jdk1.5.0_06/include/linux
> >-I/home/alakshman/FB-Projects/hadoop-0.13.0/src/native/src -g -Wall -fPIC
> >-O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF
> .deps/ZlibCompressor.Tpo
> >-c
> >/home/alakshman/FB-Projects/hadoop-0.13.0
> /src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c
> >-fPIC -DPIC -o .libs/ZlibCompressor.o
> >    [exec] In file included from /usr/include/features.h:337,
> >    [exec]                  from /usr/include/stdio.h:28,
> >    [exec]                  from /home/alakshman/FB-Projects/hadoop-
> 0.13.0
> >/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c:24:
> >    [exec] /usr/include/gnu/stubs.h:7:27: error: gnu/stubs-32.h: No such
> >file or directory
> >    [exec] make[2]: *** [ZlibCompressor.lo] Error 1
> >    [exec] make[2]: Leaving directory
> `/home/alakshman/FB-Projects/hadoop-
> >0.13.0/build/native/Linux-i386-32/src/org/apache/hadoop/io/compress/zlib'
> >    [exec] make[1]: *** [all-recursive] Error 1
> >    [exec] make[1]: Leaving directory
> `/home/alakshman/FB-Projects/hadoop-
> >0.13.0/build/native/Linux-i386-32'
> >    [exec] make: *** [all] Error 2
> >
> >Thanks
> >A
> >
> >On 6/11/07, Phantom <[hidden email]> wrote:
> >>
> >>I ran the java org.apache.hadoop.util.Platform and it says that it is an
> >>AMD64. Your guess that I have an AMD64 bit install of Java is correct.
> Now
> >>is there a 64 bit install for x86 Linux ? I don't seem to find one on
> the
> >>Sun website. Does that mean I have to run this with 32 bit binaries for
> >>Java
> >>and for Lzo and zlib too ? Please advice.
> >>
> >>
> >>On 6/10/07, Arun C Murthy <[hidden email]> wrote:
> >>>
> >>> There are 2 distinctive native components to hadoop:
> >>> a) libhdfs - Jni based C bindings for the hadoop DFS java libraries.
> >>> b) libhadoop - Native libraries for core-hadoop (for now we have lzo
> and
> >>> zlib compression libraries:
> >>> http://wiki.apache.org/lucene-hadoop/NativeHadoop as Doug pointed
> out).
> >>>
> >>> On Fri, Jun 08, 2007 at 04:14:04PM -0700, Phantom wrote:
> >>> >I am running into this wierd build problem - I am building this on
> >>> Fedora
> >>> >Linux x86 64 bit machine but the build is spitting out AMD64 library.
> >>> How
> >>> >can I fix this ? Here is the error from the build :
> >>> >
> >>>
> >>> Looking at the errors below 2 data-points:
> >>>
> >>> >    [exec] /bin/sh ../../../../../../../libtool --tag=CC --mode=link
> >>> gcc -g
> >>> >-Wall -fPIC -O2 -m64 -g -O2
> >>> -L/usr/local/jdk1.5.0_07/jre/lib/amd64/server
> >>> >-o libnativezlib.la   ZlibCompressor.lo ZlibDecompressor.lo -ldl
> -ljvm
> >>> -ljvm
> >>>
> >>> and
> >>>
> >>> >    [exec] Making all in src/org/apache/hadoop/io/compress/lzo
> >>> >    [exec] make[2]: Entering directory
> >>> `/home/alakshman/FB-Projects/hadoop-
> >>>
> >>>0.13.0
> /build/native/Linux-amd64-64/src/org/apache/hadoop/io/compress/lzo'
> >>>
> >>>
> >>> (emphasis on -L/usr/local/jdk1.5.0_07/jre/lib/amd64/server and 0.13.0
> >>> /build/native/Linux-amd64-64/src/org/apache/hadoop/io/compress/lzo)
> >>>
> >>> lead me to suspect that you have a amd64 build of the jvm installed...
> >>> could you run org.apache.hadoop.util.PlatformName and check? Also
> double
> >>> check the 'JAVA_HOME' env. variable...
> >>>
> >>> Arun
> >>>
> >>> >
> >>> >BUILD FAILED
> >>> >/home/alakshman/FB-Projects/hadoop-0.13.0/build.xml:285: exec
> returned:
> >>> 2
> >>> >
> >>> >
> >>> >On 6/8/07, Doug Cutting <[hidden email]> wrote:
> >>> >>
> >>> >>Phantom wrote:
> >>> >>> What are minimal requirements on my Linux machine for building
> >>> libhdfs ?
> >>> >>On
> >>> >>> my Linux box I do not seem to have jni.h and what are the other
> >>> binaries
> >>> >>I
> >>> >>> need for this to work ? Could someone please tell me what is the
> >>> easiest
> >>> >>> way
> >>> >>> to get this done ?
> >>> >>
> >>> >>The following wiki page should help:
> >>> >>
> >>> >>http://wiki.apache.org/lucene-hadoop/NativeHadoop
> >>> >>
> >>> >>Doug
> >>> >>
> >>>
> >>
> >>
>