HDFS Installation

classic Classic list List threaded Threaded
10 messages Options
Reply | Threaded
Open this post in threaded view
|

HDFS Installation

Ekta Agrawal
Can anybody suggest any good tutorial to install hdfs and work with hdfs?

I installed hadoop on Ubuntu as single node. I can see those service running.

But how to install and work with hdfs? Please give some guidance.
Reply | Threaded
Open this post in threaded view
|

Re: HDFS Installation

Mahesh Khandewal
I think in hadoop installation only hdfs comes.
Like you need to insert script like
bin/hadoop start-dfs.sh in $hadoop_home path


On Sun, Apr 13, 2014 at 10:27 AM, Ekta Agrawal <[hidden email]> wrote:
Can anybody suggest any good tutorial to install hdfs and work with hdfs?

I installed hadoop on Ubuntu as single node. I can see those service running.

But how to install and work with hdfs? Please give some guidance.

Reply | Threaded
Open this post in threaded view
|

Re: HDFS Installation

Sergey Murylev
In reply to this post by Ekta Agrawal
Hi Ekta,

You can look to following instructions:
single node cluster
multi node cluster

Actually I recommend to use some automatic configuration/deployment tool like Cloudera Manager or Ambari.

--
Thanks,
Sergey


On 13/04/14 08:57, Ekta Agrawal wrote:
Can anybody suggest any good tutorial to install hdfs and work with hdfs?

I installed hadoop on Ubuntu as single node. I can see those service running.

But how to install and work with hdfs? Please give some guidance.


signature.asc (924 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: HDFS Installation

Ekta Agrawal
In reply to this post by Mahesh Khandewal
I already used the same guide to install hadoop.

If HDFS does not require anything except Hadoop single node installation then the installation part is complete.

I tried running bin/hadoop dfs -mkdir /foodir
            bin/hadoop dfsadmin -safemode enter

these commands are giving following exception:

14/04/07 00:23:09 INFO ipc.Client: Retrying connect to server:localhost/127.0.0.1:54310. Already tried 9 time(s).
Bad connection to FS. command aborted. exception: Call to localhost/
127.0.0.1:54310 failed on connection exception:
java.net.ConnectException: Connection refused

Can somebody help me to understand that why it is happening?





On Sun, Apr 13, 2014 at 10:33 AM, Mahesh Khandewal <[hidden email]> wrote:
I think in hadoop installation only hdfs comes.
Like you need to insert script like
bin/hadoop start-dfs.sh in $hadoop_home path


On Sun, Apr 13, 2014 at 10:27 AM, Ekta Agrawal <[hidden email]> wrote:
Can anybody suggest any good tutorial to install hdfs and work with hdfs?

I installed hadoop on Ubuntu as single node. I can see those service running.

But how to install and work with hdfs? Please give some guidance.


Reply | Threaded
Open this post in threaded view
|

Re: HDFS Installation

Mingjiang Shi
Was the namenode started? Also checkout this SO: http://stackoverflow.com/questions/8872807/hadoop-datanodes-cannot-find-namenode to see if the solution works for you.


On Sun, Apr 13, 2014 at 11:16 PM, Ekta Agrawal <[hidden email]> wrote:
I already used the same guide to install hadoop.

If HDFS does not require anything except Hadoop single node installation then the installation part is complete.

I tried running bin/hadoop dfs -mkdir /foodir
            bin/hadoop dfsadmin -safemode enter

these commands are giving following exception:

14/04/07 00:23:09 INFO ipc.Client: Retrying connect to server:localhost/127.0.0.1:54310. Already tried 9 time(s).
Bad connection to FS. command aborted. exception: Call to localhost/
127.0.0.1:54310 failed on connection exception:
java.net.ConnectException: Connection refused

Can somebody help me to understand that why it is happening?





On Sun, Apr 13, 2014 at 10:33 AM, Mahesh Khandewal <[hidden email]> wrote:
I think in hadoop installation only hdfs comes.
Like you need to insert script like
bin/hadoop start-dfs.sh in $hadoop_home path


On Sun, Apr 13, 2014 at 10:27 AM, Ekta Agrawal <[hidden email]> wrote:
Can anybody suggest any good tutorial to install hdfs and work with hdfs?

I installed hadoop on Ubuntu as single node. I can see those service running.

But how to install and work with hdfs? Please give some guidance.





--
Cheers
-MJ
Reply | Threaded
Open this post in threaded view
|

Re: HDFS Installation

Mahesh Khandewal
In reply to this post by Ekta Agrawal
Ekta it may be ssh problem. first check for ssh


On Sun, Apr 13, 2014 at 8:46 PM, Ekta Agrawal <[hidden email]> wrote:
I already used the same guide to install hadoop.

If HDFS does not require anything except Hadoop single node installation then the installation part is complete.

I tried running bin/hadoop dfs -mkdir /foodir
            bin/hadoop dfsadmin -safemode enter

these commands are giving following exception:

14/04/07 00:23:09 INFO ipc.Client: Retrying connect to server:localhost/127.0.0.1:54310. Already tried 9 time(s).
Bad connection to FS. command aborted. exception: Call to localhost/
127.0.0.1:54310 failed on connection exception:
java.net.ConnectException: Connection refused

Can somebody help me to understand that why it is happening?





On Sun, Apr 13, 2014 at 10:33 AM, Mahesh Khandewal <[hidden email]> wrote:
I think in hadoop installation only hdfs comes.
Like you need to insert script like
bin/hadoop start-dfs.sh in $hadoop_home path


On Sun, Apr 13, 2014 at 10:27 AM, Ekta Agrawal <[hidden email]> wrote:
Can anybody suggest any good tutorial to install hdfs and work with hdfs?

I installed hadoop on Ubuntu as single node. I can see those service running.

But how to install and work with hdfs? Please give some guidance.



Reply | Threaded
Open this post in threaded view
|

Re: HDFS Installation

Ekta Agrawal
Hi,

I started with "ssh localhost" command.
Does anything else is needed to check SSH?

Then I stopped all the services which were running by "stop-all.sh"
and start them again with "start-all.sh".

I have copied the way it executed on the terminal for some commands.

I don't know, why after start-all.sh it says starting namenode and does not show any failure but
when I check through jps it does not list namenode.

I tried opening namenode in browser. It is also not getting open.

----------------------------------------------------------------------------------------------------------------------------------------

These is the way it executed on terminal:

hduser@ubuntu:~$ ssh localhost
hduser@localhost's password:
Welcome to Ubuntu 12.04.2 LTS

 * Documentation:  
https://help.ubuntu.com/

459 packages can be updated.
209 updates are security updates.

Last login: Sun Feb  2 00:28:46 2014 from localhost




hduser@ubuntu:~$ /usr/local/hadoop/bin/hadoop namenode -format
14/04/07 01:44:20 INFO namenode.NameNode: STARTUP_MSG:
/*****************************
*******************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = ubuntu/
127.0.0.1
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 1.0.3
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1335192; compiled by 'hortonfo' on Tue May  8 20:31:25 UTC 2012
******************************
******************************/
Re-format filesystem in /app/hadoop/tmp/dfs/name ? (Y or N) y
Format aborted in /app/hadoop/tmp/dfs/name
14/04/07 01:44:27 INFO namenode.NameNode: SHUTDOWN_MSG:
/*****************************
*******************************
SHUTDOWN_MSG: Shutting down NameNode at ubuntu/
127.0.0.1
************************************************************/


hduser@ubuntu:~$ /usr/local/hadoop/bin/start-all.sh
starting namenode, logging to /usr/local/hadoop/libexec/../
logs/hadoop-hduser-namenode-ubuntu.out
ehduser@localhost's password:
hduser@localhost's password: localhost: Permission denied, please try again.
localhost: starting datanode, logging to /usr/local/hadoop/libexec/../
logs/hadoop-hduser-datanode-ubuntu.out
hduser@


 


On Sun, Apr 13, 2014 at 9:14 PM, Mahesh Khandewal <[hidden email]> wrote:
Ekta it may be ssh problem. first check for ssh


On Sun, Apr 13, 2014 at 8:46 PM, Ekta Agrawal <[hidden email]> wrote:
I already used the same guide to install hadoop.

If HDFS does not require anything except Hadoop single node installation then the installation part is complete.

I tried running bin/hadoop dfs -mkdir /foodir
            bin/hadoop dfsadmin -safemode enter

these commands are giving following exception:

14/04/07 00:23:09 INFO ipc.Client: Retrying connect to server:localhost/127.0.0.1:54310. Already tried 9 time(s).
Bad connection to FS. command aborted. exception: Call to localhost/
127.0.0.1:54310 failed on connection exception:
java.net.ConnectException: Connection refused

Can somebody help me to understand that why it is happening?





On Sun, Apr 13, 2014 at 10:33 AM, Mahesh Khandewal <[hidden email]> wrote:
I think in hadoop installation only hdfs comes.
Like you need to insert script like
bin/hadoop start-dfs.sh in $hadoop_home path


On Sun, Apr 13, 2014 at 10:27 AM, Ekta Agrawal <[hidden email]> wrote:
Can anybody suggest any good tutorial to install hdfs and work with hdfs?

I installed hadoop on Ubuntu as single node. I can see those service running.

But how to install and work with hdfs? Please give some guidance.




Reply | Threaded
Open this post in threaded view
|

Re: HDFS Installation

Dejan Menges

Your output says permission denied for ssh@localhost. Try to fix that first (there are bunch of tutorials on passwordless SSH connection).

On Apr 13, 2014 7:37 PM, "Ekta Agrawal" <[hidden email]> wrote:
Hi,

I started with "ssh localhost" command.
Does anything else is needed to check SSH?

Then I stopped all the services which were running by "stop-all.sh"
and start them again with "start-all.sh".

I have copied the way it executed on the terminal for some commands.

I don't know, why after start-all.sh it says starting namenode and does not show any failure but
when I check through jps it does not list namenode.

I tried opening namenode in browser. It is also not getting open.

----------------------------------------------------------------------------------------------------------------------------------------

These is the way it executed on terminal:

hduser@ubuntu:~$ ssh localhost
hduser@localhost's password:
Welcome to Ubuntu 12.04.2 LTS

 * Documentation:  
https://help.ubuntu.com/

459 packages can be updated.
209 updates are security updates.

Last login: Sun Feb  2 00:28:46 2014 from localhost




hduser@ubuntu:~$ /usr/local/hadoop/bin/hadoop namenode -format
14/04/07 01:44:20 INFO namenode.NameNode: STARTUP_MSG:
/*****************************
*******************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = ubuntu/
127.0.0.1
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 1.0.3
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1335192; compiled by 'hortonfo' on Tue May  8 20:31:25 UTC 2012
******************************
******************************/
Re-format filesystem in /app/hadoop/tmp/dfs/name ? (Y or N) y
Format aborted in /app/hadoop/tmp/dfs/name
14/04/07 01:44:27 INFO namenode.NameNode: SHUTDOWN_MSG:
/*****************************
*******************************
SHUTDOWN_MSG: Shutting down NameNode at ubuntu/
127.0.0.1
************************************************************/


hduser@ubuntu:~$ /usr/local/hadoop/bin/start-all.sh
starting namenode, logging to /usr/local/hadoop/libexec/../
logs/hadoop-hduser-namenode-ubuntu.out
ehduser@localhost's password:
hduser@localhost's password: localhost: Permission denied, please try again.
localhost: starting datanode, logging to /usr/local/hadoop/libexec/../
logs/hadoop-hduser-datanode-ubuntu.out
hduser@


 


On Sun, Apr 13, 2014 at 9:14 PM, Mahesh Khandewal <[hidden email]> wrote:
Ekta it may be ssh problem. first check for ssh


On Sun, Apr 13, 2014 at 8:46 PM, Ekta Agrawal <[hidden email]> wrote:
I already used the same guide to install hadoop.

If HDFS does not require anything except Hadoop single node installation then the installation part is complete.

I tried running bin/hadoop dfs -mkdir /foodir
            bin/hadoop dfsadmin -safemode enter

these commands are giving following exception:

14/04/07 00:23:09 INFO ipc.Client: Retrying connect to server:localhost/127.0.0.1:54310. Already tried 9 time(s).
Bad connection to FS. command aborted. exception: Call to localhost/
127.0.0.1:54310 failed on connection exception:
java.net.ConnectException: Connection refused

Can somebody help me to understand that why it is happening?





On Sun, Apr 13, 2014 at 10:33 AM, Mahesh Khandewal <[hidden email]> wrote:
I think in hadoop installation only hdfs comes.
Like you need to insert script like
bin/hadoop start-dfs.sh in $hadoop_home path


On Sun, Apr 13, 2014 at 10:27 AM, Ekta Agrawal <[hidden email]> wrote:
Can anybody suggest any good tutorial to install hdfs and work with hdfs?

I installed hadoop on Ubuntu as single node. I can see those service running.

But how to install and work with hdfs? Please give some guidance.




Reply | Threaded
Open this post in threaded view
|

Re: HDFS Installation

Mohammad Tariq
Hi Ekta,

Could you please show me the log files.

And, you might find this helpful, in case you are still in the setup phase : http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.U0rbYeaSyUd

Let me know if you face any issue.

Warm Regards,
Tariq


On Sun, Apr 13, 2014 at 11:10 PM, Dejan Menges <[hidden email]> wrote:

Your output says permission denied for ssh@localhost. Try to fix that first (there are bunch of tutorials on passwordless SSH connection).

On Apr 13, 2014 7:37 PM, "Ekta Agrawal" <[hidden email]> wrote:
Hi,

I started with "ssh localhost" command.
Does anything else is needed to check SSH?

Then I stopped all the services which were running by "stop-all.sh"
and start them again with "start-all.sh".

I have copied the way it executed on the terminal for some commands.

I don't know, why after start-all.sh it says starting namenode and does not show any failure but
when I check through jps it does not list namenode.

I tried opening namenode in browser. It is also not getting open.

----------------------------------------------------------------------------------------------------------------------------------------

These is the way it executed on terminal:

hduser@ubuntu:~$ ssh localhost
hduser@localhost's password:
Welcome to Ubuntu 12.04.2 LTS

 * Documentation:  
https://help.ubuntu.com/

459 packages can be updated.
209 updates are security updates.

Last login: Sun Feb  2 00:28:46 2014 from localhost




hduser@ubuntu:~$ /usr/local/hadoop/bin/hadoop namenode -format
14/04/07 01:44:20 INFO namenode.NameNode: STARTUP_MSG:
/*****************************
*******************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = ubuntu/
127.0.0.1
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 1.0.3
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1335192; compiled by 'hortonfo' on Tue May  8 20:31:25 UTC 2012
******************************
******************************/
Re-format filesystem in /app/hadoop/tmp/dfs/name ? (Y or N) y
Format aborted in /app/hadoop/tmp/dfs/name
14/04/07 01:44:27 INFO namenode.NameNode: SHUTDOWN_MSG:
/*****************************
*******************************
SHUTDOWN_MSG: Shutting down NameNode at ubuntu/
127.0.0.1
************************************************************/


hduser@ubuntu:~$ /usr/local/hadoop/bin/start-all.sh
starting namenode, logging to /usr/local/hadoop/libexec/../
logs/hadoop-hduser-namenode-ubuntu.out
ehduser@localhost's password:
hduser@localhost's password: localhost: Permission denied, please try again.
localhost: starting datanode, logging to /usr/local/hadoop/libexec/../
logs/hadoop-hduser-datanode-ubuntu.out
hduser@


 


On Sun, Apr 13, 2014 at 9:14 PM, Mahesh Khandewal <[hidden email]> wrote:
Ekta it may be ssh problem. first check for ssh


On Sun, Apr 13, 2014 at 8:46 PM, Ekta Agrawal <[hidden email]> wrote:
I already used the same guide to install hadoop.

If HDFS does not require anything except Hadoop single node installation then the installation part is complete.

I tried running bin/hadoop dfs -mkdir /foodir
            bin/hadoop dfsadmin -safemode enter

these commands are giving following exception:

14/04/07 00:23:09 INFO ipc.Client: Retrying connect to server:localhost/127.0.0.1:54310. Already tried 9 time(s).
Bad connection to FS. command aborted. exception: Call to localhost/
127.0.0.1:54310 failed on connection exception:
java.net.ConnectException: Connection refused

Can somebody help me to understand that why it is happening?





On Sun, Apr 13, 2014 at 10:33 AM, Mahesh Khandewal <[hidden email]> wrote:
I think in hadoop installation only hdfs comes.
Like you need to insert script like
bin/hadoop start-dfs.sh in $hadoop_home path


On Sun, Apr 13, 2014 at 10:27 AM, Ekta Agrawal <[hidden email]> wrote:
Can anybody suggest any good tutorial to install hdfs and work with hdfs?

I installed hadoop on Ubuntu as single node. I can see those service running.

But how to install and work with hdfs? Please give some guidance.





Reply | Threaded
Open this post in threaded view
|

Re: HDFS Installation

Mahesh Khandewal
In reply to this post by Ekta Agrawal
Hi  Ekta,
when i had ssh connection problem i tried to update the Ubuntu and then upgrade the Ubuntu . for this there are 2 commands check Google. after that you try for ssh commands. also in YouTube there is a video for installing single node setup along with pig setup. In that video you will get somewhat more accurate single node setup instructions. Moreover what i found is that watching videos and then working on installation works pretty well than reading and installing.


On Sun, Apr 13, 2014 at 11:06 PM, Ekta Agrawal <[hidden email]> wrote:
Hi,

I started with "ssh localhost" command.
Does anything else is needed to check SSH?

Then I stopped all the services which were running by "stop-all.sh"
and start them again with "start-all.sh".

I have copied the way it executed on the terminal for some commands.

I don't know, why after start-all.sh it says starting namenode and does not show any failure but
when I check through jps it does not list namenode.

I tried opening namenode in browser. It is also not getting open.

----------------------------------------------------------------------------------------------------------------------------------------

These is the way it executed on terminal:

hduser@ubuntu:~$ ssh localhost
hduser@localhost's password:
Welcome to Ubuntu 12.04.2 LTS

 * Documentation:  
https://help.ubuntu.com/

459 packages can be updated.
209 updates are security updates.

Last login: Sun Feb  2 00:28:46 2014 from localhost




hduser@ubuntu:~$ /usr/local/hadoop/bin/hadoop namenode -format
14/04/07 01:44:20 INFO namenode.NameNode: STARTUP_MSG:
/*****************************
*******************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = ubuntu/
127.0.0.1
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 1.0.3
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1335192; compiled by 'hortonfo' on Tue May  8 20:31:25 UTC 2012
******************************
******************************/
Re-format filesystem in /app/hadoop/tmp/dfs/name ? (Y or N) y
Format aborted in /app/hadoop/tmp/dfs/name
14/04/07 01:44:27 INFO namenode.NameNode: SHUTDOWN_MSG:
/*****************************
*******************************
SHUTDOWN_MSG: Shutting down NameNode at ubuntu/
127.0.0.1
************************************************************/


hduser@ubuntu:~$ /usr/local/hadoop/bin/start-all.sh
starting namenode, logging to /usr/local/hadoop/libexec/../
logs/hadoop-hduser-namenode-ubuntu.out
ehduser@localhost's password:
hduser@localhost's password: localhost: Permission denied, please try again.
localhost: starting datanode, logging to /usr/local/hadoop/libexec/../
logs/hadoop-hduser-datanode-ubuntu.out
hduser@


 


On Sun, Apr 13, 2014 at 9:14 PM, Mahesh Khandewal <[hidden email]> wrote:
Ekta it may be ssh problem. first check for ssh


On Sun, Apr 13, 2014 at 8:46 PM, Ekta Agrawal <[hidden email]> wrote:
I already used the same guide to install hadoop.

If HDFS does not require anything except Hadoop single node installation then the installation part is complete.

I tried running bin/hadoop dfs -mkdir /foodir
            bin/hadoop dfsadmin -safemode enter

these commands are giving following exception:

14/04/07 00:23:09 INFO ipc.Client: Retrying connect to server:localhost/127.0.0.1:54310. Already tried 9 time(s).
Bad connection to FS. command aborted. exception: Call to localhost/
127.0.0.1:54310 failed on connection exception:
java.net.ConnectException: Connection refused

Can somebody help me to understand that why it is happening?





On Sun, Apr 13, 2014 at 10:33 AM, Mahesh Khandewal <[hidden email]> wrote:
I think in hadoop installation only hdfs comes.
Like you need to insert script like
bin/hadoop start-dfs.sh in $hadoop_home path


On Sun, Apr 13, 2014 at 10:27 AM, Ekta Agrawal <[hidden email]> wrote:
Can anybody suggest any good tutorial to install hdfs and work with hdfs?

I installed hadoop on Ubuntu as single node. I can see those service running.

But how to install and work with hdfs? Please give some guidance.