Configuring SSH - is it required? for a psedo distriburted mode?

classic Classic list List threaded Threaded
11 messages Options
Reply | Threaded
Open this post in threaded view
|

Configuring SSH - is it required? for a psedo distriburted mode?

Raj Hadoop-2
 Hi,
 
I have a dedicated user on Linux server for hadoop. I am installing it in psedo distributed mode on this box. I want to test my programs on this machine. But i see that in installation steps - they were mentioned that SSH needs to be configured. If it is single node, I dont require it ...right? Please advise.
 
I was looking at this site
 
It menionted like this -
"
Hadoop requires SSH access to manage its nodes, i.e. remote machines plus your local machine if you want to use Hadoop on it (which is what we want to do in this short tutorial). For our single-node setup of Hadoop, we therefore need to configure SSH access to localhost for the hduser user we created in the previous section.
"
 
Thanks,
Raj
 
Reply | Threaded
Open this post in threaded view
|

Re: Configuring SSH - is it required? for a psedo distriburted mode?

kishore alajangi
When you start the hadoop procecess, each process will ask the password to start, to overcome this we will configure SSH if you use single node or multiple nodes for each process, if you can enter the password for each process Its not a mandatory even if you use multiple systems.
 
Thanks,
Kishore.


On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop <[hidden email]> wrote:
 Hi,
 
I have a dedicated user on Linux server for hadoop. I am installing it in psedo distributed mode on this box. I want to test my programs on this machine. But i see that in installation steps - they were mentioned that SSH needs to be configured. If it is single node, I dont require it ...right? Please advise.
 
I was looking at this site
 
It menionted like this -
"
Hadoop requires SSH access to manage its nodes, i.e. remote machines plus your local machine if you want to use Hadoop on it (which is what we want to do in this short tutorial). For our single-node setup of Hadoop, we therefore need to configure SSH access to localhost for the hduser user we created in the previous section.
"
 
Thanks,
Raj
 

Reply | Threaded
Open this post in threaded view
|

Re: Configuring SSH - is it required? for a psedo distriburted mode?

jay vyas
Yes it is required -- in psuedodistributed node the jobtracker is not necessarily aware that the task trackers  / data nodes are on the same machine, and will thus attempt to ssh into them when starting the respective deamons etc (i.e. start-all.sh)


On Thu, May 16, 2013 at 11:21 AM, kishore alajangi <[hidden email]> wrote:
When you start the hadoop procecess, each process will ask the password to start, to overcome this we will configure SSH if you use single node or multiple nodes for each process, if you can enter the password for each process Its not a mandatory even if you use multiple systems.
 
Thanks,
Kishore.


On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop <[hidden email]> wrote:
 Hi,
 
I have a dedicated user on Linux server for hadoop. I am installing it in psedo distributed mode on this box. I want to test my programs on this machine. But i see that in installation steps - they were mentioned that SSH needs to be configured. If it is single node, I dont require it ...right? Please advise.
 
I was looking at this site
 
It menionted like this -
"
Hadoop requires SSH access to manage its nodes, i.e. remote machines plus your local machine if you want to use Hadoop on it (which is what we want to do in this short tutorial). For our single-node setup of Hadoop, we therefore need to configure SSH access to localhost for the hduser user we created in the previous section.
"
 
Thanks,
Raj
 




--
Jay Vyas
http://jayunit100.blogspot.com
jay vyas
Reply | Threaded
Open this post in threaded view
|

Re: Configuring SSH - is it required? for a psedo distriburted mode?

jay vyas
Actually, I should amend my statement -- SSH is required, but passwordless ssh (i guess) you can live without if you are willing to enter your password for each process that gets started.

But Why wouldn't you want to implement passwordless ssh in a pseudo distributed cluster ?  Its very easy to implement on a single node:

cat ~/.ssh/id_rsa.pub /root/.ssh/authorized_keys




On Thu, May 16, 2013 at 11:31 AM, Jay Vyas <[hidden email]> wrote:
Yes it is required -- in psuedodistributed node the jobtracker is not necessarily aware that the task trackers  / data nodes are on the same machine, and will thus attempt to ssh into them when starting the respective deamons etc (i.e. start-all.sh)


On Thu, May 16, 2013 at 11:21 AM, kishore alajangi <[hidden email]> wrote:
When you start the hadoop procecess, each process will ask the password to start, to overcome this we will configure SSH if you use single node or multiple nodes for each process, if you can enter the password for each process Its not a mandatory even if you use multiple systems.
 
Thanks,
Kishore.


On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop <[hidden email]> wrote:
 Hi,
 
I have a dedicated user on Linux server for hadoop. I am installing it in psedo distributed mode on this box. I want to test my programs on this machine. But i see that in installation steps - they were mentioned that SSH needs to be configured. If it is single node, I dont require it ...right? Please advise.
 
I was looking at this site
 
It menionted like this -
"
Hadoop requires SSH access to manage its nodes, i.e. remote machines plus your local machine if you want to use Hadoop on it (which is what we want to do in this short tutorial). For our single-node setup of Hadoop, we therefore need to configure SSH access to localhost for the hduser user we created in the previous section.
"
 
Thanks,
Raj
 




--
Jay Vyas
http://jayunit100.blogspot.com



--
Jay Vyas
http://jayunit100.blogspot.com
jay vyas
Reply | Threaded
Open this post in threaded view
|

Re: Configuring SSH - is it required? for a psedo distriburted mode?

Raj Hadoop-2
Hi,
 
I am a bit confused here. I am planning to run on a single machine.
 
So what should i do to start hadoop processes. How should I do an SSH? Can you please breifly explain me what SSH is?
 
Thanks,
Raj
From: Jay Vyas <[hidden email]>
To: "[hidden email]" <[hidden email]>
Cc: Raj Hadoop <[hidden email]>
Sent: Thursday, May 16, 2013 11:34 AM
Subject: Re: Configuring SSH - is it required? for a psedo distriburted mode?

Actually, I should amend my statement -- SSH is required, but passwordless ssh (i guess) you can live without if you are willing to enter your password for each process that gets started.

But Why wouldn't you want to implement passwordless ssh in a pseudo distributed cluster ?  Its very easy to implement on a single node:

cat ~/.ssh/id_rsa.pub /root/.ssh/authorized_keys




On Thu, May 16, 2013 at 11:31 AM, Jay Vyas <[hidden email]> wrote:
Yes it is required -- in psuedodistributed node the jobtracker is not necessarily aware that the task trackers  / data nodes are on the same machine, and will thus attempt to ssh into them when starting the respective deamons etc (i.e. start-all.sh)


On Thu, May 16, 2013 at 11:21 AM, kishore alajangi <[hidden email]> wrote:
When you start the hadoop procecess, each process will ask the password to start, to overcome this we will configure SSH if you use single node or multiple nodes for each process, if you can enter the password for each process Its not a mandatory even if you use multiple systems.
 
Thanks,
Kishore.


On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop <[hidden email]> wrote:
 Hi,
 
I have a dedicated user on Linux server for hadoop. I am installing it in psedo distributed mode on this box. I want to test my programs on this machine. But i see that in installation steps - they were mentioned that SSH needs to be configured. If it is single node, I dont require it ...right? Please advise.
 
I was looking at this site
 
It menionted like this -
"
Hadoop requires SSH access to manage its nodes, i.e. remote machines plus your local machine if you want to use Hadoop on it (which is what we want to do in this short tutorial). For our single-node setup of Hadoop, we therefore need to configure SSH access to localhost for the hduser user we created in the previous section.
"
 
Thanks,
Raj
 




--
Jay Vyas
http://jayunit100.blogspot.com/



--
Jay Vyas
http://jayunit100.blogspot.com/


Reply | Threaded
Open this post in threaded view
|

Re: Configuring SSH - is it required? for a psedo distriburted mode?

Mohammad Tariq
Hello Raj,

     ssh is actually 2 things :
1- ssh : The command we use to connect to remote machines - the client. 
2- sshd : The daemon that is running on the server and allows clients to connect to the server.
ssh is pre-enabled on Linux, but in order to start sshd daemon, we need to install ssh first.

To start the Hadoop daemons you have to make ssh passwordless and issue bin/start-dfs.sh and bin/start-mapred.sh.

You might find this link useful.

Warm Regards,
Tariq


On Thu, May 16, 2013 at 9:26 PM, Raj Hadoop <[hidden email]> wrote:
Hi,
 
I am a bit confused here. I am planning to run on a single machine.
 
So what should i do to start hadoop processes. How should I do an SSH? Can you please breifly explain me what SSH is?
 
Thanks,
Raj
From: Jay Vyas <[hidden email]>
To: "[hidden email]" <[hidden email]>
Cc: Raj Hadoop <[hidden email]>
Sent: Thursday, May 16, 2013 11:34 AM
Subject: Re: Configuring SSH - is it required? for a psedo distriburted mode?

Actually, I should amend my statement -- SSH is required, but passwordless ssh (i guess) you can live without if you are willing to enter your password for each process that gets started.

But Why wouldn't you want to implement passwordless ssh in a pseudo distributed cluster ?  Its very easy to implement on a single node:

cat ~/.ssh/id_rsa.pub /root/.ssh/authorized_keys




On Thu, May 16, 2013 at 11:31 AM, Jay Vyas <[hidden email]> wrote:
Yes it is required -- in psuedodistributed node the jobtracker is not necessarily aware that the task trackers  / data nodes are on the same machine, and will thus attempt to ssh into them when starting the respective deamons etc (i.e. start-all.sh)


On Thu, May 16, 2013 at 11:21 AM, kishore alajangi <[hidden email]> wrote:
When you start the hadoop procecess, each process will ask the password to start, to overcome this we will configure SSH if you use single node or multiple nodes for each process, if you can enter the password for each process Its not a mandatory even if you use multiple systems.
 
Thanks,
Kishore.


On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop <[hidden email]> wrote:
 Hi,
 
I have a dedicated user on Linux server for hadoop. I am installing it in psedo distributed mode on this box. I want to test my programs on this machine. But i see that in installation steps - they were mentioned that SSH needs to be configured. If it is single node, I dont require it ...right? Please advise.
 
I was looking at this site
 
It menionted like this -
"
Hadoop requires SSH access to manage its nodes, i.e. remote machines plus your local machine if you want to use Hadoop on it (which is what we want to do in this short tutorial). For our single-node setup of Hadoop, we therefore need to configure SSH access to localhost for the hduser user we created in the previous section.
"
 
Thanks,
Raj
 




--
Jay Vyas
http://jayunit100.blogspot.com/



--
Jay Vyas
http://jayunit100.blogspot.com/



Reply | Threaded
Open this post in threaded view
|

Error while extracting hadoop tar

Raj Hadoop-2
I am getting the following error. Does this mean the tar file is corrupted? Do i need to download it again? Please advise.
$ tar xzf hadoop-1.1.2.tar.gz
gzip: stdin: unexpected end of file
tar: Unexpected EOF in archive
tar: Unexpected EOF in archive
tar: Error is not recoverable: exiting now

From: Mohammad Tariq <[hidden email]>
To: "[hidden email]" <[hidden email]>; Raj Hadoop <[hidden email]>
Sent: Thursday, May 16, 2013 12:02 PM
Subject: Re: Configuring SSH - is it required? for a psedo distriburted mode?

Hello Raj,

     ssh is actually 2 things :
1- ssh : The command we use to connect to remote machines - the client. 
2- sshd : The daemon that is running on the server and allows clients to connect to the server.
ssh is pre-enabled on Linux, but in order to start sshd daemon, we need to install ssh first.

To start the Hadoop daemons you have to make ssh passwordless and issue bin/start-dfs.sh and bin/start-mapred.sh.

You might find this link useful.

Warm Regards,
Tariq


On Thu, May 16, 2013 at 9:26 PM, Raj Hadoop <[hidden email]> wrote:
Hi,
 
I am a bit confused here. I am planning to run on a single machine.
 
So what should i do to start hadoop processes. How should I do an SSH? Can you please breifly explain me what SSH is?
 
Thanks,
Raj
From: Jay Vyas <[hidden email]>
To: "[hidden email]" <[hidden email]>
Cc: Raj Hadoop <[hidden email]>
Sent: Thursday, May 16, 2013 11:34 AM
Subject: Re: Configuring SSH - is it required? for a psedo distriburted mode?

Actually, I should amend my statement -- SSH is required, but passwordless ssh (i guess) you can live without if you are willing to enter your password for each process that gets started.

But Why wouldn't you want to implement passwordless ssh in a pseudo distributed cluster ?  Its very easy to implement on a single node:

cat ~/.ssh/id_rsa.pub /root/.ssh/authorized_keys




On Thu, May 16, 2013 at 11:31 AM, Jay Vyas <[hidden email]> wrote:
Yes it is required -- in psuedodistributed node the jobtracker is not necessarily aware that the task trackers  / data nodes are on the same machine, and will thus attempt to ssh into them when starting the respective deamons etc (i.e. start-all.sh)


On Thu, May 16, 2013 at 11:21 AM, kishore alajangi <[hidden email]> wrote:
When you start the hadoop procecess, each process will ask the password to start, to overcome this we will configure SSH if you use single node or multiple nodes for each process, if you can enter the password for each process Its not a mandatory even if you use multiple systems.
 
Thanks,
Kishore.


On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop <[hidden email]> wrote:
 Hi,
 
I have a dedicated user on Linux server for hadoop. I am installing it in psedo distributed mode on this box. I want to test my programs on this machine. But i see that in installation steps - they were mentioned that SSH needs to be configured. If it is single node, I dont require it ...right? Please advise.
 
I was looking at this site
 
It menionted like this -
"
Hadoop requires SSH access to manage its nodes, i.e. remote machines plus your local machine if you want to use Hadoop on it (which is what we want to do in this short tutorial). For our single-node setup of Hadoop, we therefore need to configure SSH access to localhost for the hduser user we created in the previous section.
"
 
Thanks,
Raj
 




--
Jay Vyas
http://jayunit100.blogspot.com/



--
Jay Vyas
http://jayunit100.blogspot.com/





Reply | Threaded
Open this post in threaded view
|

Re: Error while extracting hadoop tar

Mohammad Tariq
It means the file is either truncated or not downloaded completely.

Warm Regards,
Tariq


On Thu, May 16, 2013 at 10:17 PM, Raj Hadoop <[hidden email]> wrote:
I am getting the following error. Does this mean the tar file is corrupted? Do i need to download it again? Please advise.
$ tar xzf hadoop-1.1.2.tar.gz
gzip: stdin: unexpected end of file
tar: Unexpected EOF in archive
tar: Unexpected EOF in archive
tar: Error is not recoverable: exiting now

From: Mohammad Tariq <[hidden email]>
To: "[hidden email]" <[hidden email]>; Raj Hadoop <[hidden email]>
Sent: Thursday, May 16, 2013 12:02 PM
Subject: Re: Configuring SSH - is it required? for a psedo distriburted mode?

Hello Raj,

     ssh is actually 2 things :
1- ssh : The command we use to connect to remote machines - the client. 
2- sshd : The daemon that is running on the server and allows clients to connect to the server.
ssh is pre-enabled on Linux, but in order to start sshd daemon, we need to install ssh first.

To start the Hadoop daemons you have to make ssh passwordless and issue bin/start-dfs.sh and bin/start-mapred.sh.

You might find this link useful.

Warm Regards,
Tariq


On Thu, May 16, 2013 at 9:26 PM, Raj Hadoop <[hidden email]> wrote:
Hi,
 
I am a bit confused here. I am planning to run on a single machine.
 
So what should i do to start hadoop processes. How should I do an SSH? Can you please breifly explain me what SSH is?
 
Thanks,
Raj
From: Jay Vyas <[hidden email]>
To: "[hidden email]" <[hidden email]>
Cc: Raj Hadoop <[hidden email]>
Sent: Thursday, May 16, 2013 11:34 AM
Subject: Re: Configuring SSH - is it required? for a psedo distriburted mode?

Actually, I should amend my statement -- SSH is required, but passwordless ssh (i guess) you can live without if you are willing to enter your password for each process that gets started.

But Why wouldn't you want to implement passwordless ssh in a pseudo distributed cluster ?  Its very easy to implement on a single node:

cat ~/.ssh/id_rsa.pub /root/.ssh/authorized_keys




On Thu, May 16, 2013 at 11:31 AM, Jay Vyas <[hidden email]> wrote:
Yes it is required -- in psuedodistributed node the jobtracker is not necessarily aware that the task trackers  / data nodes are on the same machine, and will thus attempt to ssh into them when starting the respective deamons etc (i.e. start-all.sh)


On Thu, May 16, 2013 at 11:21 AM, kishore alajangi <[hidden email]> wrote:
When you start the hadoop procecess, each process will ask the password to start, to overcome this we will configure SSH if you use single node or multiple nodes for each process, if you can enter the password for each process Its not a mandatory even if you use multiple systems.
 
Thanks,
Kishore.


On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop <[hidden email]> wrote:
 Hi,
 
I have a dedicated user on Linux server for hadoop. I am installing it in psedo distributed mode on this box. I want to test my programs on this machine. But i see that in installation steps - they were mentioned that SSH needs to be configured. If it is single node, I dont require it ...right? Please advise.
 
I was looking at this site
 
It menionted like this -
"
Hadoop requires SSH access to manage its nodes, i.e. remote machines plus your local machine if you want to use Hadoop on it (which is what we want to do in this short tutorial). For our single-node setup of Hadoop, we therefore need to configure SSH access to localhost for the hduser user we created in the previous section.
"
 
Thanks,
Raj
 




--
Jay Vyas
http://jayunit100.blogspot.com/



--
Jay Vyas
http://jayunit100.blogspot.com/






Reply | Threaded
Open this post in threaded view
|

Re: Configuring SSH - is it required? for a psedo distriburted mode?

Bertrand Dechoux
In reply to this post by Mohammad Tariq
The scrits themselves will use ssh to connect to every machine (even localhost).
It's up to you if you want to type the password everytime. For a pseudo-distributed system, I don't see the issue with configuring a local ssh access.

BUT Hadoop in itself does not require ssh. If you have a more appropriate way to start/stop the processes, you can write your own scripts.

Regards

Bertrand


On Thu, May 16, 2013 at 6:02 PM, Mohammad Tariq <[hidden email]> wrote:
Hello Raj,

     ssh is actually 2 things :
1- ssh : The command we use to connect to remote machines - the client. 
2- sshd : The daemon that is running on the server and allows clients to connect to the server.
ssh is pre-enabled on Linux, but in order to start sshd daemon, we need to install ssh first.

To start the Hadoop daemons you have to make ssh passwordless and issue bin/start-dfs.sh and bin/start-mapred.sh.

You might find this link useful.

Warm Regards,
Tariq


On Thu, May 16, 2013 at 9:26 PM, Raj Hadoop <[hidden email]> wrote:
Hi,
 
I am a bit confused here. I am planning to run on a single machine.
 
So what should i do to start hadoop processes. How should I do an SSH? Can you please breifly explain me what SSH is?
 
Thanks,
Raj
From: Jay Vyas <[hidden email]>
To: "[hidden email]" <[hidden email]>
Cc: Raj Hadoop <[hidden email]>
Sent: Thursday, May 16, 2013 11:34 AM
Subject: Re: Configuring SSH - is it required? for a psedo distriburted mode?

Actually, I should amend my statement -- SSH is required, but passwordless ssh (i guess) you can live without if you are willing to enter your password for each process that gets started.

But Why wouldn't you want to implement passwordless ssh in a pseudo distributed cluster ?  Its very easy to implement on a single node:

cat ~/.ssh/id_rsa.pub /root/.ssh/authorized_keys




On Thu, May 16, 2013 at 11:31 AM, Jay Vyas <[hidden email]> wrote:
Yes it is required -- in psuedodistributed node the jobtracker is not necessarily aware that the task trackers  / data nodes are on the same machine, and will thus attempt to ssh into them when starting the respective deamons etc (i.e. start-all.sh)


On Thu, May 16, 2013 at 11:21 AM, kishore alajangi <[hidden email]> wrote:
When you start the hadoop procecess, each process will ask the password to start, to overcome this we will configure SSH if you use single node or multiple nodes for each process, if you can enter the password for each process Its not a mandatory even if you use multiple systems.
 
Thanks,
Kishore.


On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop <[hidden email]> wrote:
 Hi,
 
I have a dedicated user on Linux server for hadoop. I am installing it in psedo distributed mode on this box. I want to test my programs on this machine. But i see that in installation steps - they were mentioned that SSH needs to be configured. If it is single node, I dont require it ...right? Please advise.
 
I was looking at this site
 
It menionted like this -
"
Hadoop requires SSH access to manage its nodes, i.e. remote machines plus your local machine if you want to use Hadoop on it (which is what we want to do in this short tutorial). For our single-node setup of Hadoop, we therefore need to configure SSH access to localhost for the hduser user we created in the previous section.
"
 
Thanks,
Raj
 




--
Jay Vyas
http://jayunit100.blogspot.com/



--
Jay Vyas
http://jayunit100.blogspot.com/




Reply | Threaded
Open this post in threaded view
|

Re: Configuring SSH - is it required? for a psedo distriburted mode?

Amal G Jose
Without passwordless ssh you can start it in pseudo-distributed mode.
Instead of start-all.sh, start-dfs.sh etc, Use
hadoop-daemon.sh start/stop  <daemon-name>

eg: for starting jobtracker

hadoop-daemon.sh start jobtracker


On Fri, May 17, 2013 at 4:45 PM, Bertrand Dechoux <[hidden email]> wrote:
The scrits themselves will use ssh to connect to every machine (even localhost).
It's up to you if you want to type the password everytime. For a pseudo-distributed system, I don't see the issue with configuring a local ssh access.

BUT Hadoop in itself does not require ssh. If you have a more appropriate way to start/stop the processes, you can write your own scripts.

Regards

Bertrand



On Thu, May 16, 2013 at 6:02 PM, Mohammad Tariq <[hidden email]> wrote:
Hello Raj,

     ssh is actually 2 things :
1- ssh : The command we use to connect to remote machines - the client. 
2- sshd : The daemon that is running on the server and allows clients to connect to the server.
ssh is pre-enabled on Linux, but in order to start sshd daemon, we need to install ssh first.

To start the Hadoop daemons you have to make ssh passwordless and issue bin/start-dfs.sh and bin/start-mapred.sh.

You might find this link useful.

Warm Regards,
Tariq


On Thu, May 16, 2013 at 9:26 PM, Raj Hadoop <[hidden email]> wrote:
Hi,
 
I am a bit confused here. I am planning to run on a single machine.
 
So what should i do to start hadoop processes. How should I do an SSH? Can you please breifly explain me what SSH is?
 
Thanks,
Raj
From: Jay Vyas <[hidden email]>
To: "[hidden email]" <[hidden email]>
Cc: Raj Hadoop <[hidden email]>
Sent: Thursday, May 16, 2013 11:34 AM
Subject: Re: Configuring SSH - is it required? for a psedo distriburted mode?

Actually, I should amend my statement -- SSH is required, but passwordless ssh (i guess) you can live without if you are willing to enter your password for each process that gets started.

But Why wouldn't you want to implement passwordless ssh in a pseudo distributed cluster ?  Its very easy to implement on a single node:

cat ~/.ssh/id_rsa.pub /root/.ssh/authorized_keys




On Thu, May 16, 2013 at 11:31 AM, Jay Vyas <[hidden email]> wrote:
Yes it is required -- in psuedodistributed node the jobtracker is not necessarily aware that the task trackers  / data nodes are on the same machine, and will thus attempt to ssh into them when starting the respective deamons etc (i.e. start-all.sh)


On Thu, May 16, 2013 at 11:21 AM, kishore alajangi <[hidden email]> wrote:
When you start the hadoop procecess, each process will ask the password to start, to overcome this we will configure SSH if you use single node or multiple nodes for each process, if you can enter the password for each process Its not a mandatory even if you use multiple systems.
 
Thanks,
Kishore.


On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop <[hidden email]> wrote:
 Hi,
 
I have a dedicated user on Linux server for hadoop. I am installing it in psedo distributed mode on this box. I want to test my programs on this machine. But i see that in installation steps - they were mentioned that SSH needs to be configured. If it is single node, I dont require it ...right? Please advise.
 
I was looking at this site
 
It menionted like this -
"
Hadoop requires SSH access to manage its nodes, i.e. remote machines plus your local machine if you want to use Hadoop on it (which is what we want to do in this short tutorial). For our single-node setup of Hadoop, we therefore need to configure SSH access to localhost for the hduser user we created in the previous section.
"
 
Thanks,
Raj
 




--
Jay Vyas
http://jayunit100.blogspot.com/



--
Jay Vyas
http://jayunit100.blogspot.com/





Reply | Threaded
Open this post in threaded view
|

Re: Configuring SSH - is it required? for a psedo distriburted mode?

Niels Basjes
In reply to this post by Raj Hadoop-2

I never configure the ssh feature.
Not for running on a single node and not for a full size cluster.
I simply start all the required deamons (name/data/job/task) and configure them on which ports each can be reached.

Niels Basjes

On May 16, 2013 4:55 PM, "Raj Hadoop" <[hidden email]> wrote:
 Hi,
 
I have a dedicated user on Linux server for hadoop. I am installing it in psedo distributed mode on this box. I want to test my programs on this machine. But i see that in installation steps - they were mentioned that SSH needs to be configured. If it is single node, I dont require it ...right? Please advise.
 
I was looking at this site
 
It menionted like this -
"
Hadoop requires SSH access to manage its nodes, i.e. remote machines plus your local machine if you want to use Hadoop on it (which is what we want to do in this short tutorial). For our single-node setup of Hadoop, we therefore need to configure SSH access to localhost for the hduser user we created in the previous section.
"
 
Thanks,
Raj