Error While copying file from local to dfs

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Error While copying file from local to dfs

Vinodh Nagaraj
Hi All,

I am new bee to Hadoop.

I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for learning purpose.

I can execute start-all.cmd successfully.

When i execute jps,i got the below output.
28544 NameNode
35728
36308 DataNode
43828 Jps
40688 NodeManager
33820 ResourceManager

My configuration files are.

core-site.xml
---------------------
<configuration>
 <property>       
       <name>fs.defaultFS</name>
       <value>hdfs://10.219.149.100:50075/</value>   
       <description>NameNode URI</description>
 </property>
</configuration>



hdfs-site.xml
---------------------
<configuration>
   <property>  
     <name>dfs.replication</name>    
     <value>2</value>
    </property>
    <property>
      <name>dfs.namenode.name.dir</name>
      <value>D:\Hadoop_TEST\Hadoop\Data</value>
    </property>
    <property>
     <name>dfs.datanode.data.dir</name>
      <value>D:\Hadoop_TEST\Hadoop\Secondary</value>   
   </property>
  
  <property>   
      <name>dfs.namenode.datanode.registration.ip-hostname-check</name>     
     <value>false</value>
   </property>
</configuration>

I tried to copy text file from my locad drive to hdfs file system.but i got the below error.

D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt hdfs://10.219.149.100:50010/a.txt
copyFromLocal: End of File Exception between local host is: "PC205172/10.219.149.100"; destination host is: "PC205172.cts.com":50010; : java.io.EOFException; For more details see:  http://wiki.apache.org/hadoop/EOFException


Please share your suggestions.

How to identify whether i have installed hadoop properly or not
how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs or hadoop command
how to identify  NAME NODE LOCATION , NAME NODE PROT and its  configuration details by hdfs or hadoop command like  how many replicat etc.

Thanks & Regards,
Vinodh.N


Reply | Threaded
Open this post in threaded view
|

Re: Error While copying file from local to dfs

Vinodh Nagaraj
Hi All,

Please help me.

Thanks & Regards,
Vinodh.N

On Fri, Mar 4, 2016 at 6:29 PM, Vinodh Nagaraj <[hidden email]> wrote:
Hi All,

I am new bee to Hadoop.

I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for learning purpose.

I can execute start-all.cmd successfully.

When i execute jps,i got the below output.
28544 NameNode
35728
36308 DataNode
43828 Jps
40688 NodeManager
33820 ResourceManager

My configuration files are.

core-site.xml
---------------------
<configuration>
 <property>       
       <name>fs.defaultFS</name>
       <value>hdfs://10.219.149.100:50075/</value>   
       <description>NameNode URI</description>
 </property>
</configuration>



hdfs-site.xml
---------------------
<configuration>
   <property>  
     <name>dfs.replication</name>    
     <value>2</value>
    </property>
    <property>
      <name>dfs.namenode.name.dir</name>
      <value>D:\Hadoop_TEST\Hadoop\Data</value>
    </property>
    <property>
     <name>dfs.datanode.data.dir</name>
      <value>D:\Hadoop_TEST\Hadoop\Secondary</value>   
   </property>
  
  <property>   
      <name>dfs.namenode.datanode.registration.ip-hostname-check</name>     
     <value>false</value>
   </property>
</configuration>

I tried to copy text file from my locad drive to hdfs file system.but i got the below error.

D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt hdfs://10.219.149.100:50010/a.txt
copyFromLocal: End of File Exception between local host is: "PC205172/10.219.149.100"; destination host is: "PC205172.cts.com":50010; : java.io.EOFException; For more details see:  http://wiki.apache.org/hadoop/EOFException


Please share your suggestions.

How to identify whether i have installed hadoop properly or not
how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs or hadoop command
how to identify  NAME NODE LOCATION , NAME NODE PROT and its  configuration details by hdfs or hadoop command like  how many replicat etc.

Thanks & Regards,
Vinodh.N



Reply | Threaded
Open this post in threaded view
|

Re: Error While copying file from local to dfs

Mallanagouda Patil

Hi Vinod,

Can you try this.
1: core-site.XML
hdfs://localhost
2: restart hadoop stop-dfs.sh and start-dfs.sh
2:try this command
hadoop fs -copyFromLocal sourcefile /
It copies file from Source file to hdfs root.
I hope it helps.

Thanks
Mallan

On Mar 5, 2016 11:11 AM, "Vinodh Nagaraj" <[hidden email]> wrote:
Hi All,

Please help me.

Thanks & Regards,
Vinodh.N

On Fri, Mar 4, 2016 at 6:29 PM, Vinodh Nagaraj <[hidden email]> wrote:
Hi All,

I am new bee to Hadoop.

I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for learning purpose.

I can execute start-all.cmd successfully.

When i execute jps,i got the below output.
28544 NameNode
35728
36308 DataNode
43828 Jps
40688 NodeManager
33820 ResourceManager

My configuration files are.

core-site.xml
---------------------
<configuration>
 <property>       
       <name>fs.defaultFS</name>
       <value>hdfs://10.219.149.100:50075/</value>   
       <description>NameNode URI</description>
 </property>
</configuration>



hdfs-site.xml
---------------------
<configuration>
   <property>  
     <name>dfs.replication</name>    
     <value>2</value>
    </property>
    <property>
      <name>dfs.namenode.name.dir</name>
      <value>D:\Hadoop_TEST\Hadoop\Data</value>
    </property>
    <property>
     <name>dfs.datanode.data.dir</name>
      <value>D:\Hadoop_TEST\Hadoop\Secondary</value>   
   </property>
  
  <property>   
      <name>dfs.namenode.datanode.registration.ip-hostname-check</name>     
     <value>false</value>
   </property>
</configuration>

I tried to copy text file from my locad drive to hdfs file system.but i got the below error.

D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt hdfs://10.219.149.100:50010/a.txt
copyFromLocal: End of File Exception between local host is: "PC205172/10.219.149.100"; destination host is: "PC205172.cts.com":50010; : java.io.EOFException; For more details see:  http://wiki.apache.org/hadoop/EOFException


Please share your suggestions.

How to identify whether i have installed hadoop properly or not
how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs or hadoop command
how to identify  NAME NODE LOCATION , NAME NODE PROT and its  configuration details by hdfs or hadoop command like  how many replicat etc.

Thanks & Regards,
Vinodh.N



Reply | Threaded
Open this post in threaded view
|

Re: Error While copying file from local to dfs

Naresh Jangra
Hi Vinodh,

After looking at your configs below are my answer -

1. Use any other port in dfs.defaultFS property because 50075 is the port used for Datanode's WEB UI. Port in this property is mean to used for metadata transfer which is 8020 by default. You can use 8020 or any other except port in below links -


2.  Secondly, you have used Replication factor 2 an data node ONLY ONE, make it to 1.

3. Lastly, when you fire copyFromLocal command, you can do it like -

hadoop fs -copyFromLocal filename hdfs_path_name

Or if you want to use the namenode address and port -

hadoop fs -copyFromLocal 4300.txt hdfs://10.219.149.100:<PORT>/a.txt

Note that the Port should be same that you used in dfs.defaultFS.

Let me know if it helped.

Regards,
Naresh Jangra

On Sun, Mar 6, 2016 at 2:05 PM, Mallanagouda Patil <[hidden email]> wrote:

Hi Vinod,

Can you try this.
1: core-site.XML
hdfs://localhost
2: restart hadoop stop-dfs.sh and start-dfs.sh
2:try this command
hadoop fs -copyFromLocal sourcefile /
It copies file from Source file to hdfs root.
I hope it helps.

Thanks
Mallan

On Mar 5, 2016 11:11 AM, "Vinodh Nagaraj" <[hidden email]> wrote:
Hi All,

Please help me.

Thanks & Regards,
Vinodh.N

On Fri, Mar 4, 2016 at 6:29 PM, Vinodh Nagaraj <[hidden email]> wrote:
Hi All,

I am new bee to Hadoop.

I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for learning purpose.

I can execute start-all.cmd successfully.

When i execute jps,i got the below output.
28544 NameNode
35728
36308 DataNode
43828 Jps
40688 NodeManager
33820 ResourceManager

My configuration files are.

core-site.xml
---------------------
<configuration>
 <property>       
       <name>fs.defaultFS</name>
       <value>hdfs://10.219.149.100:50075/</value>   
       <description>NameNode URI</description>
 </property>
</configuration>



hdfs-site.xml
---------------------
<configuration>
   <property>  
     <name>dfs.replication</name>    
     <value>2</value>
    </property>
    <property>
      <name>dfs.namenode.name.dir</name>
      <value>D:\Hadoop_TEST\Hadoop\Data</value>
    </property>
    <property>
     <name>dfs.datanode.data.dir</name>
      <value>D:\Hadoop_TEST\Hadoop\Secondary</value>   
   </property>
  
  <property>   
      <name>dfs.namenode.datanode.registration.ip-hostname-check</name>     
     <value>false</value>
   </property>
</configuration>

I tried to copy text file from my locad drive to hdfs file system.but i got the below error.

D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt hdfs://10.219.149.100:50010/a.txt
copyFromLocal: End of File Exception between local host is: "PC205172/10.219.149.100"; destination host is: "PC205172.cts.com":50010; : java.io.EOFException; For more details see:  http://wiki.apache.org/hadoop/EOFException


Please share your suggestions.

How to identify whether i have installed hadoop properly or not
how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs or hadoop command
how to identify  NAME NODE LOCATION , NAME NODE PROT and its  configuration details by hdfs or hadoop command like  how many replicat etc.

Thanks & Regards,
Vinodh.N