Showing posts with label Cluster. Show all posts
Showing posts with label Cluster. Show all posts

Saturday, 15 October 2016

How to Copy Data From One Machine to Other Machine


















---------------------------------------------------------------------------------------------------

How to install SSH using command line

http://kalyanbigdatatraining.blogspot.in/2016/09/how-to-install-ssh-using-command-line.html



How to disable the password using SSH

http://kalyanbigdatatraining.blogspot.in/2016/09/how-to-disable-password-using-ssh.html


---------------------------------------------------------------------------------------------------

How to Copy Data From One Machine (kalyan@orienit1) to Other Machine (kalyan@orienit2)

scp kalyan@orienit1:<source path> kalyan@orienit2:<destination path>



How to Copy Data From One Machine (kalyan@192.168.0.111) to Other Machine (kalyan@192.168.0.112)

scp kalyan@192.168.0.111:<source path> kalyan@192.168.0.112:<destination path>


---------------------------------------------------------------------------------------------------

How to Copy sample.txt file From One Machine (kalyan@orienit1) to Other Machine (kalyan@orienit2)

scp kalyan@orienit1:~/sample.txt kalyan@orienit2:~/sample.txt



How to Copy sample.txt file From One Machine (kalyan@192.168.0.111) to Other Machine (kalyan@192.168.0.112)

scp kalyan@192.168.0.111:~/sample.txt kalyan@192.168.0.112:~/sample.txt


---------------------------------------------------------------------------------------------------



Sunday, 18 September 2016

How to disable the password using SSH



















1. using rsa algorithm

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa 
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
chmod 600 ~/.ssh/authorized_keys


2. using dsa algorithm

ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa 

cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys
chmod 600 ~/.ssh/authorized_keys


3. verify using below command

ssh localhost


4. If still asking the password again and again then remove existing data

rm -r ~/.ssh

5. After removing the existing data .. repeat step1 or step2






Hadoop Cluster Practice Commands

















How to Start / Stop Hadoop Process:

Name Node: 

hadoop-daemon.sh start namenode
hadoop-daemon.sh stop namenode

Data Node: 

hadoop-daemon.sh start datanode
hadoop-daemon.sh stop datanode

Secondary Name Node: 

hadoop-daemon.sh start secondarynamenode
hadoop-daemon.sh stop secondarynamenode

Job Tracker: 

hadoop-daemon.sh start jobtracker
hadoop-daemon.sh stop jobtracker

Task Tracker:

hadoop-daemon.sh start tasktracker
hadoop-daemon.sh stop tasktracker

Resource Manager: 

yarn-daemon.sh start resourcemanager
yarn-daemon.sh stop resourcemanager

Node Manager: 

yarn-daemon.sh start nodemanager
yarn-daemon.sh stop nodemanager


Job History Server:
mr-jobhistory-daemon.sh start historyserver
mr-jobhistory-daemon.sh stop historyserver






Hadoop Required URLS
Name Node :                http://orienit1:50070
Resource Manager :     http://orienit2:8088


1. Create a new folder with your hostname (i.e orienit1 / orienit2 / orienit3) using below command.

hadoop fs -mkdir /orienit


2. Put some files from Local File System to HDFS using below command

hadoop fs -put <local file system path> <hdfs path>

hadoop fs -put /etc/hosts /orienit/hosts


3. Read the Data from HDFS using below command

hadoop fs -cat /orienit/hosts


4. Change the Replication factor using below commands

Increase the replication number:

hadoop fs -setrep 5 /orienit/hosts


Decrease the replication number:

hadoop fs -setrep 3 /orienit/hosts


5. Transfer the data from one cluster to other cluster using below command

hadoop distcp hdfs://nn1:8020/<src path> hdfs://nn2:8020/<dst path>

where nn1 is first cluster namenode ip or hostname

where nn2 is second cluster namenode ip or hostname


6. Commissioning and Decommissioning the Nodes in Hadoop Cluster

In Name Node machine modify below changes


1. create include file in /home/kalyan/work folder

2. create exclude file in /home/kalyan/work folder

3. update hdfs-site.xml with below configurations


<property>
<name>dfs.hosts</name>
<value>/home/kalyan/work/include</value>
</property>

<property>
<name>dfs.hosts.exclude</name>
<value>/home/kalyan/work/exclude</value>
</property>

4. execute below command to reflect the hdfs changes

hadoop dfsadmin -refreshNodes

5. update yarn-site.xml with below configurations



<property>
<name>yarn.resourcemanager.nodes.include-path</name>
<value>/home/kalyan/work/include</value>
</property>


<property>
<name>yarn.resourcemanager.nodes.exclude-path</name>
<value>/home/kalyan/work/exclude</value>
</property>

6. execute below command to reflect the mr changes

yarn rmadmin -refreshNodes

7. verify the changes in browser





Wednesday, 14 September 2016

How to change the HOSTNAME using command line


















How to modify the hostname ( hadoop to kalyan)

1. Open /etc/hostname file using below command

sudo gedit /etc/hostname

2. Modify the /etc/hostname file with new hostname

update hadoop to kalyan

3. Save the /etc/hostname file

4. Open /etc/hosts file using below command

sudo gedit /etc/hosts

5. Modify the /etc/hosts file with new hostname

update hadoop to kalyan

127.0.0.1    localhost
127.0.0.1    kalyan

6. Save the /etc/hosts file

7. Execute below command to reflect the hostname changes

sudo service hostname restart

8. Re-Open the Terminal

9. Verify the new hostname using below command

hostname





Related Posts Plugin for WordPress, Blogger...