Delete this line. If this not removed then when Hbase Master tries to connect to HDFS it will find two IP address which are mapped for your system. And will most probably pick the second one i.e. 127.0.1.1 , at that port it can't find HDFS it will throw exceptions ConnectException
If any specific directory is mentioed in root directory (in our example Hbase directory). Then we need to create this directory in our HDFS. Run the below command from Hadoop/bin installation and give the newly created folder appropriate rights.
Also set hbase.cluster.distributed in config to true
Update regionservers (in conf directory):
Set your ip or system name
dfs.data.dir: Comma separated list of paths on the local filesystem of a DataNode where it should store its blocks.
Tips and TricksIf while starting, the error is encountered as “org.apache.hadoop.hdfs.protocol.ClientProtocol version mismatch”, the solution is to put the hadoop core jar in hbase/lib directory.Sometime, the master would not start throwing the exception of “Unable to connect”. To resolve the same, make the below entry in the /etc/hosts file