0

스칼라 코드에서 HBase에 연결하려고했지만 오류가 발생했습니다.Cloudera의 Hbase 스칼라 연결 문제 빠른 시작 VM CDH5.8.0

17/03/28 11:40:53 INFO client.RpcRetryingCaller: Call exception, tries=30, retries=35, started=450502 ms ago, cancelled=false, msg= 
17/03/28 11:41:13 INFO client.RpcRetryingCaller: Call exception, tries=31, retries=35, started=470659 ms ago, cancelled=false, msg= 
17/03/28 11:41:33 INFO client.RpcRetryingCaller: Call exception, tries=32, retries=35, started=490824 ms ago, cancelled=false, msg= 
17/03/28 11:41:53 INFO client.RpcRetryingCaller: Call exception, tries=33, retries=35, started=510834 ms ago, cancelled=false, msg= 
17/03/28 11:42:13 INFO client.RpcRetryingCaller: Call exception, tries=34, retries=35, started=530956 ms ago, cancelled=false, msg= 
[error] (run-main-0) org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=35, exceptions: 
[error] Tue Mar 28 11:33:22 PDT 2017, RpcRetryingCaller{globalStartTime=1490726002560, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper 
[error] Tue Mar 28 11:33:23 PDT 2017, RpcRetryingCaller{globalStartTime=1490726002560, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper 
[error] Tue Mar 28 11:33:23 PDT 2017, RpcRetryingCaller{globalStartTime=1490726002560, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper 
[error] Tue Mar 28 11:33:24 PDT 2017, RpcRetryingCaller{globalStartTime=1490726002560, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper 
. 
. 
. 
. 
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:147) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411) 
    at Hi$.main(hw.scala:12) 
    at Hi.main(hw.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
Caused by: org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1560) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1737) 
    at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38) 
    at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411) 
    at Hi$.main(hw.scala:12) 
    at Hi.main(hw.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
Caused by: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper 
    at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:239) 
    at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331) 
    at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:58383) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1591) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1529) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1551) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1737) 
    at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38) 
    at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411) 
    at Hi$.main(hw.scala:12) 
    at Hi.main(hw.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper 
    at org.apache.hadoop.hbase.ipc.RpcClientImpl.createConnection(RpcClientImpl.java:138) 
    at org.apache.hadoop.hbase.ipc.RpcClientImpl.getConnection(RpcClientImpl.java:1316) 
    at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1224) 
    at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226) 
    at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331) 
    at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:58383) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1591) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1529) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1551) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1737) 
    at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38) 
    at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411) 
    at Hi$.main(hw.scala:12) 
    at Hi.main(hw.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.net.SocketInputWrapper 
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366) 
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358) 
    at org.apache.hadoop.hbase.ipc.RpcClientImpl.createConnection(RpcClientImpl.java:138) 
    at org.apache.hadoop.hbase.ipc.RpcClientImpl.getConnection(RpcClientImpl.java:1316) 
    at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1224) 
    at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226) 
    at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331) 
    at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:58383) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1591) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1529) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1551) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1580) 
    at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1737) 
    at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38) 
    at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4117) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4110) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:427) 
    at org.apache.hadoop.hbase.client.HBaseAdmin.listTables(HBaseAdmin.java:411) 
    at Hi$.main(hw.scala:12) 
    at Hi.main(hw.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
[trace] Stack trace suppressed: run last compile:run for the full output. 
17/03/28 07:56:55 ERROR zookeeper.ClientCnxn: Event thread exiting due to interruption 
java.lang.InterruptedException 
    at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2017) 
    at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2052) 
    at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442) 
    at org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:494) 
17/03/28 07:56:55 INFO zookeeper.ClientCnxn: EventThread shut down 
java.lang.RuntimeException: Nonzero exit code: 1 
    at scala.sys.package$.error(package.scala:27) 
[trace] Stack trace suppressed: run last compile:run for the full output. 
[error] (compile:run) Nonzero exit code: 1 
[error] Total time: 544 s, completed Mar 28, 2017 7:56:56 AM 

• 호스트 OS는 8GB RAM 및 64 비트 아치가있는 Windows 7입니다. 인텔 코어 i5.
• Cloudera 빠른 시작 VM CDH 5.8.0을 사용하고 있습니다. 내 Wndows에.
VM이 6GB RAM, 2 프로세서 & 64GB 하드 디스크를 사용 중입니다. 서비스 •
은 클라우 데라 매니저에서 실행 : 서비스 •

Hbase 
    HDFS 
    YARN 
    Zookeeper 
    Key-Value Indexer 

은 클라우 데라 매니저 중지 :

Hive 
    Hue 
    Impala 
    Oozie 
    Solar 
    Spark 
    Sqoop 1 Client 
    Sqoop 2 

내 클라이언트 코드 •
가 VM에 HBase를 버전 1.2.0-cdh5.8.0 • 전용 .
• sbt 프로젝트를 만들었습니다.
• Scala와 Hbase 연결을 위해이 URL https://hbase.apache.org/book.html#scala을 언급했습니다.
CLASSPATH으로 설정하십시오. 나는 CLASSPATH에서 "/path/to/scala-library.jar"를 언급하지 않았다.

$ export CLASSPATH=$CLASSPATH:/usr/lib/hadoop/lib/native:/usr/lib/hbase/lib/native/Linux-amd64-64 

• 프로젝트 루트 디렉토리 =/가정/클라우 데라/데스크탑/플레이 SBT-프로젝트 내/홈/클라우 데라/데스크탑/- SBT 플레이 프로젝트 •
/ build.sbt은 다음과 같습니다. 내 환경에 따라 종속 라이브러리 버전을 변경했습니다. "hbase-client", "hbase-common"& "hbase-server"와 같은 몇 가지 종속성을 오류 해결의 일부로 추가했지만 여전히 성공하지 못했습니다.

name := "play-sbt-project" 
version := "1.0" 
scalaVersion := "2.10.2" 
resolvers += "Apache HBase" at "https://repository.apache.org/content/repositories/releases" 
resolvers += "Thrift" at "http://people.apache.org/~rawson/repo/" 
libraryDependencies ++= Seq(
"org.apache.hadoop" % "hadoop-core" % "1.2.1", 
"org.apache.hbase" % "hbase" % "1.2.0", 
"org.apache.hbase" % "hbase-client" % "1.2.0", 
"org.apache.hbase" % "hbase-common" % "1.2.0", 
"org.apache.hbase" % "hbase-server" % "1.2.0" 
) 

은 HBase를 연결 /home/cloudera/Desktop/play-sbt-project/src/main/scala/pw.scala에 대한 내 주요 코드는 내/등 •이

import org.apache.hadoop.hbase.HBaseConfiguration 
import org.apache.hadoop.hbase.client.{ConnectionFactory,HBaseAdmin,HTable,Put,Get} 
import org.apache.hadoop.hbase.util.Bytes 

object Hi { 
def main(args: Array[String]) = { 
println("Hi!") 
val conf = new HBaseConfiguration() 
val connection = ConnectionFactory.createConnection(conf); 
val admin = connection.getAdmin(); 

// list the tables 
val listtables=admin.listTables() 
listtables.foreach(println) 
} 
} 

처럼 보인다 • /hbase/conf/hbase-site.xml은 다음과 같습니다

<?xml version="1.0" encoding="UTF-8"?> 

<!--Autogenerated by Cloudera Manager--> 
<configuration> 
    <property> 
    <name>hbase.rootdir</name> 
    <value>hdfs://quickstart.cloudera:8020/hbase</value> 
    </property> 
    <property> 
    <name>hbase.replication</name> 
    <value>true</value> 
    </property> 
    <property> 
    <name>hbase.client.write.buffer</name> 
    <value>2097152</value> 
    </property> 
    <property> 
    <name>hbase.client.pause</name> 
    <value>100</value> 
    </property> 
    <property> 
    <name>hbase.client.retries.number</name> 
    <value>35</value> 
    </property> 
    <property> 
    <name>hbase.client.scanner.caching</name> 
    <value>100</value> 
    </property> 
    <property> 
    <name>hbase.client.keyvalue.maxsize</name> 
    <value>10485760</value> 
    </property> 
    <property> 
    <name>hbase.ipc.client.allowsInterrupt</name> 
    <value>true</value> 
    </property> 
    <property> 
    <name>hbase.client.primaryCallTimeout.get</name> 
    <value>10</value> 
    </property> 
    <property> 
    <name>hbase.client.primaryCallTimeout.multiget</name> 
    <value>10</value> 
    </property> 
    <property> 
    <name>hbase.coprocessor.region.classes</name> 
    <value>org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint</value> 
    </property> 
    <property> 
    <name>hbase.regionserver.thrift.http</name> 
    <value>false</value> 
    </property> 
    <property> 
    <name>hbase.thrift.support.proxyuser</name> 
    <value>false</value> 
    </property> 
    <property> 
    <name>hbase.rpc.timeout</name> 
    <value>60000</value> 
    </property> 
    <property> 
    <name>hbase.snapshot.enabled</name> 
    <value>true</value> 
    </property> 
    <property> 
    <name>hbase.snapshot.master.timeoutMillis</name> 
    <value>60000</value> 
    </property> 
    <property> 
    <name>hbase.snapshot.region.timeout</name> 
    <value>60000</value> 
    </property> 
    <property> 
    <name>hbase.snapshot.master.timeout.millis</name> 
    <value>60000</value> 
    </property> 
    <property> 
    <name>hbase.security.authentication</name> 
    <value>simple</value> 
    </property> 
    <property> 
    <name>hbase.rpc.protection</name> 
    <value>authentication</value> 
    </property> 
    <property> 
    <name>zookeeper.session.timeout</name> 
    <value>60000</value> 
    </property> 
    <property> 
    <name>zookeeper.znode.parent</name> 
    <value>/hbase</value> 
    </property> 
    <property> 
    <name>zookeeper.znode.rootserver</name> 
    <value>root-region-server</value> 
    </property> 
    <property> 
    <name>hbase.zookeeper.quorum</name> 
    <!-- <value>quickstart.cloudera</value> --> 
    <value>127.0.0.1</value> 
    </property> 
    <property> 
    <name>hbase.zookeeper.property.clientPort</name> 
    <value>2181</value> 
    </property> 
    <property> 
    <name>hbase.rest.ssl.enabled</name> 
    <value>false</value> 
    </property> 
</configuration> 

내가이 문제를 해결하기 위해 많이 봤하지만 성공 도착하지 않았다.
• 내 환경에 따라 build.sbt 파일의 종속 라이브러리 버전을 변경했습니다.
• 몇 가지 종속 라이브러리 "hbase-client", "hbase-common"&을 추가했습니다. "hbase-server".
• "hbase-site.xml"파일에서 "quickstart.cloudera"의 "hbase.zookeeper.quorum"값을 "127.0.0.1"로 변경했습니다.

이 문제를 해결할 수 있도록 도와주세요. 고맙습니다.

+0

정말'하둡 - core'의 버전이 사용중인인가요? Hadoop 버전은 HBase 버전과 동일하지 않습니다. CDH5.8은 Hadoop 2.6을 기반으로합니다. –

+0

@Joe Pallas, 의견 주셔서 감사합니다. 나는이 문제를 해결했다. 'hadoop-core'와 호환되는 jar 의존성 버전은 몇 가지 코드 변경과 함께 수행 한 변경 중 하나였습니다. 나는 내 솔루션을 게시하고 있습니다. – kumarhimanshu449

답변

0

문제가 해결되었습니다. 다음과 같이 변경해야합니다.

  1. "hadoop-core"를 build.sbt 파일 내에서 "hadoop-common"으로 변경하십시오. 최신 CDH 버전에서 'hadoop-core'는 MapReduce 1 용 코드에서만 지원됩니다.
  2. build.sbt에서 cloudera 5.8.0과 호환되는 모든 종속 버전을 변경하십시오.업데이트 build.sbt은 다음과 같습니다

    name := "play-sbt-project" 
    version := "1.0" 
    scalaVersion := "2.10.2" 
    resolvers += "Thrift" at "http://people.apache.org/~rawson/repo/" 
    resolvers += "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/" 
    
    libraryDependencies ++= Seq( 
    "org.apache.hadoop" % "hadoop-common" % "2.6.0-cdh5.8.0", 
    "org.apache.hbase" % "hbase" % "1.2.0-cdh5.8.0", 
    "org.apache.hbase" % "hbase-client" % "1.2.0-cdh5.8.0", 
    "org.apache.hbase" % "hbase-common" % "1.2.0-cdh5.8.0", 
    "org.apache.hbase" % "hbase-server" % "1.2.0-cdh5.8.0" 
    ) 
    
  3. HBaseConfiguration() 클래스 depricated된다. 대신 create() 메소드를 사용하십시오. 또한 주 코드에서 일부 로직을 변경했습니다. 이전에 나는 HBase에 테이블을 가져오고있었습니다. (이것 때문에 몇 가지 문제가 있었기 때문에 이것을 떨어 뜨렸지 만 다음번에 시도 할 것입니다.) 이제 HBase 연결에 스칼라를 설정하는 것이므로 지금은 새로운 행을 삽입하려고합니다. 이미 존재하는 HBase 테이블. 새로운 코드는 다음과 같습니다

    package main.scala 
    
    import org.apache.hadoop.conf.Configuration 
    import org.apache.hadoop.hbase.HBaseConfiguration 
    import org.apache.hadoop.hbase.client.{ConnectionFactory,HTable,Put} 
    import org.apache.hadoop.hbase.util.Bytes 
    
    object Hi { 
    
    def main(args: Array[String]) = { 
    println("Hi!") 
    val conf:Configuration = HBaseConfiguration.create() 
    val table:HTable = new HTable(conf, "emp1") 
    val put1:Put = new Put(Bytes.toBytes("row1")) 
    put1.add(Bytes.toBytes("personal_data"),Bytes.toBytes("qual1"),Bytes.toBytes("val1")) 
    table.put(put1) 
    println("Success") 
    } 
    }