温馨提示×

温馨提示×

您好,登录后才能下订单哦!

密码登录×
登录注册×
其他方式登录
点击 登录注册 即表示同意《亿速云用户服务条款》

Hbase 098.4中使用新API通过Get列表获取多行的列值

发布时间:2020-07-24 06:37:49 来源:网络 阅读:13852 作者:quenlang 栏目:关系型数据库

    在Hbase0.98.4中通过Get列表的方式获取Hbase中多行的列值,本来想使用KeyValue类来处理的,结果发现该类的getRow()和getValue()的两个方法已经废弃了,最终使用新API中的Cell接口来实现,如下:

    Hbase中测试表中数据如下:

hbase(main):005:0> scan 'testtable'
ROW                      COLUMN+CELL                                                         
 row1                    column=cf:col1, timestamp=1414705620904, value=val11                
 row1                    column=cf:col2, timestamp=1414681969095, value=val2                 
1 row(s) in 0.0150 seconds

    Myeclipse中代码如下

package com.apache.hbase.kora.get;

import java.io.IOException;
import java.util.ArrayList;
import java.util.List;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.Cell;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.Get;
import org.apache.hadoop.hbase.client.HTable;
import org.apache.hadoop.hbase.client.Result;
import org.apache.hadoop.hbase.util.Bytes;


public class GetList {

    /**
     * @param args
     * @throws IOException 
     */
    public static void main(String[] args) throws IOException {
        Configuration conf = HBaseConfiguration.create();
        HTable table = new HTable(conf, "testtable");
        // 创建一个list用来存放Get对象
        List<Get> gets = new ArrayList<Get>();
        
        // 实例化一个Get对象,并将该对象添加到list中
        Get get1 = new Get(Bytes.toBytes("row1"));
        get1.addColumn(Bytes.toBytes("cf"), Bytes.toBytes("col1"));
        gets.add(get1);
        
        // 实例化一个Get对象,并将该对象添加到list中
        Get get2 = new Get(Bytes.toBytes("row1"));
        get2.addColumn(Bytes.toBytes("cf"), Bytes.toBytes("col2"));
        gets.add(get2);
        
        // 将list中的所有Get对象一次性提交到服务端,并返回一个Result数组
        Result[] results = table.get(gets);
        
        /*
         * 以下是两种遍历Result数组的方法
         * 第一种是通过Result的getRow()和getValue()两个方法实现
         * 第二种使用Result类的rawCells()方法返回一个Cell数组
         */
        System.out.println("First iteration...");
        for (Result result : results) {
            // 取到rowkey,并转化为字符串
            String row = Bytes.toString(result.getRow());
            System.out.println("Row: " + row + " ");
            // 如果result实例中中包含指定列,就取出其列值,并转化为字符串
            if ( result.containsColumn(Bytes.toBytes("cf"), Bytes.toBytes("col1")) ) {
                String val = Bytes.toString(result.getValue(Bytes.toBytes("cf"), Bytes.toBytes("col1")));
                System.out.println("Value: " + val + " ");
            }
            
            // 如果result实例中中包含指定列,就取出其列值,并转化为字符串
            if ( result.containsColumn(Bytes.toBytes("cf"), Bytes.toBytes("col2")) ) {
                String val = Bytes.toString(result.getValue(Bytes.toBytes("cf"), Bytes.toBytes("col2")));
                System.out.println("Value: " + val + " ");
            }
            
        }
        
        System.out.println("Second iteration...");
        for ( Result result : results ) {
            // result.rawCells()会返回一个Cell[],这个Cell[]存放的对应列的多个版本值,默认只取最新的版本,遍历其中的每个Cell对象 
            for( Cell cell : result.rawCells() ) {
                // 在更早的版本中是使用KeyValue类来实现,但是KeyValue在0.98中已经废弃了,改用Cell
                // cell.getRowArray()    得到数据的byte数组
                // cell.getRowOffset()    得到rowkey在数组中的索引下标
                // cell.getRowLength()    得到rowkey的长度
                // 将rowkey从数组中截取出来并转化为String类型
                String row = Bytes.toString(cell.getRowArray(), cell.getRowOffset(), cell.getRowLength());
                
                // 对列值的处理与rowkey的处理一样
                String value = Bytes.toString(cell.getValueArray(), cell.getValueOffset(), cell.getValueLength());
                System.out.println("Row: " + row + " ");
                System.out.println("Value: " + value + " ");
            }
        }
    }

}

    运行结果

2014-11-05 10:50:19,129 INFO  [main] zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
2014-11-05 10:50:19,130 INFO  [main] zookeeper.ZooKeeper: Client environment:host.name=MBETUKPOUEDZLGC
2014-11-05 10:50:19,130 INFO  [main] zookeeper.ZooKeeper: Client environment:java.version=1.7.0_67
2014-11-05 10:50:19,130 INFO  [main] zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
2014-11-05 10:50:19,130 INFO  [main] zookeeper.ZooKeeper: Client environment:java.home=C:\Java\jdk1.7.0_67\jre
2014-11-05 10:50:19,130 INFO  [main] zookeeper.ZooKeeper: Client environment:java.class.path=F:\项目目录\JavaProject\HbasePut\bin;C:\hbase-0.98.4-hadoop2\lib\activation-1.1.jar;C:\hbase-0.98.4-hadoop2\lib\aopalliance-1.0.jar;C:\hbase-0.98.4-hadoop2\lib\asm-3.1.jar;C:\hbase-0.98.4-hadoop2\lib\avro-1.7.4.jar;C:\hbase-0.98.4-hadoop2\lib\commons-beanutils-1.7.0.jar;C:\hbase-0.98.4-hadoop2\lib\commons-beanutils-core-1.8.0.jar;C:\hbase-0.98.4-hadoop2\lib\commons-cli-1.2.jar;C:\hbase-0.98.4-hadoop2\lib\commons-codec-1.7.jar;C:\hbase-0.98.4-hadoop2\lib\commons-collections-3.2.1.jar;C:\hbase-0.98.4-hadoop2\lib\commons-compress-1.4.1.jar;C:\hbase-0.98.4-hadoop2\lib\commons-configuration-1.6.jar;C:\hbase-0.98.4-hadoop2\lib\commons-daemon-1.0.13.jar;C:\hbase-0.98.4-hadoop2\lib\commons-digester-1.8.jar;C:\hbase-0.98.4-hadoop2\lib\commons-el-1.0.jar;C:\hbase-0.98.4-hadoop2\lib\commons-httpclient-3.1.jar;C:\hbase-0.98.4-hadoop2\lib\commons-io-2.4.jar;C:\hbase-0.98.4-hadoop2\lib\commons-lang-2.6.jar;C:\hbase-0.98.4-hadoop2\lib\commons-logging-1.1.1.jar;C:\hbase-0.98.4-hadoop2\lib\commons-math-2.1.jar;C:\hbase-0.98.4-hadoop2\lib\commons-net-3.1.jar;C:\hbase-0.98.4-hadoop2\lib\findbugs-annotations-1.3.9-1.jar;C:\hbase-0.98.4-hadoop2\lib\gmbal-api-only-3.0.0-b023.jar;C:\hbase-0.98.4-hadoop2\lib\grizzly-framework-2.1.2.jar;C:\hbase-0.98.4-hadoop2\lib\grizzly-http-2.1.2.jar;C:\hbase-0.98.4-hadoop2\lib\grizzly-http-server-2.1.2.jar;C:\hbase-0.98.4-hadoop2\lib\grizzly-http-servlet-2.1.2.jar;C:\hbase-0.98.4-hadoop2\lib\grizzly-rcm-2.1.2.jar;C:\hbase-0.98.4-hadoop2\lib\guava-12.0.1.jar;C:\hbase-0.98.4-hadoop2\lib\guice-3.0.jar;C:\hbase-0.98.4-hadoop2\lib\guice-servlet-3.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-annotations-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-auth-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-client-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-common-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-hdfs-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-mapreduce-client-app-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-mapreduce-client-common-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-mapreduce-client-core-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-mapreduce-client-jobclient-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-mapreduce-client-shuffle-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-yarn-api-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-yarn-client-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-yarn-common-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-yarn-server-common-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hadoop-yarn-server-nodemanager-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\hamcrest-core-1.3.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-client-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-common-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-common-0.98.4-hadoop2-tests.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-examples-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-hadoop2-compat-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-hadoop-compat-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-it-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-it-0.98.4-hadoop2-tests.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-prefix-tree-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-protocol-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-server-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-server-0.98.4-hadoop2-tests.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-shell-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-testing-util-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\hbase-thrift-0.98.4-hadoop2.jar;C:\hbase-0.98.4-hadoop2\lib\high-scale-lib-1.1.1.jar;C:\hbase-0.98.4-hadoop2\lib\htrace-core-2.04.jar;C:\hbase-0.98.4-hadoop2\lib\httpclient-4.1.3.jar;C:\hbase-0.98.4-hadoop2\lib\httpcore-4.1.3.jar;C:\hbase-0.98.4-hadoop2\lib\jackson-core-asl-1.8.8.jar;C:\hbase-0.98.4-hadoop2\lib\jackson-jaxrs-1.8.8.jar;C:\hbase-0.98.4-hadoop2\lib\jackson-mapper-asl-1.8.8.jar;C:\hbase-0.98.4-hadoop2\lib\jackson-xc-1.8.8.jar;C:\hbase-0.98.4-hadoop2\lib\jamon-runtime-2.3.1.jar;C:\hbase-0.98.4-hadoop2\lib\jasper-compiler-5.5.23.jar;C:\hbase-0.98.4-hadoop2\lib\jasper-runtime-5.5.23.jar;C:\hbase-0.98.4-hadoop2\lib\javax.inject-1.jar;C:\hbase-0.98.4-hadoop2\lib\javax.servlet-3.1.jar;C:\hbase-0.98.4-hadoop2\lib\javax.servlet-api-3.0.1.jar;C:\hbase-0.98.4-hadoop2\lib\jaxb-api-2.2.2.jar;C:\hbase-0.98.4-hadoop2\lib\jaxb-impl-2.2.3-1.jar;C:\hbase-0.98.4-hadoop2\lib\jersey-client-1.9.jar;C:\hbase-0.98.4-hadoop2\lib\jersey-core-1.8.jar;C:\hbase-0.98.4-hadoop2\lib\jersey-grizzly2-1.9.jar;C:\hbase-0.98.4-hadoop2\lib\jersey-guice-1.9.jar;C:\hbase-0.98.4-hadoop2\lib\jersey-json-1.8.jar;C:\hbase-0.98.4-hadoop2\lib\jersey-server-1.8.jar;C:\hbase-0.98.4-hadoop2\lib\jersey-test-framework-core-1.9.jar;C:\hbase-0.98.4-hadoop2\lib\jersey-test-framework-grizzly2-1.9.jar;C:\hbase-0.98.4-hadoop2\lib\jets3t-0.6.1.jar;C:\hbase-0.98.4-hadoop2\lib\jettison-1.3.1.jar;C:\hbase-0.98.4-hadoop2\lib\jetty-6.1.26.jar;C:\hbase-0.98.4-hadoop2\lib\jetty-sslengine-6.1.26.jar;C:\hbase-0.98.4-hadoop2\lib\jetty-util-6.1.26.jar;C:\hbase-0.98.4-hadoop2\lib\jruby-complete-1.6.8.jar;C:\hbase-0.98.4-hadoop2\lib\jsch-0.1.42.jar;C:\hbase-0.98.4-hadoop2\lib\jsp-2.1-6.1.14.jar;C:\hbase-0.98.4-hadoop2\lib\jsp-api-2.1-6.1.14.jar;C:\hbase-0.98.4-hadoop2\lib\jsr305-1.3.9.jar;C:\hbase-0.98.4-hadoop2\lib\junit-4.11.jar;C:\hbase-0.98.4-hadoop2\lib\libthrift-0.9.0.jar;C:\hbase-0.98.4-hadoop2\lib\log4j-1.2.17.jar;C:\hbase-0.98.4-hadoop2\lib\management-api-3.0.0-b012.jar;C:\hbase-0.98.4-hadoop2\lib\metrics-core-2.2.0.jar;C:\hbase-0.98.4-hadoop2\lib\netty-3.6.6.Final.jar;C:\hbase-0.98.4-hadoop2\lib\paranamer-2.3.jar;C:\hbase-0.98.4-hadoop2\lib\protobuf-java-2.5.0.jar;C:\hbase-0.98.4-hadoop2\lib\servlet-api-2.5-6.1.14.jar;C:\hbase-0.98.4-hadoop2\lib\slf4j-api-1.6.4.jar;C:\hbase-0.98.4-hadoop2\lib\slf4j-log4j12-1.6.4.jar;C:\hbase-0.98.4-hadoop2\lib\snappy-java-1.0.4.1.jar;C:\hbase-0.98.4-hadoop2\lib\xmlenc-0.52.jar;C:\hbase-0.98.4-hadoop2\lib\xz-1.0.jar;C:\hbase-0.98.4-hadoop2\lib\zookeeper-3.4.6.jar
2014-11-05 10:50:19,131 INFO  [main] zookeeper.ZooKeeper: Client environment:java.library.path=C:\Java\jdk1.7.0_67\jre\bin;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;C:/Users/Administrator/AppData/Local/Genuitec/Common/binary/com.sun.java.jdk.win32.x86_1.6.0.013/jre/bin/client;C:/Users/Administrator/AppData/Local/Genuitec/Common/binary/com.sun.java.jdk.win32.x86_1.6.0.013/jre/bin;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;d:\oracle;C:\Program Files\ibm\gsk8\lib64;C:\Program Files (x86)\ibm\gsk8\lib;D:\IBM\SQLLIB\BIN;D:\IBM\SQLLIB\FUNCTION;D:\IBM\SQLLIB\SAMPLES\REPL;C:\Program Files\MIT\Kerberos\bin;C:\strawberry\c\bin;C:\strawberry\perl\bin;C:\Java\jdk1.7.0_67\bin;C:\hadoop-2.4.1\bin;C:\apache-maven-3.2.3\bin;C:\hbase-0.98.4-hadoop2\bin;D:\UltraEdit\;.
2014-11-05 10:50:19,131 INFO  [main] zookeeper.ZooKeeper: Client environment:java.io.tmpdir=C:\Users\ADMINI~1\AppData\Local\Temp\
2014-11-05 10:50:19,131 INFO  [main] zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
2014-11-05 10:50:19,131 INFO  [main] zookeeper.ZooKeeper: Client environment:os.name=Windows 7
2014-11-05 10:50:19,131 INFO  [main] zookeeper.ZooKeeper: Client environment:os.arch=amd64
2014-11-05 10:50:19,131 INFO  [main] zookeeper.ZooKeeper: Client environment:os.version=6.1
2014-11-05 10:50:19,131 INFO  [main] zookeeper.ZooKeeper: Client environment:user.name=Administrator
2014-11-05 10:50:19,131 INFO  [main] zookeeper.ZooKeeper: Client environment:user.home=C:\Users\Administrator
2014-11-05 10:50:19,131 INFO  [main] zookeeper.ZooKeeper: Client environment:user.dir=F:\项目目录\JavaProject\HbasePut
2014-11-05 10:50:19,132 INFO  [main] zookeeper.ZooKeeper: Initiating client connection, connectString=hadoop4:2181,hadoop3:2181,hadoop2:2181,hadoop1:2181,hadoop5:2181 sessionTimeout=90000 watcher=hconnection-0x14332f47, quorum=hadoop4:2181,hadoop3:2181,hadoop2:2181,hadoop1:2181,hadoop5:2181, baseZNode=/hbase
2014-11-05 10:50:19,174 INFO  [main] zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x14332f47 connecting to ZooKeeper ensemble=hadoop4:2181,hadoop3:2181,hadoop2:2181,hadoop1:2181,hadoop5:2181
2014-11-05 10:50:19,176 INFO  [main-SendThread(hadoop4.updb.com:2181)] zookeeper.ClientCnxn: Opening socket connection to server hadoop4.updb.com/192.168.0.104:2181. Will not attempt to authenticate using SASL (unknown error)
2014-11-05 10:50:19,177 INFO  [main-SendThread(hadoop4.updb.com:2181)] zookeeper.ClientCnxn: Socket connection established to hadoop4.updb.com/192.168.0.104:2181, initiating session
2014-11-05 10:50:19,188 INFO  [main-SendThread(hadoop4.updb.com:2181)] zookeeper.ClientCnxn: Session establishment complete on server hadoop4.updb.com/192.168.0.104:2181, sessionid = 0x44960c88a5f0048, negotiated timeout = 40000
2014-11-05 10:50:20,160 DEBUG [main] client.ClientSmallScanner: Finished with small scan at {ENCODED => 1588230740, NAME => 'hbase:meta,,1', STARTKEY => '', ENDKEY => ''}
First iteration...
Row: row1 
Value: val11 
Row: row1 
Value: val2 
Second iteration...
Row: row1 
Value: val11 
Row: row1 
Value: val2

    看到数据已经成功的取到了。

向AI问一下细节

免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。

AI