[HADOOP] java.lang.ClassCastException가 : org.apache.hadoop.hbase.client.Result은 org.apache.hadoop.hbase.client.Mutation 캐스트 할 수없는
HADOOPjava.lang.ClassCastException가 : org.apache.hadoop.hbase.client.Result은 org.apache.hadoop.hbase.client.Mutation 캐스트 할 수없는
다른 하나 개의 HBase를 테이블에서 값을 전송하는 동안 오류가 발생
INFO mapreduce.Job: Task Id : attempt_1410946588060_0019_r_000000_2, Status : FAILED Error: java.lang.ClassCastException: org.apache.hadoop.hbase.client.Result cannot be cast to org.apache.hadoop.hbase.client.Mutation at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:87) at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:576) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.write(WrappedReducer.java:105) at org.apache.hadoop.mapreduce.Reducer.reduce(Reducer.java:150) at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:171) at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:645) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:405) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
내 드라이버 클래스 :
Configuration conf = HBaseConfiguration.create();
// define scan and define column families to scan
Scan scan = new Scan();
scan.addFamily(Bytes.toBytes("cf1"));
// Job job = new Job(conf,"ExampleSummary");
Job job =Job.getInstance(conf);
job.setJarByClass(HBaseDriver.class);
//
// define input hbase tableS
TableMapReduceUtil.initTableMapperJob(
"test1",
scan,
HBaseMapper.class,
ImmutableBytesWritable.class,
Result.class,
job);
// define output table
TableMapReduceUtil.initTableReducerJob(
"test2",
HBaseReducer.class,
job);
job.waitForCompletion(true);
내 매퍼 :
public void map(ImmutableBytesWritable rowKey, Result columns, Context context)
throws IOException, InterruptedException {
try {
// get rowKey and convert it to string
String inKey = new String(rowKey.get());
// set new key having only date
String oKey = inKey.split("#")[0];
// get sales column in byte format first and then convert it to string (as it is stored as string from hbase shell)
byte[] bSales = columns.getValue(Bytes.toBytes("cf1"), Bytes.toBytes("sales"));
String sSales = new String(bSales);
Integer sales = new Integer(sSales);
// emit date and sales values
context.write(new ImmutableBytesWritable(oKey.getBytes()), new IntWritable(sales));
} catch (RuntimeException e){
e.printStackTrace();
}
내 감속기 :
public void reduce(ImmutableBytesWritable key, Iterable<IntWritable> values, Context context)
throws IOException, InterruptedException {
try {
int sum = 0;
// loop through different sales vales and add it to sum
for (IntWritable sales : values) {
Integer intSales = new Integer(sales.toString());
sum += intSales;
}
// create hbase put with rowkey as date
Put insHBase = new Put(key.get());
// insert sum value to hbase
insHBase.add(Bytes.toBytes("cf1"), Bytes.toBytes("sum"), Bytes.toBytes(sum));
// write data to Hbase table
context.write(null, insHBase);
} catch (Exception e) {
e.printStackTrace();
}
}
해결법
-
==============================
1.난 그냥 변경해야 할, 해결책을 발견했다
난 그냥 변경해야 할, 해결책을 발견했다
이:
TableMapReduceUtil.initTableMapperJob( "test1", scan, HBaseMapper.class, ImmutableBytesWritable.class, Result.class, job);
이에:
TableMapReduceUtil.initTableMapperJob( "test1", scan, HBaseMapper.class, ImmutableBytesWritable.class, IntWritable.class, job);
from https://stackoverflow.com/questions/25906753/java-lang-classcastexception-org-apache-hadoop-hbase-client-result-cannot-be-ca by cc-by-sa and MIT license
'HADOOP' 카테고리의 다른 글
[HADOOP] 오류가 발생하면서 실행 하둡 맵리 듀스 - 예를-2.2.0.jar (0) | 2019.10.10 |
---|---|
[HADOOP] 어떻게 개인 tasktrackers (노드)하는에 맵리 듀스는 실행해야을 지정? (0) | 2019.10.10 |
[HADOOP] 하둡 1.2.1 플러그인 이클립스 인디고 (3.7.2) 우분투 12.04 오차 (0) | 2019.10.10 |
[HADOOP] 일반 특권의 MR 작업을 실행하는 방법 (0) | 2019.10.10 |
[HADOOP] 드라이버 클래스 컴파일 오류 - 하둡 맵리 듀스 (0) | 2019.10.10 |