推薦理由:kylin官網(wǎng)目前支持的hive最高版本為1.2.1, 而阿里云最低版本的hive也在2.X, 因此直接按照官網(wǎng)是安裝不成功的. 下面這片文章非常好的總結(jié)了如何在阿里云emr機(jī)器上安裝kylin, 具有非常良好的指導(dǎo)作用, 也有助于理解kylin是如何工作的.
根據(jù)下面的文章安裝成功.
****NOTE****
注意替換kylin版本號和emr版本號.另外查一下emr環(huán)境下是否存在環(huán)境變量 JAVA_LIBRARY_PATH,該變量會將 /usr/lib/hive-current/lib/ 目錄下的jar包加載進(jìn)來導(dǎo)致jar包版本沖突(這個問題排查了很久). 使用
export JAVA_LIBRARY_PATH=:
去掉這個環(huán)境變量.
附:
1.編譯是否有必要?根據(jù)后面的結(jié)果報錯的原因在于沒有使用hive-1.2.1的jar包(使用了****/usr/lib/hive-current/lib/ 目錄下的jar包****), 不過時間原因沒有嘗試過.
2.由于設(shè)置了專門的環(huán)境變量, 在操作系統(tǒng)上新建一個用戶(如kylin)來運行kylin會比較合適.
以下內(nèi)容轉(zhuǎn)自: 在阿里云EMR環(huán)境下部署Kylin
本人耗時約一天。
1 下載Kylin
tar zxvf apache-kylin-2.6.3-bin-hbase1x.tar.gz
cd apache-kylin-2.6.3-bin-hbase1x
下文以解壓到/home/hadoop為例。
2 設(shè)置環(huán)境變量
export KYLIN_HOME=pwd
export HIVE_CONF=/etc/ecm/hive-conf(你的EMR Hive配置文件路徑)
export HADOOP_HOME=/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0
export HIVE_HOME=/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin
export SPARK_HOME=/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/spark-2.3.2-1.2.0-bin-hadoop2.8
3 準(zhǔn)備必要文件:
3.1 把你的EMR Hadoop路徑/opt/apps/ecm/service/hadoop/2.8.5-1.1.0/package/hadoop-2.8.5-1.1.0拷到/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0(具體路徑可能不同)
3.2 下載一個Hive 1.2.1
wget https://archive.apache.org/dist/hive/hive-1.2.1/apache-hive-1.2.1-bin.tar.gz
并解壓到/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin
3.3 把你的EMR Spark路徑/opt/apps/ecm/service/spark/2.3.2-1.2.0/package/spark-2.3.2-1.2.0-bin-hadoop2.8拷到/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/spark-2.3.2-1.2.0-bin-hadoop2.8(具體路徑可能不同)
3.4 建目錄hadoop-conf并執(zhí)行:
ln -s /etc/ecm/hadoop-conf/core-site.xml $KYLIN_HOME/hadoop-conf/core-site.xml
ln -s /etc/ecm/hadoop-conf/hdfs-site.xml $KYLIN_HOME/hadoop-conf/hdfs-site.xml
ln -s /etc/ecm/hadoop-conf/yarn-site.xml $KYLIN_HOME/hadoop-conf/yarn-site.xml
ln -s /etc/ecm/hbase-conf/hbase-site.xml $KYLIN_HOME/hadoop-conf/hbase-site.xml
ln -s /etc/ecm/hive-conf/hive-site.xml $KYLIN_HOME/hadoop-conf/hive-site.xml
再編輯conf/kylin.properties
改一句:kylin.env.hadoop-conf-dir=/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-conf
4 替換Tomcat
自帶的tomcat有問題,把里面的webapps/kylin.war拷出來備份,再刪掉tomcat目錄,然后下載一個Tomcat 8:
wget http://mirrors.tuna.tsinghua.edu.cn/apache/tomcat/tomcat-8/v8.5.43/bin/apache-tomcat-8.5.43.tar.gz
解壓并重命名為tomcat,再把kylin.war拷進(jìn)webapps。
修改conf/server.xml中的Web端口為7070。
執(zhí)行:
rm ./tomcat/webapps/kylin/WEB-INF/lib/slf4j-api-1.7.21.jar
cp ./hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-api-1.7.10.jar ./tomcat/webapps/kylin/WEB-INF/lib/
5 Hack Hive
命令及Java修改如下:
cd /tmp/
mkdir hive-jdbc-1.2.1-standalone
cd hive-jdbc-1.2.1-standalone/
cp /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-jdbc-1.2.1-standalone.jar .
unzip hive-jdbc-1.2.1-standalone.jar
rm *.jar
cd /tmp/
mkdir hive-metastore-1.2.1.spark2
cd hive-metastore-1.2.1.spark2/
cp /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/spark-2.3.2-1.2.0-bin-hadoop2.8/jars/hive-metastore-1.2.1.spark2.jar .
unzip hive-metastore-1.2.1.spark2.jar
rm *.jar
cd /tmp/
mkdir hive-metastore-1.2.1
cd hive-metastore-1.2.1
cp /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-metastore-1.2.1.jar .
unzip hive-metastore-1.2.1.jar
rm *.jar
cd /tmp/
mkdir hive-exec-1.2.1
cd hive-exec-1.2.1/
cp /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-exec-1.2.1.jar .
unzip hive-exec-1.2.1.jar
rm *.jar
cd /tmp/
mkdir hive-common-1.2.1
cd hive-common-1.2.1/
cp /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar .
unzip hive-common-1.2.1.jar
rm *.jar
cd /tmp/
mkdir hive-exec-1.2.1.spark2
cd hive-exec-1.2.1.spark2/
cp /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/spark-2.3.2-1.2.0-bin-hadoop2.8/jars/hive-exec-1.2.1.spark2.jar .
unzip hive-exec-1.2.1.spark2.jar
rm *.jar
cd /tmp/
wget https://archive.apache.org/dist/hive/hive-1.2.1/apache-hive-1.2.1-src.tar.gz
tar zxvf apache-hive-1.2.1-src.tar.gz
cd /tmp/apache-hive-1.2.1-src
cd metastore/src/java/
cp org/apache/hadoop/hive/metastore/MetaStoreUtils.java /tmp/
cp org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java /tmp/
cp org/apache/hadoop/hive/metastore/RetryingMetaStoreClient.java /tmp/
rm -r org
mkdir -p org/apache/hadoop/hive/metastore/utils
cp /tmp/MetaStoreUtils.java org/apache/hadoop/hive/metastore/utils/
cp /tmp/HiveMetaStoreClient.java org/apache/hadoop/hive/metastore/
cp /tmp/RetryingMetaStoreClient.java org/apache/hadoop/hive/metastore/
vi org/apache/hadoop/hive/metastore/utils/MetaStoreUtils.java
vi org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java
vi org/apache/hadoop/hive/metastore/RetryingMetaStoreClient.java
javac -cp .:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/libthrift-0.9.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-lang-2.6.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/guava-14.0.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/common/hadoop-common-2.8.5.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-metastore-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-logging-1.1.3.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-cli-1.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-shims-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-shims-common-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/libfb303-0.9.2.jar org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java
javac -cp .:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/libthrift-0.9.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-lang-2.6.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/guava-14.0.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/common/hadoop-common-2.8.5.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-metastore-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-logging-1.1.3.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-cli-1.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/ org/apache/hadoop/hive/metastore/RetryingMetaStoreClient.java
javac -cp .:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/libthrift-0.9.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-lang-2.6.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/guava-14.0.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/common/hadoop-common-2.8.5.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-metastore-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-logging-1.1.3.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-cli-1.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-shims-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-shims-common-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/libfb303-0.9.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-exec-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/jsr305-3.0.0.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-jdbc-1.2.1-standalone.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-metastore-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.8.5.jar org/apache/hadoop/hive/metastore/utils/MetaStoreUtils.java
cd /tmp/apache-hive-1.2.1-src
cd ql/src/java/
cp org/apache/hadoop/hive/ql/io/AcidUtils.java /tmp/
rm -r org/apache/hadoop/hive/ql/*
mkdir -p org/apache/hadoop/hive/ql/io
cp /tmp/AcidUtils.java org/apache/hadoop/hive/ql/io/
vi org/apache/hadoop/hive/ql/io/AcidUtils.java
javac -cp .:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/libthrift-0.9.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-lang-2.6.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/guava-14.0.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/common/hadoop-common-2.8.5.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-metastore-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-logging-1.1.3.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-cli-1.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-shims-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-shims-common-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/libfb303-0.9.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-exec-1.2.1.jar org/apache/hadoop/hive/ql/io/AcidUtils.java
cd /tmp/apache-hive-1.2.1-src
cd common/src/java/
cp org/apache/hive/common/util/ShutdownHookManager.java /tmp/
rm -r org/
cp /tmp/ShutdownHookManager.java org/apache/hive/common/util/
vi org/apache/hive/common/util/ShutdownHookManager.java
javac -cp /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-logging-1.1.3.jar org/apache/hive/common/util/ShutdownHookManager.java
cd /tmp/hive-common-1.2.1/
cp /tmp/apache-hive-1.2.1-src/common/src/java/org/apache/hive/common/util/ShutdownHookManager* org/apache/hive/common/util/
zip -r /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar *
cd ../hive-exec-1.2.1/
cp /tmp/apache-hive-1.2.1-src/common/src/java/org/apache/hive/common/util/ShutdownHookManager* org/apache/hive/common/util/
cp /tmp/apache-hive-1.2.1-src/ql/src/java/org/apache/hadoop/hive/ql/io/AcidUtils* org/apache/hadoop/hive/ql/io/
zip -r -q /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-exec-1.2.1.jar *
zip -r -q /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/spark-2.3.2-1.2.0-bin-hadoop2.8/jars/hive-exec-1.2.1.jar *
cd ../hive-exec-1.2.1.spark2/
cp /tmp/apache-hive-1.2.1-src/common/src/java/org/apache/hive/common/util/ShutdownHookManager* org/apache/hive/common/util/
cp /tmp/apache-hive-1.2.1-src/ql/src/java/org/apache/hadoop/hive/ql/io/AcidUtils* org/apache/hadoop/hive/ql/io/
zip -r -q /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/spark-2.3.2-1.2.0-bin-hadoop2.8/jars/hive-exec-1.2.1.spark2.jar *
cd /tmp/hive-jdbc-1.2.1-standalone/
cp /tmp/apache-hive-1.2.1-src/common/src/java/org/apache/hive/common/util/ShutdownHookManager* org/apache/hive/common/util/
cp /tmp/apache-hive-1.2.1-src/metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient* org/apache/hadoop/hive/metastore/
cp -r /tmp/apache-hive-1.2.1-src/metastore/src/java/org/apache/hadoop/hive/metastore/utils org/apache/hadoop/hive/metastore/
zip -r -q /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-jdbc-1.2.1-standalone.jar *
cd /tmp/hive-metastore-1.2.1
cp -r /tmp/apache-hive-1.2.1-src/metastore/src/java/org/apache/hadoop/hive/metastore/utils org/apache/hadoop/hive/metastore/
cp -r /tmp/apache-hive-1.2.1-src/metastore/src/java/org/apache/hadoop/hive/metastore/RetryingMetaStoreClient.* org/apache/hadoop/hive/metastore/
cp -r /tmp/apache-hive-1.2.1-src/metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient* org/apache/hadoop/hive/metastore/
zip -r -q /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-metastore-1.2.1.jar *
cd ../hive-metastore-1.2.1.spark2/
cp -r /tmp/apache-hive-1.2.1-src/metastore/src/java/org/apache/hadoop/hive/metastore/utils org/apache/hadoop/hive/metastore/
cp -r /tmp/apache-hive-1.2.1-src/metastore/src/java/org/apache/hadoop/hive/metastore/RetryingMetaStoreClient.* org/apache/hadoop/hive/metastore/
cp -r /tmp/apache-hive-1.2.1-src/metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient* org/apache/hadoop/hive/metastore/
zip -r -q /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/spark-2.3.2-1.2.0-bin-hadoop2.8/jars/hive-metastore-1.2.1.spark2.jar *
Java文件修改:
5.1 AcidUtils.java
加入:
public static class AcidOperationalProperties {
private int description = 0x00;
public static final int SPLIT_UPDATE_BIT = 0x01;
public static final String SPLIT_UPDATE_STRING = "split_update";
public static final int HASH_BASED_MERGE_BIT = 0x02;
public static final String HASH_BASED_MERGE_STRING = "hash_merge";
public static final int INSERT_ONLY_BIT = 0x04;
public static final String INSERT_ONLY_STRING = "insert_only";
public static final String DEFAULT_VALUE_STRING = "default";
public static final String INSERTONLY_VALUE_STRING = "insert_only";
private AcidOperationalProperties() {
}
/**
* Returns an acidOperationalProperties object that represents default ACID behavior for tables
* that do no explicitly specify/override the default behavior.
* @return the acidOperationalProperties object.
-
*/
public static AcidOperationalProperties getDefault() {
AcidOperationalProperties obj = new AcidOperationalProperties(); obj.setSplitUpdate(true); obj.setHashBasedMerge(false); obj.setInsertOnly(false); return obj;
}
/**
* Returns an acidOperationalProperties object for tables that uses ACID framework but only
* supports INSERT operation and does not require ORC or bucketing
* @return the acidOperationalProperties object
-
*/
public static AcidOperationalProperties getInsertOnly() {
AcidOperationalProperties obj = new AcidOperationalProperties(); obj.setInsertOnly(true); return obj;
}
/**
* Returns an acidOperationalProperties object that is represented by an encoded string.
* @param propertiesStr an encoded string representing the acidOperationalProperties.
* @return the acidOperationalProperties object.
-
*/
public static AcidOperationalProperties parseString(String propertiesStr) {
if (propertiesStr == null) { return AcidOperationalProperties.getDefault(); } if (propertiesStr.equalsIgnoreCase(DEFAULT_VALUE_STRING)) { return AcidOperationalProperties.getDefault(); } if (propertiesStr.equalsIgnoreCase(INSERTONLY_VALUE_STRING)) { return AcidOperationalProperties.getInsertOnly(); } AcidOperationalProperties obj = new AcidOperationalProperties(); String[] options = propertiesStr.split("\\|"); for (String option : options) { if (option.trim().length() == 0) continue; // ignore empty strings switch (option) { case SPLIT_UPDATE_STRING: obj.setSplitUpdate(true); break; case HASH_BASED_MERGE_STRING: obj.setHashBasedMerge(true); break; default: throw new IllegalArgumentException( "Unexpected value " + option + " for ACID operational properties!"); } } return obj;
}
/**
* Returns an acidOperationalProperties object that is represented by an encoded 32-bit integer.
* @param properties an encoded 32-bit representing the acidOperationalProperties.
* @return the acidOperationalProperties object.
-
*/
public static AcidOperationalProperties parseInt(int properties) {
AcidOperationalProperties obj = new AcidOperationalProperties(); if ((properties & SPLIT_UPDATE_BIT) > 0) { obj.setSplitUpdate(true); } if ((properties & HASH_BASED_MERGE_BIT) > 0) { obj.setHashBasedMerge(true); } if ((properties & INSERT_ONLY_BIT) > 0) { obj.setInsertOnly(true); } return obj;
}
/**
* Sets the split update property for ACID operations based on the boolean argument.
* When split update is turned on, an update ACID event is interpreted as a combination of
* delete event followed by an update event.
* @param isSplitUpdate a boolean property that turns on split update when true.
* @return the acidOperationalProperties object.
-
*/
public AcidOperationalProperties setSplitUpdate(boolean isSplitUpdate) {
description = (isSplitUpdate ? (description | SPLIT_UPDATE_BIT) : (description & ~SPLIT_UPDATE_BIT)); return this;
}
/**
* Sets the hash-based merge property for ACID operations that combines delta files using
* GRACE hash join based approach, when turned on. (Currently unimplemented!)
* @param isHashBasedMerge a boolean property that turns on hash-based merge when true.
* @return the acidOperationalProperties object.
-
*/
public AcidOperationalProperties setHashBasedMerge(boolean isHashBasedMerge) {
description = (isHashBasedMerge ? (description | HASH_BASED_MERGE_BIT) : (description & ~HASH_BASED_MERGE_BIT)); return this;
}
public AcidOperationalProperties setInsertOnly(boolean isInsertOnly) {
description = (isInsertOnly ? (description | INSERT_ONLY_BIT) : (description & ~INSERT_ONLY_BIT)); return this;
}
public boolean isSplitUpdate() {
return (description & SPLIT_UPDATE_BIT) > 0;
}
public boolean isHashBasedMerge() {
return (description & HASH_BASED_MERGE_BIT) > 0;
}
public boolean isInsertOnly() {
return (description & INSERT_ONLY_BIT) > 0;
}
public int toInt() {
return description;
}
@Override
public String toString() {
StringBuilder str = new StringBuilder(); if (isSplitUpdate()) { str.append("|" + SPLIT_UPDATE_STRING); } if (isHashBasedMerge()) { str.append("|" + HASH_BASED_MERGE_STRING); } if (isInsertOnly()) { str.append("|" + INSERT_ONLY_STRING); } return str.toString();
}
}
public static AcidOperationalProperties getAcidOperationalProperties(
java.util.Map<String, String> parameters) { return AcidOperationalProperties.getDefault();
}
public static void setAcidOperationalProperties(java.util.Map<String, String> parameters,
boolean isTxnTable, AcidOperationalProperties properties) {
}
public static boolean isTablePropertyTransactional(java.util.Map m) { return false; }
5.2 HiveMetaStoreClient.java
加入:
public HiveMetaStoreClient(org.apache.hadoop.conf.Configuration conf, HiveMetaHookLoader hookLoader, Boolean b) throws MetaException {
this((HiveConf) conf, hookLoader);
}
5.3 RetryingMetaStoreClient.java
加入:
public static IMetaStoreClient getProxy(org.apache.hadoop.conf.Configuration hiveConf, Class<?>[] constructorArgTypes,
Object[] constructorArgs, String mscClassName) throws MetaException {
return getProxy((HiveConf) hiveConf, constructorArgTypes, constructorArgs, mscClassName);
}
5.4 ShutdownHookManager.java
加入:
public static void addShutdownHook(Runnable shutdownHook) {
addShutdownHook(shutdownHook, 1);
}
5.5 MetaStoreUtils.java
加入:
package org.apache.hadoop.hive.metastore.utils;(包聲明后加utils并添加到對應(yīng)目錄下。原Java文件不變)
import org.apache.hadoop.hive.metastore.*;
public static String getColumnNameDelimiter(List<FieldSchema> fieldSchemas) {
// we first take a look if any fieldSchemas contain COMMA
for (int i = 0; i < fieldSchemas.size(); i++) {
if (fieldSchemas.get(i).getName().contains(",")) {
return String.valueOf('\0');
}
}
return String.valueOf(',');
}
6 啟動Kylin
$KYLIN_HOME/bin/kylin.sh start
使用admin/KYLIN登錄。如果404則是出現(xiàn)了錯誤,在$KYLIN_HOME/logs下有日志。
一些錯誤的解決:
6.1 報找不到.keystore
直接mkdir conf/.keystore
6.2 報找不到contrib/capacity-scheduler/*.jar
直接mkdir -p hadoop-2.8.5-1.1.0/contrib/capacity-scheduler/
在該目錄下touch dummy再zip -r dummy.jar dummy
6.3 報More than one fragment with the name org_apache_tomcat_websocket
刪除tomcat/webapps/kylin/WEB-INF/lib/jul-to-slf4j-1.7.5.jar及jcl-over-slf4j-1.7.21.jar
6.4 報找不到HiveHook之類
cp /usr/lib/hive-current/lib/meta-hive-hook-1.0.1.jar apache-hive-1.2.1-bin/lib/
6.5 報loader constraint violation
rm tomcat/webapps/kylin/WEB-INF/lib/slf4j-api-1.7.21.jar
cp ./hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-api-1.7.10.jar ./tomcat/webapps/kylin/WEB-INF/lib/
rm ./spark-2.3.2-1.2.0-bin-hadoop2.8/jars/slf4j-api-1.7.16.jar
cp ./hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-api-1.7.10.jar ./spark-2.3.2-1.2.0-bin-hadoop2.8/jars/
rm ./spark-2.3.2-1.2.0-bin-hadoop2.8/jars/jul-to-slf4j-1.7.16.jar
rm ./spark-2.3.2-1.2.0-bin-hadoop2.8/jars/jcl-over-slf4j-1.7.16.jar
rm ./apache-hive-1.2.1-bin/hcatalog/share/webhcat/svr/lib/jul-to-slf4j-1.7.5.jar
rm ./hadoop-2.8.5-1.1.0/share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib/jul-to-slf4j-1.7.10.jar
rm ./hadoop-2.8.5-1.1.0/share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib/slf4j-log4j12-1.7.10.jar
rm ./spark-2.3.2-1.2.0-bin-hadoop2.8/jars/slf4j-log4j12-1.7.16.jar
cp ./hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar ./spark-2.3.2-1.2.0-bin-hadoop2.8/jars/
cp hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar tomcat/webapps/kylin/WEB-INF/lib/
6.6 報找不到derbyLocale之類的jar包。
不用管,忽略即可。
7 準(zhǔn)備樣例數(shù)據(jù)
${KYLIN_HOME}/bin/sample.sh
如未成功請重新登錄并修改環(huán)境變量:
export KYLIN_HOME=pwd
export HIVE_CONF=/etc/ecm/hive-conf(你的EMR Hive配置文件路徑)
export HADOOP_HOME=/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0
(即去掉步驟2中的最后兩行)
到界面/System/執(zhí)行Reload Metadata并刷新頁面。
此時可以看到Model頁出現(xiàn)kylin_sales_cube且狀態(tài)為DISABLED。
8 Build樣例數(shù)據(jù)
在界面/Model/action選擇Build,選擇2012年1月1日至2012年1月2日并確定,再去Monitor頁查看結(jié)果。
約10分鐘后成功,此時去Insight頁面,在最上方的-- Choose Project --下拉框選擇learn_kylin后,執(zhí)行select count(*) from KYLIN_SALES應(yīng)該會正確返回結(jié)果。
如果失敗請查看Yarn錯誤日志或聯(lián)系本人釘釘13699124376。