在阿里云EMR環(huán)境下部署Kylin

推薦理由:kylin官網(wǎng)目前支持的hive最高版本為1.2.1, 而阿里云最低版本的hive也在2.X, 因此直接按照官網(wǎng)是安裝不成功的. 下面這片文章非常好的總結(jié)了如何在阿里云emr機(jī)器上安裝kylin, 具有非常良好的指導(dǎo)作用, 也有助于理解kylin是如何工作的.

根據(jù)下面的文章安裝成功.

****NOTE****

注意替換kylin版本號和emr版本號.另外查一下emr環(huán)境下是否存在環(huán)境變量 JAVA_LIBRARY_PATH,該變量會將 /usr/lib/hive-current/lib/ 目錄下的jar包加載進(jìn)來導(dǎo)致jar包版本沖突(這個問題排查了很久). 使用

export JAVA_LIBRARY_PATH=:

去掉這個環(huán)境變量.

附:

1.編譯是否有必要?根據(jù)后面的結(jié)果報錯的原因在于沒有使用hive-1.2.1的jar包(使用了****/usr/lib/hive-current/lib/ 目錄下的jar包****), 不過時間原因沒有嘗試過.

2.由于設(shè)置了專門的環(huán)境變量, 在操作系統(tǒng)上新建一個用戶(如kylin)來運行kylin會比較合適.

以下內(nèi)容轉(zhuǎn)自: 在阿里云EMR環(huán)境下部署Kylin

本人耗時約一天。

1 下載Kylin

wget http://mirrors.tuna.tsinghua.edu.cn/apache/kylin/apache-kylin-2.6.3/apache-kylin-2.6.3-bin-hbase1x.tar.gz

tar zxvf apache-kylin-2.6.3-bin-hbase1x.tar.gz

cd apache-kylin-2.6.3-bin-hbase1x

下文以解壓到/home/hadoop為例。

2 設(shè)置環(huán)境變量

export KYLIN_HOME=pwd

export HIVE_CONF=/etc/ecm/hive-conf(你的EMR Hive配置文件路徑)

export HADOOP_HOME=/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0

export HIVE_HOME=/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin

export SPARK_HOME=/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/spark-2.3.2-1.2.0-bin-hadoop2.8

3 準(zhǔn)備必要文件:

3.1 把你的EMR Hadoop路徑/opt/apps/ecm/service/hadoop/2.8.5-1.1.0/package/hadoop-2.8.5-1.1.0拷到/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0(具體路徑可能不同)

3.2 下載一個Hive 1.2.1

wget https://archive.apache.org/dist/hive/hive-1.2.1/apache-hive-1.2.1-bin.tar.gz

并解壓到/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin

3.3 把你的EMR Spark路徑/opt/apps/ecm/service/spark/2.3.2-1.2.0/package/spark-2.3.2-1.2.0-bin-hadoop2.8拷到/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/spark-2.3.2-1.2.0-bin-hadoop2.8(具體路徑可能不同)

3.4 建目錄hadoop-conf并執(zhí)行:

ln -s /etc/ecm/hadoop-conf/core-site.xml $KYLIN_HOME/hadoop-conf/core-site.xml

ln -s /etc/ecm/hadoop-conf/hdfs-site.xml $KYLIN_HOME/hadoop-conf/hdfs-site.xml

ln -s /etc/ecm/hadoop-conf/yarn-site.xml $KYLIN_HOME/hadoop-conf/yarn-site.xml

ln -s /etc/ecm/hbase-conf/hbase-site.xml $KYLIN_HOME/hadoop-conf/hbase-site.xml

ln -s /etc/ecm/hive-conf/hive-site.xml $KYLIN_HOME/hadoop-conf/hive-site.xml

再編輯conf/kylin.properties

改一句:kylin.env.hadoop-conf-dir=/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-conf

4 替換Tomcat

自帶的tomcat有問題,把里面的webapps/kylin.war拷出來備份,再刪掉tomcat目錄,然后下載一個Tomcat 8:

wget http://mirrors.tuna.tsinghua.edu.cn/apache/tomcat/tomcat-8/v8.5.43/bin/apache-tomcat-8.5.43.tar.gz

解壓并重命名為tomcat,再把kylin.war拷進(jìn)webapps。

修改conf/server.xml中的Web端口為7070。

執(zhí)行:

rm ./tomcat/webapps/kylin/WEB-INF/lib/slf4j-api-1.7.21.jar

cp ./hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-api-1.7.10.jar ./tomcat/webapps/kylin/WEB-INF/lib/

5 Hack Hive

命令及Java修改如下:

cd /tmp/

mkdir hive-jdbc-1.2.1-standalone

cd hive-jdbc-1.2.1-standalone/

cp /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-jdbc-1.2.1-standalone.jar .

unzip hive-jdbc-1.2.1-standalone.jar

rm *.jar

cd /tmp/

mkdir hive-metastore-1.2.1.spark2

cd hive-metastore-1.2.1.spark2/

cp /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/spark-2.3.2-1.2.0-bin-hadoop2.8/jars/hive-metastore-1.2.1.spark2.jar .

unzip hive-metastore-1.2.1.spark2.jar

rm *.jar

cd /tmp/

mkdir hive-metastore-1.2.1

cd hive-metastore-1.2.1

cp /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-metastore-1.2.1.jar .

unzip hive-metastore-1.2.1.jar

rm *.jar

cd /tmp/

mkdir hive-exec-1.2.1

cd hive-exec-1.2.1/

cp /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-exec-1.2.1.jar .

unzip hive-exec-1.2.1.jar

rm *.jar

cd /tmp/

mkdir hive-common-1.2.1

cd hive-common-1.2.1/

cp /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar .

unzip hive-common-1.2.1.jar

rm *.jar

cd /tmp/

mkdir hive-exec-1.2.1.spark2

cd hive-exec-1.2.1.spark2/

cp /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/spark-2.3.2-1.2.0-bin-hadoop2.8/jars/hive-exec-1.2.1.spark2.jar .

unzip hive-exec-1.2.1.spark2.jar

rm *.jar

cd /tmp/

wget https://archive.apache.org/dist/hive/hive-1.2.1/apache-hive-1.2.1-src.tar.gz

tar zxvf apache-hive-1.2.1-src.tar.gz

cd /tmp/apache-hive-1.2.1-src

cd metastore/src/java/

cp org/apache/hadoop/hive/metastore/MetaStoreUtils.java /tmp/

cp org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java /tmp/

cp org/apache/hadoop/hive/metastore/RetryingMetaStoreClient.java /tmp/

rm -r org

mkdir -p org/apache/hadoop/hive/metastore/utils

cp /tmp/MetaStoreUtils.java org/apache/hadoop/hive/metastore/utils/

cp /tmp/HiveMetaStoreClient.java org/apache/hadoop/hive/metastore/

cp /tmp/RetryingMetaStoreClient.java org/apache/hadoop/hive/metastore/

vi org/apache/hadoop/hive/metastore/utils/MetaStoreUtils.java

vi org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java

vi org/apache/hadoop/hive/metastore/RetryingMetaStoreClient.java

javac -cp .:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/libthrift-0.9.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-lang-2.6.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/guava-14.0.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/common/hadoop-common-2.8.5.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-metastore-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-logging-1.1.3.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-cli-1.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-shims-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-shims-common-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/libfb303-0.9.2.jar org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java

javac -cp .:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/libthrift-0.9.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-lang-2.6.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/guava-14.0.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/common/hadoop-common-2.8.5.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-metastore-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-logging-1.1.3.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-cli-1.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/ org/apache/hadoop/hive/metastore/RetryingMetaStoreClient.java

javac -cp .:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/libthrift-0.9.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-lang-2.6.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/guava-14.0.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/common/hadoop-common-2.8.5.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-metastore-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-logging-1.1.3.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-cli-1.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-shims-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-shims-common-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/libfb303-0.9.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-exec-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/jsr305-3.0.0.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-jdbc-1.2.1-standalone.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-metastore-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.8.5.jar org/apache/hadoop/hive/metastore/utils/MetaStoreUtils.java

cd /tmp/apache-hive-1.2.1-src

cd ql/src/java/

cp org/apache/hadoop/hive/ql/io/AcidUtils.java /tmp/

rm -r org/apache/hadoop/hive/ql/*

mkdir -p org/apache/hadoop/hive/ql/io

cp /tmp/AcidUtils.java org/apache/hadoop/hive/ql/io/

vi org/apache/hadoop/hive/ql/io/AcidUtils.java

javac -cp .:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/libthrift-0.9.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-lang-2.6.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/guava-14.0.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0/share/hadoop/common/hadoop-common-2.8.5.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-metastore-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-logging-1.1.3.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-cli-1.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-shims-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-shims-common-1.2.1.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/libfb303-0.9.2.jar:/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-exec-1.2.1.jar org/apache/hadoop/hive/ql/io/AcidUtils.java

cd /tmp/apache-hive-1.2.1-src

cd common/src/java/

cp org/apache/hive/common/util/ShutdownHookManager.java /tmp/

rm -r org/

cp /tmp/ShutdownHookManager.java org/apache/hive/common/util/

vi org/apache/hive/common/util/ShutdownHookManager.java

javac -cp /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/commons-logging-1.1.3.jar org/apache/hive/common/util/ShutdownHookManager.java

cd /tmp/hive-common-1.2.1/

cp /tmp/apache-hive-1.2.1-src/common/src/java/org/apache/hive/common/util/ShutdownHookManager* org/apache/hive/common/util/

zip -r /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar *

cd ../hive-exec-1.2.1/

cp /tmp/apache-hive-1.2.1-src/common/src/java/org/apache/hive/common/util/ShutdownHookManager* org/apache/hive/common/util/

cp /tmp/apache-hive-1.2.1-src/ql/src/java/org/apache/hadoop/hive/ql/io/AcidUtils* org/apache/hadoop/hive/ql/io/

zip -r -q /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-exec-1.2.1.jar *

zip -r -q /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/spark-2.3.2-1.2.0-bin-hadoop2.8/jars/hive-exec-1.2.1.jar *

cd ../hive-exec-1.2.1.spark2/

cp /tmp/apache-hive-1.2.1-src/common/src/java/org/apache/hive/common/util/ShutdownHookManager* org/apache/hive/common/util/

cp /tmp/apache-hive-1.2.1-src/ql/src/java/org/apache/hadoop/hive/ql/io/AcidUtils* org/apache/hadoop/hive/ql/io/

zip -r -q /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/spark-2.3.2-1.2.0-bin-hadoop2.8/jars/hive-exec-1.2.1.spark2.jar *

cd /tmp/hive-jdbc-1.2.1-standalone/

cp /tmp/apache-hive-1.2.1-src/common/src/java/org/apache/hive/common/util/ShutdownHookManager* org/apache/hive/common/util/

cp /tmp/apache-hive-1.2.1-src/metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient* org/apache/hadoop/hive/metastore/

cp -r /tmp/apache-hive-1.2.1-src/metastore/src/java/org/apache/hadoop/hive/metastore/utils org/apache/hadoop/hive/metastore/

zip -r -q /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-jdbc-1.2.1-standalone.jar *

cd /tmp/hive-metastore-1.2.1

cp -r /tmp/apache-hive-1.2.1-src/metastore/src/java/org/apache/hadoop/hive/metastore/utils org/apache/hadoop/hive/metastore/

cp -r /tmp/apache-hive-1.2.1-src/metastore/src/java/org/apache/hadoop/hive/metastore/RetryingMetaStoreClient.* org/apache/hadoop/hive/metastore/

cp -r /tmp/apache-hive-1.2.1-src/metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient* org/apache/hadoop/hive/metastore/

zip -r -q /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/apache-hive-1.2.1-bin/lib/hive-metastore-1.2.1.jar *

cd ../hive-metastore-1.2.1.spark2/

cp -r /tmp/apache-hive-1.2.1-src/metastore/src/java/org/apache/hadoop/hive/metastore/utils org/apache/hadoop/hive/metastore/

cp -r /tmp/apache-hive-1.2.1-src/metastore/src/java/org/apache/hadoop/hive/metastore/RetryingMetaStoreClient.* org/apache/hadoop/hive/metastore/

cp -r /tmp/apache-hive-1.2.1-src/metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient* org/apache/hadoop/hive/metastore/

zip -r -q /home/hadoop/apache-kylin-2.6.3-bin-hbase1x/spark-2.3.2-1.2.0-bin-hadoop2.8/jars/hive-metastore-1.2.1.spark2.jar *

Java文件修改:

5.1 AcidUtils.java

加入:

image.gif

public static class AcidOperationalProperties {

private int description = 0x00;

public static final int SPLIT_UPDATE_BIT = 0x01;

public static final String SPLIT_UPDATE_STRING = "split_update";

public static final int HASH_BASED_MERGE_BIT = 0x02;

public static final String HASH_BASED_MERGE_STRING = "hash_merge";

public static final int INSERT_ONLY_BIT = 0x04;

public static final String INSERT_ONLY_STRING = "insert_only";

public static final String DEFAULT_VALUE_STRING = "default";

public static final String INSERTONLY_VALUE_STRING = "insert_only";

private AcidOperationalProperties() {

}

/**
  •  * Returns an acidOperationalProperties object that represents default ACID behavior for tables
    
  •       * that do no explicitly specify/override the default behavior.
    
  •            * @return the acidOperationalProperties object.
    
  •                 */
    

    public static AcidOperationalProperties getDefault() {

    AcidOperationalProperties obj = new AcidOperationalProperties();
    
    obj.setSplitUpdate(true);
    
    obj.setHashBasedMerge(false);
    
    obj.setInsertOnly(false);
    
    return obj;
    

    }

    /**

  •  * Returns an acidOperationalProperties object for tables that uses ACID framework but only
    
  •       * supports INSERT operation and does not require ORC or bucketing
    
  •            * @return the acidOperationalProperties object
    
  •                 */
    

    public static AcidOperationalProperties getInsertOnly() {

    AcidOperationalProperties obj = new AcidOperationalProperties();
    
    obj.setInsertOnly(true);
    
    return obj;
    

    }

    /**

  •  * Returns an acidOperationalProperties object that is represented by an encoded string.
    
  •       * @param propertiesStr an encoded string representing the acidOperationalProperties.
    
  •            * @return the acidOperationalProperties object.
    
  •                 */
    

    public static AcidOperationalProperties parseString(String propertiesStr) {

    if (propertiesStr == null) {
    
      return AcidOperationalProperties.getDefault();
    
    }
    
    if (propertiesStr.equalsIgnoreCase(DEFAULT_VALUE_STRING)) {
    
      return AcidOperationalProperties.getDefault();
    
    }
    
    if (propertiesStr.equalsIgnoreCase(INSERTONLY_VALUE_STRING)) {
    
      return AcidOperationalProperties.getInsertOnly();
    
    }
    
    AcidOperationalProperties obj = new AcidOperationalProperties();
    
    String[] options = propertiesStr.split("\\|");
    
    for (String option : options) {
    
      if (option.trim().length() == 0) continue; // ignore empty strings
    
      switch (option) {
    
        case SPLIT_UPDATE_STRING:
    
          obj.setSplitUpdate(true);
    
          break;
    
        case HASH_BASED_MERGE_STRING:
    
          obj.setHashBasedMerge(true);
    
          break;
    
        default:
    
          throw new IllegalArgumentException(
    
              "Unexpected value " + option + " for ACID operational properties!");
    
      }
    
    }
    
    return obj;
    

    }

    /**

  •  * Returns an acidOperationalProperties object that is represented by an encoded 32-bit integer.
    
  •       * @param properties an encoded 32-bit representing the acidOperationalProperties.
    
  •            * @return the acidOperationalProperties object.
    
  •                 */
    

    public static AcidOperationalProperties parseInt(int properties) {

    AcidOperationalProperties obj = new AcidOperationalProperties();
    
    if ((properties & SPLIT_UPDATE_BIT)  > 0) {
    
      obj.setSplitUpdate(true);
    
    }
    
    if ((properties & HASH_BASED_MERGE_BIT)  > 0) {
    
      obj.setHashBasedMerge(true);
    
    }
    
    if ((properties & INSERT_ONLY_BIT) > 0) {
    
      obj.setInsertOnly(true);
    
    }
    
    return obj;
    

    }

    /**

  •  * Sets the split update property for ACID operations based on the boolean argument.
    
  •       * When split update is turned on, an update ACID event is interpreted as a combination of
    
  •            * delete event followed by an update event.
    
  •                 * @param isSplitUpdate a boolean property that turns on split update when true.
    
  •                      * @return the acidOperationalProperties object.
    
  •                           */
    

    public AcidOperationalProperties setSplitUpdate(boolean isSplitUpdate) {

    description = (isSplitUpdate
    
            ? (description | SPLIT_UPDATE_BIT) : (description & ~SPLIT_UPDATE_BIT));
    
    return this;
    

    }

    /**

  •  * Sets the hash-based merge property for ACID operations that combines delta files using
    
  •       * GRACE hash join based approach, when turned on. (Currently unimplemented!)
    
  •            * @param isHashBasedMerge a boolean property that turns on hash-based merge when true.
    
  •                 * @return the acidOperationalProperties object.
    
  •                      */
    

    public AcidOperationalProperties setHashBasedMerge(boolean isHashBasedMerge) {

    description = (isHashBasedMerge
    
            ? (description | HASH_BASED_MERGE_BIT) : (description & ~HASH_BASED_MERGE_BIT));
    
    return this;
    

    }

    public AcidOperationalProperties setInsertOnly(boolean isInsertOnly) {

    description = (isInsertOnly
    
            ? (description | INSERT_ONLY_BIT) : (description & ~INSERT_ONLY_BIT));
    
    return this;
    

    }

    public boolean isSplitUpdate() {

    return (description & SPLIT_UPDATE_BIT) > 0;
    

    }

    public boolean isHashBasedMerge() {

    return (description & HASH_BASED_MERGE_BIT) > 0;
    

    }

    public boolean isInsertOnly() {

    return (description & INSERT_ONLY_BIT) > 0;
    

    }

    public int toInt() {

    return description;
    

    }

    @Override

    public String toString() {

    StringBuilder str = new StringBuilder();
    
    if (isSplitUpdate()) {
    
      str.append("|" + SPLIT_UPDATE_STRING);
    
    }
    
    if (isHashBasedMerge()) {
    
      str.append("|" + HASH_BASED_MERGE_STRING);
    
    }
    
    if (isInsertOnly()) {
    
      str.append("|" + INSERT_ONLY_STRING);
    
    }
    
    return str.toString();
    

    }

    }

    public static AcidOperationalProperties getAcidOperationalProperties(

        java.util.Map<String, String> parameters) {
    
    return AcidOperationalProperties.getDefault();
    

    }

    public static void setAcidOperationalProperties(java.util.Map<String, String> parameters,

    boolean isTxnTable, AcidOperationalProperties properties) {
    

    }

    public static boolean isTablePropertyTransactional(java.util.Map m) { return false; }

image.gif

5.2 HiveMetaStoreClient.java

加入:

public HiveMetaStoreClient(org.apache.hadoop.conf.Configuration conf, HiveMetaHookLoader hookLoader, Boolean b) throws MetaException {

this((HiveConf) conf, hookLoader);

}

5.3 RetryingMetaStoreClient.java

加入:

public static IMetaStoreClient getProxy(org.apache.hadoop.conf.Configuration hiveConf, Class<?>[] constructorArgTypes,

  Object[] constructorArgs, String mscClassName) throws MetaException {

return getProxy((HiveConf) hiveConf, constructorArgTypes, constructorArgs, mscClassName);

}

5.4 ShutdownHookManager.java

加入:

public static void addShutdownHook(Runnable shutdownHook) {

addShutdownHook(shutdownHook, 1);

}

5.5 MetaStoreUtils.java

加入:

image.gif

package org.apache.hadoop.hive.metastore.utils;(包聲明后加utils并添加到對應(yīng)目錄下。原Java文件不變)

import org.apache.hadoop.hive.metastore.*;

public static String getColumnNameDelimiter(List<FieldSchema> fieldSchemas) {

// we first take a look if any fieldSchemas contain COMMA

for (int i = 0; i < fieldSchemas.size(); i++) {

  if (fieldSchemas.get(i).getName().contains(",")) {

    return String.valueOf('\0');

  }

}

return String.valueOf(',');

}

image.gif

6 啟動Kylin

$KYLIN_HOME/bin/kylin.sh start

訪問http://機(jī)器名:7070/kylin/

使用admin/KYLIN登錄。如果404則是出現(xiàn)了錯誤,在$KYLIN_HOME/logs下有日志。

一些錯誤的解決:

6.1 報找不到.keystore

直接mkdir conf/.keystore

6.2 報找不到contrib/capacity-scheduler/*.jar

直接mkdir -p hadoop-2.8.5-1.1.0/contrib/capacity-scheduler/

在該目錄下touch dummy再zip -r dummy.jar dummy

6.3 報More than one fragment with the name org_apache_tomcat_websocket

刪除tomcat/webapps/kylin/WEB-INF/lib/jul-to-slf4j-1.7.5.jar及jcl-over-slf4j-1.7.21.jar

6.4 報找不到HiveHook之類

cp /usr/lib/hive-current/lib/meta-hive-hook-1.0.1.jar apache-hive-1.2.1-bin/lib/

6.5 報loader constraint violation

rm tomcat/webapps/kylin/WEB-INF/lib/slf4j-api-1.7.21.jar

cp ./hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-api-1.7.10.jar ./tomcat/webapps/kylin/WEB-INF/lib/

rm ./spark-2.3.2-1.2.0-bin-hadoop2.8/jars/slf4j-api-1.7.16.jar

cp ./hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-api-1.7.10.jar ./spark-2.3.2-1.2.0-bin-hadoop2.8/jars/

rm ./spark-2.3.2-1.2.0-bin-hadoop2.8/jars/jul-to-slf4j-1.7.16.jar

rm ./spark-2.3.2-1.2.0-bin-hadoop2.8/jars/jcl-over-slf4j-1.7.16.jar

rm ./apache-hive-1.2.1-bin/hcatalog/share/webhcat/svr/lib/jul-to-slf4j-1.7.5.jar

rm ./hadoop-2.8.5-1.1.0/share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib/jul-to-slf4j-1.7.10.jar

rm ./hadoop-2.8.5-1.1.0/share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib/slf4j-log4j12-1.7.10.jar

rm ./spark-2.3.2-1.2.0-bin-hadoop2.8/jars/slf4j-log4j12-1.7.16.jar

cp ./hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar ./spark-2.3.2-1.2.0-bin-hadoop2.8/jars/

cp hadoop-2.8.5-1.1.0/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar tomcat/webapps/kylin/WEB-INF/lib/

6.6 報找不到derbyLocale之類的jar包。

不用管,忽略即可。

7 準(zhǔn)備樣例數(shù)據(jù)

${KYLIN_HOME}/bin/sample.sh

如未成功請重新登錄并修改環(huán)境變量:

export KYLIN_HOME=pwd

export HIVE_CONF=/etc/ecm/hive-conf(你的EMR Hive配置文件路徑)

export HADOOP_HOME=/home/hadoop/apache-kylin-2.6.3-bin-hbase1x/hadoop-2.8.5-1.1.0

(即去掉步驟2中的最后兩行)

到界面/System/執(zhí)行Reload Metadata并刷新頁面。

此時可以看到Model頁出現(xiàn)kylin_sales_cube且狀態(tài)為DISABLED。

8 Build樣例數(shù)據(jù)

在界面/Model/action選擇Build,選擇2012年1月1日至2012年1月2日并確定,再去Monitor頁查看結(jié)果。

約10分鐘后成功,此時去Insight頁面,在最上方的-- Choose Project --下拉框選擇learn_kylin后,執(zhí)行select count(*) from KYLIN_SALES應(yīng)該會正確返回結(jié)果。

如果失敗請查看Yarn錯誤日志或聯(lián)系本人釘釘13699124376。

?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點,簡書系信息發(fā)布平臺,僅提供信息存儲服務(wù)。
  • 序言:七十年代末,一起剝皮案震驚了整個濱河市,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌,老刑警劉巖,帶你破解...
    沈念sama閱讀 230,431評論 6 544
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場離奇詭異,居然都是意外死亡,警方通過查閱死者的電腦和手機(jī),發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 99,637評論 3 429
  • 文/潘曉璐 我一進(jìn)店門,熙熙樓的掌柜王于貴愁眉苦臉地迎上來,“玉大人,你說我怎么就攤上這事。” “怎么了?”我有些...
    開封第一講書人閱讀 178,555評論 0 383
  • 文/不壞的土叔 我叫張陵,是天一觀的道長。 經(jīng)常有香客問我,道長,這世上最難降的妖魔是什么? 我笑而不...
    開封第一講書人閱讀 63,900評論 1 318
  • 正文 為了忘掉前任,我火速辦了婚禮,結(jié)果婚禮上,老公的妹妹穿的比我還像新娘。我一直安慰自己,他們只是感情好,可當(dāng)我...
    茶點故事閱讀 72,629評論 6 412
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著,像睡著了一般。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上,一...
    開封第一講書人閱讀 55,976評論 1 328
  • 那天,我揣著相機(jī)與錄音,去河邊找鬼。 笑死,一個胖子當(dāng)著我的面吹牛,可吹牛的內(nèi)容都是我干的。 我是一名探鬼主播,決...
    沈念sama閱讀 43,976評論 3 448
  • 文/蒼蘭香墨 我猛地睜開眼,長吁一口氣:“原來是場噩夢啊……” “哼!你這毒婦竟也來了?” 一聲冷哼從身側(cè)響起,我...
    開封第一講書人閱讀 43,139評論 0 290
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎,沒想到半個月后,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 49,686評論 1 336
  • 正文 獨居荒郊野嶺守林人離奇死亡,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點故事閱讀 41,411評論 3 358
  • 正文 我和宋清朗相戀三年,在試婚紗的時候發(fā)現(xiàn)自己被綠了。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片。...
    茶點故事閱讀 43,641評論 1 374
  • 序言:一個原本活蹦亂跳的男人離奇死亡,死狀恐怖,靈堂內(nèi)的尸體忽然破棺而出,到底是詐尸還是另有隱情,我是刑警寧澤,帶...
    沈念sama閱讀 39,129評論 5 364
  • 正文 年R本政府宣布,位于F島的核電站,受9級特大地震影響,放射性物質(zhì)發(fā)生泄漏。R本人自食惡果不足惜,卻給世界環(huán)境...
    茶點故事閱讀 44,820評論 3 350
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望。 院中可真熱鬧,春花似錦、人聲如沸。這莊子的主人今日做“春日...
    開封第一講書人閱讀 35,233評論 0 28
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽。三九已至,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間,已是汗流浹背。 一陣腳步聲響...
    開封第一講書人閱讀 36,567評論 1 295
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機(jī)就差點兒被人妖公主榨干…… 1. 我叫王不留,地道東北人。 一個月前我還...
    沈念sama閱讀 52,362評論 3 400
  • 正文 我出身青樓,卻偏偏與公主長得像,于是被迫代替她去往敵國和親。 傳聞我的和親對象是個殘疾皇子,可洞房花燭夜當(dāng)晚...
    茶點故事閱讀 48,604評論 2 380

推薦閱讀更多精彩內(nèi)容

  • 說明:不少讀者反饋,想使用開源組件搭建Hadoop平臺,然后再部署Kylin,但是遇到各種問題。這里我為讀者部署一...
    大詩兄_zl閱讀 2,095評論 0 2
  • 1、運行環(huán)境 主機(jī)IP 主機(jī)名 2、配置主機(jī)名(分別在五臺機(jī)器上執(zhí)行) hostname +主機(jī)名例如: h...
    獻(xiàn)給記性不好的自己閱讀 3,581評論 0 6
  • 介紹 Hive是運行在Hadoop之上的數(shù)據(jù)倉庫,將結(jié)構(gòu)化的數(shù)據(jù)文件映射為一張數(shù)據(jù)庫表,提供簡單類SQL查詢語言,...
    syncwt閱讀 4,735評論 0 7
  • Ambari安裝部署Hadoop Apache Ambari是一種基于Web的工具,支持Apache Hadoop...
    三杯水Plus閱讀 2,682評論 0 7
  • 一處安靜世界 沒有花香鳥語 萬物消寂 腳下的每一寸土地 夸張所有耳目 豐養(yǎng)所有身軀 如果愛這土地 就要像愛一切自由...
    落山溪閱讀 118評論 0 0