MAC系統下ELK6.5(ElasticSearch+Logstash+Kibana)環境搭建

ELK是一種能夠從任意數據源抽取數據,并實時對數據進行搜索、分析和可視化展現的數據分析框架。此篇文章對ELK環境部署進行簡單記錄,Linux版與Mac版相差不多,此處以macOS系統操作。

基礎環境:java1.8.0,node8.14.0,maven3.5.0

目標文件(文末有網盤鏈接,也可直接下載):

************************

ES6.5.0 https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-6.5.0.tar.gz

Logstash6.5.0 https://artifacts.elastic.co/downloads/logstash/logstash-6.5.0.tar.gz

Kibana6.5 https://artifacts.elastic.co/downloads/kibana/kibana-6.5.0-darwin-x86_64.tar.gz

IK https://github.com/medcl/elasticsearch-analysis-ik/archive/master.zip

***********************

截至2019年1月18日,ES(https://www.elastic.co/downloads/elasticsearch)最新版本為6.5.4,但是分詞插件IK現在最高版本只支持到6.5.0,所以ELK整體版本定在6.5.0

一.ES+IK

ES簡介:Elasticsearch是個開源分布式搜索引擎,提供搜集、分析、存儲數據三大功能。它的特點有:分布式,零配置,自動發現,索引自動分片,索引副本機制,restful風格接口,多數據源,自動搜索負載等。

下載地址已經在目標文件中寫好,以ES舉例,獲取方式如下,Logstash和Kibana不做贅述。

1.1 獲取es下載地址

到elastic官網https://www.elastic.co/ 右上「downloads」按鈕,進入下載頁面,找到Elasticsearch->Download->pastVersion 到版本選擇頁面(https://www.elastic.co/downloads/past-releases),選擇6.5.0版,獲取到下載地址。

1.2 下載并安裝es

新建文件夾「ELK」,下載es壓縮文件到本地,命令如下:

mkdir /opt/soft/ELK

cd /opt/soft/ELK/

wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-6.5.0.tar.gz

下載完后在當前文件解壓,然后進入文件運行es進行測試

tar -zxvf elasticsearch-6.5.0.tar.gz

rm elasticsearch-6.5.0.tar.gz

cd elasticsearch-6.5.0

./bin/elasticsearch

運行log出現「started」代表運行成功,es端口默認為9200,到瀏覽器輸入http://localhost:9200/ 查看

1.3 安裝IK分詞插件

control+C停掉es,在plugins下創建ik文件夾。

cd plugins

mkdir ik

回到ELK目錄下進行下載ik

cd ../..

wget https://github.com/medcl/elasticsearch-analysis-ik/archive/master.zip

解壓zip文件,如果沒有unzip命令使用brew下載一個即可,linux下使用yum/apt-get。

unzip master.zip

cd elasticsearch-analysis-ik-master

解壓完ik后進入到文件夾內進行maven打包

mvn clean package

需要時間較長,build成功后,到target/releases/ 拿出「elasticsearch-analysis-ik-6.5.0.zip」到es的plugins下ik文件夾內并解壓

mv elasticsearch-analysis-ik-6.5.0.zip /opt/soft/ELK/elasticsearch-6.5.0/plugins/ik/

cd /opt/soft/ELK/elasticsearch-6.5.0/plugins/ik/

unzip ik.zip

解壓完后,運行es,查看ik插件狀態

log中出現“loaded plugin [analysis-ik]” 并且started 代表ik插件安裝成功

1.4 測試ES+IK

新開一個終端,創建索引index,然后對“你果然后面有戲”進行分詞測試

curl -XPUT http://localhost:9200/index

curl -H "Content-Type: application/json"? -XGET 'http://localhost:9200/index/_analyze?pretty=true' -d '

{

"analyzer" : "ik_max_word",

"text":"你果然后面有戲份"

}'

看終端輸出結果

{

? "tokens" : [

? ? {

? ? ? "token" : "你",

? ? ? "start_offset" : 0,

? ? ? "end_offset" : 1,

? ? ? "type" : "CN_CHAR",

? ? ? "position" : 0

? ? },

? ? {

? ? ? "token" : "果然",

? ? ? "start_offset" : 1,

? ? ? "end_offset" : 3,

? ? ? "type" : "CN_WORD",

? ? ? "position" : 1

? ? },

? ? {

? ? ? "token" : "然后",

? ? ? "start_offset" : 2,

? ? ? "end_offset" : 4,

? ? ? "type" : "CN_WORD",

? ? ? "position" : 2

? ? },

? ? {

? ? ? "token" : "后面",

? ? ? "start_offset" : 3,

? ? ? "end_offset" : 5,

? ? ? "type" : "CN_WORD",

? ? ? "position" : 3

? ? },

? ? {

? ? ? "token" : "面有",

? ? ? "start_offset" : 4,

? ? ? "end_offset" : 6,

? ? ? "type" : "CN_WORD",

? ? ? "position" : 4

? ? },

? ? {

? ? ? "token" : "有戲",

? ? ? "start_offset" : 5,

? ? ? "end_offset" : 7,

? ? ? "type" : "CN_WORD",

? ? ? "position" : 5

? ? },

? ? {

? ? ? "token" : "戲份",

? ? ? "start_offset" : 6,

? ? ? "end_offset" : 8,

? ? ? "type" : "CN_WORD",

? ? ? "position" : 6

? ? }

? ]

}

再測一個“落花有意流水無情”

curl -H "Content-Type: application/json" -XGET 'http://localhost:9200/index/_analyze?pretty=true' -d '

{

"analyzer" : "ik_max_word",

"text":"落花有意流水無情"

}'

輸出:

{

? "tokens" : [

? ? {

? ? ? "token" : "落花有意流水無情",

? ? ? "start_offset" : 0,

? ? ? "end_offset" : 8,

? ? ? "type" : "CN_WORD",

? ? ? "position" : 0

? ? },

? ? {

? ? ? "token" : "落花有意",

? ? ? "start_offset" : 0,

? ? ? "end_offset" : 4,

? ? ? "type" : "CN_WORD",

? ? ? "position" : 1

? ? },

? ? {

? ? ? "token" : "落花",

? ? ? "start_offset" : 0,

? ? ? "end_offset" : 2,

? ? ? "type" : "CN_WORD",

? ? ? "position" : 2

? ? },

? ? {

? ? ? "token" : "有意",

? ? ? "start_offset" : 2,

? ? ? "end_offset" : 4,

? ? ? "type" : "CN_WORD",

? ? ? "position" : 3

? ? },

? ? {

? ? ? "token" : "流水無情",

? ? ? "start_offset" : 4,

? ? ? "end_offset" : 8,

? ? ? "type" : "CN_WORD",

? ? ? "position" : 4

? ? },

? ? {

? ? ? "token" : "流水",

? ? ? "start_offset" : 4,

? ? ? "end_offset" : 6,

? ? ? "type" : "CN_WORD",

? ? ? "position" : 5

? ? },

? ? {

? ? ? "token" : "無情",

? ? ? "start_offset" : 6,

? ? ? "end_offset" : 8,

? ? ? "type" : "CN_WORD",

? ? ? "position" : 6

? ? }

? ]

}

中文分詞效果出現,至此,ES+IK安裝并測試完成

二.Logstash+input_jdbc

Logstash簡介:Logstash 主要是用來日志的搜集、分析、過濾日志的工具,支持大量的數據獲取方式。一般工作方式為c/s架構,client端安裝在需要收集日志的主機上,server端負責將收到的各節點日志進行過濾、修改等操作在一并發往elasticsearch上去。

2.1 安裝logstash

到ELK文件夾下進行logstash下載并解壓

cd /opt/soft/ELK

wget https://artifacts.elastic.co/downloads/logstash/logstash-6.5.0.tar.gz

tar -zxvf logstash-6.5.0.tar.gz

2.2 logstash測試

到解壓好的文件夾下運行logstash

cd logstash-6.5.0/

./bin/logstash -e 'input { stdin { } } output { stdout {} }'

運行logstash后,日志面板出現「Successfully started」代表成功,輸入hello world 進行測試

[2019-01-18T17:07:16,543][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

[2019-01-18T17:07:16,789][INFO ][logstash.agent? ? ? ? ? ] Successfully started Logstash API endpoint {:port=>9600}

hello world

{

? ? ? "message" => "hello world",

? ? "@timestamp" => 2019-01-18T09:07:23.194Z,

? ? ? ? ? "host" => "zhangyanandeMacBook-Pro.local",

? ? ? "@version" => "1"

}

2.3 安裝logstash-input-jdbc 插件

使用logstash 自帶命令進行安裝

./bin/logstash-plugin? install logstash-input-jdbc

出現「Installation successful」時代表安裝完成

2.4 logstash 實現mysql數據庫同步

創建sqlconfig.conf文件

vim sqlconfig.conf

input {

? jdbc {

? # 指定mysql驅動jar

? ? jdbc_driver_library => "/opt/soft/ELK/mysql-connector-java-5.1.47-bin.jar"

? ? jdbc_driver_class => "com.mysql.jdbc.Driver"

? #連接到指定數據庫

? ? jdbc_connection_string => "jdbc:mysql://localhost:3306/infodata?useUnicode=true&characterEncoding=utf8&useSSL=true"

? ? #數據庫用戶名

? ? jdbc_user => "root"

? ? #數據庫密碼

? ? jdbc_password => "Ac123456"

? ? #定時器

? ? schedule => "* * * * *"

??? #輸入需要輸出到es的sql,也可以使用-f命令指定文件

? ? statement => "SELECT id,name,content,crtime from zztest"

? }

}

filter {

? ? json {

? ? ? ? source => "message"

? ? ? ? remove_field => ["message"]

? ? }

}

output {

? ? elasticsearch {

? ? ? ? hosts => "localhost:9200"

? ? ? ? # 輸出到指定索引

? ? ? ? index => "sqlindex"

? ? ? ? # 填寫查詢的列名

? ? ? ? document_id => "%{id}"

? ? }

? ? stdout {

? ? ? ? codec => json_lines

? ? }

}

保存文件后,再次運行logstash,這次指定加載文件為sqlconfig

./bin/logstash -f sqlconfig.conf

查看log,db是否同步到es中

Sending Logstash logs to /opt/soft/ELK/logstash-6.5.0/logs which is now configured via log4j2.properties[2019-01-18T17:26:41,208][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified[2019-01-18T17:26:41,227][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.5.0"}[2019-01-18T17:26:44,471][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}[2019-01-18T17:26:45,001][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}[2019-01-18T17:26:45,010][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}[2019-01-18T17:26:45,289][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}[2019-01-18T17:26:45,388][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}[2019-01-18T17:26:45,392][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}[2019-01-18T17:26:45,430][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}[2019-01-18T17:26:45,456][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}[2019-01-18T17:26:45,474][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}[2019-01-18T17:26:45,598][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash[2019-01-18T17:26:45,964][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x2b242f8a run>"}[2019-01-18T17:26:46,011][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}[2019-01-18T17:26:46,326][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}[2019-01-18T17:27:01,682][INFO ][logstash.inputs.jdbc ] (0.027233s) SELECT id,name,content,crtime from zz{"@timestamp":"2019-01-18T09:27:01.820Z","name":"zhyn","content":"elktest","crtime":

"2019-01-17T08:03:00.000Z","@version":"1","id":1}{"@timestamp":"2019-01-18T09:27:01.841Z","name":"張哈哈","content":"同步成功","crtime":"2019-01-17T08:31:08.000Z","@version":"1","id":3}{"@timestamp":"2019-01-18T09:27:01.840Z","name":"zhyn4j","content":"eltest333","crtime":

"2019-01-17T08:16:35.000Z","@version":"1","id":2}

使用ES查詢功能,查看索引“sqlindex”下所有內容

curl? -H "Content-Type: application/json"? -XPOST 'localhost:9200/sqlindex/_search?pretty' -d '
{
? "query": { "match_all": {} }
}'

得到返回:

{
? "took" : 62,
? "timed_out" : false,
? "_shards" : {
??? "total" : 5,
??? "successful" : 5,
??? "skipped" : 0,
??? "failed" : 0
? },
? "hits" : {
??? "total" : 3,
??? "max_score" : 1.0,
??? "hits" : [
????? {
??????? "_index" : "sqlindex",
??????? "_type" : "doc",
??????? "_id" : "2",
??????? "_score" : 1.0,
??????? "_source" : {
????????? "@timestamp" : "2019-01-18T09:30:00.157Z",
????????? "name" : "zhyn4j",
????????? "content" : "eltest333",
????????? "crtime" : "2019-01-17T08:16:35.000Z",
????????? "@version" : "1",
????????? "id" : 2
??????? }
????? },
????? {
??????? "_index" : "sqlindex",
??????? "_type" : "doc",
??????? "_id" : "1",
??????? "_score" : 1.0,
??????? "_source" : {
????????? "@timestamp" : "2019-01-18T09:30:00.156Z",
????????? "name" : "zhyn",
????????? "content" : "elktest",
????????? "crtime" : "2019-01-17T08:03:00.000Z",
????????? "@version" : "1",
????????? "id" : 1
??????? }
????? },
????? {
??????? "_index" : "sqlindex",
??????? "_type" : "doc",
??????? "_id" : "3",
??????? "_score" : 1.0,
??????? "_source" : {
????????? "@timestamp" : "2019-01-18T09:30:00.157Z",
????????? "name" : "張哈哈",
????????? "content" : "同步成功",
????????? "crtime" : "2019-01-17T08:31:08.000Z",
????????? "@version" : "1",
????????? "id" : 3
??????? }
????? }
??? ]
? }
}

至此,logstash同步數據庫成功。

三 Kibana

Kibana簡介:Kibana 是一個開源和免費的工具,Kibana可以為 Logstash 和 ElasticSearch 提供的日志分析友好的 Web 界面,可以幫助匯總、分析和搜索重要數據日志。

3.1 安裝kibana

到ELK目錄下進行kibana安裝

cd /opt/soft/ELK

wget https://artifacts.elastic.co/downloads/kibana/kibana-6.5.0-darwin-x86_64.tar.gz

tar -zxvf kibana-6.5.0-darwin-x86_64.tar.gz

3.2運行kibana

cd kibana-6.5.0-darwin-x86_64

./bin/kibana

kibana的默認端口為5601

在網頁輸入? http://localhost:5601 查看是否可以使用

kibana側欄常用功能如下:

Discover 數據搜索查看

Visualize 圖表制作

Dashboard 儀表盤制作

Timelion 時序數據的高級可視化分析

DevTools 開發者工具

Management 配置


到DIscover欄查看索引,顯示有sqlindex ,表示Kibana已經連上ES,Kibana 安裝成功

四 小結

ELK環境部署已經完成,對于大數據采集、分析的實戰,爭取在之后的文章記述,幫助新同學們少走一些彎路。文中提到的文件我已經放到網盤,包括mysql的數據驅動jar。有需要的同學可以自行獲取,網盤鏈接:

https://pan.baidu.com/s/1VVs667OiDvZph8oq9yBk3Q? 密碼:br7n

?著作權歸作者所有,轉載或內容合作請聯系作者
平臺聲明:文章內容(如有圖片或視頻亦包括在內)由作者上傳并發布,文章內容僅代表作者本人觀點,簡書系信息發布平臺,僅提供信息存儲服務。

推薦閱讀更多精彩內容