ELK學習筆記1
部署方式:ELK部署在同一linux服務器上。
訪問方式:通過APACHE反向代理,統一經由80端口訪問(內網各種端口被封,木的辦法)
監控內容:同一服務器的APACHE日志文件/ JIRA的某一項目的全部問題/。。。
本文將不涉及的:ELK下載、安裝過程(各種百度,建議以官網文檔為準)
涉及的話題:
1.ELK啟動/關閉
2.Ek的反向代理設置(Apache)
3.Logstash的配置(file/http):-TODO
a.默認方式監控Apache日志-TODO
b.jira項目-TODO
C.行情數據- JSON格式(TODO)(其實最早是想扒烏云的安全數據的,但是出公告了。。。)- TODO
ELK啟動/關閉
[elk@host~]$ cat run.sh
#!/bin/sh
# stopall
ps -ef|greplogstash |grep -v grep| awk '{print $2}'|xargs kill -9
ps -ef|grepelasticsearch |grep -v grep| awk '{print $2}'|xargs kill -9
ps -ef| grep '.*node/bin/node.*src/cli' |grep -v grep| awk '{print $2}'|xargs kill -9
### startall
#!/bin/sh
/usr/local/logstash-2.3.4/bin/logstash-f /usr/local/logstash-2.3.4/logstash.conf web &
su -elk -c/usr/local/elasticsearch-2.3.4/bin/elasticsearch&
su -elk -c/usr/local/kibana-4.5.3-linux-x64/bin/kibana&
2坑1
我去,原來elasticsearch不能用root用戶啟動的呦。
坑2疑似logstash bug:非root用戶下,通過+rx方式讀取apache accss_log報錯。
Apache是以root用戶起的,通過百度,起初只授權了logs目錄以及accss_log日志文件的elk用戶讀權限,無任何報錯,直接起不來。然后追加了執行權限。然后用elk用戶啟動logstash,悲劇來了:
[elk@hostlogstash-2.3.4]$ Settings: Default pipeline workers: 2
NotImplementedError:stat.st_dev unsupported or native support failed to load
dev_majorat org/jruby/RubyFileStat.java:205
nix_inodeat/usr/local/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/filewatch-0.8.1/lib/filewatch/watch.rb:28
inodeat/usr/local/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/filewatch-0.8.1/lib/filewatch/watch.rb:32
inodeat /usr/local/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/filewatch-0.8.1/lib/filewatch/watch.rb:106
watchat/usr/local/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/filewatch-0.8.1/lib/filewatch/watch.rb:96
_discover_file at/usr/local/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/filewatch-0.8.1/lib/filewatch/watch.rb:313
each atorg/jruby/RubyArray.java:1613
each atorg/jruby/RubyEnumerator.java:274
_discover_file at/usr/local/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/filewatch-0.8.1/lib/filewatch/watch.rb:304
watchat/usr/local/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/filewatch-0.8.1/lib/filewatch/watch.rb:95
call atorg/jruby/RubyProc.java:281
synchronizedat/usr/local/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/filewatch-0.8.1/lib/filewatch/watch.rb:357
synchronizeat org/jruby/ext/thread/Mutex.java:149
synchronizedat/usr/local/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/filewatch-0.8.1/lib/filewatch/watch.rb:357
watchat/usr/local/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/filewatch-0.8.1/lib/filewatch/watch.rb:92
tail at/usr/local/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/filewatch-0.8.1/lib/filewatch/tail_base.rb:73
tail at/usr/local/logstash-2.3.4/vendor/jruby/lib/ruby/1.9/forwardable.rb:201
begin_tailingat /usr/local/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-input-file-2.2.5/lib/logstash/inputs/file.rb:288
each atorg/jruby/RubyArray.java:1613
begin_tailingat/usr/local/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-input-file-2.2.5/lib/logstash/inputs/file.rb:288
run at/usr/local/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-input-file-2.2.5/lib/logstash/inputs/file.rb:292
inputworkerat/usr/local/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:342
start_inputat /usr/local/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:336
疑似在ES的jira上找到過類似的bug。然后修改為以root用戶啟動logstash,一切順利。
ELK學習筆記2 -Ek的反向代理設置(Apache)
1在Apache的conf.d目錄下,新增一個conf文件,如kibana.conf.
2主要內容如下:
/etc/httpd/conf.d#catkibana.conf
ProxyRequestsOn
ProxyPass
/kibanahttp://127.0.0.1:5601/app/kibana
ProxyPassReverse
/kibanahttp://127.0.0.1:5601/app/kibana
ProxyPass
/bundleshttp://127.0.0.1:5601/bundles
ProxyPassReverse
/bundleshttp://127.0.0.1:5601/bundles
ProxyPass
/elasticsearchhttp://127.0.0.1:9200/
ProxyPassReverse
/elasticsearchhttp://127.0.0.1:9200/
Alias/bundles/ /usr/local/kibana-4.5.3-linux-x64/optimize/bundles/
Require all granted
3坑:
起初按照百度的結果,只配置了kibana和elasticsearch的反向代理。但是啟動訪問之后一直出錯,通過查找apache訪問日志,發現了類似
"GET/bundles/kibana.style.css?v=9910 HTTP/1.1" 200
這樣的出錯信息。
去stackoverflow搜了一下,發現了這個帖子
http://stackoverflow.com/questions/32862053/apache-reverse-proxy-for-kibana
于是問題解決了。