1

在Mac机器上安装ELK三件套。[On mac 10.11.2 (15C50)]

Jdk的安装

首先安装Jdk: Jdb download URL

我这里选择的是 jdk-8u66-macosx-x64.dmg

安装完后,在终端输入java -version,显示

$ java -version
java version "1.8.0_66"
Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)

设置环境变量

$ sudo vim ~/.bashrc

#input
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_66.jdk/Contents/Home
export JRE_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib  
export PATH=${JAVA_HOME}/bin:$PATH

$ sudo source ~/.bashrc

安装 Redis

安装 Elastic search

从官网安装2.1.1版本: 下载地址

安装到一个/data/lek/e目录下,执行sudo ./bin/elasticsearch报错:

Exception in thread "main" java.lang.RuntimeException: don't run elasticsearch as root.
at org.elasticsearch.bootstrap.Bootstrap.initializeNatives(Bootstrap.java:93)
at org.elasticsearch.bootstrap.Bootstrap.setup(Bootstrap.java:144)
at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:285)
at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:35)
Refer to the log for complete error details.

看上去是权限的问题,再次尝试执行命令$ ./bin/elastichsearch,再次报错:

log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /data/lek/e/elasticsearch-2.1.1/logs/elasticsearch.log (Permission denied)
    at java.io.FileOutputStream.open0(Native Method)
    at java.io.FileOutputStream.open(FileOutputStream.java:270)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
    at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
    at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
    at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:223)
    at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
    at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
    at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
    at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
    at org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:440)
    at org.elasticsearch.common.logging.log4j.LogConfigurator.configure(LogConfigurator.java:128)
    at org.elasticsearch.bootstrap.Bootstrap.setupLogging(Bootstrap.java:204)
    at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:258)
    at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:35)
log4j:ERROR Either File or DatePattern options are not set for appender [file].
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /data/lek/e/elasticsearch-2.1.1/logs/elasticsearch_deprecation.log (Permission denied)
    at java.io.FileOutputStream.open0(Native Method)
    at java.io.FileOutputStream.open(FileOutputStream.java:270)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
    at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
    at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
    at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:223)
    at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
    at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
    at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
    at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
    at org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:440)
    at org.elasticsearch.common.logging.log4j.LogConfigurator.configure(LogConfigurator.java:128)
    at org.elasticsearch.bootstrap.Bootstrap.setupLogging(Bootstrap.java:204)
    at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:258)
    at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:35)
log4j:ERROR Either File or DatePattern options are not set for appender [deprecation_log_file].
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /data/lek/e/elasticsearch-2.1.1/logs/elasticsearch_index_indexing_slowlog.log (Permission denied)
    at java.io.FileOutputStream.open0(Native Method)
    at java.io.FileOutputStream.open(FileOutputStream.java:270)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
    at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
    at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
    at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:223)
    at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
    at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
    at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
    at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
    at org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:440)
    at org.elasticsearch.common.logging.log4j.LogConfigurator.configure(LogConfigurator.java:128)
    at org.elasticsearch.bootstrap.Bootstrap.setupLogging(Bootstrap.java:204)
    at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:258)
    at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:35)
log4j:ERROR Either File or DatePattern options are not set for appender [index_indexing_slow_log_file].
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /data/lek/e/elasticsearch-2.1.1/logs/elasticsearch_index_search_slowlog.log (Permission denied)
    at java.io.FileOutputStream.open0(Native Method)
    at java.io.FileOutputStream.open(FileOutputStream.java:270)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
    at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
    at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
    at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:223)
    at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
    at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
    at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
    at org.apache.log4j.PropertyConfigurator.parseCatsAndRenderers(PropertyConfigurator.java:672)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:516)
    at org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:440)
    at org.elasticsearch.common.logging.log4j.LogConfigurator.configure(LogConfigurator.java:128)
    at org.elasticsearch.bootstrap.Bootstrap.setupLogging(Bootstrap.java:204)
    at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:258)
    at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:35)
log4j:ERROR Either File or DatePattern options are not set for appender [index_search_slow_log_file].
[2016-01-03 16:57:38,427][INFO ][node                     ] [Hammerhead] version[2.1.1], pid[6224], build[40e2c53/2015-12-15T13:05:55Z]
[2016-01-03 16:57:38,428][INFO ][node                     ] [Hammerhead] initializing ...
[2016-01-03 16:57:38,532][INFO ][plugins                  ] [Hammerhead] loaded [], sites []
[2016-01-03 16:57:38,557][INFO ][env                      ] [Hammerhead] using [1] data paths, mounts [[/ (/dev/disk1)]], net usable_space [148.9gb], net total_space [232.6gb], spins? [unknown], types [hfs]
[2016-01-03 16:57:41,580][INFO ][node                     ] [Hammerhead] initialized
[2016-01-03 16:57:41,580][INFO ][node                     ] [Hammerhead] starting ...
[2016-01-03 16:57:41,728][INFO ][transport                ] [Hammerhead] publish_address {127.0.0.1:9301}, bound_addresses {127.0.0.1:9301}, {[fe80::1]:9301}, {[::1]:9301}
[2016-01-03 16:57:41,739][INFO ][discovery                ] [Hammerhead] elasticsearch/CckNREUXT6OsKQlcTBBNeQ
[2016-01-03 16:57:45,591][WARN ][discovery.zen.ping.unicast] [Hammerhead] failed to send ping to [{#zen_unicast_1#}{127.0.0.1}{127.0.0.1:9300}]
ReceiveTimeoutTransportException[[][127.0.0.1:9300][internal:discovery/zen/unicast] request_id [2] timed out after [3753ms]]
    at org.elasticsearch.transport.TransportService$TimeoutHandler.run(TransportService.java:645)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
[2016-01-03 16:57:45,597][WARN ][discovery.zen.ping.unicast] [Hammerhead] failed to send ping to [{#zen_unicast_6#}{::1}{[::1]:9300}]
ReceiveTimeoutTransportException[[][[::1]:9300][internal:discovery/zen/unicast] request_id [4] timed out after [3761ms]]
    at org.elasticsearch.transport.TransportService$TimeoutHandler.run(TransportService.java:645)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
[2016-01-03 16:57:46,276][INFO ][cluster.service          ] [Hammerhead] new_master {Hammerhead}{CckNREUXT6OsKQlcTBBNeQ}{127.0.0.1}{127.0.0.1:9301}, reason: zen-disco-join(elected_as_master, [0] joins received)
[2016-01-03 16:57:46,314][INFO ][http                     ] [Hammerhead] publish_address {127.0.0.1:9201}, bound_addresses {127.0.0.1:9201}, {[fe80::1]:9201}, {[::1]:9201}
[2016-01-03 16:57:46,314][INFO ][node                     ] [Hammerhead] started
[2016-01-03 16:57:46,351][INFO ][gateway                  ] [Hammerhead] recovered [0] indices into cluster_state

看第一条报错,应该是目录的权限不足,所以将该目录的权限修改为了777,再次执行命令$ ./bin/elasticsearch,响应如下。

[2016-01-03 17:01:54,518][INFO ][node                     ] [Pisces] version[2.1.1], pid[6275], build[40e2c53/2015-12-15T13:05:55Z]
[2016-01-03 17:01:54,519][INFO ][node                     ] [Pisces] initializing ...
[2016-01-03 17:01:54,618][INFO ][plugins                  ] [Pisces] loaded [], sites []
[2016-01-03 17:01:54,644][INFO ][env                      ] [Pisces] using [1] data paths, mounts [[/ (/dev/disk1)]], net usable_space [148.9gb], net total_space [232.6gb], spins? [unknown], types [hfs]
[2016-01-03 17:01:56,963][INFO ][node                     ] [Pisces] initialized
[2016-01-03 17:01:56,963][INFO ][node                     ] [Pisces] starting ...
[2016-01-03 17:01:57,111][INFO ][transport                ] [Pisces] publish_address {127.0.0.1:9301}, bound_addresses {127.0.0.1:9301}, {[fe80::1]:9301}, {[::1]:9301}
[2016-01-03 17:01:57,123][INFO ][discovery                ] [Pisces] elasticsearch/OqIO479gTKSY31UfJuZy5w
[2016-01-03 17:02:00,993][WARN ][discovery.zen.ping.unicast] [Pisces] failed to send ping to [{#zen_unicast_1#}{127.0.0.1}{127.0.0.1:9300}]
ReceiveTimeoutTransportException[[][127.0.0.1:9300][internal:discovery/zen/unicast] request_id [4] timed out after [3751ms]]
    at org.elasticsearch.transport.TransportService$TimeoutHandler.run(TransportService.java:645)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
[2016-01-03 17:02:00,993][WARN ][discovery.zen.ping.unicast] [Pisces] failed to send ping to [{#zen_unicast_6#}{::1}{[::1]:9300}]
ReceiveTimeoutTransportException[[][[::1]:9300][internal:discovery/zen/unicast] request_id [3] timed out after [3752ms]]
    at org.elasticsearch.transport.TransportService$TimeoutHandler.run(TransportService.java:645)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
[2016-01-03 17:02:01,655][INFO ][cluster.service          ] [Pisces] new_master {Pisces}{OqIO479gTKSY31UfJuZy5w}{127.0.0.1}{127.0.0.1:9301}, reason: zen-disco-join(elected_as_master, [0] joins received)
[2016-01-03 17:02:01,686][INFO ][http                     ] [Pisces] publish_address {127.0.0.1:9201}, bound_addresses {127.0.0.1:9201}, {[fe80::1]:9201}, {[::1]:9201}
[2016-01-03 17:02:01,686][INFO ][node                     ] [Pisces] started
[2016-01-03 17:02:01,711][INFO ][gateway                  ] [Pisces] recovered [0] indices into cluster_state

应该还是有点X超时问题,但是看上去不影响功能。根据输入信息,访问


http://localhost:9201/

获得输出信息:

{
    name: "Pisces",
    cluster_name: "elasticsearch",
    version: {
        number: "2.1.1",
        build_hash: "40e2c53a6b6c2972b3d13846e450e66f4375bd71",
        build_timestamp: "2015-12-15T13:05:55Z",
        build_snapshot: false,
        lucene_version: "5.3.1"
    },
    tagline: "You Know, for Search"
}

安装 Logstash

我下载的是Logstash 2.1.1

安装成功后,在目录下执行

./bin/logstash -e 'input { stdin { } } output { stdout {} }'

输入:hello world,返回结果:

2016-01-03T09:57:15.372Z niceforbear.local hello world

可以看到,在终端里进行log的输入,logstash会返回时间戳和IP地址,以及输入的log。

安装 Kibana

下载 Kibana

命令行中执行

sudo ./bin/kibana

看到输出中显示[info][listening] Server running at http://0.0.0.0:5601

访问http://localhsot:5601,成功显示Kibana的dashboard。

至此,三件套安装成功。

参考

  1. 使用logstash+elasticsearch+kibana快速搭建日志平台


niecprea
871 声望54 粉丝

Hello world!