启动zipkin

用elasticsearch的方式启动

QUERY_PORT=9411 STORAGE_TYPE=elasticsearch ES_HOSTS=http://localhost:9200 java -jar zipkin.jar

后台启动

QUERY_PORT=9411 STORAGE_TYPE=elasticsearch ES_HOSTS=http://localhost:9200 nohup java -jar zipkin.jar &

启动elasticsearch

后台启动

./elasticsearch -d

查询elasticsearch数据

curl -s 'localhost:9200/zipkin*span-2017-08-11/_search?q=_q:error=500'

curl -s 'localhost:9200/zipkin*span-2020-05-21/_search'

curl -s 'localhost:9200/zipkin*/_search'

查看elasticsearch的所有索引
curl http://localhost:9200/_cat/indices?v

elasticsearch数据定时删除

上线以后数据急速增加,一天大概3000万的文档,造成zipkin查询很慢,有的时候会报错。
解决办法:

  1. 定时删除不需要的数据
  2. 修改zipkin采样率,改成0.1

删除elasticsearch的数据可以用elasticsearch-curator

  1. 安装方法可以网上查找
  2. 配置curator的action
    参考https://www.cnblogs.com/cuishuai/p/10009091.html

    actions:
      1:
     action: delete_indices
     description: >-
       Delete indices older than 1 days (based on index name), for test-
       prefixed indices. Ignore the error if the filter does not result in an
       actionable list of indices (ignore_empty_list) and exit cleanly.
     options:
       ignore_empty_list: True
       timeout_override:
       continue_if_exception: False
       disable_action: False
     filters:
     - filtertype: pattern
       kind: prefix
       value: zipkin
       exclude:
     - filtertype: age
       source: name
       direction: older
       timestring: '%Y-%m-%d'
       unit: days
       unit_count: 1
       exclude:
  3. curator的执行命令
    curator --config /etc/curator/config_file.yml /etc/curator/action_file.yml
  4. 配置linux定时任务

    crontab -e

Shawn
1 声望0 粉丝

引用和评论

0 条评论