可以用Logstash jdbc 插件向es里面导入数据,配置大概长这样,然后 bin/logstash -f configFileName开始同步 input { jdbc { jdbc_driver_library=>"mysql-connector-java-8.0.12.jar" jdbc_driver_class => "com.mysql.jdbc.Driver" jdbc_connection_string => "jdbc:mysql://localhost:3306/dbname" jdbc_user => "user" jdbc_password => "passwd" statement => "select xxx from xxx" schedule => " * * * * * " tracking_column=>"xxx" tracking_column_type =>"numeric" use_column_value=>true record_last_run=>true last_run_metadata_path => "xxx.txt" jdbc_paging_enabled => true } } output { stdout { codec => json_lines } elasticsearch { hosts => ["http://localhost:9200"] index => "indexName" document_id => "%{id}" } }
可以用Logstash jdbc 插件向es里面导入数据,配置大概长这样,然后
bin/logstash -f configFileName
开始同步