准备
创建工作目录
1 2
| mkdir -p /home/docker/elk_1 cd /home/docker/elk_1/
|
创建docker-compose
将以下内容写入到docker-compose.yml
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53
| version: '3.7' services: elasticsearch: image: elasticsearch:7.6.2 container_name: elasticsearch privileged: true user: root environment: - cluster.name=elasticsearch - discovery.type=single-node - ES_JAVA_OPTS=-Xms512m -Xmx512m volumes: - ./elasticsearch/plugins:/usr/share/elasticsearch/plugins - ./elasticsearch/data:/usr/share/elasticsearch/data ports: - 9200:9200 - 9300:9300
logstash: image: logstash:7.6.2 container_name: logstash ports: - 4560:4560 privileged: true environment: - TZ=Asia/Shanghai volumes: - ./logstash/logstash.conf:/usr/share/logstash/pipeline/logstash.conf depends_on: - elasticsearch links: - elasticsearch:es
kibana: image: kibana:7.6.2 container_name: kibana ports: - 5601:5601 privileged: true links: - elasticsearch:es depends_on: - elasticsearch environment: - elasticsearch.hosts=http://es:9200
|
赋予权限
1
| chmod 777 -R /home/docker/elk_1/
|
运行容器
访问测试
Kibana
访问Kibana:http://192.168.0.150:5601
ElasticSearch
访问ES:http://192.168.2.21:9200/
出现这两个界面就表示部署成功了
收集日志
前言
我们新建一个Spring Boot项目,再使用Spring Boot自带的loback日志系统提交到ELK。
新建Spring Boot项目
添加依赖
1 2 3 4 5 6 7 8 9
| <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency>
<!--集成logstash--> <dependency> <groupId>net.logstash.logback</groupId> <artifactId>logstash-logback-encoder</artifactId> <version>6.6</version> </dependency>
|
新建测试方法
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
| import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.junit.jupiter.api.Test; import org.springframework.boot.test.context.SpringBootTest;
@SpringBootTest public class AppTest {
Logger logger = LogManager.getLogger(this.getClass());
@Test public void logback() { logger.info("logback的日志信息过来了"); logger.error("logback的错误信息过来了"); } }
|
配置
192.168.2.21
改为自己logstash
容器的端口
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32
| <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE configuration> <configuration> <include resource="org/springframework/boot/logging/logback/defaults.xml"/> <include resource="org/springframework/boot/logging/logback/console-appender.xml"/> <property name="APP_NAME" value="springboot-logback-elk-demo"/> <property name="LOG_FILE_PATH" value="${LOG_FILE:-${LOG_PATH:-${LOG_TEMP:-${java.io.tmpdir:-/tmp}}}/logs}"/> <contextName>${APP_NAME}</contextName> <appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender"> <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy"> <fileNamePattern>${LOG_FILE_PATH}/${APP_NAME}-%d{yyyy-MM-dd}.log</fileNamePattern> <maxHistory>30</maxHistory> </rollingPolicy> <encoder> <pattern>${FILE_LOG_PATTERN}</pattern> </encoder> </appender> <appender name="LOGSTASH" class="net.logstash.logback.appender.LogstashTcpSocketAppender"> <destination>192.168.2.21:4560</destination> <encoder charset="UTF-8" class="net.logstash.logback.encoder.LogstashEncoder"/> </appender> <root level="INFO"> <appender-ref ref="CONSOLE"/> <appender-ref ref="FILE"/> <appender-ref ref="LOGSTASH"/> </root> </configuration>
|
添加完配置后,运行一下测试方法,在控制台能看到正常的输入日志就可以了。
创建索引
建立索引
启动测试方法,然后点击index Patterns后,点击创建索引
创建Index
创建名称为logstash-*的index,此名字是在配置文件logstash.conf中配置的。上述配置的是logstash-%{+YYYY.MM.dd},故这里使用*代替日期
选择过滤器
选择@timestamp
的filter
创建索引成功
创建索引成功是这样的
查看收集日志
可以看到我们刚刚在测试方法中输出的日志内容,ELK也收集到了这些日志内容。
蓝奏云下载:https://rookie1679.lanzouq.com/i2yQy1g7qroh
本文章来源于我的博客:https://blog.hikki.site