Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

主要修复kafka双网卡配置支持 #649

Open
wants to merge 16 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,8 @@
# efak
此fork项目解决efak一些bug,并增加以下功能:
* 1.并新增监控数据输出到kafka功能
* 2.新增kafka报警通道

[![Build Status](https://app.travis-ci.com/smartloli/EFAK.svg?branch=master)](https://app.travis-ci.com/smartloli/EFAK)
![](https://img.shields.io/badge/language-java-orange.svg)
[![codebeat badge](https://codebeat.co/badges/4c141093-e55d-464d-87ce-7431cde81398)](https://codebeat.co/projects/github-com-smartloli-efak-master)
Expand Down
33 changes: 33 additions & 0 deletions docker/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# docker build -t seawenc/efak:3.0.9 .
FROM seawenc/centos7fat:1.0
MAINTAINER chengsheng(seawenc)

#rm -rf /etc/yum.repos.d/* && \
#curl -o /etc/yum.repos.d/CentOS-Base.repo https://mirrors.aliyun.com/repo/Centos-7.repo && \


#RUN wget -c http://192.168.56.1:8000/efak-web-${VERSION}-bin.tar.gz(原始地址:https://github.com/smartloli/kafka-eagle-bin/archive/v3.0.1.tar.gz )
# 此包通过 [email protected]:sewenc/efak-src.git master源码打包
ENV VERSION=3.0.9
COPY /efak-web-${VERSION}-bin.tar.gz /opt/app/
#RUN cd /opt/app && curl -O http://192.168.56.1:8000/efak-web-${VERSION}-bin.tar.gz && \
RUN cd /opt/app && tar -xzf efak-web-${VERSION}-bin.tar.gz && \
rm -rf efak-web-${VERSION}-bin.tar.gz && \
mv efak-web-${VERSION} efak

#替换jar包,解决两个bug:1.不支持双ip,2.jmx异常时提示不对
#已提merge request, 合并后,用新版本,将不再需要以下两行内容: https://github.com/smartloli/EFAK/pull/649
#COPY /efak-common-${VERSION}.jar /
#RUN sed -i '/ke.war/a\ \\cp -f /efak-common-3.0.1.jar /opt/app/efak/kms/webapps/ke/WEB-INF/lib/' /opt/app/efak/bin/ke.sh

ENV KE_HOME=/opt/app/efak
ENV PATH $KE_HOME/bin:$PATH
WORKDIR /opt/app/efak
COPY /ke-start.sh /
COPY /healthcheck.sh /
#添加建康检查
HEALTHCHECK --start-period=300s --retries=1 --interval=10s --timeout=5s CMD sh /healthcheck.sh

EXPOSE 8048

CMD ["sh","/ke-start.sh"]
35 changes: 35 additions & 0 deletions docker/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
## 说明
本项目主要用于hub.docker.com镜像制作(efak官方不提供docker镜像,因此自己做一个)

```bash

```

## 镜像制作
```bash
VERSION=3.0.9
mvn versions:set -DnewVersion=${VERSION} -DgenerateBackupPoms=false
rm -rf docker/*.tar.gz
rm -rf efak-web/target
mvn -pl efak-web package -Dmaven.test.skip=true
mv efak-web/target/efak-web-${VERSION}-bin.tar.gz docker/

cd docker
docker build -t seawenc/efak:${VERSION} .
docker push seawenc/efak:${VERSION}
rm -rf /data/share/efak.gz
docker save seawenc/efak:${VERSION} | gzip > /data/share/efak.gz
```

## 用法:
```
docker run -d -p 8048:8048 \
--restart=always --name efak \
-v /tmp/logs:/opt/app/efak/logs \
-v /data/workspace/my/efak-src/efak-web/src/main/resources/conf/system-config.properties:/opt/app/efak/conf/system-config.properties \
-v /tmp/db:/opt/app/efak/db \
seawenc/efak:3.0.6
```
访问:http://localhost:8048

使用文档,请参考:https://gitee.com/seawenc/kafka-ha-installer
Binary file added docker/efak-common-3.0.1.jar
Binary file not shown.
20 changes: 20 additions & 0 deletions docker/healthcheck.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
## 通过判断接口是否正常来判断应用是否正常
STATUS=`curl -i --connect-timeout 3 http://127.0.0.1:8048`
_RET=$?
if [ $_RET != 0 ]; then
# 杀死容器主进程,让其重启
echo '' > /opt/app/efak/logs/log.log
ps -ef | grep tail | head -1 |awk '{print $2}' | xargs -I {} kill -9 {}
exit 1
fi

STATUS=`echo $STATUS | head -1 | awk '{print $2'}`
if [ "$STATUS" == '404' ]; then
echo '当前状态为404,不正常'
echo '' > /opt/app/efak/logs/log.log
ps -ef | grep tail | head -1 |awk '{print $2}' | xargs -I {} kill -9 {}
exit 1
fi
echo "正常"
echo "status=$STATUS"
exit 0
3 changes: 3 additions & 0 deletions docker/ke-start.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
sh /opt/app/efak/bin/ke.sh restart
# 官方版本启动无守护进程,因此直接查看日志,再加建康检查
tail -100f /opt/app/efak/logs/log.log
138 changes: 138 additions & 0 deletions docker/system-config.properties
Original file line number Diff line number Diff line change
@@ -0,0 +1,138 @@
######################################
# multi zookeeper & kafka cluster list
# Settings prefixed with 'kafka.eagle.' will be deprecated, use 'efak.' instead
######################################
efak.zk.cluster.alias=
cluster1.zk.list=192.168.56.11:2181,192.168.56.13:2181,192.168.56.12:2181
cluster2.zk.list=xdn10:2181,xdn11:2181,xdn12:2181

######################################
# zookeeper enable acl
######################################
cluster1.zk.acl.enable=true
cluster1.zk.acl.schema=digest
cluster1.zk.acl.username=admin
cluster1.zk.acl.password=aaBB@1122

######################################
# broker size online list
######################################
cluster1.efak.broker.size=20

######################################
# zk client thread limit
######################################
kafka.zk.limit.size=16

######################################
# EFAK webui port
######################################
efak.webui.port=8048

######################################
# EFAK enable distributed
######################################
efak.distributed.enable=false
efak.cluster.mode.status=master
efak.worknode.master.host=localhost
efak.worknode.port=8085

######################################
# kafka jmx acl and ssl authenticate
######################################
cluster1.efak.jmx.acl=false
cluster1.efak.jmx.user=keadmin
cluster1.efak.jmx.password=keadmin123
cluster1.efak.jmx.ssl=false
cluster1.efak.jmx.truststore.location=/data/ssl/certificates/kafka.truststore
cluster1.efak.jmx.truststore.password=ke123456

######################################
# kafka offset storage
######################################
cluster1.efak.offset.storage=kafka
cluster2.efak.offset.storage=zk

######################################
# kafka jmx uri
######################################
cluster1.efak.jmx.uri=service:jmx:rmi:///jndi/rmi://%s/jmxrmi

######################################
# kafka metrics, 15 days by default
######################################
efak.metrics.charts=true
efak.metrics.retain=15

######################################
# kafka sql topic records max
######################################
efak.sql.topic.records.max=5000
efak.sql.topic.preview.records.max=10

######################################
# delete kafka topic token
######################################
efak.topic.token=keadmin

######################################
# kafka sasl authenticate
######################################
cluster1.efak.sasl.enable=true
cluster1.efak.sasl.protocol=SASL_PLAINTEXT
cluster1.efak.sasl.mechanism=PLAIN
cluster1.efak.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="admin" password="aaBB@1122";
cluster1.efak.sasl.client.id=
cluster1.efak.blacklist.topics=
cluster1.efak.sasl.cgroup.enable=false
cluster1.efak.sasl.cgroup.topics=
cluster2.efak.sasl.enable=false
cluster2.efak.sasl.protocol=SASL_PLAINTEXT
cluster2.efak.sasl.mechanism=PLAIN
cluster2.efak.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="kafka" password="kafka-eagle";
cluster2.efak.sasl.client.id=
cluster2.efak.blacklist.topics=
cluster2.efak.sasl.cgroup.enable=false
cluster2.efak.sasl.cgroup.topics=

######################################
# kafka ssl authenticate
######################################
cluster3.efak.ssl.enable=false
cluster3.efak.ssl.protocol=SSL
cluster3.efak.ssl.truststore.location=
cluster3.efak.ssl.truststore.password=
cluster3.efak.ssl.keystore.location=
cluster3.efak.ssl.keystore.password=
cluster3.efak.ssl.key.password=
cluster3.efak.ssl.endpoint.identification.algorithm=https
cluster3.efak.blacklist.topics=
cluster3.efak.ssl.cgroup.enable=false
cluster3.efak.ssl.cgroup.topics=

######################################
# kafka sqlite jdbc driver address
######################################
#efak.driver=org.sqlite.JDBC
#efak.url=jdbc:sqlite:/hadoop/kafka-eagle/db/ke.db
#efak.username=root
#efak.password=www.kafka-eagle.org

######################################
# kafka mysql jdbc driver address
######################################
######################################
# kafka sqlite jdbc driver address
######################################
efak.driver=org.sqlite.JDBC
efak.url=jdbc:sqlite:/tmp/ke.db
efak.username=root
efak.password=www.kafka-eagle.org

######################################
# kafka mysql jdbc driver address
######################################
#efak.driver=com.mysql.cj.jdbc.Driver
#efak.url=jdbc:mysql://127.0.0.1:3306/ke?useUnicode=true&characterEncoding=UTF-8&zeroDateTimeBehavior=convertToNull
#efak.username=root
#efak.password=123456
4 changes: 2 additions & 2 deletions efak-api/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,14 @@
<parent>
<groupId>org.smartloli.kafka.eagle</groupId>
<artifactId>efak</artifactId>
<version>${project.version}</version>
<version>3.0.9</version>
</parent>
<artifactId>efak-api</artifactId>
<dependencies>
<dependency>
<groupId>org.smartloli.kafka.eagle</groupId>
<artifactId>efak-common</artifactId>
<version>${project.version}</version>
<version>3.0.9</version>
</dependency>
<dependency>
<groupId>commons-io</groupId>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -35,4 +35,5 @@ public interface IMService {
/** Send alert message by mail. */
public void sendPostMsgByMail(String data, String url);

public void sendPostMsgByKafka(String data, String topic);
}
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@
import java.util.Date;

import org.smartloli.kafka.eagle.api.im.queue.DingDingJob;
import org.smartloli.kafka.eagle.api.im.queue.KafkaJob;
import org.smartloli.kafka.eagle.api.im.queue.MailJob;
import org.smartloli.kafka.eagle.api.im.queue.WeChatJob;
import org.smartloli.kafka.eagle.common.protocol.alarm.queue.BaseJobContext;
Expand Down Expand Up @@ -61,4 +62,11 @@ public void sendPostMsgByMail(String data, String url) {
QuartzManagerUtils.addJob(jobContext, KE_JOB_ID + new Date().getTime(), MailJob.class, QuartzManagerUtils.getCron(new Date(), 5));
}

@Override
public void sendPostMsgByKafka(String data, String topic) {
BaseJobContext jobContext = new BaseJobContext();
jobContext.setData(data);
jobContext.setUrl(topic);
QuartzManagerUtils.addJob(jobContext, KE_JOB_ID + new Date().getTime(), KafkaJob.class, QuartzManagerUtils.getCron(new Date(), 5));
}
}
Loading