标签:传输 val ado .com host win size sub str
"问题:解决办法:
在日志里发现每一个(SOH)处,都显示为\u0001,这是16进制的ascll code 1。和控制字符的code1也是对应的,断定(SOH)被解析成\u0001这个字符。
input kafka中之前都是直接codec => json{charset =>[""GBK""]},相当于在消费的时候先进行json 解析,这样很无力,一直报错。
换个思路:
1,先无格式plain消费,然后把SOH(\u0001)
2,替换成空格(因为soh无意义,替换成空格不影响源信息)gsub =>["message","\u0001"," "],
3,再用 json {source => ""message""} 进行解析,这样就可以了。
--------------------------配置信息---------------------------------
input {
kafka {
bootstrap_servers => ""ZBSZ1-LOG-KFK01:9092,ZBSZ1-LOG-KFK02:9092,ZBSZ1-LOG-KFK03:9092""
group_id => ""es-rzrqbp02""
topics_pattern => ""rzrqbp-C010001""
value_deserializer_class => ""org.apache.kafka.common.serialization.ByteArrayDeserializer"" #源字节编码转换器,因为一直以GBK编码传输
codec => plain{charset => [""GBK""]}
#codec => json
}
}
filter {
mutate {
convert => { ""[indicator][usedtime]"" => ""integer"" }
gsub =>[""message"",""\u0001"","" ""]
}
json {
source => ""message""
}
date {
match => [""[time_stamp]"",""UNIX_MS""]
target => ""@timestamp""
}
}
output {
elasticsearch {
hosts => [""ZBSZ1-LOG-ES01:9200"", ""ZBSZ1-LOG-ES02:9200"", ""ZBSZ1-LOG-ES03:9200""]
index => ""app-rzrqbp-%{+YYYY.MM.dd}""
document_id => ""%{[indicator][msgid]}""
}
}
"
json logstash 解析失败 ctrl-code 1
标签:传输 val ado .com host win size sub str
原文地址:http://blog.51cto.com/11209357/2059041