笔记|Kafka|kafka-dump-logs.sh:Kafka-logs 日志解析方法

This tool helps to parse a log file and dump its contents to the console, useful for debugging a seemingly corrupt log segment.

可以使用 kafka-dump-logs.sh 脚本来解析 Kafka 的各种日志文件(例如 .timeindex.index.log 等)。

在 Kafka 官方文档中,对 kafka-dump-log 的描述如下:

Dump Log Tool

The kafka-dump-log tool can be used to debug the log segments and snapshots for the cluster metadata directory. The tool will scan the provided files and decode the metadata records. For example, this command decodes and prints the records in the first log segment:

  > bin/kafka-dump-log.sh --cluster-metadata-decoder --files metadata_log_dir/__cluster_metadata-0/00000000000000000000.log

This command decodes and prints the recrods in the a cluster metadata snapshot:

  > bin/kafka-dump-log.sh --cluster-metadata-decoder --files metadata_log_dir/__cluster_metadata-0/00000000000000000100-0000000001.checkpoint

kafka-dump-logs.sh 的参数如下:

Option Description
--deep-iteration if set, uses deep instead of shallow iteration. Automatically set if print-data-log is enabled.
--files <String: file1, file2, ...> REQUIRED: The comma separated list of data and index log files to be dumped.
--help Print usage information.
--index-sanity-check if set, just checks the index sanity without printing its content. This is the same check that is executed on broker startup to determine if an index needs rebuilding or not.
--key-decoder-class [String] if set, used to deserialize the keys. This class should implement kafka.serializer. Decoder trait. Custom jar should be available in kafka/libs directory. (default: kafka.serializer.StringDecoder)
--max-message-size <Integer: size> Size of largest message. (default: 5242880)
--offsets-decoder if set, log data will be parsed as offset data from the __consumer_offsets topic.
--print-data-log if set, printing the messages content when dumping data logs. Automatically set if any decoder option is specified.
--transaction-log-decoder if set, log data will be parsed as transaction metadata from the __transaction_state topic.
--value-decoder-class [String] if set, used to deserialize the messages. This class should implement kafka serializer. Decoder trait. Custom jar should be available in kafka/libs directory. (default: kafka.serializer.StringDecoder)
--verify-index-only if set, just verify the index log without printing its content.
--version Display Kafka version.

批量 dump Kafka 日志的 shell 脚本(其中 /home/myself/src-logs 为日志目录,/home/myself/dump-logs 为结果目录):

#!/bin/bash

# 指定遍历的目录
dir="/home/myself/src-logs"

# 如果目录不存在或不是目录,则输出错误信息并退出
if [ ! -d "$dir" ]; then
  echo "Error: $dir is not a directory"
  exit 1
fi

# 遍历目录找到所有后缀为 .log 的文件,并执行给定的命令
for file in "$dir"/*.log; do
  if [ -f "$file" ]; then
    sh /opt/kafka/bin/kafka-dump-log.sh --files "$file" --print-data-log > "/home/myself/dump-logs/${file##*/}"
  fi
done

猜你喜欢

转载自blog.csdn.net/Changxing_J/article/details/130330236