These Shell analysis server log commands collection

These Shell analysis server log command collections are collected~

Linux should be learned  3 days ago

 

Author: Panda

Link: https://reurl.cc/kVrNXb

 

My small website runs on Alibaba Cloud's ECS, and occasionally analyzes my website server logs to see the website traffic. Let's see if there is any damage done by the dark side! So collect and sort out some server log analysis commands, you can try!

1. Check how many IP accesses:

 

awk '{print $1}' log_file|sort|uniq|wc -l

 

2. View the number of times a certain page has been visited:

 

grep "/index.php" log_file | wc -l

 

3. Check how many pages each IP has visited:

 

awk'{++S[$1]} END {for (a in S) print a,S[a]}' log_file> log.txt
sort -n -t '' -k 2 log.txt further sorting with sort

 

4. Sort the number of pages visited by each IP from small to large:

 

awk '{++S[$1]} END {for (a in S) print S[a],a}' log_file | sort -n

 

5. Check which pages a certain IP has visited:

 

grep ^111.111.111.111 log_file| awk '{print $1,$7}'

 

6. Remove the page of search engine statistics:

 

awk '{print $12,$1}' log_file | grep ^"Mozilla | awk '{print $2}' |sort | uniq | wc -l

 

7. Check how many IP accesses were made during the hour at 14:00 on August 16, 2015:

 

awk '{print $4,$1}' log_file | grep 16/Aug/2015:14 | awk '{print $2}'| sort | uniq | wc -l

 

8. Check the top ten IP addresses for access

 

awk '{print $1}' |sort|uniq -c|sort -nr |head -10 access_log

uniq -c is equivalent to group statistics and puts the statistics at the top

 

cat access.log|awk '{print $1}'|sort|uniq -c|sort -nr|head -10
cat access.log|awk '{counts[$(11)]+=1}; END {for(url in counts) print counts[url], url}

 

9. The 10 most visited files or pages

 

cat log_file|awk '{print $11}'|sort|uniq -c|sort -nr | head -10
cat log_file|awk '{print $11}'|sort|uniq -c|sort -nr|head -20
awk '{print $1}' log_file |sort -n -r |uniq -c | sort -n -r | head -20 

The top 20 most visited ips

 

10. The number of visits by subdomains is calculated based on referer, which is slightly inaccurate

 

cat access.log | awk '{print $11}' | sed -e ' s/http:' -e ' s//.*//' | sort | uniq -c | sort -rn | head -20

 

11. List the files with the largest transfer size

 

cat www.access.log |awk '($7~/.php/){print $10 " " $1 " " $4 " " $7}'|sort -nr|head -100

 

12. List the pages whose output is greater than 200,000byte (about 200kb) and the number of occurrences of the corresponding page

 

cat www.access.log |awk '($10 > 200000 && $7~/.php/){print $7}'|sort -n|uniq -c|sort -nr|head -100

 

13. If the last column of the log records the page file transfer time, there will be the most time-consuming page to the client

 

cat www.access.log |awk '($7~/.php/){print $NF " " $1 " " $4 " " $7}'|sort -nr|head -100

 

14. List the most time-consuming pages (more than 60 seconds) and the number of occurrences of the corresponding pages

 

cat www.access.log |awk '($NF > 60 && $7~/.php/){print $7}'|sort -n|uniq -c|sort -nr|head -100

 

15. List files whose transmission time exceeds 30 seconds

 

cat www.access.log |awk '($NF > 30){print $7}'|sort -n|uniq -c|sort -nr|head -20

16. List the number of running processes of each process on the current server, sorted in reverse order

 

ps -ef | awk -F ' ' '{print $8 " " $9}' |sort | uniq -c |sort -nr |head -20

 

17. View the current number of concurrent visits to apache

What is the difference between the number of MaxClients in httpd.conf

 

netstat -an | grep ESTABLISHED | wc -l

 

18. You can use the following parameters to view the data

 

ps -ef|grep httpd|wc -l
1388

Count the number of httpd processes. A single request will start a process, which is used on the Apache server.
Means that Apache can handle 1388 concurrent requests, this value Apache can automatically adjust according to the load situation

 

netstat -nat|grep -i "80"|wc -l
4341

netstat -an will print the current network link status of the system, and grep -i "80" is used to extract connections related to port 80, and wc -l performs connection statistics.
The final number returned is the total number of requests for all current ports 80

 

netstat -na|grep ESTABLISHED|wc -l
376

netstat -an will print the current network link status of the system, and grep ESTABLISHED will extract the information about the established connection. Then the final number returned by wc -l statistics is the total number of established connections on all 80 ports.

 

netstat -nat||grep ESTABLISHED|wc

Can view detailed records of all established connections

 

19. Output the number of connections for each ip, and the total number of connections in each state

 

netstat -n | awk '/^tcp/ {n=split($(NF-1),array,":");if(n<=2)++S[array[(1)]];else++S[array[(4)]];++s[$NF];++N} END {for(a in S){printf("%-20s %s", a, S[a]);++I}printf("%-20s %s","TOTAL_IP",I);for(a in s) printf("%-20s %s",a, s[a]);printf("%-20s %s","TOTAL_LINK",N);}'

 

20. Other collections

Analyze and sort the top 20 URLs with the highest access pages on 2012-05-04 in the log file

 

cat access.log |grep '04/May/2012'| awk '{print $11}'|sort|uniq -c|sort -nr|head -20

Query the IP address of www.abc.com in the URL address of the visited page

 

cat access_log | awk '($11~/www.abc.com/){print $1}'|sort|uniq -c|sort -nr

Get the 10 most visited IP addresses and you can also query by time

 

cat linewow-access.log|awk '{print $1}'|sort|uniq -c|sort -nr|head -10

Time period query log time period

 

cat log_file | egrep '15/Aug/2015|16/Aug/2015' |awk '{print $1}'|sort|uniq -c|sort -nr|head -10

Analyze the reverse order of IP in "/index.php?g=Member&m=Public&a=sendValidCode" from 2015/8/15 to 2015/8/16

 

cat log_file | egrep '15/Aug/2015|16/Aug/2015' | awk '{if($7 == "/index.php?g=Member&m=Public&a=sendValidCode") print $1,$7}'|sort|uniq -c|sort -nr

 

($7~/.php/) If $7 contains .php, it will be output. This sentence means the most time-consuming one hundred PHP pages

 

 

cat log_file |awk '($7~/.php/){print $NF " " $1 " " $4 " " $7}'|sort -nr|head -100

List the most time-consuming pages (more than 60 seconds) and the number of occurrences of the corresponding pages

 

cat access.log |awk '($NF > 60 && $7~/.php/){print $7}'|sort -n|uniq -c|sort -nr|head -100

Statistics website traffic (G)

 

cat access.log |awk '{sum+=$10} END {print sum/1024/1024/1024}'

Count 404 connections

 

awk '($9 ~/404/)' access.log | awk '{print $9,$7}' | sort

Statistics http status

 

cat access.log |awk '{counts[$(9)]+=1}; END {for(code in counts) print code, counts[code]}' 
cat access.log |awk '{print $9}'|sort|uniq -c|sort -rn

Concurrency per second

 

watch "awk '{if($9~/200|30|404/)COUNT[$4]++}END{for( a in COUNT) print a,COUNT[a]}' log_file|sort -k 2 -nr|head -n10"

Bandwidth statistics

 

cat apache.log |awk '{if($7~/GET/) count++}END{print "client_request="count}' 
cat apache.log |awk '{BYTE+=$11}END{print "client_kbyte_out="BYTE/1024"KB"}'

Find the 10 most visited IPs on a certain day

 

cat /tmp/access.log | grep "20/Mar/2011" |awk '{print $3}'|sort |uniq -c|sort -nr|head

What are the ips with the highest ip connections doing that day

 

cat access.log | grep "10.0.21.17" | awk '{print $8}' | sort | uniq -c | sort -nr | head -n 10

The 10 time periods with the largest number of ip connections in an hour

 

awk -vFS="[:]" '{gsub("-.*","",$1);num[$2" "$1]++}END{for(i in num)print i,num[i]}' log_file | sort -n -k 3 -r | head -10

Find out the most visited minutes

 

awk '{print $1}' access.log | grep "20/Mar/2011" |cut -c 14-18|sort|uniq -c|sort -nr|head

Take 5 minutes log

 

if [$DATE_MINUTE != $DATE_END_MINUTE] ;then #Judge whether the start time stamp and the end time stamp are equal
START_LINE=sed -n "/$DATE_MINUTE/=" $APACHE_LOG|head -n1 #If they are not equal, take out the start time The line number of the stamp, and the line number of the end timestamp

View the link status of tcp

 

netstat -nat |awk '{print $6}'|sort|uniq -c|sort -rn 

netstat -n | awk '/^tcp/ {++S[$NF]};END {for(a in S) print a, S[a]}' 

netstat -n | awk '/^tcp/ {++state[$NF]}; END {for(key in state) print key,"",state[key]}' 

netstat -n | awk '/^tcp/ {++arr[$NF]};END {for(k in arr) print k,"",arr[k]}' 

netstat -n |awk '/^tcp/ {print $NF}'|sort|uniq -c|sort -rn 

netstat -ant | awk '{print $NF}' | grep -v '[a-z]' | sort | uniq -c
netstat -ant|awk '/ip:80/{split($5,ip,":");++S[ip[1]]}END{for (a in S) print S[a],a}' |sort -n 

netstat -ant|awk '/:80/{split($5,ip,":");++S[ip[1]]}END{for (a in S) print S[a],a}' |sort -rn|head -n 10 

awk 'BEGIN{printf ("http_codecount_num")}{COUNT[$10]++}END{for (a in COUNT) printf a""COUNT[a]""}'

Find the top 20 IPs of the number of requests (usually used to find the source of the attack):

 

netstat -anlp|grep 80|grep tcp|awk '{print $5}'|awk -F: '{print $1}'|sort|uniq -c|sort -nr|head -n20 
netstat -ant |awk '/:80/{split($5,ip,":");++A[ip[1]]}END{for(i in A) print A[i],i}' |sort -rn|head -n20

Use tcpdump to sniff port 80 access to see who is the highest

 

tcpdump -i eth0 -tnn dst port 80 -c 1000 | awk -F"." '{print $1"."$2"."$3"."$4}' | sort | uniq -c | sort -nr |head -20

Find more time_wait connections

 

netstat -n|grep TIME_WAIT|awk '{print $5}'|sort|uniq -c|sort -rn|head -n20

Find more SYN connections

 

netstat -an | grep SYN | awk '{print $ 5}' | awk -F: '{print $ 1}' | sort | uniq -c | sort -nr | more

List processes by port

 

netstat -ntlp | grep 80 | awk '{print $7}' | cut -d/ -f1

Viewed the number of connections and the current number of connections

 

netstat -ant | grep $ ip: 80 | wc -l 
netstat -ant | grep $ ip: 80 | grip EST | wc -l

View the number of IP visits

 

netstat -nat|grep ":80"|awk '{print $5}' |awk -F: '{print $1}' | sort| uniq -c|sort -n

Linux command to analyze the current link status

 

netstat -n | awk'/^tcp/ {++S[$NF]} END {for(a in S) print a, S[a]}'
watch "netstat -n | awk'/^tcp/ {+ +S[$NF]} END {for(a in S) print a, S[a]}'" # You can always monitor through watch

 

LAST_ACK 5 #To close a TCP connection, you need to close it from two directions. Both parties send FIN to indicate the closure of one-way data. When the communication parties send the last FIN, the sender is in the LAST_ACK state. When the sender receives the other party's confirmation (Fin's Ack confirmation), the entire TCP connection is actually closed;

SYN_RECV 30 # indicates the number of requests waiting to be processed;

ESTABLISHED 1597 # indicates the normal data transmission status; 

FIN_WAIT1 51 # indicates the server actively requested Close the tcp connection; 

FIN_WAIT2 504 # indicates that the client interrupts the connection; 

TIME_WAIT 1057 # indicates the number of requests that have been processed and waited for the timeout to end; 

END

Guess you like

Origin blog.csdn.net/wzlsunice88/article/details/114261076