Say you want to find out how many hits you are getting t0 a specific page from a particular source IP, you can use this quick collection of Linux tools to get this data
[code]grep -i "URL_TO_CHECK" PATH_TO_APACHE_ACCESS_LOG | cut -d’ ‘ -f 1 -| sort |uniq -c | sort -rn > ~/ip_report.txt[/code]
You are using
- grep to filter the string of the page you want the report on
- cut to get the IP address from the log file
- sort and uniq to sort the unique IP addresses
- and finally sort -rn to sort the data in descending order
Example :
[code]grep -i "GET /" /opt/apache/logs/access_log | cut -d’ ‘ -f 1 -| sort |uniq -c | sort -rn > ~/ip_report.txt[/code]
gets you the report of hits to the index page.
2 Responses
[…] simple right, grep will do the job very well. As demonstrated in this blog post. But we had to analyze the data for a ton of servers and I really didn’t want to repeat the […]
[…] have use grep extensively before to analyze data in log files before. A good example is this post about using grep and sort to find the unique hits to a website. Here is another way to do it using […]