Analyzing access logs using only bash commands

Gökhan Aydın
3 min readNov 24, 2021

--

Last week we were doing an incident response on an Exchange Server. We were analyzing a Proxyshell exploit (surprise surprise) and we had to analyze IIS logs for detecting when the exploit happened. I just used bash commands for it and then I decided to write a blog post about that.

My main purpose in this blog is not to show how this server was hacked, just to show how I analyzed these logs. Because of that, I have to remove some parts. I don’t want to reveal any information about the case.

After obtaining IIS logs from the server let’s start digging

We have around 5.5 GB of logs to look at.

I’m gonna use this sigma rule for searching logs.

https://github.com/SigmaHQ/sigma/blob/master/rules/web/web_exchange_proxyshell_successful.yml

Searching Logs

We need a search like this;

(search '/autodiscover.json' AND search (powershell OR 'mapi/nspi' OR 'mapi/emsmdb' OR '/EWS' OR 'X-Rpps-CAT')) AND search ('- 200' OR '- 302')

I’m gonna use grep for searching these keywords.

grep -r '/autodiscover.json' * | grep -e 'powershell' -e 'mapi/nspi'  -e '/EWS' -e 'X-Rpps-CAT' | grep -e '- 200' -e '- 302'

Maybe there is a better way to do this. I’m not a bash expert, I just use it for some cool features. If you have any recommendations I would like to hear them.

This command will print every field that is in the log.

W3SVC1/u_ex210930.log2021–09–30 04:55:24REDACTEDPOST/autodiscover/autodiscover.json@evil.corp/powershell/?X-Rps-CAT=VgEAVAdXaW5kb3dzQwBBCEtlcmJlcm9zTB1BZG1pbmlzdHJhdG9yQREDACTEDUsUy0xLTUtMjEtMTczMDg0ODkwLTI2Mjk5Mzk3MzktMTA1MjE1NTI0Ny01MDBHAQAAAAcAAAAMUy0xLTUtMzItNTQ0RQAAAAA=&Email=autodiscover/autodiscover.json%3F@evil.corp&CorrelationID=<empty>;&cafeReqId=b5b2ed94-bd5c-4b03–8ca3–5cff3dbe3101; 443REDACTED python-requests/2.26.0200 0 0 89

Even for these 10 lines, we have interesting keywords. Looks like hackers used some kind of POC python script for this exploit but this is out of scope right now.

Now I’m gonna use awk for printing necessary fields.

First, we have to know the order of the fields.

IIS logs fields order

Which fields are needed for us? Let’s decide that.

· Date and time

· Source IP

· Target URI

· User-agent.

grep -r '/autodiscover.json' * | grep -e 'powershell' -e 'mapi/nspi'  -e '/EWS' -e 'X-Rpps-CAT' | grep -e '- 200' -e '- 302'| awk '{print $1,$2,$6,$9,$10}'
W3SVC1/u_ex210930.log2021–09–30 04:55:28@evil.corp/powershell/?X-Rps-CAT=VgEAVAdXaW5kb3dzQwBBCEtlcmJlcm9zTB1BZG1pbmlzdHJhdG9yQREDACTEDUsUy0xLTUtMjEtMTczMDg0ODkwLTI2Mjk5Mzk3MzktMTA1MjE1NTI0Ny01MDBHAQAAAAcAAAAMUy0xLTUtMzItNTQ0RQAAAAA=&Email=autodiscover/autodiscover.json%3F@evil.corp&CorrelationID=<empty>;&cafeReqId=e644525c-609e-4037-ae7c-41ce2b450a3a;REDACTED python-requests/2.26.0

Let’s look at how many IPs tried this exploit against this server.

I’ll use sort for making IPs unique and wc for counting lines.

grep -r '/autodiscover.json' * | grep -e 'powershell' -e 'mapi/nspi'  -e '/EWS' -e 'X-Rpps-CAT' | grep -e '- 200' -e '- 302'| awk '{print $9}' | sort -u | wc -l

This is how analyzed IIS logs using bash commands. In this case, there wasn’t that much log. But with this technique I analyzed 35 GB logs before, of course, it takes a bit longer than this but it still works.

--

--