How to use the analyzeAuditLog tool

How to use the analyzeAuditLog tool

 

Our support bundle provides the analyzeAuditLog tool. It is a simple bash script that requires awk and grep.

The tool expects you to pass an uncompressed gateway 8.1.0 > audit log. Please make sure to sort it in time so if you use cat to combine multiple rotated audit logs, make sure to get the order correct. This is only important for the log date range detection.

Tool syntax:

analyzeAuditLogs.sh <gateway_audit.log> <customername>

If you don’t specify a customername, the tool will prompt you for one.

I have added a tool to convert locally to an html report, for customer consumption

Tool syntax:

generateLocalAuditReport.sh <loganalysis-xxxx.tgz>

This will produce a local OUTPUT directory with main index html file named as loganalysis-xxx.html and the per domain html files will be base64 encoded unique names referenced by the index page. its all a bunch of static web pages you can upload it to the webserver of your choice or just host it locally with python example:

cd OUTPUT python3 -m http.server 5555

Remarks

  • This tool needs about 20MB RAM per 1M lines in the input audit log file

  • The analysis part can be improved with higher CPU clock frequency, more cores and faster disk speed ( developed on ssd/nvme )

  • This tool supports both the local gateway audit log as well as syslog audit format type on US locale

local format I expect to see: 2024-11-05 00:15:19,940 INFO [3FB8F3E69971A52E-826] syslog format I expect to see: 10.11.200.237 <166>2024-11-22T00:22:46,255 INFO [AD43E7C90B774E76] or 10.11.200.237 2024-11-22T00:22:46,255 INFO [AD43E7C90B774E76] Example syslog format I don't accept: Jul 29 10:49:19 192.168.50.6 2024-07-29T08: 49:19,732 INFO [AD43E7C90B774E76]
  • This tool consumes all the CPU cores it can find, so I suggest running this on SCS rather than gateway itself. The advantage of running this on SCS is you get overall view of all gateways via syslog.

  • This tool will need up to 2x the size of the input audit log file for temp files in the current work directory. For example if you pass the tool a 4GB file, then it will need max another 4GB of disk space for temp files.

  • This tool expects to see a gateway audit log with query statistics on column 22. If you have an older gateway log you will need to manually add an empty column 22 using the following command:

sed -i 's/$/ []/g' YOURAUDITLOG.log

Global parameters

Parameter Name

Default

Comment

Parameter Name

Default

Comment

DECIMAL

.

Which decimal separator char to use as it depends on your locale settings.

DELOUTPUTDIR

true

Delete the output directory it makes after the final tarball was created.

DELDOMAINLIST

false

Delete the per domain list raw files.

SKIPWORKLOAD

false

Disable workload detection ( expensive greps )

POSTTOSUPPORT

false

Flag to allow/disallow posting bundle as a comment to the ticket

STRIPSCSP

true

Filter out scsp traffic from analysis

BLACKLIST

qualys

Blacklist security scanners

What does the tool generate?

Global

  1. total operations per domain

  2. total operations per gateway ( if syslog format is detected )

  3. count of list operations by type

  4. log date range detected

  5. per domain average and maximum object size from PUT operations

  6. count of operations by verb and status

  7. total operations per source IP

Per Domain

  1. raw all-list-operations file

  2. total operations per bucket

  3. total operations per protocol

  4. total operations per gateway ( if syslog format is detected )

  5. count of list operations per type

  6. per bucket average and maximum object size from PUT operations

  7. count of operations by verb and status

  8. total operations per source IP

  9. operation latency info per verb

  10. PUT latency per hour

  11. PUT ObjectRetention latency per hour

  12. Top 10 Pseudo folders ( 2 levels deep )

  13. For each detected workload keyword it creates a HASWORKLOAD file with a occurrence count inside of it. Supported workloads are: Veeam , Multipart, Backup365, Commvault, FileFly, ObjectLocking and Rubrik

© DataCore Software Corporation. · https://www.datacore.com · All rights reserved.