---
-[tag-1]: https://img.shields.io/github/downloads/Yamato-Security/hayabusa/total?style=plastic&label=GitHub%F0%9F%A6%85DownLoads
+[tag-1]: https://img.shields.io/github/downloads/Yamato-Security/hayabusa/total?style=plastic&label=GitHub%F0%9F%A6%85Downloads
[tag-2]: https://img.shields.io/github/stars/Yamato-Security/hayabusa?style=plastic&label=GitHub%F0%9F%A6%85Stars
[tag-3]: https://img.shields.io/github/v/release/Yamato-Security/hayabusa?display_name=tag&label=latest-version&style=plastic
-[tag-4]: https://img.shields.io/badge/Black%20Hat%20Arsenal-Asia%202022-blue
+[tag-4]: https://github.com/toolswatch/badges/blob/master/arsenal/asia/2022.svg
[tag-5]: https://rust-reportcard.xuri.me/badge/github.com/Yamato-Security/hayabusa
[tag-6]: https://img.shields.io/badge/Maintenance%20Level-Actively%20Developed-brightgreen.svg
[tag-7]: https://img.shields.io/badge/Twitter-00acee?logo=twitter&logoColor=white
@@ -20,14 +20,14 @@
# About Hayabusa
-Hayabusa is a **Windows event log fast forensics timeline generator** and **threat hunting tool** created by the [Yamato Security](https://yamatosecurity.connpass.com/) group in Japan. Hayabusa means ["peregrine falcon"](https://en.wikipedia.org/wiki/Peregrine_falcon") in Japanese and was chosen as peregrine falcons are the fastest animal in the world, great at hunting and highly trainable. It is written in [Rust](https://www.rust-lang.org/) and supports multi-threading in order to be as fast as possible. We have provided a [tool](https://github.com/Yamato-Security/hayabusa-rules/tree/main/tools/sigmac) to convert [sigma](https://github.com/SigmaHQ/sigma) rules into hayabusa rule format. The hayabusa detection rules are based on sigma rules, written in YML in order to be as easily customizable and extensible as possible. It can be run either on running systems for live analysis or by gathering logs from multiple systems for offline analysis. (At the moment, it does not support real-time alerting or periodic scans.) The output will be consolidated into a single CSV timeline for easy analysis in Excel, [Timeline Explorer](https://ericzimmerman.github.io/#!index.md), or [Elastic Stack](doc/ElasticStackImport/ElasticStackImport-English.md).
+Hayabusa is a **Windows event log fast forensics timeline generator** and **threat hunting tool** created by the [Yamato Security](https://yamatosecurity.connpass.com/) group in Japan. Hayabusa means ["peregrine falcon"](https://en.wikipedia.org/wiki/Peregrine_falcon") in Japanese and was chosen as peregrine falcons are the fastest animal in the world, great at hunting and highly trainable. It is written in [Rust](https://www.rust-lang.org/) and supports multi-threading in order to be as fast as possible. We have provided a [tool](https://github.com/Yamato-Security/hayabusa-rules/tree/main/tools/sigmac) to convert [Sigma](https://github.com/SigmaHQ/sigma) rules into Hayabusa rule format. The Sigma-compatible Hayabusa detection rules are written in YML in order to be as easily customizable and extensible as possible. Hayabusa can be run either on single running systems for live analysis, by gathering logs from single or multiple systems for offline analysis, or by running the [Hayabusa artifact](https://docs.velociraptor.app/exchange/artifacts/pages/windows.eventlogs.hayabusa/) with [Velociraptor](https://docs.velociraptor.app/) for enterprise-wide threat hunting and incident response. The output will be consolidated into a single CSV timeline for easy analysis in Excel, [Timeline Explorer](https://ericzimmerman.github.io/#!index.md), [Elastic Stack](doc/ElasticStackImport/ElasticStackImport-English.md), [Timesketch](https://timesketch.org/), etc...
## Table of Contents
- [About Hayabusa](#about-hayabusa)
- [Table of Contents](#table-of-contents)
- [Main Goals](#main-goals)
- - [Threat Hunting](#threat-hunting)
+ - [Threat Hunting and Enterprise-wide DFIR](#threat-hunting-and-enterprise-wide-dfir)
- [Fast Forensics Timeline Generation](#fast-forensics-timeline-generation)
- [Screenshots](#screenshots)
- [Startup](#startup)
@@ -38,9 +38,9 @@ Hayabusa is a **Windows event log fast forensics timeline generator** and **thre
- [Analysis in Timeline Explorer](#analysis-in-timeline-explorer)
- [Critical Alert Filtering and Computer Grouping in Timeline Explorer](#critical-alert-filtering-and-computer-grouping-in-timeline-explorer)
- [Analysis with the Elastic Stack Dashboard](#analysis-with-the-elastic-stack-dashboard)
+ - [Analysis in Timesketch](#analysis-in-timesketch)
- [Analyzing Sample Timeline Results](#analyzing-sample-timeline-results)
- [Features](#features)
-- [Planned Features](#planned-features)
- [Downloads](#downloads)
- [Git cloning](#git-cloning)
- [Advanced: Compiling From Source (Optional)](#advanced-compiling-from-source-optional)
@@ -48,26 +48,38 @@ Hayabusa is a **Windows event log fast forensics timeline generator** and **thre
- [Cross-compiling 32-bit Windows Binaries](#cross-compiling-32-bit-windows-binaries)
- [macOS Compiling Notes](#macos-compiling-notes)
- [Linux Compiling Notes](#linux-compiling-notes)
+ - [Cross-compiling Linux MUSL Binaries](#cross-compiling-linux-musl-binaries)
- [Running Hayabusa](#running-hayabusa)
- - [Caution: Anti-Virus/EDR Warnings](#caution-anti-virusedr-warnings)
+ - [Caution: Anti-Virus/EDR Warnings and Slow Runtimes](#caution-anti-virusedr-warnings-and-slow-runtimes)
- [Windows](#windows)
- [Linux](#linux)
- [macOS](#macos)
- [Usage](#usage)
+ - [Main commands](#main-commands)
- [Command Line Options](#command-line-options)
- [Usage Examples](#usage-examples)
- [Pivot Keyword Generator](#pivot-keyword-generator)
- [Logon Summary Generator](#logon-summary-generator)
- [Testing Hayabusa on Sample Evtx Files](#testing-hayabusa-on-sample-evtx-files)
- [Hayabusa Output](#hayabusa-output)
+ - [Profiles](#profiles)
+ - [1. `minimal` profile output](#1-minimal-profile-output)
+ - [2. `standard` profile output](#2-standard-profile-output)
+ - [3. `verbose` profile output](#3-verbose-profile-output)
+ - [4. `verbose-all-field-info` profile output](#4-verbose-all-field-info-profile-output)
+ - [5. `verbose-details-and-all-field-info` profile output](#5-verbose-details-and-all-field-info-profile-output)
+ - [6. `timesketch` profile output](#6-timesketch-profile-output)
+ - [Profile Comparison](#profile-comparison)
+ - [Profile Field Aliases](#profile-field-aliases)
- [Level Abbrevations](#level-abbrevations)
- [MITRE ATT&CK Tactics Abbreviations](#mitre-attck-tactics-abbreviations)
- [Channel Abbreviations](#channel-abbreviations)
- [Progress Bar](#progress-bar)
- [Color Output](#color-output)
- - [Event Fequency Timeline](#event-fequency-timeline)
- - [Dates with most total detections](#dates-with-most-total-detections)
- - [Top 5 computers with most unique detections](#top-5-computers-with-most-unique-detections)
+ - [Results Summary](#results-summary-1)
+ - [Event Fequency Timeline](#event-fequency-timeline)
+ - [Dates with most total detections](#dates-with-most-total-detections)
+ - [Top 5 computers with most unique detections](#top-5-computers-with-most-unique-detections)
- [Hayabusa Rules](#hayabusa-rules)
- [Hayabusa v.s. Converted Sigma Rules](#hayabusa-vs-converted-sigma-rules)
- [Detection Rule Tuning](#detection-rule-tuning)
@@ -86,36 +98,36 @@ Hayabusa is a **Windows event log fast forensics timeline generator** and **thre
## Main Goals
-### Threat Hunting
+### Threat Hunting and Enterprise-wide DFIR
-Hayabusa currently has over 2300 sigma rules and over 130 hayabusa rules with more rules being added regularly. The ultimate goal is to be able to push out hayabusa agents to all Windows endpoints after an incident or for periodic threat hunting and have them alert back to a central server.
+Hayabusa currently has over 2600 Sigma rules and over 130 Hayabusa built-in detection rules with more rules being added regularly. It can be used for enterprise-wide proactive threat hunting as well as DFIR (Digital Forensics and Incident Response) for free with [Velociraptor](https://docs.velociraptor.app/)'s [Hayabusa artifact](https://docs.velociraptor.app/exchange/artifacts/pages/windows.eventlogs.hayabusa/). By combining these two open-source tools, you can essentially retroactively reproduce a SIEM when there is no SIEM setup in the environment. You can learn about how to do this by watching [Eric Capuano](https://twitter.com/eric_capuano)'s Velociraptor walkthrough [here](https://www.youtube.com/watch?v=Q1IoGX--814).
### Fast Forensics Timeline Generation
-Windows event log analysis has traditionally been a very long and tedious process because Windows event logs are 1) in a data format that is hard to analyze and 2) the majority of data is noise and not useful for investigations. Hayabusa's main goal is to extract out only useful data and present it in an easy-to-read format that is usable not only by professionally trained analysts but any Windows system administrator.
-Hayabusa is not intended to be a replacement for tools like [Evtx Explorer](https://ericzimmerman.github.io/#!index.md) or [Event Log Explorer](https://eventlogxp.com/) for more deep-dive analysis but is intended for letting analysts get 80% of their work done in 20% of the time.
+Windows event log analysis has traditionally been a very long and tedious process because Windows event logs are 1) in a data format that is hard to analyze and 2) the majority of data is noise and not useful for investigations. Hayabusa's goal is to extract out only useful data and present it in a concise as possible easy-to-read format that is usable not only by professionally trained analysts but any Windows system administrator.
+Hayabusa hopes to let analysts get 80% of their work done in 20% of the time when compared to traditional Windows event log analysis.
# Screenshots
## Startup
-
+
## Terminal Output
-
+
## Event Fequency Timeline (`-V` option)
-
+
## Results Summary
-
+
## Analysis in Excel
-
+
## Analysis in Timeline Explorer
@@ -131,6 +143,10 @@ Hayabusa is not intended to be a replacement for tools like [Evtx Explorer](http

+## Analysis in Timesketch
+
+
+
# Analyzing Sample Timeline Results
You can check out a sample CSV timeline [here](https://github.com/Yamato-Security/hayabusa/tree/main/sample-results).
@@ -139,6 +155,8 @@ You can learn how to analyze CSV timelines in Excel and Timeline Explorer [here]
You can learn how to import CSV files into Elastic Stack [here](doc/ElasticStackImport/ElasticStackImport-English.md).
+You can learn how to import CSV files into Timesketch [here](doc/TimesketchImport/TimesketchImport-English.md).
+
# Features
* Cross-platform support: Windows, Linux, macOS.
@@ -155,15 +173,11 @@ You can learn how to import CSV files into Elastic Stack [here](doc/ElasticStack
* Create a list of unique pivot keywords to quickly identify abnormal users, hostnames, processes, etc... as well as correlate events.
* Output all fields for more thorough investigations.
* Successful and failed logon summary.
-
-# Planned Features
-
-* Enterprise-wide hunting on all endpoints.
-* MITRE ATT&CK heatmap generation.
+* Enterprise-wide threat hunting and DFIR on all endpoints with [Velociraptor](https://docs.velociraptor.app/).
# Downloads
-Please download the latest stable version of hayabusa with compiled binaries or the source code from the [Releases](https://github.com/Yamato-Security/hayabusa/releases) page.
+Please download the latest stable version of Hayabusa with compiled binaries or compile the source code from the [Releases](https://github.com/Yamato-Security/hayabusa/releases) page.
# Git cloning
@@ -180,7 +194,7 @@ Note: If you forget to use --recursive option, the `rules` folder, which is mana
You can sync the `rules` folder and get latest Hayabusa rules with `git pull --recurse-submodules` or use the following command:
```bash
-hayabusa-1.3.2-win-x64.exe -u
+hayabusa-1.5.1-win-x64.exe -u
```
If the update fails, you may need to rename the `rules` folder and try again.
@@ -188,14 +202,13 @@ If the update fails, you may need to rename the `rules` folder and try again.
>> Caution: When updating, rules and config files in the `rules` folder are replaced with the latest rules and config files in the [hayabusa-rules](https://github.com/Yamato-Security/hayabusa-rules) repository.
>> Any changes you make to existing files will be overwritten, so we recommend that you make backups of any files that you edit before updating.
>> If you are performing level tuning with `--level-tuning`, please re-tune your rule files after each update.
->> If you add new rules inside of the `rules` folder, they will **not** be overwritten or deleted when updating.
+>> If you add **new** rules inside of the `rules` folder, they will **not** be overwritten or deleted when updating.
# Advanced: Compiling From Source (Optional)
If you have Rust installed, you can compile from source with the following command:
```bash
-cargo clean
cargo build --release
```
@@ -207,7 +220,7 @@ Be sure to periodically update Rust with:
rustup update stable
```
-The compiled binary will be outputted in the `target/release` folder.
+The compiled binary will be outputted in the `./target/release` folder.
## Updating Rust Packages
@@ -254,31 +267,52 @@ Fedora-based distros:
sudo yum install openssl-devel
```
+## Cross-compiling Linux MUSL Binaries
+
+On a Linux OS, first install the target.
+
+```bash
+rustup install stable-x86_64-unknown-linux-musl
+rustup target add x86_64-unknown-linux-musl
+```
+
+Compile with:
+
+```
+cargo build --release --target=x86_64-unknown-linux-musl
+```
+
+The MUSL binary will be created in the `./target/x86_64-unknown-linux-musl/release/` directory.
+MUSL binaries are are about 15% slower than the GNU binaries.
+
# Running Hayabusa
-## Caution: Anti-Virus/EDR Warnings
+## Caution: Anti-Virus/EDR Warnings and Slow Runtimes
You may receive an alert from anti-virus or EDR products when trying to run hayabusa or even just when downloading the `.yml` rules as there will be keywords like `mimikatz` and suspicious PowerShell commands in the detection signature.
These are false positives so will need to configure exclusions in your security products to allow hayabusa to run.
If you are worried about malware or supply chain attacks, please check the hayabusa source code and compile the binaries yourself.
+You may experience slow runtime especially on the first run after a reboot due to the real-time protection of Windows Defender. You can avoid this by temporarily turning real-time protection off or adding an exclusion to the hayabusa runtime directory. (Please take into consideration the security risks before doing these.)
+
## Windows
-In Command Prompt or Windows Terminal, just run the 32-bit or 64-bit Windows binary from the hayabusa root directory.
-Example: `hayabusa-1.3.2-windows-x64.exe`
+In a Command/PowerShell Prompt or Windows Terminal, just run the appropriate 32-bit or 64-bit Windows binary.
+
+Example: `hayabusa-1.5.1-windows-x64.exe`
## Linux
You first need to make the binary executable.
```bash
-chmod +x ./hayabusa-1.3.2-linux-x64-gnu
+chmod +x ./hayabusa-1.5.1-linux-x64-gnu
```
Then run it from the Hayabusa root directory:
```bash
-./hayabusa-1.3.2-linux-x64-gnu
+./hayabusa-1.5.1-linux-x64-gnu
```
## macOS
@@ -286,159 +320,186 @@ Then run it from the Hayabusa root directory:
From Terminal or iTerm2, you first need to make the binary executable.
```bash
-chmod +x ./hayabusa-1.3.2-mac-intel
+chmod +x ./hayabusa-1.5.1-mac-intel
```
Then, try to run it from the Hayabusa root directory:
```bash
-./hayabusa-1.3.2-mac-intel
+./hayabusa-1.5.1-mac-intel
```
On the latest version of macOS, you may receive the following security error when you try to run it:
-
+
Click "Cancel" and then from System Preferences, open "Security & Privacy" and from the General tab, click "Allow Anyway".
-
+
After that, try to run it again.
```bash
-./hayabusa-1.3.2-mac-intel
+./hayabusa-1.5.1-mac-intel
```
The following warning will pop up, so please click "Open".
-
+
You should now be able to run hayabusa.
# Usage
+## Main commands
+
+* default: Create a fast forensics timeline.
+* `--level-tuning`: Custom tune the alerts' `level`.
+* `-L, --logon-summary`: Print a summary of logon events.
+* `-P, --pivot-keywords-list`: Print a list of suspicious keywords to pivot on.
+* `-s, --statistics`: Print metrics of the count and percentage of events based on Event ID.
+* `--set-default-profile`: Change the default profile.
+* `-u, --update`: Sync the rules to the latest rules in the [hayabusa-rules](https://github.com/Yamato-Security/hayabusa-rules) GitHub repository.
## Command Line Options
```
USAGE:
- hayabusa.exe -f file.evtx [OPTIONS] / hayabusa.exe -d evtx-directory [OPTIONS]
+ hayabusa.exe [OTHER-ACTIONS] [OPTIONS]
-OPTIONS:
- --European-time Output timestamp in European time format (ex: 22-02-2022 22:00:00.123 +02:00)
- --RFC-2822 Output timestamp in RFC 2822 format (ex: Fri, 22 Feb 2022 22:00:00 -0600)
- --RFC-3339 Output timestamp in RFC 3339 format (ex: 2022-02-22 22:00:00.123456-06:00)
- --US-military-time Output timestamp in US military time format (ex: 02-22-2022 22:00:00.123 -06:00)
- --US-time Output timestamp in US time format (ex: 02-22-2022 10:00:00.123 PM -06:00)
- --target-file-ext ... Specify additional target file extensions (ex: evtx_data) (ex: evtx1 evtx2)
- --all-tags Output all tags when saving to a CSV file
- -c, --config Specify custom rule config folder (default: ./rules/config)
- --contributors Print the list of contributors
- -d, --directory Directory of multiple .evtx files
- -D, --enable-deprecated-rules Enable rules marked as deprecated
- --end-timeline End time of the event logs to load (ex: "2022-02-22 23:59:59 +09:00")
- -f, --filepath File path to one .evtx file
- -F, --full-data Print all field information
- -h, --help Print help information
- -l, --live-analysis Analyze the local C:\Windows\System32\winevt\Logs folder
- -L, --logon-summary Print a summary of successful and failed logons
- --level-tuning Tune alert levels (default: ./rules/config/level_tuning.txt)
- -m, --min-level Minimum level for rules (default: informational)
- -n, --enable-noisy-rules Enable rules marked as noisy
- --no-color Disable color output
- -o, --output Save the timeline in CSV format (ex: results.csv)
- -p, --pivot-keywords-list Create a list of pivot keywords
- -q, --quiet Quiet mode: do not display the launch banner
- -Q, --quiet-errors Quiet errors mode: do not save error logs
- -r, --rules Specify a rule directory or file (default: ./rules)
- -R, --hide-record-ID Do not display EventRecordID numbers
- -s, --statistics Print statistics of event IDs
- --start-timeline Start time of the event logs to load (ex: "2020-02-22 00:00:00 +09:00")
- -t, --thread-number Thread number (default: optimal number for performance)
- -u, --update-rules Update to the latest rules in the hayabusa-rules github repository
- -U, --UTC Output time in UTC format (default: local time)
- -v, --verbose Output verbose information
- -V, --visualize-timeline Output event frequency timeline
- --version Print version information
+INPUT:
+ -d, --directory Directory of multiple .evtx files
+ -f, --file File path to one .evtx file
+ -l, --live-analysis Analyze the local C:\Windows\System32\winevt\Logs folder
+
+ADVANCED:
+ -c, --rules-config Specify custom rule config directory (default: ./rules/config)
+ -Q, --quiet-errors Quiet errors mode: do not save error logs
+ -r, --rules Specify a custom rule directory or file (default: ./rules)
+ -t, --thread-number Thread number (default: optimal number for performance)
+ --target-file-ext ... Specify additional target file extensions (ex: evtx_data) (ex: evtx1 evtx2)
+
+OUTPUT:
+ -o, --output Save the timeline in CSV format (ex: results.csv)
+ -P, --profile Specify output profile (minimal, standard, verbose, verbose-all-field-info, verbose-details-and-all-field-info)
+
+DISPLAY-SETTINGS:
+ --no-color Disable color output
+ --no-summary Do not display result summary
+ -q, --quiet Quiet mode: do not display the launch banner
+ -v, --verbose Output verbose information
+ -V, --visualize-timeline Output event frequency timeline
+
+FILTERING:
+ -D, --deep-scan Disable event ID filter to scan all events (slower)
+ --enable-deprecated-rules Enable rules marked as deprecated
+ --exclude-status ... Ignore rules according to status (ex: experimental) (ex: stable test)
+ -m, --min-level Minimum level for rules (default: informational)
+ -n, --enable-noisy-rules Enable rules marked as noisy
+ --timeline-end End time of the event logs to load (ex: "2022-02-22 23:59:59 +09:00")
+ --timeline-start Start time of the event logs to load (ex: "2020-02-22 00:00:00 +09:00")
+
+OTHER-ACTIONS:
+ --contributors Print the list of contributors
+ -L, --logon-summary Print a summary of successful and failed logons
+ --level-tuning [] Tune alert levels (default: ./rules/config/level_tuning.txt)
+ -p, --pivot-keywords-list Create a list of pivot keywords
+ -s, --statistics Print statistics of event IDs
+ --set-default-profile Set default output profile
+ -u, --update-rules Update to the latest rules in the hayabusa-rules github repository
+
+TIME-FORMAT:
+ --European-time Output timestamp in European time format (ex: 22-02-2022 22:00:00.123 +02:00)
+ --RFC-2822 Output timestamp in RFC 2822 format (ex: Fri, 22 Feb 2022 22:00:00 -0600)
+ --RFC-3339 Output timestamp in RFC 3339 format (ex: 2022-02-22 22:00:00.123456-06:00)
+ --US-military-time Output timestamp in US military time format (ex: 02-22-2022 22:00:00.123 -06:00)
+ --US-time Output timestamp in US time format (ex: 02-22-2022 10:00:00.123 PM -06:00)
+ -U, --UTC Output time in UTC format (default: local time)
```
## Usage Examples
-* Run hayabusa against one Windows event log file:
+* Run hayabusa against one Windows event log file with default standard profile:
```bash
-hayabusa-1.3.2-win-x64.exe -f eventlog.evtx
+hayabusa-1.5.1-win-x64.exe -f eventlog.evtx
```
-* Run hayabusa against the sample-evtx directory with multiple Windows event log files:
+* Run hayabusa against the sample-evtx directory with multiple Windows event log files with the verbose profile:
```bash
-hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx
+hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -P verbose
```
-* Export to a single CSV file for further analysis with excel, timeline explorer, elastic stack, etc... and include all field information:
+* Export to a single CSV file for further analysis with excel, timeline explorer, elastic stack, etc... and include all field information (Warning: your file output size will become much larger with the `verbose-details-and-all-field-info` profile!):
```bash
-hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -o results.csv -F
+hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -o results.csv -F
```
* Only run hayabusa rules (the default is to run all the rules in `-r .\rules`):
```bash
-hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa -o results.csv
+hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa -o results.csv
```
* Only run hayabusa rules for logs that are enabled by default on Windows:
```bash
-hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\default -o results.csv
+hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\default -o results.csv
```
* Only run hayabusa rules for sysmon logs:
```bash
-hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\sysmon -o results.csv
+hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\sysmon -o results.csv
```
* Only run sigma rules:
```bash
-hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\sigma -o results.csv
+hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\sigma -o results.csv
```
* Enable deprecated rules (those with `status` marked as `deprecated`) and noisy rules (those whose rule ID is listed in `.\rules\config\noisy_rules.txt`):
```bash
-hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx --enable-noisy-rules --enable-deprecated-rules -o results.csv
+hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx --enable-noisy-rules --enable-deprecated-rules -o results.csv
```
* Only run rules to analyze logons and output in the UTC timezone:
```bash
-hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\default\events\Security\Logons -U -o results.csv
+hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\default\events\Security\Logons -U -o results.csv
```
* Run on a live Windows machine (requires Administrator privileges) and only detect alerts (potentially malicious behavior):
```bash
-hayabusa-1.3.2-win-x64.exe -l -m low
+hayabusa-1.5.1-win-x64.exe -l -m low
```
* Create a list of pivot keywords from critical alerts and save the results. (Results will be saved to `keywords-Ip Addresses.txt`, `keywords-Users.txt`, etc...):
```bash
-hayabusa-1.3.2-win-x64.exe -l -m critical -p -o keywords
+hayabusa-1.5.1-win-x64.exe -l -m critical -p -o keywords
```
* Print Event ID statistics:
```bash
-hayabusa-1.3.2-win-x64.exe -f Security.evtx -s
+hayabusa-1.5.1-win-x64.exe -f Security.evtx -s
+```
+
+* Print logon summary:
+
+```bash
+hayabusa-1.5.1-win-x64.exe -L -f Security.evtx -s
```
* Print verbose information (useful for determining which files take long to process, parsing errors, etc...):
```bash
-hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -v
+hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -v
```
* Verbose output example:
@@ -456,13 +517,19 @@ Checking target evtx FilePath: "./hayabusa-sample-evtx/YamatoSecurity/T1218.004_
5 / 509 [=>------------------------------------------------------------------------------------------------------------------------------------------] 0.98 % 1s
```
+* Output to a CSV format compatible to import into [Timesketch](https://timesketch.org/):
+
+```bash
+hayabusa-1.5.1-win-x64.exe -d ../hayabusa-sample-evtx --RFC-3339 -o timesketch-import.csv -P timesketch -U
+```
+
* Quiet error mode:
By default, hayabusa will save error messages to error log files.
If you do not want to save error messages, please add `-Q`.
## Pivot Keyword Generator
-You can use the `-p` or `--pivot-keywords-list` option to create a list of unique pivot keywords to quickly identify abnormal users, hostnames, processes, etc... as well as correlate events. You can customize what keywords you want to search for by editing `config/pivot_keywords.txt`.
+You can use the `-p` or `--pivot-keywords-list` option to create a list of unique pivot keywords to quickly identify abnormal users, hostnames, processes, etc... as well as correlate events. You can customize what keywords you want to search for by editing `./config/pivot_keywords.txt`.
This is the default setting:
```
@@ -493,28 +560,84 @@ You can download the sample evtx files to a new `hayabusa-sample-evtx` sub-direc
git clone https://github.com/Yamato-Security/hayabusa-sample-evtx.git
```
-> Note: You need to run the binary from the Hayabusa root directory.
-
# Hayabusa Output
-When hayabusa output is being displayed to the screen (the default), it will display the following information:
+## Profiles
-* `Timestamp`: Default is `YYYY-MM-DD HH:mm:ss.sss +hh:mm` format. This comes from the `` field in the event log. The default timezone will be the local timezone but you can change the timezone to UTC with the `--utc` option.
-* `Computer`: This comes from the `` field in the event log.
-* `Channel`: The name of log. This comes from the `` field in the event log.
-* `Event ID`: This comes from the `` field in the event log.
-* `Level`: This comes from the `level` field in the YML detection rule. (`informational`, `low`, `medium`, `high`, `critical`) By default, all level alerts will be displayed but you can set the minimum level with `-m`. For example, you can set `-m high`) in order to only scan for and display high and critical alerts.
-* `RecordID`: This comes from the `` field in the event log. You can hidde this output with the `-R` or `--hide-record-id` option.
-* `Title`: This comes from the `title` field in the YML detection rule.
-* `Details`: This comes from the `details` field in the YML detection rule, however, only hayabusa rules have this field. This field gives extra information about the alert or event and can extract useful data from the fields in event logs. For example, usernames, command line information, process information, etc... When a placeholder points to a field that does not exist or there is an incorrect alias mapping, it will be outputted as `n/a` (not available). If the `details` field is not specified (i.e. sigma rules), default `details` messages to extract fields defined in `./rules/config/default_details.txt` will be outputted. You can add more default `details` messages by adding the `Provider Name`, `EventID` and `details` message you want to output in `default_details.txt`.
+Hayabusa has 5 pre-defined profiles to use in `config/profiles.yaml`:
-The following additional columns will be added to the output when saving to a CSV file:
+1. `minimal`
+2. `standard` (default)
+3. `verbose`
+4. `verbose-all-field-info`
+5. `verbose-details-and-all-field-info`
-* `MitreAttack`: MITRE ATT&CK tactics.
-* `Rule Path`: The path to the detection rule that generated the alert or event.
-* `File Path`: The path to the evtx file that caused the alert or event.
+You can easily customize or add your own profiles by editing this file.
+You can also easily change the default profile with `--set-default-profile `.
-If you add the `-F` or `--full-data` option, a `RecordInformation` column with all field information will also be added.
+### 1. `minimal` profile output
+
+`%Timestamp%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%RuleTitle%`, `%Details%`
+
+### 2. `standard` profile output
+
+`%Timestamp%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%MitreTactics%`, `%RecordID%`, `%RuleTitle%`, `%Details%`
+
+### 3. `verbose` profile output
+
+`%Timestamp%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%MitreTactics`, `%MitreTags%`, `%OtherTags%`, `%RecordID%`, `%RuleTitle%`, `%Details%`, `%RuleFile%`, `%EvtxFile%`
+
+### 4. `verbose-all-field-info` profile output
+
+Instead of outputting the minimal `details` information, all field information in the `EventData` section will be outputted.
+
+`%Timestamp%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%MitreTactics`, `%MitreTags%`, `%OtherTags%`, `%RecordID%`, `%RuleTitle%`, `%AllFieldInfo%`, `%RuleFile%`, `%EvtxFile%`
+
+### 5. `verbose-details-and-all-field-info` profile output
+
+`verbose` profile plus all field information. (Warning: this will usually double the output file size!)
+
+`%Timestamp%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%MitreTactics`, `%MitreTags%`, `%OtherTags%`, `%RecordID%`, `%RuleTitle%`, `%Details%`, `%RuleFile%`, `%EvtxFile%`, `%AllFieldInfo%`
+
+### 6. `timesketch` profile output
+
+The `verbose` profile that is compatible with importing into [Timesketch](https://timesketch.org/).
+
+`%Timestamp%`, `hayabusa`, `%RuleTitle%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%MitreTactics`, `%MitreTags%`, `%OtherTags%`, `%RecordID%`, `%Details%`, `%RuleFile%`, `%EvtxFile%`
+
+### Profile Comparison
+
+The following benchmarks were conducted on a 2018 MBP with 7.5GB of evtx data.
+
+| Profile | Processing Time | Output Filesize |
+| :---: | :---: | :---: |
+| minimal | 16 minutes 18 seconds | 690 MB |
+| standard | 16 minutes 23 seconds | 710 MB |
+| verbose | 17 minutes | 990 MB |
+| timesketch | 17 minutes | 1015 MB |
+| verbose-all-field-info | 16 minutes 50 seconds | 1.6 GB |
+| verbose-details-and-all-field-info | 17 minutes 12 seconds | 2.1 GB |
+
+### Profile Field Aliases
+
+| Alias name | Hayabusa output information|
+| :--- | :--- |
+|%Timestamp% | Default is `YYYY-MM-DD HH:mm:ss.sss +hh:mm` format. `` field in the event log. The default timezone will be the local timezone but you can change the timezone to UTC with the `--UTC` option. |
+|%Computer% | The `` field. |
+|%Channel% | The name of log. `` field. |
+|%EventID% | The `` field. |
+|%Level% | The `level` field in the YML detection rule. (`informational`, `low`, `medium`, `high`, `critical`) |
+|%MitreTactics% | MITRE ATT&CK [tactics](https://attack.mitre.org/tactics/enterprise/) (Ex: Initial Access, Lateral Movement, etc...). |
+|%MitreTags% | MITRE ATT&CK Group ID, Technique ID and Software ID. |
+|%OtherTags% | Any keyword in the `tags` field in a YML detection rule which is not included in `MitreTactics` or `MitreTags`. |
+|%RecordID% | The Event Record ID from `` field. |
+|%RuleTitle% | The `title` field in the YML detection rule. |
+|%Details% | The `details` field in the YML detection rule, however, only hayabusa rules have this field. This field gives extra information about the alert or event and can extract useful data from the fields in event logs. For example, usernames, command line information, process information, etc... When a placeholder points to a field that does not exist or there is an incorrect alias mapping, it will be outputted as `n/a` (not available). If the `details` field is not specified (i.e. sigma rules), default `details` messages to extract fields defined in `./rules/config/default_details.txt` will be outputted. You can add more default `details` messages by adding the `Provider Name`, `EventID` and `details` message you want to output in `default_details.txt`. When no `details` field is defined in a rule nor in `default_details.txt`, all fields will be outputted to the `details` column. |
+|%AllFieldInfo% | All field information. |
+|%RuleFile% | The filename of the detection rule that generated the alert or event. |
+|%EvtxFile% | The evtx filename that caused the alert or event. |
+
+You can use these aliases in your output profiles, as well as define other [event key alises](https://github.com/Yamato-Security/hayabusa-rules/blob/main/README.md#eventkey-aliases) to output other fields.
## Level Abbrevations
@@ -529,7 +652,7 @@ In order to save space, we use the following abbrevations when displaying the al
## MITRE ATT&CK Tactics Abbreviations
In order to save space, we use the following abbreviations when displaying MITRE ATT&CK tactic tags.
-You can freely edit these abbreviations in the `config/output_tag.txt` configuration file.
+You can freely edit these abbreviations in the `./config/output_tag.txt` configuration file.
If you want to output all the tags defined in a rule, please specify the `--all-tags` option.
* `Recon` : Reconnaissance
@@ -550,7 +673,7 @@ If you want to output all the tags defined in a rule, please specify the `--all-
## Channel Abbreviations
In order to save space, we use the following abbreviations when displaying Channel.
-You can freely edit these abbreviations in the `config/channel_abbreviations.txt` configuration file.
+You can freely edit these abbreviations in the `./rules/config/channel_abbreviations.txt` configuration file.
* `App` : `Application`
* `AppLocker` : `Microsoft-Windows-AppLocker/*`
@@ -592,16 +715,18 @@ The alerts will be outputted in color based on the alert `level`.
You can change the default colors in the config file at `./config/level_color.txt` in the format of `level,(RGB 6-digit ColorHex)`.
If you want to disable color output, you can use `--no-color` option.
-## Event Fequency Timeline
+## Results Summary
+
+### Event Fequency Timeline
If you add `-V` or `--visualize-timeline` option, the Event Frequency Timeline feature displays a sparkline frequency timeline of detected events.
Note: There needs to be more than 5 events. Also, the characters will not render correctly on the default Command Prompt or PowerShell Prompt, so please use a terminal like Windows Terminal, iTerm2, etc...
-## Dates with most total detections
+### Dates with most total detections
A summary of the dates with the most total detections categorized by level (`critical`, `high`, etc...).
-## Top 5 computers with most unique detections
+### Top 5 computers with most unique detections
The top 5 computers with the most unique detections categorized by level (`critical`, `high`, etc...).
@@ -651,15 +776,15 @@ Hayabusa rules are designed solely for Windows event log analysis and have the f
Like firewalls and IDSes, any signature-based tool will require some tuning to fit your environment so you may need to permanently or temporarily exclude certain rules.
-You can add a rule ID (Example: `4fe151c2-ecf9-4fae-95ae-b88ec9c2fca6`) to `rules/config/exclude_rules.txt` in order to ignore any rule that you do not need or cannot be used.
+You can add a rule ID (Example: `4fe151c2-ecf9-4fae-95ae-b88ec9c2fca6`) to `./rules/config/exclude_rules.txt` in order to ignore any rule that you do not need or cannot be used.
-You can also add a rule ID to `rules/config/noisy_rules.txt` in order to ignore the rule by default but still be able to use the rule with the `-n` or `--enable-noisy-rules` option.
+You can also add a rule ID to `./rules/config/noisy_rules.txt` in order to ignore the rule by default but still be able to use the rule with the `-n` or `--enable-noisy-rules` option.
## Detection Level Tuning
Hayabusa and Sigma rule authors will determine the risk level of the alert when writing their rules.
However, the actual risk level will differ between environments.
-You can tune the risk level of the rules by adding them to `./rules/config/level_tuning.txt` and executing `hayabusa-1.3.2-win-x64.exe --level-tuning` which will update the `level` line in the rule file.
+You can tune the risk level of the rules by adding them to `./rules/config/level_tuning.txt` and executing `hayabusa-1.5.1-win-x64.exe --level-tuning` which will update the `level` line in the rule file.
Please note that the rule file will be updated directly.
`./rules/config/level_tuning.txt` sample line:
@@ -673,12 +798,9 @@ In this case, the risk level of the rule with an `id` of `00000000-0000-0000-000
## Event ID Filtering
-You can filter on event IDs by placing event ID numbers in `config/target_eventids.txt`.
-This will increase performance so it is recommended if you only need to search for certain IDs.
-
-We have provided a sample ID filter list at [`config/target_eventids_sample.txt`](https://github.com/Yamato-Security/hayabusa/blob/main/config/target_eventids_sample.txt) created from the `EventID` fields in all of the rules as well as IDs seen in actual results.
-
-Please use this list if you want the best performance but be aware that there is a slight possibility for missing events (false negatives).
+By default, events are filtered by ID to improve performance by ignorning events that have no detection rules.
+The IDs defined in `./rules/config/target_event_IDs.txt` will be scanned.
+If you want to scan all events, please use the `-D, --deep-scan` option.
# Other Windows Event Log Analyzers and Related Resources
@@ -686,7 +808,7 @@ There is no "one tool to rule them all" and we have found that each has its own
* [APT-Hunter](https://github.com/ahmedkhlief/APT-Hunter) - Attack detection tool written in Python.
* [Awesome Event IDs](https://github.com/stuhli/awesome-event-ids) - Collection of Event ID resources useful for Digital Forensics and Incident Response
-* [Chainsaw](https://github.com/countercept/chainsaw) - A similar sigma-based attack detection tool written in Rust.
+* [Chainsaw](https://github.com/countercept/chainsaw) - Another sigma-based attack detection tool written in Rust.
* [DeepBlueCLI](https://github.com/sans-blue-team/DeepBlueCLI) - Attack detection tool written in Powershell by [Eric Conrad](https://twitter.com/eric_conrad).
* [Epagneul](https://github.com/jurelou/epagneul) - Graph visualization for Windows event logs.
* [EventList](https://github.com/miriamxyra/EventList/) - Map security baseline event IDs to MITRE ATT&CK by [Miriam Wiesner](https://github.com/miriamxyra).
@@ -726,6 +848,7 @@ To create the most forensic evidence and detect with the highest accuracy, you n
## English
+* 2022/06/19 [Velociraptor Walkthrough and Hayabusa Integration](https://www.youtube.com/watch?v=Q1IoGX--814) by [Eric Cupuano](https://twitter.com/eric_capuano)
* 2022/01/24 [Graphing Hayabusa results in neo4j](https://www.youtube.com/watch?v=7sQqz2ek-ko) by Matthew Seyer ([@forensic_matt](https://twitter.com/forensic_matt))
## Japanese
@@ -755,4 +878,4 @@ Hayabusa is released under [GPLv3](https://www.gnu.org/licenses/gpl-3.0.en.html)
# Twitter
-You can recieve the latest news about Hayabusa, rule updates, other Yamato Security tools, etc... by following us on Twitter at [@SecurityYamato](https://twitter.com/SecurityYamato).
\ No newline at end of file
+You can recieve the latest news about Hayabusa, rule updates, other Yamato Security tools, etc... by following us on Twitter at [@SecurityYamato](https://twitter.com/SecurityYamato).
diff --git a/build.rs b/build.rs
new file mode 100644
index 00000000..7c051a1c
--- /dev/null
+++ b/build.rs
@@ -0,0 +1,4 @@
+fn main() {
+ #[cfg(target_os = "windows")]
+ static_vcruntime::metabuild();
+}
diff --git a/config/channel_abbreviations.txt b/config/channel_abbreviations.txt
deleted file mode 100644
index 3ef8affd..00000000
--- a/config/channel_abbreviations.txt
+++ /dev/null
@@ -1,33 +0,0 @@
-Channel,Abbreviation
-Application,App
-DNS Server,DNS-Svr
-Key Management Service,KeyMgtSvc
-Microsoft-ServiceBus-Client,SvcBusCli
-Microsoft-Windows-CodeIntegrity/Operational,CodeInteg
-Microsoft-Windows-LDAP-Client/Debug,LDAP-Cli
-Microsoft-Windows-AppLocker/MSI and Script,AppLocker
-Microsoft-Windows-AppLocker/EXE and DLL,AppLocker
-Microsoft-Windows-AppLocker/Packaged app-Deployment,AppLocker
-Microsoft-Windows-AppLocker/Packaged app-Execution,AppLocker
-Microsoft-Windows-Bits-Client/Operational,BitsCli
-Microsoft-Windows-DHCP-Server/Operational,DHCP-Svr
-Microsoft-Windows-DriverFrameworks-UserMode/Operational,DvrFmwk
-Microsoft-Windows-NTLM/Operational,NTLM
-Microsoft-Windows-Security-Mitigations/KernelMode,SecMitig
-Microsoft-Windows-Security-Mitigations/UserMode,SecMitig
-Microsoft-Windows-SmbClient/Security,SmbCliSec
-Microsoft-Windows-Sysmon/Operational,Sysmon
-Microsoft-Windows-TaskScheduler/Operational,TaskSch
-Microsoft-Windows-TerminalServices-RDPClient/Operational,RDP-Client
-Microsoft-Windows-PrintService/Admin,PrintAdm
-Microsoft-Windows-PrintService/Operational,PrintOp
-Microsoft-Windows-PowerShell/Operational,PwSh
-Microsoft-Windows-Windows Defender/Operational,Defender
-Microsoft-Windows-Windows Firewall With Advanced Security/Firewall,Firewall
-Microsoft-Windows-WinRM/Operational,WinRM
-Microsoft-Windows-WMI-Activity/Operational,WMI
-MSExchange Management,Exchange
-OpenSSH/Operational,OpenSSH
-Security,Sec
-System,Sys
-Windows PowerShell,PwShClassic
\ No newline at end of file
diff --git a/config/default_profile.yaml b/config/default_profile.yaml
new file mode 100644
index 00000000..394b6546
--- /dev/null
+++ b/config/default_profile.yaml
@@ -0,0 +1,10 @@
+---
+Timestamp: "%Timestamp%"
+Computer: "%Computer%"
+Channel: "%Channel%"
+EventID: "%EventID%"
+Level: "%Level%"
+MitreTactics: "%MitreTactics%"
+RecordID: "%RecordID%"
+RuleTitle: "%RuleTitle%"
+Details: "%Details%"
\ No newline at end of file
diff --git a/config/output_tag.txt b/config/mitre_tactics.txt
similarity index 100%
rename from config/output_tag.txt
rename to config/mitre_tactics.txt
diff --git a/config/profiles.yaml b/config/profiles.yaml
new file mode 100644
index 00000000..de51b7f8
--- /dev/null
+++ b/config/profiles.yaml
@@ -0,0 +1,87 @@
+#Standard profile minus MITRE ATT&CK Tactics and Record ID.
+minimal:
+ Timestamp: "%Timestamp%"
+ Computer: "%Computer%"
+ Channel: "%Channel%"
+ EventID: "%EventID%"
+ Level: "%Level%"
+ RuleTitle: "%RuleTitle%"
+ Details: "%Details%"
+
+standard:
+ Timestamp: "%Timestamp%"
+ Computer: "%Computer%"
+ Channel: "%Channel%"
+ EventID: "%EventID%"
+ Level: "%Level%"
+ MitreTactics: "%MitreTactics%"
+ RecordID: "%RecordID%"
+ RuleTitle: "%RuleTitle%"
+ Details: "%Details%"
+
+#Standard profile plus MitreTags(MITRE techniques, software and groups), rule filename and EVTX filename.
+verbose:
+ Timestamp: "%Timestamp%"
+ Computer: "%Computer%"
+ Channel: "%Channel%"
+ EventID: "%EventID%"
+ Level: "%Level%"
+ MitreTactics: "%MitreTactics%"
+ MitreTags: "%MitreTags%"
+ OtherTags: "%OtherTags%"
+ RecordID: "%RecordID%"
+ RuleTitle: "%RuleTitle%"
+ Details: "%Details%"
+ RuleFile: "%RuleFile%"
+ EvtxFile: "%EvtxFile%"
+
+#Verbose profile with all field information instead of the minimal fields defined in Details.
+verbose-all-field-info:
+ Timestamp: "%Timestamp%"
+ Computer: "%Computer%"
+ Channel: "%Channel%"
+ EventID: "%EventID%"
+ Level: "%Level%"
+ MitreTactics: "%MitreTactics%"
+ MitreTags: "%MitreTags%"
+ OtherTags: "%OtherTags%"
+ RecordID: "%RecordID%"
+ RuleTitle: "%RuleTitle%"
+ AllFieldInfo: "%RecordInformation%"
+ RuleFile: "%RuleFile%"
+ EvtxFile: "%EvtxFile%"
+
+#Verbose profile plus all field information. (Warning: this will more than double the output file size!)
+verbose-details-and-all-field-info:
+ Timestamp: "%Timestamp%"
+ Computer: "%Computer%"
+ Channel: "%Channel%"
+ EventID: "%EventID%"
+ Level: "%Level%"
+ MitreTactics: "%MitreTactics%"
+ MitreTags: "%MitreTags%"
+ OtherTags: "%OtherTags%"
+ RecordID: "%RecordID%"
+ RuleTitle: "%RuleTitle%"
+ Details: "%Details%"
+ RuleFile: "%RuleFile%"
+ EvtxFile: "%EvtxFile%"
+ AllFieldInfo: "%RecordInformation%"
+
+#Output that is compatible to import the CSV into Timesketch
+timesketch:
+ datetime: "%Timestamp%"
+ timestamp_desc: "hayabusa"
+ message: "%RuleTitle%"
+ Computer: "%Computer%"
+ Channel: "%Channel%"
+ EventID: "%EventID%"
+ Level: "%Level%"
+ MitreTactics: "%MitreTactics%"
+ MitreTags: "%MitreTags%"
+ OtherTags: "%OtherTags%"
+ RecordID: "%RecordID%"
+ Details: "%Details%"
+ RuleFile: "%RuleFile%"
+ EvtxFile: "%EvtxFile%"
+ AllFieldInfo: "%RecordInformation%"
\ No newline at end of file
diff --git a/config/statistics_event_info.txt b/config/statistics_event_info.txt
deleted file mode 100644
index 705aa564..00000000
--- a/config/statistics_event_info.txt
+++ /dev/null
@@ -1,496 +0,0 @@
-eventid,event_title
-6406,%1 registered to Windows Firewall to control filtering for the following: %2
-1,Process Creation.
-2,File Creation Timestamp Changed. (Possible Timestomping)
-3,Network Connection.
-4,Sysmon Service State Changed.
-5,Process Terminated.
-6,Driver Loaded.
-7,Image Loaded.
-8,Remote Thread Created. (Possible Code Injection)
-9,Raw Access Read.
-10,Process Access.
-11,File Creation or Overwrite.
-12,Registry Object Created/Deletion.
-13,Registry Value Set.
-14,Registry Key or Value Rename.
-15,Alternate Data Stream Created.
-16,Sysmon Service Configuration Changed.
-17,Named Pipe Created.
-18,Named Pipe Connection.
-19,WmiEventFilter Activity.
-20,WmiEventConsumer Activity.
-21,WmiEventConsumerToFilter Activity.
-22,DNS Query.
-23,Deleted File Archived.
-24,Clipboard Changed.
-25,Process Tampering. (Possible Process Hollowing or Herpaderping)
-26,File Deleted.
-27,KDC Encryption Type Configuration
-31,Windows Update Failed
-34,Windows Update Failed
-35,Windows Update Failed
-43,New Device Information
-81,Processing client request for operation CreateShell
-82,Entering the plugin for operation CreateShell with a ResourceURI
-104,Event Log was Cleared
-106,A task has been scheduled
-134,Sending response for operation CreateShell
-169,Creating WSMan Session (on Server)
-255,Sysmon Error.
-400,New Mass Storage Installation
-410,New Mass Storage Installation
-800,Summary of Software Activities
-903,New Application Installation
-904,New Application Installation
-905,Updated Application
-906,Updated Application
-907,Removed Application
-908,Removed Application
-1001,BSOD
-1005,Scan Failed
-1006,Detected Malware
-1008,Action on Malware Failed
-1009,Hotpatching Failed
-1010,Failed to remove item from quarantine
-1022,New MSI File Installed
-1033,New MSI File Installed
-1100,The event logging service has shut down
-1101,Audit events have been dropped by the transport.
-1102,The audit log was cleared
-1104,The security Log is now full
-1105,Event log automatic backup
-1108,The event logging service encountered an error
-1125,Group Policy: Internal Error
-1127,Group Policy: Generic Internal Error
-1129,Group Policy: Group Policy Application Failed due to Connectivity
-1149,User authentication succeeded
-2001,Failed to update signatures
-2003,Failed to update engine
-2004,Firewall Rule Add
-2004,Reverting to last known good set of signatures
-2005,Firewall Rule Change
-2006,Firewall Rule Deleted
-2009,Firewall Failed to load Group Policy
-2033,Firewall Rule Deleted
-3001,Code Integrity Check Warning
-3002,Code Integrity Check Warning
-3002,Real-Time Protection failed
-3003,Code Integrity Check Warning
-3004,Code Integrity Check Warning
-3010,Code Integrity Check Warning
-3023,Code Integrity Check Warning
-4103,Module logging. Executing Pipeline.
-4104,Script Block Logging.
-4105,CommandStart - Started
-4106,CommandStart - Stoppeed
-4608,Windows is starting up
-4609,Windows is shutting down
-4610,An authentication package has been loaded by the Local Security Authority
-4611,A trusted logon process has been registered with the Local Security Authority
-4612,"Internal resources allocated for the queuing of audit messages have been exhausted, leading to the loss of some audits."
-4614,A notification package has been loaded by the Security Account Manager.
-4615,Invalid use of LPC port
-4616,The system time was changed.
-4618,A monitored security event pattern has occurred
-4621,Administrator recovered system from CrashOnAuditFail
-4622,A security package has been loaded by the Local Security Authority.
-4624,Logon Success
-4625,Logon Failure
-4627,Group Membership Information
-4634,Account Logoff
-4646,IKE DoS-prevention mode started
-4647,User initiated logoff
-4648,Explicit Logon
-4649,A replay attack was detected
-4650,An IPsec Main Mode security association was established
-4651,An IPsec Main Mode security association was established
-4652,An IPsec Main Mode negotiation failed
-4653,An IPsec Main Mode negotiation failed
-4654,An IPsec Quick Mode negotiation failed
-4655,An IPsec Main Mode security association ended
-4656,A handle to an object was requested
-4657,A registry value was modified
-4658,The handle to an object was closed
-4659,A handle to an object was requested with intent to delete
-4660,An object was deleted
-4661,A handle to an object was requested
-4662,An operation was performed on an object
-4663,An attempt was made to access an object
-4664,An attempt was made to create a hard link
-4665,An attempt was made to create an application client context.
-4666,An application attempted an operation
-4667,An application client context was deleted
-4668,An application was initialized
-4670,Permissions on an object were changed
-4671,An application attempted to access a blocked ordinal through the TBS
-4672,Admin Logon
-4673,A privileged service was called
-4674,An operation was attempted on a privileged object
-4675,SIDs were filtered
-4685,The state of a transaction has changed
-4688,Process Creation.
-4689,A process has exited
-4690,An attempt was made to duplicate a handle to an object
-4691,Indirect access to an object was requested
-4692,Backup of data protection master key was attempted
-4693,Recovery of data protection master key was attempted
-4694,Protection of auditable protected data was attempted
-4695,Unprotection of auditable protected data was attempted
-4696,A primary token was assigned to process
-4697,A service was installed in the system
-4698,A scheduled task was created
-4699,A scheduled task was deleted
-4700,A scheduled task was enabled
-4701,A scheduled task was disabled
-4702,A scheduled task was updated
-4704,A user right was assigned
-4705,A user right was removed
-4706,A new trust was created to a domain
-4707,A trust to a domain was removed
-4709,IPsec Services was started
-4710,IPsec Services was disabled
-4711,PAStore Engine
-4712,IPsec Services encountered a potentially serious failure
-4713,Kerberos policy was changed
-4714,Encrypted data recovery policy was changed
-4715,The audit policy (SACL) on an object was changed
-4716,Trusted domain information was modified
-4717,System security access was granted to an account
-4718,System security access was removed from an account
-4719,System audit policy was changed
-4720,A user account was created
-4722,A user account was enabled
-4723,An attempt was made to change an account's password
-4724,An attempt was made to reset an accounts password
-4725,A user account was disabled
-4726,A user account was deleted
-4727,A security-enabled global group was created
-4728,A member was added to a security-enabled global group
-4729,A member was removed from a security-enabled global group
-4730,A security-enabled global group was deleted
-4731,A security-enabled local group was created
-4732,A member was added to a security-enabled local group
-4733,A member was removed from a security-enabled local group
-4734,A security-enabled local group was deleted
-4735,A security-enabled local group was changed
-4737,A security-enabled global group was changed
-4738,A user account was changed
-4739,Domain Policy was changed
-4740,A user account was locked out
-4741,A computer account was created
-4742,A computer account was changed
-4743,A computer account was deleted
-4744,A security-disabled local group was created
-4745,A security-disabled local group was changed
-4746,A member was added to a security-disabled local group
-4747,A member was removed from a security-disabled local group
-4748,A security-disabled local group was deleted
-4749,A security-disabled global group was created
-4750,A security-disabled global group was changed
-4751,A member was added to a security-disabled global group
-4752,A member was removed from a security-disabled global group
-4753,A security-disabled global group was deleted
-4754,A security-enabled universal group was created
-4755,A security-enabled universal group was changed
-4756,A member was added to a security-enabled universal group
-4757,A member was removed from a security-enabled universal group
-4758,A security-enabled universal group was deleted
-4759,A security-disabled universal group was created
-4760,A security-disabled universal group was changed
-4761,A member was added to a security-disabled universal group
-4762,A member was removed from a security-disabled universal group
-4763,A security-disabled universal group was deleted
-4764,A groups type was changed
-4765,SID History was added to an account
-4766,An attempt to add SID History to an account failed
-4767,A user account was unlocked
-4768,A Kerberos authentication ticket (TGT) was requested
-4769,A Kerberos service ticket was requested
-4770,A Kerberos service ticket was renewed
-4771,Kerberos pre-authentication failed
-4772,A Kerberos authentication ticket request failed
-4773,A Kerberos service ticket request failed
-4774,An account was mapped for logon
-4775,An account could not be mapped for logon
-4776,The domain controller attempted to validate the credentials for an account
-4777,The domain controller failed to validate the credentials for an account
-4778,A session was reconnected to a Window Station
-4779,A session was disconnected from a Window Station
-4780,The ACL was set on accounts which are members of administrators groups
-4781,The name of an account was changed
-4782,The password hash an account was accessed
-4783,A basic application group was created
-4784,A basic application group was changed
-4785,A member was added to a basic application group
-4786,A member was removed from a basic application group
-4787,A non-member was added to a basic application group
-4788,A non-member was removed from a basic application group..
-4789,A basic application group was deleted
-4790,An LDAP query group was created
-4791,A basic application group was changed
-4792,An LDAP query group was deleted
-4793,The Password Policy Checking API was called
-4794,An attempt was made to set the Directory Services Restore Mode administrator password
-4800,The workstation was locked
-4801,The workstation was unlocked
-4802,The screen saver was invoked
-4803,The screen saver was dismissed
-4816,RPC detected an integrity violation while decrypting an incoming message
-4817,Auditing settings on object were changed.
-4864,A namespace collision was detected
-4865,A trusted forest information entry was added
-4866,A trusted forest information entry was removed
-4867,A trusted forest information entry was modified
-4868,The certificate manager denied a pending certificate request
-4869,Certificate Services received a resubmitted certificate request
-4870,Certificate Services revoked a certificate
-4871,Certificate Services received a request to publish the certificate revocation list (CRL)
-4872,Certificate Services published the certificate revocation list (CRL)
-4873,A certificate request extension changed
-4874,One or more certificate request attributes changed.
-4875,Certificate Services received a request to shut down
-4876,Certificate Services backup started
-4877,Certificate Services backup completed
-4878,Certificate Services restore started
-4879,Certificate Services restore completed
-4880,Certificate Services started
-4881,Certificate Services stopped
-4882,The security permissions for Certificate Services changed
-4883,Certificate Services retrieved an archived key
-4884,Certificate Services imported a certificate into its database
-4885,The audit filter for Certificate Services changed
-4886,Certificate Services received a certificate request
-4887,Certificate Services approved a certificate request and issued a certificate
-4888,Certificate Services denied a certificate request
-4889,Certificate Services set the status of a certificate request to pending
-4890,The certificate manager settings for Certificate Services changed.
-4891,A configuration entry changed in Certificate Services
-4892,A property of Certificate Services changed
-4893,Certificate Services archived a key
-4894,Certificate Services imported and archived a key
-4895,Certificate Services published the CA certificate to Active Directory Domain Services
-4896,One or more rows have been deleted from the certificate database
-4897,Role separation enabled
-4898,Certificate Services loaded a template
-4899,A Certificate Services template was updated
-4900,Certificate Services template security was updated
-4902,The Per-user audit policy table was created
-4904,An attempt was made to register a security event source
-4905,An attempt was made to unregister a security event source
-4906,The CrashOnAuditFail value has changed
-4907,Auditing settings on object were changed
-4908,Special Groups Logon table modified
-4909,The local policy settings for the TBS were changed
-4910,The group policy settings for the TBS were changed
-4912,Per User Audit Policy was changed
-4928,An Active Directory replica source naming context was established
-4929,An Active Directory replica source naming context was removed
-4930,An Active Directory replica source naming context was modified
-4931,An Active Directory replica destination naming context was modified
-4932,Synchronization of a replica of an Active Directory naming context has begun
-4933,Synchronization of a replica of an Active Directory naming context has ended
-4934,Attributes of an Active Directory object were replicated
-4935,Replication failure begins
-4936,Replication failure ends
-4937,A lingering object was removed from a replica
-4944,The following policy was active when the Windows Firewall started
-4945,A rule was listed when the Windows Firewall started
-4946,A change has been made to Windows Firewall exception list. A rule was added
-4947,A change has been made to Windows Firewall exception list. A rule was modified
-4948,A change has been made to Windows Firewall exception list. A rule was deleted
-4949,Windows Firewall settings were restored to the default values
-4950,A Windows Firewall setting has changed
-4951,A rule has been ignored because its major version number was not recognized by Windows Firewall
-4952,Parts of a rule have been ignored because its minor version number was not recognized by Windows Firewall
-4953,A rule has been ignored by Windows Firewall because it could not parse the rule
-4954,Windows Firewall Group Policy settings has changed. The new settings have been applied
-4956,Windows Firewall has changed the active profile
-4957,Windows Firewall did not apply the following rule
-4958,Windows Firewall did not apply the following rule because the rule referred to items not configured on this computer
-4960,IPsec dropped an inbound packet that failed an integrity check
-4961,IPsec dropped an inbound packet that failed a replay check
-4962,IPsec dropped an inbound packet that failed a replay check
-4963,IPsec dropped an inbound clear text packet that should have been secured
-4964,Special groups have been assigned to a new logon
-4965,IPsec received a packet from a remote computer with an incorrect Security Parameter Index (SPI).
-4976,"During Main Mode negotiation, IPsec received an invalid negotiation packet."
-4977,"During Quick Mode negotiation, IPsec received an invalid negotiation packet."
-4978,"During Extended Mode negotiation, IPsec received an invalid negotiation packet."
-4979,IPsec Main Mode and Extended Mode security associations were established
-4980,IPsec Main Mode and Extended Mode security associations were established
-4981,IPsec Main Mode and Extended Mode security associations were established
-4982,IPsec Main Mode and Extended Mode security associations were established
-4983,An IPsec Extended Mode negotiation failed
-4984,An IPsec Extended Mode negotiation failed
-4985,The state of a transaction has changed
-5008,Unexpected Error
-5024,The Windows Firewall Service has started successfully
-5025,The Windows Firewall Service has been stopped
-5027,The Windows Firewall Service was unable to retrieve the security policy from the local storage
-5028,The Windows Firewall Service was unable to parse the new security policy.
-5029,The Windows Firewall Service failed to initialize the driver
-5030,The Windows Firewall Service failed to start
-5031,The Windows Firewall Service blocked an application from accepting incoming connections on the network.
-5032,Windows Firewall was unable to notify the user that it blocked an application from accepting incoming connections on the network
-5033,The Windows Firewall Driver has started successfully
-5034,The Windows Firewall Driver has been stopped
-5035,The Windows Firewall Driver failed to start
-5037,The Windows Firewall Driver detected critical runtime error. Terminating
-5038,Code integrity determined that the image hash of a file is not valid
-5039,A registry key was virtualized.
-5040,A change has been made to IPsec settings. An Authentication Set was added.
-5041,A change has been made to IPsec settings. An Authentication Set was modified
-5042,A change has been made to IPsec settings. An Authentication Set was deleted
-5043,A change has been made to IPsec settings. A Connection Security Rule was added
-5044,A change has been made to IPsec settings. A Connection Security Rule was modified
-5045,A change has been made to IPsec settings. A Connection Security Rule was deleted
-5046,A change has been made to IPsec settings. A Crypto Set was added
-5047,A change has been made to IPsec settings. A Crypto Set was modified
-5048,A change has been made to IPsec settings. A Crypto Set was deleted
-5049,An IPsec Security Association was deleted
-5050,An attempt to programmatically disable the Windows Firewall using a call to INetFwProfile
-5051,A file was virtualized
-5056,A cryptographic self test was performed
-5057,A cryptographic primitive operation failed
-5058,Key file operation
-5059,Key migration operation
-5060,Verification operation failed
-5061,Cryptographic operation
-5062,A kernel-mode cryptographic self test was performed
-5063,A cryptographic provider operation was attempted
-5064,A cryptographic context operation was attempted
-5065,A cryptographic context modification was attempted
-5066,A cryptographic function operation was attempted
-5067,A cryptographic function modification was attempted
-5068,A cryptographic function provider operation was attempted
-5069,A cryptographic function property operation was attempted
-5070,A cryptographic function property operation was attempted
-5120,OCSP Responder Service Started
-5121,OCSP Responder Service Stopped
-5122,A Configuration entry changed in the OCSP Responder Service
-5123,A configuration entry changed in the OCSP Responder Service
-5124,A security setting was updated on OCSP Responder Service
-5125,A request was submitted to OCSP Responder Service
-5126,Signing Certificate was automatically updated by the OCSP Responder Service
-5127,The OCSP Revocation Provider successfully updated the revocation information
-5136,A directory service object was modified
-5137,A directory service object was created
-5138,A directory service object was undeleted
-5139,A directory service object was moved
-5140,A network share object was accessed
-5141,A directory service object was deleted
-5142,A network share object was added.
-5143,A network share object was modified
-5144,A network share object was deleted.
-5145,A network share object was checked to see whether client can be granted desired access
-5148,The Windows Filtering Platform has detected a DoS attack and entered a defensive mode; packets associated with this attack will be discarded.
-5149,The DoS attack has subsided and normal processing is being resumed.
-5150,The Windows Filtering Platform has blocked a packet.
-5151,A more restrictive Windows Filtering Platform filter has blocked a packet.
-5152,The Windows Filtering Platform blocked a packet
-5153,A more restrictive Windows Filtering Platform filter has blocked a packet
-5154,The Windows Filtering Platform has permitted an application or service to listen on a port for incoming connections
-5155,The Windows Filtering Platform has blocked an application or service from listening on a port for incoming connections
-5156,The Windows Filtering Platform has allowed a connection
-5157,The Windows Filtering Platform has blocked a connection
-5158,The Windows Filtering Platform has permitted a bind to a local port
-5159,The Windows Filtering Platform has blocked a bind to a local port
-5168,Spn check for SMB/SMB2 fails.
-5376,Credential Manager credentials were backed up
-5377,Credential Manager credentials were restored from a backup
-5378,The requested credentials delegation was disallowed by policy
-5440,The following callout was present when the Windows Filtering Platform Base Filtering Engine started
-5441,The following filter was present when the Windows Filtering Platform Base Filtering Engine started
-5442,The following provider was present when the Windows Filtering Platform Base Filtering Engine started
-5443,The following provider context was present when the Windows Filtering Platform Base Filtering Engine started
-5444,The following sub-layer was present when the Windows Filtering Platform Base Filtering Engine started
-5446,A Windows Filtering Platform callout has been changed
-5447,A Windows Filtering Platform filter has been changed
-5448,A Windows Filtering Platform provider has been changed
-5449,A Windows Filtering Platform provider context has been changed
-5450,A Windows Filtering Platform sub-layer has been changed
-5451,An IPsec Quick Mode security association was established
-5452,An IPsec Quick Mode security association ended
-5453,An IPsec negotiation with a remote computer failed because the IKE and AuthIP IPsec Keying Modules (IKEEXT) service is not started
-5456,PAStore Engine applied Active Directory storage IPsec policy on the computer
-5457,PAStore Engine failed to apply Active Directory storage IPsec policy on the computer
-5458,PAStore Engine applied locally cached copy of Active Directory storage IPsec policy on the computer
-5459,PAStore Engine failed to apply locally cached copy of Active Directory storage IPsec policy on the computer
-5460,PAStore Engine applied local registry storage IPsec policy on the computer
-5461,PAStore Engine failed to apply local registry storage IPsec policy on the computer
-5462,PAStore Engine failed to apply some rules of the active IPsec policy on the computer
-5463,PAStore Engine polled for changes to the active IPsec policy and detected no changes
-5464,"PAStore Engine polled for changes to the active IPsec policy, detected changes, and applied them to IPsec Services"
-5465,PAStore Engine received a control for forced reloading of IPsec policy and processed the control successfully
-5466,"PAStore Engine polled for changes to the Active Directory IPsec policy, determined that Active Directory cannot be reached, and will use the cached copy of the Active Directory IPsec policy instead"
-5467,"PAStore Engine polled for changes to the Active Directory IPsec policy, determined that Active Directory can be reached, and found no changes to the policy"
-5468,"PAStore Engine polled for changes to the Active Directory IPsec policy, determined that Active Directory can be reached, found changes to the policy, and applied those changes"
-5471,PAStore Engine loaded local storage IPsec policy on the computer
-5472,PAStore Engine failed to load local storage IPsec policy on the computer
-5473,PAStore Engine loaded directory storage IPsec policy on the computer
-5474,PAStore Engine failed to load directory storage IPsec policy on the computer
-5477,PAStore Engine failed to add quick mode filter
-5478,IPsec Services has started successfully
-5479,IPsec Services has been shut down successfully
-5480,IPsec Services failed to get the complete list of network interfaces on the computer
-5483,IPsec Services failed to initialize RPC server. IPsec Services could not be started
-5484,IPsec Services has experienced a critical failure and has been shut down
-5485,IPsec Services failed to process some IPsec filters on a plug-and-play event for network interfaces
-6144,Security policy in the group policy objects has been applied successfully
-6145,One or more errors occured while processing security policy in the group policy objects
-6272,Network Policy Server granted access to a user
-6273,Network Policy Server denied access to a user
-6274,Network Policy Server discarded the request for a user
-6275,Network Policy Server discarded the accounting request for a user
-6276,Network Policy Server quarantined a user
-6277,Network Policy Server granted access to a user but put it on probation because the host did not meet the defined health policy
-6278,Network Policy Server granted full access to a user because the host met the defined health policy
-6279,Network Policy Server locked the user account due to repeated failed authentication attempts
-6280,Network Policy Server unlocked the user account
-6281,Code Integrity determined that the page hashes of an image file are not valid...
-6400,BranchCache: Received an incorrectly formatted response while discovering availability of content.
-6401,BranchCache: Received invalid data from a peer. Data discarded.
-6402,BranchCache: The message to the hosted cache offering it data is incorrectly formatted.
-6403,BranchCache: The hosted cache sent an incorrectly formatted response to the client.
-6404,BranchCache: Hosted cache could not be authenticated using the provisioned SSL certificate.
-6405,BranchCache: %2 instance(s) of event id %1 occurred.
-6407,1% (no more info in MSDN)
-6408,Registered product %1 failed and Windows Firewall is now controlling the filtering for %2
-6410,Code integrity determined that a file does not meet the security requirements to load into a process.
-7022,Windows Service Fail or Crash
-7023,The %1 service terminated with the following error: %2
-7023,Windows Service Fail or Crash
-7024,Windows Service Fail or Crash
-7026,Windows Service Fail or Crash
-7030,"The service is marked as an interactive service. However, the system is configured to not allow interactive services. This service may not function properly."
-7031,Windows Service Fail or Crash
-7032,Windows Service Fail or Crash
-7034,Windows Service Fail or Crash
-7035,The %1 service was successfully sent a %2 control.
-7036,The service entered the running/stopped state
-7040,The start type of the %1 service was changed from %2 to %3.
-7045,New Windows Service
-8000,Starting a Wireless Connection
-8001,Successfully connected to Wireless connection
-8002,Wireless Connection Failed
-8003,AppLocker Block Error
-8003,Disconnected from Wireless connection
-8004,AppLocker Block Warning
-8005,AppLocker permitted the execution of a PowerShell script
-8006,AppLocker Warning Error
-8007,AppLocker Warning
-8011,Starting a Wireless Connection
-10000,Network Connection and Disconnection Status (Wired and Wireless)
-10001,Network Connection and Disconnection Status (Wired and Wireless)
-11000,Wireless Association Status
-11001,Wireless Association Status
-11002,Wireless Association Status
-11004,"Wireless Security Started, Stopped, Successful, or Failed"
-11005,"Wireless Security Started, Stopped, Successful, or Failed"
-11006,"Wireless Security Started, Stopped, Successful, or Failed"
-11010,"Wireless Security Started, Stopped, Successful, or Failed"
-12011,Wireless Authentication Started and Failed
-12012,Wireless Authentication Started and Failed
-12013,Wireless Authentication Started and Failed
-unregistered_event_id,Unknown
diff --git a/config/target_eventids.txt b/config/target_eventids.txt
deleted file mode 100644
index e69de29b..00000000
diff --git a/config/target_eventids_sample.txt b/config/target_eventids_sample.txt
deleted file mode 100644
index f703021e..00000000
--- a/config/target_eventids_sample.txt
+++ /dev/null
@@ -1,154 +0,0 @@
-1
-10
-1000
-1001
-1006
-1013
-1015
-1031
-1032
-1033
-1034
-104
-106
-11
-1102
-1116
-1116
-1117
-1121
-12
-13
-14
-15
-150
-16
-17
-18
-19
-20
-2003
-21
-2100
-2102
-213
-217
-22
-23
-24
-255
-257
-26
-3
-30
-300
-301
-302
-316
-31017
-354
-4
-400
-400
-403
-40300
-40301
-40302
-4100
-4103
-4104
-4611
-4616
-4624
-4625
-4634
-4647
-4648
-4656
-4657
-4658
-4660
-4661
-4662
-4663
-4672
-4673
-4674
-4688
-4689
-4692
-4697
-4698
-4699
-4701
-4703
-4704
-4706
-4719
-4720
-4728
-4732
-4738
-4742
-4765
-4766
-4768
-4769
-4771
-4776
-4781
-4794
-4799
-4825
-4898
-4899
-4904
-4905
-4909
-5
-50
-5001
-5007
-5010
-5012
-5013
-5038
-5101
-5136
-5140
-5142
-5145
-5156
-517
-524
-528
-529
-55
-56
-5829
-5859
-5861
-59
-6
-600
-6281
-6416
-675
-7
-70
-7036
-7040
-7045
-770
-8
-800
-8001
-8002
-8004
-8007
-808
-823
-848
-849
-9
-98
diff --git a/contributors.txt b/contributors.txt
index 53b594ea..dd3e8a57 100644
--- a/contributors.txt
+++ b/contributors.txt
@@ -1,6 +1,7 @@
Hayabusa was possible thanks to the following people (in alphabetical order):
Akira Nishikawa (@nishikawaakira): Previous lead developer, core hayabusa rule support, etc...
+Fukusuke Takahashi (fukuseket): Static compiling for Windows, race condition and other bug fixes.
Garigariganzy (@garigariganzy31): Developer, event ID statistics implementation, etc...
ItiB (@itiB_S144) : Core developer, sigmac hayabusa backend, rule creation, etc...
James Takai / hachiyone(@hach1yon): Current lead developer, tokio multi-threading, sigma aggregation logic, sigmac backend, rule creation, sigma count implementation etc…
diff --git a/doc/TimesketchImport/01-TimesketchLogin.png b/doc/TimesketchImport/01-TimesketchLogin.png
new file mode 100644
index 00000000..1c86ca6d
Binary files /dev/null and b/doc/TimesketchImport/01-TimesketchLogin.png differ
diff --git a/doc/TimesketchImport/02-NewInvestigation.png b/doc/TimesketchImport/02-NewInvestigation.png
new file mode 100644
index 00000000..2e8d9eb3
Binary files /dev/null and b/doc/TimesketchImport/02-NewInvestigation.png differ
diff --git a/doc/TimesketchImport/03-TimesketchTimeline.png b/doc/TimesketchImport/03-TimesketchTimeline.png
new file mode 100644
index 00000000..d20cda5d
Binary files /dev/null and b/doc/TimesketchImport/03-TimesketchTimeline.png differ
diff --git a/doc/TimesketchImport/04-TimelineWithColumns.png b/doc/TimesketchImport/04-TimelineWithColumns.png
new file mode 100644
index 00000000..5e740554
Binary files /dev/null and b/doc/TimesketchImport/04-TimelineWithColumns.png differ
diff --git a/doc/TimesketchImport/05-FieldInformation.png b/doc/TimesketchImport/05-FieldInformation.png
new file mode 100644
index 00000000..56ca50f4
Binary files /dev/null and b/doc/TimesketchImport/05-FieldInformation.png differ
diff --git a/doc/TimesketchImport/06-MarkingEvents.png b/doc/TimesketchImport/06-MarkingEvents.png
new file mode 100644
index 00000000..3dd4656a
Binary files /dev/null and b/doc/TimesketchImport/06-MarkingEvents.png differ
diff --git a/doc/TimesketchImport/TimesketchImport-English.md b/doc/TimesketchImport/TimesketchImport-English.md
new file mode 100644
index 00000000..d2805231
--- /dev/null
+++ b/doc/TimesketchImport/TimesketchImport-English.md
@@ -0,0 +1,80 @@
+# Importing Hayabusa Results Into Timesketch
+
+## About
+
+"[Timesketch](https://timesketch.org/) is an open-source tool for collaborative forensic timeline analysis. Using sketches you and your collaborators can easily organize your timelines and analyze them all at the same time. Add meaning to your raw data with rich annotations, comments, tags and stars."
+
+
+## Installing
+
+We recommend using the Ubuntu 22.04 LTS Server edition.
+You can download it [here](https://ubuntu.com/download/server).
+Choose the minimal install when setting it up.
+You won't have `ifconfig` available, so install it with `sudo apt install net-tools`.
+
+After that, follow the install instructions [here](https://timesketch.org/guides/admin/install/):
+
+``` bash
+sudo apt install docker-compose
+curl -s -O https://raw.githubusercontent.com/google/timesketch/master/contrib/deploy_timesketch.sh
+chmod 755 deploy_timesketch.sh
+cd /opt
+sudo ~/deploy_timesketch.sh
+cd timesketch
+sudo docker-compose up -d
+sudo docker-compose exec timesketch-web tsctl create-user
+```
+
+## Prepared VM
+
+We have pre-built a demo VM that you can use against the 2022 DEF CON 30 [OpenSOC](https://opensoc.io/) DFIR Challenge evidence hosted by [Recon InfoSec](https://www.reconinfosec.com/). (The evidence has already been imported.)
+You can download it [here](https://www.dropbox.com/s/3be3s5c2r22ux2z/Prebuilt-Timesketch.ova?dl=0).
+You can find the other evidence for this challenge [here](https://docs.google.com/document/d/1XM4Gfdojt8fCn_9B8JKk9bcUTXZc0_hzWRUH4mEr7dw/mobilebasic) and questions [here](https://docs.google.com/spreadsheets/d/1vKn8BgABuJsqH5WhhS9ebIGTBG4aoP-StINRi18abo4/htmlview).
+
+The username for the VM is `user` and password is `password`.
+
+## Logging in
+
+Find out the IP address with `ifconfig` and open it with a web browser.
+You will be redirected to a login page as shown below:
+
+
+
+Log in with the docker-compose user credentials you used when adding a user.
+
+## Create a new sketch
+
+Click on `New investiation` and create a name for the new sketch:
+
+
+
+## Upload timeline
+
+Click `Upload timeline` and upload a CSV file that you created with the following command:
+
+`hayabusa-1.5.1-win-x64.exe -d ../hayabusa-sample-evtx --RFC-3339 -o timesketch-import.csv -P timesketch -U`
+
+You can add `-m low` if you just want alerts and not include Windows events.
+
+## Analyzing results
+
+You should get the following screen:
+
+
+
+By default, only the UTC timestamp and alert rule title will be displayed so click `Customize columns` to add more fields.
+
+> Warning: In the current version, there is a bug in that a new column will be blank. Please add another column (and then delete it afterwards if not needed) to display new columns.
+
+You can also filter on fields in the searchbox, such as `Level: crit` to only show critical alerts.
+
+
+
+
+If you click on an event, you can see all of the field information:
+
+
+
+With the three icons to the left of the alert title, you can star events of interest, search +- 5 minutes to see the context of an event and add labels.
+
+
\ No newline at end of file
diff --git a/doc/TimesketchImport/TimesketchImport-Japanese.md b/doc/TimesketchImport/TimesketchImport-Japanese.md
new file mode 100644
index 00000000..1247a631
--- /dev/null
+++ b/doc/TimesketchImport/TimesketchImport-Japanese.md
@@ -0,0 +1,80 @@
+# TimesketchにHayabusa結果をインポートする方法
+
+## Timesketchについて
+
+"[Timesketch](https://timesketch.org/)は、フォレンジックタイムラインの共同解析のためのオープンソースツールです。スケッチを使うことで、あなたとあなたの共同作業者は、簡単にタイムラインを整理し、同時に分析することができます。リッチなアノテーション、コメント、タグ、スターで生データに意味を持たせることができます。"
+
+
+## インストール
+
+Ubuntu 22.04 LTS Serverエディションの使用を推奨します。
+[こちら](https://ubuntu.com/download/server)からダウンロードできます。
+セットアップ時にミニマルインストールを選択してください。
+`ifconfig`はインストールされていないので、`sudo apt install net-tools`でインストールしてください。
+
+その後、インストール手順[こちら](https://timesketch.org/guides/admin/install/)に従ってください:
+
+``` bash
+sudo apt install docker-compose
+curl -s -O https://raw.githubusercontent.com/google/timesketch/master/contrib/deploy_timesketch.sh
+chmod 755 deploy_timesketch.sh
+cd /opt
+sudo ~/deploy_timesketch.sh
+cd timesketch
+sudo docker-compose up -d
+sudo docker-compose exec timesketch-web tsctl create-user
+```
+
+## 準備されたVM
+
+[Recon InfoSec](https://www.reconinfosec.com/)主催の2022年のDEF CON 30 [OpenSOC](https://opensoc.io/) DFIR Challengeのエビデンスに対して使用できるデモ用VMを事前に構築しています。 (エビデンスは既にインポート済み。)
+[こちら](https://www.dropbox.com/s/3be3s5c2r22ux2z/Prebuilt-Timesketch.ova?dl=0)からダウンロードできます。
+このチャレンジの他のエビデンスは[こちら](https://docs.google.com/document/d/1XM4Gfdojt8fCn_9B8JKk9bcUTXZc0_hzWRUH4mEr7dw/mobilebasic)からダウンロードできます。
+問題は[こちら](https://docs.google.com/spreadsheets/d/1vKn8BgABuJsqH5WhhS9ebIGTBG4aoP-StINRi18abo4/htmlview)からダウンロードできます。
+
+VMのユーザ名は`user`。パスワードは`password`。
+
+## ログイン
+
+`ifconfig`でIPアドレスを調べ、Webブラウザで開いてください。
+以下のようなログインページに移動されます:
+
+
+
+docker-composeコマンドで作成したユーザの認証情報でログインしてください。
+
+## 新しいsketch作成
+
+`New investiation`をクリックし、新しいスケッチに名前を付けます。
+
+
+
+## タイムラインのアップロード
+
+`Upload timeline`をクリックし、以下のコマンドで作成したCSVファイルをアップロードします:
+
+`hayabusa-1.5.1-win-x64.exe -d ../hayabusa-sample-evtx --RFC-3339 -o timesketch-import.csv -P timesketch -U`
+
+Windowsのイベントを含めず、アラートだけでよい場合は、`-m low`を追加することができます。
+
+## 結果の解析
+
+以下のような画面が表示されるはずです:
+
+
+
+デフォルトでは、UTCタイムスタンプとアラートルールのタイトル名のみが表示されますので、`Customize columns`をクリックし、他のフィールドを追加してください。
+
+> 注意: 現在のバージョンでは、新しいカラムが空白になってしまうというバグがあります。新しいカラムを表示するには、別のカラムをまず追加してください(必要なければ後で削除してください。)
+
+以下のように検索ボックスで`Level: crit`等を入力することで、クリティカルなアラートのみを表示させるようにフィルタリングできます。
+
+
+
+イベントをクリックすると、すべてのフィールド情報を見ることができます:
+
+
+
+アラートタイトルの左側にある3つのアイコンを使って、興味のあるイベントにスターをつけたり、イベントの文脈を見るために+-5分検索したり、ラベルを追加したりすることが可能です。
+
+
\ No newline at end of file
diff --git a/hayabusa-logo.png b/logo.png
similarity index 100%
rename from hayabusa-logo.png
rename to logo.png
diff --git a/rules b/rules
index 8c14d12b..85631637 160000
--- a/rules
+++ b/rules
@@ -1 +1 @@
-Subproject commit 8c14d12be3f2d08721eee6db7238058fdaca3ce6
+Subproject commit 856316374ca52ce01123c2078c7af294d29df546
diff --git a/screenshots/Hayabusa-Results.png b/screenshots/Hayabusa-Results.png
index 026b686d..61c23587 100644
Binary files a/screenshots/Hayabusa-Results.png and b/screenshots/Hayabusa-Results.png differ
diff --git a/screenshots/Hayabusa-Startup.png b/screenshots/Hayabusa-Startup.png
index 72b5284b..ec3849cc 100644
Binary files a/screenshots/Hayabusa-Startup.png and b/screenshots/Hayabusa-Startup.png differ
diff --git a/screenshots/HayabusaResultsSummary.png b/screenshots/HayabusaResultsSummary.png
index 1efd8ec9..b9deea82 100644
Binary files a/screenshots/HayabusaResultsSummary.png and b/screenshots/HayabusaResultsSummary.png differ
diff --git a/screenshots/TimesketchAnalysis.png b/screenshots/TimesketchAnalysis.png
new file mode 100644
index 00000000..e6e99eda
Binary files /dev/null and b/screenshots/TimesketchAnalysis.png differ
diff --git a/src/afterfact.rs b/src/afterfact.rs
index a6edaa50..cfc64c1c 100644
--- a/src/afterfact.rs
+++ b/src/afterfact.rs
@@ -1,69 +1,53 @@
use crate::detections::configs;
-use crate::detections::configs::TERM_SIZE;
-use crate::detections::print;
-use crate::detections::print::{AlertMessage, IS_HIDE_RECORD_ID};
-use crate::detections::utils;
-use crate::detections::utils::write_color_buffer;
+use crate::detections::configs::{CURRENT_EXE_PATH, TERM_SIZE};
+use crate::detections::message::{self, LEVEL_ABBR};
+use crate::detections::message::{AlertMessage, LEVEL_FULL};
+use crate::detections::utils::{self, format_time};
+use crate::detections::utils::{get_writable_color, write_color_buffer};
+use crate::options::profile::PROFILES;
+use bytesize::ByteSize;
use chrono::{DateTime, Local, TimeZone, Utc};
+use comfy_table::modifiers::UTF8_ROUND_CORNERS;
+use comfy_table::presets::UTF8_FULL;
use csv::QuoteStyle;
-use hashbrown::HashMap;
-use hashbrown::HashSet;
+use itertools::Itertools;
use krapslog::{build_sparkline, build_time_markers};
use lazy_static::lazy_static;
-use serde::Serialize;
+use linked_hash_map::LinkedHashMap;
+
+use comfy_table::*;
+use hashbrown::{HashMap, HashSet};
+use num_format::{Locale, ToFormattedString};
use std::cmp::min;
use std::error::Error;
+
use std::fs::File;
use std::io;
use std::io::BufWriter;
use std::io::Write;
+
+use std::fs;
use std::process;
use termcolor::{BufferWriter, Color, ColorChoice, ColorSpec, WriteColor};
use terminal_size::Width;
-#[derive(Debug, Serialize)]
-#[serde(rename_all = "PascalCase")]
-pub struct CsvFormat<'a> {
- timestamp: &'a str,
- computer: &'a str,
- channel: &'a str,
- event_i_d: &'a str,
- level: &'a str,
- mitre_attack: &'a str,
- #[serde(skip_serializing_if = "Option::is_none")]
- record_i_d: Option<&'a str>,
- rule_title: &'a str,
- details: &'a str,
- #[serde(skip_serializing_if = "Option::is_none")]
- record_information: Option<&'a str>,
- rule_path: &'a str,
- file_path: &'a str,
-}
-
-#[derive(Debug, Serialize)]
-#[serde(rename_all = "PascalCase")]
-pub struct DisplayFormat<'a> {
- timestamp: &'a str,
- pub computer: &'a str,
- pub channel: &'a str,
- pub event_i_d: &'a str,
- pub level: &'a str,
- #[serde(skip_serializing_if = "Option::is_none")]
- record_i_d: Option<&'a str>,
- pub rule_title: &'a str,
- pub details: &'a str,
- #[serde(skip_serializing_if = "Option::is_none")]
- pub record_information: Option<&'a str>,
-}
-
lazy_static! {
- pub static ref OUTPUT_COLOR: HashMap = set_output_color();
+ pub static ref OUTPUT_COLOR: HashMap = set_output_color();
+}
+
+pub struct Colors {
+ pub output_color: termcolor::Color,
+ pub table_color: comfy_table::Color,
}
/// level_color.txtファイルを読み込み対応する文字色のマッピングを返却する関数
-pub fn set_output_color() -> HashMap {
- let read_result = utils::read_csv("config/level_color.txt");
- let mut color_map: HashMap = HashMap::new();
+pub fn set_output_color() -> HashMap {
+ let read_result = utils::read_csv(
+ utils::check_setting_path(&CURRENT_EXE_PATH.to_path_buf(), "config/level_color.txt")
+ .to_str()
+ .unwrap(),
+ );
+ let mut color_map: HashMap = HashMap::new();
if configs::CONFIG.read().unwrap().args.no_color {
return color_map;
}
@@ -93,16 +77,34 @@ pub fn set_output_color() -> HashMap {
}
color_map.insert(
level.to_lowercase(),
- Color::Rgb(color_code[0], color_code[1], color_code[2]),
+ Colors {
+ output_color: termcolor::Color::Rgb(color_code[0], color_code[1], color_code[2]),
+ table_color: comfy_table::Color::Rgb {
+ r: color_code[0],
+ g: color_code[1],
+ b: color_code[2],
+ },
+ },
);
});
color_map
}
-fn _get_output_color(color_map: &HashMap, level: &str) -> Option {
+fn _get_output_color(color_map: &HashMap, level: &str) -> Option {
let mut color = None;
if let Some(c) = color_map.get(&level.to_lowercase()) {
- color = Some(c.to_owned());
+ color = Some(c.output_color.to_owned());
+ }
+ color
+}
+
+fn _get_table_color(
+ color_map: &HashMap,
+ level: &str,
+) -> Option {
+ let mut color = None;
+ if let Some(c) = color_map.get(&level.to_lowercase()) {
+ color = Some(c.table_color.to_owned());
}
color
}
@@ -190,7 +192,7 @@ pub fn after_fact(all_record_cnt: usize) {
fn emit_csv(
writer: &mut W,
displayflag: bool,
- color_map: HashMap,
+ color_map: HashMap,
all_record_cnt: u128,
) -> io::Result<()> {
let disp_wtr = BufferWriter::stdout(ColorChoice::Always);
@@ -199,7 +201,6 @@ fn emit_csv(
disp_wtr_buf.set_color(ColorSpec::new().set_fg(None)).ok();
- let messages = print::MESSAGES.lock().unwrap();
// level is devided by "Critical","High","Medium","Low","Informational","Undefined".
let mut total_detect_counts_by_level: Vec = vec![0; 6];
let mut unique_detect_counts_by_level: Vec = vec![0; 6];
@@ -209,26 +210,15 @@ fn emit_csv(
HashMap::new();
let mut detect_counts_by_computer_and_level: HashMap> =
HashMap::new();
+ let mut detect_counts_by_rule_and_level: HashMap> =
+ HashMap::new();
- let levels = Vec::from([
- "critical",
- "high",
- "medium",
- "low",
- "informational",
- "undefined",
- ]);
- let level_abbr: HashMap = HashMap::from([
- (String::from("cruitical"), String::from("crit")),
- (String::from("high"), String::from("high")),
- (String::from("medium"), String::from("med ")),
- (String::from("low"), String::from("low ")),
- (String::from("informational"), String::from("info")),
- ]);
+ let levels = Vec::from(["crit", "high", "med ", "low ", "info", "undefined"]);
// レベル別、日ごとの集計用変数の初期化
for level_init in levels {
detect_counts_by_date_and_level.insert(level_init.to_string(), HashMap::new());
detect_counts_by_computer_and_level.insert(level_init.to_string(), HashMap::new());
+ detect_counts_by_rule_and_level.insert(level_init.to_string(), HashMap::new());
}
if displayflag {
println!();
@@ -236,86 +226,57 @@ fn emit_csv(
let mut timestamps: Vec = Vec::new();
let mut plus_header = true;
let mut detected_record_idset: HashSet = HashSet::new();
- let detect_union = messages.iter();
- for (time, detect_infos) in detect_union {
+ for time in message::MESSAGES.clone().into_read_only().keys().sorted() {
+ let multi = message::MESSAGES.get(time).unwrap();
+ let (_, detect_infos) = multi.pair();
timestamps.push(_get_timestamp(time));
for detect_info in detect_infos {
- detected_record_idset.insert(format!("{}_{}", time, detect_info.eventid));
- let level = detect_info.level.to_string();
- let time_str = format_time(time, false);
+ if !detect_info.detail.starts_with("[condition]") {
+ detected_record_idset.insert(format!("{}_{}", time, detect_info.eventid));
+ }
if displayflag {
- let record_id = detect_info
- .record_id
- .as_ref()
- .map(|recinfo| _format_cellpos(recinfo, ColPos::Other));
- let recinfo = detect_info
- .record_information
- .as_ref()
- .map(|recinfo| _format_cellpos(recinfo, ColPos::Last));
- let ctr_char_exclude_details = detect_info
- .detail
- .chars()
- .filter(|&c| !c.is_control())
- .collect::();
-
- let details = if ctr_char_exclude_details.is_empty() {
- "-".to_string()
- } else {
- ctr_char_exclude_details
- };
-
- let dispformat: _ = DisplayFormat {
- timestamp: &_format_cellpos(&time_str, ColPos::First),
- level: &_format_cellpos(
- level_abbr.get(&level).unwrap_or(&level),
- ColPos::Other,
- ),
- computer: &_format_cellpos(&detect_info.computername, ColPos::Other),
- event_i_d: &_format_cellpos(&detect_info.eventid, ColPos::Other),
- channel: &_format_cellpos(&detect_info.channel, ColPos::Other),
- rule_title: &_format_cellpos(&detect_info.alert, ColPos::Other),
- details: &_format_cellpos(&details, ColPos::Other),
- record_information: recinfo.as_deref(),
- record_i_d: record_id.as_deref(),
- };
-
//ヘッダーのみを出力
if plus_header {
- write!(disp_wtr_buf, "{}", _get_serialized_disp_output(None)).ok();
- plus_header = false;
- }
- disp_wtr_buf
- .set_color(
- ColorSpec::new().set_fg(_get_output_color(&color_map, &detect_info.level)),
+ write_color_buffer(
+ &disp_wtr,
+ get_writable_color(None),
+ &_get_serialized_disp_output(PROFILES.as_ref().unwrap(), true),
+ false,
)
.ok();
- write!(
- disp_wtr_buf,
- "{}",
- _get_serialized_disp_output(Some(dispformat))
+ plus_header = false;
+ }
+ write_color_buffer(
+ &disp_wtr,
+ get_writable_color(_get_output_color(
+ &color_map,
+ LEVEL_FULL
+ .get(&detect_info.level)
+ .unwrap_or(&String::default()),
+ )),
+ &_get_serialized_disp_output(&detect_info.ext_field, false),
+ false,
)
.ok();
} else {
// csv output format
- wtr.serialize(CsvFormat {
- timestamp: &time_str,
- level: level_abbr.get(&level).unwrap_or(&level).trim(),
- computer: &detect_info.computername,
- event_i_d: &detect_info.eventid,
- channel: &detect_info.channel,
- mitre_attack: &detect_info.tag_info,
- rule_title: &detect_info.alert,
- details: &detect_info.detail,
- record_information: detect_info.record_information.as_deref(),
- file_path: &detect_info.filepath,
- rule_path: &detect_info.rulepath,
- record_i_d: detect_info.record_id.as_deref(),
- })?;
+ if plus_header {
+ wtr.write_record(detect_info.ext_field.keys().map(|x| x.trim()))?;
+ plus_header = false;
+ }
+ wtr.write_record(detect_info.ext_field.values().map(|x| x.trim()))?;
}
+
let level_suffix = *configs::LEVELMAP
- .get(&detect_info.level.to_uppercase())
+ .get(
+ &LEVEL_FULL
+ .get(&detect_info.level)
+ .unwrap_or(&"undefined".to_string())
+ .to_uppercase(),
+ )
.unwrap_or(&0) as usize;
let time_str_date = format_time(time, true);
+
let mut detect_counts_by_date = detect_counts_by_date_and_level
.get(&detect_info.level.to_lowercase())
.unwrap_or_else(|| detect_counts_by_date_and_level.get("undefined").unwrap())
@@ -327,6 +288,7 @@ fn emit_csv(
detected_rule_files.insert(detect_info.rulepath.clone());
unique_detect_counts_by_level[level_suffix] += 1;
}
+
let computer_rule_check_key =
format!("{}|{}", &detect_info.computername, &detect_info.rulepath);
if !detected_computer_and_rule_names.contains(&computer_rule_check_key) {
@@ -346,66 +308,110 @@ fn emit_csv(
.insert(detect_info.level.to_lowercase(), detect_counts_by_computer);
}
+ let mut detect_counts_by_rules = detect_counts_by_rule_and_level
+ .get(&detect_info.level.to_lowercase())
+ .unwrap_or_else(|| {
+ detect_counts_by_computer_and_level
+ .get("undefined")
+ .unwrap()
+ })
+ .clone();
+ *detect_counts_by_rules
+ .entry(Clone::clone(&detect_info.ruletitle))
+ .or_insert(0) += 1;
+ detect_counts_by_rule_and_level
+ .insert(detect_info.level.to_lowercase(), detect_counts_by_rules);
+
total_detect_counts_by_level[level_suffix] += 1;
detect_counts_by_date_and_level
.insert(detect_info.level.to_lowercase(), detect_counts_by_date);
}
}
if displayflag {
- disp_wtr.print(&disp_wtr_buf)?;
println!();
} else {
wtr.flush()?;
}
- disp_wtr_buf.clear();
- disp_wtr_buf.set_color(ColorSpec::new().set_fg(None)).ok();
- writeln!(disp_wtr_buf, "Results Summary:").ok();
- disp_wtr.print(&disp_wtr_buf).ok();
-
- let terminal_width = match *TERM_SIZE {
- Some((Width(w), _)) => w as usize,
- None => 100,
+ let output_path = &configs::CONFIG.read().unwrap().args.output;
+ if let Some(path) = output_path {
+ if let Ok(metadata) = fs::metadata(path) {
+ println!(
+ "Saved file: {} ({})",
+ configs::CONFIG
+ .read()
+ .unwrap()
+ .args
+ .output
+ .as_ref()
+ .unwrap()
+ .display(),
+ ByteSize::b(metadata.len()).to_string_as(false)
+ );
+ println!();
+ }
};
- println!();
- if configs::CONFIG.read().unwrap().args.visualize_timeline {
- _print_timeline_hist(timestamps, terminal_width, 3);
+ if !configs::CONFIG.read().unwrap().args.no_summary {
+ disp_wtr_buf.clear();
+ write_color_buffer(
+ &disp_wtr,
+ get_writable_color(Some(Color::Rgb(0, 255, 0))),
+ "Results Summary:",
+ true,
+ )
+ .ok();
+
+ let terminal_width = match *TERM_SIZE {
+ Some((Width(w), _)) => w as usize,
+ None => 100,
+ };
+ println!();
+
+ if configs::CONFIG.read().unwrap().args.visualize_timeline {
+ _print_timeline_hist(timestamps, terminal_width, 3);
+ println!();
+ }
+ let reducted_record_cnt: u128 = all_record_cnt - detected_record_idset.len() as u128;
+ let reducted_percent = if all_record_cnt == 0 {
+ 0 as f64
+ } else {
+ (reducted_record_cnt as f64) / (all_record_cnt as f64) * 100.0
+ };
+ write_color_buffer(
+ &disp_wtr,
+ get_writable_color(None),
+ &format!(
+ "Detected events / Total events: {} / {} (reduced {} events ({:.2}%))",
+ (all_record_cnt - reducted_record_cnt).to_formatted_string(&Locale::en),
+ all_record_cnt.to_formatted_string(&Locale::en),
+ reducted_record_cnt.to_formatted_string(&Locale::en),
+ reducted_percent
+ ),
+ true,
+ )
+ .ok();
+ println!();
+
+ _print_unique_results(
+ total_detect_counts_by_level,
+ unique_detect_counts_by_level,
+ "Total | Unique".to_string(),
+ "detections".to_string(),
+ &color_map,
+ );
+ println!();
+
+ _print_detection_summary_by_date(detect_counts_by_date_and_level, &color_map);
+ println!();
+ println!();
+
+ _print_detection_summary_by_computer(detect_counts_by_computer_and_level, &color_map);
+ println!();
+
+ _print_detection_summary_tables(detect_counts_by_rule_and_level, &color_map);
println!();
}
- let reducted_record_cnt: u128 = all_record_cnt - detected_record_idset.len() as u128;
- let reducted_percent = if all_record_cnt == 0 {
- 0 as f64
- } else {
- (reducted_record_cnt as f64) / (all_record_cnt as f64) * 100.0
- };
- println!("Total events: {}", all_record_cnt);
- println!(
- "Data reduction: {} events ({:.2}%)",
- reducted_record_cnt, reducted_percent
- );
- println!();
-
- _print_unique_results(
- total_detect_counts_by_level,
- "Total".to_string(),
- "detections".to_string(),
- &color_map,
- );
- println!();
-
- _print_unique_results(
- unique_detect_counts_by_level,
- "Unique".to_string(),
- "detections".to_string(),
- &color_map,
- );
- println!();
-
- _print_detection_summary_by_date(detect_counts_by_date_and_level, &color_map);
- println!();
-
- _print_detection_summary_by_computer(detect_counts_by_computer_and_level, &color_map);
Ok(())
}
@@ -420,24 +426,23 @@ enum ColPos {
Other,
}
-fn _get_serialized_disp_output(dispformat: Option) -> String {
- if dispformat.is_none() {
- let mut titles = vec![
- "Timestamp",
- "Computer",
- "Channel",
- "EventID",
- "Level",
- "RuleTitle",
- "Details",
- ];
- if !*IS_HIDE_RECORD_ID {
- titles.insert(5, "RecordID");
+fn _get_serialized_disp_output(data: &LinkedHashMap, header: bool) -> String {
+ let data_length = &data.len();
+ let mut ret: Vec = vec![];
+ if header {
+ for k in data.keys() {
+ ret.push(k.to_owned());
}
- if configs::CONFIG.read().unwrap().args.full_data {
- titles.push("RecordInformation");
+ } else {
+ for (i, (_, v)) in data.iter().enumerate() {
+ if i == 0 {
+ ret.push(_format_cellpos(v, ColPos::First))
+ } else if i == data_length - 1 {
+ ret.push(_format_cellpos(v, ColPos::Last))
+ } else {
+ ret.push(_format_cellpos(v, ColPos::Other))
+ }
}
- return format!("{}\n", titles.join("|"));
}
let mut disp_serializer = csv::WriterBuilder::new()
.double_quote(false)
@@ -446,8 +451,7 @@ fn _get_serialized_disp_output(dispformat: Option) -> String {
.has_headers(false)
.from_writer(vec![]);
- disp_serializer.serialize(dispformat.unwrap()).ok();
-
+ disp_serializer.write_record(ret).ok();
String::from_utf8(disp_serializer.into_inner().unwrap_or_default()).unwrap_or_default()
}
@@ -460,50 +464,64 @@ fn _format_cellpos(colval: &str, column: ColPos) -> String {
}
}
-/// output info which unique detection count and all detection count information(devided by level and total) to stdout.
+/// output info which unique detection count and all detection count information(separated by level and total) to stdout.
fn _print_unique_results(
mut counts_by_level: Vec,
+ mut unique_counts_by_level: Vec,
head_word: String,
tail_word: String,
- color_map: &HashMap,
+ color_map: &HashMap,
) {
- let levels = Vec::from([
- "critical",
- "high",
- "medium",
- "low",
- "informational",
- "undefined",
- ]);
-
// the order in which are registered and the order of levels to be displayed are reversed
counts_by_level.reverse();
+ unique_counts_by_level.reverse();
+ let total_count = counts_by_level.iter().sum::();
+ let unique_total_count = unique_counts_by_level.iter().sum::();
// output total results
write_color_buffer(
- BufferWriter::stdout(ColorChoice::Always),
+ &BufferWriter::stdout(ColorChoice::Always),
None,
&format!(
- "{} {}: {}",
+ "{} {}: {} | {}",
head_word,
tail_word,
- counts_by_level.iter().sum::()
+ total_count.to_formatted_string(&Locale::en),
+ unique_total_count.to_formatted_string(&Locale::en)
),
+ true,
)
.ok();
- for (i, level_name) in levels.iter().enumerate() {
+ for (i, level_name) in LEVEL_ABBR.keys().enumerate() {
if "undefined" == *level_name {
continue;
}
+ let percent = if total_count == 0 {
+ 0 as f64
+ } else {
+ (counts_by_level[i] as f64) / (total_count as f64) * 100.0
+ };
+ let unique_percent = if unique_total_count == 0 {
+ 0 as f64
+ } else {
+ (unique_counts_by_level[i] as f64) / (unique_total_count as f64) * 100.0
+ };
let output_raw_str = format!(
- "{} {} {}: {}",
- head_word, level_name, tail_word, counts_by_level[i]
+ "{} {} {}: {} ({:.2}%) | {} ({:.2}%)",
+ head_word,
+ level_name,
+ tail_word,
+ counts_by_level[i].to_formatted_string(&Locale::en),
+ percent,
+ unique_counts_by_level[i].to_formatted_string(&Locale::en),
+ unique_percent
);
write_color_buffer(
- BufferWriter::stdout(ColorChoice::Always),
+ &BufferWriter::stdout(ColorChoice::Always),
_get_output_color(color_map, level_name),
&output_raw_str,
+ true,
)
.ok();
}
@@ -512,38 +530,47 @@ fn _print_unique_results(
/// 各レベル毎で最も高い検知数を出した日付を出力する
fn _print_detection_summary_by_date(
detect_counts_by_date: HashMap>,
- color_map: &HashMap,
+ color_map: &HashMap,
) {
let buf_wtr = BufferWriter::stdout(ColorChoice::Always);
let mut wtr = buf_wtr.buffer();
wtr.set_color(ColorSpec::new().set_fg(None)).ok();
- let output_levels = Vec::from(["critical", "high", "medium", "low", "informational"]);
+ writeln!(wtr, "Dates with most total detections:").ok();
- for level in output_levels {
+ for (idx, level) in LEVEL_ABBR.values().enumerate() {
// output_levelsはlevelsからundefinedを除外した配列であり、各要素は必ず初期化されているのでSomeであることが保証されているのでunwrapをそのまま実施
let detections_by_day = detect_counts_by_date.get(level).unwrap();
let mut max_detect_str = String::default();
let mut tmp_cnt: u128 = 0;
- let mut date_str = String::default();
+ let mut exist_max_data = false;
for (date, cnt) in detections_by_day {
if cnt > &tmp_cnt {
- date_str = date.clone();
- max_detect_str = format!("{} ({})", date, cnt);
+ exist_max_data = true;
+ max_detect_str = format!("{} ({})", date, cnt.to_formatted_string(&Locale::en));
tmp_cnt = *cnt;
}
}
- wtr.set_color(ColorSpec::new().set_fg(_get_output_color(color_map, level)))
- .ok();
- if date_str == String::default() {
+ wtr.set_color(ColorSpec::new().set_fg(_get_output_color(
+ color_map,
+ LEVEL_FULL.get(level.as_str()).unwrap(),
+ )))
+ .ok();
+ if !exist_max_data {
max_detect_str = "n/a".to_string();
}
- writeln!(
+ write!(
wtr,
- "Date with most total {} detections: {}",
- level, &max_detect_str
+ "{}: {}",
+ LEVEL_FULL.get(level.as_str()).unwrap(),
+ &max_detect_str
)
.ok();
+ if idx != LEVEL_ABBR.len() - 1 {
+ wtr.set_color(ColorSpec::new().set_fg(None)).ok();
+
+ write!(wtr, ", ").ok();
+ }
}
buf_wtr.print(&wtr).ok();
}
@@ -551,15 +578,14 @@ fn _print_detection_summary_by_date(
/// 各レベル毎で最も高い検知数を出した日付を出力する
fn _print_detection_summary_by_computer(
detect_counts_by_computer: HashMap>,
- color_map: &HashMap,
+ color_map: &HashMap,
) {
let buf_wtr = BufferWriter::stdout(ColorChoice::Always);
let mut wtr = buf_wtr.buffer();
wtr.set_color(ColorSpec::new().set_fg(None)).ok();
- let output_levels = Vec::from(["critical", "high", "medium", "low", "informational"]);
-
- for level in output_levels {
+ writeln!(wtr, "Top 5 computers with most unique detections:").ok();
+ for level in LEVEL_ABBR.values() {
// output_levelsはlevelsからundefinedを除外した配列であり、各要素は必ず初期化されているのでSomeであることが保証されているのでunwrapをそのまま実施
let detections_by_computer = detect_counts_by_computer.get(level).unwrap();
let mut result_vec: Vec = Vec::new();
@@ -572,7 +598,11 @@ fn _print_detection_summary_by_computer(
sorted_detections.sort_by(|a, b| (-a.1).cmp(&(-b.1)));
for x in sorted_detections.iter().take(5) {
- result_vec.push(format!("{} ({})", x.0, x.1));
+ result_vec.push(format!(
+ "{} ({})",
+ x.0,
+ x.1.to_formatted_string(&Locale::en)
+ ));
}
let result_str = if result_vec.is_empty() {
"n/a".to_string()
@@ -580,24 +610,94 @@ fn _print_detection_summary_by_computer(
result_vec.join(", ")
};
- wtr.set_color(ColorSpec::new().set_fg(_get_output_color(color_map, level)))
- .ok();
+ wtr.set_color(ColorSpec::new().set_fg(_get_output_color(
+ color_map,
+ LEVEL_FULL.get(level.as_str()).unwrap(),
+ )))
+ .ok();
writeln!(
wtr,
- "Top 5 computers with most unique {} detections: {}",
- level, &result_str
+ "{}: {}",
+ LEVEL_FULL.get(level.as_str()).unwrap(),
+ &result_str
)
.ok();
}
buf_wtr.print(&wtr).ok();
}
-fn format_time(time: &DateTime, date_only: bool) -> String {
- if configs::CONFIG.read().unwrap().args.utc {
- format_rfc(time, date_only)
- } else {
- format_rfc(&time.with_timezone(&Local), date_only)
+/// 各レベルごとで検出数が多かったルールと日ごとの検知数を表形式で出力する関数
+fn _print_detection_summary_tables(
+ detect_counts_by_rule_and_level: HashMap>,
+ color_map: &HashMap,
+) {
+ let buf_wtr = BufferWriter::stdout(ColorChoice::Always);
+ let mut wtr = buf_wtr.buffer();
+ wtr.set_color(ColorSpec::new().set_fg(None)).ok();
+ let mut output = vec![];
+ let mut col_color = vec![];
+ for level in LEVEL_ABBR.values() {
+ let mut col_output: Vec = vec![];
+ col_output.push(format!(
+ "Top {} alerts:",
+ LEVEL_FULL.get(level.as_str()).unwrap()
+ ));
+
+ col_color.push(_get_table_color(
+ color_map,
+ LEVEL_FULL.get(level.as_str()).unwrap(),
+ ));
+
+ // output_levelsはlevelsからundefinedを除外した配列であり、各要素は必ず初期化されているのでSomeであることが保証されているのでunwrapをそのまま実施
+ let detections_by_computer = detect_counts_by_rule_and_level.get(level).unwrap();
+ let mut sorted_detections: Vec<(&String, &i128)> = detections_by_computer.iter().collect();
+
+ sorted_detections.sort_by(|a, b| (-a.1).cmp(&(-b.1)));
+
+ for x in sorted_detections.iter().take(5) {
+ col_output.push(format!(
+ "{} ({})",
+ x.0,
+ x.1.to_formatted_string(&Locale::en)
+ ));
+ }
+ let na_cnt = if sorted_detections.len() > 5 {
+ 0
+ } else {
+ 5 - sorted_detections.len()
+ };
+ for _x in 0..na_cnt {
+ col_output.push("N/A".to_string());
+ }
+ output.push(col_output);
}
+
+ let mut tb = Table::new();
+ tb.load_preset(UTF8_FULL)
+ .apply_modifier(UTF8_ROUND_CORNERS)
+ .set_content_arrangement(ContentArrangement::Dynamic)
+ .set_width(500);
+ for x in 0..2 {
+ tb.add_row(vec![
+ Cell::new(&output[2 * x][0]).fg(col_color[2 * x].unwrap_or(comfy_table::Color::Reset)),
+ Cell::new(&output[2 * x + 1][0])
+ .fg(col_color[2 * x + 1].unwrap_or(comfy_table::Color::Reset)),
+ ]);
+
+ tb.add_row(vec![
+ Cell::new(&output[2 * x][1..].join("\n"))
+ .fg(col_color[2 * x].unwrap_or(comfy_table::Color::Reset)),
+ Cell::new(&output[2 * x + 1][1..].join("\n"))
+ .fg(col_color[2 * x + 1].unwrap_or(comfy_table::Color::Reset)),
+ ]);
+ }
+ tb.add_row(vec![
+ Cell::new(&output[4][0]).fg(col_color[4].unwrap_or(comfy_table::Color::Reset))
+ ]);
+ tb.add_row(vec![
+ Cell::new(&output[4][1..].join("\n")).fg(col_color[4].unwrap_or(comfy_table::Color::Reset))
+ ]);
+ println!("{tb}");
}
/// get timestamp to input datetime.
@@ -610,73 +710,26 @@ fn _get_timestamp(time: &DateTime) -> i64 {
}
}
-/// return rfc time format string by option
-fn format_rfc(time: &DateTime, date_only: bool) -> String
-where
- Tz::Offset: std::fmt::Display,
-{
- let time_args = &configs::CONFIG.read().unwrap().args;
- if time_args.rfc_2822 {
- if date_only {
- time.format("%a, %e %b %Y").to_string()
- } else {
- time.format("%a, %e %b %Y %H:%M:%S %:z").to_string()
- }
- } else if time_args.rfc_3339 {
- if date_only {
- time.format("%Y-%m-%d").to_string()
- } else {
- time.format("%Y-%m-%d %H:%M:%S%.6f%:z").to_string()
- }
- } else if time_args.us_time {
- if date_only {
- time.format("%m-%d-%Y").to_string()
- } else {
- time.format("%m-%d-%Y %I:%M:%S%.3f %p %:z").to_string()
- }
- } else if time_args.us_military_time {
- if date_only {
- time.format("%m-%d-%Y").to_string()
- } else {
- time.format("%m-%d-%Y %H:%M:%S%.3f %:z").to_string()
- }
- } else if time_args.european_time {
- if date_only {
- time.format("%d-%m-%Y").to_string()
- } else {
- time.format("%d-%m-%Y %H:%M:%S%.3f %:z").to_string()
- }
- } else if date_only {
- time.format("%Y-%m-%d").to_string()
- } else {
- time.format("%Y-%m-%d %H:%M:%S%.3f %:z").to_string()
- }
-}
-
#[cfg(test)]
mod tests {
- use crate::afterfact::DisplayFormat;
use crate::afterfact::_get_serialized_disp_output;
use crate::afterfact::emit_csv;
use crate::afterfact::format_time;
- use crate::detections::print;
- use crate::detections::print::DetectInfo;
- use crate::detections::print::CH_CONFIG;
+ use crate::detections::message;
+ use crate::detections::message::DetectInfo;
+ use crate::options::profile::load_profile;
use chrono::{Local, TimeZone, Utc};
use hashbrown::HashMap;
+ use linked_hash_map::LinkedHashMap;
use serde_json::Value;
use std::fs::File;
use std::fs::{read_to_string, remove_file};
use std::io;
#[test]
- fn test_emit_csv() {
- //テストの並列処理によって読み込みの順序が担保できずstatic変数の内容が担保が取れない為、このテストはシーケンシャルで行う
- test_emit_csv_output();
- test_emit_csv_output();
- }
-
fn test_emit_csv_output() {
+ let mock_ch_filter =
+ message::create_output_filter_config("test_files/config/channel_abbreviations.txt");
let test_filepath: &str = "test.evtx";
let test_rulepath: &str = "test-rule.yml";
let test_title = "test_title";
@@ -688,8 +741,17 @@ mod tests {
let test_attack = "execution/txxxx.yyy";
let test_recinfo = "record_infoinfo11";
let test_record_id = "11111";
+ let expect_time = Utc
+ .datetime_from_str("1996-02-27T01:05:01Z", "%Y-%m-%dT%H:%M:%SZ")
+ .unwrap();
+ let expect_tz = expect_time.with_timezone(&Local);
+ let output_profile: LinkedHashMap = load_profile(
+ "test_files/config/default_profile.yaml",
+ "test_files/config/profiles.yaml",
+ )
+ .unwrap();
{
- let mut messages = print::MESSAGES.lock().unwrap();
+ let messages = &message::MESSAGES;
messages.clear();
let val = r##"
{
@@ -706,33 +768,46 @@ mod tests {
}
"##;
let event: Value = serde_json::from_str(val).unwrap();
- messages.insert(
- &event,
- output.to_string(),
- DetectInfo {
- filepath: test_filepath.to_string(),
- rulepath: test_rulepath.to_string(),
- level: test_level.to_string(),
- computername: test_computername.to_string(),
- eventid: test_eventid.to_string(),
- channel: CH_CONFIG
+ let mut profile_converter: HashMap = HashMap::from([
+ ("%Timestamp%".to_owned(), format_time(&expect_time, false)),
+ ("%Computer%".to_owned(), test_computername.to_string()),
+ (
+ "%Channel%".to_owned(),
+ mock_ch_filter
.get("Security")
.unwrap_or(&String::default())
.to_string(),
- alert: test_title.to_string(),
+ ),
+ ("%Level%".to_owned(), test_level.to_string()),
+ ("%EventID%".to_owned(), test_eventid.to_string()),
+ ("%MitreAttack%".to_owned(), test_attack.to_string()),
+ ("%RecordID%".to_owned(), test_record_id.to_string()),
+ ("%RuleTitle%".to_owned(), test_title.to_owned()),
+ ("%RecordInformation%".to_owned(), test_recinfo.to_owned()),
+ ("%RuleFile%".to_owned(), test_rulepath.to_string()),
+ ("%EvtxFile%".to_owned(), test_filepath.to_string()),
+ ("%Tags%".to_owned(), test_attack.to_string()),
+ ]);
+ message::insert(
+ &event,
+ output.to_string(),
+ DetectInfo {
+ rulepath: test_rulepath.to_string(),
+ ruletitle: test_title.to_string(),
+ level: test_level.to_string(),
+ computername: test_computername.to_string(),
+ eventid: test_eventid.to_string(),
detail: String::default(),
- tag_info: test_attack.to_string(),
record_information: Option::Some(test_recinfo.to_string()),
- record_id: Option::Some(test_record_id.to_string()),
+ ext_field: output_profile,
},
+ expect_time,
+ &mut profile_converter,
+ false,
);
}
- let expect_time = Utc
- .datetime_from_str("1996-02-27T01:05:01Z", "%Y-%m-%dT%H:%M:%SZ")
- .unwrap();
- let expect_tz = expect_time.with_timezone(&Local);
let expect =
- "Timestamp,Computer,Channel,EventID,Level,MitreAttack,RecordID,RuleTitle,Details,RecordInformation,RulePath,FilePath\n"
+ "Timestamp,Computer,Channel,Level,EventID,MitreAttack,RecordID,RuleTitle,Details,RecordInformation,RuleFile,EvtxFile,Tags\n"
.to_string()
+ &expect_tz
.clone()
@@ -743,10 +818,10 @@ mod tests {
+ ","
+ test_channel
+ ","
- + test_eventid
- + ","
+ test_level
+ ","
+ + test_eventid
+ + ","
+ test_attack
+ ","
+ test_record_id
@@ -760,9 +835,11 @@ mod tests {
+ test_rulepath
+ ","
+ test_filepath
+ + ","
+ + test_attack
+ "\n";
let mut file: Box = Box::new(File::create("./test_emit_csv.csv").unwrap());
- assert!(emit_csv(&mut file, false, HashMap::default(), 1).is_ok());
+ assert!(emit_csv(&mut file, false, HashMap::new(), 1).is_ok());
match read_to_string("./test_emit_csv.csv") {
Err(_) => panic!("Failed to open file."),
Ok(s) => {
@@ -770,10 +847,10 @@ mod tests {
}
};
assert!(remove_file("./test_emit_csv.csv").is_ok());
- check_emit_csv_display();
}
- fn check_emit_csv_display() {
+ #[test]
+ fn test_emit_csv_display() {
let test_title = "test_title2";
let test_level = "medium";
let test_computername = "testcomputer2";
@@ -786,44 +863,42 @@ mod tests {
let test_timestamp = Utc
.datetime_from_str("1996-02-27T01:05:01Z", "%Y-%m-%dT%H:%M:%SZ")
.unwrap();
- let expect_header = "Timestamp|Computer|Channel|EventID|Level|RecordID|RuleTitle|Details\n";
+ let expect_header = "Timestamp|Computer|Channel|EventID|Level|RecordID|RuleTitle|Details|RecordInformation\n";
let expect_tz = test_timestamp.with_timezone(&Local);
let expect_no_header = expect_tz
.clone()
.format("%Y-%m-%d %H:%M:%S%.3f %:z")
.to_string()
- + "|"
+ + " | "
+ test_computername
- + "|"
+ + " | "
+ test_channel
- + "|"
+ + " | "
+ test_eventid
- + "|"
+ + " | "
+ test_level
- + "|"
+ + " | "
+ test_recid
- + "|"
+ + " | "
+ test_title
- + "|"
+ + " | "
+ output
- + "|"
+ + " | "
+ test_recinfo
+ "\n";
- assert_eq!(_get_serialized_disp_output(None,), expect_header);
- assert_eq!(
- _get_serialized_disp_output(Some(DisplayFormat {
- timestamp: &format_time(&test_timestamp, false),
- level: test_level,
- computer: test_computername,
- event_i_d: test_eventid,
- channel: test_channel,
- rule_title: test_title,
- details: output,
- record_information: Some(test_recinfo),
- record_i_d: Some(test_recid),
- })),
- expect_no_header
- );
+ let mut data: LinkedHashMap = LinkedHashMap::new();
+ data.insert("Timestamp".to_owned(), format_time(&test_timestamp, false));
+ data.insert("Computer".to_owned(), test_computername.to_owned());
+ data.insert("Channel".to_owned(), test_channel.to_owned());
+ data.insert("EventID".to_owned(), test_eventid.to_owned());
+ data.insert("Level".to_owned(), test_level.to_owned());
+ data.insert("RecordID".to_owned(), test_recid.to_owned());
+ data.insert("RuleTitle".to_owned(), test_title.to_owned());
+ data.insert("Details".to_owned(), output.to_owned());
+ data.insert("RecordInformation".to_owned(), test_recinfo.to_owned());
+
+ assert_eq!(_get_serialized_disp_output(&data, true), expect_header);
+ assert_eq!(_get_serialized_disp_output(&data, false), expect_no_header);
}
}
diff --git a/src/detections/configs.rs b/src/detections/configs.rs
index 883d7858..feda62a4 100644
--- a/src/detections/configs.rs
+++ b/src/detections/configs.rs
@@ -1,13 +1,13 @@
+use crate::detections::message::AlertMessage;
use crate::detections::pivot::PivotKeyword;
use crate::detections::pivot::PIVOT_KEYWORD;
-use crate::detections::print::AlertMessage;
use crate::detections::utils;
use chrono::{DateTime, Utc};
use clap::{App, CommandFactory, Parser};
-use hashbrown::HashMap;
-use hashbrown::HashSet;
+use hashbrown::{HashMap, HashSet};
use lazy_static::lazy_static;
use regex::Regex;
+use std::env::current_exe;
use std::path::PathBuf;
use std::sync::RwLock;
use terminal_size::{terminal_size, Height, Width};
@@ -32,6 +32,10 @@ lazy_static! {
pub static ref TERM_SIZE: Option<(Width, Height)> = terminal_size();
pub static ref TARGET_EXTENSIONS: HashSet =
get_target_extensions(CONFIG.read().unwrap().args.evtx_file_ext.as_ref());
+ pub static ref CURRENT_EXE_PATH: PathBuf =
+ current_exe().unwrap().parent().unwrap().to_path_buf();
+ pub static ref EXCLUDE_STATUS: HashSet =
+ convert_option_vecs_to_hs(CONFIG.read().unwrap().args.exclude_status.as_ref());
}
pub struct ConfigReader<'a> {
@@ -51,78 +55,74 @@ impl Default for ConfigReader<'_> {
#[derive(Parser)]
#[clap(
name = "Hayabusa",
- usage = "hayabusa.exe -f file.evtx [OPTIONS] / hayabusa.exe -d evtx-directory [OPTIONS]",
+ usage = "hayabusa.exe [OTHER-ACTIONS] [OPTIONS]",
author = "Yamato Security (https://github.com/Yamato-Security/hayabusa) @SecurityYamato)",
+ help_template = "\n{name} {version}\n{author}\n\n{usage-heading}\n {usage}\n\n{all-args}\n",
version,
term_width = 400
)]
pub struct Config {
/// Directory of multiple .evtx files
- #[clap(short = 'd', long, value_name = "DIRECTORY")]
+ #[clap(help_heading = Some("INPUT"), short = 'd', long, value_name = "DIRECTORY")]
pub directory: Option,
/// File path to one .evtx file
- #[clap(short = 'f', long, value_name = "FILE_PATH")]
+ #[clap(help_heading = Some("INPUT"), short = 'f', long = "file", value_name = "FILE")]
pub filepath: Option,
- /// Print all field information
- #[clap(short = 'F', long = "full-data")]
- pub full_data: bool,
-
- /// Specify a rule directory or file (default: ./rules)
+ /// Specify a custom rule directory or file (default: ./rules)
#[clap(
+ help_heading = Some("ADVANCED"),
short = 'r',
long,
default_value = "./rules",
hide_default_value = true,
- value_name = "RULE_DIRECTORY/RULE_FILE"
+ value_name = "DIRECTORY/FILE"
)]
pub rules: PathBuf,
- /// Specify custom rule config folder (default: ./rules/config)
+ /// Specify custom rule config directory (default: ./rules/config)
#[clap(
+ help_heading = Some("ADVANCED"),
short = 'c',
- long,
+ long = "rules-config",
default_value = "./rules/config",
hide_default_value = true,
- value_name = "RULE_CONFIG_DIRECTORY"
+ value_name = "DIRECTORY"
)]
pub config: PathBuf,
/// Save the timeline in CSV format (ex: results.csv)
- #[clap(short = 'o', long, value_name = "CSV_TIMELINE")]
+ #[clap(help_heading = Some("OUTPUT"), short = 'o', long, value_name = "FILE")]
pub output: Option,
- /// Output all tags when saving to a CSV file
- #[clap(long = "all-tags")]
- pub all_tags: bool,
-
- /// Do not display EventRecordID numbers
- #[clap(short = 'R', long = "hide-record-id")]
- pub hide_record_id: bool,
-
/// Output verbose information
- #[clap(short = 'v', long)]
+ #[clap(help_heading = Some("DISPLAY-SETTINGS"), short = 'v', long)]
pub verbose: bool,
/// Output event frequency timeline
- #[clap(short = 'V', long = "visualize-timeline")]
+ #[clap(help_heading = Some("DISPLAY-SETTINGS"), short = 'V', long = "visualize-timeline")]
pub visualize_timeline: bool,
/// Enable rules marked as deprecated
- #[clap(short = 'D', long = "enable-deprecated-rules")]
+ #[clap(help_heading = Some("FILTERING"), long = "enable-deprecated-rules")]
pub enable_deprecated_rules: bool,
+ /// Disable event ID filter to scan all events
+ #[clap(help_heading = Some("FILTERING"), short = 'D', long = "deep-scan")]
+ pub deep_scan: bool,
+
/// Enable rules marked as noisy
- #[clap(short = 'n', long = "enable-noisy-rules")]
+ #[clap(help_heading = Some("FILTERING"), short = 'n', long = "enable-noisy-rules")]
pub enable_noisy_rules: bool,
/// Update to the latest rules in the hayabusa-rules github repository
- #[clap(short = 'u', long = "update-rules")]
+ #[clap(help_heading = Some("OTHER-ACTIONS"), short = 'u', long = "update-rules")]
pub update_rules: bool,
/// Minimum level for rules (default: informational)
#[clap(
+ help_heading = Some("FILTERING"),
short = 'm',
long = "min-level",
default_value = "informational",
@@ -132,85 +132,101 @@ pub struct Config {
pub min_level: String,
/// Analyze the local C:\Windows\System32\winevt\Logs folder
- #[clap(short = 'l', long = "live-analysis")]
+ #[clap(help_heading = Some("INPUT"), short = 'l', long = "live-analysis")]
pub live_analysis: bool,
/// Start time of the event logs to load (ex: "2020-02-22 00:00:00 +09:00")
- #[clap(long = "start-timeline", value_name = "START_TIMELINE")]
+ #[clap(help_heading = Some("FILTERING"), long = "timeline-start", value_name = "DATE")]
pub start_timeline: Option,
/// End time of the event logs to load (ex: "2022-02-22 23:59:59 +09:00")
- #[clap(long = "end-timeline", value_name = "END_TIMELINE")]
+ #[clap(help_heading = Some("FILTERING"), long = "timeline-end", value_name = "DATE")]
pub end_timeline: Option,
/// Output timestamp in RFC 2822 format (ex: Fri, 22 Feb 2022 22:00:00 -0600)
- #[clap(long = "RFC-2822")]
+ #[clap(help_heading = Some("TIME-FORMAT"), long = "RFC-2822")]
pub rfc_2822: bool,
/// Output timestamp in RFC 3339 format (ex: 2022-02-22 22:00:00.123456-06:00)
- #[clap(long = "RFC-3339")]
+ #[clap(help_heading = Some("TIME-FORMAT"), long = "RFC-3339")]
pub rfc_3339: bool,
/// Output timestamp in US time format (ex: 02-22-2022 10:00:00.123 PM -06:00)
- #[clap(long = "US-time")]
+ #[clap(help_heading = Some("TIME-FORMAT"), long = "US-time")]
pub us_time: bool,
/// Output timestamp in US military time format (ex: 02-22-2022 22:00:00.123 -06:00)
- #[clap(long = "US-military-time")]
+ #[clap(help_heading = Some("TIME-FORMAT"), long = "US-military-time")]
pub us_military_time: bool,
/// Output timestamp in European time format (ex: 22-02-2022 22:00:00.123 +02:00)
- #[clap(long = "European-time")]
+ #[clap(help_heading = Some("TIME-FORMAT"), long = "European-time")]
pub european_time: bool,
/// Output time in UTC format (default: local time)
- #[clap(short = 'U', long = "UTC")]
+ #[clap(help_heading = Some("TIME-FORMAT"), short = 'U', long = "UTC")]
pub utc: bool,
/// Disable color output
- #[clap(long = "no-color")]
+ #[clap(help_heading = Some("DISPLAY-SETTINGS"), long = "no-color")]
pub no_color: bool,
/// Thread number (default: optimal number for performance)
- #[clap(short, long = "thread-number", value_name = "NUMBER")]
+ #[clap(help_heading = Some("ADVANCED"), short, long = "thread-number", value_name = "NUMBER")]
pub thread_number: Option,
/// Print statistics of event IDs
- #[clap(short, long)]
+ #[clap(help_heading = Some("OTHER-ACTIONS"), short, long)]
pub statistics: bool,
/// Print a summary of successful and failed logons
- #[clap(short = 'L', long = "logon-summary")]
+ #[clap(help_heading = Some("OTHER-ACTIONS"), short = 'L', long = "logon-summary")]
pub logon_summary: bool,
/// Tune alert levels (default: ./rules/config/level_tuning.txt)
#[clap(
+ help_heading = Some("OTHER-ACTIONS"),
long = "level-tuning",
- default_value = "./rules/config/level_tuning.txt",
hide_default_value = true,
- value_name = "LEVEL_TUNING_FILE"
+ value_name = "FILE"
)]
- pub level_tuning: PathBuf,
+ pub level_tuning: Option