diff --git a/.gitignore b/.gitignore index d7c5ba26..eec35107 100644 --- a/.gitignore +++ b/.gitignore @@ -5,4 +5,6 @@ .DS_Store test_* .env -/logs \ No newline at end of file +/logs +*.csv +hayabusa* \ No newline at end of file diff --git a/CHANGELOG-Japanese.md b/CHANGELOG-Japanese.md index b8e75cb8..de5ea40e 100644 --- a/CHANGELOG-Japanese.md +++ b/CHANGELOG-Japanese.md @@ -1,20 +1,99 @@ # 変更点 -## v1.4 [2022/XX/XX] +## v1.6.0 [2022/XX/XX] + +**新機能:** + +- XXX + +**改善:** + +- 結果概要に各レベルで検知した上位5つのルールを表示するようにした。 (#667) (@hitenkoku) +- 結果概要を出力しないようにするために `--no-summary` オプションを追加した。 (#672) (@hitenkoku) +- 結果概要の表示を短縮させた。 (#675) (@hitenkoku) + +**バグ修正:** + +- ログオン情報の要約オプションを追加した場合に、Hayabusaがクラッシュしていたのを修正した。 (#674) (@hitenkoku) + +## v1.5.1 [2022/08/20] + +**改善:** + +- TimesketchにインポートできるCSV形式を出力するプロファイルを追加して、v1.5.1を再リリースした。 (#668) (@YamatoSecurity) + +## v1.5.1 [2022/08/19] + +**バグ修正:** + +- Critical, medium、lowレベルのアラートはカラーで出力されていなかった。 (#663) (@fukusuket) +- `-f`で存在しないevtxファイルが指定された場合は、Hayabusaがクラッシュしていた。 (#664) (@fukusuket) + +## v1.5.0 [2022/08/18] + +**新機能:** + +- `config/profiles.yaml`と`config/default_profile.yaml`の設定ファイルで、出力内容をカスタマイズできる。 (#165) (@hitenkoku) +- 対象のフィールドがレコード内に存在しないことを確認する `null` キーワードに対応した。 (#643) (@hitenkoku) + +**改善:** + +- ルールのアップデート機能のルールパスの出力から./を削除した。 (#642) (@hitenkoku) +- MITRE ATT&CK関連のタグとその他タグを出力するための出力用のエイリアスを追加した。 (#637) (@hitenkoku) +- 結果概要の数値をカンマをつけて見やすくした。 (#649) (@hitenkoku) +- `-h`オプションでメニューを使いやすいようにグループ化した。 (#651) (@YamatoSecurity and @hitenkoku) +- 結果概要内の検知数にパーセント表示を追加した。 (#658) (@hitenkoku) + +**バグ修正:** + +- aggregation conditionのルール検知が原因で検知しなかったイベント数の集計に誤りがあったので修正した。 (#640) (@hitenkoku) +- 一部のイベント(0.01%程度)が検出されないレースコンディションの不具合を修正した。 (#639 #660) (@fukusuket) + +## v1.4.3 [2022/08/03] + +**バグ修正:** + +- VC再頒布パッケージがインストールされていない環境でエラーが発生している状態を修正した。 (#635) (@fukusuket) + +## v1.4.2 [2022/07/24] + +**改善:** + +- `--update-rules` オプションを利用する時に、更新対象のレポジトリを`--rules`オプションで指定できるようにした。 (#615) (@hitenkoku) +- 並列処理の改善による高速化。 (#479) (@kazuminn) +- `--output`オプションを利用したときのRulePathをRuleFileに変更した。RuleFileは出力するファイルの容量を低減させるためにファイル名のみを出力するようにした。 (#623) (@hitenkoku) + +**バグ修正:** + +- `cargo run`コマンドでhayabusaを実行するとconfigフォルダの読み込みエラーが発生する問題を修正した。 (#618) (@hitenkoku) + +## v1.4.1 [2022/06/30] + +**改善:** + +- ルールや`./rules/config/default_details.txt` に対応する`details`の記載がない場合、すべてのフィールド情報を結果の``Details`列に出力するようにした (#606) (@hitenkoku) +- `--deep-scan`オプションの追加。 このオプションがない場合、`config/target_event_ids.txt`で指定されたイベントIDのみをスキャン対象とします。 このオプションをつけることですべてのイベントIDをスキャン対象とします。(#608) (@hitenkoku) +- `-U, --update-rules`オプションで`channel_abbreviations.txt`、`statistics_event_info.txt`、`target_event_IDs.txt`を更新できるように、`config`ディレクトリから`rules/config`ディレクトリに移動した。 + +## v1.4.0 [2022/06/26] **新機能:** - `--target-file-ext` オプションの追加。evtx以外の拡張子を指定する事ができます。ただし、ファイルの中身の形式はevtxファイル形式である必要があります。 (#586) (@hitenkoku) +- `--exclude-status` オプションの追加。ルール内の`status`フィールドをもとに、読み込み対象から除外するフィルタを利用することができます。 (#596) (@hitenkoku) **改善:** +- ルール内に`details`フィールドがないときに、`rules/config/default_details.txt`に設定されたデフォルトの出力を行えるようにした。 (#359) (@hitenkoku) - Clap Crateパッケージの更新 (#413) (@hitenkoku) - オプションの指定がないときに、`--help`と同じ画面出力を行うように変更した。(#387) (@hitenkoku) -- ルール内に`details`フィールドがないときに、`rules/config/default_details.txt`に設定されたデフォルトの出力を行えるようにした。 (#359) (@hitenkoku) +- hayabusa.exeをカレントワーキングディレクトリ以外から動作できるようにした。 (#592) (@hitenkoku) +- `output` オプションで指定されファイルのサイズを出力するようにした。 (#595) (@hitenkoku) **バグ修正:** -- XXX +- カラー出力で長い出力があった場合にエラーが出て終了する問題を修正した。 (#603) (@hitenkoku) +- `Excluded rules`の合計で`rules/tools/sigmac/testfiles`配下のテストルールも入っていたので、無視するようにした。 (#602) (@hitenkoku) ## v1.3.2 [2022/06/13] @@ -35,6 +114,7 @@ - `--rfc-3339` オプションの時刻表示形式を変更した。 (#574) (@hitenkoku) - `-R/ --display-record-id`オプションを`-R/ --hide-record-id`に変更。レコードIDはデフォルトで出力するようにして`-R`オプションを付けた際に表示しないように変更した。(#579) (@hitenkoku) - ルール読み込み時のメッセージを追加した。 (#583) (@hitenkoku) +- `rules/tools/sigmac/testfiles`内のテスト用のymlファイルを読み込まないようにした. (#602) (@hitenkoku) **バグ修正:** @@ -97,7 +177,7 @@ **新機能:** -- `-C / --config` オプションの追加。検知ルールのコンフィグを指定することが可能。(Windowsでのライブ調査に便利) (@hitenkoku) +- `-C / --config` オプションの追加。検知ルールのコンフィグを指定することが可能。(Windowsでのライブ調査に便利) (@hitenkoku) - `|equalsfield` と記載することでルール内で二つのフィールドの値が一致するかを記載に対応。 (@hach1yon) - `-p / --pivot-keywords-list` オプションの追加。攻撃されたマシン名や疑わしいユーザ名などの情報をピボットキーワードリストとして出力する。 (@kazuminn) - `-F / --full-data`オプションの追加。ルールの`details`で指定されたフィールドだけではなく、全フィールド情報を出力する。(@hach1yon) @@ -128,7 +208,7 @@ - `-r / --rules`オプションで一つのルール指定が可能。(ルールをテストする際に便利!) (@kazuminn) - ルール更新オプション (`-u / --update-rules`): [hayabusa-rules](https://github.com/Yamato-Security/hayabusa-rules)レポジトリにある最新のルールに更新できる。 (@hitenkoku) -- ライブ調査オプション (`-l / --live-analysis`): Windowsイベントログディレクトリを指定しないで、楽にWindows端末でライブ調査ができる。(@hitenkoku) +- ライブ調査オプション (`-l / --live-analysis`): Windowsイベントログディレクトリを指定しないで、楽にWindows端末でライブ調査ができる。(@hitenkoku) **改善:** diff --git a/CHANGELOG.md b/CHANGELOG.md index 6f8a06e6..1310b7bd 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,20 +1,99 @@ # Changes -## v1.4 [2022/XX/XX] +## v1.6.0 [2022/XX/XX] + +**New Features:** + +- XXX + +**Enhancements:** + +- Added top alerts to results summary. (#667) (@hitenkoku) +- Added `--no-summary` option to not display the results summary. (#672) (@hitenkoku) +- Made the results summary more compact. (#675) (@hitenkoku) + +**Bug Fixes:** + +- Hayabusa would crash with `-L` option (logon summary option). (#674) (@hitenkoku) + +## v1.5.1 [2022/08/20] + +**Enhancements:** + +- Re-released v1.5.1 with an updated output profile that is compatible with Timesketch. (#668) (@YamatoSecurity) + +## v1.5.1 [2022/08/19] + +**Bug Fixes:** + +- Critical, medium and low level alerts were not being displayed in color. (#663) (@fukusuket) +- Hayabusa would crash when an evtx file specified with `-f` did not exist. (#664) (@fukusuket) + +## v1.5.0 [2022/08/18] + +**New Features:** + +- Customizable output of fields defined at `config/profiles.yaml` and `config/default_profile.yaml`. (#165) (@hitenkoku) +- Implemented the `null` keyword for rule detection. It is used to check if a target field exists or not. (#643) (@hitenkoku) + +**Enhancements:** + +- Trimmed `./` from the rule path when updating. (#642) (@hitenkoku) +- Added new output aliases for MITRE ATT&CK tags and other tags. (#637) (@hitenkoku) +- Organized the menu output when `-h` is used. (#651) (@YamatoSecurity and @hitenkoku) +- Added commas to summary numbers to make them easier to read. (#649) (@hitenkoku) +- Added output percentage of detections in Result Summary. (#658) (@hitenkoku) + +**Bug Fixes:** + +- Fixed miscalculation of Data Reduction due to aggregation condition rule detection. (#640) (@hitenkoku) +- Fixed a race condition bug where a few events (around 0.01%) would not be detected. (#639 #660) (@fukusuket) + +## v1.4.3 [2022/08/03] + +**Bug Fixes:** + +- Hayabusa would not run on Windows 11 when the VC redistribute package was not installed but now everything is compiled statically. (#635) (@fukusuket) + +## v1.4.2 [2022/07/24] + +**Enhancements:** + +- You can now update rules to a custom directory by combining the `--update-rules` and `--rules` options. (#615) (@hitenkoku) +- Improved speed with parallel processing by up to 20% with large files. (#479) (@kazuminn) +- When saving files with `-o`, the `.yml` detection rule path column changed from `RulePath` to `RuleFile` and only the rule file name will be saved in order to decrease file size. (#623) (@hitenkoku) + +**Bug Fixes:** + +- Fixed a runtime error when hayabusa is run from a different path than the current directory. (#618) (@hitenkoku) + +## v1.4.1 [2022/06/30] + +**Enhancements:** + +- When no `details` field is defined in a rule nor in `./rules/config/default_details.txt`, all fields will be outputted to the `details` column. (#606) (@hitenkoku) +- Added the `-D, --deep-scan` option. Now by default, events are filtered by Event IDs that there are detection rules for defined in `./rules/config/target_event_IDs.txt`. This should improve performance by 25~55% while still detecting almost everything. If you want to do a thorough scan on all events, you can disable the event ID filter with `-D, --deep-scan`. (#608) (@hitenkoku) +- `channel_abbreviations.txt`, `statistics_event_info.txt` and `target_event_IDs.txt` have been moved from the `config` directory to the `rules/config` directory in order to provide updates with `-U, --update-rules`. + +## v1.4.0 [2022/06/26] **New Features:** - Added `--target-file-ext` option. You can specify additional file extensions to scan in addtition to the default `.evtx` files. For example, `--target-file-ext evtx_data` or multiple extensions with `--target-file-ext evtx1 evtx2`. (#586) (@hitenkoku) +- Added `--exclude-status` option: You can ignore rules based on their `status`. (#596) (@hitenkoku) **Enhancements:** +- Added default details output based on `rules/config/default_details.txt` when no `details` field in a rule is specified. (i.e. Sigma rules) (#359) (@hitenkoku) - Updated clap crate package to version 3. (#413) (@hitnekoku) - Updated the default usage and help menu. (#387) (@hitenkoku) -- Added default details output based on `rules/config/default_details.txt` when no `details` field in a rule is specified. (i.e. Sigma rules) (#359) (@hitenkoku) +- Hayabusa can be run from any directory, not just from the current directory. (#592) (@hitenkoku) +- Added saved file size output when `output` is specified. (#595) (@hitenkoku) **Bug Fixes:** -- XXX +- Fixed output error and program termination when long output is displayed with color. (#603) (@hitenkoku) +- Ignore loading yml files in `rules/tools/sigmac/testfiles` to fix `Excluded rules` count. (#602) (@hitenkoku) ## v1.3.2 [2022/06/13] @@ -99,7 +178,7 @@ **New Features:** -- Specify config directory (`-C / --config`): When specifying a different rules directory, the rules config directory will still be the default `rules/config`, so this option is useful when you want to test rules and their config files in a different directory. (@hitenkoku) +- Specify config directory (`-C / --config`): When specifying a different rules directory, the rules config directory will still be the default `rules/config`, so this option is useful when you want to test rules and their config files in a different directory. (@hitenkoku) - `|equalsfield` aggregator: In order to write rules that compare if two fields are equal or not. (@hach1yon) - Pivot keyword list generator feature (`-p / --pivot-keywords-list`): Will generate a list of keywords to grep for to quickly identify compromised machines, suspicious usernames, files, etc... (@kazuminn) - `-F / --full-data` option: Will output all field information in addition to the fields defined in the rule’s `details`. (@hach1yon) @@ -130,7 +209,7 @@ - Can specify a single rule with the `-r / --rules` option. (Great for testing rules!) (@kazuminn) - Rule update option (`-u / --update-rules`): Update to the latest rules in the [hayabusa-rules](https://github.com/Yamato-Security/hayabusa-rules) repository. (@hitenkoku) -- Live analysis option (`-l / --live-analysis`): Can easily perform live analysis on Windows machines without specifying the Windows event log directory. (@hitenkoku) +- Live analysis option (`-l / --live-analysis`): Can easily perform live analysis on Windows machines without specifying the Windows event log directory. (@hitenkoku) **Enhancements:** diff --git a/Cargo.lock b/Cargo.lock index f9856a8b..95d822b4 100644 --- a/Cargo.lock +++ b/Cargo.lock @@ -14,7 +14,7 @@ version = "0.7.6" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "fcb51a0695d8f838b1ee009b3fbf66bda078cd64590202a864a8f3e8c4315c47" dependencies = [ - "getrandom 0.2.7", + "getrandom", "once_cell", "version_check", ] @@ -28,6 +28,15 @@ dependencies = [ "memchr", ] +[[package]] +name = "android_system_properties" +version = "0.1.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d7ed72e1635e121ca3e79420540282af22da58be50de153d36f81ddc6b83aa9e" +dependencies = [ + "libc", +] + [[package]] name = "ansi_term" version = "0.12.1" @@ -39,21 +48,18 @@ dependencies = [ [[package]] name = "anyhow" -version = "1.0.58" +version = "1.0.62" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "bb07d2053ccdbe10e2af2995a2f116c1330396493dc1269f6a91d0ae82e19704" - -[[package]] -name = "arrayref" -version = "0.3.6" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "a4c527152e37cf757a3f78aae5a06fbeefdb07ccc535c980a3208ee3060dd544" +checksum = "1485d4d2cc45e7b201ee3767015c96faa5904387c9d87c6efdd0fb511f12d305" [[package]] name = "arrayvec" -version = "0.5.2" +version = "0.4.12" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "23b62fc65de8e4e7f52534fb52b0f3ed04746ae267519eef2a83941e8085068b" +checksum = "cd9fd44efafa8690358b7408d253adf110036b88f55672a933f01d616ad9b1b9" +dependencies = [ + "nodrop", +] [[package]] name = "atty" @@ -74,9 +80,9 @@ checksum = "d468802bab17cbc0cc575e9b053f41e72aa36bfa6b7f55e3529ffa43161b97fa" [[package]] name = "base-x" -version = "0.2.10" +version = "0.2.11" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "dc19a4937b4fbd3fe3379793130e42060d10627a360f2127802b10b87e7baf74" +checksum = "4cbbc9d0964165b47557570cce6c952866c2678457aca742aafc9fb771d30270" [[package]] name = "base64" @@ -90,17 +96,6 @@ version = "1.3.2" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "bef38d45163c2f1dde094a7dfd33ccf595c92905c8f8f4fdc18d06fb1037718a" -[[package]] -name = "blake2b_simd" -version = "0.5.11" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "afa748e348ad3be8263be728124b24a24f268266f6f5d58af9d75f6a40b5c587" -dependencies = [ - "arrayref", - "arrayvec", - "constant_time_eq", -] - [[package]] name = "bstr" version = "0.2.17" @@ -115,9 +110,9 @@ dependencies = [ [[package]] name = "bumpalo" -version = "3.10.0" +version = "3.11.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "37ccbd214614c6783386c1af30caf03192f17891059cecc394b4fb119e363de3" +checksum = "c1ad822118d20d2c234f427000d5acc36eabe1e29a348c89b63dd60b13f28e5d" [[package]] name = "bytecount" @@ -133,9 +128,9 @@ checksum = "14c189c53d098945499cdfa7ecc63567cf3886b3332b312a5b4585d8d3a6a610" [[package]] name = "bytes" -version = "1.1.0" +version = "1.2.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c4872d67bab6358e59559027aa3b9157c53d9358c51423c17554809a8858e0f8" +checksum = "ec8a7b6a70fde80372154c65702f00a0f56f3e1c36abbc6c440484be248856db" [[package]] name = "bytesize" @@ -145,9 +140,9 @@ checksum = "6c58ec36aac5066d5ca17df51b3e70279f5670a72102f5752cb7e7c856adfc70" [[package]] name = "camino" -version = "1.0.9" +version = "1.1.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "869119e97797867fd90f5e22af7d0bd274bd4635ebb9eb68c04f3f513ae6c412" +checksum = "88ad0e1e3e88dd237a156ab9f571021b8a158caa0ae44b1968a241efb5144c1e" dependencies = [ "serde", ] @@ -169,7 +164,7 @@ checksum = "4acbb09d9ee8e23699b9634375c72795d095bf268439da88562cf9b501f181fa" dependencies = [ "camino", "cargo-platform", - "semver 1.0.10", + "semver 1.0.13", "serde", "serde_json", ] @@ -191,15 +186,17 @@ checksum = "baf1de4339761588bc0619e3cbc0120ee582ebb74b53b4efbf79117bd2da40fd" [[package]] name = "chrono" -version = "0.4.19" +version = "0.4.22" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "670ad68c9088c2a963aaa298cb369688cf3f9465ce5e2d4ca10e6e0098a1ce73" +checksum = "bfd4d1b31faaa3a89d7934dbded3111da0d2ef28e3ebccdb4f0179f5929d1ef1" dependencies = [ - "libc", + "iana-time-zone", + "js-sys", "num-integer", "num-traits", "serde", "time 0.1.44", + "wasm-bindgen", "winapi", ] @@ -220,9 +217,9 @@ dependencies = [ [[package]] name = "clap" -version = "3.2.5" +version = "3.2.17" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "d53da17d37dba964b9b3ecb5c5a1f193a2762c700e6829201e645b9381c99dc7" +checksum = "29e724a68d9319343bb3328c9cc2dfde263f4b3142ee1059a9980580171c954b" dependencies = [ "atty", "bitflags", @@ -237,9 +234,9 @@ dependencies = [ [[package]] name = "clap_derive" -version = "3.2.5" +version = "3.2.17" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c11d40217d16aee8508cc8e5fde8b4ff24639758608e5374e731b53f85749fb9" +checksum = "13547f7012c01ab4a0e8f8967730ada8f9fdf419e8b6c792788f39cf4e46eefa" dependencies = [ "heck", "proc-macro-error", @@ -250,24 +247,35 @@ dependencies = [ [[package]] name = "clap_lex" -version = "0.2.2" +version = "0.2.4" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "5538cd660450ebeb4234cfecf8f2284b844ffc4c50531e66d584ad5b91293613" +checksum = "2850f2f5a82cbf437dd5af4d49848fbdfc27c157c3d010345776f952765261c5" dependencies = [ "os_str_bytes", ] [[package]] -name = "console" -version = "0.15.0" +name = "comfy-table" +version = "6.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "a28b32d32ca44b70c3e4acd7db1babf555fa026e385fb95f18028f88848b3c31" +checksum = "85914173c2f558d61613bfbbf1911f14e630895087a7ed2fafc0f5319e1536e7" dependencies = [ - "encode_unicode", + "crossterm", + "strum", + "strum_macros", + "unicode-width", +] + +[[package]] +name = "console" +version = "0.15.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "89eab4d20ce20cea182308bca13088fecea9c05f6776cf287205d41a0ed3c847" +dependencies = [ + "encode_unicode 0.3.6", "libc", "once_cell", - "regex", - "terminal_size", + "terminal_size 0.1.17", "unicode-width", "winapi", ] @@ -279,10 +287,10 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "fbdcdcb6d86f71c5e97409ad45898af11cbc995b4ee8112d59095a28d376c935" [[package]] -name = "constant_time_eq" -version = "0.1.5" +name = "core-foundation-sys" +version = "0.8.3" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "245097e9a4535ee1e3e3931fcfcd55a796a44c643e8596ff6566d68f09b87bbc" +checksum = "5827cebf4670468b8772dd191856768aedcb1b0278a04f989f7766351917b9dc" [[package]] name = "crc32fast" @@ -295,9 +303,9 @@ dependencies = [ [[package]] name = "crossbeam-channel" -version = "0.5.5" +version = "0.5.6" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4c02a4d71819009c192cf4872265391563fd6a84c81ff2c0f2a7026ca4c1d85c" +checksum = "c2dd04ddaf88237dc3b8d8f9a3c1004b506b54b3313403944054d23c0870c521" dependencies = [ "cfg-if", "crossbeam-utils", @@ -305,9 +313,9 @@ dependencies = [ [[package]] name = "crossbeam-deque" -version = "0.8.1" +version = "0.8.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "6455c0ca19f0d2fbf751b908d5c55c1f5cbc65e03c4225427254b46890bdde1e" +checksum = "715e8152b692bba2d374b53d4875445368fdf21a94751410af607a5ac677d1fc" dependencies = [ "cfg-if", "crossbeam-epoch", @@ -316,9 +324,9 @@ dependencies = [ [[package]] name = "crossbeam-epoch" -version = "0.9.9" +version = "0.9.10" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "07db9d94cbd326813772c968ccd25999e5f8ae22f4f8d1b11effa37ef6ce281d" +checksum = "045ebe27666471bb549370b4b0b3e51b07f56325befa4284db65fc89c02511b1" dependencies = [ "autocfg", "cfg-if", @@ -330,14 +338,39 @@ dependencies = [ [[package]] name = "crossbeam-utils" -version = "0.8.9" +version = "0.8.11" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "8ff1f980957787286a554052d03c7aee98d99cc32e09f6d45f0a814133c87978" +checksum = "51887d4adc7b564537b15adcfb307936f8075dfcd5f00dde9a9f1d29383682bc" dependencies = [ "cfg-if", "once_cell", ] +[[package]] +name = "crossterm" +version = "0.25.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e64e6c0fbe2c17357405f7c758c1ef960fce08bdfb2c03d88d2a18d7e09c4b67" +dependencies = [ + "bitflags", + "crossterm_winapi", + "libc", + "mio", + "parking_lot", + "signal-hook", + "signal-hook-mio", + "winapi", +] + +[[package]] +name = "crossterm_winapi" +version = "0.9.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2ae1b35a484aa10e07fe0638d02301c5ad24de82d310ccbd2f3693da5f09bf1c" +dependencies = [ + "winapi", +] + [[package]] name = "csv" version = "1.1.6" @@ -360,6 +393,18 @@ dependencies = [ "memchr", ] +[[package]] +name = "dashmap" +version = "5.3.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3495912c9c1ccf2e18976439f4443f3fee0fd61f424ff99fde6a66b15ecb448f" +dependencies = [ + "cfg-if", + "hashbrown", + "lock_api", + "parking_lot_core", +] + [[package]] name = "dialoguer" version = "0.9.0" @@ -373,10 +418,20 @@ dependencies = [ ] [[package]] -name = "dirs" -version = "1.0.5" +name = "dirs-next" +version = "2.0.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "3fd78930633bd1c6e35c4b42b1df7b0cbc6bc191146e512bb3bedf243fcc3901" +checksum = "b98cf8ebf19c3d1b223e151f99a4f9f0690dca41414773390fc824184ac833e1" +dependencies = [ + "cfg-if", + "dirs-sys-next", +] + +[[package]] +name = "dirs-sys-next" +version = "0.1.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4ebda144c4fe02d1f7ea1a7d9641b6fc6b580adcfa024ae48797ecdeb6825b4d" dependencies = [ "libc", "redox_users", @@ -397,9 +452,9 @@ checksum = "9ea835d29036a4087793836fa931b08837ad5e957da9e23886b29586fb9b6650" [[package]] name = "either" -version = "1.6.1" +version = "1.8.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "e78d4f1cc4ae33bbfc157ed5d5a5ef3bc29227303d595861deb238fcec4e9457" +checksum = "90e5c1c8368803113bf0c9584fc495a58b86dc8a29edbf8fe877d21d9507e797" [[package]] name = "encode_unicode" @@ -407,6 +462,12 @@ version = "0.3.6" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "a357d28ed41a50f9c765dbfe56cbc04a64e53e5fc58ba79fbc34c10ef3df831f" +[[package]] +name = "encode_unicode" +version = "1.0.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "34aa73646ffb006b8f5147f3dc182bd4bcb190227ce861fc4a4844bf8e3cb2c0" + [[package]] name = "encoding" version = "0.2.33" @@ -484,6 +545,27 @@ dependencies = [ "termcolor", ] +[[package]] +name = "errno" +version = "0.2.8" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f639046355ee4f37944e44f60642c6f3a7efa3cf6b78c78a0d989a8ce6c396a1" +dependencies = [ + "errno-dragonfly", + "libc", + "winapi", +] + +[[package]] +name = "errno-dragonfly" +version = "0.1.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "aa68f1b12764fab894d2755d2518754e71b4fd80ecfb822714a1206c2aab39bf" +dependencies = [ + "cc", + "libc", +] + [[package]] name = "error-chain" version = "0.12.4" @@ -496,7 +578,7 @@ dependencies = [ [[package]] name = "evtx" version = "0.7.3" -source = "git+https://github.com/Yamato-Security/hayabusa-evtx.git?rev=158d496#158d496e6f40a036fa30b35e245683c3f7981df6" +source = "git+https://github.com/Yamato-Security/hayabusa-evtx.git#f2689c0343d0487521b9572dc3b9e4c179bcc5c9" dependencies = [ "anyhow", "bitflags", @@ -509,7 +591,7 @@ dependencies = [ "indoc", "jemallocator", "log", - "quick-xml", + "quick-xml 0.23.0", "rayon", "rpmalloc", "serde", @@ -522,9 +604,9 @@ dependencies = [ [[package]] name = "fastrand" -version = "1.7.0" +version = "1.8.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c3fcf0cee53519c866c09b5de1f6c56ff9d647101f81c1964fa632e148896cdf" +checksum = "a7a407cfaa3385c4ae6b23e84623d48c2798d06e3e6a1878f7f59f17b3f86499" dependencies = [ "instant", ] @@ -588,30 +670,30 @@ checksum = "2022715d62ab30faffd124d40b76f4134a550a87792276512b18d63272333394" [[package]] name = "futures-channel" -version = "0.3.21" +version = "0.3.23" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c3083ce4b914124575708913bca19bfe887522d6e2e6d0952943f5eac4a74010" +checksum = "2bfc52cbddcfd745bf1740338492bb0bd83d76c67b445f91c5fb29fae29ecaa1" dependencies = [ "futures-core", ] [[package]] name = "futures-core" -version = "0.3.21" +version = "0.3.23" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "0c09fd04b7e4073ac7156a9539b57a484a8ea920f79c7c675d05d289ab6110d3" +checksum = "d2acedae88d38235936c3922476b10fced7b2b68136f5e3c03c2d5be348a1115" [[package]] name = "futures-task" -version = "0.3.21" +version = "0.3.23" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "57c66a976bf5909d801bbef33416c41372779507e7a6b3a5e25e4749c58f776a" +checksum = "842fc63b931f4056a24d59de13fb1272134ce261816e063e634ad0c15cdc5306" [[package]] name = "futures-util" -version = "0.3.21" +version = "0.3.23" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "d8b7abd5d659d9b90c8cba917f6ec750a74e2dc23902ef9cd4cc8c8b22e6036a" +checksum = "f0828a5471e340229c11c77ca80017937ce3c58cb788a17e5f1c2d5c485a9577" dependencies = [ "futures-core", "futures-task", @@ -619,17 +701,6 @@ dependencies = [ "pin-utils", ] -[[package]] -name = "getrandom" -version = "0.1.16" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "8fc3cb4d91f53b50155bdcfd23f6a4c39ae1969c2ae85982b135750cccaf5fce" -dependencies = [ - "cfg-if", - "libc", - "wasi 0.9.0+wasi-snapshot-preview1", -] - [[package]] name = "getrandom" version = "0.2.7" @@ -643,9 +714,9 @@ dependencies = [ [[package]] name = "git2" -version = "0.14.4" +version = "0.15.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "d0155506aab710a86160ddb504a480d2964d7ab5b9e62419be69e0032bc5931c" +checksum = "2994bee4a3a6a51eb90c218523be382fd7ea09b16380b9312e9dbe955ff7c7d1" dependencies = [ "bitflags", "libc", @@ -664,23 +735,25 @@ checksum = "9b919933a397b79c37e33b77bb2aa3dc8eb6e165ad809e58ff75bc7db2e34574" [[package]] name = "hashbrown" -version = "0.12.1" +version = "0.12.3" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "db0d4cf898abf0081f964436dc980e96670a0f36863e4b83aaacdb65c9d7ccc3" +checksum = "8a9ee70c43aaf417c914396645a0fa852624801b24ebb7ae78fe8272889ac888" dependencies = [ "ahash", ] [[package]] name = "hayabusa" -version = "1.4.0-dev" +version = "1.6.0-dev" dependencies = [ "base64", "bytesize", "chrono", - "clap 3.2.5", + "clap 3.2.17", + "comfy-table", "crossbeam-utils", "csv", + "dashmap", "downcast-rs", "evtx", "flate2", @@ -690,22 +763,25 @@ dependencies = [ "hhmmss", "hyper", "is_elevated", + "itertools", "krapslog", "lazy_static", "linked-hash-map", "lock_api", + "num-format", "num_cpus", "openssl", "pbr", "prettytable-rs", - "quick-xml", + "quick-xml 0.24.0", + "rand", "regex", "serde", "serde_derive", "serde_json", "static_vcruntime", "termcolor", - "terminal_size", + "terminal_size 0.2.1", "tokio", "yaml-rust", ] @@ -749,7 +825,7 @@ checksum = "75f43d41e26995c17e71ee126451dd3941010b0514a81a9d11f3b341debc2399" dependencies = [ "bytes", "fnv", - "itoa 1.0.2", + "itoa 1.0.3", ] [[package]] @@ -786,9 +862,9 @@ dependencies = [ [[package]] name = "hyper" -version = "0.14.19" +version = "0.14.20" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "42dc3c131584288d375f2d07f822b0cb012d8c6fb899a5b9fdb3cb7eb9b6004f" +checksum = "02c929dc5c39e335a03c405292728118860721b10190d98c2a0f0efd5baafbac" dependencies = [ "bytes", "futures-channel", @@ -798,7 +874,7 @@ dependencies = [ "http-body", "httparse", "httpdate", - "itoa 1.0.2", + "itoa 1.0.3", "pin-project-lite", "tokio", "tower-service", @@ -806,6 +882,19 @@ dependencies = [ "want", ] +[[package]] +name = "iana-time-zone" +version = "0.1.46" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ad2bfd338099682614d3ee3fe0cd72e0b6a41ca6a87f6a74a3bd593c91650501" +dependencies = [ + "android_system_properties", + "core-foundation-sys", + "js-sys", + "wasm-bindgen", + "winapi", +] + [[package]] name = "idna" version = "0.2.3" @@ -819,9 +908,9 @@ dependencies = [ [[package]] name = "indexmap" -version = "1.9.0" +version = "1.9.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "6c6392766afd7964e2531940894cffe4bd8d7d17dbc3c1c4857040fd4b33bdb3" +checksum = "10a35a97730320ffe8e2d410b5d3b69279b98d2c14bdb8b70ea89ecf7888d41e" dependencies = [ "autocfg", "hashbrown", @@ -841,9 +930,9 @@ dependencies = [ [[package]] name = "indoc" -version = "1.0.6" +version = "1.0.7" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "05a0bd019339e5d968b37855180087b7b9d512c5046fbd244cf8c95687927d6e" +checksum = "adab1eaa3408fb7f0c777a73e7465fd5656136fc93b670eb6df3c88c2c1344e3" [[package]] name = "instant" @@ -854,6 +943,12 @@ dependencies = [ "cfg-if", ] +[[package]] +name = "io-lifetimes" +version = "0.7.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1ea37f355c05dde75b84bba2d767906ad522e97cd9e2eef2be7a4ab7fb442c06" + [[package]] name = "is_elevated" version = "0.1.2" @@ -863,6 +958,15 @@ dependencies = [ "winapi", ] +[[package]] +name = "itertools" +version = "0.10.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a9a9d19fa1e79b6215ff29b9d6880b706147f16e9b1dbb1e4e5947b5b02bc5e3" +dependencies = [ + "either", +] + [[package]] name = "itoa" version = "0.4.8" @@ -871,15 +975,15 @@ checksum = "b71991ff56294aa922b450139ee08b3bfc70982c6b2c7562771375cf73542dd4" [[package]] name = "itoa" -version = "1.0.2" +version = "1.0.3" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "112c678d4050afce233f4f2852bb2eb519230b3cf12f33585275537d7e41578d" +checksum = "6c8af84674fe1f223a982c933a0ee1086ac4d4052aa0fb8060c12c6ad838e754" [[package]] name = "jemalloc-sys" -version = "0.5.0+5.3.0" +version = "0.5.1+5.3.0-patched" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "f655c3ecfa6b0d03634595b4b54551d4bd5ac208b9e0124873949a7ab168f70b" +checksum = "b7c2b313609b95939cb0c5a5c6917fb9b7c9394562aa3ef44eb66ffa51736432" dependencies = [ "cc", "fs_extra", @@ -905,6 +1009,15 @@ dependencies = [ "libc", ] +[[package]] +name = "js-sys" +version = "0.3.59" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "258451ab10b34f8af53416d1fdab72c22e805f0c92a1136d59470ec0b11138b2" +dependencies = [ + "wasm-bindgen", +] + [[package]] name = "krapslog" version = "0.4.0" @@ -914,7 +1027,7 @@ dependencies = [ "anyhow", "atty", "chrono", - "clap 3.2.5", + "clap 3.2.17", "file-chunker", "indicatif", "memmap2", @@ -923,7 +1036,7 @@ dependencies = [ "rayon", "regex", "tempfile", - "terminal_size", + "terminal_size 0.1.17", ] [[package]] @@ -934,15 +1047,15 @@ checksum = "e2abad23fbc42b3700f2f279844dc832adb2b2eb069b2df918f455c4e18cc646" [[package]] name = "libc" -version = "0.2.126" +version = "0.2.132" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "349d5a591cd28b49e1d1037471617a32ddcda5731b99419008085f72d5a53836" +checksum = "8371e4e5341c3a96db127eb2465ac681ced4c433e01dd0e938adbef26ba93ba5" [[package]] name = "libgit2-sys" -version = "0.13.4+1.4.2" +version = "0.14.0+1.5.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "d0fa6563431ede25f5cc7f6d803c6afbc1c5d3ad3d4925d12c882bf2b526f5d1" +checksum = "47a00859c70c8a4f7218e6d1cc32875c4b55f6799445b842b0d8ed5e4c3d959b" dependencies = [ "cc", "libc", @@ -980,15 +1093,21 @@ dependencies = [ [[package]] name = "linked-hash-map" -version = "0.5.4" +version = "0.5.6" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "7fb9b38af92608140b86b693604b9ffcc5824240a484d1ecd4795bacb2fe88f3" +checksum = "0717cef1bc8b636c6e1c1bbdefc09e6322da8a9321966e8928ef80d20f7f770f" + +[[package]] +name = "linux-raw-sys" +version = "0.0.46" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d4d2456c373231a208ad294c33dc5bff30051eafd954cd4caae83a712b12854d" [[package]] name = "lock_api" -version = "0.4.7" +version = "0.4.8" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "327fa5b6a6940e4699ec49a9beae1ea4845c6bab9314e4f84ac68742139d8c53" +checksum = "9f80bf5aacaf25cbfc8210d1cfb718f2bf3b11c4c54e5afe36c236853a8ec390" dependencies = [ "autocfg", "scopeguard", @@ -1017,9 +1136,9 @@ checksum = "2dffe52ecf27772e601905b7522cb4ef790d2cc203488bbd0e2fe85fcb74566d" [[package]] name = "memmap2" -version = "0.5.4" +version = "0.5.7" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "d5172b50c23043ff43dd53e51392f36519d9b35a8f3a410d30ece5d1aedd58ae" +checksum = "95af15f345b17af2efc8ead6080fb8bc376f8cec1b35277b935637595fe77498" dependencies = [ "libc", ] @@ -1044,9 +1163,9 @@ dependencies = [ [[package]] name = "mio" -version = "0.8.3" +version = "0.8.4" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "713d550d9b44d89174e066b7a6217ae06234c10cb47819a88290d2b353c31799" +checksum = "57ee1c23c7c63b0c9250c339ffdc69255f110b298b901b9f6c82547b7b87caaf" dependencies = [ "libc", "log", @@ -1054,6 +1173,12 @@ dependencies = [ "windows-sys", ] +[[package]] +name = "nodrop" +version = "0.1.14" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "72ef4a56884ca558e5ddb05a1d1e7e1bfd9a68d9ed024c21704cc98872dae1bb" + [[package]] name = "num-derive" version = "0.3.3" @@ -1065,6 +1190,16 @@ dependencies = [ "syn", ] +[[package]] +name = "num-format" +version = "0.4.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "bafe4179722c2894288ee77a9f044f02811c86af699344c498b0840c698a2465" +dependencies = [ + "arrayvec", + "itoa 0.4.8", +] + [[package]] name = "num-integer" version = "0.1.45" @@ -1102,15 +1237,15 @@ checksum = "830b246a0e5f20af87141b25c173cd1b609bd7779a4617d6ec582abaf90870f3" [[package]] name = "once_cell" -version = "1.12.0" +version = "1.13.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "7709cef83f0c1f58f666e746a08b21e0085f7440fa6a29cc194d68aac97a4225" +checksum = "074864da206b4973b84eb91683020dbefd6a8c3f0f38e054d93954e891935e4e" [[package]] name = "openssl" -version = "0.10.40" +version = "0.10.41" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "fb81a6430ac911acb25fe5ac8f1d2af1b4ea8a4fdfda0f1ee4292af2e2d8eb0e" +checksum = "618febf65336490dfcf20b73f885f5651a0c89c64c2d4a8c3662585a70bf5bd0" dependencies = [ "bitflags", "cfg-if", @@ -1140,18 +1275,18 @@ checksum = "ff011a302c396a5197692431fc1948019154afc178baf7d8e37367442a4601cf" [[package]] name = "openssl-src" -version = "111.20.0+1.1.1o" +version = "111.22.0+1.1.1q" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "92892c4f87d56e376e469ace79f1128fdaded07646ddf73aa0be4706ff712dec" +checksum = "8f31f0d509d1c1ae9cada2f9539ff8f37933831fd5098879e482aa687d659853" dependencies = [ "cc", ] [[package]] name = "openssl-sys" -version = "0.9.74" +version = "0.9.75" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "835363342df5fba8354c5b453325b110ffd54044e588c539cf2f20a8014e4cb1" +checksum = "e5f9bd0c2710541a3cda73d6f9ac4f1b240de4ae261065d309dbe73d9dceb42f" dependencies = [ "autocfg", "cc", @@ -1163,9 +1298,9 @@ dependencies = [ [[package]] name = "os_str_bytes" -version = "6.1.0" +version = "6.3.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "21326818e99cfe6ce1e524c2a805c189a99b5ae555a35d19f9a284b427d86afa" +checksum = "9ff7415e9ae3fff1225851df9e0d9e4e5479f947619774677a63572e55e80eff" [[package]] name = "parking_lot" @@ -1185,7 +1320,7 @@ checksum = "09a279cbf25cb0757810394fbc1e359949b59e348145c643a939a525692e6929" dependencies = [ "cfg-if", "libc", - "redox_syscall 0.2.13", + "redox_syscall", "smallvec", "windows-sys", ] @@ -1227,14 +1362,20 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "1df8c4ec4b0627e53bdf214615ad287367e482558cf84b109250b37464dc03ae" [[package]] -name = "prettytable-rs" -version = "0.8.0" +name = "ppv-lite86" +version = "0.2.16" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "0fd04b170004fa2daccf418a7f8253aaf033c27760b5f225889024cf66d7ac2e" +checksum = "eb9f9e6e233e5c4a35559a617bf40a4ec447db2e84c20b55a6f83167b7e57872" + +[[package]] +name = "prettytable-rs" +version = "0.9.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5f375cb74c23b51d23937ffdeb48b1fbf5b6409d4b9979c1418c1de58bc8f801" dependencies = [ "atty", "csv", - "encode_unicode", + "encode_unicode 1.0.0", "lazy_static", "term", "unicode-width", @@ -1272,9 +1413,9 @@ checksum = "dbf0c48bc1d91375ae5c3cd81e3722dff1abcf81a30960240640d223f59fe0e5" [[package]] name = "proc-macro2" -version = "1.0.39" +version = "1.0.43" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c54b25569025b7fc9651de43004ae593a75ad88543b17178aa5e1b9c4f15f56f" +checksum = "0a2ca2c61bc9f3d74d2886294ab7b9853abd9c1ad903a3ac7815c58989bb7bab" dependencies = [ "unicode-ident", ] @@ -1287,9 +1428,9 @@ checksum = "e965d96c8162c607b0cd8d66047ad3c9fd35273c134d994327882c6e47f986a7" [[package]] name = "pulldown-cmark" -version = "0.9.1" +version = "0.9.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "34f197a544b0c9ab3ae46c359a7ec9cbbb5c7bf97054266fecb7ead794a181d6" +checksum = "2d9cc634bc78768157b5cbfe988ffcd1dcba95cd2b2f03a88316c08c6d00ed63" dependencies = [ "bitflags", "memchr", @@ -1307,6 +1448,15 @@ name = "quick-xml" version = "0.23.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "9279fbdacaad3baf559d8cabe0acc3d06e30ea14931af31af79578ac0946decc" +dependencies = [ + "memchr", +] + +[[package]] +name = "quick-xml" +version = "0.24.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "678404d55890514fa1c01fe98cf280b674db93944fdcb70310dd3be1d0d63be7" dependencies = [ "memchr", "serde", @@ -1314,13 +1464,43 @@ dependencies = [ [[package]] name = "quote" -version = "1.0.19" +version = "1.0.21" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "f53dc8cf16a769a6f677e09e7ff2cd4be1ea0f48754aac39520536962011de0d" +checksum = "bbe448f377a7d6961e30f5955f9b8d106c3f5e449d493ee1b125c1d43c2b5179" dependencies = [ "proc-macro2", ] +[[package]] +name = "rand" +version = "0.8.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "34af8d1a0e25924bc5b7c43c079c942339d8f0a8b57c39049bef581b46327404" +dependencies = [ + "libc", + "rand_chacha", + "rand_core", +] + +[[package]] +name = "rand_chacha" +version = "0.3.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e6c10a63a0fa32252be49d21e7709d4d4baf8d231c2dbce1eaa8141b9b127d88" +dependencies = [ + "ppv-lite86", + "rand_core", +] + +[[package]] +name = "rand_core" +version = "0.6.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d34f1408f55294453790c48b2f1ebbb1c5b4b7563eb1f418bcfcfdbb06ebb4e7" +dependencies = [ + "getrandom", +] + [[package]] name = "rayon" version = "1.5.3" @@ -1347,35 +1527,29 @@ dependencies = [ [[package]] name = "redox_syscall" -version = "0.1.57" +version = "0.2.16" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "41cc0f7e4d5d4544e8861606a285bb08d3e70712ccc7d2b84d7c0ccfaf4b05ce" - -[[package]] -name = "redox_syscall" -version = "0.2.13" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "62f25bc4c7e55e0b0b7a1d43fb893f4fa1361d0abe38b9ce4f323c2adfe6ef42" +checksum = "fb5a58c1855b4b6819d59012155603f0b22ad30cad752600aadfcb695265519a" dependencies = [ "bitflags", ] [[package]] name = "redox_users" -version = "0.3.5" +version = "0.4.3" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "de0737333e7a9502c789a36d7c7fa6092a49895d4faa31ca5df163857ded2e9d" +checksum = "b033d837a7cf162d7993aded9304e30a83213c648b6e389db233191f891e5c2b" dependencies = [ - "getrandom 0.1.16", - "redox_syscall 0.1.57", - "rust-argon2", + "getrandom", + "redox_syscall", + "thiserror", ] [[package]] name = "regex" -version = "1.5.6" +version = "1.6.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "d83f127d94bdbcda4c8cc2e50f6f84f4b611f69c902699ca385a39c3a75f9ff1" +checksum = "4c4eb3267174b8c6c2f654116623910a0fef09c4753f8dd83db29c48a0df988b" dependencies = [ "aho-corasick", "memchr", @@ -1390,9 +1564,9 @@ checksum = "6c230d73fb8d8c1b9c0b3135c5142a8acee3a0558fb8db5cf1cb65f8d7862132" [[package]] name = "regex-syntax" -version = "0.6.26" +version = "0.6.27" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "49b3de9ec5dc0a3417da371aab17d729997c15010e7fd24ff707773a33bddb64" +checksum = "a3f87b73ce11b1619a3c6332f45341e0047173771e8b8b73f87bfeefb7b56244" [[package]] name = "remove_dir_all" @@ -1423,18 +1597,6 @@ dependencies = [ "pkg-config", ] -[[package]] -name = "rust-argon2" -version = "0.8.3" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4b18820d944b33caa75a71378964ac46f58517c92b6ae5f762636247c09e78fb" -dependencies = [ - "base64", - "blake2b_simd", - "constant_time_eq", - "crossbeam-utils", -] - [[package]] name = "rustc_version" version = "0.2.3" @@ -1445,10 +1607,30 @@ dependencies = [ ] [[package]] -name = "ryu" -version = "1.0.10" +name = "rustix" +version = "0.35.9" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "f3f6f92acf49d1b98f7a81226834412ada05458b7364277387724a237f062695" +checksum = "72c825b8aa8010eb9ee99b75f05e10180b9278d161583034d7574c9d617aeada" +dependencies = [ + "bitflags", + "errno", + "io-lifetimes", + "libc", + "linux-raw-sys", + "windows-sys", +] + +[[package]] +name = "rustversion" +version = "1.0.9" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "97477e48b4cf8603ad5f7aaf897467cf42ab4218a38ef76fb14c2d6773a6d6a8" + +[[package]] +name = "ryu" +version = "1.0.11" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4501abdff3ae82a1c1b477a17252eb69cee9e66eb915c1abaa4f44d873df9f09" [[package]] name = "same-file" @@ -1476,9 +1658,9 @@ dependencies = [ [[package]] name = "semver" -version = "1.0.10" +version = "1.0.13" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "a41d061efea015927ac527063765e73601444cdc344ba855bc7bd44578b25e1c" +checksum = "93f6841e709003d68bb2deee8c343572bf446003ec20a583e76f7b15cebf3711" dependencies = [ "serde", ] @@ -1491,18 +1673,18 @@ checksum = "388a1df253eca08550bef6c72392cfe7c30914bf41df5269b68cbd6ff8f570a3" [[package]] name = "serde" -version = "1.0.137" +version = "1.0.144" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "61ea8d54c77f8315140a05f4c7237403bf38b72704d031543aa1d16abbf517d1" +checksum = "0f747710de3dcd43b88c9168773254e809d8ddbdf9653b84e2554ab219f17860" dependencies = [ "serde_derive", ] [[package]] name = "serde_derive" -version = "1.0.137" +version = "1.0.144" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "1f26faba0c3959972377d3b2d306ee9f71faee9714294e41bb777f83f88578be" +checksum = "94ed3a816fb1d101812f83e789f888322c34e291f894f19590dc310963e87a00" dependencies = [ "proc-macro2", "quote", @@ -1511,11 +1693,11 @@ dependencies = [ [[package]] name = "serde_json" -version = "1.0.81" +version = "1.0.85" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "9b7ce2b32a1aed03c558dc61a5cd328f15aff2dbc17daad8fb8af04d2100e15c" +checksum = "e55a28e3aaef9d5ce0506d0a14dbba8054ddc7e499ef522dd8b26859ec9d4a44" dependencies = [ - "itoa 1.0.2", + "itoa 1.0.3", "ryu", "serde", ] @@ -1535,6 +1717,27 @@ version = "1.0.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "ae1a47186c03a32177042e55dbc5fd5aee900b8e0069a8d70fba96a9375cd012" +[[package]] +name = "signal-hook" +version = "0.3.14" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a253b5e89e2698464fc26b545c9edceb338e18a89effeeecfea192c3025be29d" +dependencies = [ + "libc", + "signal-hook-registry", +] + +[[package]] +name = "signal-hook-mio" +version = "0.2.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "29ad2e15f37ec9a6cc544097b78a1ec90001e9f71b81338ca39f430adaca99af" +dependencies = [ + "libc", + "mio", + "signal-hook", +] + [[package]] name = "signal-hook-registry" version = "1.4.0" @@ -1572,15 +1775,15 @@ dependencies = [ [[package]] name = "smallvec" -version = "1.8.0" +version = "1.9.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "f2dd574626839106c320a323308629dcb1acfc96e32a8cba364ddc61ac23ee83" +checksum = "2fd0db749597d91ff862fd1d55ea87f7855a744a8425a64695b6fca237d1dad1" [[package]] name = "socket2" -version = "0.4.4" +version = "0.4.6" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "66d72b759436ae32898a2af0a14218dbf55efde3feeb170eb623637db85ee1e0" +checksum = "10c98bba371b9b22a71a9414e420f92ddeb2369239af08200816169d5e2dd7aa" dependencies = [ "libc", "winapi", @@ -1663,10 +1866,29 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "73473c0e59e6d5812c5dfe2a064a6444949f089e20eec9a2e5506596494e4623" [[package]] -name = "syn" -version = "1.0.98" +name = "strum" +version = "0.24.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c50aef8a904de4c23c788f104b7dddc7d6f79c647c7c8ce4cc8f73eb0ca773dd" +checksum = "063e6045c0e62079840579a7e47a355ae92f60eb74daaf156fb1e84ba164e63f" + +[[package]] +name = "strum_macros" +version = "0.24.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1e385be0d24f186b4ce2f9982191e7101bb737312ad61c1f2f984f34bcf85d59" +dependencies = [ + "heck", + "proc-macro2", + "quote", + "rustversion", + "syn", +] + +[[package]] +name = "syn" +version = "1.0.99" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "58dbef6ec655055e20b86b15a8cc6d439cca19b667537ac6a1369572d151ab13" dependencies = [ "proc-macro2", "quote", @@ -1682,19 +1904,19 @@ dependencies = [ "cfg-if", "fastrand", "libc", - "redox_syscall 0.2.13", + "redox_syscall", "remove_dir_all", "winapi", ] [[package]] name = "term" -version = "0.5.2" +version = "0.7.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "edd106a334b7657c10b7c540a0106114feadeb4dc314513e97df481d5d966f42" +checksum = "c59df8ac95d96ff9bede18eb7300b0fda5e5d8d90960e76f8e14ae765eedbf1f" dependencies = [ - "byteorder", - "dirs", + "dirs-next", + "rustversion", "winapi", ] @@ -1717,6 +1939,16 @@ dependencies = [ "winapi", ] +[[package]] +name = "terminal_size" +version = "0.2.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8440c860cf79def6164e4a0a983bcc2305d82419177a0e0c71930d049e3ac5a1" +dependencies = [ + "rustix", + "windows-sys", +] + [[package]] name = "textwrap" version = "0.11.0" @@ -1734,18 +1966,18 @@ checksum = "b1141d4d61095b28419e22cb0bbf02755f5e54e0526f97f1e3d1d160e60885fb" [[package]] name = "thiserror" -version = "1.0.31" +version = "1.0.32" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "bd829fe32373d27f76265620b5309d0340cb8550f523c1dda251d6298069069a" +checksum = "f5f6586b7f764adc0231f4c79be7b920e766bb2f3e51b3661cdb263828f19994" dependencies = [ "thiserror-impl", ] [[package]] name = "thiserror-impl" -version = "1.0.31" +version = "1.0.32" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "0396bc89e626244658bef819e22d0cc459e795a5ebe878e6ec336d1674a8d79a" +checksum = "12bafc5b54507e0149cdf1b145a5d80ab80a90bcd9275df43d4fff68460f6c21" dependencies = [ "proc-macro2", "quote", @@ -1818,10 +2050,11 @@ checksum = "cda74da7e1a664f795bb1f8a87ec406fb89a02522cf6e50620d016add6dbbf5c" [[package]] name = "tokio" -version = "1.19.2" +version = "1.20.1" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c51a52ed6686dd62c320f9b89299e9dfb46f730c7a48e635c19f21d116cb1439" +checksum = "7a8325f63a7d4774dd041e363b2409ed1c5cbbd0f867795e661df066b2b0a581" dependencies = [ + "autocfg", "bytes", "libc", "memchr", @@ -1855,9 +2088,9 @@ checksum = "b6bc1c9ce2b5135ac7f93c72918fc37feb872bdc6a5533a8b85eb4b86bfdae52" [[package]] name = "tracing" -version = "0.1.35" +version = "0.1.36" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "a400e31aa60b9d44a52a8ee0343b5b18566b03a8321e0d321f695cf56e940160" +checksum = "2fce9567bd60a67d08a16488756721ba392f24f29006402881e43b19aac64307" dependencies = [ "cfg-if", "pin-project-lite", @@ -1866,9 +2099,9 @@ dependencies = [ [[package]] name = "tracing-core" -version = "0.1.27" +version = "0.1.29" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "7709595b8878a4965ce5e87ebf880a7d39c9afc6837721b21a5a816a8117d921" +checksum = "5aeea4303076558a00714b823f9ad67d58a3bbda1df83d8827d21193156e22f7" dependencies = [ "once_cell", ] @@ -1896,15 +2129,15 @@ checksum = "099b7128301d285f79ddd55b9a83d5e6b9e97c92e0ea0daebee7263e932de992" [[package]] name = "unicode-ident" -version = "1.0.1" +version = "1.0.3" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "5bd2fe26506023ed7b5e1e315add59d6f584c621d037f9368fea9cfb988f368c" +checksum = "c4f5b37a154999a8f3f98cc23a628d850e154479cd94decf3414696e12e31aaf" [[package]] name = "unicode-normalization" -version = "0.1.19" +version = "0.1.21" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "d54590932941a9e9266f0832deed84ebe1bf2e4c9e4a3554d393d18f5e854bf9" +checksum = "854cbdc4f7bc6ae19c820d44abdc3277ac3e1b2b93db20a636825d9322fb60e6" dependencies = [ "tinyvec", ] @@ -1966,12 +2199,6 @@ dependencies = [ "try-lock", ] -[[package]] -name = "wasi" -version = "0.9.0+wasi-snapshot-preview1" -source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "cccddf32554fecc6acb585f82a32a72e28b48f8c4c1883ddfeeeaa96f7d8e519" - [[package]] name = "wasi" version = "0.10.0+wasi-snapshot-preview1" @@ -1986,9 +2213,9 @@ checksum = "9c8d87e72b64a3b4db28d11ce29237c246188f4f51057d65a7eab63b7987e423" [[package]] name = "wasm-bindgen" -version = "0.2.81" +version = "0.2.82" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "7c53b543413a17a202f4be280a7e5c62a1c69345f5de525ee64f8cfdbc954994" +checksum = "fc7652e3f6c4706c8d9cd54832c4a4ccb9b5336e2c3bd154d5cccfbf1c1f5f7d" dependencies = [ "cfg-if", "wasm-bindgen-macro", @@ -1996,13 +2223,13 @@ dependencies = [ [[package]] name = "wasm-bindgen-backend" -version = "0.2.81" +version = "0.2.82" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "5491a68ab4500fa6b4d726bd67408630c3dbe9c4fe7bda16d5c82a1fd8c7340a" +checksum = "662cd44805586bd52971b9586b1df85cdbbd9112e4ef4d8f41559c334dc6ac3f" dependencies = [ "bumpalo", - "lazy_static", "log", + "once_cell", "proc-macro2", "quote", "syn", @@ -2011,9 +2238,9 @@ dependencies = [ [[package]] name = "wasm-bindgen-macro" -version = "0.2.81" +version = "0.2.82" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c441e177922bc58f1e12c022624b6216378e5febc2f0533e41ba443d505b80aa" +checksum = "b260f13d3012071dfb1512849c033b1925038373aea48ced3012c09df952c602" dependencies = [ "quote", "wasm-bindgen-macro-support", @@ -2021,9 +2248,9 @@ dependencies = [ [[package]] name = "wasm-bindgen-macro-support" -version = "0.2.81" +version = "0.2.82" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "7d94ac45fcf608c1f45ef53e748d35660f168490c10b23704c7779ab8f5c3048" +checksum = "5be8e654bdd9b79216c2929ab90721aa82faf65c48cdf08bdc4e7f51357b80da" dependencies = [ "proc-macro2", "quote", @@ -2034,9 +2261,9 @@ dependencies = [ [[package]] name = "wasm-bindgen-shared" -version = "0.2.81" +version = "0.2.82" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "6a89911bd99e5f3659ec4acf9c4d93b0a90fe4a2a11f15328472058edc5261be" +checksum = "6598dd0bd3c7d51095ff6531a5b23e02acdc81804e30d8f07afb77b7215a140a" [[package]] name = "winapi" @@ -2141,6 +2368,6 @@ dependencies = [ [[package]] name = "zeroize" -version = "1.5.5" +version = "1.5.7" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "94693807d016b2f2d2e14420eb3bfcca689311ff775dcf113d74ea624b7cdf07" +checksum = "c394b5bd0c6f669e7275d9c20aa90ae064cb22e75a1cad54e1b34088034b149f" diff --git a/Cargo.toml b/Cargo.toml index a99e1da3..bbe08346 100644 --- a/Cargo.toml +++ b/Cargo.toml @@ -1,19 +1,19 @@ [package] name = "hayabusa" -version = "1.4.0-dev" +version = "1.6.0-dev" authors = ["Yamato Security @SecurityYamato"] edition = "2021" -# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html - [dependencies] +itertools = "*" +dashmap = "*" clap = { version = "3.*", features = ["derive", "cargo"]} -evtx = { git = "https://github.com/Yamato-Security/hayabusa-evtx.git" , rev = "158d496" , features = ["fast-alloc"]} +evtx = { git = "https://github.com/Yamato-Security/hayabusa-evtx.git" , features = ["fast-alloc"]} quick-xml = {version = "0.*", features = ["serialize"] } serde = { version = "1.*", features = ["derive"] } serde_json = { version = "1.0"} serde_derive = "1.*" -regex = "1.5.*" +regex = "1" csv = "1.1.*" base64 = "*" flate2 = "1.*" @@ -37,10 +37,17 @@ bytesize = "1.*" hyper = "0.14.*" lock_api = "0.4.*" crossbeam-utils = "0.8.*" +num-format = "*" +comfy-table = "6.*" + +[build-dependencies] +static_vcruntime = "2.*" + +[dev-dependencies] +rand = "0.8.*" [target.'cfg(windows)'.dependencies] is_elevated = "0.1.*" -static_vcruntime = "2.*" [target.'cfg(unix)'.dependencies] #Mac and Linux openssl = { version = "*", features = ["vendored"] } #vendored is needed to compile statically. diff --git a/README-1.5.1-Japanese.pdf b/README-1.5.1-Japanese.pdf new file mode 100644 index 00000000..76f89c18 Binary files /dev/null and b/README-1.5.1-Japanese.pdf differ diff --git a/README-1.5.1.pdf b/README-1.5.1.pdf new file mode 100644 index 00000000..16c4800a Binary files /dev/null and b/README-1.5.1.pdf differ diff --git a/README-Japanese.md b/README-Japanese.md index 33f4be56..f3f19cb7 100644 --- a/README-Japanese.md +++ b/README-Japanese.md @@ -1,16 +1,16 @@

- Hayabusa Logo + Hayabusa Logo

[English] | [日本語]
--- -[tag-1]: https://img.shields.io/github/downloads/Yamato-Security/hayabusa/total?style=plastic&label=GitHub%F0%9F%A6%85DownLoads +[tag-1]: https://img.shields.io/github/downloads/Yamato-Security/hayabusa/total?style=plastic&label=GitHub%F0%9F%A6%85Downloads [tag-2]: https://img.shields.io/github/stars/Yamato-Security/hayabusa?style=plastic&label=GitHub%F0%9F%A6%85Stars [tag-3]: https://img.shields.io/github/v/release/Yamato-Security/hayabusa?display_name=tag&label=latest-version&style=plastic -[tag-4]: https://img.shields.io/badge/Black%20Hat%20Arsenal-Asia%202022-blue +[tag-4]: https://github.com/toolswatch/badges/blob/master/arsenal/asia/2022.svg [tag-5]: https://rust-reportcard.xuri.me/badge/github.com/Yamato-Security/hayabusa [tag-6]: https://img.shields.io/badge/Maintenance%20Level-Actively%20Developed-brightgreen.svg [tag-7]: https://img.shields.io/badge/Twitter-00acee?logo=twitter&logoColor=white @@ -21,14 +21,14 @@ # Hayabusa について -Hayabusaは、日本の[Yamato Security](https://yamatosecurity.connpass.com/)グループによって作られた**Windowsイベントログのファストフォレンジックタイムライン生成**および**スレットハンティングツール**です。 Hayabusaは日本語で[「ハヤブサ」](https://en.wikipedia.org/wiki/Peregrine_falcon)を意味し、ハヤブサが世界で最も速く、狩猟(hunting)に優れ、とても訓練しやすい動物であることから選ばれました。[Rust](https://www.rust-lang.org/) で開発され、マルチスレッドに対応し、可能な限り高速に動作するよう配慮されています。[Sigma](https://github.com/SigmaHQ/Sigma)ルールをHayabusaルール形式に変換する[ツール](https://github.com/Yamato-Security/hayabusa-rules/tree/main/tools/sigmac)も提供しています。Hayabusaの検知ルールもSigmaと同様にYML形式であり、カスタマイズ性や拡張性に優れます。稼働中のシステムで実行してライブ調査することも、複数のシステムからログを収集してオフライン調査することも可能です。(※現時点では、リアルタイムアラートや定期的なスキャンには対応していません。) 出力は一つのCSVタイムラインにまとめられ、Excel、[Timeline Explorer](https://ericzimmerman.github.io/#!index.md)、[Elastic Stack](doc/ElasticStackImport/ElasticStackImport-Japanese.md)等で簡単に分析できるようになります。 +Hayabusaは、日本の[Yamato Security](https://yamatosecurity.connpass.com/)グループによって作られた**Windowsイベントログのファストフォレンジックタイムライン生成**および**スレットハンティングツール**です。 Hayabusaは日本語で[「ハヤブサ」](https://en.wikipedia.org/wiki/Peregrine_falcon)を意味し、ハヤブサが世界で最も速く、狩猟(hunting)に優れ、とても訓練しやすい動物であることから選ばれました。[Rust](https://www.rust-lang.org/) で開発され、マルチスレッドに対応し、可能な限り高速に動作するよう配慮されています。[Sigma](https://github.com/SigmaHQ/Sigma)ルールをHayabusaルール形式に変換する[ツール](https://github.com/Yamato-Security/hayabusa-rules/tree/main/tools/sigmac)も提供しています。Hayabusaの検知ルールもSigmaと同様にYML形式であり、カスタマイズ性や拡張性に優れます。稼働中のシステムで実行してライブ調査することも、複数のシステムからログを収集してオフライン調査することも可能です。また、 [Velociraptor](https://docs.velociraptor.app/)と[Hayabusa artifact](https://docs.velociraptor.app/exchange/artifacts/pages/windows.eventlogs.hayabusa/)を用いることで企業向けの広範囲なスレットハンティングとインシデントレスポンスにも活用できます。出力は一つのCSVタイムラインにまとめられ、Excel、[Timeline Explorer](https://ericzimmerman.github.io/#!index.md)、[Elastic Stack](doc/ElasticStackImport/ElasticStackImport-Japanese.md)、[Timesketch](https://timesketch.org/)等で簡単に分析できるようになります。 ## 目次 - [Hayabusa について](#hayabusa-について) - [目次](#目次) - [主な目的](#主な目的) - - [スレット(脅威)ハンティング](#スレット脅威ハンティング) + - [スレット(脅威)ハンティングと企業向けの広範囲なDFIR](#スレット脅威ハンティングと企業向けの広範囲なdfir) - [フォレンジックタイムラインの高速生成](#フォレンジックタイムラインの高速生成) - [スクリーンショット](#スクリーンショット) - [起動画面](#起動画面) @@ -39,9 +39,9 @@ Hayabusaは、日本の[Yamato Security](https://yamatosecurity.connpass.com/) - [Timeline Explorerでの解析](#timeline-explorerでの解析) - [Criticalアラートのフィルタリングとコンピュータごとのグルーピング](#criticalアラートのフィルタリングとコンピュータごとのグルーピング) - [Elastic Stackダッシュボードでの解析](#elastic-stackダッシュボードでの解析) + - [Timesketchでの解析](#timesketchでの解析) - [タイムラインのサンプル結果](#タイムラインのサンプル結果) - [特徴&機能](#特徴機能) -- [予定されている機能](#予定されている機能) - [ダウンロード](#ダウンロード) - [Gitクローン](#gitクローン) - [アドバンス: ソースコードからのコンパイル(任意)](#アドバンス-ソースコードからのコンパイル任意) @@ -49,26 +49,39 @@ Hayabusaは、日本の[Yamato Security](https://yamatosecurity.connpass.com/) - [32ビットWindowsバイナリのクロスコンパイル](#32ビットwindowsバイナリのクロスコンパイル) - [macOSでのコンパイルの注意点](#macosでのコンパイルの注意点) - [Linuxでのコンパイルの注意点](#linuxでのコンパイルの注意点) + - [LinuxのMUSLバイナリのクロスコンパイル](#linuxのmuslバイナリのクロスコンパイル) + - [Linuxでのコンパイルの注意点](#linuxでのコンパイルの注意点-1) - [Hayabusaの実行](#hayabusaの実行) - - [注意: アンチウィルス/EDRの誤検知](#注意-アンチウィルスedrの誤検知) + - [注意: アンチウィルス/EDRの誤検知と遅い初回実行](#注意-アンチウィルスedrの誤検知と遅い初回実行) - [Windows](#windows) - [Linux](#linux) - [macOS](#macos) - [使用方法](#使用方法) + - [主なコマンド](#主なコマンド) - [コマンドラインオプション](#コマンドラインオプション) - [使用例](#使用例) - [ピボットキーワードの作成](#ピボットキーワードの作成) - [ログオン情報の要約](#ログオン情報の要約) - [サンプルevtxファイルでHayabusaをテストする](#サンプルevtxファイルでhayabusaをテストする) - [Hayabusaの出力](#hayabusaの出力) + - [プロファイル](#プロファイル) + - [1. `minimal`プロファイルの出力](#1-minimalプロファイルの出力) + - [2. `standard`プロファイルの出力](#2-standardプロファイルの出力) + - [3. `verbose`プロファイルの出力](#3-verboseプロファイルの出力) + - [4. `verbose-all-field-info`プロファイルの出力](#4-verbose-all-field-infoプロファイルの出力) + - [5. `verbose-details-and-all-field-info`プロファイルの出力](#5-verbose-details-and-all-field-infoプロファイルの出力) + - [6. `timesketch`プロファイルの出力](#6-timesketchプロファイルの出力) + - [プロファイルの比較](#プロファイルの比較) + - [Profile Field Aliases](#profile-field-aliases) - [Levelの省略](#levelの省略) - [MITRE ATT&CK戦術の省略](#mitre-attck戦術の省略) - [Channel情報の省略](#channel情報の省略) - [プログレスバー](#プログレスバー) - [標準出力へのカラー設定](#標準出力へのカラー設定) - - [イベント頻度タイムライン](#イベント頻度タイムライン) - - [最多検知日の出力](#最多検知日の出力) - - [最多検知端末名の出力](#最多検知端末名の出力) + - [結果のサマリ](#結果のサマリ) + - [イベント頻度タイムライン](#イベント頻度タイムライン) + - [最多検知日の出力](#最多検知日の出力) + - [最多検知端末名の出力](#最多検知端末名の出力) - [Hayabusaルール](#hayabusaルール) - [Hayabusa v.s. 変換されたSigmaルール](#hayabusa-vs-変換されたsigmaルール) - [検知ルールのチューニング](#検知ルールのチューニング) @@ -87,9 +100,11 @@ Hayabusaは、日本の[Yamato Security](https://yamatosecurity.connpass.com/) ## 主な目的 -### スレット(脅威)ハンティング +### スレット(脅威)ハンティングと企業向けの広範囲なDFIR -Hayabusaには現在、2300以上のSigmaルールと130以上のHayabusa検知ルールがあり、定期的にルールが追加されています。 最終的な目標はインシデントレスポンスや定期的なスレットハンティングのために、HayabusaエージェントをすべてのWindows端末にインストールして、中央サーバーにアラートを返す仕組みを作ることです。 +Hayabusaには現在、2600以上のSigmaルールと130以上のHayabusa検知ルールがあり、定期的にルールが追加されています。 +[Velociraptor](https://docs.velociraptor.app/)の[Hayabusa artifact](https://docs.velociraptor.app/exchange/artifacts/pages/windows.eventlogs.hayabusa/)を用いることで企業向けの広範囲なスレットハンティングだけでなくDFIR(デジタルフォレンジックとインシデントレスポンス)にも無料で利用することが可能です。この2つのオープンソースを組み合わせることで、SIEMが設定されていない環境でも実質的に遡及してSIEMを再現することができます。具体的な方法は[Eric Capuano](https://twitter.com/eric_capuano)の[こちら](https://www.youtube.com/watch?v=Q1IoGX--814)の動画で学ぶことができます。 + 最終的な目標はインシデントレスポンスや定期的なスレットハンティングのために、HayabusaエージェントをすべてのWindows端末にインストールして、中央サーバーにアラートを返す仕組みを作ることです。 ### フォレンジックタイムラインの高速生成 @@ -97,29 +112,29 @@ Windowsのイベントログは、 1)解析が困難なデータ形式であること 2)データの大半がノイズであり調査に有用でないこと から、従来は非常に長い時間と手間がかかる解析作業となっていました。 Hayabusa は、有用なデータのみを抽出し、専門的なトレーニングを受けた分析者だけでなく、Windowsのシステム管理者であれば誰でも利用できる読みやすい形式で提示することを主な目的としています。 -[Evtx Explorer](https://ericzimmerman.github.io/#!index.md)や[Event Log Explorer](https://eventlogxp.com/)のような深掘り分析を行うツールの代替ではなく、分析者が20%の時間で80%の作業を行えるようにすることを目的としています。 +Hayabusaは従来のWindowsイベントログ分析解析と比較して、分析者が20%の時間で80%の作業を行えるようにすることを目指しています。 # スクリーンショット ## 起動画面 -![Hayabusa 起動画面](/screenshots/Hayabusa-Startup.png) +![Hayabusa 起動画面](screenshots/Hayabusa-Startup.png) ## ターミナル出力画面 -![Hayabusa ターミナル出力画面](/screenshots/Hayabusa-Results.png) +![Hayabusa ターミナル出力画面](screenshots/Hayabusa-Results.png) ## イベント頻度タイムライン出力画面 (`-V`オプション) -![Hayabusa イベント頻度タイムライン出力画面](/screenshots/HayabusaEventFrequencyTimeline.png) +![Hayabusa イベント頻度タイムライン出力画面](screenshots/HayabusaEventFrequencyTimeline.png) ## 結果サマリ画面 -![Hayabusa 結果サマリ画面](/screenshots/HayabusaResultsSummary.png) +![Hayabusa 結果サマリ画面](screenshots/HayabusaResultsSummary.png) ## Excelでの解析 -![Hayabusa Excelでの解析](/screenshots/ExcelScreenshot.png) +![Hayabusa Excelでの解析](screenshots/ExcelScreenshot.png) ## Timeline Explorerでの解析 @@ -136,6 +151,10 @@ Windowsのイベントログは、 ![Elastic Stack Dashboard 2](doc/ElasticStackImport/18-HayabusaDashboard-2.png) +## Timesketchでの解析 + +![Timesketch](screenshots/TimesketchAnalysis.png) + # タイムラインのサンプル結果 CSVのタイムライン結果のサンプルは[こちら](https://github.com/Yamato-Security/hayabusa/tree/main/sample-results)で確認できます。 @@ -144,6 +163,8 @@ CSVのタイムラインをExcelやTimeline Explorerで分析する方法は[こ CSVのタイムラインをElastic Stackにインポートする方法は[こちら](doc/ElasticStackImport/ElasticStackImport-Japanese.md)で紹介しています。 +CSVのタイムラインをTimesketchにインポートする方法は[こちら](doc/TimesketchImport/TimesketchImport-Japanese.md)で紹介しています。 + # 特徴&機能 * クロスプラットフォーム対応: Windows, Linux, macOS。 @@ -160,11 +181,7 @@ CSVのタイムラインをElastic Stackにインポートする方法は[こち * イベントログから不審なユーザやファイルを素早く特定するためのピボットキーワードの一覧作成。 * 詳細な調査のために全フィールド情報の出力。 * 成功と失敗したユーザログオンの要約。 - -# 予定されている機能 - -* すべてのエンドポイントでの企業全体のスレットハンティング。 -* MITRE ATT&CKのヒートマップ生成機能。 +* [Velociraptor](https://docs.velociraptor.app/)と組み合わせた企業向けの広範囲なすべてのエンドポイントに対するスレットハンティングとDFIR。 # ダウンロード @@ -185,7 +202,7 @@ git clone https://github.com/Yamato-Security/hayabusa.git --recursive `git pull --recurse-submodules`コマンド、もしくは以下のコマンドで`rules`フォルダを同期し、Hayabusaの最新のルールを更新することができます: ```bash -hayabusa-1.3.2-win-x64.exe -u +hayabusa-1.5.1-win-x64.exe -u ``` アップデートが失敗した場合は、`rules`フォルダの名前を変更してから、もう一回アップデートしてみて下さい。 @@ -200,7 +217,6 @@ hayabusa-1.3.2-win-x64.exe -u Rustがインストールされている場合、以下のコマンドでソースコードからコンパイルすることができます: ```bash -cargo clean cargo build --release ``` @@ -256,31 +272,55 @@ Fedora系のディストロ: sudo yum install openssl-devel ``` +## LinuxのMUSLバイナリのクロスコンパイル + +まず、Linux OSでターゲットをインストールします。 + +```bash +rustup install stable-x86_64-unknown-linux-musl +rustup target add x86_64-unknown-linux-musl +``` + +以下のようにコンパイルします: + +``` +cargo build --release --target=x86_64-unknown-linux-musl +``` + +MUSLバイナリは`./target/x86_64-unknown-linux-musl/release/`ディレクトリ配下に作成されます。 +MUSLバイナリはGNUバイナリより約15%遅いです。 + +## Linuxでのコンパイルの注意点 + + # Hayabusaの実行 -## 注意: アンチウィルス/EDRの誤検知 +## 注意: アンチウィルス/EDRの誤検知と遅い初回実行 Hayabusa実行する際や、`.yml`ルールのダウンロードや実行時にルール内でdetectionに不審なPowerShellコマンドや`mimikatz`のようなキーワードが書かれている際に、アンチウィルスやEDRにブロックされる可能性があります。 誤検知のため、セキュリティ対策の製品がHayabusaを許可するように設定する必要があります。 マルウェア感染が心配であれば、ソースコードを確認した上で、自分でバイナリをコンパイルして下さい。 +Windows PC起動後の初回実行時に時間がかかる場合があります。これはWindows Defenderのリアルタイムスキャンが行われていることが原因です。リアルタイムスキャンを無効にするかHayabusaのディレクトリをアンチウィルススキャンから除外することでこの現象は解消しますが、設定を変える前にセキュリティリスクを十分ご考慮ください。 + ## Windows コマンドプロンプトやWindows Terminalから32ビットもしくは64ビットのWindowsバイナリをHayabusaのルートディレクトリから実行します。 -例: `hayabusa-1.3.2-windows-x64.exe` + +例: `hayabusa-1.5.1-windows-x64.exe` ## Linux まず、バイナリに実行権限を与える必要があります。 ```bash -chmod +x ./hayabusa-1.3.2-linux-x64-gnu +chmod +x ./hayabusa-1.5.1-linux-x64-gnu ``` 次に、Hayabusaのルートディレクトリから実行します: ```bash -./hayabusa-1.3.2-linux-x64-gnu +./hayabusa-1.5.1-linux-x64-gnu ``` ## macOS @@ -288,159 +328,185 @@ chmod +x ./hayabusa-1.3.2-linux-x64-gnu まず、ターミナルやiTerm2からバイナリに実行権限を与える必要があります。 ```bash -chmod +x ./hayabusa-1.3.2-mac-intel +chmod +x ./hayabusa-1.5.1-mac-intel ``` 次に、Hayabusaのルートディレクトリから実行してみてください: ```bash -./hayabusa-1.3.2-mac-intel +./hayabusa-1.5.1-mac-intel ``` macOSの最新版では、以下のセキュリティ警告が出る可能性があります: -![Mac Error 1 JP](/screenshots/MacOS-RunError-1-JP.png) +![Mac Error 1 JP](screenshots/MacOS-RunError-1-JP.png) macOSの環境設定から「セキュリティとプライバシー」を開き、「一般」タブから「このまま許可」ボタンをクリックしてください。 -![Mac Error 2 JP](/screenshots/MacOS-RunError-2-JP.png) +![Mac Error 2 JP](screenshots/MacOS-RunError-2-JP.png) その後、ターミナルからもう一回実行してみてください: ```bash -./hayabusa-1.3.2-mac-intel +./hayabusa-1.5.1-mac-intel ``` 以下の警告が出るので、「開く」をクリックしてください。 -![Mac Error 3 JP](/screenshots/MacOS-RunError-3-JP.png) +![Mac Error 3 JP](screenshots/MacOS-RunError-3-JP.png) これで実行できるようになります。 # 使用方法 +## 主なコマンド + +* デフォルト: ファストフォレンジックタイムラインの作成。 +* `--level-tuning`: アラート`level`のカスタムチューニング +* `-L, --logon-summary`: ログオンイベントのサマリを出力する。 +* `-P, --pivot-keywords-list`: ピボットする不審なキーワードのリスト作成。 +* `-s, --statistics`: イベントIDに基づくイベントの合計と割合の集計を出力する。 +* `--set-default-profile`: デフォルトプロファイルを変更する。 +* `-u, --update`: GitHubの[hayabusa-rules](https://github.com/Yamato-Security/hayabusa-rules)リポジトリにある最新のルールに同期させる。 ## コマンドラインオプション ``` USAGE: - hayabusa.exe -f file.evtx [OPTIONS] / hayabusa.exe -d evtx-directory [OPTIONS] + hayabusa.exe [OTHER-ACTIONS] [OPTIONS] -OPTIONS: - --European-time ヨーロッパ形式で日付と時刻を出力する (例: 22-02-2022 22:00:00.123 +02:00) - --RFC-2822 RFC 2822形式で日付と時刻を出力する (例: Fri, 22 Feb 2022 22:00:00 -0600) - --RFC-3339 RFC 3339形式で日付と時刻を出力する (例: 2022-02-22 22:00:00.123456-06:00) - --US-military-time 24時間制(ミリタリータイム)のアメリカ形式で日付と時刻を出力する (例: 02-22-2022 22:00:00.123 -06:00) - --US-time アメリカ形式で日付と時刻を出力する (例: 02-22-2022 10:00:00.123 PM -06:00) - --target-file-ext ... evtx以外の拡張子を解析対象に追加する。 (例1: evtx_data 例2:evtx1 evtx2) - --all-tags 出力したCSVファイルにルール内のタグ情報を全て出力する - -c, --config ルールフォルダのコンフィグディレクトリ (デフォルト: ./rules/config) - --contributors コントリビュータの一覧表示 - -d, --directory .evtxファイルを持つディレクトリのパス - -D, --enable-deprecated-rules Deprecatedルールを有効にする - --end-timeline 解析対象とするイベントログの終了時刻 (例: "2022-02-22 23:59:59 +09:00") - -f, --filepath 1つの.evtxファイルに対して解析を行う - -F, --full-data 全てのフィールド情報を出力する - -h, --help ヘルプ情報を表示する - -l, --live-analysis ローカル端末のC:\Windows\System32\winevt\Logsフォルダを解析する - -L, --logon-summary 成功と失敗したログオン情報の要約を出力する - --level-tuning ルールlevelのチューニング (デフォルト: ./rules/config/level_tuning.txt) - -m, --min-level 結果出力をするルールの最低レベル (デフォルト: informational) - -n, --enable-noisy-rules Noisyルールを有効にする - --no_color カラー出力を無効にする - -o, --output タイムラインをCSV形式で保存する (例: results.csv) - -p, --pivot-keywords-list ピボットキーワードの一覧作成 - -q, --quiet Quietモード: 起動バナーを表示しない - -Q, --quiet-errors Quiet errorsモード: エラーログを保存しない - -r, --rules ルールファイルまたはルールファイルを持つディレクトリ (デフォルト: ./rules) - -R, --hide-record-id イベントレコードIDを表示しない - -s, --statistics イベントIDの統計情報を表示する - --start-timeline 解析対象とするイベントログの開始時刻 (例: "2020-02-22 00:00:00 +09:00") - -t, --thread-number スレッド数 (デフォルト: パフォーマンスに最適な数値) - -u, --update-rules rulesフォルダをhayabusa-rulesのgithubリポジトリの最新版に更新する - -U, --UTC UTC形式で日付と時刻を出力する (デフォルト: 現地時間) - -v, --verbose 詳細な情報を出力する - -V, --visualize-timeline イベント頻度タイムラインを出力する - --version バージョン情報を表示する +INPUT: + -d, --directory .evtxファイルを持つディレクトリのパス + -f, --file 1つの.evtxファイルに対して解析を行う + -l, --live-analysis ローカル端末のC:\Windows\System32\winevt\Logsフォルダを解析する + +ADVANCED: + -c, --rules-config ルールフォルダのコンフィグディレクトリ (デフォルト: ./rules/config) + -Q, --quiet-errors Quiet errorsモード: エラーログを保存しない + -r, --rules ルールファイルまたはルールファイルを持つディレクトリ (デフォルト: ./rules) + -t, --thread-number スレッド数 (デフォルト: パフォーマンスに最適な数値) + --target-file-ext ... evtx以外の拡張子を解析対象に追加する。 (例1: evtx_data 例2:evtx1 evtx2) + +OUTPUT: + -o, --output タイムラインをCSV形式で保存する (例: results.csv) + -P, --profile 利用する出力プロファイル名を指定する (minimal, standard, verbose, verbose-all-field-info, verbose-details-and-all-field-info) + +DISPLAY-SETTINGS: + --no-color カラー出力を無効にする + --no-summary 結果概要を出力しない + -q, --quiet Quietモード: 起動バナーを表示しない + -v, --verbose 詳細な情報を出力する + -V, --visualize-timeline イベント頻度タイムラインを出力する + +FILTERING: + -D, --deep-scan すべてのイベントIDを対象にしたスキャンを行う(遅くなる) + --enable-deprecated-rules Deprecatedルールを有効にする + --exclude-status ... 読み込み対象外とするルール内でのステータス (ex: experimental) (ex: stable test) + -m, --min-level 結果出力をするルールの最低レベル (デフォルト: informational) + -n, --enable-noisy-rules Noisyルールを有効にする + --timeline-end 解析対象とするイベントログの終了時刻 (例: "2022-02-22 23:59:59 +09:00") + --timeline-start 解析対象とするイベントログの開始時刻 (例: "2020-02-22 00:00:00 +09:00") + +OTHER-ACTIONS: + --contributors コントリビュータの一覧表示 + -L, --logon-summary 成功と失敗したログオン情報の要約を出力する + --level-tuning [] ルールlevelのチューニング (デフォルト: ./rules/config/level_tuning.txt) + -p, --pivot-keywords-list ピボットキーワードの一覧作成 + -s, --statistics イベントIDの統計情報を表示する + --set-default-profile デフォルトの出力コンフィグを設定する + -u, --update-rules rulesフォルダをhayabusa-rulesのgithubリポジトリの最新版に更新する + +TIME-FORMAT: + --European-time ヨーロッパ形式で日付と時刻を出力する (例: 22-02-2022 22:00:00.123 +02:00) + --RFC-2822 RFC 2822形式で日付と時刻を出力する (例: Fri, 22 Feb 2022 22:00:00 -0600) + --RFC-3339 RFC 3339形式で日付と時刻を出力する (例: 2022-02-22 22:00:00.123456-06:00) + --US-military-time 24時間制(ミリタリータイム)のアメリカ形式で日付と時刻を出力する (例: 02-22-2022 22:00:00.123 -06:00) + --US-time アメリカ形式で日付と時刻を出力する (例: 02-22-2022 10:00:00.123 PM -06:00) + -U, --UTC UTC形式で日付と時刻を出力する (デフォルト: 現地時間) ``` ## 使用例 -* 1つのWindowsイベントログファイルに対してHayabusaを実行します: +* 1つのWindowsイベントログファイルに対してHayabusaを実行する: ```bash -hayabusa-1.3.2-win-x64.exe -f eventlog.evtx +hayabusa-1.5.1-win-x64.exe -f eventlog.evtx ``` -* 複数のWindowsイベントログファイルのあるsample-evtxディレクトリに対して、Hayabusaを実行します: +* `verbose`プロファイルで複数のWindowsイベントログファイルのあるsample-evtxディレクトリに対して、Hayabusaを実行する: ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -P verbose ``` -* 全てのフィールド情報も含めて1つのCSVファイルにエクスポートして、Excel、Timeline Explorer、Elastic Stack等でさらに分析することができます: +* 全てのフィールド情報も含めて1つのCSVファイルにエクスポートして、Excel、Timeline Explorer、Elastic Stack等でさらに分析することができる(注意: `verbose-details-and-all-field-info`プロファイルを使すると、出力するファイルのサイズがとても大きくなる!): ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -o results.csv -F +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -o results.csv -P `verbose-details-and-all-field-info` ``` -* Hayabusaルールのみを実行します(デフォルトでは `-r .\rules` にあるすべてのルールが利用されます): +* Hayabusaルールのみを実行する(デフォルトでは`-r .\rules`にあるすべてのルールが利用される): ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa -o results.csv +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa -o results.csv ``` -* Windowsでデフォルトで有効になっているログに対してのみ、Hayabusaルールを実行します: +* Windowsでデフォルトで有効になっているログに対してのみ、Hayabusaルールを実行する: ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\default -o results.csv +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\default -o results.csv ``` -* Sysmonログに対してのみHayabusaルールを実行します: +* Sysmonログに対してのみHayabusaルールを実行する: ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\sysmon -o results.csv +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\sysmon -o results.csv ``` -* Sigmaルールのみを実行します: +* Sigmaルールのみを実行する: ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\sigma -o results.csv +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\sigma -o results.csv ``` -* 廃棄(deprecated)されたルール(`status`が`deprecated`になっているルール)とノイジールール(`.\rules\config\noisy_rules.txt`にルールIDが書かれているルール)を有効にします: +* 廃棄(deprecated)されたルール(`status`が`deprecated`になっているルール)とノイジールール(`.\rules\config\noisy_rules.txt`にルールIDが書かれているルール)を有効にする: ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx --enable-deprecated-rules --enable-noisy-rules -o results.csv +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx --enable-deprecated-rules --enable-noisy-rules -o results.csv ``` -* ログオン情報を分析するルールのみを実行し、UTCタイムゾーンで出力します: +* ログオン情報を分析するルールのみを実行し、UTCタイムゾーンで出力する: ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\default\events\Security\Logons -U -o results.csv +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\default\events\Security\Logons -U -o results.csv ``` -* 起動中のWindows端末上で実行し(Administrator権限が必要)、アラート(悪意のある可能性のある動作)のみを検知します: +* 起動中のWindows端末上で実行し(Administrator権限が必要)、アラート(悪意のある可能性のある動作)のみを検知する: ```bash -hayabusa-1.3.2-win-x64.exe -l -m low +hayabusa-1.5.1-win-x64.exe -l -m low ``` -* criticalレベルのアラートからピボットキーワードの一覧を作成します(結果は結果毎に`keywords-Ip Address.txt`や`keyworss-Users.txt`等に出力されます): +* criticalレベルのアラートからピボットキーワードの一覧を作成する(結果は結果毎に`keywords-Ip Address.txt`や`keywords-Users.txt`等に出力される): ```bash -hayabusa-1.3.2-win-x64.exe -l -m critical -p -o keywords +hayabusa-1.5.1-win-x64.exe -l -m critical -p -o keywords ``` -* イベントIDの統計情報を取得します: +* イベントIDの統計情報を出力する: ```bash -hayabusa-1.3.2-win-x64.exe -f Security.evtx -s +hayabusa-1.5.1-win-x64.exe -f Security.evtx -s +``` +* ログオンサマリを出力する: + +```bash +hayabusa-1.5.1-win-x64.exe -L -f Security.evtx -s ``` -* 詳細なメッセージを出力します(処理に時間がかかるファイル、パースエラー等を特定するのに便利): +* 詳細なメッセージを出力する(処理に時間がかかるファイル、パースエラー等を特定するのに便利): ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -v +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -v ``` * Verbose出力の例: @@ -458,6 +524,12 @@ Checking target evtx FilePath: "./hayabusa-sample-evtx/YamatoSecurity/T1218.004_ 5 / 509 [=>------------------------------------------------------------------------------------------------------------------------------------------] 0.98 % 1s ``` +* 結果を[Timesketch](https://timesketch.org/)にインポートできるCSV形式に保存する: + +```bash +hayabusa-1.5.1-win-x64.exe -d ../hayabusa-sample-evtx --RFC-3339 -o timesketch-import.csv -P timesketch -U +``` + * エラーログの出力をさせないようにする: デフォルトでは、Hayabusaはエラーメッセージをエラーログに保存します。 エラーメッセージを保存したくない場合は、`-Q`を追加してください。 @@ -465,7 +537,7 @@ Checking target evtx FilePath: "./hayabusa-sample-evtx/YamatoSecurity/T1218.004_ ## ピボットキーワードの作成 `-p`もしくは`--pivot-keywords-list`オプションを使うことで不審なユーザやホスト名、プロセスなどを一覧で出力することができ、イベントログから素早く特定することができます。 -ピボットキーワードのカスタマイズは`config/pivot_keywords.txt`を変更することで行うことができます。以下はデフォルトの設定になります: +ピボットキーワードのカスタマイズは`./config/pivot_keywords.txt`を変更することで行うことができます。以下はデフォルトの設定になります: ``` Users.SubjectUserName @@ -494,29 +566,85 @@ Hayabusaをテストしたり、新しいルールを作成したりするため git clone https://github.com/Yamato-Security/hayabusa-sample-evtx.git ``` -> ※ 以下の例でHayabusaを試したい方は、上記コマンドをhayabusaのルートフォルダから実行してください。 - # Hayabusaの出力 +## プロファイル -Hayabusaの結果を標準出力に表示しているとき(デフォルト)は、以下の情報を表示します: +Hayabusaの`config/profiles.yaml`設定ファイルでは、5つのプロファイルが定義されています: -* `Timestamp`: デフォルトでは`YYYY-MM-DD HH:mm:ss.sss +hh:mm`形式になっています。イベントログの``フィールドから来ています。デフォルトのタイムゾーンはローカルのタイムゾーンになりますが、`--utc` オプションで UTC に変更することができます。 -* `Computer`: イベントログの``フィールドから来ています。 -* `Channel`: ログ名です。イベントログの``フィールドから来ています。 -* `Event ID`: イベントログの``フィールドから来ています。 -* `Level`: YML検知ルールの`level`フィールドから来ています。(例:`informational`, `low`, `medium`, `high`, `critical`) デフォルトでは、すべてのレベルのアラートとイベントが出力されますが、`-m`オプションで最低のレベルを指定することができます。例えば`-m high`オプションを付けると、`high`と`critical`アラートしか出力されません。 -* `Title`: YML検知ルールの`title`フィールドから来ています。 -* `RecordID`: イベントレコードIDです。``フィールドから来ています。`-R`もしくは`--hide-record-id`オプションを付けると表示されません。 -* `Details`: YML検知ルールの`details`フィールドから来ていますが、このフィールドはHayabusaルールにしかありません。このフィールドはアラートとイベントに関する追加情報を提供し、ログのフィールドから有用なデータを抽出することができます。イベントキーのマッピングが間違っている場合、もしくはフィールドが存在しない場合で抽出ができなかった箇所は`n/a` (not available)と記載されます。YML検知ルールに`details`フィールドが存在しない時のdetailsのメッセージを`./rules/config/default_details.txt`で設定できます。`default_details.txt`では`Provider Name`、`EventID`、`details`の組み合わせで設定することができます。 +1. `minimal` +2. `standard` (デフォルト) +3. `verbose` +4. `verbose-all-field-info` +5. `verbose-details-and-all-field-info` -CSVファイルとして保存する場合、以下の列が追加されます: +このファイルを編集することで、簡単に独自のプロファイルをカスタマイズしたり、追加したりすることができます。 +`--set-default-profile `オプションでデフォルトのプロファイルを変更することもできます。 -* `MitreAttack`: MITRE ATT&CKの戦術。 -* `Rule Path`: アラートまたはイベントを生成した検知ルールへのパス。 -* `File Path`: アラートまたはイベントを起こしたevtxファイルへのパス。 +### 1. `minimal`プロファイルの出力 -`-F`もしくは`--full-data`オプションを指定した場合、全てのフィールド情報が`RecordInformation`カラムにで出力されます。 +`%Timestamp%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%RuleTitle%`, `%Details%` +### 2. `standard`プロファイルの出力 + +`%Timestamp%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%MitreTactics%`, `%RecordID%`, `%RuleTitle%`, `%Details%` + +### 3. `verbose`プロファイルの出力 + +`%Timestamp%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%MitreTactics`, `%MitreTags%`, `%OtherTags%`, `%RecordID%`, `%RuleTitle%`, `%Details%`, `%RuleFile%`, `%EvtxFile%` + +### 4. `verbose-all-field-info`プロファイルの出力 + +最小限の`details`情報を出力する代わりに、イベントにあるすべての`EventData`フィールド情報が出力されます。 + +`%Timestamp%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%MitreTactics`, `%MitreTags%`, `%OtherTags%`, `%RecordID%`, `%RuleTitle%`, `%AllFieldInfo%`, `%RuleFile%`, `%EvtxFile%` + +### 5. `verbose-details-and-all-field-info`プロファイルの出力 + +`verbose`プロファイルで出力される情報とイベントにあるすべての`EventData`フィールド情報が出力されます。 +(注意: 出力ファイルサイズは2倍になります!) + +`%Timestamp%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%MitreTactics`, `%MitreTags%`, `%OtherTags%`, `%RecordID%`, `%RuleTitle%`, `%Details%`, `%RuleFile%`, `%EvtxFile%`, `%AllFieldInfo%` + +### 6. `timesketch`プロファイルの出力 + +[Timesketch](https://timesketch.org/)にインポートできる`verbose`プロファイル。 + +`%Timestamp%`, `hayabusa`, `%RuleTitle%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%MitreTactics`, `%MitreTags%`, `%OtherTags%`, `%RecordID%`, `%Details%`, `%RuleFile%`, `%EvtxFile%` + +### プロファイルの比較 + +以下のベンチマークは、2018年製のマックブックプロ上で7.5GBのEVTXデータに対して実施されました。 + +| プロファイル | 処理時間 | 結果のファイルサイズ | +| :---: | :---: | :---: | +| minimal | 16分18秒 | 690 MB | +| standard | 16分23秒 | 710 MB | +| verbose | 17分 | 990 MB | +| timesketch | 17分 | 1015 MB | +| verbose-all-field-info | 16分50秒 | 1.6 GB | +| verbose-details-and-all-field-info | 17分12秒 | 2.1 GB | + +### Profile Field Aliases + +| エイリアス名 | Hayabusaの出力情報 | +| :--- | :--- | +|%Timestamp% | デフォルトでは`YYYY-MM-DD HH:mm:ss.sss +hh:mm`形式になっている。イベントログの``フィールドから来ている。デフォルトのタイムゾーンはローカルのタイムゾーンになるが、`--UTC`オプションでUTCに変更することができる。 | +|%Computer% | イベントログの``フィールド。 | +|%Channel% | ログ名。イベントログの``フィールド。 | +|%EventID% | イベントログの``フィールド。 | +|%Level% | YML検知ルールの`level`フィールド。(例:`informational`、`low`、`medium`、`high`、`critical`) | +|%MitreTactics% | MITRE ATT&CKの[戦術](https://attack.mitre.org/tactics/enterprise/) (例: Initial Access、Lateral Movement等々) | +|%MitreTags% | MITRE ATT&CKの戦術以外の情報。attack.g(グループ)、attack.t(技術)、attack.s(ソフトウェア)の情報を出力する。 | +|%OtherTags% | YML検知ルールの`tags`フィールドから`MitreTactics`、`MitreTags`以外のキーワードを出力する。| +|%RecordID% | ``フィールドのイベントレコードID。 | +|%RuleTitle% | YML検知ルールの`title`フィールド。 | +|%Details% | YML検知ルールの`details`フィールドから来ていますが、このフィールドはHayabusaルールにしかありません。このフィールドはアラートとイベントに関する追加情報を提供し、ログのフィールドから有用なデータを抽出することができます。イベントキーのマッピングが間違っている場合、もしくはフィールドが存在しない場合で抽出ができなかった箇所は`n/a` (not available)と記載されます。YML検知ルールに`details`フィールドが存在しない時のdetailsのメッセージを`./rules/config/default_details.txt`で設定できます。`default_details.txt`では`Provider Name`、`EventID`、`details`の組み合わせで設定することができます。default_details.txt`やYML検知ルールに対応するルールが記載されていない場合はすべてのフィールド情報を出力します。 | +|%AllFieldInfo% | すべてのフィールド情報。 | +|%RuleFile% | アラートまたはイベントを生成した検知ルールのファイル名。 | +|%EvtxFile% | アラートまたはイベントを起こしたevtxファイルへのパス。 | + +これらのエイリアスは、出力プロファイルで使用することができます。また、他の[イベントキーアライズ](https://github.com/Yamato-Security/hayabusa-rules/blob/main/README-Japanese.md#%E3%82%A4%E3%83%99%E3%83%B3%E3%83%88%E3%82%AD%E3%83%BC%E3%82%A8%E3%82%A4%E3%83%AA%E3%82%A2%E3%82%B9)を定義し、他のフィールドを出力することもできます。 + ## Levelの省略 簡潔に出力するためにLevelを以下のように省略し出力しています。 @@ -530,7 +658,7 @@ CSVファイルとして保存する場合、以下の列が追加されます: ## MITRE ATT&CK戦術の省略 簡潔に出力するためにMITRE ATT&CKの戦術を以下のように省略しています。 -`config/output_tag.txt`の設定ファイルで自由に編集できます。 +`./config/output_tag.txt`の設定ファイルで自由に編集できます。 検知したデータの戦術を全て出力したい場合は、`--all-tags`オプションをつけてください。 * `Recon` : Reconnaissance (偵察) @@ -551,7 +679,7 @@ CSVファイルとして保存する場合、以下の列が追加されます: ## Channel情報の省略 簡潔に出力するためにChannelの表示を以下のように省略しています。 -`config/channel_abbreviations.txt`の設定ファイルで自由に編集できます。 +`./rules/config/channel_abbreviations.txt`の設定ファイルで自由に編集できます。 * `App` : `Application` * `AppLocker` : `Microsoft-Windows-AppLocker/*` @@ -594,16 +722,18 @@ Hayabusaの結果は`level`毎に文字色が変わります。 形式は`level名,(6桁のRGBのカラーhex)`です。 カラー出力をしないようにしたい場合は`--no-color`オプションをご利用ください。 -## イベント頻度タイムライン +## 結果のサマリ + +### イベント頻度タイムライン `-V`または`--visualize-timeline`オプションを使うことで、検知したイベントの数が5以上の時、頻度のタイムライン(スパークライン)を画面に出力します。 マーカーの数は最大10個です。デフォルトのCommand PromptとPowerShell Promptでは文字化けがでるので、Windows TerminalやiTerm2等のターミナルをご利用ください。 -## 最多検知日の出力 +### 最多検知日の出力 各レベルで最も検知された日付を画面に出力します。 -## 最多検知端末名の出力 +### 最多検知端末名の出力 各レベルで多く検知されたユニークなイベントが多い端末名上位5つを画面に出力します。 @@ -654,14 +784,14 @@ Hayabusaルールは、Windowsのイベントログ解析専用に設計され ファイアウォールやIDSと同様に、シグネチャベースのツールは、環境に合わせて調整が必要になるため、特定のルールを永続的または一時的に除外する必要がある場合があります。 -ルールID(例: `4fe151c2-ecf9-4fae-95ae-b88ec9c2fca6`) を `rules/config/exclude_rules.txt`に追加すると、不要なルールや利用できないルールを無視することができます。 +ルールID(例: `4fe151c2-ecf9-4fae-95ae-b88ec9c2fca6`) を `./rules/config/exclude_rules.txt`に追加すると、不要なルールや利用できないルールを無視することができます。 -ルールIDを `rules/config/noisy_rules.txt`に追加して、デフォルトでルールを無視することもできますが、`-n`または `--enable-noisy-rules`オプションを指定してルールを使用することもできます。 +ルールIDを `./rules/config/noisy_rules.txt`に追加して、デフォルトでルールを無視することもできますが、`-n`または `--enable-noisy-rules`オプションを指定してルールを使用することもできます。 ## 検知レベルのlevelチューニング Hayabusaルール、Sigmaルールはそれぞれの作者が検知した際のリスクレベルを決めています。 -ユーザが独自のリスクレベルに設定するには`./rules/config/level_tuning.txt`に変換情報を書き、`hayabusa-1.3.2-win-x64.exe --level-tuning`を実行することでルールファイルが書き換えられます。 +ユーザが独自のリスクレベルに設定するには`./rules/config/level_tuning.txt`に変換情報を書き、`hayabusa-1.5.1-win-x64.exe --level-tuning`を実行することでルールファイルが書き換えられます。 ルールファイルが直接書き換えられることに注意して使用してください。 `./rules/config/level_tuning.txt`の例: @@ -674,12 +804,9 @@ id,new_level ## イベントIDフィルタリング -`config/target_eventids.txt`にイベントID番号を追加することで、イベントIDでフィルタリングすることができます。 -これはパフォーマンスを向上させるので、特定のIDだけを検索したい場合に推奨されます。 - -すべてのルールの`EventID`フィールドと実際のスキャン結果で見られるIDから作成したIDフィルタリストのサンプルを[`config/target_eventids_sample.txt`](https://github.com/Yamato-Security/hayabusa/blob/main/config/target_eventids_sample.txt)で提供しています。 - -最高のパフォーマンスを得たい場合はこのリストを使用してください。ただし、検出漏れの可能性が若干あることにご注意ください。 +デフォルトではパフォーマンスを上げるために、検知ルールでイベントIDが定義されていないイベントを無視しています。 +`./rules/config/target_event_IDs.txt`で定義されたIDがスキャンされます。 +すべてのイベントをスキャンしたい場合は、`-D, --deep-scan`オプションを使用してください。 # その他のWindowsイベントログ解析ツールおよび関連リソース @@ -687,7 +814,7 @@ id,new_level * [APT-Hunter](https://github.com/ahmedkhlief/APT-Hunter) - Pythonで開発された攻撃検知ツール。 * [Awesome Event IDs](https://github.com/stuhli/awesome-event-ids) - フォレンジック調査とインシデント対応に役立つイベントIDのリソース。 -* [Chainsaw](https://github.com/countercept/chainsaw) - Rustで開発された同様のSigmaベースの攻撃検知ツール。 +* [Chainsaw](https://github.com/countercept/chainsaw) - Rustで開発されたSigmaベースの攻撃検知ツール。 * [DeepBlueCLI](https://github.com/sans-blue-team/DeepBlueCLI) - [Eric Conrad](https://twitter.com/eric_conrad) によってPowershellで開発された攻撃検知ツール。 * [Epagneul](https://github.com/jurelou/epagneul) - Windowsイベントログの可視化ツール。 * [EventList](https://github.com/miriamxyra/EventList/) - [Miriam Wiesner](https://github.com/miriamxyra)によるセキュリティベースラインの有効なイベントIDをMITRE ATT&CKにマッピングするPowerShellツール。 @@ -728,6 +855,7 @@ Windows機での悪性な活動を検知する為には、デフォルトのロ ## 英語 +* 2022/06/19 [VelociraptorチュートリアルとHayabusaの統合方法](https://www.youtube.com/watch?v=Q1IoGX--814) by [Eric Cupuano](https://twitter.com/eric_capuano) * 2022/01/24 [Hayabusa結果をneo4jで可視化する方法](https://www.youtube.com/watch?v=7sQqz2ek-ko) by Matthew Seyer ([@forensic_matt](https://twitter.com/forensic_matt)) ## 日本語 diff --git a/README.md b/README.md index efff6bb9..2f2231c1 100644 --- a/README.md +++ b/README.md @@ -1,16 +1,16 @@

- Hayabusa Logo + Hayabusa Logo

[ English ] | [日本語]
--- -[tag-1]: https://img.shields.io/github/downloads/Yamato-Security/hayabusa/total?style=plastic&label=GitHub%F0%9F%A6%85DownLoads +[tag-1]: https://img.shields.io/github/downloads/Yamato-Security/hayabusa/total?style=plastic&label=GitHub%F0%9F%A6%85Downloads [tag-2]: https://img.shields.io/github/stars/Yamato-Security/hayabusa?style=plastic&label=GitHub%F0%9F%A6%85Stars [tag-3]: https://img.shields.io/github/v/release/Yamato-Security/hayabusa?display_name=tag&label=latest-version&style=plastic -[tag-4]: https://img.shields.io/badge/Black%20Hat%20Arsenal-Asia%202022-blue +[tag-4]: https://github.com/toolswatch/badges/blob/master/arsenal/asia/2022.svg [tag-5]: https://rust-reportcard.xuri.me/badge/github.com/Yamato-Security/hayabusa [tag-6]: https://img.shields.io/badge/Maintenance%20Level-Actively%20Developed-brightgreen.svg [tag-7]: https://img.shields.io/badge/Twitter-00acee?logo=twitter&logoColor=white @@ -20,14 +20,14 @@ # About Hayabusa -Hayabusa is a **Windows event log fast forensics timeline generator** and **threat hunting tool** created by the [Yamato Security](https://yamatosecurity.connpass.com/) group in Japan. Hayabusa means ["peregrine falcon"](https://en.wikipedia.org/wiki/Peregrine_falcon") in Japanese and was chosen as peregrine falcons are the fastest animal in the world, great at hunting and highly trainable. It is written in [Rust](https://www.rust-lang.org/) and supports multi-threading in order to be as fast as possible. We have provided a [tool](https://github.com/Yamato-Security/hayabusa-rules/tree/main/tools/sigmac) to convert [sigma](https://github.com/SigmaHQ/sigma) rules into hayabusa rule format. The hayabusa detection rules are based on sigma rules, written in YML in order to be as easily customizable and extensible as possible. It can be run either on running systems for live analysis or by gathering logs from multiple systems for offline analysis. (At the moment, it does not support real-time alerting or periodic scans.) The output will be consolidated into a single CSV timeline for easy analysis in Excel, [Timeline Explorer](https://ericzimmerman.github.io/#!index.md), or [Elastic Stack](doc/ElasticStackImport/ElasticStackImport-English.md). +Hayabusa is a **Windows event log fast forensics timeline generator** and **threat hunting tool** created by the [Yamato Security](https://yamatosecurity.connpass.com/) group in Japan. Hayabusa means ["peregrine falcon"](https://en.wikipedia.org/wiki/Peregrine_falcon") in Japanese and was chosen as peregrine falcons are the fastest animal in the world, great at hunting and highly trainable. It is written in [Rust](https://www.rust-lang.org/) and supports multi-threading in order to be as fast as possible. We have provided a [tool](https://github.com/Yamato-Security/hayabusa-rules/tree/main/tools/sigmac) to convert [Sigma](https://github.com/SigmaHQ/sigma) rules into Hayabusa rule format. The Sigma-compatible Hayabusa detection rules are written in YML in order to be as easily customizable and extensible as possible. Hayabusa can be run either on single running systems for live analysis, by gathering logs from single or multiple systems for offline analysis, or by running the [Hayabusa artifact](https://docs.velociraptor.app/exchange/artifacts/pages/windows.eventlogs.hayabusa/) with [Velociraptor](https://docs.velociraptor.app/) for enterprise-wide threat hunting and incident response. The output will be consolidated into a single CSV timeline for easy analysis in Excel, [Timeline Explorer](https://ericzimmerman.github.io/#!index.md), [Elastic Stack](doc/ElasticStackImport/ElasticStackImport-English.md), [Timesketch](https://timesketch.org/), etc... ## Table of Contents - [About Hayabusa](#about-hayabusa) - [Table of Contents](#table-of-contents) - [Main Goals](#main-goals) - - [Threat Hunting](#threat-hunting) + - [Threat Hunting and Enterprise-wide DFIR](#threat-hunting-and-enterprise-wide-dfir) - [Fast Forensics Timeline Generation](#fast-forensics-timeline-generation) - [Screenshots](#screenshots) - [Startup](#startup) @@ -38,9 +38,9 @@ Hayabusa is a **Windows event log fast forensics timeline generator** and **thre - [Analysis in Timeline Explorer](#analysis-in-timeline-explorer) - [Critical Alert Filtering and Computer Grouping in Timeline Explorer](#critical-alert-filtering-and-computer-grouping-in-timeline-explorer) - [Analysis with the Elastic Stack Dashboard](#analysis-with-the-elastic-stack-dashboard) + - [Analysis in Timesketch](#analysis-in-timesketch) - [Analyzing Sample Timeline Results](#analyzing-sample-timeline-results) - [Features](#features) -- [Planned Features](#planned-features) - [Downloads](#downloads) - [Git cloning](#git-cloning) - [Advanced: Compiling From Source (Optional)](#advanced-compiling-from-source-optional) @@ -48,26 +48,38 @@ Hayabusa is a **Windows event log fast forensics timeline generator** and **thre - [Cross-compiling 32-bit Windows Binaries](#cross-compiling-32-bit-windows-binaries) - [macOS Compiling Notes](#macos-compiling-notes) - [Linux Compiling Notes](#linux-compiling-notes) + - [Cross-compiling Linux MUSL Binaries](#cross-compiling-linux-musl-binaries) - [Running Hayabusa](#running-hayabusa) - - [Caution: Anti-Virus/EDR Warnings](#caution-anti-virusedr-warnings) + - [Caution: Anti-Virus/EDR Warnings and Slow Runtimes](#caution-anti-virusedr-warnings-and-slow-runtimes) - [Windows](#windows) - [Linux](#linux) - [macOS](#macos) - [Usage](#usage) + - [Main commands](#main-commands) - [Command Line Options](#command-line-options) - [Usage Examples](#usage-examples) - [Pivot Keyword Generator](#pivot-keyword-generator) - [Logon Summary Generator](#logon-summary-generator) - [Testing Hayabusa on Sample Evtx Files](#testing-hayabusa-on-sample-evtx-files) - [Hayabusa Output](#hayabusa-output) + - [Profiles](#profiles) + - [1. `minimal` profile output](#1-minimal-profile-output) + - [2. `standard` profile output](#2-standard-profile-output) + - [3. `verbose` profile output](#3-verbose-profile-output) + - [4. `verbose-all-field-info` profile output](#4-verbose-all-field-info-profile-output) + - [5. `verbose-details-and-all-field-info` profile output](#5-verbose-details-and-all-field-info-profile-output) + - [6. `timesketch` profile output](#6-timesketch-profile-output) + - [Profile Comparison](#profile-comparison) + - [Profile Field Aliases](#profile-field-aliases) - [Level Abbrevations](#level-abbrevations) - [MITRE ATT&CK Tactics Abbreviations](#mitre-attck-tactics-abbreviations) - [Channel Abbreviations](#channel-abbreviations) - [Progress Bar](#progress-bar) - [Color Output](#color-output) - - [Event Fequency Timeline](#event-fequency-timeline) - - [Dates with most total detections](#dates-with-most-total-detections) - - [Top 5 computers with most unique detections](#top-5-computers-with-most-unique-detections) + - [Results Summary](#results-summary-1) + - [Event Fequency Timeline](#event-fequency-timeline) + - [Dates with most total detections](#dates-with-most-total-detections) + - [Top 5 computers with most unique detections](#top-5-computers-with-most-unique-detections) - [Hayabusa Rules](#hayabusa-rules) - [Hayabusa v.s. Converted Sigma Rules](#hayabusa-vs-converted-sigma-rules) - [Detection Rule Tuning](#detection-rule-tuning) @@ -86,36 +98,36 @@ Hayabusa is a **Windows event log fast forensics timeline generator** and **thre ## Main Goals -### Threat Hunting +### Threat Hunting and Enterprise-wide DFIR -Hayabusa currently has over 2300 sigma rules and over 130 hayabusa rules with more rules being added regularly. The ultimate goal is to be able to push out hayabusa agents to all Windows endpoints after an incident or for periodic threat hunting and have them alert back to a central server. +Hayabusa currently has over 2600 Sigma rules and over 130 Hayabusa built-in detection rules with more rules being added regularly. It can be used for enterprise-wide proactive threat hunting as well as DFIR (Digital Forensics and Incident Response) for free with [Velociraptor](https://docs.velociraptor.app/)'s [Hayabusa artifact](https://docs.velociraptor.app/exchange/artifacts/pages/windows.eventlogs.hayabusa/). By combining these two open-source tools, you can essentially retroactively reproduce a SIEM when there is no SIEM setup in the environment. You can learn about how to do this by watching [Eric Capuano](https://twitter.com/eric_capuano)'s Velociraptor walkthrough [here](https://www.youtube.com/watch?v=Q1IoGX--814). ### Fast Forensics Timeline Generation -Windows event log analysis has traditionally been a very long and tedious process because Windows event logs are 1) in a data format that is hard to analyze and 2) the majority of data is noise and not useful for investigations. Hayabusa's main goal is to extract out only useful data and present it in an easy-to-read format that is usable not only by professionally trained analysts but any Windows system administrator. -Hayabusa is not intended to be a replacement for tools like [Evtx Explorer](https://ericzimmerman.github.io/#!index.md) or [Event Log Explorer](https://eventlogxp.com/) for more deep-dive analysis but is intended for letting analysts get 80% of their work done in 20% of the time. +Windows event log analysis has traditionally been a very long and tedious process because Windows event logs are 1) in a data format that is hard to analyze and 2) the majority of data is noise and not useful for investigations. Hayabusa's goal is to extract out only useful data and present it in a concise as possible easy-to-read format that is usable not only by professionally trained analysts but any Windows system administrator. +Hayabusa hopes to let analysts get 80% of their work done in 20% of the time when compared to traditional Windows event log analysis. # Screenshots ## Startup -![Hayabusa Startup](/screenshots/Hayabusa-Startup.png) +![Hayabusa Startup](screenshots/Hayabusa-Startup.png) ## Terminal Output -![Hayabusa terminal output](/screenshots/Hayabusa-Results.png) +![Hayabusa terminal output](screenshots/Hayabusa-Results.png) ## Event Fequency Timeline (`-V` option) -![Hayabusa Event Frequency Timeline](/screenshots/HayabusaEventFrequencyTimeline.png) +![Hayabusa Event Frequency Timeline](screenshots/HayabusaEventFrequencyTimeline.png) ## Results Summary -![Hayabusa results summary](/screenshots/HayabusaResultsSummary.png) +![Hayabusa results summary](screenshots/HayabusaResultsSummary.png) ## Analysis in Excel -![Hayabusa analysis in Excel](/screenshots/ExcelScreenshot.png) +![Hayabusa analysis in Excel](screenshots/ExcelScreenshot.png) ## Analysis in Timeline Explorer @@ -131,6 +143,10 @@ Hayabusa is not intended to be a replacement for tools like [Evtx Explorer](http ![Elastic Stack Dashboard 2](doc/ElasticStackImport/18-HayabusaDashboard-2.png) +## Analysis in Timesketch + +![Timesketch](screenshots/TimesketchAnalysis.png) + # Analyzing Sample Timeline Results You can check out a sample CSV timeline [here](https://github.com/Yamato-Security/hayabusa/tree/main/sample-results). @@ -139,6 +155,8 @@ You can learn how to analyze CSV timelines in Excel and Timeline Explorer [here] You can learn how to import CSV files into Elastic Stack [here](doc/ElasticStackImport/ElasticStackImport-English.md). +You can learn how to import CSV files into Timesketch [here](doc/TimesketchImport/TimesketchImport-English.md). + # Features * Cross-platform support: Windows, Linux, macOS. @@ -155,15 +173,11 @@ You can learn how to import CSV files into Elastic Stack [here](doc/ElasticStack * Create a list of unique pivot keywords to quickly identify abnormal users, hostnames, processes, etc... as well as correlate events. * Output all fields for more thorough investigations. * Successful and failed logon summary. - -# Planned Features - -* Enterprise-wide hunting on all endpoints. -* MITRE ATT&CK heatmap generation. +* Enterprise-wide threat hunting and DFIR on all endpoints with [Velociraptor](https://docs.velociraptor.app/). # Downloads -Please download the latest stable version of hayabusa with compiled binaries or the source code from the [Releases](https://github.com/Yamato-Security/hayabusa/releases) page. +Please download the latest stable version of Hayabusa with compiled binaries or compile the source code from the [Releases](https://github.com/Yamato-Security/hayabusa/releases) page. # Git cloning @@ -180,7 +194,7 @@ Note: If you forget to use --recursive option, the `rules` folder, which is mana You can sync the `rules` folder and get latest Hayabusa rules with `git pull --recurse-submodules` or use the following command: ```bash -hayabusa-1.3.2-win-x64.exe -u +hayabusa-1.5.1-win-x64.exe -u ``` If the update fails, you may need to rename the `rules` folder and try again. @@ -188,14 +202,13 @@ If the update fails, you may need to rename the `rules` folder and try again. >> Caution: When updating, rules and config files in the `rules` folder are replaced with the latest rules and config files in the [hayabusa-rules](https://github.com/Yamato-Security/hayabusa-rules) repository. >> Any changes you make to existing files will be overwritten, so we recommend that you make backups of any files that you edit before updating. >> If you are performing level tuning with `--level-tuning`, please re-tune your rule files after each update. ->> If you add new rules inside of the `rules` folder, they will **not** be overwritten or deleted when updating. +>> If you add **new** rules inside of the `rules` folder, they will **not** be overwritten or deleted when updating. # Advanced: Compiling From Source (Optional) If you have Rust installed, you can compile from source with the following command: ```bash -cargo clean cargo build --release ``` @@ -207,7 +220,7 @@ Be sure to periodically update Rust with: rustup update stable ``` -The compiled binary will be outputted in the `target/release` folder. +The compiled binary will be outputted in the `./target/release` folder. ## Updating Rust Packages @@ -254,31 +267,52 @@ Fedora-based distros: sudo yum install openssl-devel ``` +## Cross-compiling Linux MUSL Binaries + +On a Linux OS, first install the target. + +```bash +rustup install stable-x86_64-unknown-linux-musl +rustup target add x86_64-unknown-linux-musl +``` + +Compile with: + +``` +cargo build --release --target=x86_64-unknown-linux-musl +``` + +The MUSL binary will be created in the `./target/x86_64-unknown-linux-musl/release/` directory. +MUSL binaries are are about 15% slower than the GNU binaries. + # Running Hayabusa -## Caution: Anti-Virus/EDR Warnings +## Caution: Anti-Virus/EDR Warnings and Slow Runtimes You may receive an alert from anti-virus or EDR products when trying to run hayabusa or even just when downloading the `.yml` rules as there will be keywords like `mimikatz` and suspicious PowerShell commands in the detection signature. These are false positives so will need to configure exclusions in your security products to allow hayabusa to run. If you are worried about malware or supply chain attacks, please check the hayabusa source code and compile the binaries yourself. +You may experience slow runtime especially on the first run after a reboot due to the real-time protection of Windows Defender. You can avoid this by temporarily turning real-time protection off or adding an exclusion to the hayabusa runtime directory. (Please take into consideration the security risks before doing these.) + ## Windows -In Command Prompt or Windows Terminal, just run the 32-bit or 64-bit Windows binary from the hayabusa root directory. -Example: `hayabusa-1.3.2-windows-x64.exe` +In a Command/PowerShell Prompt or Windows Terminal, just run the appropriate 32-bit or 64-bit Windows binary. + +Example: `hayabusa-1.5.1-windows-x64.exe` ## Linux You first need to make the binary executable. ```bash -chmod +x ./hayabusa-1.3.2-linux-x64-gnu +chmod +x ./hayabusa-1.5.1-linux-x64-gnu ``` Then run it from the Hayabusa root directory: ```bash -./hayabusa-1.3.2-linux-x64-gnu +./hayabusa-1.5.1-linux-x64-gnu ``` ## macOS @@ -286,159 +320,186 @@ Then run it from the Hayabusa root directory: From Terminal or iTerm2, you first need to make the binary executable. ```bash -chmod +x ./hayabusa-1.3.2-mac-intel +chmod +x ./hayabusa-1.5.1-mac-intel ``` Then, try to run it from the Hayabusa root directory: ```bash -./hayabusa-1.3.2-mac-intel +./hayabusa-1.5.1-mac-intel ``` On the latest version of macOS, you may receive the following security error when you try to run it: -![Mac Error 1 EN](/screenshots/MacOS-RunError-1-EN.png) +![Mac Error 1 EN](screenshots/MacOS-RunError-1-EN.png) Click "Cancel" and then from System Preferences, open "Security & Privacy" and from the General tab, click "Allow Anyway". -![Mac Error 2 EN](/screenshots/MacOS-RunError-2-EN.png) +![Mac Error 2 EN](screenshots/MacOS-RunError-2-EN.png) After that, try to run it again. ```bash -./hayabusa-1.3.2-mac-intel +./hayabusa-1.5.1-mac-intel ``` The following warning will pop up, so please click "Open". -![Mac Error 3 EN](/screenshots/MacOS-RunError-3-EN.png) +![Mac Error 3 EN](screenshots/MacOS-RunError-3-EN.png) You should now be able to run hayabusa. # Usage +## Main commands + +* default: Create a fast forensics timeline. +* `--level-tuning`: Custom tune the alerts' `level`. +* `-L, --logon-summary`: Print a summary of logon events. +* `-P, --pivot-keywords-list`: Print a list of suspicious keywords to pivot on. +* `-s, --statistics`: Print metrics of the count and percentage of events based on Event ID. +* `--set-default-profile`: Change the default profile. +* `-u, --update`: Sync the rules to the latest rules in the [hayabusa-rules](https://github.com/Yamato-Security/hayabusa-rules) GitHub repository. ## Command Line Options ``` USAGE: - hayabusa.exe -f file.evtx [OPTIONS] / hayabusa.exe -d evtx-directory [OPTIONS] + hayabusa.exe [OTHER-ACTIONS] [OPTIONS] -OPTIONS: - --European-time Output timestamp in European time format (ex: 22-02-2022 22:00:00.123 +02:00) - --RFC-2822 Output timestamp in RFC 2822 format (ex: Fri, 22 Feb 2022 22:00:00 -0600) - --RFC-3339 Output timestamp in RFC 3339 format (ex: 2022-02-22 22:00:00.123456-06:00) - --US-military-time Output timestamp in US military time format (ex: 02-22-2022 22:00:00.123 -06:00) - --US-time Output timestamp in US time format (ex: 02-22-2022 10:00:00.123 PM -06:00) - --target-file-ext ... Specify additional target file extensions (ex: evtx_data) (ex: evtx1 evtx2) - --all-tags Output all tags when saving to a CSV file - -c, --config Specify custom rule config folder (default: ./rules/config) - --contributors Print the list of contributors - -d, --directory Directory of multiple .evtx files - -D, --enable-deprecated-rules Enable rules marked as deprecated - --end-timeline End time of the event logs to load (ex: "2022-02-22 23:59:59 +09:00") - -f, --filepath File path to one .evtx file - -F, --full-data Print all field information - -h, --help Print help information - -l, --live-analysis Analyze the local C:\Windows\System32\winevt\Logs folder - -L, --logon-summary Print a summary of successful and failed logons - --level-tuning Tune alert levels (default: ./rules/config/level_tuning.txt) - -m, --min-level Minimum level for rules (default: informational) - -n, --enable-noisy-rules Enable rules marked as noisy - --no-color Disable color output - -o, --output Save the timeline in CSV format (ex: results.csv) - -p, --pivot-keywords-list Create a list of pivot keywords - -q, --quiet Quiet mode: do not display the launch banner - -Q, --quiet-errors Quiet errors mode: do not save error logs - -r, --rules Specify a rule directory or file (default: ./rules) - -R, --hide-record-ID Do not display EventRecordID numbers - -s, --statistics Print statistics of event IDs - --start-timeline Start time of the event logs to load (ex: "2020-02-22 00:00:00 +09:00") - -t, --thread-number Thread number (default: optimal number for performance) - -u, --update-rules Update to the latest rules in the hayabusa-rules github repository - -U, --UTC Output time in UTC format (default: local time) - -v, --verbose Output verbose information - -V, --visualize-timeline Output event frequency timeline - --version Print version information +INPUT: + -d, --directory Directory of multiple .evtx files + -f, --file File path to one .evtx file + -l, --live-analysis Analyze the local C:\Windows\System32\winevt\Logs folder + +ADVANCED: + -c, --rules-config Specify custom rule config directory (default: ./rules/config) + -Q, --quiet-errors Quiet errors mode: do not save error logs + -r, --rules Specify a custom rule directory or file (default: ./rules) + -t, --thread-number Thread number (default: optimal number for performance) + --target-file-ext ... Specify additional target file extensions (ex: evtx_data) (ex: evtx1 evtx2) + +OUTPUT: + -o, --output Save the timeline in CSV format (ex: results.csv) + -P, --profile Specify output profile (minimal, standard, verbose, verbose-all-field-info, verbose-details-and-all-field-info) + +DISPLAY-SETTINGS: + --no-color Disable color output + --no-summary Do not display result summary + -q, --quiet Quiet mode: do not display the launch banner + -v, --verbose Output verbose information + -V, --visualize-timeline Output event frequency timeline + +FILTERING: + -D, --deep-scan Disable event ID filter to scan all events (slower) + --enable-deprecated-rules Enable rules marked as deprecated + --exclude-status ... Ignore rules according to status (ex: experimental) (ex: stable test) + -m, --min-level Minimum level for rules (default: informational) + -n, --enable-noisy-rules Enable rules marked as noisy + --timeline-end End time of the event logs to load (ex: "2022-02-22 23:59:59 +09:00") + --timeline-start Start time of the event logs to load (ex: "2020-02-22 00:00:00 +09:00") + +OTHER-ACTIONS: + --contributors Print the list of contributors + -L, --logon-summary Print a summary of successful and failed logons + --level-tuning [] Tune alert levels (default: ./rules/config/level_tuning.txt) + -p, --pivot-keywords-list Create a list of pivot keywords + -s, --statistics Print statistics of event IDs + --set-default-profile Set default output profile + -u, --update-rules Update to the latest rules in the hayabusa-rules github repository + +TIME-FORMAT: + --European-time Output timestamp in European time format (ex: 22-02-2022 22:00:00.123 +02:00) + --RFC-2822 Output timestamp in RFC 2822 format (ex: Fri, 22 Feb 2022 22:00:00 -0600) + --RFC-3339 Output timestamp in RFC 3339 format (ex: 2022-02-22 22:00:00.123456-06:00) + --US-military-time Output timestamp in US military time format (ex: 02-22-2022 22:00:00.123 -06:00) + --US-time Output timestamp in US time format (ex: 02-22-2022 10:00:00.123 PM -06:00) + -U, --UTC Output time in UTC format (default: local time) ``` ## Usage Examples -* Run hayabusa against one Windows event log file: +* Run hayabusa against one Windows event log file with default standard profile: ```bash -hayabusa-1.3.2-win-x64.exe -f eventlog.evtx +hayabusa-1.5.1-win-x64.exe -f eventlog.evtx ``` -* Run hayabusa against the sample-evtx directory with multiple Windows event log files: +* Run hayabusa against the sample-evtx directory with multiple Windows event log files with the verbose profile: ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -P verbose ``` -* Export to a single CSV file for further analysis with excel, timeline explorer, elastic stack, etc... and include all field information: +* Export to a single CSV file for further analysis with excel, timeline explorer, elastic stack, etc... and include all field information (Warning: your file output size will become much larger with the `verbose-details-and-all-field-info` profile!): ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -o results.csv -F +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -o results.csv -F ``` * Only run hayabusa rules (the default is to run all the rules in `-r .\rules`): ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa -o results.csv +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa -o results.csv ``` * Only run hayabusa rules for logs that are enabled by default on Windows: ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\default -o results.csv +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\default -o results.csv ``` * Only run hayabusa rules for sysmon logs: ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\sysmon -o results.csv +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\sysmon -o results.csv ``` * Only run sigma rules: ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\sigma -o results.csv +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\sigma -o results.csv ``` * Enable deprecated rules (those with `status` marked as `deprecated`) and noisy rules (those whose rule ID is listed in `.\rules\config\noisy_rules.txt`): ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx --enable-noisy-rules --enable-deprecated-rules -o results.csv +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx --enable-noisy-rules --enable-deprecated-rules -o results.csv ``` * Only run rules to analyze logons and output in the UTC timezone: ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\default\events\Security\Logons -U -o results.csv +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -r .\rules\hayabusa\default\events\Security\Logons -U -o results.csv ``` * Run on a live Windows machine (requires Administrator privileges) and only detect alerts (potentially malicious behavior): ```bash -hayabusa-1.3.2-win-x64.exe -l -m low +hayabusa-1.5.1-win-x64.exe -l -m low ``` * Create a list of pivot keywords from critical alerts and save the results. (Results will be saved to `keywords-Ip Addresses.txt`, `keywords-Users.txt`, etc...): ```bash -hayabusa-1.3.2-win-x64.exe -l -m critical -p -o keywords +hayabusa-1.5.1-win-x64.exe -l -m critical -p -o keywords ``` * Print Event ID statistics: ```bash -hayabusa-1.3.2-win-x64.exe -f Security.evtx -s +hayabusa-1.5.1-win-x64.exe -f Security.evtx -s +``` + +* Print logon summary: + +```bash +hayabusa-1.5.1-win-x64.exe -L -f Security.evtx -s ``` * Print verbose information (useful for determining which files take long to process, parsing errors, etc...): ```bash -hayabusa-1.3.2-win-x64.exe -d .\hayabusa-sample-evtx -v +hayabusa-1.5.1-win-x64.exe -d .\hayabusa-sample-evtx -v ``` * Verbose output example: @@ -456,13 +517,19 @@ Checking target evtx FilePath: "./hayabusa-sample-evtx/YamatoSecurity/T1218.004_ 5 / 509 [=>------------------------------------------------------------------------------------------------------------------------------------------] 0.98 % 1s ``` +* Output to a CSV format compatible to import into [Timesketch](https://timesketch.org/): + +```bash +hayabusa-1.5.1-win-x64.exe -d ../hayabusa-sample-evtx --RFC-3339 -o timesketch-import.csv -P timesketch -U +``` + * Quiet error mode: By default, hayabusa will save error messages to error log files. If you do not want to save error messages, please add `-Q`. ## Pivot Keyword Generator -You can use the `-p` or `--pivot-keywords-list` option to create a list of unique pivot keywords to quickly identify abnormal users, hostnames, processes, etc... as well as correlate events. You can customize what keywords you want to search for by editing `config/pivot_keywords.txt`. +You can use the `-p` or `--pivot-keywords-list` option to create a list of unique pivot keywords to quickly identify abnormal users, hostnames, processes, etc... as well as correlate events. You can customize what keywords you want to search for by editing `./config/pivot_keywords.txt`. This is the default setting: ``` @@ -493,28 +560,84 @@ You can download the sample evtx files to a new `hayabusa-sample-evtx` sub-direc git clone https://github.com/Yamato-Security/hayabusa-sample-evtx.git ``` -> Note: You need to run the binary from the Hayabusa root directory. - # Hayabusa Output -When hayabusa output is being displayed to the screen (the default), it will display the following information: +## Profiles -* `Timestamp`: Default is `YYYY-MM-DD HH:mm:ss.sss +hh:mm` format. This comes from the `` field in the event log. The default timezone will be the local timezone but you can change the timezone to UTC with the `--utc` option. -* `Computer`: This comes from the `` field in the event log. -* `Channel`: The name of log. This comes from the `` field in the event log. -* `Event ID`: This comes from the `` field in the event log. -* `Level`: This comes from the `level` field in the YML detection rule. (`informational`, `low`, `medium`, `high`, `critical`) By default, all level alerts will be displayed but you can set the minimum level with `-m`. For example, you can set `-m high`) in order to only scan for and display high and critical alerts. -* `RecordID`: This comes from the `` field in the event log. You can hidde this output with the `-R` or `--hide-record-id` option. -* `Title`: This comes from the `title` field in the YML detection rule. -* `Details`: This comes from the `details` field in the YML detection rule, however, only hayabusa rules have this field. This field gives extra information about the alert or event and can extract useful data from the fields in event logs. For example, usernames, command line information, process information, etc... When a placeholder points to a field that does not exist or there is an incorrect alias mapping, it will be outputted as `n/a` (not available). If the `details` field is not specified (i.e. sigma rules), default `details` messages to extract fields defined in `./rules/config/default_details.txt` will be outputted. You can add more default `details` messages by adding the `Provider Name`, `EventID` and `details` message you want to output in `default_details.txt`. +Hayabusa has 5 pre-defined profiles to use in `config/profiles.yaml`: -The following additional columns will be added to the output when saving to a CSV file: +1. `minimal` +2. `standard` (default) +3. `verbose` +4. `verbose-all-field-info` +5. `verbose-details-and-all-field-info` -* `MitreAttack`: MITRE ATT&CK tactics. -* `Rule Path`: The path to the detection rule that generated the alert or event. -* `File Path`: The path to the evtx file that caused the alert or event. +You can easily customize or add your own profiles by editing this file. +You can also easily change the default profile with `--set-default-profile `. -If you add the `-F` or `--full-data` option, a `RecordInformation` column with all field information will also be added. +### 1. `minimal` profile output + +`%Timestamp%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%RuleTitle%`, `%Details%` + +### 2. `standard` profile output + +`%Timestamp%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%MitreTactics%`, `%RecordID%`, `%RuleTitle%`, `%Details%` + +### 3. `verbose` profile output + +`%Timestamp%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%MitreTactics`, `%MitreTags%`, `%OtherTags%`, `%RecordID%`, `%RuleTitle%`, `%Details%`, `%RuleFile%`, `%EvtxFile%` + +### 4. `verbose-all-field-info` profile output + +Instead of outputting the minimal `details` information, all field information in the `EventData` section will be outputted. + +`%Timestamp%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%MitreTactics`, `%MitreTags%`, `%OtherTags%`, `%RecordID%`, `%RuleTitle%`, `%AllFieldInfo%`, `%RuleFile%`, `%EvtxFile%` + +### 5. `verbose-details-and-all-field-info` profile output + +`verbose` profile plus all field information. (Warning: this will usually double the output file size!) + +`%Timestamp%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%MitreTactics`, `%MitreTags%`, `%OtherTags%`, `%RecordID%`, `%RuleTitle%`, `%Details%`, `%RuleFile%`, `%EvtxFile%`, `%AllFieldInfo%` + +### 6. `timesketch` profile output + +The `verbose` profile that is compatible with importing into [Timesketch](https://timesketch.org/). + +`%Timestamp%`, `hayabusa`, `%RuleTitle%`, `%Computer%`, `%Channel%`, `%EventID%`, `%Level%`, `%MitreTactics`, `%MitreTags%`, `%OtherTags%`, `%RecordID%`, `%Details%`, `%RuleFile%`, `%EvtxFile%` + +### Profile Comparison + +The following benchmarks were conducted on a 2018 MBP with 7.5GB of evtx data. + +| Profile | Processing Time | Output Filesize | +| :---: | :---: | :---: | +| minimal | 16 minutes 18 seconds | 690 MB | +| standard | 16 minutes 23 seconds | 710 MB | +| verbose | 17 minutes | 990 MB | +| timesketch | 17 minutes | 1015 MB | +| verbose-all-field-info | 16 minutes 50 seconds | 1.6 GB | +| verbose-details-and-all-field-info | 17 minutes 12 seconds | 2.1 GB | + +### Profile Field Aliases + +| Alias name | Hayabusa output information| +| :--- | :--- | +|%Timestamp% | Default is `YYYY-MM-DD HH:mm:ss.sss +hh:mm` format. `` field in the event log. The default timezone will be the local timezone but you can change the timezone to UTC with the `--UTC` option. | +|%Computer% | The `` field. | +|%Channel% | The name of log. `` field. | +|%EventID% | The `` field. | +|%Level% | The `level` field in the YML detection rule. (`informational`, `low`, `medium`, `high`, `critical`) | +|%MitreTactics% | MITRE ATT&CK [tactics](https://attack.mitre.org/tactics/enterprise/) (Ex: Initial Access, Lateral Movement, etc...). | +|%MitreTags% | MITRE ATT&CK Group ID, Technique ID and Software ID. | +|%OtherTags% | Any keyword in the `tags` field in a YML detection rule which is not included in `MitreTactics` or `MitreTags`. | +|%RecordID% | The Event Record ID from `` field. | +|%RuleTitle% | The `title` field in the YML detection rule. | +|%Details% | The `details` field in the YML detection rule, however, only hayabusa rules have this field. This field gives extra information about the alert or event and can extract useful data from the fields in event logs. For example, usernames, command line information, process information, etc... When a placeholder points to a field that does not exist or there is an incorrect alias mapping, it will be outputted as `n/a` (not available). If the `details` field is not specified (i.e. sigma rules), default `details` messages to extract fields defined in `./rules/config/default_details.txt` will be outputted. You can add more default `details` messages by adding the `Provider Name`, `EventID` and `details` message you want to output in `default_details.txt`. When no `details` field is defined in a rule nor in `default_details.txt`, all fields will be outputted to the `details` column. | +|%AllFieldInfo% | All field information. | +|%RuleFile% | The filename of the detection rule that generated the alert or event. | +|%EvtxFile% | The evtx filename that caused the alert or event. | + +You can use these aliases in your output profiles, as well as define other [event key alises](https://github.com/Yamato-Security/hayabusa-rules/blob/main/README.md#eventkey-aliases) to output other fields. ## Level Abbrevations @@ -529,7 +652,7 @@ In order to save space, we use the following abbrevations when displaying the al ## MITRE ATT&CK Tactics Abbreviations In order to save space, we use the following abbreviations when displaying MITRE ATT&CK tactic tags. -You can freely edit these abbreviations in the `config/output_tag.txt` configuration file. +You can freely edit these abbreviations in the `./config/output_tag.txt` configuration file. If you want to output all the tags defined in a rule, please specify the `--all-tags` option. * `Recon` : Reconnaissance @@ -550,7 +673,7 @@ If you want to output all the tags defined in a rule, please specify the `--all- ## Channel Abbreviations In order to save space, we use the following abbreviations when displaying Channel. -You can freely edit these abbreviations in the `config/channel_abbreviations.txt` configuration file. +You can freely edit these abbreviations in the `./rules/config/channel_abbreviations.txt` configuration file. * `App` : `Application` * `AppLocker` : `Microsoft-Windows-AppLocker/*` @@ -592,16 +715,18 @@ The alerts will be outputted in color based on the alert `level`. You can change the default colors in the config file at `./config/level_color.txt` in the format of `level,(RGB 6-digit ColorHex)`. If you want to disable color output, you can use `--no-color` option. -## Event Fequency Timeline +## Results Summary + +### Event Fequency Timeline If you add `-V` or `--visualize-timeline` option, the Event Frequency Timeline feature displays a sparkline frequency timeline of detected events. Note: There needs to be more than 5 events. Also, the characters will not render correctly on the default Command Prompt or PowerShell Prompt, so please use a terminal like Windows Terminal, iTerm2, etc... -## Dates with most total detections +### Dates with most total detections A summary of the dates with the most total detections categorized by level (`critical`, `high`, etc...). -## Top 5 computers with most unique detections +### Top 5 computers with most unique detections The top 5 computers with the most unique detections categorized by level (`critical`, `high`, etc...). @@ -651,15 +776,15 @@ Hayabusa rules are designed solely for Windows event log analysis and have the f Like firewalls and IDSes, any signature-based tool will require some tuning to fit your environment so you may need to permanently or temporarily exclude certain rules. -You can add a rule ID (Example: `4fe151c2-ecf9-4fae-95ae-b88ec9c2fca6`) to `rules/config/exclude_rules.txt` in order to ignore any rule that you do not need or cannot be used. +You can add a rule ID (Example: `4fe151c2-ecf9-4fae-95ae-b88ec9c2fca6`) to `./rules/config/exclude_rules.txt` in order to ignore any rule that you do not need or cannot be used. -You can also add a rule ID to `rules/config/noisy_rules.txt` in order to ignore the rule by default but still be able to use the rule with the `-n` or `--enable-noisy-rules` option. +You can also add a rule ID to `./rules/config/noisy_rules.txt` in order to ignore the rule by default but still be able to use the rule with the `-n` or `--enable-noisy-rules` option. ## Detection Level Tuning Hayabusa and Sigma rule authors will determine the risk level of the alert when writing their rules. However, the actual risk level will differ between environments. -You can tune the risk level of the rules by adding them to `./rules/config/level_tuning.txt` and executing `hayabusa-1.3.2-win-x64.exe --level-tuning` which will update the `level` line in the rule file. +You can tune the risk level of the rules by adding them to `./rules/config/level_tuning.txt` and executing `hayabusa-1.5.1-win-x64.exe --level-tuning` which will update the `level` line in the rule file. Please note that the rule file will be updated directly. `./rules/config/level_tuning.txt` sample line: @@ -673,12 +798,9 @@ In this case, the risk level of the rule with an `id` of `00000000-0000-0000-000 ## Event ID Filtering -You can filter on event IDs by placing event ID numbers in `config/target_eventids.txt`. -This will increase performance so it is recommended if you only need to search for certain IDs. - -We have provided a sample ID filter list at [`config/target_eventids_sample.txt`](https://github.com/Yamato-Security/hayabusa/blob/main/config/target_eventids_sample.txt) created from the `EventID` fields in all of the rules as well as IDs seen in actual results. - -Please use this list if you want the best performance but be aware that there is a slight possibility for missing events (false negatives). +By default, events are filtered by ID to improve performance by ignorning events that have no detection rules. +The IDs defined in `./rules/config/target_event_IDs.txt` will be scanned. +If you want to scan all events, please use the `-D, --deep-scan` option. # Other Windows Event Log Analyzers and Related Resources @@ -686,7 +808,7 @@ There is no "one tool to rule them all" and we have found that each has its own * [APT-Hunter](https://github.com/ahmedkhlief/APT-Hunter) - Attack detection tool written in Python. * [Awesome Event IDs](https://github.com/stuhli/awesome-event-ids) - Collection of Event ID resources useful for Digital Forensics and Incident Response -* [Chainsaw](https://github.com/countercept/chainsaw) - A similar sigma-based attack detection tool written in Rust. +* [Chainsaw](https://github.com/countercept/chainsaw) - Another sigma-based attack detection tool written in Rust. * [DeepBlueCLI](https://github.com/sans-blue-team/DeepBlueCLI) - Attack detection tool written in Powershell by [Eric Conrad](https://twitter.com/eric_conrad). * [Epagneul](https://github.com/jurelou/epagneul) - Graph visualization for Windows event logs. * [EventList](https://github.com/miriamxyra/EventList/) - Map security baseline event IDs to MITRE ATT&CK by [Miriam Wiesner](https://github.com/miriamxyra). @@ -726,6 +848,7 @@ To create the most forensic evidence and detect with the highest accuracy, you n ## English +* 2022/06/19 [Velociraptor Walkthrough and Hayabusa Integration](https://www.youtube.com/watch?v=Q1IoGX--814) by [Eric Cupuano](https://twitter.com/eric_capuano) * 2022/01/24 [Graphing Hayabusa results in neo4j](https://www.youtube.com/watch?v=7sQqz2ek-ko) by Matthew Seyer ([@forensic_matt](https://twitter.com/forensic_matt)) ## Japanese @@ -755,4 +878,4 @@ Hayabusa is released under [GPLv3](https://www.gnu.org/licenses/gpl-3.0.en.html) # Twitter -You can recieve the latest news about Hayabusa, rule updates, other Yamato Security tools, etc... by following us on Twitter at [@SecurityYamato](https://twitter.com/SecurityYamato). \ No newline at end of file +You can recieve the latest news about Hayabusa, rule updates, other Yamato Security tools, etc... by following us on Twitter at [@SecurityYamato](https://twitter.com/SecurityYamato). diff --git a/build.rs b/build.rs new file mode 100644 index 00000000..7c051a1c --- /dev/null +++ b/build.rs @@ -0,0 +1,4 @@ +fn main() { + #[cfg(target_os = "windows")] + static_vcruntime::metabuild(); +} diff --git a/config/channel_abbreviations.txt b/config/channel_abbreviations.txt deleted file mode 100644 index 3ef8affd..00000000 --- a/config/channel_abbreviations.txt +++ /dev/null @@ -1,33 +0,0 @@ -Channel,Abbreviation -Application,App -DNS Server,DNS-Svr -Key Management Service,KeyMgtSvc -Microsoft-ServiceBus-Client,SvcBusCli -Microsoft-Windows-CodeIntegrity/Operational,CodeInteg -Microsoft-Windows-LDAP-Client/Debug,LDAP-Cli -Microsoft-Windows-AppLocker/MSI and Script,AppLocker -Microsoft-Windows-AppLocker/EXE and DLL,AppLocker -Microsoft-Windows-AppLocker/Packaged app-Deployment,AppLocker -Microsoft-Windows-AppLocker/Packaged app-Execution,AppLocker -Microsoft-Windows-Bits-Client/Operational,BitsCli -Microsoft-Windows-DHCP-Server/Operational,DHCP-Svr -Microsoft-Windows-DriverFrameworks-UserMode/Operational,DvrFmwk -Microsoft-Windows-NTLM/Operational,NTLM -Microsoft-Windows-Security-Mitigations/KernelMode,SecMitig -Microsoft-Windows-Security-Mitigations/UserMode,SecMitig -Microsoft-Windows-SmbClient/Security,SmbCliSec -Microsoft-Windows-Sysmon/Operational,Sysmon -Microsoft-Windows-TaskScheduler/Operational,TaskSch -Microsoft-Windows-TerminalServices-RDPClient/Operational,RDP-Client -Microsoft-Windows-PrintService/Admin,PrintAdm -Microsoft-Windows-PrintService/Operational,PrintOp -Microsoft-Windows-PowerShell/Operational,PwSh -Microsoft-Windows-Windows Defender/Operational,Defender -Microsoft-Windows-Windows Firewall With Advanced Security/Firewall,Firewall -Microsoft-Windows-WinRM/Operational,WinRM -Microsoft-Windows-WMI-Activity/Operational,WMI -MSExchange Management,Exchange -OpenSSH/Operational,OpenSSH -Security,Sec -System,Sys -Windows PowerShell,PwShClassic \ No newline at end of file diff --git a/config/default_profile.yaml b/config/default_profile.yaml new file mode 100644 index 00000000..394b6546 --- /dev/null +++ b/config/default_profile.yaml @@ -0,0 +1,10 @@ +--- +Timestamp: "%Timestamp%" +Computer: "%Computer%" +Channel: "%Channel%" +EventID: "%EventID%" +Level: "%Level%" +MitreTactics: "%MitreTactics%" +RecordID: "%RecordID%" +RuleTitle: "%RuleTitle%" +Details: "%Details%" \ No newline at end of file diff --git a/config/output_tag.txt b/config/mitre_tactics.txt similarity index 100% rename from config/output_tag.txt rename to config/mitre_tactics.txt diff --git a/config/profiles.yaml b/config/profiles.yaml new file mode 100644 index 00000000..de51b7f8 --- /dev/null +++ b/config/profiles.yaml @@ -0,0 +1,87 @@ +#Standard profile minus MITRE ATT&CK Tactics and Record ID. +minimal: + Timestamp: "%Timestamp%" + Computer: "%Computer%" + Channel: "%Channel%" + EventID: "%EventID%" + Level: "%Level%" + RuleTitle: "%RuleTitle%" + Details: "%Details%" + +standard: + Timestamp: "%Timestamp%" + Computer: "%Computer%" + Channel: "%Channel%" + EventID: "%EventID%" + Level: "%Level%" + MitreTactics: "%MitreTactics%" + RecordID: "%RecordID%" + RuleTitle: "%RuleTitle%" + Details: "%Details%" + +#Standard profile plus MitreTags(MITRE techniques, software and groups), rule filename and EVTX filename. +verbose: + Timestamp: "%Timestamp%" + Computer: "%Computer%" + Channel: "%Channel%" + EventID: "%EventID%" + Level: "%Level%" + MitreTactics: "%MitreTactics%" + MitreTags: "%MitreTags%" + OtherTags: "%OtherTags%" + RecordID: "%RecordID%" + RuleTitle: "%RuleTitle%" + Details: "%Details%" + RuleFile: "%RuleFile%" + EvtxFile: "%EvtxFile%" + +#Verbose profile with all field information instead of the minimal fields defined in Details. +verbose-all-field-info: + Timestamp: "%Timestamp%" + Computer: "%Computer%" + Channel: "%Channel%" + EventID: "%EventID%" + Level: "%Level%" + MitreTactics: "%MitreTactics%" + MitreTags: "%MitreTags%" + OtherTags: "%OtherTags%" + RecordID: "%RecordID%" + RuleTitle: "%RuleTitle%" + AllFieldInfo: "%RecordInformation%" + RuleFile: "%RuleFile%" + EvtxFile: "%EvtxFile%" + +#Verbose profile plus all field information. (Warning: this will more than double the output file size!) +verbose-details-and-all-field-info: + Timestamp: "%Timestamp%" + Computer: "%Computer%" + Channel: "%Channel%" + EventID: "%EventID%" + Level: "%Level%" + MitreTactics: "%MitreTactics%" + MitreTags: "%MitreTags%" + OtherTags: "%OtherTags%" + RecordID: "%RecordID%" + RuleTitle: "%RuleTitle%" + Details: "%Details%" + RuleFile: "%RuleFile%" + EvtxFile: "%EvtxFile%" + AllFieldInfo: "%RecordInformation%" + +#Output that is compatible to import the CSV into Timesketch +timesketch: + datetime: "%Timestamp%" + timestamp_desc: "hayabusa" + message: "%RuleTitle%" + Computer: "%Computer%" + Channel: "%Channel%" + EventID: "%EventID%" + Level: "%Level%" + MitreTactics: "%MitreTactics%" + MitreTags: "%MitreTags%" + OtherTags: "%OtherTags%" + RecordID: "%RecordID%" + Details: "%Details%" + RuleFile: "%RuleFile%" + EvtxFile: "%EvtxFile%" + AllFieldInfo: "%RecordInformation%" \ No newline at end of file diff --git a/config/statistics_event_info.txt b/config/statistics_event_info.txt deleted file mode 100644 index 705aa564..00000000 --- a/config/statistics_event_info.txt +++ /dev/null @@ -1,496 +0,0 @@ -eventid,event_title -6406,%1 registered to Windows Firewall to control filtering for the following: %2 -1,Process Creation. -2,File Creation Timestamp Changed. (Possible Timestomping) -3,Network Connection. -4,Sysmon Service State Changed. -5,Process Terminated. -6,Driver Loaded. -7,Image Loaded. -8,Remote Thread Created. (Possible Code Injection) -9,Raw Access Read. -10,Process Access. -11,File Creation or Overwrite. -12,Registry Object Created/Deletion. -13,Registry Value Set. -14,Registry Key or Value Rename. -15,Alternate Data Stream Created. -16,Sysmon Service Configuration Changed. -17,Named Pipe Created. -18,Named Pipe Connection. -19,WmiEventFilter Activity. -20,WmiEventConsumer Activity. -21,WmiEventConsumerToFilter Activity. -22,DNS Query. -23,Deleted File Archived. -24,Clipboard Changed. -25,Process Tampering. (Possible Process Hollowing or Herpaderping) -26,File Deleted. -27,KDC Encryption Type Configuration -31,Windows Update Failed -34,Windows Update Failed -35,Windows Update Failed -43,New Device Information -81,Processing client request for operation CreateShell -82,Entering the plugin for operation CreateShell with a ResourceURI -104,Event Log was Cleared -106,A task has been scheduled -134,Sending response for operation CreateShell -169,Creating WSMan Session (on Server) -255,Sysmon Error. -400,New Mass Storage Installation -410,New Mass Storage Installation -800,Summary of Software Activities -903,New Application Installation -904,New Application Installation -905,Updated Application -906,Updated Application -907,Removed Application -908,Removed Application -1001,BSOD -1005,Scan Failed -1006,Detected Malware -1008,Action on Malware Failed -1009,Hotpatching Failed -1010,Failed to remove item from quarantine -1022,New MSI File Installed -1033,New MSI File Installed -1100,The event logging service has shut down -1101,Audit events have been dropped by the transport. -1102,The audit log was cleared -1104,The security Log is now full -1105,Event log automatic backup -1108,The event logging service encountered an error -1125,Group Policy: Internal Error -1127,Group Policy: Generic Internal Error -1129,Group Policy: Group Policy Application Failed due to Connectivity -1149,User authentication succeeded -2001,Failed to update signatures -2003,Failed to update engine -2004,Firewall Rule Add -2004,Reverting to last known good set of signatures -2005,Firewall Rule Change -2006,Firewall Rule Deleted -2009,Firewall Failed to load Group Policy -2033,Firewall Rule Deleted -3001,Code Integrity Check Warning -3002,Code Integrity Check Warning -3002,Real-Time Protection failed -3003,Code Integrity Check Warning -3004,Code Integrity Check Warning -3010,Code Integrity Check Warning -3023,Code Integrity Check Warning -4103,Module logging. Executing Pipeline. -4104,Script Block Logging. -4105,CommandStart - Started -4106,CommandStart - Stoppeed -4608,Windows is starting up -4609,Windows is shutting down -4610,An authentication package has been loaded by the Local Security Authority -4611,A trusted logon process has been registered with the Local Security Authority -4612,"Internal resources allocated for the queuing of audit messages have been exhausted, leading to the loss of some audits." -4614,A notification package has been loaded by the Security Account Manager. -4615,Invalid use of LPC port -4616,The system time was changed. -4618,A monitored security event pattern has occurred -4621,Administrator recovered system from CrashOnAuditFail -4622,A security package has been loaded by the Local Security Authority. -4624,Logon Success -4625,Logon Failure -4627,Group Membership Information -4634,Account Logoff -4646,IKE DoS-prevention mode started -4647,User initiated logoff -4648,Explicit Logon -4649,A replay attack was detected -4650,An IPsec Main Mode security association was established -4651,An IPsec Main Mode security association was established -4652,An IPsec Main Mode negotiation failed -4653,An IPsec Main Mode negotiation failed -4654,An IPsec Quick Mode negotiation failed -4655,An IPsec Main Mode security association ended -4656,A handle to an object was requested -4657,A registry value was modified -4658,The handle to an object was closed -4659,A handle to an object was requested with intent to delete -4660,An object was deleted -4661,A handle to an object was requested -4662,An operation was performed on an object -4663,An attempt was made to access an object -4664,An attempt was made to create a hard link -4665,An attempt was made to create an application client context. -4666,An application attempted an operation -4667,An application client context was deleted -4668,An application was initialized -4670,Permissions on an object were changed -4671,An application attempted to access a blocked ordinal through the TBS -4672,Admin Logon -4673,A privileged service was called -4674,An operation was attempted on a privileged object -4675,SIDs were filtered -4685,The state of a transaction has changed -4688,Process Creation. -4689,A process has exited -4690,An attempt was made to duplicate a handle to an object -4691,Indirect access to an object was requested -4692,Backup of data protection master key was attempted -4693,Recovery of data protection master key was attempted -4694,Protection of auditable protected data was attempted -4695,Unprotection of auditable protected data was attempted -4696,A primary token was assigned to process -4697,A service was installed in the system -4698,A scheduled task was created -4699,A scheduled task was deleted -4700,A scheduled task was enabled -4701,A scheduled task was disabled -4702,A scheduled task was updated -4704,A user right was assigned -4705,A user right was removed -4706,A new trust was created to a domain -4707,A trust to a domain was removed -4709,IPsec Services was started -4710,IPsec Services was disabled -4711,PAStore Engine -4712,IPsec Services encountered a potentially serious failure -4713,Kerberos policy was changed -4714,Encrypted data recovery policy was changed -4715,The audit policy (SACL) on an object was changed -4716,Trusted domain information was modified -4717,System security access was granted to an account -4718,System security access was removed from an account -4719,System audit policy was changed -4720,A user account was created -4722,A user account was enabled -4723,An attempt was made to change an account's password -4724,An attempt was made to reset an accounts password -4725,A user account was disabled -4726,A user account was deleted -4727,A security-enabled global group was created -4728,A member was added to a security-enabled global group -4729,A member was removed from a security-enabled global group -4730,A security-enabled global group was deleted -4731,A security-enabled local group was created -4732,A member was added to a security-enabled local group -4733,A member was removed from a security-enabled local group -4734,A security-enabled local group was deleted -4735,A security-enabled local group was changed -4737,A security-enabled global group was changed -4738,A user account was changed -4739,Domain Policy was changed -4740,A user account was locked out -4741,A computer account was created -4742,A computer account was changed -4743,A computer account was deleted -4744,A security-disabled local group was created -4745,A security-disabled local group was changed -4746,A member was added to a security-disabled local group -4747,A member was removed from a security-disabled local group -4748,A security-disabled local group was deleted -4749,A security-disabled global group was created -4750,A security-disabled global group was changed -4751,A member was added to a security-disabled global group -4752,A member was removed from a security-disabled global group -4753,A security-disabled global group was deleted -4754,A security-enabled universal group was created -4755,A security-enabled universal group was changed -4756,A member was added to a security-enabled universal group -4757,A member was removed from a security-enabled universal group -4758,A security-enabled universal group was deleted -4759,A security-disabled universal group was created -4760,A security-disabled universal group was changed -4761,A member was added to a security-disabled universal group -4762,A member was removed from a security-disabled universal group -4763,A security-disabled universal group was deleted -4764,A groups type was changed -4765,SID History was added to an account -4766,An attempt to add SID History to an account failed -4767,A user account was unlocked -4768,A Kerberos authentication ticket (TGT) was requested -4769,A Kerberos service ticket was requested -4770,A Kerberos service ticket was renewed -4771,Kerberos pre-authentication failed -4772,A Kerberos authentication ticket request failed -4773,A Kerberos service ticket request failed -4774,An account was mapped for logon -4775,An account could not be mapped for logon -4776,The domain controller attempted to validate the credentials for an account -4777,The domain controller failed to validate the credentials for an account -4778,A session was reconnected to a Window Station -4779,A session was disconnected from a Window Station -4780,The ACL was set on accounts which are members of administrators groups -4781,The name of an account was changed -4782,The password hash an account was accessed -4783,A basic application group was created -4784,A basic application group was changed -4785,A member was added to a basic application group -4786,A member was removed from a basic application group -4787,A non-member was added to a basic application group -4788,A non-member was removed from a basic application group.. -4789,A basic application group was deleted -4790,An LDAP query group was created -4791,A basic application group was changed -4792,An LDAP query group was deleted -4793,The Password Policy Checking API was called -4794,An attempt was made to set the Directory Services Restore Mode administrator password -4800,The workstation was locked -4801,The workstation was unlocked -4802,The screen saver was invoked -4803,The screen saver was dismissed -4816,RPC detected an integrity violation while decrypting an incoming message -4817,Auditing settings on object were changed. -4864,A namespace collision was detected -4865,A trusted forest information entry was added -4866,A trusted forest information entry was removed -4867,A trusted forest information entry was modified -4868,The certificate manager denied a pending certificate request -4869,Certificate Services received a resubmitted certificate request -4870,Certificate Services revoked a certificate -4871,Certificate Services received a request to publish the certificate revocation list (CRL) -4872,Certificate Services published the certificate revocation list (CRL) -4873,A certificate request extension changed -4874,One or more certificate request attributes changed. -4875,Certificate Services received a request to shut down -4876,Certificate Services backup started -4877,Certificate Services backup completed -4878,Certificate Services restore started -4879,Certificate Services restore completed -4880,Certificate Services started -4881,Certificate Services stopped -4882,The security permissions for Certificate Services changed -4883,Certificate Services retrieved an archived key -4884,Certificate Services imported a certificate into its database -4885,The audit filter for Certificate Services changed -4886,Certificate Services received a certificate request -4887,Certificate Services approved a certificate request and issued a certificate -4888,Certificate Services denied a certificate request -4889,Certificate Services set the status of a certificate request to pending -4890,The certificate manager settings for Certificate Services changed. -4891,A configuration entry changed in Certificate Services -4892,A property of Certificate Services changed -4893,Certificate Services archived a key -4894,Certificate Services imported and archived a key -4895,Certificate Services published the CA certificate to Active Directory Domain Services -4896,One or more rows have been deleted from the certificate database -4897,Role separation enabled -4898,Certificate Services loaded a template -4899,A Certificate Services template was updated -4900,Certificate Services template security was updated -4902,The Per-user audit policy table was created -4904,An attempt was made to register a security event source -4905,An attempt was made to unregister a security event source -4906,The CrashOnAuditFail value has changed -4907,Auditing settings on object were changed -4908,Special Groups Logon table modified -4909,The local policy settings for the TBS were changed -4910,The group policy settings for the TBS were changed -4912,Per User Audit Policy was changed -4928,An Active Directory replica source naming context was established -4929,An Active Directory replica source naming context was removed -4930,An Active Directory replica source naming context was modified -4931,An Active Directory replica destination naming context was modified -4932,Synchronization of a replica of an Active Directory naming context has begun -4933,Synchronization of a replica of an Active Directory naming context has ended -4934,Attributes of an Active Directory object were replicated -4935,Replication failure begins -4936,Replication failure ends -4937,A lingering object was removed from a replica -4944,The following policy was active when the Windows Firewall started -4945,A rule was listed when the Windows Firewall started -4946,A change has been made to Windows Firewall exception list. A rule was added -4947,A change has been made to Windows Firewall exception list. A rule was modified -4948,A change has been made to Windows Firewall exception list. A rule was deleted -4949,Windows Firewall settings were restored to the default values -4950,A Windows Firewall setting has changed -4951,A rule has been ignored because its major version number was not recognized by Windows Firewall -4952,Parts of a rule have been ignored because its minor version number was not recognized by Windows Firewall -4953,A rule has been ignored by Windows Firewall because it could not parse the rule -4954,Windows Firewall Group Policy settings has changed. The new settings have been applied -4956,Windows Firewall has changed the active profile -4957,Windows Firewall did not apply the following rule -4958,Windows Firewall did not apply the following rule because the rule referred to items not configured on this computer -4960,IPsec dropped an inbound packet that failed an integrity check -4961,IPsec dropped an inbound packet that failed a replay check -4962,IPsec dropped an inbound packet that failed a replay check -4963,IPsec dropped an inbound clear text packet that should have been secured -4964,Special groups have been assigned to a new logon -4965,IPsec received a packet from a remote computer with an incorrect Security Parameter Index (SPI). -4976,"During Main Mode negotiation, IPsec received an invalid negotiation packet." -4977,"During Quick Mode negotiation, IPsec received an invalid negotiation packet." -4978,"During Extended Mode negotiation, IPsec received an invalid negotiation packet." -4979,IPsec Main Mode and Extended Mode security associations were established -4980,IPsec Main Mode and Extended Mode security associations were established -4981,IPsec Main Mode and Extended Mode security associations were established -4982,IPsec Main Mode and Extended Mode security associations were established -4983,An IPsec Extended Mode negotiation failed -4984,An IPsec Extended Mode negotiation failed -4985,The state of a transaction has changed -5008,Unexpected Error -5024,The Windows Firewall Service has started successfully -5025,The Windows Firewall Service has been stopped -5027,The Windows Firewall Service was unable to retrieve the security policy from the local storage -5028,The Windows Firewall Service was unable to parse the new security policy. -5029,The Windows Firewall Service failed to initialize the driver -5030,The Windows Firewall Service failed to start -5031,The Windows Firewall Service blocked an application from accepting incoming connections on the network. -5032,Windows Firewall was unable to notify the user that it blocked an application from accepting incoming connections on the network -5033,The Windows Firewall Driver has started successfully -5034,The Windows Firewall Driver has been stopped -5035,The Windows Firewall Driver failed to start -5037,The Windows Firewall Driver detected critical runtime error. Terminating -5038,Code integrity determined that the image hash of a file is not valid -5039,A registry key was virtualized. -5040,A change has been made to IPsec settings. An Authentication Set was added. -5041,A change has been made to IPsec settings. An Authentication Set was modified -5042,A change has been made to IPsec settings. An Authentication Set was deleted -5043,A change has been made to IPsec settings. A Connection Security Rule was added -5044,A change has been made to IPsec settings. A Connection Security Rule was modified -5045,A change has been made to IPsec settings. A Connection Security Rule was deleted -5046,A change has been made to IPsec settings. A Crypto Set was added -5047,A change has been made to IPsec settings. A Crypto Set was modified -5048,A change has been made to IPsec settings. A Crypto Set was deleted -5049,An IPsec Security Association was deleted -5050,An attempt to programmatically disable the Windows Firewall using a call to INetFwProfile -5051,A file was virtualized -5056,A cryptographic self test was performed -5057,A cryptographic primitive operation failed -5058,Key file operation -5059,Key migration operation -5060,Verification operation failed -5061,Cryptographic operation -5062,A kernel-mode cryptographic self test was performed -5063,A cryptographic provider operation was attempted -5064,A cryptographic context operation was attempted -5065,A cryptographic context modification was attempted -5066,A cryptographic function operation was attempted -5067,A cryptographic function modification was attempted -5068,A cryptographic function provider operation was attempted -5069,A cryptographic function property operation was attempted -5070,A cryptographic function property operation was attempted -5120,OCSP Responder Service Started -5121,OCSP Responder Service Stopped -5122,A Configuration entry changed in the OCSP Responder Service -5123,A configuration entry changed in the OCSP Responder Service -5124,A security setting was updated on OCSP Responder Service -5125,A request was submitted to OCSP Responder Service -5126,Signing Certificate was automatically updated by the OCSP Responder Service -5127,The OCSP Revocation Provider successfully updated the revocation information -5136,A directory service object was modified -5137,A directory service object was created -5138,A directory service object was undeleted -5139,A directory service object was moved -5140,A network share object was accessed -5141,A directory service object was deleted -5142,A network share object was added. -5143,A network share object was modified -5144,A network share object was deleted. -5145,A network share object was checked to see whether client can be granted desired access -5148,The Windows Filtering Platform has detected a DoS attack and entered a defensive mode; packets associated with this attack will be discarded. -5149,The DoS attack has subsided and normal processing is being resumed. -5150,The Windows Filtering Platform has blocked a packet. -5151,A more restrictive Windows Filtering Platform filter has blocked a packet. -5152,The Windows Filtering Platform blocked a packet -5153,A more restrictive Windows Filtering Platform filter has blocked a packet -5154,The Windows Filtering Platform has permitted an application or service to listen on a port for incoming connections -5155,The Windows Filtering Platform has blocked an application or service from listening on a port for incoming connections -5156,The Windows Filtering Platform has allowed a connection -5157,The Windows Filtering Platform has blocked a connection -5158,The Windows Filtering Platform has permitted a bind to a local port -5159,The Windows Filtering Platform has blocked a bind to a local port -5168,Spn check for SMB/SMB2 fails. -5376,Credential Manager credentials were backed up -5377,Credential Manager credentials were restored from a backup -5378,The requested credentials delegation was disallowed by policy -5440,The following callout was present when the Windows Filtering Platform Base Filtering Engine started -5441,The following filter was present when the Windows Filtering Platform Base Filtering Engine started -5442,The following provider was present when the Windows Filtering Platform Base Filtering Engine started -5443,The following provider context was present when the Windows Filtering Platform Base Filtering Engine started -5444,The following sub-layer was present when the Windows Filtering Platform Base Filtering Engine started -5446,A Windows Filtering Platform callout has been changed -5447,A Windows Filtering Platform filter has been changed -5448,A Windows Filtering Platform provider has been changed -5449,A Windows Filtering Platform provider context has been changed -5450,A Windows Filtering Platform sub-layer has been changed -5451,An IPsec Quick Mode security association was established -5452,An IPsec Quick Mode security association ended -5453,An IPsec negotiation with a remote computer failed because the IKE and AuthIP IPsec Keying Modules (IKEEXT) service is not started -5456,PAStore Engine applied Active Directory storage IPsec policy on the computer -5457,PAStore Engine failed to apply Active Directory storage IPsec policy on the computer -5458,PAStore Engine applied locally cached copy of Active Directory storage IPsec policy on the computer -5459,PAStore Engine failed to apply locally cached copy of Active Directory storage IPsec policy on the computer -5460,PAStore Engine applied local registry storage IPsec policy on the computer -5461,PAStore Engine failed to apply local registry storage IPsec policy on the computer -5462,PAStore Engine failed to apply some rules of the active IPsec policy on the computer -5463,PAStore Engine polled for changes to the active IPsec policy and detected no changes -5464,"PAStore Engine polled for changes to the active IPsec policy, detected changes, and applied them to IPsec Services" -5465,PAStore Engine received a control for forced reloading of IPsec policy and processed the control successfully -5466,"PAStore Engine polled for changes to the Active Directory IPsec policy, determined that Active Directory cannot be reached, and will use the cached copy of the Active Directory IPsec policy instead" -5467,"PAStore Engine polled for changes to the Active Directory IPsec policy, determined that Active Directory can be reached, and found no changes to the policy" -5468,"PAStore Engine polled for changes to the Active Directory IPsec policy, determined that Active Directory can be reached, found changes to the policy, and applied those changes" -5471,PAStore Engine loaded local storage IPsec policy on the computer -5472,PAStore Engine failed to load local storage IPsec policy on the computer -5473,PAStore Engine loaded directory storage IPsec policy on the computer -5474,PAStore Engine failed to load directory storage IPsec policy on the computer -5477,PAStore Engine failed to add quick mode filter -5478,IPsec Services has started successfully -5479,IPsec Services has been shut down successfully -5480,IPsec Services failed to get the complete list of network interfaces on the computer -5483,IPsec Services failed to initialize RPC server. IPsec Services could not be started -5484,IPsec Services has experienced a critical failure and has been shut down -5485,IPsec Services failed to process some IPsec filters on a plug-and-play event for network interfaces -6144,Security policy in the group policy objects has been applied successfully -6145,One or more errors occured while processing security policy in the group policy objects -6272,Network Policy Server granted access to a user -6273,Network Policy Server denied access to a user -6274,Network Policy Server discarded the request for a user -6275,Network Policy Server discarded the accounting request for a user -6276,Network Policy Server quarantined a user -6277,Network Policy Server granted access to a user but put it on probation because the host did not meet the defined health policy -6278,Network Policy Server granted full access to a user because the host met the defined health policy -6279,Network Policy Server locked the user account due to repeated failed authentication attempts -6280,Network Policy Server unlocked the user account -6281,Code Integrity determined that the page hashes of an image file are not valid... -6400,BranchCache: Received an incorrectly formatted response while discovering availability of content. -6401,BranchCache: Received invalid data from a peer. Data discarded. -6402,BranchCache: The message to the hosted cache offering it data is incorrectly formatted. -6403,BranchCache: The hosted cache sent an incorrectly formatted response to the client. -6404,BranchCache: Hosted cache could not be authenticated using the provisioned SSL certificate. -6405,BranchCache: %2 instance(s) of event id %1 occurred. -6407,1% (no more info in MSDN) -6408,Registered product %1 failed and Windows Firewall is now controlling the filtering for %2 -6410,Code integrity determined that a file does not meet the security requirements to load into a process. -7022,Windows Service Fail or Crash -7023,The %1 service terminated with the following error: %2 -7023,Windows Service Fail or Crash -7024,Windows Service Fail or Crash -7026,Windows Service Fail or Crash -7030,"The service is marked as an interactive service. However, the system is configured to not allow interactive services. This service may not function properly." -7031,Windows Service Fail or Crash -7032,Windows Service Fail or Crash -7034,Windows Service Fail or Crash -7035,The %1 service was successfully sent a %2 control. -7036,The service entered the running/stopped state -7040,The start type of the %1 service was changed from %2 to %3. -7045,New Windows Service -8000,Starting a Wireless Connection -8001,Successfully connected to Wireless connection -8002,Wireless Connection Failed -8003,AppLocker Block Error -8003,Disconnected from Wireless connection -8004,AppLocker Block Warning -8005,AppLocker permitted the execution of a PowerShell script -8006,AppLocker Warning Error -8007,AppLocker Warning -8011,Starting a Wireless Connection -10000,Network Connection and Disconnection Status (Wired and Wireless) -10001,Network Connection and Disconnection Status (Wired and Wireless) -11000,Wireless Association Status -11001,Wireless Association Status -11002,Wireless Association Status -11004,"Wireless Security Started, Stopped, Successful, or Failed" -11005,"Wireless Security Started, Stopped, Successful, or Failed" -11006,"Wireless Security Started, Stopped, Successful, or Failed" -11010,"Wireless Security Started, Stopped, Successful, or Failed" -12011,Wireless Authentication Started and Failed -12012,Wireless Authentication Started and Failed -12013,Wireless Authentication Started and Failed -unregistered_event_id,Unknown diff --git a/config/target_eventids.txt b/config/target_eventids.txt deleted file mode 100644 index e69de29b..00000000 diff --git a/config/target_eventids_sample.txt b/config/target_eventids_sample.txt deleted file mode 100644 index f703021e..00000000 --- a/config/target_eventids_sample.txt +++ /dev/null @@ -1,154 +0,0 @@ -1 -10 -1000 -1001 -1006 -1013 -1015 -1031 -1032 -1033 -1034 -104 -106 -11 -1102 -1116 -1116 -1117 -1121 -12 -13 -14 -15 -150 -16 -17 -18 -19 -20 -2003 -21 -2100 -2102 -213 -217 -22 -23 -24 -255 -257 -26 -3 -30 -300 -301 -302 -316 -31017 -354 -4 -400 -400 -403 -40300 -40301 -40302 -4100 -4103 -4104 -4611 -4616 -4624 -4625 -4634 -4647 -4648 -4656 -4657 -4658 -4660 -4661 -4662 -4663 -4672 -4673 -4674 -4688 -4689 -4692 -4697 -4698 -4699 -4701 -4703 -4704 -4706 -4719 -4720 -4728 -4732 -4738 -4742 -4765 -4766 -4768 -4769 -4771 -4776 -4781 -4794 -4799 -4825 -4898 -4899 -4904 -4905 -4909 -5 -50 -5001 -5007 -5010 -5012 -5013 -5038 -5101 -5136 -5140 -5142 -5145 -5156 -517 -524 -528 -529 -55 -56 -5829 -5859 -5861 -59 -6 -600 -6281 -6416 -675 -7 -70 -7036 -7040 -7045 -770 -8 -800 -8001 -8002 -8004 -8007 -808 -823 -848 -849 -9 -98 diff --git a/contributors.txt b/contributors.txt index 53b594ea..dd3e8a57 100644 --- a/contributors.txt +++ b/contributors.txt @@ -1,6 +1,7 @@ Hayabusa was possible thanks to the following people (in alphabetical order): Akira Nishikawa (@nishikawaakira): Previous lead developer, core hayabusa rule support, etc... +Fukusuke Takahashi (fukuseket): Static compiling for Windows, race condition and other bug fixes. Garigariganzy (@garigariganzy31): Developer, event ID statistics implementation, etc... ItiB (@itiB_S144) : Core developer, sigmac hayabusa backend, rule creation, etc... James Takai / hachiyone(@hach1yon): Current lead developer, tokio multi-threading, sigma aggregation logic, sigmac backend, rule creation, sigma count implementation etc… diff --git a/doc/TimesketchImport/01-TimesketchLogin.png b/doc/TimesketchImport/01-TimesketchLogin.png new file mode 100644 index 00000000..1c86ca6d Binary files /dev/null and b/doc/TimesketchImport/01-TimesketchLogin.png differ diff --git a/doc/TimesketchImport/02-NewInvestigation.png b/doc/TimesketchImport/02-NewInvestigation.png new file mode 100644 index 00000000..2e8d9eb3 Binary files /dev/null and b/doc/TimesketchImport/02-NewInvestigation.png differ diff --git a/doc/TimesketchImport/03-TimesketchTimeline.png b/doc/TimesketchImport/03-TimesketchTimeline.png new file mode 100644 index 00000000..d20cda5d Binary files /dev/null and b/doc/TimesketchImport/03-TimesketchTimeline.png differ diff --git a/doc/TimesketchImport/04-TimelineWithColumns.png b/doc/TimesketchImport/04-TimelineWithColumns.png new file mode 100644 index 00000000..5e740554 Binary files /dev/null and b/doc/TimesketchImport/04-TimelineWithColumns.png differ diff --git a/doc/TimesketchImport/05-FieldInformation.png b/doc/TimesketchImport/05-FieldInformation.png new file mode 100644 index 00000000..56ca50f4 Binary files /dev/null and b/doc/TimesketchImport/05-FieldInformation.png differ diff --git a/doc/TimesketchImport/06-MarkingEvents.png b/doc/TimesketchImport/06-MarkingEvents.png new file mode 100644 index 00000000..3dd4656a Binary files /dev/null and b/doc/TimesketchImport/06-MarkingEvents.png differ diff --git a/doc/TimesketchImport/TimesketchImport-English.md b/doc/TimesketchImport/TimesketchImport-English.md new file mode 100644 index 00000000..d2805231 --- /dev/null +++ b/doc/TimesketchImport/TimesketchImport-English.md @@ -0,0 +1,80 @@ +# Importing Hayabusa Results Into Timesketch + +## About + +"[Timesketch](https://timesketch.org/) is an open-source tool for collaborative forensic timeline analysis. Using sketches you and your collaborators can easily organize your timelines and analyze them all at the same time. Add meaning to your raw data with rich annotations, comments, tags and stars." + + +## Installing + +We recommend using the Ubuntu 22.04 LTS Server edition. +You can download it [here](https://ubuntu.com/download/server). +Choose the minimal install when setting it up. +You won't have `ifconfig` available, so install it with `sudo apt install net-tools`. + +After that, follow the install instructions [here](https://timesketch.org/guides/admin/install/): + +``` bash +sudo apt install docker-compose +curl -s -O https://raw.githubusercontent.com/google/timesketch/master/contrib/deploy_timesketch.sh +chmod 755 deploy_timesketch.sh +cd /opt +sudo ~/deploy_timesketch.sh +cd timesketch +sudo docker-compose up -d +sudo docker-compose exec timesketch-web tsctl create-user +``` + +## Prepared VM + +We have pre-built a demo VM that you can use against the 2022 DEF CON 30 [OpenSOC](https://opensoc.io/) DFIR Challenge evidence hosted by [Recon InfoSec](https://www.reconinfosec.com/). (The evidence has already been imported.) +You can download it [here](https://www.dropbox.com/s/3be3s5c2r22ux2z/Prebuilt-Timesketch.ova?dl=0). +You can find the other evidence for this challenge [here](https://docs.google.com/document/d/1XM4Gfdojt8fCn_9B8JKk9bcUTXZc0_hzWRUH4mEr7dw/mobilebasic) and questions [here](https://docs.google.com/spreadsheets/d/1vKn8BgABuJsqH5WhhS9ebIGTBG4aoP-StINRi18abo4/htmlview). + +The username for the VM is `user` and password is `password`. + +## Logging in + +Find out the IP address with `ifconfig` and open it with a web browser. +You will be redirected to a login page as shown below: + +![Timesketch Login](01-TimesketchLogin.png) + +Log in with the docker-compose user credentials you used when adding a user. + +## Create a new sketch + +Click on `New investiation` and create a name for the new sketch: + +![New Investigation](02-NewInvestigation.png) + +## Upload timeline + +Click `Upload timeline` and upload a CSV file that you created with the following command: + +`hayabusa-1.5.1-win-x64.exe -d ../hayabusa-sample-evtx --RFC-3339 -o timesketch-import.csv -P timesketch -U` + +You can add `-m low` if you just want alerts and not include Windows events. + +## Analyzing results + +You should get the following screen: + +![Timesketch timeline](03-TimesketchTimeline.png) + +By default, only the UTC timestamp and alert rule title will be displayed so click `Customize columns` to add more fields. + +> Warning: In the current version, there is a bug in that a new column will be blank. Please add another column (and then delete it afterwards if not needed) to display new columns. + +You can also filter on fields in the searchbox, such as `Level: crit` to only show critical alerts. + + +![Timeline with columns](04-TimelineWithColumns.png) + +If you click on an event, you can see all of the field information: + +![Field Information](05-FieldInformation.png) + +With the three icons to the left of the alert title, you can star events of interest, search +- 5 minutes to see the context of an event and add labels. + +![Marking Events](06-MarkingEvents.png) \ No newline at end of file diff --git a/doc/TimesketchImport/TimesketchImport-Japanese.md b/doc/TimesketchImport/TimesketchImport-Japanese.md new file mode 100644 index 00000000..1247a631 --- /dev/null +++ b/doc/TimesketchImport/TimesketchImport-Japanese.md @@ -0,0 +1,80 @@ +# TimesketchにHayabusa結果をインポートする方法 + +## Timesketchについて + +"[Timesketch](https://timesketch.org/)は、フォレンジックタイムラインの共同解析のためのオープンソースツールです。スケッチを使うことで、あなたとあなたの共同作業者は、簡単にタイムラインを整理し、同時に分析することができます。リッチなアノテーション、コメント、タグ、スターで生データに意味を持たせることができます。" + + +## インストール + +Ubuntu 22.04 LTS Serverエディションの使用を推奨します。 +[こちら](https://ubuntu.com/download/server)からダウンロードできます。 +セットアップ時にミニマルインストールを選択してください。 +`ifconfig`はインストールされていないので、`sudo apt install net-tools`でインストールしてください。 + +その後、インストール手順[こちら](https://timesketch.org/guides/admin/install/)に従ってください: + +``` bash +sudo apt install docker-compose +curl -s -O https://raw.githubusercontent.com/google/timesketch/master/contrib/deploy_timesketch.sh +chmod 755 deploy_timesketch.sh +cd /opt +sudo ~/deploy_timesketch.sh +cd timesketch +sudo docker-compose up -d +sudo docker-compose exec timesketch-web tsctl create-user +``` + +## 準備されたVM + +[Recon InfoSec](https://www.reconinfosec.com/)主催の2022年のDEF CON 30 [OpenSOC](https://opensoc.io/) DFIR Challengeのエビデンスに対して使用できるデモ用VMを事前に構築しています。 (エビデンスは既にインポート済み。) +[こちら](https://www.dropbox.com/s/3be3s5c2r22ux2z/Prebuilt-Timesketch.ova?dl=0)からダウンロードできます。 +このチャレンジの他のエビデンスは[こちら](https://docs.google.com/document/d/1XM4Gfdojt8fCn_9B8JKk9bcUTXZc0_hzWRUH4mEr7dw/mobilebasic)からダウンロードできます。 +問題は[こちら](https://docs.google.com/spreadsheets/d/1vKn8BgABuJsqH5WhhS9ebIGTBG4aoP-StINRi18abo4/htmlview)からダウンロードできます。 + +VMのユーザ名は`user`。パスワードは`password`。 + +## ログイン + +`ifconfig`でIPアドレスを調べ、Webブラウザで開いてください。 +以下のようなログインページに移動されます: + +![Timesketch Login](01-TimesketchLogin.png) + +docker-composeコマンドで作成したユーザの認証情報でログインしてください。 + +## 新しいsketch作成 + +`New investiation`をクリックし、新しいスケッチに名前を付けます。 + +![New Investigation](02-NewInvestigation.png) + +## タイムラインのアップロード + +`Upload timeline`をクリックし、以下のコマンドで作成したCSVファイルをアップロードします: + +`hayabusa-1.5.1-win-x64.exe -d ../hayabusa-sample-evtx --RFC-3339 -o timesketch-import.csv -P timesketch -U` + +Windowsのイベントを含めず、アラートだけでよい場合は、`-m low`を追加することができます。 + +## 結果の解析 + +以下のような画面が表示されるはずです: + +![Timesketch timeline](03-TimesketchTimeline.png) + +デフォルトでは、UTCタイムスタンプとアラートルールのタイトル名のみが表示されますので、`Customize columns`をクリックし、他のフィールドを追加してください。 + +> 注意: 現在のバージョンでは、新しいカラムが空白になってしまうというバグがあります。新しいカラムを表示するには、別のカラムをまず追加してください(必要なければ後で削除してください。) + +以下のように検索ボックスで`Level: crit`等を入力することで、クリティカルなアラートのみを表示させるようにフィルタリングできます。 + +![Timeline with columns](04-TimelineWithColumns.png) + +イベントをクリックすると、すべてのフィールド情報を見ることができます: + +![Field Information](05-FieldInformation.png) + +アラートタイトルの左側にある3つのアイコンを使って、興味のあるイベントにスターをつけたり、イベントの文脈を見るために+-5分検索したり、ラベルを追加したりすることが可能です。 + +![Marking Events](06-MarkingEvents.png) \ No newline at end of file diff --git a/hayabusa-logo.png b/logo.png similarity index 100% rename from hayabusa-logo.png rename to logo.png diff --git a/rules b/rules index 8c14d12b..85631637 160000 --- a/rules +++ b/rules @@ -1 +1 @@ -Subproject commit 8c14d12be3f2d08721eee6db7238058fdaca3ce6 +Subproject commit 856316374ca52ce01123c2078c7af294d29df546 diff --git a/screenshots/Hayabusa-Results.png b/screenshots/Hayabusa-Results.png index 026b686d..61c23587 100644 Binary files a/screenshots/Hayabusa-Results.png and b/screenshots/Hayabusa-Results.png differ diff --git a/screenshots/Hayabusa-Startup.png b/screenshots/Hayabusa-Startup.png index 72b5284b..ec3849cc 100644 Binary files a/screenshots/Hayabusa-Startup.png and b/screenshots/Hayabusa-Startup.png differ diff --git a/screenshots/HayabusaResultsSummary.png b/screenshots/HayabusaResultsSummary.png index 1efd8ec9..b9deea82 100644 Binary files a/screenshots/HayabusaResultsSummary.png and b/screenshots/HayabusaResultsSummary.png differ diff --git a/screenshots/TimesketchAnalysis.png b/screenshots/TimesketchAnalysis.png new file mode 100644 index 00000000..e6e99eda Binary files /dev/null and b/screenshots/TimesketchAnalysis.png differ diff --git a/src/afterfact.rs b/src/afterfact.rs index a6edaa50..cfc64c1c 100644 --- a/src/afterfact.rs +++ b/src/afterfact.rs @@ -1,69 +1,53 @@ use crate::detections::configs; -use crate::detections::configs::TERM_SIZE; -use crate::detections::print; -use crate::detections::print::{AlertMessage, IS_HIDE_RECORD_ID}; -use crate::detections::utils; -use crate::detections::utils::write_color_buffer; +use crate::detections::configs::{CURRENT_EXE_PATH, TERM_SIZE}; +use crate::detections::message::{self, LEVEL_ABBR}; +use crate::detections::message::{AlertMessage, LEVEL_FULL}; +use crate::detections::utils::{self, format_time}; +use crate::detections::utils::{get_writable_color, write_color_buffer}; +use crate::options::profile::PROFILES; +use bytesize::ByteSize; use chrono::{DateTime, Local, TimeZone, Utc}; +use comfy_table::modifiers::UTF8_ROUND_CORNERS; +use comfy_table::presets::UTF8_FULL; use csv::QuoteStyle; -use hashbrown::HashMap; -use hashbrown::HashSet; +use itertools::Itertools; use krapslog::{build_sparkline, build_time_markers}; use lazy_static::lazy_static; -use serde::Serialize; +use linked_hash_map::LinkedHashMap; + +use comfy_table::*; +use hashbrown::{HashMap, HashSet}; +use num_format::{Locale, ToFormattedString}; use std::cmp::min; use std::error::Error; + use std::fs::File; use std::io; use std::io::BufWriter; use std::io::Write; + +use std::fs; use std::process; use termcolor::{BufferWriter, Color, ColorChoice, ColorSpec, WriteColor}; use terminal_size::Width; -#[derive(Debug, Serialize)] -#[serde(rename_all = "PascalCase")] -pub struct CsvFormat<'a> { - timestamp: &'a str, - computer: &'a str, - channel: &'a str, - event_i_d: &'a str, - level: &'a str, - mitre_attack: &'a str, - #[serde(skip_serializing_if = "Option::is_none")] - record_i_d: Option<&'a str>, - rule_title: &'a str, - details: &'a str, - #[serde(skip_serializing_if = "Option::is_none")] - record_information: Option<&'a str>, - rule_path: &'a str, - file_path: &'a str, -} - -#[derive(Debug, Serialize)] -#[serde(rename_all = "PascalCase")] -pub struct DisplayFormat<'a> { - timestamp: &'a str, - pub computer: &'a str, - pub channel: &'a str, - pub event_i_d: &'a str, - pub level: &'a str, - #[serde(skip_serializing_if = "Option::is_none")] - record_i_d: Option<&'a str>, - pub rule_title: &'a str, - pub details: &'a str, - #[serde(skip_serializing_if = "Option::is_none")] - pub record_information: Option<&'a str>, -} - lazy_static! { - pub static ref OUTPUT_COLOR: HashMap = set_output_color(); + pub static ref OUTPUT_COLOR: HashMap = set_output_color(); +} + +pub struct Colors { + pub output_color: termcolor::Color, + pub table_color: comfy_table::Color, } /// level_color.txtファイルを読み込み対応する文字色のマッピングを返却する関数 -pub fn set_output_color() -> HashMap { - let read_result = utils::read_csv("config/level_color.txt"); - let mut color_map: HashMap = HashMap::new(); +pub fn set_output_color() -> HashMap { + let read_result = utils::read_csv( + utils::check_setting_path(&CURRENT_EXE_PATH.to_path_buf(), "config/level_color.txt") + .to_str() + .unwrap(), + ); + let mut color_map: HashMap = HashMap::new(); if configs::CONFIG.read().unwrap().args.no_color { return color_map; } @@ -93,16 +77,34 @@ pub fn set_output_color() -> HashMap { } color_map.insert( level.to_lowercase(), - Color::Rgb(color_code[0], color_code[1], color_code[2]), + Colors { + output_color: termcolor::Color::Rgb(color_code[0], color_code[1], color_code[2]), + table_color: comfy_table::Color::Rgb { + r: color_code[0], + g: color_code[1], + b: color_code[2], + }, + }, ); }); color_map } -fn _get_output_color(color_map: &HashMap, level: &str) -> Option { +fn _get_output_color(color_map: &HashMap, level: &str) -> Option { let mut color = None; if let Some(c) = color_map.get(&level.to_lowercase()) { - color = Some(c.to_owned()); + color = Some(c.output_color.to_owned()); + } + color +} + +fn _get_table_color( + color_map: &HashMap, + level: &str, +) -> Option { + let mut color = None; + if let Some(c) = color_map.get(&level.to_lowercase()) { + color = Some(c.table_color.to_owned()); } color } @@ -190,7 +192,7 @@ pub fn after_fact(all_record_cnt: usize) { fn emit_csv( writer: &mut W, displayflag: bool, - color_map: HashMap, + color_map: HashMap, all_record_cnt: u128, ) -> io::Result<()> { let disp_wtr = BufferWriter::stdout(ColorChoice::Always); @@ -199,7 +201,6 @@ fn emit_csv( disp_wtr_buf.set_color(ColorSpec::new().set_fg(None)).ok(); - let messages = print::MESSAGES.lock().unwrap(); // level is devided by "Critical","High","Medium","Low","Informational","Undefined". let mut total_detect_counts_by_level: Vec = vec![0; 6]; let mut unique_detect_counts_by_level: Vec = vec![0; 6]; @@ -209,26 +210,15 @@ fn emit_csv( HashMap::new(); let mut detect_counts_by_computer_and_level: HashMap> = HashMap::new(); + let mut detect_counts_by_rule_and_level: HashMap> = + HashMap::new(); - let levels = Vec::from([ - "critical", - "high", - "medium", - "low", - "informational", - "undefined", - ]); - let level_abbr: HashMap = HashMap::from([ - (String::from("cruitical"), String::from("crit")), - (String::from("high"), String::from("high")), - (String::from("medium"), String::from("med ")), - (String::from("low"), String::from("low ")), - (String::from("informational"), String::from("info")), - ]); + let levels = Vec::from(["crit", "high", "med ", "low ", "info", "undefined"]); // レベル別、日ごとの集計用変数の初期化 for level_init in levels { detect_counts_by_date_and_level.insert(level_init.to_string(), HashMap::new()); detect_counts_by_computer_and_level.insert(level_init.to_string(), HashMap::new()); + detect_counts_by_rule_and_level.insert(level_init.to_string(), HashMap::new()); } if displayflag { println!(); @@ -236,86 +226,57 @@ fn emit_csv( let mut timestamps: Vec = Vec::new(); let mut plus_header = true; let mut detected_record_idset: HashSet = HashSet::new(); - let detect_union = messages.iter(); - for (time, detect_infos) in detect_union { + for time in message::MESSAGES.clone().into_read_only().keys().sorted() { + let multi = message::MESSAGES.get(time).unwrap(); + let (_, detect_infos) = multi.pair(); timestamps.push(_get_timestamp(time)); for detect_info in detect_infos { - detected_record_idset.insert(format!("{}_{}", time, detect_info.eventid)); - let level = detect_info.level.to_string(); - let time_str = format_time(time, false); + if !detect_info.detail.starts_with("[condition]") { + detected_record_idset.insert(format!("{}_{}", time, detect_info.eventid)); + } if displayflag { - let record_id = detect_info - .record_id - .as_ref() - .map(|recinfo| _format_cellpos(recinfo, ColPos::Other)); - let recinfo = detect_info - .record_information - .as_ref() - .map(|recinfo| _format_cellpos(recinfo, ColPos::Last)); - let ctr_char_exclude_details = detect_info - .detail - .chars() - .filter(|&c| !c.is_control()) - .collect::(); - - let details = if ctr_char_exclude_details.is_empty() { - "-".to_string() - } else { - ctr_char_exclude_details - }; - - let dispformat: _ = DisplayFormat { - timestamp: &_format_cellpos(&time_str, ColPos::First), - level: &_format_cellpos( - level_abbr.get(&level).unwrap_or(&level), - ColPos::Other, - ), - computer: &_format_cellpos(&detect_info.computername, ColPos::Other), - event_i_d: &_format_cellpos(&detect_info.eventid, ColPos::Other), - channel: &_format_cellpos(&detect_info.channel, ColPos::Other), - rule_title: &_format_cellpos(&detect_info.alert, ColPos::Other), - details: &_format_cellpos(&details, ColPos::Other), - record_information: recinfo.as_deref(), - record_i_d: record_id.as_deref(), - }; - //ヘッダーのみを出力 if plus_header { - write!(disp_wtr_buf, "{}", _get_serialized_disp_output(None)).ok(); - plus_header = false; - } - disp_wtr_buf - .set_color( - ColorSpec::new().set_fg(_get_output_color(&color_map, &detect_info.level)), + write_color_buffer( + &disp_wtr, + get_writable_color(None), + &_get_serialized_disp_output(PROFILES.as_ref().unwrap(), true), + false, ) .ok(); - write!( - disp_wtr_buf, - "{}", - _get_serialized_disp_output(Some(dispformat)) + plus_header = false; + } + write_color_buffer( + &disp_wtr, + get_writable_color(_get_output_color( + &color_map, + LEVEL_FULL + .get(&detect_info.level) + .unwrap_or(&String::default()), + )), + &_get_serialized_disp_output(&detect_info.ext_field, false), + false, ) .ok(); } else { // csv output format - wtr.serialize(CsvFormat { - timestamp: &time_str, - level: level_abbr.get(&level).unwrap_or(&level).trim(), - computer: &detect_info.computername, - event_i_d: &detect_info.eventid, - channel: &detect_info.channel, - mitre_attack: &detect_info.tag_info, - rule_title: &detect_info.alert, - details: &detect_info.detail, - record_information: detect_info.record_information.as_deref(), - file_path: &detect_info.filepath, - rule_path: &detect_info.rulepath, - record_i_d: detect_info.record_id.as_deref(), - })?; + if plus_header { + wtr.write_record(detect_info.ext_field.keys().map(|x| x.trim()))?; + plus_header = false; + } + wtr.write_record(detect_info.ext_field.values().map(|x| x.trim()))?; } + let level_suffix = *configs::LEVELMAP - .get(&detect_info.level.to_uppercase()) + .get( + &LEVEL_FULL + .get(&detect_info.level) + .unwrap_or(&"undefined".to_string()) + .to_uppercase(), + ) .unwrap_or(&0) as usize; let time_str_date = format_time(time, true); + let mut detect_counts_by_date = detect_counts_by_date_and_level .get(&detect_info.level.to_lowercase()) .unwrap_or_else(|| detect_counts_by_date_and_level.get("undefined").unwrap()) @@ -327,6 +288,7 @@ fn emit_csv( detected_rule_files.insert(detect_info.rulepath.clone()); unique_detect_counts_by_level[level_suffix] += 1; } + let computer_rule_check_key = format!("{}|{}", &detect_info.computername, &detect_info.rulepath); if !detected_computer_and_rule_names.contains(&computer_rule_check_key) { @@ -346,66 +308,110 @@ fn emit_csv( .insert(detect_info.level.to_lowercase(), detect_counts_by_computer); } + let mut detect_counts_by_rules = detect_counts_by_rule_and_level + .get(&detect_info.level.to_lowercase()) + .unwrap_or_else(|| { + detect_counts_by_computer_and_level + .get("undefined") + .unwrap() + }) + .clone(); + *detect_counts_by_rules + .entry(Clone::clone(&detect_info.ruletitle)) + .or_insert(0) += 1; + detect_counts_by_rule_and_level + .insert(detect_info.level.to_lowercase(), detect_counts_by_rules); + total_detect_counts_by_level[level_suffix] += 1; detect_counts_by_date_and_level .insert(detect_info.level.to_lowercase(), detect_counts_by_date); } } if displayflag { - disp_wtr.print(&disp_wtr_buf)?; println!(); } else { wtr.flush()?; } - disp_wtr_buf.clear(); - disp_wtr_buf.set_color(ColorSpec::new().set_fg(None)).ok(); - writeln!(disp_wtr_buf, "Results Summary:").ok(); - disp_wtr.print(&disp_wtr_buf).ok(); - - let terminal_width = match *TERM_SIZE { - Some((Width(w), _)) => w as usize, - None => 100, + let output_path = &configs::CONFIG.read().unwrap().args.output; + if let Some(path) = output_path { + if let Ok(metadata) = fs::metadata(path) { + println!( + "Saved file: {} ({})", + configs::CONFIG + .read() + .unwrap() + .args + .output + .as_ref() + .unwrap() + .display(), + ByteSize::b(metadata.len()).to_string_as(false) + ); + println!(); + } }; - println!(); - if configs::CONFIG.read().unwrap().args.visualize_timeline { - _print_timeline_hist(timestamps, terminal_width, 3); + if !configs::CONFIG.read().unwrap().args.no_summary { + disp_wtr_buf.clear(); + write_color_buffer( + &disp_wtr, + get_writable_color(Some(Color::Rgb(0, 255, 0))), + "Results Summary:", + true, + ) + .ok(); + + let terminal_width = match *TERM_SIZE { + Some((Width(w), _)) => w as usize, + None => 100, + }; + println!(); + + if configs::CONFIG.read().unwrap().args.visualize_timeline { + _print_timeline_hist(timestamps, terminal_width, 3); + println!(); + } + let reducted_record_cnt: u128 = all_record_cnt - detected_record_idset.len() as u128; + let reducted_percent = if all_record_cnt == 0 { + 0 as f64 + } else { + (reducted_record_cnt as f64) / (all_record_cnt as f64) * 100.0 + }; + write_color_buffer( + &disp_wtr, + get_writable_color(None), + &format!( + "Detected events / Total events: {} / {} (reduced {} events ({:.2}%))", + (all_record_cnt - reducted_record_cnt).to_formatted_string(&Locale::en), + all_record_cnt.to_formatted_string(&Locale::en), + reducted_record_cnt.to_formatted_string(&Locale::en), + reducted_percent + ), + true, + ) + .ok(); + println!(); + + _print_unique_results( + total_detect_counts_by_level, + unique_detect_counts_by_level, + "Total | Unique".to_string(), + "detections".to_string(), + &color_map, + ); + println!(); + + _print_detection_summary_by_date(detect_counts_by_date_and_level, &color_map); + println!(); + println!(); + + _print_detection_summary_by_computer(detect_counts_by_computer_and_level, &color_map); + println!(); + + _print_detection_summary_tables(detect_counts_by_rule_and_level, &color_map); println!(); } - let reducted_record_cnt: u128 = all_record_cnt - detected_record_idset.len() as u128; - let reducted_percent = if all_record_cnt == 0 { - 0 as f64 - } else { - (reducted_record_cnt as f64) / (all_record_cnt as f64) * 100.0 - }; - println!("Total events: {}", all_record_cnt); - println!( - "Data reduction: {} events ({:.2}%)", - reducted_record_cnt, reducted_percent - ); - println!(); - - _print_unique_results( - total_detect_counts_by_level, - "Total".to_string(), - "detections".to_string(), - &color_map, - ); - println!(); - - _print_unique_results( - unique_detect_counts_by_level, - "Unique".to_string(), - "detections".to_string(), - &color_map, - ); - println!(); - - _print_detection_summary_by_date(detect_counts_by_date_and_level, &color_map); - println!(); - - _print_detection_summary_by_computer(detect_counts_by_computer_and_level, &color_map); Ok(()) } @@ -420,24 +426,23 @@ enum ColPos { Other, } -fn _get_serialized_disp_output(dispformat: Option) -> String { - if dispformat.is_none() { - let mut titles = vec![ - "Timestamp", - "Computer", - "Channel", - "EventID", - "Level", - "RuleTitle", - "Details", - ]; - if !*IS_HIDE_RECORD_ID { - titles.insert(5, "RecordID"); +fn _get_serialized_disp_output(data: &LinkedHashMap, header: bool) -> String { + let data_length = &data.len(); + let mut ret: Vec = vec![]; + if header { + for k in data.keys() { + ret.push(k.to_owned()); } - if configs::CONFIG.read().unwrap().args.full_data { - titles.push("RecordInformation"); + } else { + for (i, (_, v)) in data.iter().enumerate() { + if i == 0 { + ret.push(_format_cellpos(v, ColPos::First)) + } else if i == data_length - 1 { + ret.push(_format_cellpos(v, ColPos::Last)) + } else { + ret.push(_format_cellpos(v, ColPos::Other)) + } } - return format!("{}\n", titles.join("|")); } let mut disp_serializer = csv::WriterBuilder::new() .double_quote(false) @@ -446,8 +451,7 @@ fn _get_serialized_disp_output(dispformat: Option) -> String { .has_headers(false) .from_writer(vec![]); - disp_serializer.serialize(dispformat.unwrap()).ok(); - + disp_serializer.write_record(ret).ok(); String::from_utf8(disp_serializer.into_inner().unwrap_or_default()).unwrap_or_default() } @@ -460,50 +464,64 @@ fn _format_cellpos(colval: &str, column: ColPos) -> String { } } -/// output info which unique detection count and all detection count information(devided by level and total) to stdout. +/// output info which unique detection count and all detection count information(separated by level and total) to stdout. fn _print_unique_results( mut counts_by_level: Vec, + mut unique_counts_by_level: Vec, head_word: String, tail_word: String, - color_map: &HashMap, + color_map: &HashMap, ) { - let levels = Vec::from([ - "critical", - "high", - "medium", - "low", - "informational", - "undefined", - ]); - // the order in which are registered and the order of levels to be displayed are reversed counts_by_level.reverse(); + unique_counts_by_level.reverse(); + let total_count = counts_by_level.iter().sum::(); + let unique_total_count = unique_counts_by_level.iter().sum::(); // output total results write_color_buffer( - BufferWriter::stdout(ColorChoice::Always), + &BufferWriter::stdout(ColorChoice::Always), None, &format!( - "{} {}: {}", + "{} {}: {} | {}", head_word, tail_word, - counts_by_level.iter().sum::() + total_count.to_formatted_string(&Locale::en), + unique_total_count.to_formatted_string(&Locale::en) ), + true, ) .ok(); - for (i, level_name) in levels.iter().enumerate() { + for (i, level_name) in LEVEL_ABBR.keys().enumerate() { if "undefined" == *level_name { continue; } + let percent = if total_count == 0 { + 0 as f64 + } else { + (counts_by_level[i] as f64) / (total_count as f64) * 100.0 + }; + let unique_percent = if unique_total_count == 0 { + 0 as f64 + } else { + (unique_counts_by_level[i] as f64) / (unique_total_count as f64) * 100.0 + }; let output_raw_str = format!( - "{} {} {}: {}", - head_word, level_name, tail_word, counts_by_level[i] + "{} {} {}: {} ({:.2}%) | {} ({:.2}%)", + head_word, + level_name, + tail_word, + counts_by_level[i].to_formatted_string(&Locale::en), + percent, + unique_counts_by_level[i].to_formatted_string(&Locale::en), + unique_percent ); write_color_buffer( - BufferWriter::stdout(ColorChoice::Always), + &BufferWriter::stdout(ColorChoice::Always), _get_output_color(color_map, level_name), &output_raw_str, + true, ) .ok(); } @@ -512,38 +530,47 @@ fn _print_unique_results( /// 各レベル毎で最も高い検知数を出した日付を出力する fn _print_detection_summary_by_date( detect_counts_by_date: HashMap>, - color_map: &HashMap, + color_map: &HashMap, ) { let buf_wtr = BufferWriter::stdout(ColorChoice::Always); let mut wtr = buf_wtr.buffer(); wtr.set_color(ColorSpec::new().set_fg(None)).ok(); - let output_levels = Vec::from(["critical", "high", "medium", "low", "informational"]); + writeln!(wtr, "Dates with most total detections:").ok(); - for level in output_levels { + for (idx, level) in LEVEL_ABBR.values().enumerate() { // output_levelsはlevelsからundefinedを除外した配列であり、各要素は必ず初期化されているのでSomeであることが保証されているのでunwrapをそのまま実施 let detections_by_day = detect_counts_by_date.get(level).unwrap(); let mut max_detect_str = String::default(); let mut tmp_cnt: u128 = 0; - let mut date_str = String::default(); + let mut exist_max_data = false; for (date, cnt) in detections_by_day { if cnt > &tmp_cnt { - date_str = date.clone(); - max_detect_str = format!("{} ({})", date, cnt); + exist_max_data = true; + max_detect_str = format!("{} ({})", date, cnt.to_formatted_string(&Locale::en)); tmp_cnt = *cnt; } } - wtr.set_color(ColorSpec::new().set_fg(_get_output_color(color_map, level))) - .ok(); - if date_str == String::default() { + wtr.set_color(ColorSpec::new().set_fg(_get_output_color( + color_map, + LEVEL_FULL.get(level.as_str()).unwrap(), + ))) + .ok(); + if !exist_max_data { max_detect_str = "n/a".to_string(); } - writeln!( + write!( wtr, - "Date with most total {} detections: {}", - level, &max_detect_str + "{}: {}", + LEVEL_FULL.get(level.as_str()).unwrap(), + &max_detect_str ) .ok(); + if idx != LEVEL_ABBR.len() - 1 { + wtr.set_color(ColorSpec::new().set_fg(None)).ok(); + + write!(wtr, ", ").ok(); + } } buf_wtr.print(&wtr).ok(); } @@ -551,15 +578,14 @@ fn _print_detection_summary_by_date( /// 各レベル毎で最も高い検知数を出した日付を出力する fn _print_detection_summary_by_computer( detect_counts_by_computer: HashMap>, - color_map: &HashMap, + color_map: &HashMap, ) { let buf_wtr = BufferWriter::stdout(ColorChoice::Always); let mut wtr = buf_wtr.buffer(); wtr.set_color(ColorSpec::new().set_fg(None)).ok(); - let output_levels = Vec::from(["critical", "high", "medium", "low", "informational"]); - - for level in output_levels { + writeln!(wtr, "Top 5 computers with most unique detections:").ok(); + for level in LEVEL_ABBR.values() { // output_levelsはlevelsからundefinedを除外した配列であり、各要素は必ず初期化されているのでSomeであることが保証されているのでunwrapをそのまま実施 let detections_by_computer = detect_counts_by_computer.get(level).unwrap(); let mut result_vec: Vec = Vec::new(); @@ -572,7 +598,11 @@ fn _print_detection_summary_by_computer( sorted_detections.sort_by(|a, b| (-a.1).cmp(&(-b.1))); for x in sorted_detections.iter().take(5) { - result_vec.push(format!("{} ({})", x.0, x.1)); + result_vec.push(format!( + "{} ({})", + x.0, + x.1.to_formatted_string(&Locale::en) + )); } let result_str = if result_vec.is_empty() { "n/a".to_string() @@ -580,24 +610,94 @@ fn _print_detection_summary_by_computer( result_vec.join(", ") }; - wtr.set_color(ColorSpec::new().set_fg(_get_output_color(color_map, level))) - .ok(); + wtr.set_color(ColorSpec::new().set_fg(_get_output_color( + color_map, + LEVEL_FULL.get(level.as_str()).unwrap(), + ))) + .ok(); writeln!( wtr, - "Top 5 computers with most unique {} detections: {}", - level, &result_str + "{}: {}", + LEVEL_FULL.get(level.as_str()).unwrap(), + &result_str ) .ok(); } buf_wtr.print(&wtr).ok(); } -fn format_time(time: &DateTime, date_only: bool) -> String { - if configs::CONFIG.read().unwrap().args.utc { - format_rfc(time, date_only) - } else { - format_rfc(&time.with_timezone(&Local), date_only) +/// 各レベルごとで検出数が多かったルールと日ごとの検知数を表形式で出力する関数 +fn _print_detection_summary_tables( + detect_counts_by_rule_and_level: HashMap>, + color_map: &HashMap, +) { + let buf_wtr = BufferWriter::stdout(ColorChoice::Always); + let mut wtr = buf_wtr.buffer(); + wtr.set_color(ColorSpec::new().set_fg(None)).ok(); + let mut output = vec![]; + let mut col_color = vec![]; + for level in LEVEL_ABBR.values() { + let mut col_output: Vec = vec![]; + col_output.push(format!( + "Top {} alerts:", + LEVEL_FULL.get(level.as_str()).unwrap() + )); + + col_color.push(_get_table_color( + color_map, + LEVEL_FULL.get(level.as_str()).unwrap(), + )); + + // output_levelsはlevelsからundefinedを除外した配列であり、各要素は必ず初期化されているのでSomeであることが保証されているのでunwrapをそのまま実施 + let detections_by_computer = detect_counts_by_rule_and_level.get(level).unwrap(); + let mut sorted_detections: Vec<(&String, &i128)> = detections_by_computer.iter().collect(); + + sorted_detections.sort_by(|a, b| (-a.1).cmp(&(-b.1))); + + for x in sorted_detections.iter().take(5) { + col_output.push(format!( + "{} ({})", + x.0, + x.1.to_formatted_string(&Locale::en) + )); + } + let na_cnt = if sorted_detections.len() > 5 { + 0 + } else { + 5 - sorted_detections.len() + }; + for _x in 0..na_cnt { + col_output.push("N/A".to_string()); + } + output.push(col_output); } + + let mut tb = Table::new(); + tb.load_preset(UTF8_FULL) + .apply_modifier(UTF8_ROUND_CORNERS) + .set_content_arrangement(ContentArrangement::Dynamic) + .set_width(500); + for x in 0..2 { + tb.add_row(vec![ + Cell::new(&output[2 * x][0]).fg(col_color[2 * x].unwrap_or(comfy_table::Color::Reset)), + Cell::new(&output[2 * x + 1][0]) + .fg(col_color[2 * x + 1].unwrap_or(comfy_table::Color::Reset)), + ]); + + tb.add_row(vec![ + Cell::new(&output[2 * x][1..].join("\n")) + .fg(col_color[2 * x].unwrap_or(comfy_table::Color::Reset)), + Cell::new(&output[2 * x + 1][1..].join("\n")) + .fg(col_color[2 * x + 1].unwrap_or(comfy_table::Color::Reset)), + ]); + } + tb.add_row(vec![ + Cell::new(&output[4][0]).fg(col_color[4].unwrap_or(comfy_table::Color::Reset)) + ]); + tb.add_row(vec![ + Cell::new(&output[4][1..].join("\n")).fg(col_color[4].unwrap_or(comfy_table::Color::Reset)) + ]); + println!("{tb}"); } /// get timestamp to input datetime. @@ -610,73 +710,26 @@ fn _get_timestamp(time: &DateTime) -> i64 { } } -/// return rfc time format string by option -fn format_rfc(time: &DateTime, date_only: bool) -> String -where - Tz::Offset: std::fmt::Display, -{ - let time_args = &configs::CONFIG.read().unwrap().args; - if time_args.rfc_2822 { - if date_only { - time.format("%a, %e %b %Y").to_string() - } else { - time.format("%a, %e %b %Y %H:%M:%S %:z").to_string() - } - } else if time_args.rfc_3339 { - if date_only { - time.format("%Y-%m-%d").to_string() - } else { - time.format("%Y-%m-%d %H:%M:%S%.6f%:z").to_string() - } - } else if time_args.us_time { - if date_only { - time.format("%m-%d-%Y").to_string() - } else { - time.format("%m-%d-%Y %I:%M:%S%.3f %p %:z").to_string() - } - } else if time_args.us_military_time { - if date_only { - time.format("%m-%d-%Y").to_string() - } else { - time.format("%m-%d-%Y %H:%M:%S%.3f %:z").to_string() - } - } else if time_args.european_time { - if date_only { - time.format("%d-%m-%Y").to_string() - } else { - time.format("%d-%m-%Y %H:%M:%S%.3f %:z").to_string() - } - } else if date_only { - time.format("%Y-%m-%d").to_string() - } else { - time.format("%Y-%m-%d %H:%M:%S%.3f %:z").to_string() - } -} - #[cfg(test)] mod tests { - use crate::afterfact::DisplayFormat; use crate::afterfact::_get_serialized_disp_output; use crate::afterfact::emit_csv; use crate::afterfact::format_time; - use crate::detections::print; - use crate::detections::print::DetectInfo; - use crate::detections::print::CH_CONFIG; + use crate::detections::message; + use crate::detections::message::DetectInfo; + use crate::options::profile::load_profile; use chrono::{Local, TimeZone, Utc}; use hashbrown::HashMap; + use linked_hash_map::LinkedHashMap; use serde_json::Value; use std::fs::File; use std::fs::{read_to_string, remove_file}; use std::io; #[test] - fn test_emit_csv() { - //テストの並列処理によって読み込みの順序が担保できずstatic変数の内容が担保が取れない為、このテストはシーケンシャルで行う - test_emit_csv_output(); - test_emit_csv_output(); - } - fn test_emit_csv_output() { + let mock_ch_filter = + message::create_output_filter_config("test_files/config/channel_abbreviations.txt"); let test_filepath: &str = "test.evtx"; let test_rulepath: &str = "test-rule.yml"; let test_title = "test_title"; @@ -688,8 +741,17 @@ mod tests { let test_attack = "execution/txxxx.yyy"; let test_recinfo = "record_infoinfo11"; let test_record_id = "11111"; + let expect_time = Utc + .datetime_from_str("1996-02-27T01:05:01Z", "%Y-%m-%dT%H:%M:%SZ") + .unwrap(); + let expect_tz = expect_time.with_timezone(&Local); + let output_profile: LinkedHashMap = load_profile( + "test_files/config/default_profile.yaml", + "test_files/config/profiles.yaml", + ) + .unwrap(); { - let mut messages = print::MESSAGES.lock().unwrap(); + let messages = &message::MESSAGES; messages.clear(); let val = r##" { @@ -706,33 +768,46 @@ mod tests { } "##; let event: Value = serde_json::from_str(val).unwrap(); - messages.insert( - &event, - output.to_string(), - DetectInfo { - filepath: test_filepath.to_string(), - rulepath: test_rulepath.to_string(), - level: test_level.to_string(), - computername: test_computername.to_string(), - eventid: test_eventid.to_string(), - channel: CH_CONFIG + let mut profile_converter: HashMap = HashMap::from([ + ("%Timestamp%".to_owned(), format_time(&expect_time, false)), + ("%Computer%".to_owned(), test_computername.to_string()), + ( + "%Channel%".to_owned(), + mock_ch_filter .get("Security") .unwrap_or(&String::default()) .to_string(), - alert: test_title.to_string(), + ), + ("%Level%".to_owned(), test_level.to_string()), + ("%EventID%".to_owned(), test_eventid.to_string()), + ("%MitreAttack%".to_owned(), test_attack.to_string()), + ("%RecordID%".to_owned(), test_record_id.to_string()), + ("%RuleTitle%".to_owned(), test_title.to_owned()), + ("%RecordInformation%".to_owned(), test_recinfo.to_owned()), + ("%RuleFile%".to_owned(), test_rulepath.to_string()), + ("%EvtxFile%".to_owned(), test_filepath.to_string()), + ("%Tags%".to_owned(), test_attack.to_string()), + ]); + message::insert( + &event, + output.to_string(), + DetectInfo { + rulepath: test_rulepath.to_string(), + ruletitle: test_title.to_string(), + level: test_level.to_string(), + computername: test_computername.to_string(), + eventid: test_eventid.to_string(), detail: String::default(), - tag_info: test_attack.to_string(), record_information: Option::Some(test_recinfo.to_string()), - record_id: Option::Some(test_record_id.to_string()), + ext_field: output_profile, }, + expect_time, + &mut profile_converter, + false, ); } - let expect_time = Utc - .datetime_from_str("1996-02-27T01:05:01Z", "%Y-%m-%dT%H:%M:%SZ") - .unwrap(); - let expect_tz = expect_time.with_timezone(&Local); let expect = - "Timestamp,Computer,Channel,EventID,Level,MitreAttack,RecordID,RuleTitle,Details,RecordInformation,RulePath,FilePath\n" + "Timestamp,Computer,Channel,Level,EventID,MitreAttack,RecordID,RuleTitle,Details,RecordInformation,RuleFile,EvtxFile,Tags\n" .to_string() + &expect_tz .clone() @@ -743,10 +818,10 @@ mod tests { + "," + test_channel + "," - + test_eventid - + "," + test_level + "," + + test_eventid + + "," + test_attack + "," + test_record_id @@ -760,9 +835,11 @@ mod tests { + test_rulepath + "," + test_filepath + + "," + + test_attack + "\n"; let mut file: Box = Box::new(File::create("./test_emit_csv.csv").unwrap()); - assert!(emit_csv(&mut file, false, HashMap::default(), 1).is_ok()); + assert!(emit_csv(&mut file, false, HashMap::new(), 1).is_ok()); match read_to_string("./test_emit_csv.csv") { Err(_) => panic!("Failed to open file."), Ok(s) => { @@ -770,10 +847,10 @@ mod tests { } }; assert!(remove_file("./test_emit_csv.csv").is_ok()); - check_emit_csv_display(); } - fn check_emit_csv_display() { + #[test] + fn test_emit_csv_display() { let test_title = "test_title2"; let test_level = "medium"; let test_computername = "testcomputer2"; @@ -786,44 +863,42 @@ mod tests { let test_timestamp = Utc .datetime_from_str("1996-02-27T01:05:01Z", "%Y-%m-%dT%H:%M:%SZ") .unwrap(); - let expect_header = "Timestamp|Computer|Channel|EventID|Level|RecordID|RuleTitle|Details\n"; + let expect_header = "Timestamp|Computer|Channel|EventID|Level|RecordID|RuleTitle|Details|RecordInformation\n"; let expect_tz = test_timestamp.with_timezone(&Local); let expect_no_header = expect_tz .clone() .format("%Y-%m-%d %H:%M:%S%.3f %:z") .to_string() - + "|" + + " | " + test_computername - + "|" + + " | " + test_channel - + "|" + + " | " + test_eventid - + "|" + + " | " + test_level - + "|" + + " | " + test_recid - + "|" + + " | " + test_title - + "|" + + " | " + output - + "|" + + " | " + test_recinfo + "\n"; - assert_eq!(_get_serialized_disp_output(None,), expect_header); - assert_eq!( - _get_serialized_disp_output(Some(DisplayFormat { - timestamp: &format_time(&test_timestamp, false), - level: test_level, - computer: test_computername, - event_i_d: test_eventid, - channel: test_channel, - rule_title: test_title, - details: output, - record_information: Some(test_recinfo), - record_i_d: Some(test_recid), - })), - expect_no_header - ); + let mut data: LinkedHashMap = LinkedHashMap::new(); + data.insert("Timestamp".to_owned(), format_time(&test_timestamp, false)); + data.insert("Computer".to_owned(), test_computername.to_owned()); + data.insert("Channel".to_owned(), test_channel.to_owned()); + data.insert("EventID".to_owned(), test_eventid.to_owned()); + data.insert("Level".to_owned(), test_level.to_owned()); + data.insert("RecordID".to_owned(), test_recid.to_owned()); + data.insert("RuleTitle".to_owned(), test_title.to_owned()); + data.insert("Details".to_owned(), output.to_owned()); + data.insert("RecordInformation".to_owned(), test_recinfo.to_owned()); + + assert_eq!(_get_serialized_disp_output(&data, true), expect_header); + assert_eq!(_get_serialized_disp_output(&data, false), expect_no_header); } } diff --git a/src/detections/configs.rs b/src/detections/configs.rs index 883d7858..feda62a4 100644 --- a/src/detections/configs.rs +++ b/src/detections/configs.rs @@ -1,13 +1,13 @@ +use crate::detections::message::AlertMessage; use crate::detections::pivot::PivotKeyword; use crate::detections::pivot::PIVOT_KEYWORD; -use crate::detections::print::AlertMessage; use crate::detections::utils; use chrono::{DateTime, Utc}; use clap::{App, CommandFactory, Parser}; -use hashbrown::HashMap; -use hashbrown::HashSet; +use hashbrown::{HashMap, HashSet}; use lazy_static::lazy_static; use regex::Regex; +use std::env::current_exe; use std::path::PathBuf; use std::sync::RwLock; use terminal_size::{terminal_size, Height, Width}; @@ -32,6 +32,10 @@ lazy_static! { pub static ref TERM_SIZE: Option<(Width, Height)> = terminal_size(); pub static ref TARGET_EXTENSIONS: HashSet = get_target_extensions(CONFIG.read().unwrap().args.evtx_file_ext.as_ref()); + pub static ref CURRENT_EXE_PATH: PathBuf = + current_exe().unwrap().parent().unwrap().to_path_buf(); + pub static ref EXCLUDE_STATUS: HashSet = + convert_option_vecs_to_hs(CONFIG.read().unwrap().args.exclude_status.as_ref()); } pub struct ConfigReader<'a> { @@ -51,78 +55,74 @@ impl Default for ConfigReader<'_> { #[derive(Parser)] #[clap( name = "Hayabusa", - usage = "hayabusa.exe -f file.evtx [OPTIONS] / hayabusa.exe -d evtx-directory [OPTIONS]", + usage = "hayabusa.exe [OTHER-ACTIONS] [OPTIONS]", author = "Yamato Security (https://github.com/Yamato-Security/hayabusa) @SecurityYamato)", + help_template = "\n{name} {version}\n{author}\n\n{usage-heading}\n {usage}\n\n{all-args}\n", version, term_width = 400 )] pub struct Config { /// Directory of multiple .evtx files - #[clap(short = 'd', long, value_name = "DIRECTORY")] + #[clap(help_heading = Some("INPUT"), short = 'd', long, value_name = "DIRECTORY")] pub directory: Option, /// File path to one .evtx file - #[clap(short = 'f', long, value_name = "FILE_PATH")] + #[clap(help_heading = Some("INPUT"), short = 'f', long = "file", value_name = "FILE")] pub filepath: Option, - /// Print all field information - #[clap(short = 'F', long = "full-data")] - pub full_data: bool, - - /// Specify a rule directory or file (default: ./rules) + /// Specify a custom rule directory or file (default: ./rules) #[clap( + help_heading = Some("ADVANCED"), short = 'r', long, default_value = "./rules", hide_default_value = true, - value_name = "RULE_DIRECTORY/RULE_FILE" + value_name = "DIRECTORY/FILE" )] pub rules: PathBuf, - /// Specify custom rule config folder (default: ./rules/config) + /// Specify custom rule config directory (default: ./rules/config) #[clap( + help_heading = Some("ADVANCED"), short = 'c', - long, + long = "rules-config", default_value = "./rules/config", hide_default_value = true, - value_name = "RULE_CONFIG_DIRECTORY" + value_name = "DIRECTORY" )] pub config: PathBuf, /// Save the timeline in CSV format (ex: results.csv) - #[clap(short = 'o', long, value_name = "CSV_TIMELINE")] + #[clap(help_heading = Some("OUTPUT"), short = 'o', long, value_name = "FILE")] pub output: Option, - /// Output all tags when saving to a CSV file - #[clap(long = "all-tags")] - pub all_tags: bool, - - /// Do not display EventRecordID numbers - #[clap(short = 'R', long = "hide-record-id")] - pub hide_record_id: bool, - /// Output verbose information - #[clap(short = 'v', long)] + #[clap(help_heading = Some("DISPLAY-SETTINGS"), short = 'v', long)] pub verbose: bool, /// Output event frequency timeline - #[clap(short = 'V', long = "visualize-timeline")] + #[clap(help_heading = Some("DISPLAY-SETTINGS"), short = 'V', long = "visualize-timeline")] pub visualize_timeline: bool, /// Enable rules marked as deprecated - #[clap(short = 'D', long = "enable-deprecated-rules")] + #[clap(help_heading = Some("FILTERING"), long = "enable-deprecated-rules")] pub enable_deprecated_rules: bool, + /// Disable event ID filter to scan all events + #[clap(help_heading = Some("FILTERING"), short = 'D', long = "deep-scan")] + pub deep_scan: bool, + /// Enable rules marked as noisy - #[clap(short = 'n', long = "enable-noisy-rules")] + #[clap(help_heading = Some("FILTERING"), short = 'n', long = "enable-noisy-rules")] pub enable_noisy_rules: bool, /// Update to the latest rules in the hayabusa-rules github repository - #[clap(short = 'u', long = "update-rules")] + #[clap(help_heading = Some("OTHER-ACTIONS"), short = 'u', long = "update-rules")] pub update_rules: bool, /// Minimum level for rules (default: informational) #[clap( + help_heading = Some("FILTERING"), short = 'm', long = "min-level", default_value = "informational", @@ -132,85 +132,101 @@ pub struct Config { pub min_level: String, /// Analyze the local C:\Windows\System32\winevt\Logs folder - #[clap(short = 'l', long = "live-analysis")] + #[clap(help_heading = Some("INPUT"), short = 'l', long = "live-analysis")] pub live_analysis: bool, /// Start time of the event logs to load (ex: "2020-02-22 00:00:00 +09:00") - #[clap(long = "start-timeline", value_name = "START_TIMELINE")] + #[clap(help_heading = Some("FILTERING"), long = "timeline-start", value_name = "DATE")] pub start_timeline: Option, /// End time of the event logs to load (ex: "2022-02-22 23:59:59 +09:00") - #[clap(long = "end-timeline", value_name = "END_TIMELINE")] + #[clap(help_heading = Some("FILTERING"), long = "timeline-end", value_name = "DATE")] pub end_timeline: Option, /// Output timestamp in RFC 2822 format (ex: Fri, 22 Feb 2022 22:00:00 -0600) - #[clap(long = "RFC-2822")] + #[clap(help_heading = Some("TIME-FORMAT"), long = "RFC-2822")] pub rfc_2822: bool, /// Output timestamp in RFC 3339 format (ex: 2022-02-22 22:00:00.123456-06:00) - #[clap(long = "RFC-3339")] + #[clap(help_heading = Some("TIME-FORMAT"), long = "RFC-3339")] pub rfc_3339: bool, /// Output timestamp in US time format (ex: 02-22-2022 10:00:00.123 PM -06:00) - #[clap(long = "US-time")] + #[clap(help_heading = Some("TIME-FORMAT"), long = "US-time")] pub us_time: bool, /// Output timestamp in US military time format (ex: 02-22-2022 22:00:00.123 -06:00) - #[clap(long = "US-military-time")] + #[clap(help_heading = Some("TIME-FORMAT"), long = "US-military-time")] pub us_military_time: bool, /// Output timestamp in European time format (ex: 22-02-2022 22:00:00.123 +02:00) - #[clap(long = "European-time")] + #[clap(help_heading = Some("TIME-FORMAT"), long = "European-time")] pub european_time: bool, /// Output time in UTC format (default: local time) - #[clap(short = 'U', long = "UTC")] + #[clap(help_heading = Some("TIME-FORMAT"), short = 'U', long = "UTC")] pub utc: bool, /// Disable color output - #[clap(long = "no-color")] + #[clap(help_heading = Some("DISPLAY-SETTINGS"), long = "no-color")] pub no_color: bool, /// Thread number (default: optimal number for performance) - #[clap(short, long = "thread-number", value_name = "NUMBER")] + #[clap(help_heading = Some("ADVANCED"), short, long = "thread-number", value_name = "NUMBER")] pub thread_number: Option, /// Print statistics of event IDs - #[clap(short, long)] + #[clap(help_heading = Some("OTHER-ACTIONS"), short, long)] pub statistics: bool, /// Print a summary of successful and failed logons - #[clap(short = 'L', long = "logon-summary")] + #[clap(help_heading = Some("OTHER-ACTIONS"), short = 'L', long = "logon-summary")] pub logon_summary: bool, /// Tune alert levels (default: ./rules/config/level_tuning.txt) #[clap( + help_heading = Some("OTHER-ACTIONS"), long = "level-tuning", - default_value = "./rules/config/level_tuning.txt", hide_default_value = true, - value_name = "LEVEL_TUNING_FILE" + value_name = "FILE" )] - pub level_tuning: PathBuf, + pub level_tuning: Option>, /// Quiet mode: do not display the launch banner - #[clap(short, long)] + #[clap(help_heading = Some("DISPLAY-SETTINGS"), short, long)] pub quiet: bool, /// Quiet errors mode: do not save error logs - #[clap(short = 'Q', long = "quiet-errors")] + #[clap(help_heading = Some("ADVANCED"), short = 'Q', long = "quiet-errors")] pub quiet_errors: bool, /// Create a list of pivot keywords - #[clap(short = 'p', long = "pivot-keywords-list")] + #[clap(help_heading = Some("OTHER-ACTIONS"), short = 'p', long = "pivot-keywords-list")] pub pivot_keywords_list: bool, /// Print the list of contributors - #[clap(long)] + #[clap(help_heading = Some("OTHER-ACTIONS"), long)] pub contributors: bool, /// Specify additional target file extensions (ex: evtx_data) (ex: evtx1 evtx2) - #[clap(long = "target-file-ext", multiple_values = true)] + #[clap(help_heading = Some("ADVANCED"), long = "target-file-ext", multiple_values = true)] pub evtx_file_ext: Option>, + + /// Ignore rules according to status (ex: experimental) (ex: stable test) + #[clap(help_heading = Some("FILTERING"), long = "exclude-status", multiple_values = true, value_name = "STATUS")] + pub exclude_status: Option>, + + /// Specify output profile (minimal, standard, verbose, verbose-all-field-info, verbose-details-and-all-field-info) + #[clap(help_heading = Some("OUTPUT"), short = 'P', long = "profile")] + pub profile: Option, + + /// Set default output profile + #[clap(help_heading = Some("OTHER-ACTIONS"), long = "set-default-profile", value_name = "PROFILE")] + pub set_default_profile: Option, + + /// Do not display result summary + #[clap(help_heading = Some("DISPLAY-SETTINGS"), long = "no-summary")] + pub no_summary: bool, } impl ConfigReader<'_> { @@ -228,8 +244,22 @@ impl ConfigReader<'_> { app: build_cmd, args: parse, headless_help: String::default(), - event_timeline_config: load_eventcode_info("config/statistics_event_info.txt"), - target_eventids: load_target_ids("config/target_eventids.txt"), + event_timeline_config: load_eventcode_info( + utils::check_setting_path( + &CURRENT_EXE_PATH.to_path_buf(), + "rules/config/statistics_event_info.txt", + ) + .to_str() + .unwrap(), + ), + target_eventids: load_target_ids( + utils::check_setting_path( + &CURRENT_EXE_PATH.to_path_buf(), + "rules/config/target_event_IDs.txt", + ) + .to_str() + .unwrap(), + ), } } } @@ -447,7 +477,7 @@ pub fn load_pivot_keywords(path: &str) { .write() .unwrap() .entry(map[0].to_string()) - .or_insert(PivotKeyword::new()); + .or_insert_with(PivotKeyword::new); PIVOT_KEYWORD .write() @@ -461,12 +491,17 @@ pub fn load_pivot_keywords(path: &str) { /// --target-file-extで追加された拡張子から、調査対象ファイルの拡張子セットを返す関数 pub fn get_target_extensions(arg: Option<&Vec>) -> HashSet { - let mut target_file_extensions: HashSet = - arg.unwrap_or(&Vec::new()).iter().cloned().collect(); + let mut target_file_extensions: HashSet = convert_option_vecs_to_hs(arg); target_file_extensions.insert(String::from("evtx")); target_file_extensions } +/// Option>の内容をHashSetに変換する関数 +pub fn convert_option_vecs_to_hs(arg: Option<&Vec>) -> HashSet { + let ret: HashSet = arg.unwrap_or(&Vec::new()).iter().cloned().collect(); + ret +} + #[derive(Debug, Clone)] pub struct EventInfo { pub evttitle: String, diff --git a/src/detections/detection.rs b/src/detections/detection.rs index dfb01167..2f4e6207 100644 --- a/src/detections/detection.rs +++ b/src/detections/detection.rs @@ -1,36 +1,46 @@ extern crate csv; use crate::detections::configs; -use crate::detections::pivot::insert_pivot_keyword; -use crate::detections::print::AlertMessage; -use crate::detections::print::DetectInfo; -use crate::detections::print::ERROR_LOG_STACK; -use crate::detections::print::MESSAGES; -use crate::detections::print::{CH_CONFIG, DEFAULT_DETAILS, IS_HIDE_RECORD_ID, TAGS_CONFIG}; -use crate::detections::print::{ +use crate::detections::utils::{format_time, write_color_buffer}; +use crate::options::profile::{ + LOAEDED_PROFILE_ALIAS, PRELOAD_PROFILE, PRELOAD_PROFILE_REGEX, PROFILES, +}; +use chrono::{TimeZone, Utc}; +use itertools::Itertools; +use termcolor::{BufferWriter, Color, ColorChoice}; + +use crate::detections::message::AlertMessage; +use crate::detections::message::DetectInfo; +use crate::detections::message::ERROR_LOG_STACK; +use crate::detections::message::{CH_CONFIG, DEFAULT_DETAILS, TAGS_CONFIG}; +use crate::detections::message::{ LOGONSUMMARY_FLAG, PIVOT_KEYWORD_LIST_FLAG, QUIET_ERRORS_FLAG, STATISTICS_FLAG, }; +use crate::detections::pivot::insert_pivot_keyword; use crate::detections::rule; use crate::detections::rule::AggResult; use crate::detections::rule::RuleNode; use crate::detections::utils::{get_serde_number_to_string, make_ascii_titlecase}; use crate::filter; use crate::yaml::ParseYaml; -use hashbrown; use hashbrown::HashMap; use serde_json::Value; use std::fmt::Write; use std::path::Path; + use std::sync::Arc; use tokio::{runtime::Runtime, spawn, task::JoinHandle}; +use super::message; +use super::message::LEVEL_ABBR; + // イベントファイルの1レコード分の情報を保持する構造体 #[derive(Clone, Debug)] pub struct EvtxRecordInfo { - pub evtx_filepath: String, // イベントファイルのファイルパス ログで出力するときに使う + pub evtx_filepath: String, // イベントファイルのファイルパス ログで出力するときに使う pub record: Value, // 1レコード分のデータをJSON形式にシリアライズしたもの pub data_string: String, - pub key_2_value: hashbrown::HashMap, + pub key_2_value: HashMap, pub record_information: Option, } @@ -119,13 +129,11 @@ impl Detection { .filter_map(return_if_success) .collect(); if !*LOGONSUMMARY_FLAG { - let _ = &rulefile_loader - .rule_load_cnt - .insert(String::from("rule parsing error"), parseerror_count); Detection::print_rule_load_info( &rulefile_loader.rulecounter, &rulefile_loader.rule_load_cnt, &rulefile_loader.rule_status_cnt, + &parseerror_count, ); } ret @@ -199,34 +207,14 @@ impl Detection { rule } - /// 条件に合致したレコードを表示するための関数 + /// 条件に合致したレコードを格納するための関数 fn insert_message(rule: &RuleNode, record_info: &EvtxRecordInfo) { - let tag_info: Vec = match TAGS_CONFIG.is_empty() { - false => rule.yaml["tags"] - .as_vec() - .unwrap_or(&Vec::default()) - .iter() - .filter_map(|info| TAGS_CONFIG.get(info.as_str().unwrap_or(&String::default()))) - .map(|str| str.to_owned()) - .collect(), - true => rule.yaml["tags"] - .as_vec() - .unwrap_or(&Vec::default()) - .iter() - .map( - |info| match TAGS_CONFIG.get(info.as_str().unwrap_or(&String::default())) { - Some(s) => s.to_owned(), - _ => info.as_str().unwrap_or("").replace("attack.", ""), - }, - ) - .collect(), - }; - + let tag_info: &Vec = &Detection::get_tag_info(rule); let recinfo = record_info .record_information .as_ref() .map(|recinfo| recinfo.to_string()); - let rec_id = if !*IS_HIDE_RECORD_ID { + let rec_id = if LOAEDED_PROFILE_ALIAS.contains("%RecordID%") { Some( get_serde_number_to_string(&record_info.record["Event"]["System"]["EventRecordID"]) .unwrap_or_default(), @@ -242,73 +230,316 @@ impl Detection { .unwrap_or_default(); let eid = get_serde_number_to_string(&record_info.record["Event"]["System"]["EventID"]) .unwrap_or_else(|| "-".to_owned()); - let default_output = DEFAULT_DETAILS - .get(&format!("{}_{}", provider, &eid)) - .unwrap_or(&"-".to_string()) - .to_string(); + let default_output = match DEFAULT_DETAILS.get(&format!("{}_{}", provider, &eid)) { + Some(str) => str.to_owned(), + None => recinfo.as_ref().unwrap_or(&"-".to_string()).to_string(), + }; + let opt_record_info = if LOAEDED_PROFILE_ALIAS.contains("%RecordInformation%") { + recinfo + } else { + None + }; + + let default_time = Utc.ymd(1970, 1, 1).and_hms(0, 0, 0); + let time = message::get_event_time(&record_info.record).unwrap_or(default_time); + let level = rule.yaml["level"].as_str().unwrap_or("-").to_string(); + + let mut profile_converter: HashMap = HashMap::new(); + for (_k, v) in PROFILES.as_ref().unwrap().iter() { + let tmp = v.as_str(); + for target_profile in PRELOAD_PROFILE_REGEX.matches(tmp).into_iter() { + match PRELOAD_PROFILE[target_profile] { + "%Timestamp%" => { + profile_converter + .insert("%Timestamp%".to_string(), format_time(&time, false)); + } + "%Computer%" => { + profile_converter.insert( + "%Computer%".to_string(), + record_info.record["Event"]["System"]["Computer"] + .to_string() + .replace('\"', ""), + ); + } + "%Channel%" => { + profile_converter.insert( + "%Channel%".to_string(), + CH_CONFIG.get(ch_str).unwrap_or(ch_str).to_string(), + ); + } + "%Level%" => { + profile_converter.insert( + "%Level%".to_string(), + LEVEL_ABBR.get(&level).unwrap_or(&level).to_string(), + ); + } + "%EventID%" => { + profile_converter.insert("%EventID%".to_string(), eid.to_owned()); + } + "%RecordID%" => { + profile_converter.insert( + "%RecordID%".to_string(), + rec_id.as_ref().unwrap_or(&"".to_string()).to_owned(), + ); + } + "%RuleTitle%" => { + profile_converter.insert( + "%RuleTitle%".to_string(), + rule.yaml["title"].as_str().unwrap_or("").to_string(), + ); + } + "%RecordInformation%" => { + profile_converter.insert( + "%RecordInformation%".to_string(), + opt_record_info + .as_ref() + .unwrap_or(&"-".to_string()) + .to_owned(), + ); + } + "%RuleFile%" => { + profile_converter.insert( + "%RuleFile%".to_string(), + Path::new(&rule.rulepath) + .file_name() + .unwrap_or_default() + .to_str() + .unwrap_or_default() + .to_string(), + ); + } + "%EvtxFile%" => { + profile_converter.insert( + "%EvtxFile%".to_string(), + Path::new(&record_info.evtx_filepath) + .to_str() + .unwrap_or_default() + .to_string(), + ); + } + "%MitreTactics%" => { + let tactics: &Vec = &tag_info + .iter() + .filter(|x| TAGS_CONFIG.values().contains(x)) + .map(|y| y.to_owned()) + .collect(); + profile_converter.insert("%MitreTactics%".to_string(), tactics.join(" : ")); + } + "%MitreTags%" => { + let techniques: &Vec = &tag_info + .iter() + .filter(|x| { + !TAGS_CONFIG.values().contains(x) + && (x.starts_with("attack.t") + || x.starts_with("attack.g") + || x.starts_with("attack.s")) + }) + .map(|y| { + let mut replaced_tag = y.replace("attack.", ""); + make_ascii_titlecase(&mut replaced_tag) + }) + .collect(); + profile_converter.insert("%MitreTags%".to_string(), techniques.join(" : ")); + } + "%OtherTags%" => { + let tags: &Vec = &tag_info + .iter() + .filter(|x| { + !(TAGS_CONFIG.values().contains(x) + || x.starts_with("attack.t") + || x.starts_with("attack.g") + || x.starts_with("attack.s")) + }) + .map(|y| y.to_owned()) + .collect(); + profile_converter.insert("%OtherTags%".to_string(), tags.join(" : ")); + } + + _ => {} + } + } + } + let detect_info = DetectInfo { - filepath: record_info.evtx_filepath.to_string(), - rulepath: rule.rulepath.to_string(), - level: rule.yaml["level"].as_str().unwrap_or("-").to_string(), + rulepath: (&rule.rulepath).to_owned(), + ruletitle: rule.yaml["title"].as_str().unwrap_or("-").to_string(), + level: LEVEL_ABBR.get(&level).unwrap_or(&level).to_string(), computername: record_info.record["Event"]["System"]["Computer"] .to_string() .replace('\"', ""), eventid: eid, - channel: CH_CONFIG.get(ch_str).unwrap_or(ch_str).to_string(), - alert: rule.yaml["title"].as_str().unwrap_or("").to_string(), detail: String::default(), - tag_info: tag_info.join(" | "), - record_information: recinfo, - record_id: rec_id, + record_information: opt_record_info, + ext_field: PROFILES.as_ref().unwrap().to_owned(), }; - MESSAGES.lock().unwrap().insert( + message::insert( &record_info.record, rule.yaml["details"] .as_str() .unwrap_or(&default_output) .to_string(), detect_info, + time, + &mut profile_converter, + false, ); } /// insert aggregation condition detection message to output stack fn insert_agg_message(rule: &RuleNode, agg_result: AggResult) { - let tag_info: Vec = rule.yaml["tags"] - .as_vec() - .unwrap_or(&Vec::default()) - .iter() - .filter_map(|info| TAGS_CONFIG.get(info.as_str().unwrap_or(&String::default()))) - .map(|str| str.to_owned()) - .collect(); + let tag_info: &Vec = &Detection::get_tag_info(rule); let output = Detection::create_count_output(rule, &agg_result); - let rec_info = if configs::CONFIG.read().unwrap().args.full_data { + let rec_info = if LOAEDED_PROFILE_ALIAS.contains("%RecordInformation%") { Option::Some(String::default()) } else { Option::None }; - let rec_id = if !*IS_HIDE_RECORD_ID { - Some(String::default()) - } else { - None - }; + + let mut profile_converter: HashMap = HashMap::new(); + let level = rule.yaml["level"].as_str().unwrap_or("-").to_string(); + + for (_k, v) in PROFILES.as_ref().unwrap().iter() { + let tmp = v.as_str(); + for target_profile in PRELOAD_PROFILE_REGEX.matches(tmp).into_iter() { + match PRELOAD_PROFILE[target_profile] { + "%Timestamp%" => { + profile_converter.insert( + "%Timestamp%".to_string(), + format_time(&agg_result.start_timedate, false), + ); + } + "%Computer%" => { + profile_converter.insert("%Computer%".to_string(), "-".to_owned()); + } + "%Channel%" => { + profile_converter.insert("%Channel%".to_string(), "-".to_owned()); + } + "%Level%" => { + profile_converter.insert( + "%Level%".to_string(), + LEVEL_ABBR.get(&level).unwrap_or(&level).to_string(), + ); + } + "%EventID%" => { + profile_converter.insert("%EventID%".to_string(), "-".to_owned()); + } + "%RecordID%" => { + profile_converter.insert("%RecordID%".to_string(), "".to_owned()); + } + "%RuleTitle%" => { + profile_converter.insert( + "%RuleTitle%".to_string(), + rule.yaml["title"].as_str().unwrap_or("").to_string(), + ); + } + "%RecordInformation%" => { + profile_converter.insert("%RecordInformation%".to_string(), "-".to_owned()); + } + "%RuleFile%" => { + profile_converter.insert( + "%RuleFile%".to_string(), + Path::new(&rule.rulepath) + .file_name() + .unwrap_or_default() + .to_str() + .unwrap_or_default() + .to_string(), + ); + } + "%EvtxFile%" => { + profile_converter.insert("%EvtxFile%".to_string(), "-".to_owned()); + } + "%MitreTactics%" => { + let tactics: &Vec = &tag_info + .iter() + .filter(|x| TAGS_CONFIG.values().contains(x)) + .map(|y| y.to_owned()) + .collect(); + profile_converter.insert("%MitreTactics%".to_string(), tactics.join(" : ")); + } + "%MitreTags%" => { + let techniques: &Vec = &tag_info + .iter() + .filter(|x| { + !TAGS_CONFIG.values().contains(x) + && (x.starts_with("attack.t") + || x.starts_with("attack.g") + || x.starts_with("attack.s")) + }) + .map(|y| { + let mut replaced_tag = y.replace("attack.", ""); + make_ascii_titlecase(&mut replaced_tag) + }) + .collect(); + profile_converter.insert("%MitreTags%".to_string(), techniques.join(" : ")); + } + "%OtherTags%" => { + let tags: &Vec = &tag_info + .iter() + .filter(|x| { + !(TAGS_CONFIG.values().contains(x) + || x.starts_with("attack.t") + || x.starts_with("attack.g") + || x.starts_with("attack.s")) + }) + .map(|y| y.to_owned()) + .collect(); + profile_converter.insert("%OtherTags%".to_string(), tags.join(" : ")); + } + _ => {} + } + } + } + let detect_info = DetectInfo { - filepath: "-".to_owned(), - rulepath: rule.rulepath.to_owned(), - level: rule.yaml["level"].as_str().unwrap_or("").to_owned(), + rulepath: (&rule.rulepath).to_owned(), + ruletitle: rule.yaml["title"].as_str().unwrap_or("-").to_string(), + level: LEVEL_ABBR.get(&level).unwrap_or(&level).to_string(), computername: "-".to_owned(), eventid: "-".to_owned(), - channel: "-".to_owned(), - alert: rule.yaml["title"].as_str().unwrap_or("").to_owned(), detail: output, record_information: rec_info, - tag_info: tag_info.join(" : "), - record_id: rec_id, + ext_field: PROFILES.as_ref().unwrap().to_owned(), }; - MESSAGES - .lock() - .unwrap() - .insert_message(detect_info, agg_result.start_timedate) + message::insert( + &Value::default(), + rule.yaml["details"].as_str().unwrap_or("-").to_string(), + detect_info, + agg_result.start_timedate, + &mut profile_converter, + true, + ) + } + + /// rule内のtagsの内容を配列として返却する関数 + fn get_tag_info(rule: &RuleNode) -> Vec { + match TAGS_CONFIG.is_empty() { + false => rule.yaml["tags"] + .as_vec() + .unwrap_or(&Vec::default()) + .iter() + .map(|info| { + if let Some(tag) = TAGS_CONFIG.get(info.as_str().unwrap_or(&String::default())) + { + tag.to_owned() + } else { + info.as_str().unwrap_or(&String::default()).to_owned() + } + }) + .collect(), + true => rule.yaml["tags"] + .as_vec() + .unwrap_or(&Vec::default()) + .iter() + .map( + |info| match TAGS_CONFIG.get(info.as_str().unwrap_or(&String::default())) { + Some(s) => s.to_owned(), + _ => info.as_str().unwrap_or("").to_string(), + }, + ) + .collect(), + } } ///aggregation conditionのcount部分の検知出力文の文字列を返す関数 @@ -363,46 +594,83 @@ impl Detection { rc: &HashMap, ld_rc: &HashMap, st_rc: &HashMap, + err_rc: &u128, ) { if *STATISTICS_FLAG { return; } let mut sorted_ld_rc: Vec<(&String, &u128)> = ld_rc.iter().collect(); sorted_ld_rc.sort_by(|a, b| a.0.cmp(b.0)); + let args = &configs::CONFIG.read().unwrap().args; + sorted_ld_rc.into_iter().for_each(|(key, value)| { - //タイトルに利用するものはascii文字であることを前提として1文字目を大文字にするように変更する - println!( - "{} rules: {}", - make_ascii_titlecase(key.clone().as_mut()), - value, - ); + if value != &0_u128 { + let disable_flag = if key == "noisy" && !args.enable_noisy_rules { + " (Disabled)" + } else { + "" + }; + //タイトルに利用するものはascii文字であることを前提として1文字目を大文字にするように変更する + println!( + "{} rules: {}{}", + make_ascii_titlecase(key.clone().as_mut()), + value, + disable_flag, + ); + } }); + if err_rc != &0_u128 { + write_color_buffer( + &BufferWriter::stdout(ColorChoice::Always), + Some(Color::Red), + &format!("Rule parsing errors: {}", err_rc), + true, + ) + .ok(); + } println!(); let mut sorted_st_rc: Vec<(&String, &u128)> = st_rc.iter().collect(); let total_loaded_rule_cnt: u128 = sorted_st_rc.iter().map(|(_, v)| v.to_owned()).sum(); sorted_st_rc.sort_by(|a, b| a.0.cmp(b.0)); sorted_st_rc.into_iter().for_each(|(key, value)| { - let rate = if value == &0_u128 { - 0 as f64 - } else { - (*value as f64) / (total_loaded_rule_cnt as f64) * 100.0 - }; - //タイトルに利用するものはascii文字であることを前提として1文字目を大文字にするように変更する - println!( - "{} rules: {} ({:.2}%)", - make_ascii_titlecase(key.clone().as_mut()), - value, - rate - ); + if value != &0_u128 { + let rate = (*value as f64) / (total_loaded_rule_cnt as f64) * 100.0; + let deprecated_flag = if key == "deprecated" && !args.enable_deprecated_rules { + " (Disabled)" + } else { + "" + }; + //タイトルに利用するものはascii文字であることを前提として1文字目を大文字にするように変更する + write_color_buffer( + &BufferWriter::stdout(ColorChoice::Always), + None, + &format!( + "{} rules: {} ({:.2}%){}", + make_ascii_titlecase(key.clone().as_mut()), + value, + rate, + deprecated_flag + ), + true, + ) + .ok(); + } }); println!(); let mut sorted_rc: Vec<(&String, &u128)> = rc.iter().collect(); sorted_rc.sort_by(|a, b| a.0.cmp(b.0)); sorted_rc.into_iter().for_each(|(key, value)| { - println!("{} rules: {}", key, value); + write_color_buffer( + &BufferWriter::stdout(ColorChoice::Always), + None, + &format!("{} rules: {}", key, value), + true, + ) + .ok(); }); + println!("Total enabled detection rules: {}", total_loaded_rule_cnt); println!(); } diff --git a/src/detections/message.rs b/src/detections/message.rs new file mode 100644 index 00000000..9aff48c5 --- /dev/null +++ b/src/detections/message.rs @@ -0,0 +1,667 @@ +extern crate lazy_static; +use crate::detections::configs; +use crate::detections::configs::CURRENT_EXE_PATH; +use crate::detections::utils; +use crate::detections::utils::get_serde_number_to_string; +use crate::detections::utils::write_color_buffer; +use crate::options::profile::PROFILES; +use chrono::{DateTime, Local, Utc}; +use dashmap::DashMap; +use hashbrown::HashMap; +use lazy_static::lazy_static; +use linked_hash_map::LinkedHashMap; +use regex::Regex; +use serde_json::Value; +use std::env; +use std::fs::create_dir; +use std::fs::File; +use std::io::BufWriter; +use std::io::{self, Write}; +use std::path::Path; +use std::sync::Mutex; +use termcolor::{BufferWriter, ColorChoice}; + +#[derive(Debug, Clone)] +pub struct DetectInfo { + pub rulepath: String, + pub ruletitle: String, + pub level: String, + pub computername: String, + pub eventid: String, + pub detail: String, + pub record_information: Option, + pub ext_field: LinkedHashMap, +} + +pub struct AlertMessage {} + +lazy_static! { + #[derive(Debug,PartialEq, Eq, Ord, PartialOrd)] + pub static ref MESSAGES: DashMap, Vec> = DashMap::new(); + pub static ref ALIASREGEX: Regex = Regex::new(r"%[a-zA-Z0-9-_\[\]]+%").unwrap(); + pub static ref SUFFIXREGEX: Regex = Regex::new(r"\[([0-9]+)\]").unwrap(); + pub static ref ERROR_LOG_PATH: String = format!( + "./logs/errorlog-{}.log", + Local::now().format("%Y%m%d_%H%M%S") + ); + pub static ref QUIET_ERRORS_FLAG: bool = configs::CONFIG.read().unwrap().args.quiet_errors; + pub static ref ERROR_LOG_STACK: Mutex> = Mutex::new(Vec::new()); + pub static ref STATISTICS_FLAG: bool = configs::CONFIG.read().unwrap().args.statistics; + pub static ref LOGONSUMMARY_FLAG: bool = configs::CONFIG.read().unwrap().args.logon_summary; + pub static ref TAGS_CONFIG: HashMap = create_output_filter_config( + utils::check_setting_path(&CURRENT_EXE_PATH.to_path_buf(), "config/mitre_tactics.txt") + .to_str() + .unwrap(), + ); + pub static ref CH_CONFIG: HashMap = create_output_filter_config( + utils::check_setting_path( + &CURRENT_EXE_PATH.to_path_buf(), + "rules/config/channel_abbreviations.txt" + ) + .to_str() + .unwrap(), + ); + pub static ref PIVOT_KEYWORD_LIST_FLAG: bool = + configs::CONFIG.read().unwrap().args.pivot_keywords_list; + pub static ref DEFAULT_DETAILS: HashMap = get_default_details(&format!( + "{}/default_details.txt", + configs::CONFIG + .read() + .unwrap() + .args + .config + .as_path() + .display() + )); + pub static ref LEVEL_ABBR: LinkedHashMap = LinkedHashMap::from_iter([ + ("critical".to_string(), "crit".to_string()), + ("high".to_string(), "high".to_string()), + ("medium".to_string(), "med ".to_string()), + ("low".to_string(), "low ".to_string()), + ("informational".to_string(), "info".to_string()), + ]); + pub static ref LEVEL_FULL: HashMap = HashMap::from([ + ("crit".to_string(), "critical".to_string()), + ("high".to_string(), "high".to_string()), + ("med ".to_string(), "medium".to_string()), + ("low ".to_string(), "low".to_string()), + ("info".to_string(), "informational".to_string()) + ]); +} + +/// ファイルパスで記載されたtagでのフル名、表示の際に置き換えられる文字列のHashMapを作成する関数。 +/// ex. attack.impact,Impact +pub fn create_output_filter_config(path: &str) -> HashMap { + let mut ret: HashMap = HashMap::new(); + let read_result = utils::read_csv(path); + if read_result.is_err() { + AlertMessage::alert(read_result.as_ref().unwrap_err()).ok(); + return HashMap::default(); + } + read_result.unwrap().into_iter().for_each(|line| { + if line.len() != 2 { + return; + } + + let tag_full_str = line[0].trim(); + let tag_replace_str = line[1].trim(); + + ret.insert(tag_full_str.to_owned(), tag_replace_str.to_owned()); + }); + ret +} + +/// メッセージの設定を行う関数。aggcondition対応のためrecordではなく出力をする対象時間がDatetime形式での入力としている +pub fn insert_message(detect_info: DetectInfo, event_time: DateTime) { + let mut v = MESSAGES.entry(event_time).or_default(); + let (_, info) = v.pair_mut(); + info.push(detect_info); +} + +/// メッセージを設定 +pub fn insert( + event_record: &Value, + output: String, + mut detect_info: DetectInfo, + time: DateTime, + profile_converter: &mut HashMap, + is_agg: bool, +) { + if !is_agg { + let parsed_detail = parse_message(event_record, &output) + .chars() + .filter(|&c| !c.is_control()) + .collect::(); + detect_info.detail = if parsed_detail.is_empty() { + "-".to_string() + } else { + parsed_detail + }; + } + let mut exist_detail = false; + PROFILES.as_ref().unwrap().iter().for_each(|(_k, v)| { + if v.contains("%Details%") { + exist_detail = true; + } + }); + if exist_detail { + profile_converter.insert("%Details%".to_string(), detect_info.detail.to_owned()); + } + let mut tmp_converted_info: LinkedHashMap = LinkedHashMap::new(); + for (k, v) in &detect_info.ext_field { + let converted_reserve_info = convert_profile_reserved_info(v, profile_converter); + if v.contains("%RecordInformation%") || v.contains("%Details%") { + tmp_converted_info.insert(k.to_owned(), converted_reserve_info); + } else { + tmp_converted_info.insert( + k.to_owned(), + parse_message(event_record, &converted_reserve_info), + ); + } + } + for (k, v) in tmp_converted_info { + detect_info.ext_field.insert(k, v); + } + insert_message(detect_info, time) +} + +/// profileで用いられる予約語の情報を変換する関数 +fn convert_profile_reserved_info( + output: &String, + config_reserved_info: &HashMap, +) -> String { + let mut ret = output.to_owned(); + config_reserved_info.iter().for_each(|(k, v)| { + ret = ret.replace(k, v); + }); + ret +} + +/// メッセージ内の%で囲まれた箇所をエイリアスとしてをレコード情報を参照して置き換える関数 +fn parse_message(event_record: &Value, output: &String) -> String { + let mut return_message = output.to_owned(); + let mut hash_map: HashMap = HashMap::new(); + for caps in ALIASREGEX.captures_iter(&return_message) { + let full_target_str = &caps[0]; + let target_length = full_target_str.chars().count() - 2; // The meaning of 2 is two percent + let target_str = full_target_str + .chars() + .skip(1) + .take(target_length) + .collect::(); + + let array_str = if let Some(_array_str) = configs::EVENTKEY_ALIAS.get_event_key(&target_str) + { + _array_str.to_string() + } else { + format!("Event.EventData.{}", target_str) + }; + + let split: Vec<&str> = array_str.split('.').collect(); + let mut tmp_event_record: &Value = event_record; + for s in &split { + if let Some(record) = tmp_event_record.get(s) { + tmp_event_record = record; + } + } + let suffix_match = SUFFIXREGEX.captures(&target_str); + let suffix: i64 = match suffix_match { + Some(cap) => cap.get(1).map_or(-1, |a| a.as_str().parse().unwrap_or(-1)), + None => -1, + }; + if suffix >= 1 { + tmp_event_record = tmp_event_record + .get("Data") + .unwrap() + .get((suffix - 1) as usize) + .unwrap_or(tmp_event_record); + } + let hash_value = get_serde_number_to_string(tmp_event_record); + if hash_value.is_some() { + if let Some(hash_value) = hash_value { + // UnicodeのWhitespace characterをそのままCSVに出力すると見難いので、スペースに変換する。なお、先頭と最後のWhitespace characterは単に削除される。 + let hash_value: Vec<&str> = hash_value.split_whitespace().collect(); + let hash_value = hash_value.join(" "); + hash_map.insert(full_target_str.to_string(), hash_value); + } + } else { + hash_map.insert(full_target_str.to_string(), "n/a".to_string()); + } + } + + for (k, v) in &hash_map { + return_message = return_message.replace(k, v); + } + return_message +} + +/// メッセージを返す +pub fn get(time: DateTime) -> Vec { + match MESSAGES.get(&time) { + Some(v) => v.to_vec(), + None => Vec::new(), + } +} + +pub fn get_event_time(event_record: &Value) -> Option> { + let system_time = &event_record["Event"]["System"]["TimeCreated_attributes"]["SystemTime"]; + return utils::str_time_to_datetime(system_time.as_str().unwrap_or("")); +} + +/// detailsのdefault値をファイルから読み取る関数 +pub fn get_default_details(filepath: &str) -> HashMap { + let read_result = utils::read_csv(filepath); + match read_result { + Err(_e) => { + AlertMessage::alert(&_e).ok(); + HashMap::new() + } + Ok(lines) => { + let mut ret: HashMap = HashMap::new(); + lines + .into_iter() + .try_for_each(|line| -> Result<(), String> { + let provider = match line.get(0) { + Some(_provider) => _provider.trim(), + _ => { + return Result::Err( + "Failed to read provider in default_details.txt.".to_string(), + ) + } + }; + let eid = match line.get(1) { + Some(eid_str) => match eid_str.trim().parse::() { + Ok(_eid) => _eid, + _ => { + return Result::Err( + "Parse Error EventID in default_details.txt.".to_string(), + ) + } + }, + _ => { + return Result::Err( + "Failed to read EventID in default_details.txt.".to_string(), + ) + } + }; + let details = match line.get(2) { + Some(detail) => detail.trim(), + _ => { + return Result::Err( + "Failed to read details in default_details.txt.".to_string(), + ) + } + }; + ret.insert(format!("{}_{}", provider, eid), details.to_string()); + Ok(()) + }) + .ok(); + ret + } + } +} + +impl AlertMessage { + ///対象のディレクトリが存在することを確認後、最初の定型文を追加して、ファイルのbufwriterを返す関数 + pub fn create_error_log(path_str: String) { + if *QUIET_ERRORS_FLAG { + return; + } + let path = Path::new(&path_str); + if !path.parent().unwrap().exists() { + create_dir(path.parent().unwrap()).ok(); + } + let mut error_log_writer = BufWriter::new(File::create(path).unwrap()); + error_log_writer + .write_all( + format!( + "user input: {:?}\n", + format_args!("{}", env::args().collect::>().join(" ")) + ) + .as_bytes(), + ) + .ok(); + let error_logs = ERROR_LOG_STACK.lock().unwrap(); + error_logs.iter().for_each(|error_log| { + writeln!(error_log_writer, "{}", error_log).ok(); + }); + println!( + "Errors were generated. Please check {} for details.", + *ERROR_LOG_PATH + ); + println!(); + } + + /// ERRORメッセージを表示する関数 + pub fn alert(contents: &str) -> io::Result<()> { + write_color_buffer( + &BufferWriter::stderr(ColorChoice::Always), + None, + &format!("[ERROR] {}", contents), + true, + ) + } + + /// WARNメッセージを表示する関数 + pub fn warn(contents: &str) -> io::Result<()> { + write_color_buffer( + &BufferWriter::stderr(ColorChoice::Always), + None, + &format!("[WARN] {}", contents), + true, + ) + } +} + +#[cfg(test)] +mod tests { + use crate::detections::message::{get, insert_message, AlertMessage, DetectInfo}; + use crate::detections::message::{parse_message, MESSAGES}; + use chrono::Utc; + use hashbrown::HashMap; + use rand::Rng; + use serde_json::Value; + use std::thread; + use std::time::Duration; + + use super::{create_output_filter_config, get_default_details}; + + #[test] + fn test_error_message() { + let input = "TEST!"; + AlertMessage::alert(input).expect("[ERROR] TEST!"); + } + + #[test] + fn test_warn_message() { + let input = "TESTWarn!"; + AlertMessage::warn(input).expect("[WARN] TESTWarn!"); + } + + #[test] + /// outputで指定されているキー(eventkey_alias.txt内で設定済み)から対象のレコード内の情報でメッセージをパースしているか確認する関数 + fn test_parse_message() { + MESSAGES.clear(); + let json_str = r##" + { + "Event": { + "EventData": { + "CommandLine": "parsetest1" + }, + "System": { + "Computer": "testcomputer1", + "TimeCreated_attributes": { + "SystemTime": "1996-02-27T01:05:01Z" + } + } + } + } + "##; + let event_record: Value = serde_json::from_str(json_str).unwrap(); + let expected = "commandline:parsetest1 computername:testcomputer1"; + assert_eq!( + parse_message( + &event_record, + &"commandline:%CommandLine% computername:%ComputerName%".to_owned() + ), + expected, + ); + } + + #[test] + fn test_parse_message_auto_search() { + MESSAGES.clear(); + let json_str = r##" + { + "Event": { + "EventData": { + "NoAlias": "no_alias" + } + } + } + "##; + let event_record: Value = serde_json::from_str(json_str).unwrap(); + let expected = "alias:no_alias"; + assert_eq!( + parse_message(&event_record, &"alias:%NoAlias%".to_owned()), + expected, + ); + } + + #[test] + /// outputで指定されているキーが、eventkey_alias.txt内で設定されていない場合の出力テスト + fn test_parse_message_not_exist_key_in_output() { + MESSAGES.clear(); + let json_str = r##" + { + "Event": { + "EventData": { + "CommandLine": "parsetest2" + }, + "System": { + "TimeCreated_attributes": { + "SystemTime": "1996-02-27T01:05:01Z" + } + } + } + } + "##; + let event_record: Value = serde_json::from_str(json_str).unwrap(); + let expected = "NoExistAlias:n/a"; + assert_eq!( + parse_message(&event_record, &"NoExistAlias:%NoAliasNoHit%".to_owned()), + expected, + ); + } + #[test] + /// output test when no exist info in target record output and described key-value data in eventkey_alias.txt + fn test_parse_message_not_exist_value_in_record() { + MESSAGES.clear(); + let json_str = r##" + { + "Event": { + "EventData": { + "CommandLine": "parsetest3" + }, + "System": { + "TimeCreated_attributes": { + "SystemTime": "1996-02-27T01:05:01Z" + } + } + } + } + "##; + let event_record: Value = serde_json::from_str(json_str).unwrap(); + let expected = "commandline:parsetest3 computername:n/a"; + assert_eq!( + parse_message( + &event_record, + &"commandline:%CommandLine% computername:%ComputerName%".to_owned() + ), + expected, + ); + } + #[test] + /// output test when no exist info in target record output and described key-value data in eventkey_alias.txt + fn test_parse_message_multiple_no_suffix_in_record() { + MESSAGES.clear(); + let json_str = r##" + { + "Event": { + "EventData": { + "CommandLine": "parsetest3", + "Data": [ + "data1", + "data2", + "data3" + ] + }, + "System": { + "TimeCreated_attributes": { + "SystemTime": "1996-02-27T01:05:01Z" + } + } + } + } + "##; + let event_record: Value = serde_json::from_str(json_str).unwrap(); + let expected = "commandline:parsetest3 data:[\"data1\",\"data2\",\"data3\"]"; + assert_eq!( + parse_message( + &event_record, + &"commandline:%CommandLine% data:%Data%".to_owned() + ), + expected, + ); + } + #[test] + /// output test when no exist info in target record output and described key-value data in eventkey_alias.txt + fn test_parse_message_multiple_with_suffix_in_record() { + MESSAGES.clear(); + let json_str = r##" + { + "Event": { + "EventData": { + "CommandLine": "parsetest3", + "Data": [ + "data1", + "data2", + "data3" + ] + }, + "System": { + "TimeCreated_attributes": { + "SystemTime": "1996-02-27T01:05:01Z" + } + } + } + } + "##; + let event_record: Value = serde_json::from_str(json_str).unwrap(); + let expected = "commandline:parsetest3 data:data2"; + assert_eq!( + parse_message( + &event_record, + &"commandline:%CommandLine% data:%Data[2]%".to_owned() + ), + expected, + ); + } + #[test] + /// output test when no exist info in target record output and described key-value data in eventkey_alias.txt + fn test_parse_message_multiple_no_exist_in_record() { + MESSAGES.clear(); + let json_str = r##" + { + "Event": { + "EventData": { + "CommandLine": "parsetest3", + "Data": [ + "data1", + "data2", + "data3" + ] + }, + "System": { + "TimeCreated_attributes": { + "SystemTime": "1996-02-27T01:05:01Z" + } + } + } + } + "##; + let event_record: Value = serde_json::from_str(json_str).unwrap(); + let expected = "commandline:parsetest3 data:n/a"; + assert_eq!( + parse_message( + &event_record, + &"commandline:%CommandLine% data:%Data[0]%".to_owned() + ), + expected, + ); + } + #[test] + /// test of loading output filter config by mitre_tactics.txt + fn test_load_mitre_tactics_log() { + let actual = create_output_filter_config("test_files/config/mitre_tactics.txt"); + let expected: HashMap = HashMap::from([ + ("attack.impact".to_string(), "Impact".to_string()), + ("xxx".to_string(), "yyy".to_string()), + ]); + _check_hashmap_element(&expected, actual); + } + + #[test] + /// loading test to channel_abbrevations.txt + fn test_load_abbrevations() { + let actual = create_output_filter_config("test_files/config/channel_abbreviations.txt"); + let actual2 = create_output_filter_config("test_files/config/channel_abbreviations.txt"); + let expected: HashMap = HashMap::from([ + ("Security".to_string(), "Sec".to_string()), + ("xxx".to_string(), "yyy".to_string()), + ]); + _check_hashmap_element(&expected, actual); + _check_hashmap_element(&expected, actual2); + } + + #[test] + fn _get_default_defails() { + let expected: HashMap = HashMap::from([ + ("Microsoft-Windows-PowerShell_4104".to_string(),"%ScriptBlockText%".to_string()),("Microsoft-Windows-Security-Auditing_4624".to_string(), "User: %TargetUserName% | Comp: %WorkstationName% | IP Addr: %IpAddress% | LID: %TargetLogonId% | Process: %ProcessName%".to_string()), + ("Microsoft-Windows-Sysmon_1".to_string(), "Cmd: %CommandLine% | Process: %Image% | User: %User% | Parent Cmd: %ParentCommandLine% | LID: %LogonId% | PID: %ProcessId% | PGUID: %ProcessGuid%".to_string()), + ("Service Control Manager_7031".to_string(), "Svc: %param1% | Crash Count: %param2% | Action: %param5%".to_string()), + ]); + let actual = get_default_details("test_files/config/default_details.txt"); + _check_hashmap_element(&expected, actual); + } + + /// check two HashMap element length and value + fn _check_hashmap_element(expected: &HashMap, actual: HashMap) { + assert_eq!(expected.len(), actual.len()); + for (k, v) in expected.iter() { + assert!(actual.get(k).unwrap_or(&String::default()) == v); + } + } + + #[ignore] + #[test] + fn test_insert_message_race_condition() { + MESSAGES.clear(); + + // Setup test detect_info before starting threads. + let mut sample_detects = vec![]; + let mut rng = rand::thread_rng(); + let sample_event_time = Utc::now(); + for i in 1..2001 { + let detect_info = DetectInfo { + rulepath: "".to_string(), + ruletitle: "".to_string(), + level: "".to_string(), + computername: "".to_string(), + eventid: i.to_string(), + detail: "".to_string(), + record_information: None, + ext_field: Default::default(), + }; + sample_detects.push((sample_event_time, detect_info, rng.gen_range(0..10))); + } + + // Starting threads and randomly insert_message in parallel. + let mut handles = vec![]; + for (event_time, detect_info, random_num) in sample_detects { + let handle = thread::spawn(move || { + thread::sleep(Duration::from_micros(random_num)); + insert_message(detect_info, event_time); + }); + handles.push(handle); + } + + // Wait for all threads execution completion. + for handle in handles { + handle.join().unwrap(); + } + + // Expect all sample_detects to be included, but the len() size will be different each time I run it + assert_eq!(get(sample_event_time).len(), 2000) + } +} diff --git a/src/detections/mod.rs b/src/detections/mod.rs index 5a081dff..58bbafeb 100644 --- a/src/detections/mod.rs +++ b/src/detections/mod.rs @@ -1,6 +1,6 @@ pub mod configs; pub mod detection; +pub mod message; pub mod pivot; -pub mod print; pub mod rule; pub mod utils; diff --git a/src/detections/pivot.rs b/src/detections/pivot.rs index 040ab57e..af2b0f59 100644 --- a/src/detections/pivot.rs +++ b/src/detections/pivot.rs @@ -1,5 +1,4 @@ -use hashbrown::HashMap; -use hashbrown::HashSet; +use hashbrown::{HashMap, HashSet}; use lazy_static::lazy_static; use serde_json::Value; use std::sync::RwLock; diff --git a/src/detections/print.rs b/src/detections/print.rs deleted file mode 100644 index d74875d7..00000000 --- a/src/detections/print.rs +++ /dev/null @@ -1,760 +0,0 @@ -extern crate lazy_static; -use crate::detections::configs; -use crate::detections::utils; -use crate::detections::utils::get_serde_number_to_string; -use crate::detections::utils::write_color_buffer; -use chrono::{DateTime, Local, TimeZone, Utc}; -use hashbrown::HashMap; -use lazy_static::lazy_static; -use regex::Regex; -use serde_json::Value; -use std::collections::BTreeMap; -use std::env; -use std::fs::create_dir; -use std::fs::File; -use std::io::BufWriter; -use std::io::{self, Write}; -use std::path::Path; -use std::sync::Mutex; -use termcolor::{BufferWriter, ColorChoice}; - -#[derive(Debug)] -pub struct Message { - map: BTreeMap, Vec>, -} - -#[derive(Debug, Clone)] -pub struct DetectInfo { - pub filepath: String, - pub rulepath: String, - pub level: String, - pub computername: String, - pub eventid: String, - pub channel: String, - pub alert: String, - pub detail: String, - pub tag_info: String, - pub record_information: Option, - pub record_id: Option, -} - -pub struct AlertMessage {} - -lazy_static! { - pub static ref MESSAGES: Mutex = Mutex::new(Message::new()); - pub static ref ALIASREGEX: Regex = Regex::new(r"%[a-zA-Z0-9-_\[\]]+%").unwrap(); - pub static ref SUFFIXREGEX: Regex = Regex::new(r"\[([0-9]+)\]").unwrap(); - pub static ref ERROR_LOG_PATH: String = format!( - "./logs/errorlog-{}.log", - Local::now().format("%Y%m%d_%H%M%S") - ); - pub static ref QUIET_ERRORS_FLAG: bool = configs::CONFIG.read().unwrap().args.quiet_errors; - pub static ref ERROR_LOG_STACK: Mutex> = Mutex::new(Vec::new()); - pub static ref STATISTICS_FLAG: bool = configs::CONFIG.read().unwrap().args.statistics; - pub static ref LOGONSUMMARY_FLAG: bool = configs::CONFIG.read().unwrap().args.logon_summary; - pub static ref TAGS_CONFIG: HashMap = Message::create_output_filter_config( - "config/output_tag.txt", - true, - configs::CONFIG.read().unwrap().args.all_tags - ); - pub static ref CH_CONFIG: HashMap = Message::create_output_filter_config( - "config/channel_abbreviations.txt", - false, - configs::CONFIG.read().unwrap().args.all_tags - ); - pub static ref PIVOT_KEYWORD_LIST_FLAG: bool = - configs::CONFIG.read().unwrap().args.pivot_keywords_list; - pub static ref IS_HIDE_RECORD_ID: bool = configs::CONFIG.read().unwrap().args.hide_record_id; - pub static ref DEFAULT_DETAILS: HashMap = - Message::get_default_details(&format!( - "{}/default_details.txt", - configs::CONFIG - .read() - .unwrap() - .args - .config - .as_path() - .display() - )); -} - -impl Default for Message { - fn default() -> Self { - Self::new() - } -} - -impl Message { - pub fn new() -> Self { - let messages: BTreeMap, Vec> = BTreeMap::new(); - Message { map: messages } - } - - /// ファイルパスで記載されたtagでのフル名、表示の際に置き換えられる文字列のHashMapを作成する関数。 - /// ex. attack.impact,Impact - pub fn create_output_filter_config( - path: &str, - read_tags: bool, - pass_flag: bool, - ) -> HashMap { - let mut ret: HashMap = HashMap::new(); - if read_tags && pass_flag { - return ret; - } - let read_result = utils::read_csv(path); - if read_result.is_err() { - AlertMessage::alert(read_result.as_ref().unwrap_err()).ok(); - return HashMap::default(); - } - read_result.unwrap().into_iter().for_each(|line| { - if line.len() != 2 { - return; - } - - let empty = &"".to_string(); - let tag_full_str = line.get(0).unwrap_or(empty).trim(); - let tag_replace_str = line.get(1).unwrap_or(empty).trim(); - - ret.insert(tag_full_str.to_owned(), tag_replace_str.to_owned()); - }); - ret - } - - /// メッセージの設定を行う関数。aggcondition対応のためrecordではなく出力をする対象時間がDatetime形式での入力としている - pub fn insert_message(&mut self, detect_info: DetectInfo, event_time: DateTime) { - if let Some(v) = self.map.get_mut(&event_time) { - v.push(detect_info); - } else { - let m = vec![detect_info; 1]; - self.map.insert(event_time, m); - } - } - - /// メッセージを設定 - pub fn insert(&mut self, event_record: &Value, output: String, mut detect_info: DetectInfo) { - detect_info.detail = self.parse_message(event_record, output); - let default_time = Utc.ymd(1970, 1, 1).and_hms(0, 0, 0); - let time = Message::get_event_time(event_record).unwrap_or(default_time); - self.insert_message(detect_info, time) - } - - fn parse_message(&mut self, event_record: &Value, output: String) -> String { - let mut return_message: String = output; - let mut hash_map: HashMap = HashMap::new(); - for caps in ALIASREGEX.captures_iter(&return_message) { - let full_target_str = &caps[0]; - let target_length = full_target_str.chars().count() - 2; // The meaning of 2 is two percent - let target_str = full_target_str - .chars() - .skip(1) - .take(target_length) - .collect::(); - - let array_str = - if let Some(_array_str) = configs::EVENTKEY_ALIAS.get_event_key(&target_str) { - _array_str.to_string() - } else { - "Event.EventData.".to_owned() + &target_str - }; - - let split: Vec<&str> = array_str.split('.').collect(); - let mut tmp_event_record: &Value = event_record; - for s in &split { - if let Some(record) = tmp_event_record.get(s) { - tmp_event_record = record; - } - } - let suffix_match = SUFFIXREGEX.captures(&target_str); - let suffix: i64 = match suffix_match { - Some(cap) => cap.get(1).map_or(-1, |a| a.as_str().parse().unwrap_or(-1)), - None => -1, - }; - if suffix >= 1 { - tmp_event_record = tmp_event_record - .get("Data") - .unwrap() - .get((suffix - 1) as usize) - .unwrap_or(tmp_event_record); - } - let hash_value = get_serde_number_to_string(tmp_event_record); - if hash_value.is_some() { - if let Some(hash_value) = hash_value { - // UnicodeのWhitespace characterをそのままCSVに出力すると見難いので、スペースに変換する。なお、先頭と最後のWhitespace characterは単に削除される。 - let hash_value: Vec<&str> = hash_value.split_whitespace().collect(); - let hash_value = hash_value.join(" "); - hash_map.insert(full_target_str.to_string(), hash_value); - } - } else { - hash_map.insert(full_target_str.to_string(), "n/a".to_string()); - } - } - - for (k, v) in &hash_map { - return_message = return_message.replace(k, v); - } - - return_message - } - - /// メッセージを返す - pub fn get(&self, time: DateTime) -> Vec { - match self.map.get(&time) { - Some(v) => v.to_vec(), - None => Vec::new(), - } - } - - /// Messageのなかに入っているメッセージすべてを表示する - pub fn debug(&self) { - println!("{:?}", self.map); - } - - /// 最後に表示を行う - pub fn print(&self) { - let mut detect_count = 0; - for (key, detect_infos) in self.map.iter() { - for detect_info in detect_infos.iter() { - println!("{} <{}> {}", key, detect_info.alert, detect_info.detail); - } - detect_count += detect_infos.len(); - } - println!(); - println!("Total events:{:?}", detect_count); - } - - pub fn iter(&self) -> &BTreeMap, Vec> { - &self.map - } - - pub fn get_event_time(event_record: &Value) -> Option> { - let system_time = &event_record["Event"]["System"]["TimeCreated_attributes"]["SystemTime"]; - return utils::str_time_to_datetime(system_time.as_str().unwrap_or("")); - } - - /// message内のマップをクリアする。テストする際の冪等性の担保のため作成。 - pub fn clear(&mut self) { - self.map.clear(); - } - - /// detailsのdefault値をファイルから読み取る関数 - pub fn get_default_details(filepath: &str) -> HashMap { - let read_result = utils::read_csv(filepath); - match read_result { - Err(_e) => { - AlertMessage::alert(&_e).ok(); - HashMap::new() - } - Ok(lines) => { - let mut ret: HashMap = HashMap::new(); - lines - .into_iter() - .try_for_each(|line| -> Result<(), String> { - let provider = match line.get(0) { - Some(_provider) => _provider.trim(), - _ => { - return Result::Err( - "Failed to read provider in default_details.txt.".to_string(), - ) - } - }; - let eid = match line.get(1) { - Some(eid_str) => match eid_str.trim().parse::() { - Ok(_eid) => _eid, - _ => { - return Result::Err( - "Parse Error EventID in default_details.txt.".to_string(), - ) - } - }, - _ => { - return Result::Err( - "Failed to read EventID in default_details.txt.".to_string(), - ) - } - }; - let details = match line.get(2) { - Some(detail) => detail.trim(), - _ => { - return Result::Err( - "Failed to read details in default_details.txt.".to_string(), - ) - } - }; - ret.insert(format!("{}_{}", provider, eid), details.to_string()); - Ok(()) - }) - .ok(); - ret - } - } - } -} - -impl AlertMessage { - ///対象のディレクトリが存在することを確認後、最初の定型文を追加して、ファイルのbufwriterを返す関数 - pub fn create_error_log(path_str: String) { - if *QUIET_ERRORS_FLAG { - return; - } - let path = Path::new(&path_str); - if !path.parent().unwrap().exists() { - create_dir(path.parent().unwrap()).ok(); - } - let mut error_log_writer = BufWriter::new(File::create(path).unwrap()); - error_log_writer - .write_all( - format!( - "user input: {:?}\n", - format_args!("{}", env::args().collect::>().join(" ")) - ) - .as_bytes(), - ) - .ok(); - let error_logs = ERROR_LOG_STACK.lock().unwrap(); - error_logs.iter().for_each(|error_log| { - writeln!(error_log_writer, "{}", error_log).ok(); - }); - println!( - "Errors were generated. Please check {} for details.", - *ERROR_LOG_PATH - ); - println!(); - } - - /// ERRORメッセージを表示する関数 - pub fn alert(contents: &str) -> io::Result<()> { - write_color_buffer( - BufferWriter::stderr(ColorChoice::Always), - None, - &format!("[ERROR] {}", contents), - ) - } - - /// WARNメッセージを表示する関数 - pub fn warn(contents: &str) -> io::Result<()> { - write_color_buffer( - BufferWriter::stderr(ColorChoice::Always), - None, - &format!("[WARN] {}", contents), - ) - } -} - -#[cfg(test)] -mod tests { - use crate::detections::print::DetectInfo; - use crate::detections::print::{AlertMessage, Message}; - use hashbrown::HashMap; - use serde_json::Value; - - #[test] - fn test_create_and_append_message() { - let mut message = Message::new(); - let json_str_1 = r##" - { - "Event": { - "EventData": { - "CommandLine": "hoge" - }, - "System": { - "TimeCreated_attributes": { - "SystemTime": "1996-02-27T01:05:01Z" - } - } - } - } - "##; - let event_record_1: Value = serde_json::from_str(json_str_1).unwrap(); - message.insert( - &event_record_1, - "CommandLine1: %CommandLine%".to_string(), - DetectInfo { - filepath: "a".to_string(), - rulepath: "test_rule".to_string(), - level: "high".to_string(), - computername: "testcomputer1".to_string(), - eventid: "1".to_string(), - channel: String::default(), - alert: "test1".to_string(), - detail: String::default(), - tag_info: "txxx.001".to_string(), - record_information: Option::Some("record_information1".to_string()), - record_id: Option::Some("11111".to_string()), - }, - ); - - let json_str_2 = r##" - { - "Event": { - "EventData": { - "CommandLine": "hoge" - }, - "System": { - "TimeCreated_attributes": { - "SystemTime": "1996-02-27T01:05:01Z" - } - } - } - } - "##; - let event_record_2: Value = serde_json::from_str(json_str_2).unwrap(); - message.insert( - &event_record_2, - "CommandLine2: %CommandLine%".to_string(), - DetectInfo { - filepath: "a".to_string(), - rulepath: "test_rule2".to_string(), - level: "high".to_string(), - computername: "testcomputer2".to_string(), - eventid: "2".to_string(), - channel: String::default(), - alert: "test2".to_string(), - detail: String::default(), - tag_info: "txxx.002".to_string(), - record_information: Option::Some("record_information2".to_string()), - record_id: Option::Some("22222".to_string()), - }, - ); - - let json_str_3 = r##" - { - "Event": { - "EventData": { - "CommandLine": "hoge" - }, - "System": { - "TimeCreated_attributes": { - "SystemTime": "2000-01-21T09:06:01Z" - } - } - } - } - "##; - let event_record_3: Value = serde_json::from_str(json_str_3).unwrap(); - message.insert( - &event_record_3, - "CommandLine3: %CommandLine%".to_string(), - DetectInfo { - filepath: "a".to_string(), - rulepath: "test_rule3".to_string(), - level: "high".to_string(), - computername: "testcomputer3".to_string(), - eventid: "3".to_string(), - channel: String::default(), - alert: "test3".to_string(), - detail: String::default(), - tag_info: "txxx.003".to_string(), - record_information: Option::Some("record_information3".to_string()), - record_id: Option::Some("33333".to_string()), - }, - ); - - let json_str_4 = r##" - { - "Event": { - "EventData": { - "CommandLine": "hoge" - } - } - } - "##; - let event_record_4: Value = serde_json::from_str(json_str_4).unwrap(); - message.insert( - &event_record_4, - "CommandLine4: %CommandLine%".to_string(), - DetectInfo { - filepath: "a".to_string(), - rulepath: "test_rule4".to_string(), - level: "medium".to_string(), - computername: "testcomputer4".to_string(), - eventid: "4".to_string(), - channel: String::default(), - alert: "test4".to_string(), - detail: String::default(), - tag_info: "txxx.004".to_string(), - record_information: Option::Some("record_information4".to_string()), - record_id: Option::None, - }, - ); - - let display = format!("{}", format_args!("{:?}", message)); - println!("display::::{}", display); - let expect = "Message { map: {1970-01-01T00:00:00Z: [DetectInfo { filepath: \"a\", rulepath: \"test_rule4\", level: \"medium\", computername: \"testcomputer4\", eventid: \"4\", channel: \"\", alert: \"test4\", detail: \"CommandLine4: hoge\", tag_info: \"txxx.004\", record_information: Some(\"record_information4\"), record_id: None }], 1996-02-27T01:05:01Z: [DetectInfo { filepath: \"a\", rulepath: \"test_rule\", level: \"high\", computername: \"testcomputer1\", eventid: \"1\", channel: \"\", alert: \"test1\", detail: \"CommandLine1: hoge\", tag_info: \"txxx.001\", record_information: Some(\"record_information1\"), record_id: Some(\"11111\") }, DetectInfo { filepath: \"a\", rulepath: \"test_rule2\", level: \"high\", computername: \"testcomputer2\", eventid: \"2\", channel: \"\", alert: \"test2\", detail: \"CommandLine2: hoge\", tag_info: \"txxx.002\", record_information: Some(\"record_information2\"), record_id: Some(\"22222\") }], 2000-01-21T09:06:01Z: [DetectInfo { filepath: \"a\", rulepath: \"test_rule3\", level: \"high\", computername: \"testcomputer3\", eventid: \"3\", channel: \"\", alert: \"test3\", detail: \"CommandLine3: hoge\", tag_info: \"txxx.003\", record_information: Some(\"record_information3\"), record_id: Some(\"33333\") }]} }"; - assert_eq!(display, expect); - } - - #[test] - fn test_error_message() { - let input = "TEST!"; - AlertMessage::alert(input).expect("[ERROR] TEST!"); - } - - #[test] - fn test_warn_message() { - let input = "TESTWarn!"; - AlertMessage::warn(input).expect("[WARN] TESTWarn!"); - } - - #[test] - /// outputで指定されているキー(eventkey_alias.txt内で設定済み)から対象のレコード内の情報でメッセージをパースしているか確認する関数 - fn test_parse_message() { - let mut message = Message::new(); - let json_str = r##" - { - "Event": { - "EventData": { - "CommandLine": "parsetest1" - }, - "System": { - "Computer": "testcomputer1", - "TimeCreated_attributes": { - "SystemTime": "1996-02-27T01:05:01Z" - } - } - } - } - "##; - let event_record: Value = serde_json::from_str(json_str).unwrap(); - let expected = "commandline:parsetest1 computername:testcomputer1"; - assert_eq!( - message.parse_message( - &event_record, - "commandline:%CommandLine% computername:%ComputerName%".to_owned() - ), - expected, - ); - } - - #[test] - fn test_parse_message_auto_search() { - let mut message = Message::new(); - let json_str = r##" - { - "Event": { - "EventData": { - "NoAlias": "no_alias" - } - } - } - "##; - let event_record: Value = serde_json::from_str(json_str).unwrap(); - let expected = "alias:no_alias"; - assert_eq!( - message.parse_message(&event_record, "alias:%NoAlias%".to_owned()), - expected, - ); - } - - #[test] - /// outputで指定されているキーが、eventkey_alias.txt内で設定されていない場合の出力テスト - fn test_parse_message_not_exist_key_in_output() { - let mut message = Message::new(); - let json_str = r##" - { - "Event": { - "EventData": { - "CommandLine": "parsetest2" - }, - "System": { - "TimeCreated_attributes": { - "SystemTime": "1996-02-27T01:05:01Z" - } - } - } - } - "##; - let event_record: Value = serde_json::from_str(json_str).unwrap(); - let expected = "NoExistAlias:n/a"; - assert_eq!( - message.parse_message(&event_record, "NoExistAlias:%NoAliasNoHit%".to_owned()), - expected, - ); - } - #[test] - /// output test when no exist info in target record output and described key-value data in eventkey_alias.txt - fn test_parse_message_not_exist_value_in_record() { - let mut message = Message::new(); - let json_str = r##" - { - "Event": { - "EventData": { - "CommandLine": "parsetest3" - }, - "System": { - "TimeCreated_attributes": { - "SystemTime": "1996-02-27T01:05:01Z" - } - } - } - } - "##; - let event_record: Value = serde_json::from_str(json_str).unwrap(); - let expected = "commandline:parsetest3 computername:n/a"; - assert_eq!( - message.parse_message( - &event_record, - "commandline:%CommandLine% computername:%ComputerName%".to_owned() - ), - expected, - ); - } - #[test] - /// output test when no exist info in target record output and described key-value data in eventkey_alias.txt - fn test_parse_message_multiple_no_suffix_in_record() { - let mut message = Message::new(); - let json_str = r##" - { - "Event": { - "EventData": { - "CommandLine": "parsetest3", - "Data": [ - "data1", - "data2", - "data3" - ] - }, - "System": { - "TimeCreated_attributes": { - "SystemTime": "1996-02-27T01:05:01Z" - } - } - } - } - "##; - let event_record: Value = serde_json::from_str(json_str).unwrap(); - let expected = "commandline:parsetest3 data:[\"data1\",\"data2\",\"data3\"]"; - assert_eq!( - message.parse_message( - &event_record, - "commandline:%CommandLine% data:%Data%".to_owned() - ), - expected, - ); - } - #[test] - /// output test when no exist info in target record output and described key-value data in eventkey_alias.txt - fn test_parse_message_multiple_with_suffix_in_record() { - let mut message = Message::new(); - let json_str = r##" - { - "Event": { - "EventData": { - "CommandLine": "parsetest3", - "Data": [ - "data1", - "data2", - "data3" - ] - }, - "System": { - "TimeCreated_attributes": { - "SystemTime": "1996-02-27T01:05:01Z" - } - } - } - } - "##; - let event_record: Value = serde_json::from_str(json_str).unwrap(); - let expected = "commandline:parsetest3 data:data2"; - assert_eq!( - message.parse_message( - &event_record, - "commandline:%CommandLine% data:%Data[2]%".to_owned() - ), - expected, - ); - } - #[test] - /// output test when no exist info in target record output and described key-value data in eventkey_alias.txt - fn test_parse_message_multiple_no_exist_in_record() { - let mut message = Message::new(); - let json_str = r##" - { - "Event": { - "EventData": { - "CommandLine": "parsetest3", - "Data": [ - "data1", - "data2", - "data3" - ] - }, - "System": { - "TimeCreated_attributes": { - "SystemTime": "1996-02-27T01:05:01Z" - } - } - } - } - "##; - let event_record: Value = serde_json::from_str(json_str).unwrap(); - let expected = "commandline:parsetest3 data:n/a"; - assert_eq!( - message.parse_message( - &event_record, - "commandline:%CommandLine% data:%Data[0]%".to_owned() - ), - expected, - ); - } - #[test] - /// test of loading output filter config by output_tag.txt - fn test_load_output_tag() { - let actual = - Message::create_output_filter_config("test_files/config/output_tag.txt", true, false); - let expected: HashMap = HashMap::from([ - ("attack.impact".to_string(), "Impact".to_string()), - ("xxx".to_string(), "yyy".to_string()), - ]); - _check_hashmap_element(&expected, actual); - } - - #[test] - /// test of loading pass by output_tag.txt - fn test_no_load_output_tag() { - let actual = - Message::create_output_filter_config("test_files/config/output_tag.txt", true, true); - let expected: HashMap = HashMap::new(); - _check_hashmap_element(&expected, actual); - } - - #[test] - /// loading test to channel_abbrevations.txt - fn test_load_abbrevations() { - let actual = Message::create_output_filter_config( - "test_files/config/channel_abbreviations.txt", - false, - true, - ); - let actual2 = Message::create_output_filter_config( - "test_files/config/channel_abbreviations.txt", - false, - false, - ); - let expected: HashMap = HashMap::from([ - ("Security".to_string(), "Sec".to_string()), - ("xxx".to_string(), "yyy".to_string()), - ]); - _check_hashmap_element(&expected, actual); - _check_hashmap_element(&expected, actual2); - } - - #[test] - fn _get_default_defails() { - let expected: HashMap = HashMap::from([ - ("Microsoft-Windows-PowerShell_4104".to_string(),"%ScriptBlockText%".to_string()),("Microsoft-Windows-Security-Auditing_4624".to_string(), "User: %TargetUserName% | Comp: %WorkstationName% | IP Addr: %IpAddress% | LID: %TargetLogonId% | Process: %ProcessName%".to_string()), - ("Microsoft-Windows-Sysmon_1".to_string(), "Cmd: %CommandLine% | Process: %Image% | User: %User% | Parent Cmd: %ParentCommandLine% | LID: %LogonId% | PID: %ProcessId% | PGUID: %ProcessGuid%".to_string()), - ("Service Control Manager_7031".to_string(), "Svc: %param1% | Crash Count: %param2% | Action: %param5%".to_string()), - ]); - let actual = Message::get_default_details("test_files/config/default_details.txt"); - _check_hashmap_element(&expected, actual); - } - - /// check two HashMap element length and value - fn _check_hashmap_element(expected: &HashMap, actual: HashMap) { - assert_eq!(expected.len(), actual.len()); - for (k, v) in expected.iter() { - assert!(actual.get(k).unwrap_or(&String::default()) == v); - } - } -} diff --git a/src/detections/rule/count.rs b/src/detections/rule/count.rs index 3f32f028..c6778934 100644 --- a/src/detections/rule/count.rs +++ b/src/detections/rule/count.rs @@ -1,9 +1,9 @@ use crate::detections::configs; -use crate::detections::print::AlertMessage; -use crate::detections::print::ERROR_LOG_STACK; -use crate::detections::print::QUIET_ERRORS_FLAG; +use crate::detections::message; +use crate::detections::message::AlertMessage; +use crate::detections::message::ERROR_LOG_STACK; +use crate::detections::message::QUIET_ERRORS_FLAG; use crate::detections::rule::AggResult; -use crate::detections::rule::Message; use crate::detections::rule::RuleNode; use chrono::{DateTime, TimeZone, Utc}; use hashbrown::HashMap; @@ -33,7 +33,7 @@ pub fn count(rule: &mut RuleNode, record: &Value) { rule, key, field_value, - Message::get_event_time(record).unwrap_or(default_time), + message::get_event_time(record).unwrap_or(default_time), ); } diff --git a/src/detections/rule/matchers.rs b/src/detections/rule/matchers.rs index 5ed1a8c8..8dc94bc1 100644 --- a/src/detections/rule/matchers.rs +++ b/src/detections/rule/matchers.rs @@ -218,7 +218,7 @@ impl DefaultMatcher { }); } - /// YEAのルールファイルのフィールド名とそれに続いて指定されるパイプを、正規表現形式の文字列に変換します。 + /// Hayabusaのルールファイルのフィールド名とそれに続いて指定されるパイプを、正規表現形式の文字列に変換します。 /// ワイルドカードの文字列を正規表現にする処理もこのメソッドに実装されています。patternにワイルドカードの文字列を指定して、pipesにPipeElement::Wildcardを指定すればOK!! fn from_pattern_to_regex_str(pattern: String, pipes: &[PipeElement]) -> String { // パターンをPipeで処理する。 @@ -346,6 +346,17 @@ impl LeafMatcher for DefaultMatcher { return false; } + // yamlにnullが設定されていた場合 + if self.re.is_none() { + // レコード内に対象のフィールドが存在しなければ検知したものとして扱う + for v in self.key_list.iter() { + if recinfo.get_value(v).is_none() { + return true; + } + } + return false; + } + if event_value.is_none() { return false; } @@ -353,7 +364,7 @@ impl LeafMatcher for DefaultMatcher { let event_value_str = event_value.unwrap(); if self.key_list.is_empty() { // この場合ただのgrep検索なので、ただ正規表現に一致するかどうか調べればよいだけ - return self.re.as_ref().unwrap().is_match(event_value_str); + self.re.as_ref().unwrap().is_match(event_value_str) } else { // 通常の検索はこっち self.is_regex_fullmatch(event_value_str) @@ -523,8 +534,8 @@ mod tests { - ホスト アプリケーション ImagePath: min_length: 1234321 - regexes: ./rules/config/regex/detectlist_suspicous_services.txt - allowlist: ./rules/config/regex/allowlist_legitimate_services.txt + regexes: ./../../../rules/config/regex/detectlist_suspicous_services.txt + allowlist: ./../../../rules/config/regex/allowlist_legitimate_services.txt falsepositives: - unknown level: medium @@ -1111,7 +1122,7 @@ mod tests { selection: EventID: 4103 Channel: - - allowlist: ./rules/config/regex/allowlist_legitimate_services.txt + - allowlist: ./../../../rules/config/regex/allowlist_legitimate_services.txt details: 'command=%CommandLine%' "#; @@ -1145,7 +1156,7 @@ mod tests { selection: EventID: 4103 Channel: - - allowlist: ./rules/config/regex/allowlist_legitimate_services.txt + - allowlist: ./../../../rules/config/regex/allowlist_legitimate_services.txt details: 'command=%CommandLine%' "#; @@ -1179,7 +1190,7 @@ mod tests { selection: EventID: 4103 Channel: - - allowlist: ./rules/config/regex/allowlist_legitimate_services.txt + - allowlist: ./../../../rules/config/regex/allowlist_legitimate_services.txt details: 'command=%CommandLine%' "#; @@ -1286,6 +1297,48 @@ mod tests { } } + #[test] + fn test_detect_startswith_case_insensitive() { + // startswithが大文字小文字を区別しないことを確認 + let rule_str = r#" + enabled: true + detection: + selection: + Channel: Security + EventID: 4732 + TargetUserName|startswith: "ADMINISTRATORS" + details: 'user added to local Administrators UserName: %MemberName% SID: %MemberSid%' + "#; + + let record_json_str = r#" + { + "Event": { + "System": { + "EventID": 4732, + "Channel": "Security" + }, + "EventData": { + "TargetUserName": "TestAdministrators" + } + }, + "Event_attributes": { + "xmlns": "http://schemas.microsoft.com/win/2004/08/events/event" + } + }"#; + + let mut rule_node = parse_rule_from_str(rule_str); + match serde_json::from_str(record_json_str) { + Ok(record) => { + let keys = detections::rule::get_detection_keys(&rule_node); + let recinfo = utils::create_rec_info(record, "testpath".to_owned(), &keys); + assert!(!rule_node.select(&recinfo)); + } + Err(_rec) => { + panic!("Failed to parse json record."); + } + } + } + #[test] fn test_detect_endswith1() { // endswithが正しく検知できることを確認 @@ -1370,6 +1423,48 @@ mod tests { } } + #[test] + fn test_detect_endswith_case_insensitive() { + // endswithが大文字小文字を区別せず検知するかを確認するテスト + let rule_str = r#" + enabled: true + detection: + selection: + Channel: Security + EventID: 4732 + TargetUserName|endswith: "ADministRATORS" + details: 'user added to local Administrators UserName: %MemberName% SID: %MemberSid%' + "#; + + let record_json_str = r#" + { + "Event": { + "System": { + "EventID": 4732, + "Channel": "Security" + }, + "EventData": { + "TargetUserName": "AdministratorsTest" + } + }, + "Event_attributes": { + "xmlns": "http://schemas.microsoft.com/win/2004/08/events/event" + } + }"#; + + let mut rule_node = parse_rule_from_str(rule_str); + match serde_json::from_str(record_json_str) { + Ok(record) => { + let keys = detections::rule::get_detection_keys(&rule_node); + let recinfo = utils::create_rec_info(record, "testpath".to_owned(), &keys); + assert!(!rule_node.select(&recinfo)); + } + Err(_rec) => { + panic!("Failed to parse json record."); + } + } + } + #[test] fn test_detect_contains1() { // containsが正しく検知できることを確認 @@ -1454,6 +1549,48 @@ mod tests { } } + #[test] + fn test_detect_contains_case_insensitive() { + // containsが大文字小文字を区別せずに検知することを確認するテスト + let rule_str = r#" + enabled: true + detection: + selection: + Channel: Security + EventID: 4732 + TargetUserName|contains: "ADminIstraTOrS" + details: 'user added to local Administrators UserName: %MemberName% SID: %MemberSid%' + "#; + + let record_json_str = r#" + { + "Event": { + "System": { + "EventID": 4732, + "Channel": "Security" + }, + "EventData": { + "TargetUserName": "Testministrators" + } + }, + "Event_attributes": { + "xmlns": "http://schemas.microsoft.com/win/2004/08/events/event" + } + }"#; + + let mut rule_node = parse_rule_from_str(rule_str); + match serde_json::from_str(record_json_str) { + Ok(record) => { + let keys = detections::rule::get_detection_keys(&rule_node); + let recinfo = utils::create_rec_info(record, "testpath".to_owned(), &keys); + assert!(!rule_node.select(&recinfo)); + } + Err(_rec) => { + panic!("Failed to parse json record."); + } + } + } + #[test] fn test_detect_wildcard_multibyte() { // multi byteの確認 @@ -1858,4 +1995,65 @@ mod tests { } } } + + #[test] + fn test_eq_field_null() { + // 値でnullであった場合に対象のフィールドが存在しないことを確認 + let rule_str = r#" + enabled: true + detection: + selection: + Channel: + value: Security + Takoyaki: + value: null + details: 'command=%CommandLine%' + "#; + + let record_json_str = r#" + { + "Event": {"System": {"EventID": 4103, "Channel": "Security", "Computer": "Powershell" }}, + "Event_attributes": {"xmlns": "http://schemas.microsoft.com/win/2004/08/events/event"} + }"#; + + let mut rule_node = parse_rule_from_str(rule_str); + match serde_json::from_str(record_json_str) { + Ok(record) => { + let keys = detections::rule::get_detection_keys(&rule_node); + let recinfo = utils::create_rec_info(record, "testpath".to_owned(), &keys); + assert!(rule_node.select(&recinfo)); + } + Err(_) => { + panic!("Failed to parse json record."); + } + } + } + #[test] + fn test_eq_field_null_not_detect() { + // 値でnullであった場合に対象のフィールドが存在しないことを確認するテスト + let rule_str = r#" + enabled: true + detection: + selection: + EventID: null + details: 'command=%CommandLine%' + "#; + + let record_json_str = r#"{ + "Event": {"System": {"EventID": 4103, "Channel": "Security", "Computer": "Powershell"}}, + "Event_attributes": {"xmlns": "http://schemas.microsoft.com/win/2004/08/events/event"} + }"#; + + let mut rule_node = parse_rule_from_str(rule_str); + match serde_json::from_str(record_json_str) { + Ok(record) => { + let keys = detections::rule::get_detection_keys(&rule_node); + let recinfo = utils::create_rec_info(record, "testpath".to_owned(), &keys); + assert!(!rule_node.select(&recinfo)); + } + Err(e) => { + panic!("Failed to parse json record.{:?}", e); + } + } + } } diff --git a/src/detections/rule/mod.rs b/src/detections/rule/mod.rs index cfa1173b..60f55011 100644 --- a/src/detections/rule/mod.rs +++ b/src/detections/rule/mod.rs @@ -1,5 +1,4 @@ extern crate regex; -use crate::detections::print::Message; use chrono::{DateTime, Utc}; diff --git a/src/detections/utils.rs b/src/detections/utils.rs index 7f20781b..d678bb07 100644 --- a/src/detections/utils.rs +++ b/src/detections/utils.rs @@ -3,6 +3,12 @@ extern crate csv; extern crate regex; use crate::detections::configs; +use crate::detections::configs::CURRENT_EXE_PATH; +use hashbrown::HashMap; +use std::path::Path; +use std::path::PathBuf; + +use chrono::Local; use termcolor::Color; use tokio::runtime::Builder; @@ -66,7 +72,15 @@ pub fn value_to_string(value: &Value) -> Option { } pub fn read_txt(filename: &str) -> Result, String> { - let f = File::open(filename); + let filepath = if filename.starts_with("./") { + check_setting_path(&CURRENT_EXE_PATH.to_path_buf(), filename) + .to_str() + .unwrap() + .to_string() + } else { + filename.to_string() + }; + let f = File::open(filepath); if f.is_err() { let errmsg = format!("Cannot open file. [file:{}]", filename); return Result::Err(errmsg); @@ -206,8 +220,8 @@ pub fn create_rec_info(data: Value, path: String, keys: &[String]) -> EvtxRecord // この処理を高速化するため、rec.key_2_valueというhashmapに"Event.System.EventID"というキーで値を設定しておく。 // これなら、"Event.System.EventID"というキーを1回指定するだけで値を取得できるようになるので、高速化されるはず。 // あと、serde_jsonのValueからvalue["Event"]みたいな感じで値を取得する処理がなんか遅いので、そういう意味でも早くなるかも - // それと、serde_jsonでは内部的に標準ライブラリのhashmapを使用しているが、hashbrownを使った方が早くなるらしい。 - let mut key_2_values = hashbrown::HashMap::new(); + // それと、serde_jsonでは内部的に標準ライブラリのhashmapを使用しているが、hashbrownを使った方が早くなるらしい。標準ライブラリがhashbrownを採用したためserde_jsonについても高速化した。 + let mut key_2_values = HashMap::new(); for key in keys { let val = get_event_value(key, &data); if val.is_none() { @@ -224,11 +238,8 @@ pub fn create_rec_info(data: Value, path: String, keys: &[String]) -> EvtxRecord // EvtxRecordInfoを作る let data_str = data.to_string(); - let rec_info = if configs::CONFIG.read().unwrap().args.full_data { - Option::Some(create_recordinfos(&data)) - } else { - Option::None - }; + let rec_info = Option::Some(create_recordinfos(&data)); + EvtxRecordInfo { evtx_filepath: path, record: data, @@ -242,16 +253,30 @@ pub fn create_rec_info(data: Value, path: String, keys: &[String]) -> EvtxRecord * 標準出力のカラー出力設定を指定した値に変更し画面出力を行う関数 */ pub fn write_color_buffer( - wtr: BufferWriter, + wtr: &BufferWriter, color: Option, output_str: &str, + newline_flag: bool, ) -> io::Result<()> { let mut buf = wtr.buffer(); buf.set_color(ColorSpec::new().set_fg(color)).ok(); - writeln!(buf, "{}", output_str).ok(); + if newline_flag { + writeln!(buf, "{}", output_str).ok(); + } else { + write!(buf, "{}", output_str).ok(); + } wtr.print(&buf) } +/// no-colorのオプションの指定があるかを確認し、指定されている場合はNoneをかえし、指定されていない場合は引数で指定されたColorをSomeでラップして返す関数 +pub fn get_writable_color(color: Option) -> Option { + if configs::CONFIG.read().unwrap().args.no_color { + None + } else { + color + } +} + /** * CSVのrecord infoカラムに出力する文字列を作る */ @@ -354,9 +379,72 @@ pub fn make_ascii_titlecase(s: &mut str) -> String { } } +/// base_path/path が存在するかを確認し、存在しなければカレントディレクトリを参照するpathを返す関数 +pub fn check_setting_path(base_path: &Path, path: &str) -> PathBuf { + if base_path.join(path).exists() { + base_path.join(path) + } else { + Path::new(path).to_path_buf() + } +} + +///タイムゾーンに合わせた情報を情報を取得する関数 +pub fn format_time(time: &DateTime, date_only: bool) -> String { + if configs::CONFIG.read().unwrap().args.utc { + format_rfc(time, date_only) + } else { + format_rfc(&time.with_timezone(&Local), date_only) + } +} + +/// return rfc time format string by option +fn format_rfc(time: &DateTime, date_only: bool) -> String +where + Tz::Offset: std::fmt::Display, +{ + let time_args = &configs::CONFIG.read().unwrap().args; + if time_args.rfc_2822 { + if date_only { + time.format("%a, %e %b %Y").to_string() + } else { + time.format("%a, %e %b %Y %H:%M:%S %:z").to_string() + } + } else if time_args.rfc_3339 { + if date_only { + time.format("%Y-%m-%d").to_string() + } else { + time.format("%Y-%m-%d %H:%M:%S%.6f%:z").to_string() + } + } else if time_args.us_time { + if date_only { + time.format("%m-%d-%Y").to_string() + } else { + time.format("%m-%d-%Y %I:%M:%S%.3f %p %:z").to_string() + } + } else if time_args.us_military_time { + if date_only { + time.format("%m-%d-%Y").to_string() + } else { + time.format("%m-%d-%Y %H:%M:%S%.3f %:z").to_string() + } + } else if time_args.european_time { + if date_only { + time.format("%d-%m-%Y").to_string() + } else { + time.format("%d-%m-%Y %H:%M:%S%.3f %:z").to_string() + } + } else if date_only { + time.format("%Y-%m-%d").to_string() + } else { + time.format("%Y-%m-%d %H:%M:%S%.3f %:z").to_string() + } +} + #[cfg(test)] mod tests { - use crate::detections::utils::{self, make_ascii_titlecase}; + use std::path::Path; + + use crate::detections::utils::{self, check_setting_path, make_ascii_titlecase}; use regex::Regex; use serde_json::Value; @@ -423,7 +511,7 @@ mod tests { #[test] fn test_check_regex() { let regexes: Vec = - utils::read_txt("./rules/config/regex/detectlist_suspicous_services.txt") + utils::read_txt("./../../../rules/config/regex/detectlist_suspicous_services.txt") .unwrap() .into_iter() .map(|regex_str| Regex::new(®ex_str).unwrap()) @@ -439,7 +527,7 @@ mod tests { fn test_check_allowlist() { let commandline = "\"C:\\Program Files\\Google\\Update\\GoogleUpdate.exe\""; let allowlist: Vec = - utils::read_txt("./rules/config/regex/allowlist_legitimate_services.txt") + utils::read_txt("./../../../rules/config/regex/allowlist_legitimate_services.txt") .unwrap() .into_iter() .map(|allow_str| Regex::new(&allow_str).unwrap()) @@ -518,4 +606,31 @@ mod tests { ); assert_eq!(make_ascii_titlecase("β".to_string().as_mut()), "β"); } + + #[test] + /// 与えられたパスからファイルの存在確認ができているかのテスト + fn test_check_setting_path() { + let exist_path = Path::new("./test_files").to_path_buf(); + let not_exist_path = Path::new("not_exist_path").to_path_buf(); + assert_eq!( + check_setting_path(¬_exist_path, "rules") + .to_str() + .unwrap(), + "rules" + ); + assert_eq!( + check_setting_path(¬_exist_path, "fake") + .to_str() + .unwrap(), + "fake" + ); + assert_eq!( + check_setting_path(&exist_path, "rules").to_str().unwrap(), + exist_path.join("rules").to_str().unwrap() + ); + assert_eq!( + check_setting_path(&exist_path, "fake").to_str().unwrap(), + "fake" + ); + } } diff --git a/src/filter.rs b/src/filter.rs index 425829e3..c78b7880 100644 --- a/src/filter.rs +++ b/src/filter.rs @@ -1,7 +1,7 @@ use crate::detections::configs; -use crate::detections::print::AlertMessage; -use crate::detections::print::ERROR_LOG_STACK; -use crate::detections::print::QUIET_ERRORS_FLAG; +use crate::detections::message::AlertMessage; +use crate::detections::message::ERROR_LOG_STACK; +use crate::detections::message::QUIET_ERRORS_FLAG; use hashbrown::HashMap; use regex::Regex; use std::fs::File; @@ -29,18 +29,16 @@ impl RuleExclude { pub fn exclude_ids() -> RuleExclude { let mut exclude_ids = RuleExclude::default(); - if !configs::CONFIG.read().unwrap().args.enable_noisy_rules { - exclude_ids.insert_ids(&format!( - "{}/noisy_rules.txt", - configs::CONFIG - .read() - .unwrap() - .args - .config - .as_path() - .display() - )); - }; + exclude_ids.insert_ids(&format!( + "{}/noisy_rules.txt", + configs::CONFIG + .read() + .unwrap() + .args + .config + .as_path() + .display() + )); exclude_ids.insert_ids(&format!( "{}/exclude_rules.txt", diff --git a/src/main.rs b/src/main.rs index bf6a4797..539b8973 100644 --- a/src/main.rs +++ b/src/main.rs @@ -3,41 +3,35 @@ extern crate downcast_rs; extern crate serde; extern crate serde_derive; -#[cfg(target_os = "windows")] -extern crate static_vcruntime; - use bytesize::ByteSize; -use chrono::{DateTime, Datelike, Local, TimeZone}; +use chrono::{DateTime, Datelike, Local}; use evtx::{EvtxParser, ParserSettings}; -use git2::Repository; use hashbrown::{HashMap, HashSet}; +use hayabusa::detections::configs::CURRENT_EXE_PATH; use hayabusa::detections::configs::{load_pivot_keywords, TargetEventTime, TARGET_EXTENSIONS}; use hayabusa::detections::detection::{self, EvtxRecordInfo}; -use hayabusa::detections::pivot::PivotKeyword; -use hayabusa::detections::pivot::PIVOT_KEYWORD; -use hayabusa::detections::print::{ +use hayabusa::detections::message::{ AlertMessage, ERROR_LOG_PATH, ERROR_LOG_STACK, LOGONSUMMARY_FLAG, PIVOT_KEYWORD_LIST_FLAG, QUIET_ERRORS_FLAG, STATISTICS_FLAG, }; +use hayabusa::detections::pivot::PivotKeyword; +use hayabusa::detections::pivot::PIVOT_KEYWORD; use hayabusa::detections::rule::{get_detection_keys, RuleNode}; use hayabusa::omikuji::Omikuji; -use hayabusa::options::level_tuning::LevelTuning; -use hayabusa::yaml::ParseYaml; +use hayabusa::options::profile::PROFILES; +use hayabusa::options::{level_tuning::LevelTuning, update_rules::UpdateRules}; use hayabusa::{afterfact::after_fact, detections::utils}; use hayabusa::{detections::configs, timeline::timelines::Timeline}; use hayabusa::{detections::utils::write_color_buffer, filter}; use hhmmss::Hhmmss; use pbr::ProgressBar; use serde_json::Value; -use std::cmp::Ordering; use std::ffi::{OsStr, OsString}; use std::fmt::Display; use std::fmt::Write as _; -use std::fs::create_dir; use std::io::{BufWriter, Write}; use std::path::Path; use std::sync::Arc; -use std::time::SystemTime; use std::{ env, fs::{self, File}, @@ -82,9 +76,18 @@ impl App { fn exec(&mut self) { if *PIVOT_KEYWORD_LIST_FLAG { - load_pivot_keywords("config/pivot_keywords.txt"); + load_pivot_keywords( + utils::check_setting_path( + &CURRENT_EXE_PATH.to_path_buf(), + "config/pivot_keywords.txt", + ) + .to_str() + .unwrap(), + ); + } + if PROFILES.is_none() { + return; } - let analysis_start_time: DateTime = Local::now(); // Show usage when no arguments. if std::env::args().len() == 1 { @@ -113,13 +116,16 @@ impl App { } if configs::CONFIG.read().unwrap().args.update_rules { - match self.update_rules() { + match UpdateRules::update_rules( + configs::CONFIG.read().unwrap().args.rules.to_str().unwrap(), + ) { Ok(output) => { if output != "You currently have the latest rules." { write_color_buffer( - BufferWriter::stdout(ColorChoice::Always), + &BufferWriter::stdout(ColorChoice::Always), None, "Rules updated successfully.", + true, ) .ok(); } @@ -131,14 +137,25 @@ impl App { println!(); return; } - - if !Path::new("./config").exists() { + // 実行時のexeファイルのパスをベースに変更する必要があるためデフォルトの値であった場合はそのexeファイルと同一階層を探すようにする + if !CURRENT_EXE_PATH.join("config").exists() && !Path::new("./config").exists() { AlertMessage::alert( - "Hayabusa could not find the config directory.\nPlease run it from the Hayabusa root directory.\nExample: ./hayabusa-1.0.0-windows-x64.exe" + "Hayabusa could not find the config directory.\nPlease make sure that it is in the same directory as the hayabusa executable." ) .ok(); return; } + // カレントディレクトリ以外からの実行の際にrules-configオプションの指定がないとエラーが発生することを防ぐための処理 + if configs::CONFIG.read().unwrap().args.config == Path::new("./rules/config") { + configs::CONFIG.write().unwrap().args.config = + utils::check_setting_path(&CURRENT_EXE_PATH.to_path_buf(), "./rules/config"); + } + + // カレントディレクトリ以外からの実行の際にrulesオプションの指定がないとエラーが発生することを防ぐための処理 + if configs::CONFIG.read().unwrap().args.rules == Path::new("./rules") { + configs::CONFIG.write().unwrap().args.rules = + utils::check_setting_path(&CURRENT_EXE_PATH.to_path_buf(), "./rules"); + } if let Some(csv_path) = &configs::CONFIG.read().unwrap().args.output { let pivot_key_unions = PIVOT_KEYWORD.read().unwrap(); @@ -170,18 +187,20 @@ impl App { if *STATISTICS_FLAG { write_color_buffer( - BufferWriter::stdout(ColorChoice::Always), + &BufferWriter::stdout(ColorChoice::Always), None, "Generating Event ID Statistics", + true, ) .ok(); println!(); } if *LOGONSUMMARY_FLAG { write_color_buffer( - BufferWriter::stdout(ColorChoice::Always), + &BufferWriter::stdout(ColorChoice::Always), None, "Generating Logons Summary", + true, ) .ok(); println!(); @@ -194,6 +213,14 @@ impl App { } self.analysis_files(live_analysis_list.unwrap(), &time_filter); } else if let Some(filepath) = &configs::CONFIG.read().unwrap().args.filepath { + if !filepath.exists() { + AlertMessage::alert(&format!( + " The file {} does not exist. Please specify a valid file path.", + filepath.as_os_str().to_str().unwrap() + )) + .ok(); + return; + } if !TARGET_EXTENSIONS.contains( filepath .extension() @@ -226,18 +253,23 @@ impl App { } else if configs::CONFIG.read().unwrap().args.contributors { self.print_contributors(); return; - } else if std::env::args() - .into_iter() - .any(|arg| arg.contains("level-tuning")) - { - let level_tuning_config_path = configs::CONFIG + } else if configs::CONFIG.read().unwrap().args.level_tuning.is_some() { + let level_tuning_val = &configs::CONFIG .read() .unwrap() .args .level_tuning - .as_path() + .clone() + .unwrap(); + let level_tuning_config_path = match level_tuning_val { + Some(path) => path.to_owned(), + _ => utils::check_setting_path( + &CURRENT_EXE_PATH.to_path_buf(), + "./rules/config/level_tuning.txt", + ) .display() - .to_string(); + .to_string(), + }; if Path::new(&level_tuning_config_path).exists() { if let Err(err) = LevelTuning::run( @@ -262,9 +294,10 @@ impl App { return; } else { write_color_buffer( - BufferWriter::stdout(ColorChoice::Always), + &BufferWriter::stdout(ColorChoice::Always), None, &configs::CONFIG.read().unwrap().headless_help, + true, ) .ok(); return; @@ -272,11 +305,11 @@ impl App { let analysis_end_time: DateTime = Local::now(); let analysis_duration = analysis_end_time.signed_duration_since(analysis_start_time); - println!(); write_color_buffer( - BufferWriter::stdout(ColorChoice::Always), + &BufferWriter::stdout(ColorChoice::Always), None, &format!("Elapsed Time: {}", &analysis_duration.hhmmssxxx()), + true, ) .ok(); println!(); @@ -329,17 +362,30 @@ impl App { ) .ok(); }); - write_color_buffer(BufferWriter::stdout(ColorChoice::Always), None, &output).ok(); + write_color_buffer( + &BufferWriter::stdout(ColorChoice::Always), + None, + &output, + true, + ) + .ok(); } else { //標準出力の場合 let output = "The following pivot keywords were found:".to_string(); - write_color_buffer(BufferWriter::stdout(ColorChoice::Always), None, &output).ok(); + write_color_buffer( + &BufferWriter::stdout(ColorChoice::Always), + None, + &output, + true, + ) + .ok(); pivot_key_unions.iter().for_each(|(key, pivot_keyword)| { write_color_buffer( - BufferWriter::stdout(ColorChoice::Always), + &BufferWriter::stdout(ColorChoice::Always), None, &create_output(String::default(), key, pivot_keyword), + true, ) .ok(); }); @@ -423,9 +469,18 @@ impl App { } fn print_contributors(&self) { - match fs::read_to_string("./contributors.txt") { + match fs::read_to_string(utils::check_setting_path( + &CURRENT_EXE_PATH.to_path_buf(), + "contributors.txt", + )) { Ok(contents) => { - write_color_buffer(BufferWriter::stdout(ColorChoice::Always), None, &contents).ok(); + write_color_buffer( + &BufferWriter::stdout(ColorChoice::Always), + None, + &contents, + true, + ) + .ok(); } Err(err) => { AlertMessage::alert(&format!("{}", err)).ok(); @@ -441,9 +496,10 @@ impl App { .min_level .to_uppercase(); write_color_buffer( - BufferWriter::stdout(ColorChoice::Always), + &BufferWriter::stdout(ColorChoice::Always), None, &format!("Analyzing event files: {:?}", evtx_files.len()), + true, ) .ok(); @@ -543,11 +599,17 @@ impl App { continue; } - // target_eventids.txtでフィルタする。 + // target_eventids.txtでイベントIDベースでフィルタする。 let data = record_result.as_ref().unwrap().data.clone(); - let timestamp = record_result.unwrap().timestamp; + if !self._is_target_event_id(&data) + && !configs::CONFIG.read().unwrap().args.deep_scan + { + continue; + } - if !self._is_target_event_id(&data) || !time_filter.is_target(&Some(timestamp)) { + // EventID側の条件との条件の混同を防ぐため時間でのフィルタリングの条件分岐を分離した + let timestamp = record_result.unwrap().timestamp; + if !time_filter.is_target(&Some(timestamp)) { continue; } @@ -659,7 +721,7 @@ impl App { /// output logo fn output_logo(&self) { - let fp = &"art/logo.txt".to_string(); + let fp = utils::check_setting_path(&CURRENT_EXE_PATH.to_path_buf(), "art/logo.txt"); let content = fs::read_to_string(fp).unwrap_or_default(); let output_color = if configs::CONFIG.read().unwrap().args.no_color { None @@ -667,9 +729,10 @@ impl App { Some(Color::Green) }; write_color_buffer( - BufferWriter::stdout(ColorChoice::Always), + &BufferWriter::stdout(ColorChoice::Always), output_color, &content, + true, ) .ok(); } @@ -685,226 +748,19 @@ impl App { match eggs.get(exec_datestr) { None => {} Some(path) => { - let content = fs::read_to_string(path).unwrap_or_default(); - write_color_buffer(BufferWriter::stdout(ColorChoice::Always), None, &content).ok(); - } - } - } - - /// update rules(hayabusa-rules subrepository) - fn update_rules(&self) -> Result { - let mut result; - let mut prev_modified_time: SystemTime = SystemTime::UNIX_EPOCH; - let mut prev_modified_rules: HashSet = HashSet::default(); - let hayabusa_repo = Repository::open(Path::new(".")); - let hayabusa_rule_repo = Repository::open(Path::new("rules")); - if hayabusa_repo.is_err() && hayabusa_rule_repo.is_err() { - write_color_buffer( - BufferWriter::stdout(ColorChoice::Always), - None, - "Attempting to git clone the hayabusa-rules repository into the rules folder.", - ) - .ok(); - // execution git clone of hayabusa-rules repository when failed open hayabusa repository. - result = self.clone_rules(); - } else if hayabusa_rule_repo.is_ok() { - // case of exist hayabusa-rules repository - self._repo_main_reset_hard(hayabusa_rule_repo.as_ref().unwrap())?; - // case of failed fetching origin/main, git clone is not executed so network error has occurred possibly. - prev_modified_rules = self.get_updated_rules("rules", &prev_modified_time); - prev_modified_time = fs::metadata("rules").unwrap().modified().unwrap(); - result = self.pull_repository(&hayabusa_rule_repo.unwrap()); - } else { - // case of no exist hayabusa-rules repository in rules. - // execute update because submodule information exists if hayabusa repository exists submodule information. - - prev_modified_time = fs::metadata("rules").unwrap().modified().unwrap(); - let rules_path = Path::new("rules"); - if !rules_path.exists() { - create_dir(rules_path).ok(); - } - let hayabusa_repo = hayabusa_repo.unwrap(); - let submodules = hayabusa_repo.submodules()?; - let mut is_success_submodule_update = true; - // submodule rules erase path is hard coding to avoid unintentional remove folder. - fs::remove_dir_all(".git/.submodule/rules").ok(); - for mut submodule in submodules { - submodule.update(true, None)?; - let submodule_repo = submodule.open()?; - if let Err(e) = self.pull_repository(&submodule_repo) { - AlertMessage::alert(&format!("Failed submodule update. {}", e)).ok(); - is_success_submodule_update = false; - } - } - if is_success_submodule_update { - result = Ok("Successed submodule update".to_string()); - } else { - result = Err(git2::Error::from_str(&String::default())); - } - } - if result.is_ok() { - let updated_modified_rules = self.get_updated_rules("rules", &prev_modified_time); - result = - self.print_diff_modified_rule_dates(prev_modified_rules, updated_modified_rules); - } - result - } - - /// hard reset in main branch - fn _repo_main_reset_hard(&self, input_repo: &Repository) -> Result<(), git2::Error> { - let branch = input_repo - .find_branch("main", git2::BranchType::Local) - .unwrap(); - let local_head = branch.get().target().unwrap(); - let object = input_repo.find_object(local_head, None).unwrap(); - match input_repo.reset(&object, git2::ResetType::Hard, None) { - Ok(()) => Ok(()), - _ => Err(git2::Error::from_str("Failed reset main branch in rules")), - } - } - - /// Pull(fetch and fast-forward merge) repositoryto input_repo. - fn pull_repository(&self, input_repo: &Repository) -> Result { - match input_repo - .find_remote("origin")? - .fetch(&["main"], None, None) - .map_err(|e| { - AlertMessage::alert(&format!("Failed git fetch to rules folder. {}", e)).ok(); - }) { - Ok(it) => it, - Err(_err) => return Err(git2::Error::from_str(&String::default())), - }; - let fetch_head = input_repo.find_reference("FETCH_HEAD")?; - let fetch_commit = input_repo.reference_to_annotated_commit(&fetch_head)?; - let analysis = input_repo.merge_analysis(&[&fetch_commit])?; - if analysis.0.is_up_to_date() { - Ok("Already up to date".to_string()) - } else if analysis.0.is_fast_forward() { - let mut reference = input_repo.find_reference("refs/heads/main")?; - reference.set_target(fetch_commit.id(), "Fast-Forward")?; - input_repo.set_head("refs/heads/main")?; - input_repo.checkout_head(Some(git2::build::CheckoutBuilder::default().force()))?; - Ok("Finished fast forward merge.".to_string()) - } else if analysis.0.is_normal() { - AlertMessage::alert( - "update-rules option is git Fast-Forward merge only. please check your rules folder." - , - ).ok(); - Err(git2::Error::from_str(&String::default())) - } else { - Err(git2::Error::from_str(&String::default())) - } - } - - /// git clone でhauyabusa-rules レポジトリをrulesフォルダにgit cloneする関数 - fn clone_rules(&self) -> Result { - match Repository::clone( - "https://github.com/Yamato-Security/hayabusa-rules.git", - "rules", - ) { - Ok(_repo) => { - println!("Finished cloning the hayabusa-rules repository."); - Ok("Finished clone".to_string()) - } - Err(e) => { - AlertMessage::alert( - &format!( - "Failed to git clone into the rules folder. Please rename your rules folder name. {}", - e - ), + let egg_path = utils::check_setting_path(&CURRENT_EXE_PATH.to_path_buf(), path); + let content = fs::read_to_string(egg_path).unwrap_or_default(); + write_color_buffer( + &BufferWriter::stdout(ColorChoice::Always), + None, + &content, + true, ) .ok(); - Err(git2::Error::from_str(&String::default())) } } } - /// Create rules folder files Hashset. Format is "[rule title in yaml]|[filepath]|[filemodified date]|[rule type in yaml]" - fn get_updated_rules( - &self, - rule_folder_path: &str, - target_date: &SystemTime, - ) -> HashSet { - let mut rulefile_loader = ParseYaml::new(); - // level in read_dir is hard code to check all rules. - rulefile_loader - .read_dir( - rule_folder_path, - "INFORMATIONAL", - &filter::RuleExclude::default(), - ) - .ok(); - - let hash_set_keys: HashSet = rulefile_loader - .files - .into_iter() - .filter_map(|(filepath, yaml)| { - let file_modified_date = fs::metadata(&filepath).unwrap().modified().unwrap(); - - if file_modified_date.cmp(target_date).is_gt() { - let yaml_date = yaml["date"].as_str().unwrap_or("-"); - return Option::Some(format!( - "{}|{}|{}|{}", - yaml["title"].as_str().unwrap_or(&String::default()), - yaml["modified"].as_str().unwrap_or(yaml_date), - &filepath, - yaml["ruletype"].as_str().unwrap_or("Other") - )); - } - Option::None - }) - .collect(); - hash_set_keys - } - - /// print updated rule files. - fn print_diff_modified_rule_dates( - &self, - prev_sets: HashSet, - updated_sets: HashSet, - ) -> Result { - let diff = updated_sets.difference(&prev_sets); - let mut update_count_by_rule_type: HashMap = HashMap::new(); - let mut latest_update_date = Local.timestamp(0, 0); - for diff_key in diff { - let tmp: Vec<&str> = diff_key.split('|').collect(); - let file_modified_date = fs::metadata(&tmp[2]).unwrap().modified().unwrap(); - - let dt_local: DateTime = file_modified_date.into(); - - if latest_update_date.cmp(&dt_local) == Ordering::Less { - latest_update_date = dt_local; - } - *update_count_by_rule_type - .entry(tmp[3].to_string()) - .or_insert(0b0) += 1; - write_color_buffer( - BufferWriter::stdout(ColorChoice::Always), - None, - &format!( - "[Updated] {} (Modified: {} | Path: {})", - tmp[0], tmp[1], tmp[2] - ), - ) - .ok(); - } - println!(); - for (key, value) in &update_count_by_rule_type { - println!("Updated {} rules: {}", key, value); - } - if !&update_count_by_rule_type.is_empty() { - Ok("Rule updated".to_string()) - } else { - write_color_buffer( - BufferWriter::stdout(ColorChoice::Always), - None, - "You currently have the latest rules.", - ) - .ok(); - Ok("You currently have the latest rules.".to_string()) - } - } - /// check architecture fn is_matched_architecture_and_binary(&self) -> bool { if cfg!(target_os = "windows") { @@ -925,7 +781,6 @@ impl App { #[cfg(test)] mod tests { use crate::App; - use std::time::SystemTime; #[test] fn test_collect_evtxfiles() { @@ -942,20 +797,4 @@ mod tests { assert_eq!(is_contains, &true); }) } - - #[test] - fn test_get_updated_rules() { - let app = App::new(); - - let prev_modified_time: SystemTime = SystemTime::UNIX_EPOCH; - - let prev_modified_rules = - app.get_updated_rules("test_files/rules/level_yaml", &prev_modified_time); - assert_eq!(prev_modified_rules.len(), 5); - - let target_time: SystemTime = SystemTime::now(); - let prev_modified_rules2 = - app.get_updated_rules("test_files/rules/level_yaml", &target_time); - assert_eq!(prev_modified_rules2.len(), 0); - } } diff --git a/src/options/level_tuning.rs b/src/options/level_tuning.rs index f378ec1f..42f7576c 100644 --- a/src/options/level_tuning.rs +++ b/src/options/level_tuning.rs @@ -2,7 +2,7 @@ use crate::detections::utils::write_color_buffer; use crate::detections::{configs, utils}; use crate::filter::RuleExclude; use crate::yaml::ParseYaml; -use std::collections::HashMap; +use hashbrown::HashMap; use std::fs::{self, File}; use std::io::Write; use termcolor::{BufferWriter, ColorChoice}; @@ -59,9 +59,10 @@ impl LevelTuning { for (path, rule) in rulefile_loader.files { if let Some(new_level) = tuning_map.get(rule["id"].as_str().unwrap()) { write_color_buffer( - BufferWriter::stdout(ColorChoice::Always), + &BufferWriter::stdout(ColorChoice::Always), None, &format!("path: {}", path), + true, ) .ok(); let mut content = match fs::read_to_string(&path) { @@ -94,13 +95,14 @@ impl LevelTuning { file.write_all(content.as_bytes()).unwrap(); file.flush().unwrap(); write_color_buffer( - BufferWriter::stdout(ColorChoice::Always), + &BufferWriter::stdout(ColorChoice::Always), None, &format!( "level: {} -> {}", rule["level"].as_str().unwrap(), new_level ), + true, ) .ok(); } diff --git a/src/options/mod.rs b/src/options/mod.rs index 1f3c32b6..f63dd2b9 100644 --- a/src/options/mod.rs +++ b/src/options/mod.rs @@ -1 +1,3 @@ pub mod level_tuning; +pub mod profile; +pub mod update_rules; diff --git a/src/options/profile.rs b/src/options/profile.rs new file mode 100644 index 00000000..70e0e9cf --- /dev/null +++ b/src/options/profile.rs @@ -0,0 +1,309 @@ +use crate::detections::configs::{self, CURRENT_EXE_PATH}; +use crate::detections::message::AlertMessage; +use crate::detections::utils::check_setting_path; +use crate::yaml; +use hashbrown::HashSet; +use lazy_static::lazy_static; +use linked_hash_map::LinkedHashMap; +use regex::RegexSet; +use std::fs::OpenOptions; +use std::io::{BufWriter, Write}; +use std::path::Path; +use yaml_rust::{Yaml, YamlEmitter, YamlLoader}; + +lazy_static! { + pub static ref PROFILES: Option> = load_profile( + check_setting_path( + &CURRENT_EXE_PATH.to_path_buf(), + "config/default_profile.yaml" + ) + .to_str() + .unwrap(), + check_setting_path(&CURRENT_EXE_PATH.to_path_buf(), "config/profiles.yaml") + .to_str() + .unwrap() + ); + pub static ref LOAEDED_PROFILE_ALIAS: HashSet = HashSet::from_iter( + PROFILES + .as_ref() + .unwrap_or(&LinkedHashMap::default()) + .values() + .cloned() + ); + pub static ref PRELOAD_PROFILE: Vec<&'static str> = vec![ + "%Timestamp%", + "%Computer%", + "%Channel%", + "%Level%", + "%EventID%", + "%RecordID%", + "%RuleTitle%", + "%RecordInformation%", + "%RuleFile%", + "%EvtxFile%", + "%MitreTactics%", + "%MitreTags%", + "%OtherTags%" + ]; + pub static ref PRELOAD_PROFILE_REGEX: RegexSet = RegexSet::new(&*PRELOAD_PROFILE).unwrap(); +} + +// 指定されたパスのprofileを読み込む処理 +fn read_profile_data(profile_path: &str) -> Result, String> { + let yml = yaml::ParseYaml::new(); + if let Ok(loaded_profile) = yml.read_file(Path::new(profile_path).to_path_buf()) { + match YamlLoader::load_from_str(&loaded_profile) { + Ok(profile_yml) => Ok(profile_yml), + Err(e) => Err(format!("Parse error: {}. {}", profile_path, e)), + } + } else { + Err(format!( + "The profile file({}) does not exist. Please check your default profile.", + profile_path + )) + } +} + +/// プロファイル情報`を読み込む関数 +pub fn load_profile( + default_profile_path: &str, + profile_path: &str, +) -> Option> { + let conf = &configs::CONFIG.read().unwrap().args; + if conf.set_default_profile.is_some() { + if let Err(e) = set_default_profile(default_profile_path, profile_path) { + AlertMessage::alert(&e).ok(); + } else { + println!("Successfully updated the default profile."); + }; + } + let profile_all: Vec = if conf.profile.is_none() { + match read_profile_data(default_profile_path) { + Ok(data) => data, + Err(e) => { + AlertMessage::alert(&e).ok(); + vec![] + } + } + } else { + match read_profile_data(profile_path) { + Ok(data) => data, + Err(e) => { + AlertMessage::alert(&e).ok(); + vec![] + } + } + }; + + // profileを読み込んで何も結果がない場合はAlert出しているためプログラム終了のためにNoneを出力する。 + if profile_all.is_empty() { + return None; + } + let profile_data = &profile_all[0]; + let mut ret: LinkedHashMap = LinkedHashMap::new(); + if let Some(profile_name) = &conf.profile { + let target_data = &profile_data[profile_name.as_str()]; + if !target_data.is_badvalue() { + target_data + .as_hash() + .unwrap() + .into_iter() + .for_each(|(k, v)| { + ret.insert( + k.as_str().unwrap().to_string(), + v.as_str().unwrap().to_string(), + ); + }); + Some(ret) + } else { + let profile_names: Vec<&str> = profile_data + .as_hash() + .unwrap() + .keys() + .map(|k| k.as_str().unwrap()) + .collect(); + AlertMessage::alert(&format!( + "Invalid profile specified: {}\nPlease specify one of the following profiles:\n {}", + profile_name, + profile_names.join(", ") + )) + .ok(); + None + } + } else { + profile_data + .as_hash() + .unwrap() + .into_iter() + .for_each(|(k, v)| { + ret.insert( + k.as_str().unwrap().to_string(), + v.as_str().unwrap().to_string(), + ); + }); + Some(ret) + } +} + +/// デフォルトプロファイルを設定する関数 +pub fn set_default_profile(default_profile_path: &str, profile_path: &str) -> Result<(), String> { + let profile_data: Vec = match read_profile_data(profile_path) { + Ok(data) => data, + Err(e) => { + AlertMessage::alert(&e).ok(); + return Err("Failed to set the default profile.".to_string()); + } + }; + + // デフォルトプロファイルを設定する処理 + if let Some(profile_name) = &configs::CONFIG.read().unwrap().args.set_default_profile { + if let Ok(mut buf_wtr) = OpenOptions::new() + .write(true) + .truncate(true) + .open(default_profile_path) + .map(BufWriter::new) + { + let prof_all_data = &profile_data[0]; + let overwrite_default_data = &prof_all_data[profile_name.as_str()]; + if !overwrite_default_data.is_badvalue() { + let mut out_str = String::default(); + let mut yml_writer = YamlEmitter::new(&mut out_str); + let dump_result = yml_writer.dump(overwrite_default_data); + match dump_result { + Ok(_) => match buf_wtr.write_all(out_str.as_bytes()) { + Err(e) => Err(format!( + "Failed to set the default profile file({}). {}", + profile_path, e + )), + _ => { + buf_wtr.flush().ok(); + Ok(()) + } + }, + Err(e) => Err(format!( + "Failed to set the default profile file({}). {}", + profile_path, e + )), + } + } else { + let profile_names: Vec<&str> = prof_all_data + .as_hash() + .unwrap() + .keys() + .map(|k| k.as_str().unwrap()) + .collect(); + Err(format!( + "Invalid profile specified: {}\nPlease specify one of the following profiles:\n{}", + profile_name, + profile_names.join(", ") + )) + } + } else { + Err(format!( + "Failed to set the default profile file({}).", + profile_path + )) + } + } else { + Err("Not specified: --set-default-profile".to_string()) + } +} + +#[cfg(test)] +mod tests { + use linked_hash_map::LinkedHashMap; + + use crate::detections::configs; + use crate::options::profile::load_profile; + + #[test] + ///オプションの設定が入ると値の冪等性が担保できないためテストを逐次的に処理する + fn test_load_profile() { + test_load_profile_without_profile_option(); + test_load_profile_no_exist_profile_files(); + test_load_profile_with_profile_option(); + } + + /// プロファイルオプションが設定されていないときにロードをした場合のテスト + fn test_load_profile_without_profile_option() { + configs::CONFIG.write().unwrap().args.profile = None; + let mut expect: LinkedHashMap = LinkedHashMap::new(); + expect.insert("Timestamp".to_owned(), "%Timestamp%".to_owned()); + expect.insert("Computer".to_owned(), "%Computer%".to_owned()); + expect.insert("Channel".to_owned(), "%Channel%".to_owned()); + expect.insert("Level".to_owned(), "%Level%".to_owned()); + expect.insert("EventID".to_owned(), "%EventID%".to_owned()); + expect.insert("MitreAttack".to_owned(), "%MitreAttack%".to_owned()); + expect.insert("RecordID".to_owned(), "%RecordID%".to_owned()); + expect.insert("RuleTitle".to_owned(), "%RuleTitle%".to_owned()); + expect.insert("Details".to_owned(), "%Details%".to_owned()); + expect.insert( + "RecordInformation".to_owned(), + "%RecordInformation%".to_owned(), + ); + expect.insert("RuleFile".to_owned(), "%RuleFile%".to_owned()); + expect.insert("EvtxFile".to_owned(), "%EvtxFile%".to_owned()); + expect.insert("Tags".to_owned(), "%MitreAttack%".to_owned()); + + assert_eq!( + Some(expect), + load_profile( + "test_files/config/default_profile.yaml", + "test_files/config/profiles.yaml" + ) + ); + } + + /// プロファイルオプションが設定されて`おり、そのオプションに該当するプロファイルが存在する場合のテスト + fn test_load_profile_with_profile_option() { + configs::CONFIG.write().unwrap().args.profile = Some("minimal".to_string()); + let mut expect: LinkedHashMap = LinkedHashMap::new(); + expect.insert("Timestamp".to_owned(), "%Timestamp%".to_owned()); + expect.insert("Computer".to_owned(), "%Computer%".to_owned()); + expect.insert("Channel".to_owned(), "%Channel%".to_owned()); + expect.insert("EventID".to_owned(), "%EventID%".to_owned()); + expect.insert("Level".to_owned(), "%Level%".to_owned()); + expect.insert("RuleTitle".to_owned(), "%RuleTitle%".to_owned()); + expect.insert("Details".to_owned(), "%Details%".to_owned()); + + assert_eq!( + Some(expect), + load_profile( + "test_files/config/default_profile.yaml", + "test_files/config/profiles.yaml" + ) + ); + } + + /// プロファイルオプションが設定されているが、対象のオプションが存在しない場合のテスト + fn test_load_profile_no_exist_profile_files() { + configs::CONFIG.write().unwrap().args.profile = Some("not_exist".to_string()); + + //両方のファイルが存在しない場合 + assert_eq!( + None, + load_profile( + "test_files/config/no_exist_default_profile.yaml", + "test_files/config/no_exist_profiles.yaml" + ) + ); + + //デフォルトプロファイルは存在しているがprofileオプションが指定されているため読み込み失敗の場合 + assert_eq!( + None, + load_profile( + "test_files/config/profile/default_profile.yaml", + "test_files/config/profile/no_exist_profiles.yaml" + ) + ); + + //オプション先のターゲットのプロファイルファイルが存在しているが、profileオプションで指定されたオプションが存在しない場合 + assert_eq!( + None, + load_profile( + "test_files/config/no_exist_default_profile.yaml", + "test_files/config/profiles.yaml" + ) + ); + } +} diff --git a/src/options/update_rules.rs b/src/options/update_rules.rs new file mode 100644 index 00000000..6d501777 --- /dev/null +++ b/src/options/update_rules.rs @@ -0,0 +1,273 @@ +use crate::detections::message::AlertMessage; +use crate::detections::utils::write_color_buffer; +use crate::filter; +use crate::yaml::ParseYaml; +use chrono::{DateTime, Local, TimeZone}; +use git2::Repository; +use std::fs::{self}; +use std::path::Path; + +use hashbrown::{HashMap, HashSet}; +use std::cmp::Ordering; + +use std::time::SystemTime; + +use std::fs::create_dir; + +use termcolor::{BufferWriter, ColorChoice}; + +pub struct UpdateRules {} + +impl UpdateRules { + /// update rules(hayabusa-rules subrepository) + pub fn update_rules(rule_path: &str) -> Result { + let mut result; + let mut prev_modified_time: SystemTime = SystemTime::UNIX_EPOCH; + let mut prev_modified_rules: HashSet = HashSet::default(); + let hayabusa_repo = Repository::open(Path::new(".")); + let hayabusa_rule_repo = Repository::open(Path::new(rule_path)); + if hayabusa_repo.is_err() && hayabusa_rule_repo.is_err() { + write_color_buffer( + &BufferWriter::stdout(ColorChoice::Always), + None, + "Attempting to git clone the hayabusa-rules repository into the rules folder.", + true, + ) + .ok(); + // execution git clone of hayabusa-rules repository when failed open hayabusa repository. + result = UpdateRules::clone_rules(Path::new(rule_path)); + } else if hayabusa_rule_repo.is_ok() { + // case of exist hayabusa-rules repository + UpdateRules::_repo_main_reset_hard(hayabusa_rule_repo.as_ref().unwrap())?; + // case of failed fetching origin/main, git clone is not executed so network error has occurred possibly. + prev_modified_rules = UpdateRules::get_updated_rules(rule_path, &prev_modified_time); + prev_modified_time = fs::metadata(rule_path).unwrap().modified().unwrap(); + result = UpdateRules::pull_repository(&hayabusa_rule_repo.unwrap()); + } else { + // case of no exist hayabusa-rules repository in rules. + // execute update because submodule information exists if hayabusa repository exists submodule information. + + prev_modified_time = fs::metadata(rule_path).unwrap().modified().unwrap(); + let rules_path = Path::new(rule_path); + if !rules_path.exists() { + create_dir(rules_path).ok(); + } + if rule_path == "./rules" { + let hayabusa_repo = hayabusa_repo.unwrap(); + let submodules = hayabusa_repo.submodules()?; + let mut is_success_submodule_update = true; + // submodule rules erase path is hard coding to avoid unintentional remove folder. + fs::remove_dir_all(".git/.submodule/rules").ok(); + for mut submodule in submodules { + submodule.update(true, None)?; + let submodule_repo = submodule.open()?; + if let Err(e) = UpdateRules::pull_repository(&submodule_repo) { + AlertMessage::alert(&format!("Failed submodule update. {}", e)).ok(); + is_success_submodule_update = false; + } + } + if is_success_submodule_update { + result = Ok("Successed submodule update".to_string()); + } else { + result = Err(git2::Error::from_str(&String::default())); + } + } else { + write_color_buffer( + &BufferWriter::stdout(ColorChoice::Always), + None, + "Attempting to git clone the hayabusa-rules repository into the rules folder.", + true, + ) + .ok(); + // execution git clone of hayabusa-rules repository when failed open hayabusa repository. + result = UpdateRules::clone_rules(rules_path); + } + } + if result.is_ok() { + let updated_modified_rules = + UpdateRules::get_updated_rules(rule_path, &prev_modified_time); + result = UpdateRules::print_diff_modified_rule_dates( + prev_modified_rules, + updated_modified_rules, + ); + } + result + } + + /// hard reset in main branch + fn _repo_main_reset_hard(input_repo: &Repository) -> Result<(), git2::Error> { + let branch = input_repo + .find_branch("main", git2::BranchType::Local) + .unwrap(); + let local_head = branch.get().target().unwrap(); + let object = input_repo.find_object(local_head, None).unwrap(); + match input_repo.reset(&object, git2::ResetType::Hard, None) { + Ok(()) => Ok(()), + _ => Err(git2::Error::from_str("Failed reset main branch in rules")), + } + } + + /// Pull(fetch and fast-forward merge) repositoryto input_repo. + fn pull_repository(input_repo: &Repository) -> Result { + match input_repo + .find_remote("origin")? + .fetch(&["main"], None, None) + .map_err(|e| { + AlertMessage::alert(&format!("Failed git fetch to rules folder. {}", e)).ok(); + }) { + Ok(it) => it, + Err(_err) => return Err(git2::Error::from_str(&String::default())), + }; + let fetch_head = input_repo.find_reference("FETCH_HEAD")?; + let fetch_commit = input_repo.reference_to_annotated_commit(&fetch_head)?; + let analysis = input_repo.merge_analysis(&[&fetch_commit])?; + if analysis.0.is_up_to_date() { + Ok("Already up to date".to_string()) + } else if analysis.0.is_fast_forward() { + let mut reference = input_repo.find_reference("refs/heads/main")?; + reference.set_target(fetch_commit.id(), "Fast-Forward")?; + input_repo.set_head("refs/heads/main")?; + input_repo.checkout_head(Some(git2::build::CheckoutBuilder::default().force()))?; + Ok("Finished fast forward merge.".to_string()) + } else if analysis.0.is_normal() { + AlertMessage::alert( + "update-rules option is git Fast-Forward merge only. please check your rules folder." + , + ).ok(); + Err(git2::Error::from_str(&String::default())) + } else { + Err(git2::Error::from_str(&String::default())) + } + } + + /// git clone でhauyabusa-rules レポジトリをrulesフォルダにgit cloneする関数 + fn clone_rules(rules_path: &Path) -> Result { + match Repository::clone( + "https://github.com/Yamato-Security/hayabusa-rules.git", + rules_path, + ) { + Ok(_repo) => { + println!("Finished cloning the hayabusa-rules repository."); + Ok("Finished clone".to_string()) + } + Err(e) => { + AlertMessage::alert( + &format!( + "Failed to git clone into the rules folder. Please rename your rules folder name. {}", + e + ), + ) + .ok(); + Err(git2::Error::from_str(&String::default())) + } + } + } + + /// Create rules folder files Hashset. Format is "[rule title in yaml]|[filepath]|[filemodified date]|[rule type in yaml]" + fn get_updated_rules(rule_folder_path: &str, target_date: &SystemTime) -> HashSet { + let mut rulefile_loader = ParseYaml::new(); + // level in read_dir is hard code to check all rules. + rulefile_loader + .read_dir( + rule_folder_path, + "INFORMATIONAL", + &filter::RuleExclude::default(), + ) + .ok(); + + let hash_set_keys: HashSet = rulefile_loader + .files + .into_iter() + .filter_map(|(filepath, yaml)| { + let file_modified_date = fs::metadata(&filepath).unwrap().modified().unwrap(); + + if file_modified_date.cmp(target_date).is_gt() { + let yaml_date = yaml["date"].as_str().unwrap_or("-"); + return Option::Some(format!( + "{}|{}|{}|{}", + yaml["title"].as_str().unwrap_or(&String::default()), + yaml["modified"].as_str().unwrap_or(yaml_date), + &filepath, + yaml["ruletype"].as_str().unwrap_or("Other") + )); + } + Option::None + }) + .collect(); + hash_set_keys + } + + /// print updated rule files. + fn print_diff_modified_rule_dates( + prev_sets: HashSet, + updated_sets: HashSet, + ) -> Result { + let diff = updated_sets.difference(&prev_sets); + let mut update_count_by_rule_type: HashMap = HashMap::new(); + let mut latest_update_date = Local.timestamp(0, 0); + for diff_key in diff { + let tmp: Vec<&str> = diff_key.split('|').collect(); + let file_modified_date = fs::metadata(&tmp[2]).unwrap().modified().unwrap(); + + let dt_local: DateTime = file_modified_date.into(); + + if latest_update_date.cmp(&dt_local) == Ordering::Less { + latest_update_date = dt_local; + } + *update_count_by_rule_type + .entry(tmp[3].to_string()) + .or_insert(0b0) += 1; + let path_str: &str = if tmp[2].starts_with("./") { + tmp[2].strip_prefix("./").unwrap() + } else { + tmp[2] + }; + write_color_buffer( + &BufferWriter::stdout(ColorChoice::Always), + None, + &format!( + "[Updated] {} (Modified: {} | Path: {})", + tmp[0], tmp[1], path_str + ), + true, + ) + .ok(); + } + println!(); + for (key, value) in &update_count_by_rule_type { + println!("Updated {} rules: {}", key, value); + } + if !&update_count_by_rule_type.is_empty() { + Ok("Rule updated".to_string()) + } else { + write_color_buffer( + &BufferWriter::stdout(ColorChoice::Always), + None, + "You currently have the latest rules.", + true, + ) + .ok(); + Ok("You currently have the latest rules.".to_string()) + } + } +} + +#[cfg(test)] +mod tests { + use crate::options::update_rules::UpdateRules; + use std::time::SystemTime; + + #[test] + fn test_get_updated_rules() { + let prev_modified_time: SystemTime = SystemTime::UNIX_EPOCH; + + let prev_modified_rules = + UpdateRules::get_updated_rules("test_files/rules/level_yaml", &prev_modified_time); + assert_eq!(prev_modified_rules.len(), 5); + + let target_time: SystemTime = SystemTime::now(); + let prev_modified_rules2 = + UpdateRules::get_updated_rules("test_files/rules/level_yaml", &target_time); + assert_eq!(prev_modified_rules2.len(), 0); + } +} diff --git a/src/timeline/statistics.rs b/src/timeline/statistics.rs index d487a0a4..2ab6d912 100644 --- a/src/timeline/statistics.rs +++ b/src/timeline/statistics.rs @@ -1,4 +1,4 @@ -use crate::detections::print::{LOGONSUMMARY_FLAG, STATISTICS_FLAG}; +use crate::detections::message::{LOGONSUMMARY_FLAG, STATISTICS_FLAG}; use crate::detections::{detection::EvtxRecordInfo, utils}; use hashbrown::HashMap; @@ -129,8 +129,21 @@ impl EventStatistics { if evtid.is_none() { continue; } + let idnum: i64 = if evtid.unwrap().is_number() { + evtid.unwrap().as_i64().unwrap() + } else { + evtid + .unwrap() + .as_str() + .unwrap() + .parse::() + .unwrap_or_default() + }; + if !(idnum == 4624 || idnum == 4625) { + continue; + } + let username = utils::get_event_value("TargetUserName", &record.record); - let idnum = evtid.unwrap(); let countlist: [usize; 2] = [0, 0]; if idnum == 4624 { let count: &mut [usize; 2] = self diff --git a/src/timeline/timelines.rs b/src/timeline/timelines.rs index 34f8bc8f..a0cad83a 100644 --- a/src/timeline/timelines.rs +++ b/src/timeline/timelines.rs @@ -1,4 +1,4 @@ -use crate::detections::print::{LOGONSUMMARY_FLAG, STATISTICS_FLAG}; +use crate::detections::message::{LOGONSUMMARY_FLAG, STATISTICS_FLAG}; use crate::detections::{configs::CONFIG, detection::EvtxRecordInfo}; use prettytable::{Cell, Row, Table}; diff --git a/src/yaml.rs b/src/yaml.rs index 49c1ba12..19ce2eef 100644 --- a/src/yaml.rs +++ b/src/yaml.rs @@ -2,9 +2,9 @@ extern crate serde_derive; extern crate yaml_rust; use crate::detections::configs; -use crate::detections::print::AlertMessage; -use crate::detections::print::ERROR_LOG_STACK; -use crate::detections::print::QUIET_ERRORS_FLAG; +use crate::detections::configs::EXCLUDE_STATUS; +use crate::detections::message::AlertMessage; +use crate::detections::message::{ERROR_LOG_STACK, QUIET_ERRORS_FLAG}; use crate::filter::RuleExclude; use hashbrown::HashMap; use std::ffi::OsStr; @@ -165,6 +165,19 @@ impl ParseYaml { return io::Result::Ok(ret); } + // ignore if tool test yml file in hayabusa-rules. + if path + .to_str() + .unwrap() + .contains("rules/tools/sigmac/test_files") + || path + .to_str() + .unwrap() + .contains("rules\\tools\\sigmac\\test_files") + { + return io::Result::Ok(ret); + } + // 個別のファイルの読み込みは即終了としない。 let read_content = self.read_file(path); if read_content.is_err() { @@ -231,7 +244,28 @@ impl ParseYaml { } else { "noisy" }; - let entry = self.rule_load_cnt.entry(entry_key.to_string()).or_insert(0); + // テスト用のルール(ID:000...0)の場合はexcluded ruleのカウントから除外するようにする + if v != "00000000-0000-0000-0000-000000000000" { + let entry = + self.rule_load_cnt.entry(entry_key.to_string()).or_insert(0); + *entry += 1; + } + if entry_key == "excluded" + || (entry_key == "noisy" + && !configs::CONFIG.read().unwrap().args.enable_noisy_rules) + { + return Option::None; + } + } + } + + let status = &yaml_doc["status"].as_str(); + if let Some(s) = status { + if EXCLUDE_STATUS.contains(&s.to_string()) { + let entry = self + .rule_load_cnt + .entry("excluded".to_string()) + .or_insert(0); *entry += 1; return Option::None; } @@ -271,19 +305,6 @@ impl ParseYaml { if doc_level_num < args_level_num { return Option::None; } - - if !configs::CONFIG.read().unwrap().args.enable_deprecated_rules { - let rule_status = &yaml_doc["status"].as_str().unwrap_or_default(); - if *rule_status == "deprecated" { - let entry = self - .rule_status_cnt - .entry(rule_status.to_string()) - .or_insert(0); - *entry += 1; - return Option::None; - } - } - Option::Some((filepath, yaml_doc)) }) .collect(); @@ -295,8 +316,8 @@ impl ParseYaml { #[cfg(test)] mod tests { - use crate::detections::print::AlertMessage; - use crate::detections::print::ERROR_LOG_PATH; + use crate::detections::message::AlertMessage; + use crate::detections::message::ERROR_LOG_PATH; use crate::filter; use crate::yaml; use crate::yaml::RuleExclude; @@ -439,7 +460,7 @@ mod tests { yaml.read_dir(path, "", &exclude_ids).unwrap(); assert_eq!( yaml.rule_status_cnt.get("deprecated").unwrap().to_owned(), - 2 + 1 ); } } diff --git a/test_files/config/default_profile.yaml b/test_files/config/default_profile.yaml new file mode 100644 index 00000000..a643554a --- /dev/null +++ b/test_files/config/default_profile.yaml @@ -0,0 +1,13 @@ +Timestamp: "%Timestamp%" +Computer: "%Computer%" +Channel: "%Channel%" +Level: "%Level%" +EventID: "%EventID%" +MitreAttack: "%MitreAttack%" +RecordID: "%RecordID%" +RuleTitle: "%RuleTitle%" +Details: "%Details%" +RecordInformation: "%RecordInformation%" +RuleFile: "%RuleFile%" +EvtxFile: "%EvtxFile%" +Tags: "%MitreAttack%" diff --git a/test_files/config/output_tag.txt b/test_files/config/mitre_tactics.txt similarity index 100% rename from test_files/config/output_tag.txt rename to test_files/config/mitre_tactics.txt diff --git a/test_files/config/profiles.yaml b/test_files/config/profiles.yaml new file mode 100644 index 00000000..78348ee2 --- /dev/null +++ b/test_files/config/profiles.yaml @@ -0,0 +1,44 @@ +minimal: + Timestamp: "%Timestamp%" + Computer: "%Computer%" + Channel: "%Channel%" + EventID: "%EventID%" + Level: "%Level%" + RuleTitle: "%RuleTitle%" + Details: "%Details%" + +standard: + Timestamp: "%Timestamp%" + Computer: "%Computer%" + Channel: "%Channel%" + EventID: "%EventID%" + Level: "%Level%" + Tags: "%MitreAttack%" + RecordID: "%RecordID%" + RuleTitle: "%RuleTitle%" + Details: "%Details%" + +verbose-1: + Timestamp: "%Timestamp%" + Computer: "%Computer%" + Channel: "%Channel%" + EventID: "%EventID%" + Level: "%Level%" + Tags: "%MitreAttack%" + RecordID: "%RecordID%" + RuleTitle: "%RuleTitle%" + Details: "%Details%" + RuleFile: "%RuleFile%" + EvtxFile: "%EvtxFile%" + +verbose-2: + Timestamp: "%Timestamp%" + Computer: "%Computer%" + Channel: "%Channel%" + EventID: "%EventID%" + Level: "%Level%" + Tags: "%MitreAttack%" + RecordID: "%RecordID%" + RuleTitle: "%RuleTitle%" + Details: "%Details%" + AllFieldInfo: "%RecordInformation%" \ No newline at end of file diff --git a/test_files/rules/yaml/exclude1.yml b/test_files/rules/yaml/exclude1.yml index 76e3e73d..7fd19c8d 100644 --- a/test_files/rules/yaml/exclude1.yml +++ b/test_files/rules/yaml/exclude1.yml @@ -1,5 +1,5 @@ -title: Sysmon Check command lines -id : 4fe151c2-ecf9-4fae-95ae-b88ec9c2fca6 +title: Excluded Rule Test 1 +id : 00000000-0000-0000-0000-000000000000 description: hogehoge enabled: true author: Yea diff --git a/test_files/rules/yaml/exclude2.yml b/test_files/rules/yaml/exclude2.yml index e17e37cf..89214921 100644 --- a/test_files/rules/yaml/exclude2.yml +++ b/test_files/rules/yaml/exclude2.yml @@ -1,13 +1,10 @@ -title: Possible Exploitation of Exchange RCE CVE-2021-42321 -author: Florian Roth, @testanull +title: Excluded Rule 2 date: 2021/11/18 -description: Detects log entries that appear in exploitation attempts against MS Exchange - RCE CVE-2021-42321 detection: condition: 'Cmdlet failed. Cmdlet Get-App, ' falsepositives: - Unknown, please report false positives via https://github.com/SigmaHQ/sigma/issues -id: c92f1896-d1d2-43c3-92d5-7a5b35c217bb +id: 00000000-0000-0000-0000-000000000000 level: critical logsource: product: windows @@ -15,7 +12,4 @@ logsource: references: - https://msrc.microsoft.com/update-guide/vulnerability/CVE-2021-42321 status: experimental -tags: -- attack.lateral_movement -- attack.t1210 ruletype: SIGMA diff --git a/test_files/rules/yaml/exclude3.yml b/test_files/rules/yaml/exclude3.yml index 45f43c4a..e5b79e6d 100644 --- a/test_files/rules/yaml/exclude3.yml +++ b/test_files/rules/yaml/exclude3.yml @@ -1,8 +1,5 @@ -title: Hidden Local User Creation -author: Christian Burkard +title: Excluded Rule 3 date: 2021/05/03 -description: Detects the creation of a local hidden user account which should not - happen for event ID 4720. detection: SELECTION_1: EventID: 4720 @@ -14,7 +11,7 @@ falsepositives: fields: - EventCode - AccountName -id: 7b449a5e-1db5-4dd0-a2dc-4e3a67282538 +id: 00000000-0000-0000-0000-000000000000 level: high logsource: product: windows @@ -22,7 +19,4 @@ logsource: references: - https://twitter.com/SBousseaden/status/1387743867663958021 status: experimental -tags: -- attack.persistence -- attack.t1136.001 ruletype: SIGMA diff --git a/test_files/rules/yaml/exclude4.yml b/test_files/rules/yaml/exclude4.yml index 06b76c48..95fe7061 100644 --- a/test_files/rules/yaml/exclude4.yml +++ b/test_files/rules/yaml/exclude4.yml @@ -1,8 +1,5 @@ -title: User Added to Local Administrators -author: Florian Roth +title: Excluded Rule 4 date: 2017/03/14 -description: This rule triggers on user accounts that are added to the local Administrators - group, which could be legitimate activity or a sign of privilege escalation activity detection: SELECTION_1: EventID: 4732 @@ -13,18 +10,11 @@ detection: SELECTION_4: SubjectUserName: '*$' condition: ((SELECTION_1 and (SELECTION_2 or SELECTION_3)) and not (SELECTION_4)) -falsepositives: -- Legitimate administrative activity -id: c265cf08-3f99-46c1-8d59-328247057d57 +id: 00000000-0000-0000-0000-000000000000 level: medium logsource: product: windows service: security modified: 2021/07/07 status: stable -tags: -- attack.privilege_escalation -- attack.t1078 -- attack.persistence -- attack.t1098 ruletype: SIGMA diff --git a/test_files/rules/yaml/exclude5.yml b/test_files/rules/yaml/exclude5.yml index 27ec53cc..b54b5eab 100644 --- a/test_files/rules/yaml/exclude5.yml +++ b/test_files/rules/yaml/exclude5.yml @@ -1,9 +1,5 @@ -title: Local User Creation -author: Patrick Bareiss +title: Excluded Rule 5 date: 2019/04/18 -description: Detects local user creation on windows servers, which shouldn't happen - in an Active Directory environment. Apply this Sigma Use Case on your windows server - logs and not on your DC logs. detection: SELECTION_1: EventID: 4720 @@ -15,7 +11,7 @@ fields: - EventCode - AccountName - AccountDomain -id: 66b6be3d-55d0-4f47-9855-d69df21740ea +id: 00000000-0000-0000-0000-000000000000 level: low logsource: product: windows @@ -24,8 +20,4 @@ modified: 2020/08/23 references: - https://patrick-bareiss.com/detecting-local-user-creation-in-ad-with-sigma/ status: experimental -tags: -- attack.persistence -- attack.t1136 -- attack.t1136.001 ruletype: SIGMA diff --git a/test_files/rules/yaml/noisy1.yml b/test_files/rules/yaml/noisy1.yml index 6ea217b6..eab1c29a 100644 --- a/test_files/rules/yaml/noisy1.yml +++ b/test_files/rules/yaml/noisy1.yml @@ -1,7 +1,5 @@ -title: WMI Event Subscription -author: Tom Ueltschi (@c_APT_ure) +title: Noisy Rule Test1 date: 2019/01/12 -description: Detects creation of WMI event subscription persistence method detection: SELECTION_1: EventID: 19 @@ -12,7 +10,7 @@ detection: condition: (SELECTION_1 or SELECTION_2 or SELECTION_3) falsepositives: - exclude legitimate (vetted) use of WMI event subscription in your network -id: 0f06a3a5-6a09-413f-8743-e6cf35561297 +id: 0090ea60-f4a2-43a8-8657-3a9a4ddcf547 level: high logsource: category: wmi_event diff --git a/test_files/rules/yaml/noisy2.yml b/test_files/rules/yaml/noisy2.yml index 2296fba4..20b18825 100644 --- a/test_files/rules/yaml/noisy2.yml +++ b/test_files/rules/yaml/noisy2.yml @@ -1,9 +1,6 @@ -title: Rare Schtasks Creations -author: Florian Roth +title: Noisy Rule Test2 date: 2017/03/23 -description: Detects rare scheduled tasks creations that only appear a few times per - time frame and could reveal password dumpers, backdoor installs or other types of - malicious code +description: excluded rule detection: SELECTION_1: EventID: 4698 @@ -11,21 +8,6 @@ detection: falsepositives: - Software installation - Software updates -id: b0d77106-7bb0-41fe-bd94-d1752164d066 +id: 8b8db936-172e-4bb7-9f84-ccc954d51d93 level: low -logsource: - definition: The Advanced Audit Policy setting Object Access > Audit Other Object - Access Events has to be configured to allow this detection (not in the baseline - recommendations by Microsoft). We also recommend extracting the Command field - from the embedded XML in the event data. - product: windows - service: security -status: experimental -tags: -- attack.execution -- attack.privilege_escalation -- attack.persistence -- attack.t1053 -- car.2013-08-001 -- attack.t1053.005 ruletype: SIGMA diff --git a/test_files/rules/yaml/noisy3.yml b/test_files/rules/yaml/noisy3.yml index 7e2071a0..8b4f209d 100644 --- a/test_files/rules/yaml/noisy3.yml +++ b/test_files/rules/yaml/noisy3.yml @@ -1,26 +1,13 @@ -title: Rare Service Installs -author: Florian Roth +title: Noisy Rule Test 3 date: 2017/03/08 -description: Detects rare service installs that only appear a few times per time frame - and could reveal password dumpers, backdoor installs or other types of malicious - services detection: SELECTION_1: EventID: 7045 condition: SELECTION_1 | count() by ServiceFileName < 5 -falsepositives: -- Software installation -- Software updates -id: 66bfef30-22a5-4fcd-ad44-8d81e60922ae +id: 1703ba97-b2c2-4071-a241-a16d017d25d3 level: low logsource: product: windows service: system status: experimental -tags: -- attack.persistence -- attack.privilege_escalation -- attack.t1050 -- car.2013-09-005 -- attack.t1543.003 ruletype: SIGMA diff --git a/test_files/rules/yaml/noisy4.yml b/test_files/rules/yaml/noisy4.yml index 39bbd1a3..5157c38a 100644 --- a/test_files/rules/yaml/noisy4.yml +++ b/test_files/rules/yaml/noisy4.yml @@ -1,8 +1,5 @@ -title: Failed Logins with Different Accounts from Single Source System -author: Florian Roth +title: Noisy Rule Test 4 date: 2017/01/10 -description: Detects suspicious failed logins with different user accounts from a - single source system detection: SELECTION_1: EventID: 529 @@ -14,20 +11,11 @@ detection: WorkstationName: '*' condition: ((SELECTION_1 or SELECTION_2) and SELECTION_3 and SELECTION_4) | count(TargetUserName) by WorkstationName > 3 -falsepositives: -- Terminal servers -- Jump servers -- Other multiuser systems like Citrix server farms -- Workstations with frequently changing users -id: e98374a6-e2d9-4076-9b5c-11bdb2569995 +id: 9f5663ce-6205-4753-b486-fb8498d1fae5 level: medium logsource: product: windows service: security modified: 2021/09/21 status: experimental -tags: -- attack.persistence -- attack.privilege_escalation -- attack.t1078 ruletype: SIGMA diff --git a/test_files/rules/yaml/noisy5.yml b/test_files/rules/yaml/noisy5.yml index ddfc134a..7a4b62d2 100644 --- a/test_files/rules/yaml/noisy5.yml +++ b/test_files/rules/yaml/noisy5.yml @@ -1,8 +1,5 @@ -title: Failed Logins with Different Accounts from Single Source System -author: Florian Roth +title: Noisy Rule Test 5 date: 2017/01/10 -description: Detects suspicious failed logins with different user accounts from a - single source system detection: SELECTION_1: EventID: 4776 @@ -12,23 +9,11 @@ detection: Workstation: '*' condition: (SELECTION_1 and SELECTION_2 and SELECTION_3) | count(TargetUserName) by Workstation > 3 -falsepositives: -- Terminal servers -- Jump servers -- Other multiuser systems like Citrix server farms -- Workstations with frequently changing users -id: 6309ffc4-8fa2-47cf-96b8-a2f72e58e538 +id: 3546ce10-19b4-4c4c-9658-f4f3b5d27ae9 level: medium logsource: product: windows service: security modified: 2021/09/21 -related: -- id: e98374a6-e2d9-4076-9b5c-11bdb2569995 - type: derived status: experimental -tags: -- attack.persistence -- attack.privilege_escalation -- attack.t1078 ruletype: SIGMA