mirror of
https://github.com/mvt-project/mvt.git
synced 2026-02-17 19:02:48 +00:00
Compare commits
23 Commits
v2.2.3
...
feature/ne
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
ccca58de63 | ||
|
|
3787dc48cd | ||
|
|
f814244ff8 | ||
|
|
11730f164f | ||
|
|
912fb060cb | ||
|
|
a9edf4a9fe | ||
|
|
ea7b9066ba | ||
|
|
fd81e3aa13 | ||
|
|
15477cc187 | ||
|
|
551b95b38b | ||
|
|
d767abb912 | ||
|
|
8a507b0a0b | ||
|
|
63b95ee6a5 | ||
|
|
c8ae495971 | ||
|
|
33d092692e | ||
|
|
b1e5dc715f | ||
|
|
1dc1ee2238 | ||
|
|
a2cbaacfce | ||
|
|
801fe367ac | ||
|
|
0d653be4dd | ||
|
|
179b6976fa | ||
|
|
577fcf752d | ||
|
|
2942209f62 |
@@ -24,7 +24,7 @@ Some recent phones will enforce the utilisation of a password to encrypt the bac
|
|||||||
|
|
||||||
## Unpack and check the backup
|
## Unpack and check the backup
|
||||||
|
|
||||||
MVT includes a partial implementation of the Android Backup parsing, because of the implementation difference in the compression algorithm between Java and Python. The `-nocompress` option passed to adb in the section above allows to avoid this issue. You can analyse and extract SMSs containing links from the backup directly with MVT:
|
MVT includes a partial implementation of the Android Backup parsing, because of the implementation difference in the compression algorithm between Java and Python. The `-nocompress` option passed to adb in the section above allows to avoid this issue. You can analyse and extract SMSs from the backup directly with MVT:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
$ mvt-android check-backup --output /path/to/results/ /path/to/backup.ab
|
$ mvt-android check-backup --output /path/to/results/ /path/to/backup.ab
|
||||||
@@ -32,7 +32,7 @@ $ mvt-android check-backup --output /path/to/results/ /path/to/backup.ab
|
|||||||
INFO [mvt.android.modules.backup.sms] Running module SMS...
|
INFO [mvt.android.modules.backup.sms] Running module SMS...
|
||||||
INFO [mvt.android.modules.backup.sms] Processing SMS backup file at
|
INFO [mvt.android.modules.backup.sms] Processing SMS backup file at
|
||||||
apps/com.android.providers.telephony/d_f/000000_sms_backup
|
apps/com.android.providers.telephony/d_f/000000_sms_backup
|
||||||
INFO [mvt.android.modules.backup.sms] Extracted a total of 64 SMS messages containing links
|
INFO [mvt.android.modules.backup.sms] Extracted a total of 64 SMS messages
|
||||||
```
|
```
|
||||||
|
|
||||||
If the backup is encrypted, MVT will prompt you to enter the password.
|
If the backup is encrypted, MVT will prompt you to enter the password.
|
||||||
@@ -52,4 +52,4 @@ If the backup is encrypted, ABE will prompt you to enter the password.
|
|||||||
|
|
||||||
Alternatively, [ab-decrypt](https://github.com/joernheissler/ab-decrypt) can be used for that purpose.
|
Alternatively, [ab-decrypt](https://github.com/joernheissler/ab-decrypt) can be used for that purpose.
|
||||||
|
|
||||||
You can then extract SMSs containing links with MVT by passing the folder path as parameter instead of the `.ab` file: `mvt-android check-backup --output /path/to/results/ /path/to/backup/` (the path to backup given should be the folder containing the `apps` folder).
|
You can then extract SMSs with MVT by passing the folder path as parameter instead of the `.ab` file: `mvt-android check-backup --output /path/to/results/ /path/to/backup/` (the path to backup given should be the folder containing the `apps` folder).
|
||||||
|
|||||||
@@ -39,7 +39,9 @@ export MVT_STIX2="/home/user/IOC1.stix2:/home/user/IOC2.stix2"
|
|||||||
- The [Amnesty International investigations repository](https://github.com/AmnestyTech/investigations) contains STIX-formatted IOCs for:
|
- The [Amnesty International investigations repository](https://github.com/AmnestyTech/investigations) contains STIX-formatted IOCs for:
|
||||||
- [Pegasus](https://en.wikipedia.org/wiki/Pegasus_(spyware)) ([STIX2](https://raw.githubusercontent.com/AmnestyTech/investigations/master/2021-07-18_nso/pegasus.stix2))
|
- [Pegasus](https://en.wikipedia.org/wiki/Pegasus_(spyware)) ([STIX2](https://raw.githubusercontent.com/AmnestyTech/investigations/master/2021-07-18_nso/pegasus.stix2))
|
||||||
- [Predator from Cytrox](https://citizenlab.ca/2021/12/pegasus-vs-predator-dissidents-doubly-infected-iphone-reveals-cytrox-mercenary-spyware/) ([STIX2](https://raw.githubusercontent.com/AmnestyTech/investigations/master/2021-12-16_cytrox/cytrox.stix2))
|
- [Predator from Cytrox](https://citizenlab.ca/2021/12/pegasus-vs-predator-dissidents-doubly-infected-iphone-reveals-cytrox-mercenary-spyware/) ([STIX2](https://raw.githubusercontent.com/AmnestyTech/investigations/master/2021-12-16_cytrox/cytrox.stix2))
|
||||||
|
- [An Android Spyware Campaign Linked to a Mercenary Company](https://github.com/AmnestyTech/investigations/tree/master/2023-03-29_android_campaign) ([STIX2](https://github.com/AmnestyTech/investigations/blob/master/2023-03-29_android_campaign/malware.stix2))
|
||||||
- [This repository](https://github.com/Te-k/stalkerware-indicators) contains IOCs for Android stalkerware including [a STIX MVT-compatible file](https://raw.githubusercontent.com/Te-k/stalkerware-indicators/master/generated/stalkerware.stix2).
|
- [This repository](https://github.com/Te-k/stalkerware-indicators) contains IOCs for Android stalkerware including [a STIX MVT-compatible file](https://raw.githubusercontent.com/Te-k/stalkerware-indicators/master/generated/stalkerware.stix2).
|
||||||
|
- We are also maintaining [a list of IOCs](https://github.com/mvt-project/mvt-indicators) in STIX format from public spyware campaigns.
|
||||||
|
|
||||||
You can automaticallly download the latest public indicator files with the command `mvt-ios download-iocs` or `mvt-android download-iocs`. These commands download the list of indicators listed [here](https://github.com/mvt-project/mvt/blob/main/public_indicators.json) and store them in the [appdir](https://pypi.org/project/appdirs/) folder. They are then loaded automatically by MVT.
|
You can automaticallly download the latest public indicator files with the command `mvt-ios download-iocs` or `mvt-android download-iocs`. These commands download the list of indicators listed [here](https://github.com/mvt-project/mvt/blob/main/public_indicators.json) and store them in the [appdir](https://pypi.org/project/appdirs/) folder. They are then loaded automatically by MVT.
|
||||||
|
|
||||||
|
|||||||
@@ -16,6 +16,18 @@ If indicators are provided through the command-line, processes and domains are c
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
### `applications.json`
|
||||||
|
|
||||||
|
!!! info "Availability"
|
||||||
|
Backup: :material-check:
|
||||||
|
Full filesystem dump: :material-check:
|
||||||
|
|
||||||
|
This JSON file is created by mvt-ios' `Applications` module. The module extracts the list of applications installed on the device from the `Info.plist` file in backup, or from the `iTunesMetadata.plist` files in a file system dump. These records contains detailed information on the source and installation of the app.
|
||||||
|
|
||||||
|
If indicators are provided through the command-line, processes and application ids are checked against the app name of each application. It also flags any applications not installed from the AppStore. Any matches are stored in *applications_detected.json*.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
### `backup_info.json`
|
### `backup_info.json`
|
||||||
|
|
||||||
!!! info "Availability"
|
!!! info "Availability"
|
||||||
@@ -38,6 +50,18 @@ If indicators are provided through the command-line, they are checked against th
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
### `calendar.json`
|
||||||
|
|
||||||
|
!!! info "Availability"
|
||||||
|
Backup: :material-check:
|
||||||
|
Full filesystem dump: :material-check:
|
||||||
|
|
||||||
|
This JSON file is created by mvt-ios' `Calendar` module. This module extracts all CalendarItems from the `Calendar.sqlitedb` database. This database contains all calendar entries from the different calendars installed on the phone.
|
||||||
|
|
||||||
|
If indicators are provided through the command-line, email addresses are checked against the inviter's email of the different events. Any matches are stored in *calendar_detected.json*.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
### `calls.json`
|
### `calls.json`
|
||||||
|
|
||||||
!!! info "Availability"
|
!!! info "Availability"
|
||||||
@@ -272,7 +296,7 @@ If indicators are provided through the command-line, they are checked against th
|
|||||||
Backup: :material-check:
|
Backup: :material-check:
|
||||||
Full filesystem dump: :material-check:
|
Full filesystem dump: :material-check:
|
||||||
|
|
||||||
This JSON file is created by mvt-ios' `SMS` module. The module extracts a list of SMS messages containing HTTP links from the SQLite database located at */private/var/mobile/Library/SMS/sms.db*.
|
This JSON file is created by mvt-ios' `SMS` module. The module extracts a list of SMS messages from the SQLite database located at */private/var/mobile/Library/SMS/sms.db*.
|
||||||
|
|
||||||
If indicators are provided through the command-line, they are checked against the extracted HTTP links. Any matches are stored in *sms_detected.json*.
|
If indicators are provided through the command-line, they are checked against the extracted HTTP links. Any matches are stored in *sms_detected.json*.
|
||||||
|
|
||||||
@@ -374,7 +398,7 @@ If indicators are provided through the command-line, they are checked against th
|
|||||||
Backup: :material-check:
|
Backup: :material-check:
|
||||||
Full filesystem dump: :material-check:
|
Full filesystem dump: :material-check:
|
||||||
|
|
||||||
This JSON file is created by mvt-ios' `WhatsApp` module. The module extracts a list of WhatsApp messages containing HTTP links from the SQLite database located at *private/var/mobile/Containers/Shared/AppGroup/\*/ChatStorage.sqlite*.
|
This JSON file is created by mvt-ios' `WhatsApp` module. The module extracts a list of WhatsApp messages from the SQLite database located at *private/var/mobile/Containers/Shared/AppGroup/\*/ChatStorage.sqlite*.
|
||||||
|
|
||||||
If indicators are provided through the command-line, they are checked against the extracted HTTP links. Any matches are stored in *whatsapp_detected.json*.
|
If indicators are provided through the command-line, they are checked against the extracted HTTP links. Any matches are stored in *whatsapp_detected.json*.
|
||||||
|
|
||||||
|
|||||||
@@ -6,14 +6,15 @@
|
|||||||
import logging
|
import logging
|
||||||
|
|
||||||
import click
|
import click
|
||||||
from rich.logging import RichHandler
|
|
||||||
|
|
||||||
from mvt.common.cmd_check_iocs import CmdCheckIOCS
|
from mvt.common.cmd_check_iocs import CmdCheckIOCS
|
||||||
from mvt.common.help import (HELP_MSG_FAST, HELP_MSG_HASHES, HELP_MSG_IOC,
|
from mvt.common.help import (HELP_MSG_FAST, HELP_MSG_HASHES, HELP_MSG_IOC,
|
||||||
HELP_MSG_LIST_MODULES, HELP_MSG_MODULE,
|
HELP_MSG_LIST_MODULES, HELP_MSG_MODULE,
|
||||||
HELP_MSG_OUTPUT, HELP_MSG_SERIAL)
|
HELP_MSG_OUTPUT, HELP_MSG_SERIAL,
|
||||||
|
HELP_MSG_VERBOSE)
|
||||||
from mvt.common.logo import logo
|
from mvt.common.logo import logo
|
||||||
from mvt.common.updates import IndicatorsUpdates
|
from mvt.common.updates import IndicatorsUpdates
|
||||||
|
from mvt.common.utils import init_logging, set_verbose_logging
|
||||||
|
|
||||||
from .cmd_check_adb import CmdAndroidCheckADB
|
from .cmd_check_adb import CmdAndroidCheckADB
|
||||||
from .cmd_check_androidqf import CmdAndroidCheckAndroidQF
|
from .cmd_check_androidqf import CmdAndroidCheckAndroidQF
|
||||||
@@ -25,11 +26,8 @@ from .modules.adb.packages import Packages
|
|||||||
from .modules.backup import BACKUP_MODULES
|
from .modules.backup import BACKUP_MODULES
|
||||||
from .modules.bugreport import BUGREPORT_MODULES
|
from .modules.bugreport import BUGREPORT_MODULES
|
||||||
|
|
||||||
# Setup logging using Rich.
|
init_logging()
|
||||||
LOG_FORMAT = "[%(name)s] %(message)s"
|
log = logging.getLogger("mvt")
|
||||||
logging.basicConfig(level="INFO", format=LOG_FORMAT, handlers=[
|
|
||||||
RichHandler(show_path=False, log_time_format="%X")])
|
|
||||||
log = logging.getLogger(__name__)
|
|
||||||
CONTEXT_SETTINGS = dict(help_option_names=['-h', '--help'])
|
CONTEXT_SETTINGS = dict(help_option_names=['-h', '--help'])
|
||||||
|
|
||||||
|
|
||||||
@@ -63,8 +61,10 @@ def version():
|
|||||||
@click.option("--from-file", "-f", type=click.Path(exists=True),
|
@click.option("--from-file", "-f", type=click.Path(exists=True),
|
||||||
help="Instead of acquiring from phone, load an existing packages.json file for "
|
help="Instead of acquiring from phone, load an existing packages.json file for "
|
||||||
"lookups (mainly for debug purposes)")
|
"lookups (mainly for debug purposes)")
|
||||||
|
@click.option("--verbose", "-v", is_flag=True, help=HELP_MSG_VERBOSE)
|
||||||
@click.pass_context
|
@click.pass_context
|
||||||
def download_apks(ctx, all_apks, virustotal, output, from_file, serial):
|
def download_apks(ctx, all_apks, virustotal, output, from_file, serial, verbose):
|
||||||
|
set_verbose_logging(verbose)
|
||||||
try:
|
try:
|
||||||
if from_file:
|
if from_file:
|
||||||
download = DownloadAPKs.from_json(from_file)
|
download = DownloadAPKs.from_json(from_file)
|
||||||
@@ -112,8 +112,10 @@ def download_apks(ctx, all_apks, virustotal, output, from_file, serial):
|
|||||||
@click.option("--fast", "-f", is_flag=True, help=HELP_MSG_FAST)
|
@click.option("--fast", "-f", is_flag=True, help=HELP_MSG_FAST)
|
||||||
@click.option("--list-modules", "-l", is_flag=True, help=HELP_MSG_LIST_MODULES)
|
@click.option("--list-modules", "-l", is_flag=True, help=HELP_MSG_LIST_MODULES)
|
||||||
@click.option("--module", "-m", help=HELP_MSG_MODULE)
|
@click.option("--module", "-m", help=HELP_MSG_MODULE)
|
||||||
|
@click.option("--verbose", "-v", is_flag=True, help=HELP_MSG_VERBOSE)
|
||||||
@click.pass_context
|
@click.pass_context
|
||||||
def check_adb(ctx, serial, iocs, output, fast, list_modules, module):
|
def check_adb(ctx, serial, iocs, output, fast, list_modules, module, verbose):
|
||||||
|
set_verbose_logging(verbose)
|
||||||
cmd = CmdAndroidCheckADB(results_path=output, ioc_files=iocs,
|
cmd = CmdAndroidCheckADB(results_path=output, ioc_files=iocs,
|
||||||
module_name=module, serial=serial, fast_mode=fast)
|
module_name=module, serial=serial, fast_mode=fast)
|
||||||
|
|
||||||
@@ -141,9 +143,11 @@ def check_adb(ctx, serial, iocs, output, fast, list_modules, module):
|
|||||||
help=HELP_MSG_OUTPUT)
|
help=HELP_MSG_OUTPUT)
|
||||||
@click.option("--list-modules", "-l", is_flag=True, help=HELP_MSG_LIST_MODULES)
|
@click.option("--list-modules", "-l", is_flag=True, help=HELP_MSG_LIST_MODULES)
|
||||||
@click.option("--module", "-m", help=HELP_MSG_MODULE)
|
@click.option("--module", "-m", help=HELP_MSG_MODULE)
|
||||||
|
@click.option("--verbose", "-v", is_flag=True, help=HELP_MSG_VERBOSE)
|
||||||
@click.argument("BUGREPORT_PATH", type=click.Path(exists=True))
|
@click.argument("BUGREPORT_PATH", type=click.Path(exists=True))
|
||||||
@click.pass_context
|
@click.pass_context
|
||||||
def check_bugreport(ctx, iocs, output, list_modules, module, bugreport_path):
|
def check_bugreport(ctx, iocs, output, list_modules, module, verbose, bugreport_path):
|
||||||
|
set_verbose_logging(verbose)
|
||||||
# Always generate hashes as bug reports are small.
|
# Always generate hashes as bug reports are small.
|
||||||
cmd = CmdAndroidCheckBugreport(target_path=bugreport_path,
|
cmd = CmdAndroidCheckBugreport(target_path=bugreport_path,
|
||||||
results_path=output, ioc_files=iocs,
|
results_path=output, ioc_files=iocs,
|
||||||
@@ -172,9 +176,11 @@ def check_bugreport(ctx, iocs, output, list_modules, module, bugreport_path):
|
|||||||
@click.option("--output", "-o", type=click.Path(exists=False),
|
@click.option("--output", "-o", type=click.Path(exists=False),
|
||||||
help=HELP_MSG_OUTPUT)
|
help=HELP_MSG_OUTPUT)
|
||||||
@click.option("--list-modules", "-l", is_flag=True, help=HELP_MSG_LIST_MODULES)
|
@click.option("--list-modules", "-l", is_flag=True, help=HELP_MSG_LIST_MODULES)
|
||||||
|
@click.option("--verbose", "-v", is_flag=True, help=HELP_MSG_VERBOSE)
|
||||||
@click.argument("BACKUP_PATH", type=click.Path(exists=True))
|
@click.argument("BACKUP_PATH", type=click.Path(exists=True))
|
||||||
@click.pass_context
|
@click.pass_context
|
||||||
def check_backup(ctx, iocs, output, list_modules, backup_path):
|
def check_backup(ctx, iocs, output, list_modules, verbose, backup_path):
|
||||||
|
set_verbose_logging(verbose)
|
||||||
# Always generate hashes as backups are generally small.
|
# Always generate hashes as backups are generally small.
|
||||||
cmd = CmdAndroidCheckBackup(target_path=backup_path, results_path=output,
|
cmd = CmdAndroidCheckBackup(target_path=backup_path, results_path=output,
|
||||||
ioc_files=iocs, hashes=True)
|
ioc_files=iocs, hashes=True)
|
||||||
@@ -204,9 +210,11 @@ def check_backup(ctx, iocs, output, list_modules, backup_path):
|
|||||||
@click.option("--list-modules", "-l", is_flag=True, help=HELP_MSG_LIST_MODULES)
|
@click.option("--list-modules", "-l", is_flag=True, help=HELP_MSG_LIST_MODULES)
|
||||||
@click.option("--module", "-m", help=HELP_MSG_MODULE)
|
@click.option("--module", "-m", help=HELP_MSG_MODULE)
|
||||||
@click.option("--hashes", "-H", is_flag=True, help=HELP_MSG_HASHES)
|
@click.option("--hashes", "-H", is_flag=True, help=HELP_MSG_HASHES)
|
||||||
|
@click.option("--verbose", "-v", is_flag=True, help=HELP_MSG_VERBOSE)
|
||||||
@click.argument("ANDROIDQF_PATH", type=click.Path(exists=True))
|
@click.argument("ANDROIDQF_PATH", type=click.Path(exists=True))
|
||||||
@click.pass_context
|
@click.pass_context
|
||||||
def check_androidqf(ctx, iocs, output, list_modules, module, hashes, androidqf_path):
|
def check_androidqf(ctx, iocs, output, list_modules, module, hashes, verbose, androidqf_path):
|
||||||
|
set_verbose_logging(verbose)
|
||||||
cmd = CmdAndroidCheckAndroidQF(target_path=androidqf_path,
|
cmd = CmdAndroidCheckAndroidQF(target_path=androidqf_path,
|
||||||
results_path=output, ioc_files=iocs,
|
results_path=output, ioc_files=iocs,
|
||||||
module_name=module, hashes=hashes)
|
module_name=module, hashes=hashes)
|
||||||
|
|||||||
@@ -21,15 +21,13 @@ log = logging.getLogger(__name__)
|
|||||||
class DownloadAPKs(AndroidExtraction):
|
class DownloadAPKs(AndroidExtraction):
|
||||||
"""DownloadAPKs is the main class operating the download of APKs
|
"""DownloadAPKs is the main class operating the download of APKs
|
||||||
from the device.
|
from the device.
|
||||||
|
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(
|
def __init__(
|
||||||
self,
|
self,
|
||||||
results_path: Optional[str] = None,
|
results_path: Optional[str] = None,
|
||||||
all_apks: Optional[bool] = False,
|
all_apks: Optional[bool] = False,
|
||||||
packages: Optional[list] = None
|
packages: Optional[list] = None,
|
||||||
) -> None:
|
) -> None:
|
||||||
"""Initialize module.
|
"""Initialize module.
|
||||||
:param results_path: Path to the folder where data should be stored
|
:param results_path: Path to the folder where data should be stored
|
||||||
|
|||||||
@@ -43,7 +43,7 @@ FROM sms;
|
|||||||
|
|
||||||
|
|
||||||
class SMS(AndroidExtraction):
|
class SMS(AndroidExtraction):
|
||||||
"""This module extracts all SMS messages containing links."""
|
"""This module extracts all SMS messages."""
|
||||||
|
|
||||||
def __init__(
|
def __init__(
|
||||||
self,
|
self,
|
||||||
@@ -77,8 +77,10 @@ class SMS(AndroidExtraction):
|
|||||||
if "body" not in message:
|
if "body" not in message:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
# TODO: check links exported from the body previously.
|
message_links = message.get("links", [])
|
||||||
message_links = check_for_links(message["body"])
|
if message_links == []:
|
||||||
|
message_links = check_for_links(message["body"])
|
||||||
|
|
||||||
if self.indicators.check_domains(message_links):
|
if self.indicators.check_domains(message_links):
|
||||||
self.detected.append(message)
|
self.detected.append(message)
|
||||||
|
|
||||||
@@ -106,15 +108,16 @@ class SMS(AndroidExtraction):
|
|||||||
message["direction"] = ("received" if message["incoming"] == 1 else "sent")
|
message["direction"] = ("received" if message["incoming"] == 1 else "sent")
|
||||||
message["isodate"] = convert_unix_to_iso(message["timestamp"])
|
message["isodate"] = convert_unix_to_iso(message["timestamp"])
|
||||||
|
|
||||||
# If we find links in the messages or if they are empty we add
|
# Extract links in the message body
|
||||||
# them to the list of results.
|
links = check_for_links(message["body"])
|
||||||
if check_for_links(message["body"]) or message["body"].strip() == "":
|
message["links"] = links
|
||||||
self.results.append(message)
|
|
||||||
|
self.results.append(message)
|
||||||
|
|
||||||
cur.close()
|
cur.close()
|
||||||
conn.close()
|
conn.close()
|
||||||
|
|
||||||
self.log.info("Extracted a total of %d SMS messages containing links",
|
self.log.info("Extracted a total of %d SMS messages",
|
||||||
len(self.results))
|
len(self.results))
|
||||||
|
|
||||||
def _extract_sms_adb(self) -> None:
|
def _extract_sms_adb(self) -> None:
|
||||||
@@ -137,7 +140,7 @@ class SMS(AndroidExtraction):
|
|||||||
"Android Backup Extractor")
|
"Android Backup Extractor")
|
||||||
return
|
return
|
||||||
|
|
||||||
self.log.info("Extracted a total of %d SMS messages containing links",
|
self.log.info("Extracted a total of %d SMS messages",
|
||||||
len(self.results))
|
len(self.results))
|
||||||
|
|
||||||
def run(self) -> None:
|
def run(self) -> None:
|
||||||
|
|||||||
@@ -38,7 +38,7 @@ class SMS(AndroidQFModule):
|
|||||||
if "body" not in message:
|
if "body" not in message:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
if self.indicators.check_domains(message["links"]):
|
if self.indicators.check_domains(message.get("links", [])):
|
||||||
self.detected.append(message)
|
self.detected.append(message)
|
||||||
|
|
||||||
def parse_backup(self, data):
|
def parse_backup(self, data):
|
||||||
|
|||||||
@@ -8,6 +8,7 @@ from typing import Optional
|
|||||||
|
|
||||||
from mvt.android.modules.backup.base import BackupExtraction
|
from mvt.android.modules.backup.base import BackupExtraction
|
||||||
from mvt.android.parsers.backup import parse_sms_file
|
from mvt.android.parsers.backup import parse_sms_file
|
||||||
|
from mvt.common.utils import check_for_links
|
||||||
|
|
||||||
|
|
||||||
class SMS(BackupExtraction):
|
class SMS(BackupExtraction):
|
||||||
@@ -34,7 +35,11 @@ class SMS(BackupExtraction):
|
|||||||
if "body" not in message:
|
if "body" not in message:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
if self.indicators.check_domains(message["links"]):
|
message_links = message.get("links", [])
|
||||||
|
if message_links == []:
|
||||||
|
message_links = check_for_links(message.get("text", ""))
|
||||||
|
|
||||||
|
if self.indicators.check_domains(message_links):
|
||||||
self.detected.append(message)
|
self.detected.append(message)
|
||||||
|
|
||||||
def run(self) -> None:
|
def run(self) -> None:
|
||||||
@@ -50,5 +55,5 @@ class SMS(BackupExtraction):
|
|||||||
data = self._get_file_content(file)
|
data = self._get_file_content(file)
|
||||||
self.results.extend(parse_sms_file(data))
|
self.results.extend(parse_sms_file(data))
|
||||||
|
|
||||||
self.log.info("Extracted a total of %d SMS & MMS messages containing links",
|
self.log.info("Extracted a total of %d SMS & MMS messages",
|
||||||
len(self.results))
|
len(self.results))
|
||||||
|
|||||||
@@ -12,6 +12,8 @@ from .dbinfo import DBInfo
|
|||||||
from .getprop import Getprop
|
from .getprop import Getprop
|
||||||
from .packages import Packages
|
from .packages import Packages
|
||||||
from .receivers import Receivers
|
from .receivers import Receivers
|
||||||
|
from .network_interfaces import NetworkInterfaces
|
||||||
|
|
||||||
BUGREPORT_MODULES = [Accessibility, Activities, Appops, BatteryDaily,
|
BUGREPORT_MODULES = [Accessibility, Activities, Appops, BatteryDaily,
|
||||||
BatteryHistory, DBInfo, Getprop, Packages, Receivers]
|
BatteryHistory, DBInfo, Getprop, Packages, Receivers,
|
||||||
|
NetworkInterfaces]
|
||||||
|
|||||||
@@ -39,8 +39,9 @@ class Getprop(BugReportModule):
|
|||||||
|
|
||||||
lines = []
|
lines = []
|
||||||
in_getprop = False
|
in_getprop = False
|
||||||
|
|
||||||
for line in content.decode(errors="ignore").splitlines():
|
for line in content.decode(errors="ignore").splitlines():
|
||||||
if line.strip() == "------ SYSTEM PROPERTIES (getprop) ------":
|
if line.strip().startswith("------ SYSTEM PROPERTIES"):
|
||||||
in_getprop = True
|
in_getprop = True
|
||||||
continue
|
continue
|
||||||
|
|
||||||
@@ -55,13 +56,14 @@ class Getprop(BugReportModule):
|
|||||||
self.results = parse_getprop("\n".join(lines))
|
self.results = parse_getprop("\n".join(lines))
|
||||||
|
|
||||||
# Alert if phone is outdated.
|
# Alert if phone is outdated.
|
||||||
security_patch = self.results.get("ro.build.version.security_patch", "")
|
for entry in self.results:
|
||||||
if security_patch:
|
if entry["name"] == "ro.build.version.security_patch":
|
||||||
patch_date = datetime.strptime(security_patch, "%Y-%m-%d")
|
security_patch = entry["value"]
|
||||||
if (datetime.now() - patch_date) > timedelta(days=6*30):
|
patch_date = datetime.strptime(security_patch, "%Y-%m-%d")
|
||||||
self.log.warning("This phone has not received security updates "
|
if (datetime.now() - patch_date) > timedelta(days=6*30):
|
||||||
"for more than six months (last update: %s)",
|
self.log.warning("This phone has not received security updates "
|
||||||
security_patch)
|
"for more than six months (last update: %s)",
|
||||||
|
security_patch)
|
||||||
|
|
||||||
self.log.info("Extracted %d Android system properties",
|
self.log.info("Extracted %d Android system properties",
|
||||||
len(self.results))
|
len(self.results))
|
||||||
|
|||||||
58
mvt/android/modules/bugreport/network_interfaces.py
Normal file
58
mvt/android/modules/bugreport/network_interfaces.py
Normal file
@@ -0,0 +1,58 @@
|
|||||||
|
# Mobile Verification Toolkit (MVT)
|
||||||
|
# Copyright (c) 2021-2023 Claudio Guarnieri.
|
||||||
|
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||||
|
# https://license.mvt.re/1.1/
|
||||||
|
|
||||||
|
import logging
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from mvt.android.parsers import parse_dumpsys_network_interfaces
|
||||||
|
|
||||||
|
from .base import BugReportModule
|
||||||
|
|
||||||
|
|
||||||
|
class NetworkInterfaces(BugReportModule):
|
||||||
|
"""This module extracts network interfaces from 'ip link' command."""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
file_path: Optional[str] = None,
|
||||||
|
target_path: Optional[str] = None,
|
||||||
|
results_path: Optional[str] = None,
|
||||||
|
fast_mode: Optional[bool] = False,
|
||||||
|
log: logging.Logger = logging.getLogger(__name__),
|
||||||
|
results: Optional[list] = None
|
||||||
|
) -> None:
|
||||||
|
super().__init__(file_path=file_path, target_path=target_path,
|
||||||
|
results_path=results_path, fast_mode=fast_mode,
|
||||||
|
log=log, results=results)
|
||||||
|
|
||||||
|
self.results = {} if not results else results
|
||||||
|
|
||||||
|
def run(self) -> None:
|
||||||
|
content = self._get_dumpstate_file()
|
||||||
|
if not content:
|
||||||
|
self.log.error("Unable to find dumpstate file. "
|
||||||
|
"Did you provide a valid bug report archive?")
|
||||||
|
return
|
||||||
|
|
||||||
|
lines = []
|
||||||
|
in_getprop = False
|
||||||
|
|
||||||
|
for line in content.decode(errors="ignore").splitlines():
|
||||||
|
if line.strip().startswith("------ NETWORK INTERFACES"):
|
||||||
|
in_getprop = True
|
||||||
|
continue
|
||||||
|
|
||||||
|
if not in_getprop:
|
||||||
|
continue
|
||||||
|
|
||||||
|
if line.strip().startswith("------"):
|
||||||
|
break
|
||||||
|
|
||||||
|
lines.append(line)
|
||||||
|
|
||||||
|
self.results = parse_dumpsys_network_interfaces("\n".join(lines))
|
||||||
|
|
||||||
|
self.log.info("Extracted information about %d Android network interfaces",
|
||||||
|
len(self.results))
|
||||||
@@ -7,5 +7,7 @@ from .dumpsys import (parse_dumpsys_accessibility,
|
|||||||
parse_dumpsys_activity_resolver_table,
|
parse_dumpsys_activity_resolver_table,
|
||||||
parse_dumpsys_appops, parse_dumpsys_battery_daily,
|
parse_dumpsys_appops, parse_dumpsys_battery_daily,
|
||||||
parse_dumpsys_battery_history, parse_dumpsys_dbinfo,
|
parse_dumpsys_battery_history, parse_dumpsys_dbinfo,
|
||||||
parse_dumpsys_receiver_resolver_table)
|
parse_dumpsys_receiver_resolver_table,
|
||||||
|
parse_dumpsys_network_interfaces,
|
||||||
|
)
|
||||||
from .getprop import parse_getprop
|
from .getprop import parse_getprop
|
||||||
|
|||||||
@@ -218,10 +218,9 @@ def parse_sms_file(data):
|
|||||||
entry["isodate"] = convert_unix_to_iso(int(entry["date"]) / 1000)
|
entry["isodate"] = convert_unix_to_iso(int(entry["date"]) / 1000)
|
||||||
entry["direction"] = ("sent" if int(entry["date_sent"]) else "received")
|
entry["direction"] = ("sent" if int(entry["date_sent"]) else "received")
|
||||||
|
|
||||||
# If we find links in the messages or if they are empty we add them to
|
# Extract links from the body
|
||||||
# the list.
|
|
||||||
if message_links or entry["body"].strip() == "":
|
if message_links or entry["body"].strip() == "":
|
||||||
entry["links"] = message_links
|
entry["links"] = message_links
|
||||||
res.append(entry)
|
res.append(entry)
|
||||||
|
|
||||||
return res
|
return res
|
||||||
|
|||||||
@@ -519,3 +519,39 @@ def parse_dumpsys_packages(output: str) -> List[Dict[str, Any]]:
|
|||||||
results.append(package)
|
results.append(package)
|
||||||
|
|
||||||
return results
|
return results
|
||||||
|
|
||||||
|
|
||||||
|
def parse_dumpsys_network_interfaces(output: str) -> List[Dict[str, str]]:
|
||||||
|
"""
|
||||||
|
Parse network interfaces (output of the 'ip link' command)
|
||||||
|
"""
|
||||||
|
results = []
|
||||||
|
interface_rxp = re.compile(r"(?P<if_number>\d+): (?P<if_name>[\S\d]+): (?P<if_options>\<.*)")
|
||||||
|
mac_or_ip_line_rxp = re.compile(r"\W+ (?P<link_type>[\S]+) (?P<mac_or_ip_address>[a-f0-9\:\.\/]+) (.*)")
|
||||||
|
|
||||||
|
interface = None
|
||||||
|
for line in output.splitlines():
|
||||||
|
|
||||||
|
interface_match = re.match(interface_rxp, line)
|
||||||
|
if interface_match:
|
||||||
|
interface = {
|
||||||
|
"interface_number": interface_match.group("if_number"),
|
||||||
|
"name": interface_match.group("if_name"),
|
||||||
|
"options": interface_match.group("if_options"),
|
||||||
|
}
|
||||||
|
continue
|
||||||
|
|
||||||
|
elif interface:
|
||||||
|
mac_line_match = re.match(mac_or_ip_line_rxp, line)
|
||||||
|
mac_or_ip_address = mac_line_match.group("mac_or_ip_address")
|
||||||
|
if len(mac_or_ip_address) == 17:
|
||||||
|
interface["mac_address"] = mac_or_ip_address
|
||||||
|
else:
|
||||||
|
interface["address"] = mac_or_ip_address
|
||||||
|
interface["link_type"] = mac_line_match.group("link_type")
|
||||||
|
interface["link_line"] = line
|
||||||
|
|
||||||
|
results.append(interface)
|
||||||
|
interface = None
|
||||||
|
|
||||||
|
return results
|
||||||
|
|||||||
@@ -42,20 +42,21 @@ class Command:
|
|||||||
self.fast_mode = fast_mode
|
self.fast_mode = fast_mode
|
||||||
self.log = log
|
self.log = log
|
||||||
|
|
||||||
self.iocs = Indicators(log=log)
|
|
||||||
self.iocs.load_indicators_files(self.ioc_files)
|
|
||||||
|
|
||||||
# This list will contain all executed modules.
|
# This list will contain all executed modules.
|
||||||
# We can use this to reference e.g. self.executed[0].results.
|
# We can use this to reference e.g. self.executed[0].results.
|
||||||
self.executed = []
|
self.executed = []
|
||||||
|
|
||||||
self.detected_count = 0
|
self.detected_count = 0
|
||||||
|
|
||||||
self.hashes = hashes
|
self.hashes = hashes
|
||||||
self.hash_values = []
|
self.hash_values = []
|
||||||
self.timeline = []
|
self.timeline = []
|
||||||
self.timeline_detected = []
|
self.timeline_detected = []
|
||||||
|
|
||||||
|
# Load IOCs
|
||||||
|
self._create_storage()
|
||||||
|
self._setup_logging()
|
||||||
|
self.iocs = Indicators(log=log)
|
||||||
|
self.iocs.load_indicators_files(self.ioc_files)
|
||||||
|
|
||||||
def _create_storage(self) -> None:
|
def _create_storage(self) -> None:
|
||||||
if self.results_path and not os.path.exists(self.results_path):
|
if self.results_path and not os.path.exists(self.results_path):
|
||||||
try:
|
try:
|
||||||
@@ -65,10 +66,11 @@ class Command:
|
|||||||
self.results_path, exc)
|
self.results_path, exc)
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
|
|
||||||
def _add_log_file_handler(self, logger: logging.Logger) -> None:
|
def _setup_logging(self):
|
||||||
if not self.results_path:
|
if not self.results_path:
|
||||||
return
|
return
|
||||||
|
|
||||||
|
logger = logging.getLogger("mvt")
|
||||||
file_handler = logging.FileHandler(os.path.join(self.results_path,
|
file_handler = logging.FileHandler(os.path.join(self.results_path,
|
||||||
"command.log"))
|
"command.log"))
|
||||||
formatter = logging.Formatter("%(asctime)s - %(name)s - "
|
formatter = logging.Formatter("%(asctime)s - %(name)s - "
|
||||||
@@ -150,8 +152,6 @@ class Command:
|
|||||||
raise NotImplementedError
|
raise NotImplementedError
|
||||||
|
|
||||||
def run(self) -> None:
|
def run(self) -> None:
|
||||||
self._create_storage()
|
|
||||||
self._add_log_file_handler(self.log)
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
self.init()
|
self.init()
|
||||||
@@ -162,8 +162,8 @@ class Command:
|
|||||||
if self.module_name and module.__name__ != self.module_name:
|
if self.module_name and module.__name__ != self.module_name:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
|
# FIXME: do we need the logger here
|
||||||
module_logger = logging.getLogger(module.__module__)
|
module_logger = logging.getLogger(module.__module__)
|
||||||
self._add_log_file_handler(module_logger)
|
|
||||||
|
|
||||||
m = module(target_path=self.target_path,
|
m = module(target_path=self.target_path,
|
||||||
results_path=self.results_path,
|
results_path=self.results_path,
|
||||||
|
|||||||
@@ -10,6 +10,7 @@ HELP_MSG_FAST = "Avoid running time/resource consuming features"
|
|||||||
HELP_MSG_LIST_MODULES = "Print list of available modules and exit"
|
HELP_MSG_LIST_MODULES = "Print list of available modules and exit"
|
||||||
HELP_MSG_MODULE = "Name of a single module you would like to run instead of all"
|
HELP_MSG_MODULE = "Name of a single module you would like to run instead of all"
|
||||||
HELP_MSG_HASHES = "Generate hashes of all the files analyzed"
|
HELP_MSG_HASHES = "Generate hashes of all the files analyzed"
|
||||||
|
HELP_MSG_VERBOSE = "Verbose mode"
|
||||||
|
|
||||||
# Android-specific.
|
# Android-specific.
|
||||||
HELP_MSG_SERIAL = "Specify a device serial number or HOST:PORT connection string"
|
HELP_MSG_SERIAL = "Specify a device serial number or HOST:PORT connection string"
|
||||||
|
|||||||
@@ -15,13 +15,15 @@ from .url import URL
|
|||||||
MVT_DATA_FOLDER = user_data_dir("mvt")
|
MVT_DATA_FOLDER = user_data_dir("mvt")
|
||||||
MVT_INDICATORS_FOLDER = os.path.join(MVT_DATA_FOLDER, "indicators")
|
MVT_INDICATORS_FOLDER = os.path.join(MVT_DATA_FOLDER, "indicators")
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
class Indicators:
|
class Indicators:
|
||||||
"""This class is used to parse indicators from a STIX2 file and provide
|
"""This class is used to parse indicators from a STIX2 file and provide
|
||||||
functions to compare extracted artifacts to the indicators.
|
functions to compare extracted artifacts to the indicators.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(self, log=logging.Logger) -> None:
|
def __init__(self, log=logger) -> None:
|
||||||
self.log = log
|
self.log = log
|
||||||
self.ioc_collections: List[Dict[str, Any]] = []
|
self.ioc_collections: List[Dict[str, Any]] = []
|
||||||
self.total_ioc_count = 0
|
self.total_ioc_count = 0
|
||||||
@@ -215,7 +217,7 @@ class Indicators:
|
|||||||
self.log.info("Loaded a total of %d unique indicators",
|
self.log.info("Loaded a total of %d unique indicators",
|
||||||
self.total_ioc_count)
|
self.total_ioc_count)
|
||||||
|
|
||||||
def get_iocs(self, ioc_type: str) -> Union[Iterator[Dict[str, Any]], None]:
|
def get_iocs(self, ioc_type: str) -> Iterator[Dict[str, Any]]:
|
||||||
for ioc_collection in self.ioc_collections:
|
for ioc_collection in self.ioc_collections:
|
||||||
for ioc in ioc_collection.get(ioc_type, []):
|
for ioc in ioc_collection.get(ioc_type, []):
|
||||||
yield {
|
yield {
|
||||||
@@ -233,10 +235,10 @@ class Indicators:
|
|||||||
:returns: Indicator details if matched, otherwise None
|
:returns: Indicator details if matched, otherwise None
|
||||||
|
|
||||||
"""
|
"""
|
||||||
# TODO: If the IOC domain contains a subdomain, it is not currently
|
|
||||||
# being matched.
|
|
||||||
if not url:
|
if not url:
|
||||||
return None
|
return None
|
||||||
|
if not isinstance(url, str):
|
||||||
|
return None
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# First we use the provided URL.
|
# First we use the provided URL.
|
||||||
@@ -247,15 +249,17 @@ class Indicators:
|
|||||||
# HTTP HEAD request.
|
# HTTP HEAD request.
|
||||||
unshortened = orig_url.unshorten()
|
unshortened = orig_url.unshorten()
|
||||||
|
|
||||||
# self.log.info("Found a shortened URL %s -> %s",
|
self.log.debug("Found a shortened URL %s -> %s",
|
||||||
# url, unshortened)
|
url, unshortened)
|
||||||
|
if unshortened is None:
|
||||||
|
return None
|
||||||
|
|
||||||
# Now we check for any nested URL shorteners.
|
# Now we check for any nested URL shorteners.
|
||||||
dest_url = URL(unshortened)
|
dest_url = URL(unshortened)
|
||||||
if dest_url.check_if_shortened():
|
if dest_url.check_if_shortened():
|
||||||
# self.log.info("Original URL %s appears to shorten another "
|
self.log.debug("Original URL %s appears to shorten another "
|
||||||
# "shortened URL %s ... checking!",
|
"shortened URL %s ... checking!",
|
||||||
# orig_url.url, dest_url.url)
|
orig_url.url, dest_url.url)
|
||||||
return self.check_domain(dest_url.url)
|
return self.check_domain(dest_url.url)
|
||||||
|
|
||||||
final_url = dest_url
|
final_url = dest_url
|
||||||
@@ -442,7 +446,7 @@ class Indicators:
|
|||||||
|
|
||||||
return None
|
return None
|
||||||
|
|
||||||
def check_file_path_process(self, file_path: str) -> Union[dict, None]:
|
def check_file_path_process(self, file_path: str) -> Optional[Dict[str, Any]]:
|
||||||
"""Check the provided file path contains a process name from the
|
"""Check the provided file path contains a process name from the
|
||||||
list of indicators
|
list of indicators
|
||||||
|
|
||||||
@@ -463,6 +467,8 @@ class Indicators:
|
|||||||
file_path, ioc["name"])
|
file_path, ioc["name"])
|
||||||
return ioc
|
return ioc
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
def check_profile(self, profile_uuid: str) -> Union[dict, None]:
|
def check_profile(self, profile_uuid: str) -> Union[dict, None]:
|
||||||
"""Check the provided configuration profile UUID against the list of
|
"""Check the provided configuration profile UUID against the list of
|
||||||
indicators.
|
indicators.
|
||||||
|
|||||||
@@ -185,6 +185,10 @@ def run_module(module: MVTModule) -> None:
|
|||||||
except NotImplementedError:
|
except NotImplementedError:
|
||||||
module.log.info("The %s module does not support checking for indicators",
|
module.log.info("The %s module does not support checking for indicators",
|
||||||
module.__class__.__name__)
|
module.__class__.__name__)
|
||||||
|
except Exception as exc:
|
||||||
|
module.log.exception("Error when checking indicators from module %s: %s",
|
||||||
|
module.__class__.__name__, exc)
|
||||||
|
|
||||||
else:
|
else:
|
||||||
if module.indicators and not module.detected:
|
if module.indicators and not module.detected:
|
||||||
module.log.info("The %s module produced no detections!",
|
module.log.info("The %s module produced no detections!",
|
||||||
@@ -194,6 +198,9 @@ def run_module(module: MVTModule) -> None:
|
|||||||
module.to_timeline()
|
module.to_timeline()
|
||||||
except NotImplementedError:
|
except NotImplementedError:
|
||||||
pass
|
pass
|
||||||
|
except Exception as exc:
|
||||||
|
module.log.exception("Error when serializing data from module %s: %s",
|
||||||
|
module.__class__.__name__, exc)
|
||||||
|
|
||||||
module.save_to_json()
|
module.save_to_json()
|
||||||
|
|
||||||
@@ -207,7 +214,7 @@ def save_timeline(timeline: list, timeline_path: str) -> None:
|
|||||||
"""
|
"""
|
||||||
with open(timeline_path, "a+", encoding="utf-8") as handle:
|
with open(timeline_path, "a+", encoding="utf-8") as handle:
|
||||||
csvoutput = csv.writer(handle, delimiter=",", quotechar="\"",
|
csvoutput = csv.writer(handle, delimiter=",", quotechar="\"",
|
||||||
quoting=csv.QUOTE_ALL)
|
quoting=csv.QUOTE_ALL, escapechar='\\')
|
||||||
csvoutput.writerow(["UTC Timestamp", "Plugin", "Event", "Description"])
|
csvoutput.writerow(["UTC Timestamp", "Plugin", "Event", "Description"])
|
||||||
|
|
||||||
for event in sorted(timeline, key=lambda x: x["timestamp"]
|
for event in sorted(timeline, key=lambda x: x["timestamp"]
|
||||||
|
|||||||
@@ -5,10 +5,13 @@
|
|||||||
|
|
||||||
import datetime
|
import datetime
|
||||||
import hashlib
|
import hashlib
|
||||||
|
import logging
|
||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
from typing import Any, Iterator, Union
|
from typing import Any, Iterator, Union
|
||||||
|
|
||||||
|
from rich.logging import RichHandler
|
||||||
|
|
||||||
|
|
||||||
def convert_chrometime_to_datetime(timestamp: int) -> datetime.datetime:
|
def convert_chrometime_to_datetime(timestamp: int) -> datetime.datetime:
|
||||||
"""Converts Chrome timestamp to a datetime.
|
"""Converts Chrome timestamp to a datetime.
|
||||||
@@ -161,9 +164,12 @@ def get_sha256_from_file_path(file_path: str) -> str:
|
|||||||
|
|
||||||
"""
|
"""
|
||||||
sha256_hash = hashlib.sha256()
|
sha256_hash = hashlib.sha256()
|
||||||
with open(file_path, "rb") as handle:
|
try:
|
||||||
for byte_block in iter(lambda: handle.read(4096), b""):
|
with open(file_path, "rb") as handle:
|
||||||
sha256_hash.update(byte_block)
|
for byte_block in iter(lambda: handle.read(4096), b""):
|
||||||
|
sha256_hash.update(byte_block)
|
||||||
|
except OSError:
|
||||||
|
return ""
|
||||||
|
|
||||||
return sha256_hash.hexdigest()
|
return sha256_hash.hexdigest()
|
||||||
|
|
||||||
@@ -194,3 +200,28 @@ def generate_hashes_from_path(path: str, log) -> Iterator[dict]:
|
|||||||
continue
|
continue
|
||||||
|
|
||||||
yield {"file_path": file_path, "sha256": sha256}
|
yield {"file_path": file_path, "sha256": sha256}
|
||||||
|
|
||||||
|
|
||||||
|
def init_logging(verbose: bool = False):
|
||||||
|
"""
|
||||||
|
Initialise logging for the MVT module
|
||||||
|
"""
|
||||||
|
# Setup logging using Rich.
|
||||||
|
log = logging.getLogger("mvt")
|
||||||
|
log.setLevel(logging.DEBUG)
|
||||||
|
consoleHandler = RichHandler(show_path=False, log_time_format="%X")
|
||||||
|
consoleHandler.setFormatter(logging.Formatter("[%(name)s] %(message)s"))
|
||||||
|
if verbose:
|
||||||
|
consoleHandler.setLevel(logging.DEBUG)
|
||||||
|
else:
|
||||||
|
consoleHandler.setLevel(logging.INFO)
|
||||||
|
log.addHandler(consoleHandler)
|
||||||
|
|
||||||
|
|
||||||
|
def set_verbose_logging(verbose: bool = False):
|
||||||
|
log = logging.getLogger("mvt")
|
||||||
|
handler = log.handlers[0]
|
||||||
|
if verbose:
|
||||||
|
handler.setLevel(logging.DEBUG)
|
||||||
|
else:
|
||||||
|
handler.setLevel(logging.INFO)
|
||||||
|
|||||||
@@ -3,4 +3,4 @@
|
|||||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||||
# https://license.mvt.re/1.1/
|
# https://license.mvt.re/1.1/
|
||||||
|
|
||||||
MVT_VERSION = "2.2.3"
|
MVT_VERSION = "2.2.6"
|
||||||
|
|||||||
@@ -8,17 +8,17 @@ import logging
|
|||||||
import os
|
import os
|
||||||
|
|
||||||
import click
|
import click
|
||||||
from rich.logging import RichHandler
|
|
||||||
from rich.prompt import Prompt
|
from rich.prompt import Prompt
|
||||||
|
|
||||||
from mvt.common.cmd_check_iocs import CmdCheckIOCS
|
from mvt.common.cmd_check_iocs import CmdCheckIOCS
|
||||||
from mvt.common.help import (HELP_MSG_FAST, HELP_MSG_HASHES, HELP_MSG_IOC,
|
from mvt.common.help import (HELP_MSG_FAST, HELP_MSG_HASHES, HELP_MSG_IOC,
|
||||||
HELP_MSG_LIST_MODULES, HELP_MSG_MODULE,
|
HELP_MSG_LIST_MODULES, HELP_MSG_MODULE,
|
||||||
HELP_MSG_OUTPUT)
|
HELP_MSG_OUTPUT, HELP_MSG_VERBOSE)
|
||||||
from mvt.common.logo import logo
|
from mvt.common.logo import logo
|
||||||
from mvt.common.options import MutuallyExclusiveOption
|
from mvt.common.options import MutuallyExclusiveOption
|
||||||
from mvt.common.updates import IndicatorsUpdates
|
from mvt.common.updates import IndicatorsUpdates
|
||||||
from mvt.common.utils import generate_hashes_from_path
|
from mvt.common.utils import (generate_hashes_from_path, init_logging,
|
||||||
|
set_verbose_logging)
|
||||||
|
|
||||||
from .cmd_check_backup import CmdIOSCheckBackup
|
from .cmd_check_backup import CmdIOSCheckBackup
|
||||||
from .cmd_check_fs import CmdIOSCheckFS
|
from .cmd_check_fs import CmdIOSCheckFS
|
||||||
@@ -27,11 +27,8 @@ from .modules.backup import BACKUP_MODULES
|
|||||||
from .modules.fs import FS_MODULES
|
from .modules.fs import FS_MODULES
|
||||||
from .modules.mixed import MIXED_MODULES
|
from .modules.mixed import MIXED_MODULES
|
||||||
|
|
||||||
# Setup logging using Rich.
|
init_logging()
|
||||||
LOG_FORMAT = "[%(name)s] %(message)s"
|
log = logging.getLogger("mvt")
|
||||||
logging.basicConfig(level="INFO", format=LOG_FORMAT, handlers=[
|
|
||||||
RichHandler(show_path=False, log_time_format="%X")])
|
|
||||||
log = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
# Set this environment variable to a password if needed.
|
# Set this environment variable to a password if needed.
|
||||||
MVT_IOS_BACKUP_PASSWORD = "MVT_IOS_BACKUP_PASSWORD"
|
MVT_IOS_BACKUP_PASSWORD = "MVT_IOS_BACKUP_PASSWORD"
|
||||||
@@ -166,9 +163,12 @@ def extract_key(password, key_file, backup_path):
|
|||||||
@click.option("--list-modules", "-l", is_flag=True, help=HELP_MSG_LIST_MODULES)
|
@click.option("--list-modules", "-l", is_flag=True, help=HELP_MSG_LIST_MODULES)
|
||||||
@click.option("--module", "-m", help=HELP_MSG_MODULE)
|
@click.option("--module", "-m", help=HELP_MSG_MODULE)
|
||||||
@click.option("--hashes", "-H", is_flag=True, help=HELP_MSG_HASHES)
|
@click.option("--hashes", "-H", is_flag=True, help=HELP_MSG_HASHES)
|
||||||
|
@click.option("--verbose", "-v", is_flag=True, help=HELP_MSG_VERBOSE)
|
||||||
@click.argument("BACKUP_PATH", type=click.Path(exists=True))
|
@click.argument("BACKUP_PATH", type=click.Path(exists=True))
|
||||||
@click.pass_context
|
@click.pass_context
|
||||||
def check_backup(ctx, iocs, output, fast, list_modules, module, hashes, backup_path):
|
def check_backup(ctx, iocs, output, fast, list_modules, module, hashes, verbose, backup_path):
|
||||||
|
set_verbose_logging(verbose)
|
||||||
|
|
||||||
cmd = CmdIOSCheckBackup(target_path=backup_path, results_path=output,
|
cmd = CmdIOSCheckBackup(target_path=backup_path, results_path=output,
|
||||||
ioc_files=iocs, module_name=module, fast_mode=fast,
|
ioc_files=iocs, module_name=module, fast_mode=fast,
|
||||||
hashes=hashes)
|
hashes=hashes)
|
||||||
@@ -199,9 +199,11 @@ def check_backup(ctx, iocs, output, fast, list_modules, module, hashes, backup_p
|
|||||||
@click.option("--list-modules", "-l", is_flag=True, help=HELP_MSG_LIST_MODULES)
|
@click.option("--list-modules", "-l", is_flag=True, help=HELP_MSG_LIST_MODULES)
|
||||||
@click.option("--module", "-m", help=HELP_MSG_MODULE)
|
@click.option("--module", "-m", help=HELP_MSG_MODULE)
|
||||||
@click.option("--hashes", "-H", is_flag=True, help=HELP_MSG_HASHES)
|
@click.option("--hashes", "-H", is_flag=True, help=HELP_MSG_HASHES)
|
||||||
|
@click.option("--verbose", "-v", is_flag=True, help=HELP_MSG_VERBOSE)
|
||||||
@click.argument("DUMP_PATH", type=click.Path(exists=True))
|
@click.argument("DUMP_PATH", type=click.Path(exists=True))
|
||||||
@click.pass_context
|
@click.pass_context
|
||||||
def check_fs(ctx, iocs, output, fast, list_modules, module, hashes, dump_path):
|
def check_fs(ctx, iocs, output, fast, list_modules, module, hashes, verbose, dump_path):
|
||||||
|
set_verbose_logging(verbose)
|
||||||
cmd = CmdIOSCheckFS(target_path=dump_path, results_path=output,
|
cmd = CmdIOSCheckFS(target_path=dump_path, results_path=output,
|
||||||
ioc_files=iocs, module_name=module, fast_mode=fast,
|
ioc_files=iocs, module_name=module, fast_mode=fast,
|
||||||
hashes=hashes)
|
hashes=hashes)
|
||||||
|
|||||||
@@ -34,7 +34,6 @@ class IOSExtraction(MVTModule):
|
|||||||
|
|
||||||
self.is_backup = False
|
self.is_backup = False
|
||||||
self.is_fs_dump = False
|
self.is_fs_dump = False
|
||||||
self.is_sysdiagnose = False
|
|
||||||
|
|
||||||
def _recover_sqlite_db_if_needed(self, file_path: str,
|
def _recover_sqlite_db_if_needed(self, file_path: str,
|
||||||
forced: Optional[bool] = False) -> None:
|
forced: Optional[bool] = False) -> None:
|
||||||
|
|||||||
@@ -3,6 +3,7 @@
|
|||||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||||
# https://license.mvt.re/1.1/
|
# https://license.mvt.re/1.1/
|
||||||
|
|
||||||
|
import copy
|
||||||
import logging
|
import logging
|
||||||
import plistlib
|
import plistlib
|
||||||
import sqlite3
|
import sqlite3
|
||||||
@@ -55,18 +56,20 @@ class Analytics(IOSExtraction):
|
|||||||
if ioc:
|
if ioc:
|
||||||
self.log.warning("Found mention of a malicious process \"%s\" in %s file at %s",
|
self.log.warning("Found mention of a malicious process \"%s\" in %s file at %s",
|
||||||
value, result["artifact"],
|
value, result["artifact"],
|
||||||
result["timestamp"])
|
result["isodate"])
|
||||||
result["matched_indicator"] = ioc
|
new_result = copy.copy(result)
|
||||||
self.detected.append(result)
|
new_result["matched_indicator"] = ioc
|
||||||
|
self.detected.append(new_result)
|
||||||
continue
|
continue
|
||||||
|
|
||||||
ioc = self.indicators.check_domain(value)
|
ioc = self.indicators.check_domain(value)
|
||||||
if ioc:
|
if ioc:
|
||||||
self.log.warning("Found mention of a malicious domain \"%s\" in %s file at %s",
|
self.log.warning("Found mention of a malicious domain \"%s\" in %s file at %s",
|
||||||
value, result["artifact"],
|
value, result["artifact"],
|
||||||
result["timestamp"])
|
result["isodate"])
|
||||||
result["matched_indicator"] = ioc
|
new_result = copy.copy(result)
|
||||||
self.detected.append(result)
|
new_result["matched_indicator"] = ioc
|
||||||
|
self.detected.append(new_result)
|
||||||
|
|
||||||
def _extract_analytics_data(self):
|
def _extract_analytics_data(self):
|
||||||
artifact = self.file_path.split("/")[-1]
|
artifact = self.file_path.split("/")[-1]
|
||||||
|
|||||||
@@ -15,8 +15,6 @@ from ..base import IOSExtraction
|
|||||||
class Filesystem(IOSExtraction):
|
class Filesystem(IOSExtraction):
|
||||||
"""This module extracts creation and modification date of files from a
|
"""This module extracts creation and modification date of files from a
|
||||||
full file-system dump.
|
full file-system dump.
|
||||||
|
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(
|
def __init__(
|
||||||
|
|||||||
@@ -3,6 +3,8 @@
|
|||||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||||
# https://license.mvt.re/1.1/
|
# https://license.mvt.re/1.1/
|
||||||
|
|
||||||
|
from .applications import Applications
|
||||||
|
from .calendar import Calendar
|
||||||
from .calls import Calls
|
from .calls import Calls
|
||||||
from .chrome_favicon import ChromeFavicon
|
from .chrome_favicon import ChromeFavicon
|
||||||
from .chrome_history import ChromeHistory
|
from .chrome_history import ChromeHistory
|
||||||
@@ -28,4 +30,5 @@ MIXED_MODULES = [Calls, ChromeFavicon, ChromeHistory, Contacts, FirefoxFavicon,
|
|||||||
FirefoxHistory, IDStatusCache, InteractionC, LocationdClients,
|
FirefoxHistory, IDStatusCache, InteractionC, LocationdClients,
|
||||||
OSAnalyticsADDaily, Datausage, SafariBrowserState, SafariHistory,
|
OSAnalyticsADDaily, Datausage, SafariBrowserState, SafariHistory,
|
||||||
TCC, SMS, SMSAttachments, WebkitResourceLoadStatistics,
|
TCC, SMS, SMSAttachments, WebkitResourceLoadStatistics,
|
||||||
WebkitSessionResourceLog, Whatsapp, Shortcuts]
|
WebkitSessionResourceLog, Whatsapp, Shortcuts, Applications,
|
||||||
|
Calendar]
|
||||||
|
|||||||
123
mvt/ios/modules/mixed/applications.py
Normal file
123
mvt/ios/modules/mixed/applications.py
Normal file
@@ -0,0 +1,123 @@
|
|||||||
|
# Mobile Verification Toolkit (MVT)
|
||||||
|
# Copyright (c) 2021-2023 Claudio Guarnieri.
|
||||||
|
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||||
|
# https://license.mvt.re/1.1/
|
||||||
|
|
||||||
|
import hashlib
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
import plistlib
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
from typing import Any, Dict, Optional, Union
|
||||||
|
|
||||||
|
from mvt.common.module import DatabaseNotFoundError
|
||||||
|
from mvt.common.utils import convert_datetime_to_iso
|
||||||
|
from mvt.ios.modules.base import IOSExtraction
|
||||||
|
|
||||||
|
APPLICATIONS_DB_PATH = [
|
||||||
|
"private/var/containers/Bundle/Application/*/iTunesMetadata.plist"
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
class Applications(IOSExtraction):
|
||||||
|
"""Extract information from accounts installed on the phone."""
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
file_path: Optional[str] = None,
|
||||||
|
target_path: Optional[str] = None,
|
||||||
|
results_path: Optional[str] = None,
|
||||||
|
fast_mode: Optional[bool] = False,
|
||||||
|
log: logging.Logger = logging.getLogger(__name__),
|
||||||
|
results: Optional[list] = None
|
||||||
|
) -> None:
|
||||||
|
super().__init__(file_path=file_path, target_path=target_path,
|
||||||
|
results_path=results_path, fast_mode=fast_mode,
|
||||||
|
log=log, results=results)
|
||||||
|
|
||||||
|
def serialize(self, record: dict) -> Union[dict, list]:
|
||||||
|
if "isodate" in record:
|
||||||
|
return {
|
||||||
|
"timestamp": record["isodate"],
|
||||||
|
"module": self.__class__.__name__,
|
||||||
|
"event": "app_installed",
|
||||||
|
"data": f"App {record.get('name', '')} version {record.get('bundleShortVersionString', '')} from {record.get('artistName', '')} installed from {record.get('sourceApp', '')}"
|
||||||
|
}
|
||||||
|
return []
|
||||||
|
|
||||||
|
def check_indicators(self) -> None:
|
||||||
|
for result in self.results:
|
||||||
|
if self.indicators:
|
||||||
|
ioc = self.indicators.check_process(result["softwareVersionBundleId"])
|
||||||
|
if ioc:
|
||||||
|
self.log.warning("Malicious application %s identified", result["softwareVersionBundleId"])
|
||||||
|
result["matched_indicator"] = ioc
|
||||||
|
self.detected.append(result)
|
||||||
|
continue
|
||||||
|
|
||||||
|
ioc = self.indicators.check_app_id(result["softwareVersionBundleId"])
|
||||||
|
if ioc:
|
||||||
|
self.log.warning("Malicious application %s identified", result["softwareVersionBundleId"])
|
||||||
|
result["matched_indicator"] = ioc
|
||||||
|
self.detected.append(result)
|
||||||
|
continue
|
||||||
|
|
||||||
|
if result.get("sourceApp", "com.apple.AppStore") not in ["com.apple.AppStore", "com.apple.dmd", "dmd"]:
|
||||||
|
self.log.warning("Suspicious app not installed from the App Store or MDM: %s", result["softwareVersionBundleId"])
|
||||||
|
self.detected.append(result)
|
||||||
|
|
||||||
|
def _parse_itunes_timestamp(self, entry: Dict[str, Any]) -> None:
|
||||||
|
"""
|
||||||
|
Parse the iTunes metadata info
|
||||||
|
"""
|
||||||
|
if entry.get("com.apple.iTunesStore.downloadInfo", {}).get("purchaseDate", None):
|
||||||
|
timestamp = datetime.strptime(
|
||||||
|
entry["com.apple.iTunesStore.downloadInfo"]["purchaseDate"],
|
||||||
|
"%Y-%m-%dT%H:%M:%SZ")
|
||||||
|
timestamp_utc = timestamp.astimezone(timezone.utc)
|
||||||
|
entry["isodate"] = convert_datetime_to_iso(timestamp_utc)
|
||||||
|
|
||||||
|
def _parse_itunes_metadata(self, plist_path: str) -> None:
|
||||||
|
"""
|
||||||
|
Parse iTunesMetadata.plist file from an application in fs dump
|
||||||
|
"""
|
||||||
|
with open(plist_path, "rb") as f:
|
||||||
|
entry = plistlib.load(f)
|
||||||
|
|
||||||
|
entry["file_path"] = plist_path
|
||||||
|
self._parse_itunes_timestamp(entry)
|
||||||
|
self.results.append(entry)
|
||||||
|
|
||||||
|
def _parse_info_plist(self, plist_path: str) -> None:
|
||||||
|
"""
|
||||||
|
Parse Info.plist file from backup
|
||||||
|
"""
|
||||||
|
with open(plist_path, "rb") as f:
|
||||||
|
data = plistlib.load(f)
|
||||||
|
|
||||||
|
for app in data.get("Applications", {}):
|
||||||
|
app_data = data["Applications"][app]
|
||||||
|
entry = {"name": app}
|
||||||
|
metadata = plistlib.loads(app_data["iTunesMetadata"])
|
||||||
|
entry.update(metadata)
|
||||||
|
|
||||||
|
self._parse_itunes_timestamp(entry)
|
||||||
|
|
||||||
|
if "PlaceholderIcon" in app_data:
|
||||||
|
sha256_hash = hashlib.sha256()
|
||||||
|
sha256_hash.update(app_data["PlaceholderIcon"])
|
||||||
|
entry["icon_sha256"] = sha256_hash.hexdigest()
|
||||||
|
|
||||||
|
self.results.append(entry)
|
||||||
|
|
||||||
|
def run(self) -> None:
|
||||||
|
if self.is_backup:
|
||||||
|
plist_path = os.path.join(self.target_path, "Info.plist")
|
||||||
|
if not os.path.isfile(plist_path):
|
||||||
|
raise DatabaseNotFoundError("Impossible to find Info.plist file")
|
||||||
|
self._parse_info_plist(plist_path)
|
||||||
|
elif self.is_fs_dump:
|
||||||
|
for file_path in self._get_fs_files_from_patterns(APPLICATIONS_DB_PATH):
|
||||||
|
self._parse_itunes_metadata(file_path)
|
||||||
|
|
||||||
|
self.log.info("Extracted a total of %d applications",
|
||||||
|
len(self.results))
|
||||||
136
mvt/ios/modules/mixed/calendar.py
Normal file
136
mvt/ios/modules/mixed/calendar.py
Normal file
@@ -0,0 +1,136 @@
|
|||||||
|
# Mobile Verification Toolkit (MVT)
|
||||||
|
# Copyright (c) 2021-2023 Claudio Guarnieri.
|
||||||
|
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||||
|
# https://license.mvt.re/1.1/
|
||||||
|
|
||||||
|
import logging
|
||||||
|
import sqlite3
|
||||||
|
from typing import Optional, Union
|
||||||
|
|
||||||
|
from mvt.common.utils import convert_mactime_to_iso
|
||||||
|
|
||||||
|
from ..base import IOSExtraction
|
||||||
|
|
||||||
|
CALENDAR_BACKUP_IDS = [
|
||||||
|
"2041457d5fe04d39d0ab481178355df6781e6858",
|
||||||
|
]
|
||||||
|
CALENDAR_ROOT_PATHS = [
|
||||||
|
"private/var/mobile/Library/Calendar/Calendar.sqlitedb"
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
class Calendar(IOSExtraction):
|
||||||
|
"""This module extracts all calendar entries."""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
file_path: Optional[str] = None,
|
||||||
|
target_path: Optional[str] = None,
|
||||||
|
results_path: Optional[str] = None,
|
||||||
|
fast_mode: Optional[bool] = False,
|
||||||
|
log: logging.Logger = logging.getLogger(__name__),
|
||||||
|
results: Optional[list] = None
|
||||||
|
) -> None:
|
||||||
|
super().__init__(file_path=file_path, target_path=target_path,
|
||||||
|
results_path=results_path, fast_mode=fast_mode,
|
||||||
|
log=log, results=results)
|
||||||
|
self.timestamps = [
|
||||||
|
"start_date",
|
||||||
|
"end_date",
|
||||||
|
"last_modified",
|
||||||
|
"creation_date",
|
||||||
|
"participant_last_modified"
|
||||||
|
]
|
||||||
|
|
||||||
|
def serialize(self, record: dict) -> Union[dict, list]:
|
||||||
|
records = []
|
||||||
|
for timestamp in self.timestamps:
|
||||||
|
if timestamp not in record or not record[timestamp]:
|
||||||
|
continue
|
||||||
|
|
||||||
|
records.append({
|
||||||
|
"timestamp": record[timestamp],
|
||||||
|
"module": self.__class__.__name__,
|
||||||
|
"event": timestamp,
|
||||||
|
"data": f"Calendar event {record['summary']} ({record['description']}) "
|
||||||
|
f"(invitation by {record['participant_email']})"
|
||||||
|
})
|
||||||
|
return records
|
||||||
|
|
||||||
|
def check_indicators(self) -> None:
|
||||||
|
for result in self.results:
|
||||||
|
if result["participant_email"] and self.indicators:
|
||||||
|
ioc = self.indicators.check_email(result["participant_email"])
|
||||||
|
if ioc:
|
||||||
|
result["matched_indicator"] = ioc
|
||||||
|
self.detected.append(result)
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Custom check for Quadream exploit
|
||||||
|
if result["summary"] == "Meeting" and result["description"] == "Notes":
|
||||||
|
self.log.warning("Potential Quadream exploit event identified: %s", result["uuid"])
|
||||||
|
self.detected.append(result)
|
||||||
|
|
||||||
|
def _parse_calendar_db(self):
|
||||||
|
"""
|
||||||
|
Parse the calendar database
|
||||||
|
"""
|
||||||
|
conn = sqlite3.connect(self.file_path)
|
||||||
|
cur = conn.cursor()
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
CalendarItem.ROWID as "id",
|
||||||
|
CalendarItem.summary as "summary",
|
||||||
|
CalendarItem.description as "description",
|
||||||
|
CalendarItem.start_date as "start_date",
|
||||||
|
CalendarItem.end_date as "end_date",
|
||||||
|
CalendarItem.all_day as "all_day",
|
||||||
|
CalendarItem.calendar_id as "calendar_id",
|
||||||
|
CalendarItem.organizer_id as "organizer_id",
|
||||||
|
CalendarItem.url as "url",
|
||||||
|
CalendarItem.last_modified as "last_modified",
|
||||||
|
CalendarItem.external_id as "external_id",
|
||||||
|
CalendarItem.external_mod_tag as "external_mod_tag",
|
||||||
|
CalendarItem.unique_identifier as "unique_identifier",
|
||||||
|
CalendarItem.hidden as "hidden",
|
||||||
|
CalendarItem.UUID as "uuid",
|
||||||
|
CalendarItem.creation_date as "creation_date",
|
||||||
|
CalendarItem.action as "action",
|
||||||
|
CalendarItem.created_by_id as "created_by_id",
|
||||||
|
Participant.UUID as "participant_uuid",
|
||||||
|
Participant.email as "participant_email",
|
||||||
|
Participant.phone_number as "participant_phone",
|
||||||
|
Participant.comment as "participant_comment",
|
||||||
|
Participant.last_modified as "participant_last_modified"
|
||||||
|
FROM CalendarItem
|
||||||
|
LEFT JOIN Participant ON Participant.ROWID = CalendarItem.organizer_id;
|
||||||
|
""")
|
||||||
|
|
||||||
|
names = [description[0] for description in cur.description]
|
||||||
|
for item in cur:
|
||||||
|
entry = {}
|
||||||
|
for index, value in enumerate(item):
|
||||||
|
if names[index] in self.timestamps:
|
||||||
|
if value is None or isinstance(value, str):
|
||||||
|
entry[names[index]] = value
|
||||||
|
else:
|
||||||
|
entry[names[index]] = convert_mactime_to_iso(value)
|
||||||
|
else:
|
||||||
|
entry[names[index]] = value
|
||||||
|
|
||||||
|
self.results.append(entry)
|
||||||
|
|
||||||
|
cur.close()
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
def run(self) -> None:
|
||||||
|
self._find_ios_database(backup_ids=CALENDAR_BACKUP_IDS,
|
||||||
|
root_paths=CALENDAR_ROOT_PATHS)
|
||||||
|
self.log.info("Found calendar database at path: %s",
|
||||||
|
self.file_path)
|
||||||
|
|
||||||
|
self._parse_calendar_db()
|
||||||
|
|
||||||
|
self.log.info("Extracted a total of %d calendar items",
|
||||||
|
len(self.results))
|
||||||
@@ -17,6 +17,200 @@ INTERACTIONC_BACKUP_IDS = [
|
|||||||
INTERACTIONC_ROOT_PATHS = [
|
INTERACTIONC_ROOT_PATHS = [
|
||||||
"private/var/mobile/Library/CoreDuet/People/interactionC.db",
|
"private/var/mobile/Library/CoreDuet/People/interactionC.db",
|
||||||
]
|
]
|
||||||
|
# Taken from APOLLO
|
||||||
|
# https://github.com/mac4n6/APOLLO/blob/master/modules/interaction_contact_interactions.txt
|
||||||
|
QUERIES = [
|
||||||
|
"""SELECT
|
||||||
|
ZINTERACTIONS.ZSTARTDATE AS "start_date",
|
||||||
|
ZINTERACTIONS.ZENDDATE AS "end_date",
|
||||||
|
ZINTERACTIONS.ZBUNDLEID AS "bundle_id",
|
||||||
|
ZINTERACTIONS.ZACCOUNT AS "account",
|
||||||
|
ZINTERACTIONS.ZTARGETBUNDLEID AS "target_bundle_id",
|
||||||
|
CASE ZINTERACTIONS.ZDIRECTION
|
||||||
|
WHEN '0' THEN 'INCOMING'
|
||||||
|
WHEN '1' THEN 'OUTGOING'
|
||||||
|
END 'DIRECTION' AS "direction",
|
||||||
|
ZCONTACTS.ZDISPLAYNAME AS "sender_display_name",
|
||||||
|
ZCONTACTS.ZIDENTIFIER AS "sender_identifier",
|
||||||
|
ZCONTACTS.ZPERSONID AS "sender_personid",
|
||||||
|
RECEIPIENTCONACT.ZDISPLAYNAME AS "recipient_display_name",
|
||||||
|
RECEIPIENTCONACT.ZIDENTIFIER AS "recipient_identifier",
|
||||||
|
RECEIPIENTCONACT.ZPERSONID AS "recipient_personid",
|
||||||
|
ZINTERACTIONS.ZRECIPIENTCOUNT AS "recipient_count",
|
||||||
|
ZINTERACTIONS.ZDOMAINIDENTIFIER AS "domain_identifier",
|
||||||
|
ZINTERACTIONS.ZISRESPONSE AS "is_response",
|
||||||
|
ZATTACHMENT.ZCONTENTTEXT AS "content",
|
||||||
|
ZATTACHMENT.ZUTI AS "uti",
|
||||||
|
ZATTACHMENT.ZCONTENTURL AS "attachment_content_url",
|
||||||
|
ZATTACHMENT.ZSIZEINBYTES AS "size",
|
||||||
|
ZATTACHMENT.ZPHOTOLOCALIDENTIFIER AS "photo_local_id",
|
||||||
|
HEX(ZATTACHMENT.ZIDENTIFIER) AS "attachment_id",
|
||||||
|
ZATTACHMENT.ZCLOUDIDENTIFIER AS "cloud_id",
|
||||||
|
ZCONTACTS.ZINCOMINGRECIPIENTCOUNT AS "incoming_recipient_count",
|
||||||
|
ZCONTACTS.ZINCOMINGSENDERCOUNT AS "incoming_sender_count",
|
||||||
|
ZCONTACTS.ZOUTGOINGRECIPIENTCOUNT AS "outgoing_recipient_count",
|
||||||
|
ZINTERACTIONS.ZCREATIONDATE AS "interactions_creation_date",
|
||||||
|
ZCONTACTS.ZCREATIONDATE AS "contacts_creation_date",
|
||||||
|
ZCONTACTS.ZFIRSTINCOMINGRECIPIENTDATE AS "first_incoming_recipient_date",
|
||||||
|
ZCONTACTS.ZFIRSTINCOMINGSENDERDATE AS "first_incoming_sender_date",
|
||||||
|
ZCONTACTS.ZFIRSTOUTGOINGRECIPIENTDATE AS "first_outgoing_recipient_date",
|
||||||
|
ZCONTACTS.ZLASTINCOMINGSENDERDATE AS "last_incoming_sender_date",
|
||||||
|
ZCONTACTS.ZLASTINCOMINGRECIPIENTDATE AS "last_incoming_recipient_date",
|
||||||
|
ZCONTACTS.ZLASTOUTGOINGRECIPIENTDATE AS "last_outgoing_recipient_date",
|
||||||
|
ZCONTACTS.ZCUSTOMIDENTIFIER AS "custom_id",
|
||||||
|
ZINTERACTIONS.ZCONTENTURL AS "interaction_content_url",
|
||||||
|
ZINTERACTIONS.ZLOCATIONUUID AS "location_uuid",
|
||||||
|
ZINTERACTIONS.ZGROUPNAME AS "group_name",
|
||||||
|
ZINTERACTIONS.ZDERIVEDINTENTIDENTIFIER AS "derivied_intent_id",
|
||||||
|
ZINTERACTIONS.Z_PK AS "table_id"
|
||||||
|
FROM ZINTERACTIONS
|
||||||
|
LEFT JOIN ZCONTACTS
|
||||||
|
ON ZINTERACTIONS.ZSENDER = ZCONTACTS.Z_PK
|
||||||
|
LEFT JOIN Z_1INTERACTIONS
|
||||||
|
ON ZINTERACTIONS.Z_PK == Z_1INTERACTIONS.Z_3INTERACTIONS
|
||||||
|
LEFT JOIN ZATTACHMENT
|
||||||
|
ON Z_1INTERACTIONS.Z_1ATTACHMENTS == ZATTACHMENT.Z_PK
|
||||||
|
LEFT JOIN Z_2INTERACTIONRECIPIENT
|
||||||
|
ON ZINTERACTIONS.Z_PK == Z_2INTERACTIONRECIPIENT.Z_3INTERACTIONRECIPIENT
|
||||||
|
LEFT JOIN ZCONTACTS RECEIPIENTCONACT
|
||||||
|
ON Z_2INTERACTIONRECIPIENT.Z_2RECIPIENTS == RECEIPIENTCONACT.Z_PK;
|
||||||
|
""",
|
||||||
|
""" SELECT
|
||||||
|
ZINTERACTIONS.ZSTARTDATE AS "start_date",
|
||||||
|
ZINTERACTIONS.ZENDDATE AS "end_date",
|
||||||
|
ZINTERACTIONS.ZBUNDLEID AS "bundle_id",
|
||||||
|
ZINTERACTIONS.ZACCOUNT AS "account",
|
||||||
|
ZINTERACTIONS.ZTARGETBUNDLEID AS "target_bundle_id",
|
||||||
|
CASE ZINTERACTIONS.ZDIRECTION
|
||||||
|
WHEN '0' THEN 'INCOMING'
|
||||||
|
WHEN '1' THEN 'OUTGOING'
|
||||||
|
END 'DIRECTION' AS "direction",
|
||||||
|
ZCONTACTS.ZDISPLAYNAME AS "sender_display_name",
|
||||||
|
ZCONTACTS.ZIDENTIFIER AS "sender_identifier",
|
||||||
|
ZCONTACTS.ZPERSONID AS "sender_personid",
|
||||||
|
RECEIPIENTCONACT.ZDISPLAYNAME AS "recipient_display_name",
|
||||||
|
RECEIPIENTCONACT.ZIDENTIFIER AS "recipient_identifier",
|
||||||
|
RECEIPIENTCONACT.ZPERSONID AS "recipient_personid",
|
||||||
|
ZINTERACTIONS.ZRECIPIENTCOUNT AS "recipient_count",
|
||||||
|
ZINTERACTIONS.ZDOMAINIDENTIFIER AS "domain_identifier",
|
||||||
|
ZINTERACTIONS.ZISRESPONSE AS "is_response",
|
||||||
|
ZATTACHMENT.ZCONTENTTEXT AS "content",
|
||||||
|
ZATTACHMENT.ZUTI AS "uti",
|
||||||
|
ZATTACHMENT.ZCONTENTURL AS "attachment_content_url",
|
||||||
|
ZATTACHMENT.ZSIZEINBYTES AS "size",
|
||||||
|
HEX(ZATTACHMENT.ZIDENTIFIER) AS "attachment_id",
|
||||||
|
ZATTACHMENT.ZCLOUDIDENTIFIER AS "cloud_id",
|
||||||
|
ZCONTACTS.ZINCOMINGRECIPIENTCOUNT AS "incoming_recipient_count",
|
||||||
|
ZCONTACTS.ZINCOMINGSENDERCOUNT AS "incoming_sender_count",
|
||||||
|
ZCONTACTS.ZOUTGOINGRECIPIENTCOUNT AS "outgoing_recipient_count",
|
||||||
|
ZINTERACTIONS.ZCREATIONDATE AS "interactions_creation_date",
|
||||||
|
ZCONTACTS.ZCREATIONDATE AS "contacts_creation_date",
|
||||||
|
ZCONTACTS.ZFIRSTINCOMINGRECIPIENTDATE AS "first_incoming_recipient_date",
|
||||||
|
ZCONTACTS.ZFIRSTINCOMINGSENDERDATE AS "first_incoming_sender_date",
|
||||||
|
ZCONTACTS.ZFIRSTOUTGOINGRECIPIENTDATE AS "first_outgoing_recipient_date",
|
||||||
|
ZCONTACTS.ZLASTINCOMINGSENDERDATE AS "last_incoming_sender_date",
|
||||||
|
CASE ZCONTACTS.ZLASTINCOMINGRECIPIENTDATE
|
||||||
|
WHEN '0' THEN '0'
|
||||||
|
ELSE ZCONTACTS.ZLASTINCOMINGRECIPIENTDATE
|
||||||
|
END 'LAST INCOMING RECIPIENT DATE' AS "last_incoming_recipient_date",
|
||||||
|
ZCONTACTS.ZLASTOUTGOINGRECIPIENTDATE AS "last_outgoing_recipient_date",
|
||||||
|
ZCONTACTS.ZCUSTOMIDENTIFIER AS "custom_id",
|
||||||
|
ZINTERACTIONS.ZCONTENTURL AS "interaction_content_url",
|
||||||
|
ZINTERACTIONS.ZLOCATIONUUID AS "location_uuid",
|
||||||
|
ZINTERACTIONS.Z_PK AS "table_id"
|
||||||
|
FROM
|
||||||
|
ZINTERACTIONS
|
||||||
|
LEFT JOIN
|
||||||
|
ZCONTACTS
|
||||||
|
ON ZINTERACTIONS.ZSENDER = ZCONTACTS.Z_PK
|
||||||
|
LEFT JOIN Z_1INTERACTIONS ON ZINTERACTIONS.Z_PK == Z_1INTERACTIONS.Z_3INTERACTIONS
|
||||||
|
LEFT JOIN ZATTACHMENT ON Z_1INTERACTIONS.Z_1ATTACHMENTS == ZATTACHMENT.Z_PK
|
||||||
|
LEFT JOIN Z_2INTERACTIONRECIPIENT ON ZINTERACTIONS.Z_PK== Z_2INTERACTIONRECIPIENT.Z_3INTERACTIONRECIPIENT
|
||||||
|
LEFT JOIN ZCONTACTS RECEIPIENTCONACT ON Z_2INTERACTIONRECIPIENT.Z_2RECIPIENTS== RECEIPIENTCONACT.Z_PK
|
||||||
|
""",
|
||||||
|
""" SELECT
|
||||||
|
ZINTERACTIONS.ZSTARTDATE AS "start_date",
|
||||||
|
ZINTERACTIONS.ZENDDATE AS "end_date",
|
||||||
|
ZINTERACTIONS.ZBUNDLEID AS "bundle_id",
|
||||||
|
ZCONTACTS.ZDISPLAYNAME AS "sender_display_name",
|
||||||
|
ZCONTACTS.ZIDENTIFIER AS "sender_identifier",
|
||||||
|
ZCONTACTS.ZPERSONID AS "sender_personid",
|
||||||
|
ZINTERACTIONS.ZDIRECTION AS "direction",
|
||||||
|
ZINTERACTIONS.ZISRESPONSE AS "is_response",
|
||||||
|
ZINTERACTIONS.ZMECHANISM AS "mechanism",
|
||||||
|
ZINTERACTIONS.ZRECIPIENTCOUNT AS "recipient_count",
|
||||||
|
ZINTERACTIONS.ZCREATIONDATE AS "interactions_creation_date",
|
||||||
|
ZCONTACTS.ZCREATIONDATE AS "contacts_creation_date",
|
||||||
|
ZCONTACTS.ZFIRSTINCOMINGRECIPIENTDATE AS "first_incoming_recipient_date",
|
||||||
|
ZCONTACTS.ZFIRSTINCOMINGSENDERDATE AS "first_incoming_sender_date",
|
||||||
|
ZCONTACTS.ZFIRSTOUTGOINGRECIPIENTDATE AS "first_outgoing_recipient_date",
|
||||||
|
ZCONTACTS.ZLASTINCOMINGSENDERDATE AS "last_incoming_sender_date",
|
||||||
|
CASE
|
||||||
|
ZLASTINCOMINGRECIPIENTDATE
|
||||||
|
WHEN
|
||||||
|
'0'
|
||||||
|
THEN
|
||||||
|
'0'
|
||||||
|
ELSE
|
||||||
|
ZCONTACTS.ZLASTINCOMINGRECIPIENTDATE
|
||||||
|
END AS "last_incoming_recipient_date",
|
||||||
|
ZCONTACTS.ZLASTOUTGOINGRECIPIENTDATE AS "last_outgoing_recipient_date",
|
||||||
|
ZINTERACTIONS.ZACCOUNT AS 'account',
|
||||||
|
ZINTERACTIONS.ZDOMAINIDENTIFIER AS "domain_identifier",
|
||||||
|
ZCONTACTS.ZINCOMINGRECIPIENTCOUNT AS "incoming_recipient_count",
|
||||||
|
ZCONTACTS.ZINCOMINGSENDERCOUNT AS "incoming_sender_count",
|
||||||
|
ZCONTACTS.ZOUTGOINGRECIPIENTCOUNT AS "outgoing_recipient_count",
|
||||||
|
ZCONTACTS.ZCUSTOMIDENTIFIER AS "custom_id",
|
||||||
|
ZINTERACTIONS.ZCONTENTURL AS "interaction_content_url",
|
||||||
|
ZINTERACTIONS.ZLOCATIONUUID AS "location_uuid",
|
||||||
|
ZINTERACTIONS.Z_PK AS "table_id"
|
||||||
|
FROM
|
||||||
|
ZINTERACTIONS
|
||||||
|
LEFT JOIN
|
||||||
|
ZCONTACTS
|
||||||
|
ON ZINTERACTIONS.ZSENDER = ZCONTACTS.Z_PK
|
||||||
|
""",
|
||||||
|
""" SELECT
|
||||||
|
ZINTERACTIONS.ZSTARTDATE AS "start_date",
|
||||||
|
ZINTERACTIONS.ZENDDATE AS "end_date",
|
||||||
|
ZINTERACTIONS.ZCREATIONDATE AS "interactions_creation_date",
|
||||||
|
ZINTERACTIONS.ZBUNDLEID AS "bundle_id",
|
||||||
|
ZCONTACTS.ZDISPLAYNAME AS "sender_display_name",
|
||||||
|
ZCONTACTS.ZIDENTIFIER AS "sender_identifier",
|
||||||
|
ZCONTACTS.ZPERSONID AS "sender_personid",
|
||||||
|
ZINTERACTIONS.ZDIRECTION AS "direction",
|
||||||
|
ZINTERACTIONS.ZISRESPONSE AS "is_response",
|
||||||
|
ZINTERACTIONS.ZMECHANISM AS "mechanism",
|
||||||
|
ZCONTACTS.ZCREATIONDATE AS "contacts_creation_date",
|
||||||
|
ZCONTACTS.ZFIRSTINCOMINGRECIPIENTDATE AS "first_incoming_recipient_date",
|
||||||
|
ZCONTACTS.ZFIRSTINCOMINGSENDERDATE AS "first_incoming_sender_date",
|
||||||
|
ZCONTACTS.ZFIRSTOUTGOINGRECIPIENTDATE AS "first_outgoing_recipient_date",
|
||||||
|
ZCONTACTS.ZLASTINCOMINGSENDERDATE AS "last_incoming_sender_date",
|
||||||
|
CASE
|
||||||
|
ZLASTINCOMINGRECIPIENTDATE
|
||||||
|
WHEN
|
||||||
|
'0'
|
||||||
|
THEN
|
||||||
|
'0'
|
||||||
|
ELSE
|
||||||
|
ZCONTACTS.ZLASTINCOMINGRECIPIENTDATE
|
||||||
|
END AS "last_incoming_recipient_date",
|
||||||
|
ZCONTACTS.ZLASTOUTGOINGRECIPIENTDATE AS "last_outgoing_recipient_date",
|
||||||
|
ZINTERACTIONS.ZACCOUNT AS "account",
|
||||||
|
ZINTERACTIONS.ZDOMAINIDENTIFIER AS "domain_identifier",
|
||||||
|
ZCONTACTS.ZINCOMINGRECIPIENTCOUNT AS "incoming_recipient_count",
|
||||||
|
ZCONTACTS.ZINCOMINGSENDERCOUNT AS "incoming_sender_count",
|
||||||
|
ZCONTACTS.ZOUTGOINGRECIPIENTCOUNT AS "outgoing_recipient_count",
|
||||||
|
ZINTERACTIONS.ZCONTENTURL AS "interaction_content_url",
|
||||||
|
ZINTERACTIONS.ZLOCATIONUUID AS "location_uuid",
|
||||||
|
ZINTERACTIONS.Z_PK AS "table_id"
|
||||||
|
FROM
|
||||||
|
ZINTERACTIONS
|
||||||
|
LEFT JOIN
|
||||||
|
ZCONTACTS
|
||||||
|
ON ZINTERACTIONS.ZSENDER = ZCONTACTS.Z_PK
|
||||||
|
"""
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
class InteractionC(IOSExtraction):
|
class InteractionC(IOSExtraction):
|
||||||
@@ -66,8 +260,8 @@ class InteractionC(IOSExtraction):
|
|||||||
"event": timestamp,
|
"event": timestamp,
|
||||||
"data": f"[{record['bundle_id']}] {record['account']} - "
|
"data": f"[{record['bundle_id']}] {record['account']} - "
|
||||||
f"from {record['sender_display_name']} ({record['sender_identifier']}) "
|
f"from {record['sender_display_name']} ({record['sender_identifier']}) "
|
||||||
f"to {record['recipient_display_name']} ({record['recipient_identifier']}):"
|
f"to {record.get('recipient_display_name', '')} ({record.get('recipient_identifier', '')}):"
|
||||||
f" {record['content']}"
|
f" {record.get('content', '')}"
|
||||||
})
|
})
|
||||||
processed.append(record[timestamp])
|
processed.append(record[timestamp])
|
||||||
|
|
||||||
@@ -81,108 +275,30 @@ class InteractionC(IOSExtraction):
|
|||||||
conn = sqlite3.connect(self.file_path)
|
conn = sqlite3.connect(self.file_path)
|
||||||
cur = conn.cursor()
|
cur = conn.cursor()
|
||||||
|
|
||||||
# TODO: Support all versions.
|
try:
|
||||||
# Taken from:
|
cur.execute(QUERIES[0])
|
||||||
# https://github.com/mac4n6/APOLLO/blob/master/modules/interaction_contact_interactions.txt
|
except sqlite3.OperationalError:
|
||||||
cur.execute("""
|
try:
|
||||||
SELECT
|
cur.execute(QUERIES[1])
|
||||||
ZINTERACTIONS.ZSTARTDATE,
|
except sqlite3.OperationalError:
|
||||||
ZINTERACTIONS.ZENDDATE,
|
try:
|
||||||
ZINTERACTIONS.ZBUNDLEID,
|
cur.execute(QUERIES[2])
|
||||||
ZINTERACTIONS.ZACCOUNT,
|
except sqlite3.OperationalError:
|
||||||
ZINTERACTIONS.ZTARGETBUNDLEID,
|
cur.execute(QUERIES[3])
|
||||||
CASE ZINTERACTIONS.ZDIRECTION
|
|
||||||
WHEN '0' THEN 'INCOMING'
|
|
||||||
WHEN '1' THEN 'OUTGOING'
|
|
||||||
END 'DIRECTION',
|
|
||||||
ZCONTACTS.ZDISPLAYNAME,
|
|
||||||
ZCONTACTS.ZIDENTIFIER,
|
|
||||||
ZCONTACTS.ZPERSONID,
|
|
||||||
RECEIPIENTCONACT.ZDISPLAYNAME,
|
|
||||||
RECEIPIENTCONACT.ZIDENTIFIER,
|
|
||||||
RECEIPIENTCONACT.ZPERSONID,
|
|
||||||
ZINTERACTIONS.ZRECIPIENTCOUNT,
|
|
||||||
ZINTERACTIONS.ZDOMAINIDENTIFIER,
|
|
||||||
ZINTERACTIONS.ZISRESPONSE,
|
|
||||||
ZATTACHMENT.ZCONTENTTEXT,
|
|
||||||
ZATTACHMENT.ZUTI,
|
|
||||||
ZATTACHMENT.ZCONTENTURL,
|
|
||||||
ZATTACHMENT.ZSIZEINBYTES,
|
|
||||||
ZATTACHMENT.ZPHOTOLOCALIDENTIFIER,
|
|
||||||
HEX(ZATTACHMENT.ZIDENTIFIER),
|
|
||||||
ZATTACHMENT.ZCLOUDIDENTIFIER,
|
|
||||||
ZCONTACTS.ZINCOMINGRECIPIENTCOUNT,
|
|
||||||
ZCONTACTS.ZINCOMINGSENDERCOUNT,
|
|
||||||
ZCONTACTS.ZOUTGOINGRECIPIENTCOUNT,
|
|
||||||
ZINTERACTIONS.ZCREATIONDATE,
|
|
||||||
ZCONTACTS.ZCREATIONDATE,
|
|
||||||
ZCONTACTS.ZFIRSTINCOMINGRECIPIENTDATE,
|
|
||||||
ZCONTACTS.ZFIRSTINCOMINGSENDERDATE,
|
|
||||||
ZCONTACTS.ZFIRSTOUTGOINGRECIPIENTDATE,
|
|
||||||
ZCONTACTS.ZLASTINCOMINGSENDERDATE,
|
|
||||||
ZCONTACTS.ZLASTINCOMINGRECIPIENTDATE,
|
|
||||||
ZCONTACTS.ZLASTOUTGOINGRECIPIENTDATE,
|
|
||||||
ZCONTACTS.ZCUSTOMIDENTIFIER,
|
|
||||||
ZINTERACTIONS.ZCONTENTURL,
|
|
||||||
ZINTERACTIONS.ZLOCATIONUUID,
|
|
||||||
ZINTERACTIONS.ZGROUPNAME,
|
|
||||||
ZINTERACTIONS.ZDERIVEDINTENTIDENTIFIER,
|
|
||||||
ZINTERACTIONS.Z_PK
|
|
||||||
FROM ZINTERACTIONS
|
|
||||||
LEFT JOIN ZCONTACTS
|
|
||||||
ON ZINTERACTIONS.ZSENDER = ZCONTACTS.Z_PK
|
|
||||||
LEFT JOIN Z_1INTERACTIONS
|
|
||||||
ON ZINTERACTIONS.Z_PK == Z_1INTERACTIONS.Z_3INTERACTIONS
|
|
||||||
LEFT JOIN ZATTACHMENT
|
|
||||||
ON Z_1INTERACTIONS.Z_1ATTACHMENTS == ZATTACHMENT.Z_PK
|
|
||||||
LEFT JOIN Z_2INTERACTIONRECIPIENT
|
|
||||||
ON ZINTERACTIONS.Z_PK == Z_2INTERACTIONRECIPIENT.Z_3INTERACTIONRECIPIENT
|
|
||||||
LEFT JOIN ZCONTACTS RECEIPIENTCONACT
|
|
||||||
ON Z_2INTERACTIONRECIPIENT.Z_2RECIPIENTS == RECEIPIENTCONACT.Z_PK;
|
|
||||||
""")
|
|
||||||
# names = [description[0] for description in cur.description]
|
|
||||||
|
|
||||||
for row in cur:
|
names = [description[0] for description in cur.description]
|
||||||
self.results.append({
|
for item in cur:
|
||||||
"start_date": convert_mactime_to_iso(row[0]),
|
entry = {}
|
||||||
"end_date": convert_mactime_to_iso(row[1]),
|
for index, value in enumerate(item):
|
||||||
"bundle_id": row[2],
|
if names[index] in self.timestamps:
|
||||||
"account": row[3],
|
if value is None or isinstance(value, str):
|
||||||
"target_bundle_id": row[4],
|
entry[names[index]] = value
|
||||||
"direction": row[5],
|
else:
|
||||||
"sender_display_name": row[6],
|
entry[names[index]] = convert_mactime_to_iso(value)
|
||||||
"sender_identifier": row[7],
|
else:
|
||||||
"sender_personid": row[8],
|
entry[names[index]] = value
|
||||||
"recipient_display_name": row[9],
|
|
||||||
"recipient_identifier": row[10],
|
self.results.append(entry)
|
||||||
"recipient_personid": row[11],
|
|
||||||
"recipient_count": row[12],
|
|
||||||
"domain_identifier": row[13],
|
|
||||||
"is_response": row[14],
|
|
||||||
"content": row[15],
|
|
||||||
"uti": row[16],
|
|
||||||
"content_url": row[17],
|
|
||||||
"size": row[18],
|
|
||||||
"photo_local_id": row[19],
|
|
||||||
"attachment_id": row[20],
|
|
||||||
"cloud_id": row[21],
|
|
||||||
"incoming_recipient_count": row[22],
|
|
||||||
"incoming_sender_count": row[23],
|
|
||||||
"outgoing_recipient_count": row[24],
|
|
||||||
"interactions_creation_date": convert_mactime_to_iso(row[25]) if row[25] else None,
|
|
||||||
"contacts_creation_date": convert_mactime_to_iso(row[26]) if row[26] else None,
|
|
||||||
"first_incoming_recipient_date": convert_mactime_to_iso(row[27]) if row[27] else None,
|
|
||||||
"first_incoming_sender_date": convert_mactime_to_iso(row[28]) if row[28] else None,
|
|
||||||
"first_outgoing_recipient_date": convert_mactime_to_iso(row[29]) if row[29] else None,
|
|
||||||
"last_incoming_sender_date": convert_mactime_to_iso(row[30]) if row[30] else None,
|
|
||||||
"last_incoming_recipient_date": convert_mactime_to_iso(row[31]) if row[31] else None,
|
|
||||||
"last_outgoing_recipient_date": convert_mactime_to_iso(row[32]) if row[32] else None,
|
|
||||||
"custom_id": row[33],
|
|
||||||
"location_uuid": row[35],
|
|
||||||
"group_name": row[36],
|
|
||||||
"derivied_intent_id": row[37],
|
|
||||||
"table_id": row[38]
|
|
||||||
})
|
|
||||||
|
|
||||||
cur.close()
|
cur.close()
|
||||||
conn.close()
|
conn.close()
|
||||||
|
|||||||
@@ -51,7 +51,10 @@ class SMS(IOSExtraction):
|
|||||||
return
|
return
|
||||||
|
|
||||||
for result in self.results:
|
for result in self.results:
|
||||||
message_links = check_for_links(result.get("text", ""))
|
message_links = result.get("links", [])
|
||||||
|
# Making sure not link was ignored
|
||||||
|
if message_links == []:
|
||||||
|
message_links = check_for_links(result.get("text", ""))
|
||||||
ioc = self.indicators.check_domains(message_links)
|
ioc = self.indicators.check_domains(message_links)
|
||||||
if ioc:
|
if ioc:
|
||||||
result["matched_indicator"] = ioc
|
result["matched_indicator"] = ioc
|
||||||
@@ -118,18 +121,15 @@ class SMS(IOSExtraction):
|
|||||||
if message.get("text", "").startswith(alert):
|
if message.get("text", "").startswith(alert):
|
||||||
self.log.warning("Apple warning about state-sponsored attack received on the %s",
|
self.log.warning("Apple warning about state-sponsored attack received on the %s",
|
||||||
message["isodate"])
|
message["isodate"])
|
||||||
self.results.append(message)
|
|
||||||
else:
|
else:
|
||||||
# Extract links from the SMS message.
|
# Extract links from the SMS message.
|
||||||
message_links = check_for_links(message.get("text", ""))
|
message_links = check_for_links(message.get("text", ""))
|
||||||
|
message["links"] = message_links
|
||||||
|
|
||||||
# If we find links in the messages or if they are empty we add
|
self.results.append(message)
|
||||||
# them to the list.
|
|
||||||
if message_links or message.get("text", "").strip() == "":
|
|
||||||
self.results.append(message)
|
|
||||||
|
|
||||||
cur.close()
|
cur.close()
|
||||||
conn.close()
|
conn.close()
|
||||||
|
|
||||||
self.log.info("Extracted a total of %d SMS messages containing links",
|
self.log.info("Extracted a total of %d SMS messages",
|
||||||
len(self.results))
|
len(self.results))
|
||||||
|
|||||||
@@ -70,7 +70,15 @@ class WebkitResourceLoadStatistics(IOSExtraction):
|
|||||||
cur = conn.cursor()
|
cur = conn.cursor()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
cur.execute("SELECT * from ObservedDomains;")
|
# FIXME: table contains extra fields with timestamp here
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
domainID,
|
||||||
|
registrableDomain,
|
||||||
|
lastSeen,
|
||||||
|
hadUserInteraction
|
||||||
|
from ObservedDomains;
|
||||||
|
""")
|
||||||
except sqlite3.OperationalError:
|
except sqlite3.OperationalError:
|
||||||
return
|
return
|
||||||
|
|
||||||
|
|||||||
@@ -38,7 +38,7 @@ class Whatsapp(IOSExtraction):
|
|||||||
def serialize(self, record: dict) -> Union[dict, list]:
|
def serialize(self, record: dict) -> Union[dict, list]:
|
||||||
text = record.get("ZTEXT", "").replace("\n", "\\n")
|
text = record.get("ZTEXT", "").replace("\n", "\\n")
|
||||||
links_text = ""
|
links_text = ""
|
||||||
if record["links"]:
|
if record.get("links"):
|
||||||
links_text = " - Embedded links: " + ", ".join(record["links"])
|
links_text = " - Embedded links: " + ", ".join(record["links"])
|
||||||
|
|
||||||
return {
|
return {
|
||||||
@@ -112,14 +112,13 @@ class Whatsapp(IOSExtraction):
|
|||||||
or link.startswith("https://mmg.whatsapp.net/")):
|
or link.startswith("https://mmg.whatsapp.net/")):
|
||||||
filtered_links.append(link)
|
filtered_links.append(link)
|
||||||
|
|
||||||
# If we find messages with links, or if there's an empty message
|
# Add all the links found to the record
|
||||||
# we add it to the results list.
|
|
||||||
if filtered_links or (message.get("ZTEXT") or "").strip() == "":
|
if filtered_links or (message.get("ZTEXT") or "").strip() == "":
|
||||||
message["links"] = list(set(filtered_links))
|
message["links"] = list(set(filtered_links))
|
||||||
self.results.append(message)
|
self.results.append(message)
|
||||||
|
|
||||||
cur.close()
|
cur.close()
|
||||||
conn.close()
|
conn.close()
|
||||||
|
|
||||||
self.log.info("Extracted a total of %d WhatsApp messages containing links",
|
self.log.info("Extracted a total of %d WhatsApp messages",
|
||||||
len(self.results))
|
len(self.results))
|
||||||
|
|||||||
@@ -272,7 +272,8 @@ IPHONE_IOS_VERSIONS = [
|
|||||||
{"build": "20C65", "version": "16.2"},
|
{"build": "20C65", "version": "16.2"},
|
||||||
{"build": "20D47", "version": "16.3"},
|
{"build": "20D47", "version": "16.3"},
|
||||||
{"build": "20D67", "version": "16.3.1"},
|
{"build": "20D67", "version": "16.3.1"},
|
||||||
{"build": "20E247", "version": "16.4"}
|
{"build": "20E247", "version": "16.4"},
|
||||||
|
{"build": "20E252", "version": "16.4.1"}
|
||||||
]
|
]
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
0
tests/android_bugreport/__init__.py
Normal file
0
tests/android_bugreport/__init__.py
Normal file
@@ -8,6 +8,7 @@ from pathlib import Path
|
|||||||
|
|
||||||
from mvt.android.modules.bugreport.appops import Appops
|
from mvt.android.modules.bugreport.appops import Appops
|
||||||
from mvt.android.modules.bugreport.packages import Packages
|
from mvt.android.modules.bugreport.packages import Packages
|
||||||
|
from mvt.android.modules.bugreport.getprop import Getprop
|
||||||
from mvt.common.module import run_module
|
from mvt.common.module import run_module
|
||||||
|
|
||||||
from ..utils import get_artifact_folder
|
from ..utils import get_artifact_folder
|
||||||
@@ -40,3 +41,7 @@ class TestBugreportAnalysis:
|
|||||||
assert m.results[1]["package_name"] == "com.instagram.android"
|
assert m.results[1]["package_name"] == "com.instagram.android"
|
||||||
assert len(m.results[0]["permissions"]) == 4
|
assert len(m.results[0]["permissions"]) == 4
|
||||||
assert len(m.results[1]["permissions"]) == 32
|
assert len(m.results[1]["permissions"]) == 32
|
||||||
|
|
||||||
|
def test_getprop_module(self):
|
||||||
|
m = self.launch_bug_report_module(Getprop)
|
||||||
|
assert len(m.results) == 0
|
||||||
Binary file not shown.
@@ -24,8 +24,10 @@ class TestIndicators:
|
|||||||
def test_check_domain(self, indicator_file):
|
def test_check_domain(self, indicator_file):
|
||||||
ind = Indicators(log=logging)
|
ind = Indicators(log=logging)
|
||||||
ind.load_indicators_files([indicator_file], load_default=False)
|
ind.load_indicators_files([indicator_file], load_default=False)
|
||||||
|
assert ind.check_domain(42) is None
|
||||||
assert ind.check_domain("https://www.example.org/foobar")
|
assert ind.check_domain("https://www.example.org/foobar")
|
||||||
assert ind.check_domain("http://example.org:8080/toto")
|
assert ind.check_domain("http://example.org:8080/toto")
|
||||||
|
assert ind.check_domain("https://github.com") is None
|
||||||
|
|
||||||
def test_check_android_property(self, indicator_file):
|
def test_check_android_property(self, indicator_file):
|
||||||
ind = Indicators(log=logging)
|
ind = Indicators(log=logging)
|
||||||
|
|||||||
34
tests/ios_backup/test_calendar.py
Normal file
34
tests/ios_backup/test_calendar.py
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
# Mobile Verification Toolkit (MVT)
|
||||||
|
# Copyright (c) 2021-2023 Claudio Guarnieri.
|
||||||
|
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||||
|
# https://license.mvt.re/1.1/
|
||||||
|
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from mvt.common.indicators import Indicators
|
||||||
|
from mvt.common.module import run_module
|
||||||
|
from mvt.ios.modules.mixed.calendar import Calendar
|
||||||
|
|
||||||
|
from ..utils import get_ios_backup_folder
|
||||||
|
|
||||||
|
|
||||||
|
class TestCalendarModule:
|
||||||
|
|
||||||
|
def test_calendar(self):
|
||||||
|
m = Calendar(target_path=get_ios_backup_folder())
|
||||||
|
run_module(m)
|
||||||
|
assert len(m.results) == 1
|
||||||
|
assert len(m.timeline) == 4
|
||||||
|
assert len(m.detected) == 0
|
||||||
|
assert m.results[0]["summary"] == "Super interesting meeting"
|
||||||
|
|
||||||
|
def test_calendar_detection(self, indicator_file):
|
||||||
|
m = Calendar(target_path=get_ios_backup_folder())
|
||||||
|
ind = Indicators(log=logging.getLogger())
|
||||||
|
ind.parse_stix2(indicator_file)
|
||||||
|
ind.ioc_collections[0]["emails"].append("user@example.org")
|
||||||
|
m.indicators = ind
|
||||||
|
run_module(m)
|
||||||
|
assert len(m.results) == 1
|
||||||
|
assert len(m.timeline) == 4
|
||||||
|
assert len(m.detected) == 1
|
||||||
@@ -17,8 +17,8 @@ class TestFilesystem:
|
|||||||
def test_filesystem(self):
|
def test_filesystem(self):
|
||||||
m = Filesystem(target_path=get_ios_backup_folder())
|
m = Filesystem(target_path=get_ios_backup_folder())
|
||||||
run_module(m)
|
run_module(m)
|
||||||
assert len(m.results) == 12
|
assert len(m.results) == 14
|
||||||
assert len(m.timeline) == 12
|
assert len(m.timeline) == 14
|
||||||
assert len(m.detected) == 0
|
assert len(m.detected) == 0
|
||||||
|
|
||||||
def test_detection(self, indicator_file):
|
def test_detection(self, indicator_file):
|
||||||
@@ -29,6 +29,6 @@ class TestFilesystem:
|
|||||||
ind.ioc_collections[0]["processes"].append("64d0019cb3d46bfc8cce545a8ba54b93e7ea9347")
|
ind.ioc_collections[0]["processes"].append("64d0019cb3d46bfc8cce545a8ba54b93e7ea9347")
|
||||||
m.indicators = ind
|
m.indicators = ind
|
||||||
run_module(m)
|
run_module(m)
|
||||||
assert len(m.results) == 12
|
assert len(m.results) == 14
|
||||||
assert len(m.timeline) == 12
|
assert len(m.timeline) == 14
|
||||||
assert len(m.detected) == 1
|
assert len(m.detected) == 1
|
||||||
|
|||||||
Reference in New Issue
Block a user