mirror of
https://github.com/mvt-project/mvt.git
synced 2026-02-14 17:42:46 +00:00
Compare commits
77 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
02bf903411 | ||
|
|
7019375767 | ||
|
|
34dd27c5d2 | ||
|
|
a4d6a08a8b | ||
|
|
635d3a392d | ||
|
|
2d78bddbba | ||
|
|
c1938d2ead | ||
|
|
104b01e5cd | ||
|
|
7087e8adb2 | ||
|
|
67608ac02b | ||
|
|
6d8de5b461 | ||
|
|
b0177d6104 | ||
|
|
e0c9a44b10 | ||
|
|
ef8c1ae895 | ||
|
|
3165801e2b | ||
|
|
1aa371a398 | ||
|
|
f8e380baa1 | ||
|
|
35559b09a8 | ||
|
|
daf5c1f3de | ||
|
|
f601db2174 | ||
|
|
3ce9641c23 | ||
|
|
9be393e3f6 | ||
|
|
5f125974b8 | ||
|
|
aa0f152ba1 | ||
|
|
169f5fbc26 | ||
|
|
5ea3460c09 | ||
|
|
c38df37967 | ||
|
|
7f29b522fa | ||
|
|
40b0da9885 | ||
|
|
94a8d9dd91 | ||
|
|
963d3db51a | ||
|
|
660e208473 | ||
|
|
01e68ccc6a | ||
|
|
fba0fa1f2c | ||
|
|
1cbf55e50e | ||
|
|
8fcc79ebfa | ||
|
|
423462395a | ||
|
|
1f08572a6a | ||
|
|
94e3c0ce7b | ||
|
|
904daad935 | ||
|
|
eb2a8b8b41 | ||
|
|
60a17381a2 | ||
|
|
ef2bb93dc4 | ||
|
|
f68b7e7089 | ||
|
|
a22241ec32 | ||
|
|
8ad1bc7a2b | ||
|
|
c6b3509ed4 | ||
|
|
75b5b296a5 | ||
|
|
2d62e31eaa | ||
|
|
1bfc683e4b | ||
|
|
7ab09669b5 | ||
|
|
757bd8618e | ||
|
|
f1d039346d | ||
|
|
ccdfd92d4a | ||
|
|
032b229eb8 | ||
|
|
93936976c7 | ||
|
|
f3a4e9d108 | ||
|
|
93a9735b5e | ||
|
|
7b0e2d4564 | ||
|
|
725a99bcd5 | ||
|
|
35a6f6ec9a | ||
|
|
f4ba29f1ef | ||
|
|
3f9809f36c | ||
|
|
6da6595108 | ||
|
|
35dfeaccee | ||
|
|
e5f2aa3c3d | ||
|
|
3236c1b390 | ||
|
|
80a670273d | ||
|
|
969b5cc506 | ||
|
|
ef8622d4c3 | ||
|
|
e39e9e6f92 | ||
|
|
7b32ed3179 | ||
|
|
315317863e | ||
|
|
08d35b056a | ||
|
|
3e679312d1 | ||
|
|
be4f1afed6 | ||
|
|
0dea25d86e |
@@ -38,12 +38,15 @@ RUN apt update \
|
||||
# Build libimobiledevice
|
||||
# ----------------------
|
||||
RUN git clone https://github.com/libimobiledevice/libplist \
|
||||
&& git clone https://github.com/libimobiledevice/libimobiledevice-glue \
|
||||
&& git clone https://github.com/libimobiledevice/libusbmuxd \
|
||||
&& git clone https://github.com/libimobiledevice/libimobiledevice \
|
||||
&& git clone https://github.com/libimobiledevice/usbmuxd \
|
||||
|
||||
&& cd libplist && ./autogen.sh && make && make install && ldconfig \
|
||||
|
||||
&& cd ../libimobiledevice-glue && PKG_CONFIG_PATH=/usr/local/lib/pkgconfig ./autogen.sh --prefix=/usr && make && make install && ldconfig \
|
||||
|
||||
&& cd ../libusbmuxd && PKG_CONFIG_PATH=/usr/local/lib/pkgconfig ./autogen.sh && make && make install && ldconfig \
|
||||
|
||||
&& cd ../libimobiledevice && PKG_CONFIG_PATH=/usr/local/lib/pkgconfig ./autogen.sh --enable-debug && make && make install && ldconfig \
|
||||
@@ -51,7 +54,7 @@ RUN git clone https://github.com/libimobiledevice/libplist \
|
||||
&& cd ../usbmuxd && PKG_CONFIG_PATH=/usr/local/lib/pkgconfig ./autogen.sh --prefix=/usr --sysconfdir=/etc --localstatedir=/var --runstatedir=/run && make && make install \
|
||||
|
||||
# Clean up.
|
||||
&& cd .. && rm -rf libplist libusbmuxd libimobiledevice usbmuxd
|
||||
&& cd .. && rm -rf libplist libimobiledevice-glue libusbmuxd libimobiledevice usbmuxd
|
||||
|
||||
# Installing MVT
|
||||
# --------------
|
||||
|
||||
@@ -15,15 +15,15 @@ It has been developed and released by the [Amnesty International Security Lab](h
|
||||
|
||||
## Installation
|
||||
|
||||
MVT can be installed from sources or from [PyPi](https://pypi.org/project/mvt/) (you will need some dependencies, check the [documentation](https://docs.mvt.re/en/latest/install.html)):
|
||||
MVT can be installed from sources or from [PyPi](https://pypi.org/project/mvt/) (you will need some dependencies, check the [documentation](https://docs.mvt.re/en/latest/install/)):
|
||||
|
||||
```
|
||||
pip3 install mvt
|
||||
```
|
||||
|
||||
Alternatively, you can decide to run MVT and all relevant tools through a [Docker container](https://docs.mvt.re/en/latest/docker.html).
|
||||
Alternatively, you can decide to run MVT and all relevant tools through a [Docker container](https://docs.mvt.re/en/latest/docker/).
|
||||
|
||||
**Please note:** MVT is best run on Linux or Mac systems. [It does not currently support running natively on Windows.](https://docs.mvt.re/en/latest/install.html#mvt-on-windows)
|
||||
**Please note:** MVT is best run on Linux or Mac systems. [It does not currently support running natively on Windows.](https://docs.mvt.re/en/latest/install/#mvt-on-windows)
|
||||
|
||||
## Usage
|
||||
|
||||
@@ -31,4 +31,4 @@ MVT provides two commands `mvt-ios` and `mvt-android`. [Check out the documentat
|
||||
|
||||
## License
|
||||
|
||||
The purpose of MVT is to facilitate the ***consensual forensic analysis*** of devices of those who might be targets of sophisticated mobile spyware attacks, especially members of civil society and marginalized communities. We do not want MVT to enable privacy violations of non-consenting individuals. In order to achieve this, MVT is released under its own license. [Read more here.](https://docs.mvt.re/en/latest/license.html)
|
||||
The purpose of MVT is to facilitate the ***consensual forensic analysis*** of devices of those who might be targets of sophisticated mobile spyware attacks, especially members of civil society and marginalized communities. We do not want MVT to enable privacy violations of non-consenting individuals. In order to achieve this, MVT is released under its own license. [Read more here.](https://docs.mvt.re/en/latest/license/)
|
||||
|
||||
@@ -22,7 +22,7 @@ adb backup -all
|
||||
|
||||
## Unpack the backup
|
||||
|
||||
In order to reliable unpack th [Android Backup Extractor (ABE)](https://github.com/nelenkov/android-backup-extractor) to convert it to a readable file format. Make sure that java is installed on your system and use the following command:
|
||||
In order to unpack the backup, use [Android Backup Extractor (ABE)](https://github.com/nelenkov/android-backup-extractor) to convert it to a readable file format. Make sure that java is installed on your system and use the following command:
|
||||
|
||||
```bash
|
||||
java -jar ~/path/to/abe.jar unpack backup.ab backup.tar
|
||||
@@ -31,6 +31,8 @@ tar xvf backup.tar
|
||||
|
||||
If the backup is encrypted, ABE will prompt you to enter the password.
|
||||
|
||||
Alternatively, [ab-decrypt](https://github.com/joernheissler/ab-decrypt) can be used for that purpose.
|
||||
|
||||
## Check the backup
|
||||
|
||||
You can then extract SMSs containing links with MVT:
|
||||
|
||||
@@ -8,7 +8,7 @@ However, not all is lost.
|
||||
|
||||
Because malware attacks over Android typically take the form of malicious or backdoored apps, the very first thing you might want to do is to extract and verify all installed Android packages and triage quickly if there are any which stand out as malicious or which might be atypical.
|
||||
|
||||
While it is out of the scope of this documentation to dwell into details on how to analyze Android apps, MVT does allow to easily and automatically extract information about installed apps, download copies of them, and quickly lookup services such as [VirusTotal](https://www.virustotal.com) or [Koodous](https://www.koodous.com) which might quickly indicate known bad apps.
|
||||
While it is out of the scope of this documentation to dwell into details on how to analyze Android apps, MVT does allow to easily and automatically extract information about installed apps, download copies of them, and quickly lookup services such as [VirusTotal](https://www.virustotal.com) or [Koodous](https://koodous.com) which might quickly indicate known bad apps.
|
||||
|
||||
|
||||
## Check the device over Android Debug Bridge
|
||||
|
||||
@@ -32,5 +32,6 @@ mvt-ios check-backup --iocs ~/iocs/malware1.stix --iocs ~/iocs/malware2.stix2 /p
|
||||
|
||||
- The [Amnesty International investigations repository](https://github.com/AmnestyTech/investigations) contains STIX-formatted IOCs for:
|
||||
- [Pegasus](https://en.wikipedia.org/wiki/Pegasus_(spyware)) ([STIX2](https://raw.githubusercontent.com/AmnestyTech/investigations/master/2021-07-18_nso/pegasus.stix2))
|
||||
- [This repository](https://github.com/Te-k/stalkerware-indicators) contains IOCs for Android stalkerware including [a STIX MVT-compatible file](https://github.com/Te-k/stalkerware-indicators/blob/master/stalkerware.stix2).
|
||||
|
||||
Please [open an issue](https://github.com/mvt-project/mvt/issues/) to suggest new sources of STIX-formatted IOCs.
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Install libimobiledevice
|
||||
|
||||
Before proceeding with doing any acquisition of iOS devices we recommend installing [libimobiledevice](https://www.libimobiledevice.org/) utilities. These utilities will become useful when extracting crash logs and generating iTunes backups. Because the utilities and its libraries are subject to frequent changes in response to new versions of iOS, you might want to consider compiling libimobiledevice utilities from sources. Otherwise, if available, you can try installing packages available in your distribution:
|
||||
Before proceeding with doing any acquisition of iOS devices we recommend installing [libimobiledevice](https://libimobiledevice.org/) utilities. These utilities will become useful when extracting crash logs and generating iTunes backups. Because the utilities and its libraries are subject to frequent changes in response to new versions of iOS, you might want to consider compiling libimobiledevice utilities from sources. Otherwise, if available, you can try installing packages available in your distribution:
|
||||
|
||||
```bash
|
||||
sudo apt install libimobiledevice-utils
|
||||
|
||||
@@ -148,6 +148,18 @@ If indicators are provided through the command-line, they are checked against th
|
||||
|
||||
---
|
||||
|
||||
### `os_analytics_ad_daily.json`
|
||||
|
||||
!!! info "Availability"
|
||||
Backup: :material-check:
|
||||
Full filesystem dump: :material-check:
|
||||
|
||||
This JSON file is created by mvt-ios' `OSAnalyticsADDaily` module. The module extracts records from a plist located *private/var/mobile/Library/Preferences/com.apple.osanalytics.addaily.plist*, which contains a history of data usage by processes running on the system. Besides the network statistics, these records are particularly important because they might show traces of malicious process executions and the relevant timeframe.
|
||||
|
||||
If indicators are provided through the command-line, they are checked against the process names. Any matches are stored in *os_analytics_ad_daily_detected.json*.
|
||||
|
||||
---
|
||||
|
||||
### `datausage.json`
|
||||
|
||||
!!! info "Availability"
|
||||
@@ -218,6 +230,18 @@ If indicators are provided through the command-line, they are checked against th
|
||||
|
||||
---
|
||||
|
||||
### `shutdown_log.json`
|
||||
|
||||
!!! info "Availability"
|
||||
Backup (if encrypted): :material-close:
|
||||
Full filesystem dump: :material-check:
|
||||
|
||||
This JSON file is created by mvt-ios' `ShutdownLog` module. The module extracts records from the shutdown log located at *private/var/db/diagnostics/shutdown.log*. When shutting down an iPhone, a SIGTERM will be sent to all processes runnning. The `shutdown.log` file will log any process (with its pid and path) that did not shut down after the SIGTERM was sent.
|
||||
|
||||
If indicators are provided through the command-line, they are checked against the paths. Any matches are stored in *shutdown_log_detected.json*.
|
||||
|
||||
---
|
||||
|
||||
### `sms.json`
|
||||
|
||||
!!! info "Availability"
|
||||
@@ -240,6 +264,16 @@ This JSON file is created by mvt-ios' `SMSAttachments` module. The module extrac
|
||||
|
||||
---
|
||||
|
||||
### `tcc.json`
|
||||
|
||||
!!! info "Availability"
|
||||
Backup: :material-check:
|
||||
Full filesystem dump: :material-check:
|
||||
|
||||
This JSON file is created by mvt-ios' `TCC` module. The module extracts records from a SQLite database located at */private/var/mobile/Library/TCC/TCC.db*, which contains a list of which services such as microphone, camera, or location, apps have been granted or denied access to.
|
||||
|
||||
---
|
||||
|
||||
### `version_history.json`
|
||||
|
||||
!!! info "Availability"
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
mkdocs==1.2.1
|
||||
mkdocs==1.2.3
|
||||
mkdocs-autorefs
|
||||
mkdocs-material
|
||||
mkdocs-material-extensions
|
||||
|
||||
@@ -34,6 +34,14 @@ def cli():
|
||||
logo()
|
||||
|
||||
|
||||
#==============================================================================
|
||||
# Command: version
|
||||
#==============================================================================
|
||||
@cli.command("version", help="Show the currently installed version of MVT")
|
||||
def version():
|
||||
return
|
||||
|
||||
|
||||
#==============================================================================
|
||||
# Download APKs
|
||||
#==============================================================================
|
||||
|
||||
@@ -11,7 +11,6 @@ import pkg_resources
|
||||
from tqdm import tqdm
|
||||
|
||||
from mvt.common.module import InsufficientPrivileges
|
||||
from mvt.common.utils import get_sha256_from_file_path
|
||||
|
||||
from .modules.adb.base import AndroidExtraction
|
||||
from .modules.adb.packages import Packages
|
||||
@@ -32,7 +31,10 @@ class PullProgress(tqdm):
|
||||
|
||||
class DownloadAPKs(AndroidExtraction):
|
||||
"""DownloadAPKs is the main class operating the download of APKs
|
||||
from the device."""
|
||||
from the device.
|
||||
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, output_folder=None, all_apks=False, log=None,
|
||||
packages=None):
|
||||
@@ -51,7 +53,9 @@ class DownloadAPKs(AndroidExtraction):
|
||||
@classmethod
|
||||
def from_json(cls, json_path):
|
||||
"""Initialize this class from an existing apks.json file.
|
||||
|
||||
:param json_path: Path to the apks.json file to parse.
|
||||
|
||||
"""
|
||||
with open(json_path, "r") as handle:
|
||||
packages = json.load(handle)
|
||||
@@ -59,9 +63,11 @@ class DownloadAPKs(AndroidExtraction):
|
||||
|
||||
def pull_package_file(self, package_name, remote_path):
|
||||
"""Pull files related to specific package from the device.
|
||||
|
||||
:param package_name: Name of the package to download
|
||||
:param remote_path: Path to the file to download
|
||||
:returns: Path to the local copy
|
||||
|
||||
"""
|
||||
log.info("Downloading %s ...", remote_path)
|
||||
|
||||
@@ -101,6 +107,8 @@ class DownloadAPKs(AndroidExtraction):
|
||||
def get_packages(self):
|
||||
"""Use the Packages adb module to retrieve the list of packages.
|
||||
We reuse the same extraction logic to then download the APKs.
|
||||
|
||||
|
||||
"""
|
||||
self.log.info("Retrieving list of installed packages...")
|
||||
|
||||
@@ -111,8 +119,7 @@ class DownloadAPKs(AndroidExtraction):
|
||||
self.packages = m.results
|
||||
|
||||
def pull_packages(self):
|
||||
"""Download all files of all selected packages from the device.
|
||||
"""
|
||||
"""Download all files of all selected packages from the device."""
|
||||
log.info("Starting extraction of installed APKs at folder %s", self.output_folder)
|
||||
|
||||
if not os.path.exists(self.output_folder):
|
||||
@@ -150,50 +157,27 @@ class DownloadAPKs(AndroidExtraction):
|
||||
log.info("[%d/%d] Package: %s", counter, len(packages_selection),
|
||||
package["package_name"])
|
||||
|
||||
# Get the file path for the specific package.
|
||||
try:
|
||||
output = self._adb_command(f"pm path {package['package_name']}")
|
||||
output = output.strip().replace("package:", "")
|
||||
if not output:
|
||||
continue
|
||||
except Exception as e:
|
||||
log.exception("Failed to get path of package %s: %s",
|
||||
package["package_name"], e)
|
||||
self._adb_reconnect()
|
||||
continue
|
||||
|
||||
# Sometimes the package path contains multiple lines for multiple apks.
|
||||
# We loop through each line and download each file.
|
||||
for path in output.split("\n"):
|
||||
device_path = path.strip()
|
||||
file_path = self.pull_package_file(package["package_name"],
|
||||
device_path)
|
||||
if not file_path:
|
||||
for package_file in package["files"]:
|
||||
device_path = package_file["path"]
|
||||
local_path = self.pull_package_file(package["package_name"],
|
||||
device_path)
|
||||
if not local_path:
|
||||
continue
|
||||
|
||||
file_info = {
|
||||
"path": device_path,
|
||||
"local_name": file_path,
|
||||
"sha256": get_sha256_from_file_path(file_path),
|
||||
}
|
||||
|
||||
if "files" not in package:
|
||||
package["files"] = [file_info,]
|
||||
else:
|
||||
package["files"].append(file_info)
|
||||
package_file["local_path"] = local_path
|
||||
|
||||
log.info("Download of selected packages completed")
|
||||
|
||||
def save_json(self):
|
||||
"""Save the results to the package.json file.
|
||||
"""
|
||||
"""Save the results to the package.json file."""
|
||||
json_path = os.path.join(self.output_folder, "apks.json")
|
||||
with open(json_path, "w") as handle:
|
||||
json.dump(self.packages, handle, indent=4)
|
||||
|
||||
def run(self):
|
||||
"""Run all steps of fetch-apk.
|
||||
"""
|
||||
"""Run all steps of fetch-apk."""
|
||||
self.get_packages()
|
||||
self._adb_connect()
|
||||
self.pull_packages()
|
||||
|
||||
@@ -32,7 +32,7 @@ def koodous_lookup(packages):
|
||||
res = requests.get(url)
|
||||
report = res.json()
|
||||
|
||||
row = [package["package_name"], file["local_name"]]
|
||||
row = [package["package_name"], file["path"]]
|
||||
|
||||
if "package_name" in report:
|
||||
trusted = "no"
|
||||
|
||||
@@ -75,7 +75,7 @@ def virustotal_lookup(packages):
|
||||
|
||||
for package in packages:
|
||||
for file in package.get("files", []):
|
||||
row = [package["package_name"], file["local_name"]]
|
||||
row = [package["package_name"], file["path"]]
|
||||
|
||||
if file["sha256"] in detections:
|
||||
detection = detections[file["sha256"]]
|
||||
|
||||
@@ -39,10 +39,9 @@ class AndroidExtraction(MVTModule):
|
||||
|
||||
@staticmethod
|
||||
def _adb_check_keys():
|
||||
"""Make sure Android adb keys exist.
|
||||
"""
|
||||
"""Make sure Android adb keys exist."""
|
||||
if not os.path.isdir(os.path.dirname(ADB_KEY_PATH)):
|
||||
os.path.makedirs(os.path.dirname(ADB_KEY_PATH))
|
||||
os.makedirs(os.path.dirname(ADB_KEY_PATH))
|
||||
|
||||
if not os.path.exists(ADB_KEY_PATH):
|
||||
keygen(ADB_KEY_PATH)
|
||||
@@ -51,14 +50,16 @@ class AndroidExtraction(MVTModule):
|
||||
write_public_keyfile(ADB_KEY_PATH, ADB_PUB_KEY_PATH)
|
||||
|
||||
def _adb_connect(self):
|
||||
"""Connect to the device over adb.
|
||||
"""
|
||||
"""Connect to the device over adb."""
|
||||
self._adb_check_keys()
|
||||
|
||||
with open(ADB_KEY_PATH, "rb") as handle:
|
||||
priv_key = handle.read()
|
||||
|
||||
signer = PythonRSASigner("", priv_key)
|
||||
with open(ADB_PUB_KEY_PATH, "rb") as handle:
|
||||
pub_key = handle.read()
|
||||
|
||||
signer = PythonRSASigner(pub_key, priv_key)
|
||||
|
||||
# If no serial was specified or if the serial does not seem to be
|
||||
# a HOST:PORT definition, we use the USB transport.
|
||||
@@ -94,47 +95,53 @@ class AndroidExtraction(MVTModule):
|
||||
break
|
||||
|
||||
def _adb_disconnect(self):
|
||||
"""Close adb connection to the device.
|
||||
"""
|
||||
"""Close adb connection to the device."""
|
||||
self.device.close()
|
||||
|
||||
def _adb_reconnect(self):
|
||||
"""Reconnect to device using adb.
|
||||
"""
|
||||
"""Reconnect to device using adb."""
|
||||
log.info("Reconnecting ...")
|
||||
self._adb_disconnect()
|
||||
self._adb_connect()
|
||||
|
||||
def _adb_command(self, command):
|
||||
"""Execute an adb shell command.
|
||||
|
||||
:param command: Shell command to execute
|
||||
:returns: Output of command
|
||||
|
||||
"""
|
||||
return self.device.shell(command)
|
||||
|
||||
def _adb_check_if_root(self):
|
||||
"""Check if we have a `su` binary on the Android device.
|
||||
|
||||
|
||||
:returns: Boolean indicating whether a `su` binary is present or not
|
||||
|
||||
"""
|
||||
return bool(self._adb_command("command -v su"))
|
||||
|
||||
def _adb_root_or_die(self):
|
||||
"""Check if we have a `su` binary, otherwise raise an Exception.
|
||||
"""
|
||||
"""Check if we have a `su` binary, otherwise raise an Exception."""
|
||||
if not self._adb_check_if_root():
|
||||
raise InsufficientPrivileges("This module is optionally available in case the device is already rooted. Do NOT root your own device!")
|
||||
|
||||
def _adb_command_as_root(self, command):
|
||||
"""Execute an adb shell command.
|
||||
|
||||
:param command: Shell command to execute as root
|
||||
:returns: Output of command
|
||||
|
||||
"""
|
||||
return self._adb_command(f"su -c {command}")
|
||||
|
||||
|
||||
def _adb_check_file_exists(self, file):
|
||||
"""Verify that a file exists.
|
||||
|
||||
:param file: Path of the file
|
||||
:returns: Boolean indicating whether the file exists or not
|
||||
|
||||
"""
|
||||
|
||||
# TODO: Need to support checking files without root privileges as well.
|
||||
@@ -148,9 +155,12 @@ class AndroidExtraction(MVTModule):
|
||||
|
||||
def _adb_download(self, remote_path, local_path, progress_callback=None, retry_root=True):
|
||||
"""Download a file form the device.
|
||||
|
||||
:param remote_path: Path to download from the device
|
||||
:param local_path: Path to where to locally store the copy of the file
|
||||
:param progress_callback: Callback for download progress bar
|
||||
:param progress_callback: Callback for download progress bar (Default value = None)
|
||||
:param retry_root: Default value = True)
|
||||
|
||||
"""
|
||||
try:
|
||||
self.device.pull(remote_path, local_path, progress_callback)
|
||||
@@ -159,7 +169,7 @@ class AndroidExtraction(MVTModule):
|
||||
self._adb_download_root(remote_path, local_path, progress_callback)
|
||||
else:
|
||||
raise Exception(f"Unable to download file {remote_path}: {e}")
|
||||
|
||||
|
||||
def _adb_download_root(self, remote_path, local_path, progress_callback=None):
|
||||
try:
|
||||
# Check if we have root, if not raise an Exception.
|
||||
@@ -184,16 +194,18 @@ class AndroidExtraction(MVTModule):
|
||||
|
||||
# Delete the copy on /sdcard/.
|
||||
self._adb_command(f"rm -rf {new_remote_path}")
|
||||
|
||||
|
||||
except AdbCommandFailureException as e:
|
||||
raise Exception(f"Unable to download file {remote_path}: {e}")
|
||||
|
||||
def _adb_process_file(self, remote_path, process_routine):
|
||||
"""Download a local copy of a file which is only accessible as root.
|
||||
This is a wrapper around process_routine.
|
||||
|
||||
:param remote_path: Path of the file on the device to process
|
||||
:param process_routine: Function to be called on the local copy of the
|
||||
downloaded file
|
||||
|
||||
"""
|
||||
# Connect to the device over adb.
|
||||
self._adb_connect()
|
||||
@@ -227,6 +239,5 @@ class AndroidExtraction(MVTModule):
|
||||
self._adb_disconnect()
|
||||
|
||||
def run(self):
|
||||
"""Run the main procedure.
|
||||
"""
|
||||
"""Run the main procedure."""
|
||||
raise NotImplementedError
|
||||
|
||||
@@ -33,9 +33,19 @@ class ChromeHistory(AndroidExtraction):
|
||||
"data": f"{record['id']} - {record['url']} (visit ID: {record['visit_id']}, redirect source: {record['redirect_source']})"
|
||||
}
|
||||
|
||||
def check_indicators(self):
|
||||
if not self.indicators:
|
||||
return
|
||||
|
||||
for result in self.results:
|
||||
if self.indicators.check_domain(result["url"]):
|
||||
self.detected.append(result)
|
||||
|
||||
def _parse_db(self, db_path):
|
||||
"""Parse a Chrome History database file.
|
||||
|
||||
:param db_path: Path to the History database to process.
|
||||
|
||||
"""
|
||||
conn = sqlite3.connect(db_path)
|
||||
cur = conn.cursor()
|
||||
|
||||
@@ -41,19 +41,55 @@ class Packages(AndroidExtraction):
|
||||
return records
|
||||
|
||||
def check_indicators(self):
|
||||
if not self.indicators:
|
||||
return
|
||||
|
||||
root_packages_path = os.path.join("..", "..", "data", "root_packages.txt")
|
||||
root_packages_string = pkg_resources.resource_string(__name__, root_packages_path)
|
||||
root_packages = root_packages_string.decode("utf-8").split("\n")
|
||||
root_packages = [rp.strip() for rp in root_packages]
|
||||
|
||||
for root_package in root_packages:
|
||||
root_package = root_package.strip()
|
||||
if not root_package:
|
||||
continue
|
||||
|
||||
if root_package in self.results:
|
||||
for result in self.results:
|
||||
if result["package_name"] in root_packages:
|
||||
self.log.warning("Found an installed package related to rooting/jailbreaking: \"%s\"",
|
||||
root_package)
|
||||
self.detected.append(root_package)
|
||||
result["package_name"])
|
||||
self.detected.append(result)
|
||||
if result["package_name"] in self.indicators.ioc_app_ids:
|
||||
self.log.warning("Found a malicious package name: \"%s\"",
|
||||
result["package_name"])
|
||||
self.detected.append(result)
|
||||
for file in result["files"]:
|
||||
if file["sha256"] in self.indicators.ioc_files_sha256:
|
||||
self.log.warning("Found a malicious APK: \"%s\" %s",
|
||||
result["package_name"],
|
||||
file["sha256"])
|
||||
self.detected.append(result)
|
||||
|
||||
def _get_files_for_package(self, package_name):
|
||||
output = self._adb_command(f"pm path {package_name}")
|
||||
output = output.strip().replace("package:", "")
|
||||
if not output:
|
||||
return []
|
||||
|
||||
package_files = []
|
||||
for file_path in output.split("\n"):
|
||||
file_path = file_path.strip()
|
||||
|
||||
md5 = self._adb_command(f"md5sum {file_path}").split(" ")[0]
|
||||
sha1 = self._adb_command(f"sha1sum {file_path}").split(" ")[0]
|
||||
sha256 = self._adb_command(f"sha256sum {file_path}").split(" ")[0]
|
||||
sha512 = self._adb_command(f"sha512sum {file_path}").split(" ")[0]
|
||||
|
||||
package_files.append({
|
||||
"path": file_path,
|
||||
"md5": md5,
|
||||
"sha1": sha1,
|
||||
"sha256": sha256,
|
||||
"sha512": sha512,
|
||||
})
|
||||
|
||||
return package_files
|
||||
|
||||
def run(self):
|
||||
self._adb_connect()
|
||||
@@ -85,6 +121,8 @@ class Packages(AndroidExtraction):
|
||||
first_install = dumpsys[1].split("=")[1].strip()
|
||||
last_update = dumpsys[2].split("=")[1].strip()
|
||||
|
||||
package_files = self._get_files_for_package(package_name)
|
||||
|
||||
self.results.append({
|
||||
"package_name": package_name,
|
||||
"file_name": file_name,
|
||||
@@ -96,6 +134,7 @@ class Packages(AndroidExtraction):
|
||||
"disabled": False,
|
||||
"system": False,
|
||||
"third_party": False,
|
||||
"files": package_files,
|
||||
})
|
||||
|
||||
cmds = [
|
||||
|
||||
@@ -71,7 +71,9 @@ class SMS(AndroidExtraction):
|
||||
|
||||
def _parse_db(self, db_path):
|
||||
"""Parse an Android bugle_db SMS database file.
|
||||
|
||||
:param db_path: Path to the Android SMS database file to process
|
||||
|
||||
"""
|
||||
conn = sqlite3.connect(db_path)
|
||||
cur = conn.cursor()
|
||||
|
||||
@@ -48,7 +48,9 @@ class Whatsapp(AndroidExtraction):
|
||||
|
||||
def _parse_db(self, db_path):
|
||||
"""Parse an Android msgstore.db WhatsApp database file.
|
||||
|
||||
:param db_path: Path to the Android WhatsApp database file to process
|
||||
|
||||
"""
|
||||
conn = sqlite3.connect(db_path)
|
||||
cur = conn.cursor()
|
||||
|
||||
@@ -15,6 +15,8 @@ class IndicatorsFileBadFormat(Exception):
|
||||
class Indicators:
|
||||
"""This class is used to parse indicators from a STIX2 file and provide
|
||||
functions to compare extracted artifacts to the indicators.
|
||||
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, log=None):
|
||||
@@ -23,6 +25,8 @@ class Indicators:
|
||||
self.ioc_processes = []
|
||||
self.ioc_emails = []
|
||||
self.ioc_files = []
|
||||
self.ioc_files_sha256 = []
|
||||
self.ioc_app_ids = []
|
||||
self.ioc_count = 0
|
||||
|
||||
def _add_indicator(self, ioc, iocs_list):
|
||||
@@ -32,6 +36,10 @@ class Indicators:
|
||||
|
||||
def parse_stix2(self, file_path):
|
||||
"""Extract indicators from a STIX2 file.
|
||||
|
||||
:param file_path: Path to the STIX2 file to parse
|
||||
:type file_path: str
|
||||
|
||||
"""
|
||||
self.log.info("Parsing STIX2 indicators file at path %s",
|
||||
file_path)
|
||||
@@ -63,10 +71,26 @@ class Indicators:
|
||||
elif key == "file:name":
|
||||
self._add_indicator(ioc=value,
|
||||
iocs_list=self.ioc_files)
|
||||
elif key == "app:id":
|
||||
self._add_indicator(ioc=value,
|
||||
iocs_list=self.ioc_app_ids)
|
||||
elif key == "file:hashes.sha256":
|
||||
self._add_indicator(ioc=value,
|
||||
iocs_list=self.ioc_files_sha256)
|
||||
|
||||
def check_domain(self, url):
|
||||
def check_domain(self, url) -> bool:
|
||||
"""Check if a given URL matches any of the provided domain indicators.
|
||||
|
||||
:param url: URL to match against domain indicators
|
||||
:type url: str
|
||||
:returns: True if the URL matched an indicator, otherwise False
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
# TODO: If the IOC domain contains a subdomain, it is not currently
|
||||
# being matched.
|
||||
if not url:
|
||||
return False
|
||||
|
||||
try:
|
||||
# First we use the provided URL.
|
||||
@@ -124,18 +148,35 @@ class Indicators:
|
||||
|
||||
return True
|
||||
|
||||
def check_domains(self, urls):
|
||||
"""Check the provided list of (suspicious) domains against a list of URLs.
|
||||
:param urls: List of URLs to check
|
||||
return False
|
||||
|
||||
def check_domains(self, urls) -> bool:
|
||||
"""Check a list of URLs against the provided list of domain indicators.
|
||||
|
||||
:param urls: List of URLs to check against domain indicators
|
||||
:type urls: list
|
||||
:returns: True if any URL matched an indicator, otherwise False
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
if not urls:
|
||||
return False
|
||||
|
||||
for url in urls:
|
||||
if self.check_domain(url):
|
||||
return True
|
||||
|
||||
def check_process(self, process):
|
||||
return False
|
||||
|
||||
def check_process(self, process) -> bool:
|
||||
"""Check the provided process name against the list of process
|
||||
indicators.
|
||||
:param process: Process name to check
|
||||
|
||||
:param process: Process name to check against process indicators
|
||||
:type process: str
|
||||
:returns: True if process matched an indicator, otherwise False
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
if not process:
|
||||
return False
|
||||
@@ -151,18 +192,35 @@ class Indicators:
|
||||
self.log.warning("Found a truncated known suspicious process name \"%s\"", process)
|
||||
return True
|
||||
|
||||
def check_processes(self, processes):
|
||||
return False
|
||||
|
||||
def check_processes(self, processes) -> bool:
|
||||
"""Check the provided list of processes against the list of
|
||||
process indicators.
|
||||
:param processes: List of processes to check
|
||||
|
||||
:param processes: List of processes to check against process indicators
|
||||
:type processes: list
|
||||
:returns: True if process matched an indicator, otherwise False
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
if not processes:
|
||||
return False
|
||||
|
||||
for process in processes:
|
||||
if self.check_process(process):
|
||||
return True
|
||||
|
||||
def check_email(self, email):
|
||||
return False
|
||||
|
||||
def check_email(self, email) -> bool:
|
||||
"""Check the provided email against the list of email indicators.
|
||||
:param email: Suspicious email to check
|
||||
|
||||
:param email: Email address to check against email indicators
|
||||
:type email: str
|
||||
:returns: True if email address matched an indicator, otherwise False
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
if not email:
|
||||
return False
|
||||
@@ -171,9 +229,17 @@ class Indicators:
|
||||
self.log.warning("Found a known suspicious email address: \"%s\"", email)
|
||||
return True
|
||||
|
||||
def check_file(self, file_path):
|
||||
return False
|
||||
|
||||
def check_file(self, file_path) -> bool:
|
||||
"""Check the provided file path against the list of file indicators.
|
||||
:param file_path: Path or name of the file to check
|
||||
|
||||
:param file_path: File path or file name to check against file
|
||||
indicators
|
||||
:type file_path: str
|
||||
:returns: True if the file path matched an indicator, otherwise False
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
if not file_path:
|
||||
return False
|
||||
@@ -182,3 +248,5 @@ class Indicators:
|
||||
if file_name in self.ioc_files:
|
||||
self.log.warning("Found a known suspicious file: \"%s\"", file_path)
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
@@ -31,12 +31,18 @@ class MVTModule(object):
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
fast_mode=False, log=None, results=[]):
|
||||
"""Initialize module.
|
||||
:param file_path: Path to the module's database file, if there is any.
|
||||
|
||||
:param file_path: Path to the module's database file, if there is any
|
||||
:type file_path: str
|
||||
:param base_folder: Path to the base folder (backup or filesystem dump)
|
||||
:type file_path: str
|
||||
:param output_folder: Folder where results will be stored
|
||||
:type output_folder: str
|
||||
:param fast_mode: Flag to enable or disable slow modules
|
||||
:type fast_mode: bool
|
||||
:param log: Handle to logger
|
||||
:param results: Provided list of results entries
|
||||
:type results: list
|
||||
"""
|
||||
self.file_path = file_path
|
||||
self.base_folder = base_folder
|
||||
@@ -59,23 +65,23 @@ class MVTModule(object):
|
||||
return cls(results=results, log=log)
|
||||
|
||||
def get_slug(self):
|
||||
"""Use the module's class name to retrieve a slug"""
|
||||
if self.slug:
|
||||
return self.slug
|
||||
|
||||
sub = re.sub("(.)([A-Z][a-z]+)", r"\1_\2", self.__class__.__name__)
|
||||
return re.sub("([a-z0-9])([A-Z])", r"\1_\2", sub).lower()
|
||||
|
||||
def load_indicators(self, file_path):
|
||||
self.indicators = Indicators(file_path, self.log)
|
||||
|
||||
def check_indicators(self):
|
||||
"""Check the results of this module against a provided list of
|
||||
indicators."""
|
||||
indicators.
|
||||
|
||||
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def save_to_json(self):
|
||||
"""Save the collected results to a json file.
|
||||
"""
|
||||
"""Save the collected results to a json file."""
|
||||
if not self.output_folder:
|
||||
return
|
||||
|
||||
@@ -102,15 +108,18 @@ class MVTModule(object):
|
||||
|
||||
@staticmethod
|
||||
def _deduplicate_timeline(timeline):
|
||||
"""Serialize entry as JSON to deduplicate repeated entries"""
|
||||
"""Serialize entry as JSON to deduplicate repeated entries
|
||||
|
||||
:param timeline: List of entries from timeline to deduplicate
|
||||
|
||||
"""
|
||||
timeline_set = set()
|
||||
for record in timeline:
|
||||
timeline_set.add(json.dumps(record, sort_keys=True))
|
||||
return [json.loads(record) for record in timeline_set]
|
||||
|
||||
def to_timeline(self):
|
||||
"""Convert results into a timeline.
|
||||
"""
|
||||
"""Convert results into a timeline."""
|
||||
for result in self.results:
|
||||
record = self.serialize(result)
|
||||
if record:
|
||||
@@ -132,8 +141,7 @@ class MVTModule(object):
|
||||
self.timeline_detected = self._deduplicate_timeline(self.timeline_detected)
|
||||
|
||||
def run(self):
|
||||
"""Run the main module procedure.
|
||||
"""
|
||||
"""Run the main module procedure."""
|
||||
raise NotImplementedError
|
||||
|
||||
|
||||
@@ -178,8 +186,10 @@ def run_module(module):
|
||||
|
||||
def save_timeline(timeline, timeline_path):
|
||||
"""Save the timeline in a csv file.
|
||||
:param timeline: List of records to order and store.
|
||||
:param timeline_path: Path to the csv file to store the timeline to.
|
||||
|
||||
:param timeline: List of records to order and store
|
||||
:param timeline_path: Path to the csv file to store the timeline to
|
||||
|
||||
"""
|
||||
with io.open(timeline_path, "a+", encoding="utf-8") as handle:
|
||||
csvoutput = csv.writer(handle, delimiter=",", quotechar="\"")
|
||||
|
||||
@@ -9,8 +9,7 @@ from click import Option, UsageError
|
||||
|
||||
|
||||
class MutuallyExclusiveOption(Option):
|
||||
"""This class extends click to support mutually exclusive options.
|
||||
"""
|
||||
"""This class extends click to support mutually exclusive options."""
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.mutually_exclusive = set(kwargs.pop("mutually_exclusive", []))
|
||||
|
||||
@@ -7,6 +7,7 @@ import requests
|
||||
from tld import get_tld
|
||||
|
||||
SHORTENER_DOMAINS = [
|
||||
"1drv.ms",
|
||||
"1link.in",
|
||||
"1url.com",
|
||||
"2big.at",
|
||||
@@ -15,29 +16,29 @@ SHORTENER_DOMAINS = [
|
||||
"2ya.com",
|
||||
"4url.cc",
|
||||
"6url.com",
|
||||
"a.gg",
|
||||
"a.nf",
|
||||
"a2a.me",
|
||||
"abbrr.com",
|
||||
"adf.ly",
|
||||
"adjix.com",
|
||||
"a.gg",
|
||||
"alturl.com",
|
||||
"a.nf",
|
||||
"atu.ca",
|
||||
"b23.ru",
|
||||
"bacn.me",
|
||||
"bit.ly",
|
||||
"bit.do",
|
||||
"bit.ly",
|
||||
"bkite.com",
|
||||
"bloat.me",
|
||||
"budurl.com",
|
||||
"buff.ly",
|
||||
"buk.me",
|
||||
"burnurl.com",
|
||||
"c-o.in",
|
||||
"chilp.it",
|
||||
"clck.ru",
|
||||
"clickmeter.com",
|
||||
"cli.gs",
|
||||
"c-o.in",
|
||||
"clickmeter.com",
|
||||
"cort.as",
|
||||
"cut.ly",
|
||||
"cuturl.com",
|
||||
@@ -55,19 +56,20 @@ SHORTENER_DOMAINS = [
|
||||
"esyurl.com",
|
||||
"ewerl.com",
|
||||
"fa.b",
|
||||
"fff.to",
|
||||
"ff.im",
|
||||
"fff.to",
|
||||
"fhurl.com",
|
||||
"fire.to",
|
||||
"firsturl.de",
|
||||
"flic.kr",
|
||||
"fly2.ws",
|
||||
"fon.gs",
|
||||
"forms.gle",
|
||||
"fwd4.me",
|
||||
"gl.am",
|
||||
"go2cut.com",
|
||||
"go2.me",
|
||||
"go.9nl.com",
|
||||
"go2.me",
|
||||
"go2cut.com",
|
||||
"goo.gl",
|
||||
"goshrink.com",
|
||||
"gowat.ch",
|
||||
@@ -77,6 +79,7 @@ SHORTENER_DOMAINS = [
|
||||
"hex.io",
|
||||
"hover.com",
|
||||
"href.in",
|
||||
"ht.ly",
|
||||
"htxt.it",
|
||||
"hugeurl.com",
|
||||
"hurl.it",
|
||||
@@ -85,8 +88,8 @@ SHORTENER_DOMAINS = [
|
||||
"icanhaz.com",
|
||||
"idek.net",
|
||||
"inreply.to",
|
||||
"iscool.net",
|
||||
"is.gd",
|
||||
"iscool.net",
|
||||
"iterasi.net",
|
||||
"jijr.com",
|
||||
"jmp2.net",
|
||||
@@ -102,10 +105,11 @@ SHORTENER_DOMAINS = [
|
||||
"linkbee.com",
|
||||
"linkbun.ch",
|
||||
"liurl.cn",
|
||||
"lnk.gd",
|
||||
"lnk.in",
|
||||
"ln-s.net",
|
||||
"ln-s.ru",
|
||||
"lnk.gd",
|
||||
"lnk.in",
|
||||
"lnkd.in",
|
||||
"loopt.us",
|
||||
"lru.jp",
|
||||
"lt.tl",
|
||||
@@ -122,44 +126,44 @@ SHORTENER_DOMAINS = [
|
||||
"nn.nf",
|
||||
"notlong.com",
|
||||
"nsfw.in",
|
||||
"o-x.fr",
|
||||
"om.ly",
|
||||
"ow.ly",
|
||||
"o-x.fr",
|
||||
"pd.am",
|
||||
"pic.gd",
|
||||
"ping.fm",
|
||||
"piurl.com",
|
||||
"pnt.me",
|
||||
"poprl.com",
|
||||
"posted.at",
|
||||
"post.ly",
|
||||
"posted.at",
|
||||
"profile.to",
|
||||
"qicute.com",
|
||||
"qlnk.net",
|
||||
"quip-art.com",
|
||||
"rb6.me",
|
||||
"redirx.com",
|
||||
"rickroll.it",
|
||||
"ri.ms",
|
||||
"rickroll.it",
|
||||
"riz.gd",
|
||||
"rsmonkey.com",
|
||||
"rubyurl.com",
|
||||
"ru.ly",
|
||||
"rubyurl.com",
|
||||
"s7y.us",
|
||||
"safe.mn",
|
||||
"sharein.com",
|
||||
"sharetabs.com",
|
||||
"shorl.com",
|
||||
"short.ie",
|
||||
"short.to",
|
||||
"shortlinks.co.uk",
|
||||
"shortna.me",
|
||||
"short.to",
|
||||
"shorturl.com",
|
||||
"shoturl.us",
|
||||
"shrinkify.com",
|
||||
"shrinkster.com",
|
||||
"shrten.com",
|
||||
"shrt.st",
|
||||
"shrten.com",
|
||||
"shrunkin.com",
|
||||
"shw.me",
|
||||
"simurl.com",
|
||||
@@ -177,20 +181,20 @@ SHORTENER_DOMAINS = [
|
||||
"tcrn.ch",
|
||||
"thrdl.es",
|
||||
"tighturl.com",
|
||||
"tiny123.com",
|
||||
"tinyarro.ws",
|
||||
"tiny.cc",
|
||||
"tiny.pl",
|
||||
"tiny123.com",
|
||||
"tinyarro.ws",
|
||||
"tinytw.it",
|
||||
"tinyuri.ca",
|
||||
"tinyurl.com",
|
||||
"tinyvid.io",
|
||||
"tnij.org",
|
||||
"togoto.us",
|
||||
"to.ly",
|
||||
"traceurl.com",
|
||||
"togoto.us",
|
||||
"tr.im",
|
||||
"tr.my",
|
||||
"traceurl.com",
|
||||
"turo.us",
|
||||
"tweetburner.com",
|
||||
"twirl.at",
|
||||
@@ -200,21 +204,21 @@ SHORTENER_DOMAINS = [
|
||||
"twiturl.de",
|
||||
"twurl.cc",
|
||||
"twurl.nl",
|
||||
"u6e.de",
|
||||
"ub0.cc",
|
||||
"u.mavrev.com",
|
||||
"u.nu",
|
||||
"u6e.de",
|
||||
"ub0.cc",
|
||||
"updating.me",
|
||||
"ur1.ca",
|
||||
"url.co.uk",
|
||||
"url.ie",
|
||||
"url4.eu",
|
||||
"urlao.com",
|
||||
"urlbrief.com",
|
||||
"url.co.uk",
|
||||
"urlcover.com",
|
||||
"urlcut.com",
|
||||
"urlenco.de",
|
||||
"urlhawk.com",
|
||||
"url.ie",
|
||||
"urlkiss.com",
|
||||
"urlot.com",
|
||||
"urlpire.com",
|
||||
@@ -227,27 +231,23 @@ SHORTENER_DOMAINS = [
|
||||
"wapurl.co.uk",
|
||||
"wipi.es",
|
||||
"wp.me",
|
||||
"xaddr.com",
|
||||
"x.co",
|
||||
"x.se",
|
||||
"xaddr.com",
|
||||
"xeeurl.com",
|
||||
"xr.com",
|
||||
"xrl.in",
|
||||
"xrl.us",
|
||||
"x.se",
|
||||
"xurl.jp",
|
||||
"xzb.cc",
|
||||
"yep.it",
|
||||
"yfrog.com",
|
||||
"ymlp.com",
|
||||
"yweb.com",
|
||||
"zi.ma",
|
||||
"zi.pe",
|
||||
"zipmyurl.com",
|
||||
"zz.gd",
|
||||
"ymlp.com",
|
||||
"forms.gle",
|
||||
"ht.ly",
|
||||
"lnkd.in",
|
||||
"1drv.ms",
|
||||
]
|
||||
|
||||
class URL:
|
||||
@@ -263,8 +263,12 @@ class URL:
|
||||
|
||||
def get_domain(self):
|
||||
"""Get the domain from a URL.
|
||||
|
||||
:param url: URL to parse
|
||||
:returns: Just the domain name extracted from the URL
|
||||
:type url: str
|
||||
:returns: Domain name extracted from URL
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
# TODO: Properly handle exception.
|
||||
try:
|
||||
@@ -273,9 +277,13 @@ class URL:
|
||||
return None
|
||||
|
||||
def get_top_level(self):
|
||||
"""Get only the top level domain from a URL.
|
||||
"""Get only the top-level domain from a URL.
|
||||
|
||||
:param url: URL to parse
|
||||
:returns: The top level domain extracted from the URL
|
||||
:type url: str
|
||||
:returns: Top-level domain name extracted from URL
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
# TODO: Properly handle exception.
|
||||
try:
|
||||
@@ -283,13 +291,22 @@ class URL:
|
||||
except:
|
||||
return None
|
||||
|
||||
def check_if_shortened(self):
|
||||
def check_if_shortened(self) -> bool:
|
||||
"""Check if the URL is among list of shortener services.
|
||||
|
||||
|
||||
:returns: True if the URL is shortened, otherwise False
|
||||
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
if self.domain.lower() in SHORTENER_DOMAINS:
|
||||
self.is_shortened = True
|
||||
|
||||
return self.is_shortened
|
||||
|
||||
def unshorten(self):
|
||||
"""Unshorten the URL by requesting an HTTP HEAD response."""
|
||||
res = requests.head(self.url)
|
||||
if str(res.status_code).startswith("30"):
|
||||
return res.headers["Location"]
|
||||
|
||||
@@ -10,8 +10,13 @@ import re
|
||||
|
||||
def convert_mactime_to_unix(timestamp, from_2001=True):
|
||||
"""Converts Mac Standard Time to a Unix timestamp.
|
||||
:param timestamp: MacTime timestamp (either int or float)
|
||||
:returns: Unix epoch timestamp
|
||||
|
||||
:param timestamp: MacTime timestamp (either int or float).
|
||||
:type timestamp: int
|
||||
:param from_2001: bool: Whether to (Default value = True)
|
||||
:param from_2001: Default value = True)
|
||||
:returns: Unix epoch timestamp.
|
||||
|
||||
"""
|
||||
if not timestamp:
|
||||
return None
|
||||
@@ -34,8 +39,11 @@ def convert_mactime_to_unix(timestamp, from_2001=True):
|
||||
|
||||
def convert_chrometime_to_unix(timestamp):
|
||||
"""Converts Chrome timestamp to a Unix timestamp.
|
||||
:param timestamp: Chrome timestamp as int
|
||||
:returns: Unix epoch timestamp
|
||||
|
||||
:param timestamp: Chrome timestamp as int.
|
||||
:type timestamp: int
|
||||
:returns: Unix epoch timestamp.
|
||||
|
||||
"""
|
||||
epoch_start = datetime.datetime(1601, 1 , 1)
|
||||
delta = datetime.timedelta(microseconds=timestamp)
|
||||
@@ -44,8 +52,12 @@ def convert_chrometime_to_unix(timestamp):
|
||||
|
||||
def convert_timestamp_to_iso(timestamp):
|
||||
"""Converts Unix timestamp to ISO string.
|
||||
:param timestamp: Unix timestamp
|
||||
:returns: ISO timestamp string in YYYY-mm-dd HH:MM:SS.ms format
|
||||
|
||||
:param timestamp: Unix timestamp.
|
||||
:type timestamp: int
|
||||
:returns: ISO timestamp string in YYYY-mm-dd HH:MM:SS.ms format.
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
try:
|
||||
return timestamp.strftime("%Y-%m-%d %H:%M:%S.%f")
|
||||
@@ -54,15 +66,20 @@ def convert_timestamp_to_iso(timestamp):
|
||||
|
||||
def check_for_links(text):
|
||||
"""Checks if a given text contains HTTP links.
|
||||
:param text: Any provided text
|
||||
:returns: Search results
|
||||
|
||||
:param text: Any provided text.
|
||||
:type text: str
|
||||
:returns: Search results.
|
||||
|
||||
"""
|
||||
return re.findall("(?P<url>https?://[^\s]+)", text, re.IGNORECASE)
|
||||
|
||||
def get_sha256_from_file_path(file_path):
|
||||
"""Calculate the SHA256 hash of a file from a file path.
|
||||
|
||||
:param file_path: Path to the file to hash
|
||||
:returns: The SHA256 hash string
|
||||
|
||||
"""
|
||||
sha256_hash = hashlib.sha256()
|
||||
with open(file_path, "rb") as handle:
|
||||
@@ -75,8 +92,11 @@ def get_sha256_from_file_path(file_path):
|
||||
# https://stackoverflow.com/questions/57014259/json-dumps-on-dictionary-with-bytes-for-keys
|
||||
def keys_bytes_to_string(obj):
|
||||
"""Convert object keys from bytes to string.
|
||||
|
||||
:param obj: Object to convert from bytes to string.
|
||||
:returns: Converted object.
|
||||
:returns: Object converted to string.
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
new_obj = {}
|
||||
if not isinstance(obj, dict):
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
import requests
|
||||
from packaging import version
|
||||
|
||||
MVT_VERSION = "1.3"
|
||||
MVT_VERSION = "1.2.14"
|
||||
|
||||
def check_for_updates():
|
||||
res = requests.get("https://pypi.org/pypi/mvt/json")
|
||||
|
||||
@@ -38,6 +38,14 @@ def cli():
|
||||
logo()
|
||||
|
||||
|
||||
#==============================================================================
|
||||
# Command: version
|
||||
#==============================================================================
|
||||
@cli.command("version", help="Show the currently installed version of MVT")
|
||||
def version():
|
||||
return
|
||||
|
||||
|
||||
#==============================================================================
|
||||
# Command: decrypt-backup
|
||||
#==============================================================================
|
||||
|
||||
@@ -17,6 +17,8 @@ log = logging.getLogger(__name__)
|
||||
class DecryptBackup:
|
||||
"""This class provides functions to decrypt an encrypted iTunes backup
|
||||
using either a password or a key file.
|
||||
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, backup_path, dest_path=None):
|
||||
@@ -35,7 +37,9 @@ class DecryptBackup:
|
||||
@staticmethod
|
||||
def is_encrypted(backup_path) -> bool:
|
||||
"""Query Manifest.db file to see if it's encrypted or not.
|
||||
|
||||
:param backup_path: Path to the backup to decrypt
|
||||
|
||||
"""
|
||||
conn = sqlite3.connect(os.path.join(backup_path, "Manifest.db"))
|
||||
cur = conn.cursor()
|
||||
@@ -95,7 +99,9 @@ class DecryptBackup:
|
||||
|
||||
def decrypt_with_password(self, password):
|
||||
"""Decrypts an encrypted iOS backup.
|
||||
|
||||
:param password: Password to use to decrypt the original backup
|
||||
|
||||
"""
|
||||
log.info("Decrypting iOS backup at path %s with password", self.backup_path)
|
||||
|
||||
@@ -131,7 +137,9 @@ class DecryptBackup:
|
||||
|
||||
def decrypt_with_key_file(self, key_file):
|
||||
"""Decrypts an encrypted iOS backup using a key file.
|
||||
|
||||
:param key_file: File to read the key bytes to decrypt the backup
|
||||
|
||||
"""
|
||||
log.info("Decrypting iOS backup at path %s with key file %s",
|
||||
self.backup_path, key_file)
|
||||
@@ -158,8 +166,7 @@ class DecryptBackup:
|
||||
log.critical("Failed to decrypt backup. Did you provide the correct key file?")
|
||||
|
||||
def get_key(self):
|
||||
"""Retrieve and prints the encryption key.
|
||||
"""
|
||||
"""Retrieve and prints the encryption key."""
|
||||
if not self._backup:
|
||||
return
|
||||
|
||||
@@ -169,7 +176,9 @@ class DecryptBackup:
|
||||
|
||||
def write_key(self, key_path):
|
||||
"""Save extracted key to file.
|
||||
|
||||
:param key_path: Path to the file where to write the derived decryption key.
|
||||
|
||||
"""
|
||||
if not self._decryption_key:
|
||||
return
|
||||
|
||||
@@ -11,8 +11,7 @@ from ..base import IOSExtraction
|
||||
CONF_PROFILES_DOMAIN = "SysSharedContainerDomain-systemgroup.com.apple.configurationprofiles"
|
||||
|
||||
class ConfigurationProfiles(IOSExtraction):
|
||||
"""This module extracts the full plist data from configuration profiles.
|
||||
"""
|
||||
"""This module extracts the full plist data from configuration profiles."""
|
||||
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
fast_mode=False, log=None, results=[]):
|
||||
|
||||
@@ -27,12 +27,19 @@ class Manifest(IOSExtraction):
|
||||
def _get_key(self, dictionary, key):
|
||||
"""Unserialized plist objects can have keys which are str or byte types
|
||||
This is a helper to try fetch a key as both a byte or string type.
|
||||
|
||||
:param dictionary: param key:
|
||||
:param key:
|
||||
|
||||
"""
|
||||
return dictionary.get(key.encode("utf-8"), None) or dictionary.get(key, None)
|
||||
|
||||
@staticmethod
|
||||
def _convert_timestamp(timestamp_or_unix_time_int):
|
||||
"""Older iOS versions stored the manifest times as unix timestamps.
|
||||
|
||||
:param timestamp_or_unix_time_int:
|
||||
|
||||
"""
|
||||
if isinstance(timestamp_or_unix_time_int, datetime.datetime):
|
||||
return convert_timestamp_to_iso(timestamp_or_unix_time_int)
|
||||
|
||||
@@ -14,6 +14,8 @@ CONF_PROFILES_EVENTS_RELPATH = "Library/ConfigurationProfiles/MCProfileEvents.pl
|
||||
class ProfileEvents(IOSExtraction):
|
||||
"""This module extracts events related to the installation of configuration
|
||||
profiles.
|
||||
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
|
||||
@@ -28,7 +28,9 @@ class IOSExtraction(MVTModule):
|
||||
|
||||
def _recover_sqlite_db_if_needed(self, file_path):
|
||||
"""Tries to recover a malformed database by running a .clone command.
|
||||
|
||||
:param file_path: Path to the malformed database file.
|
||||
|
||||
"""
|
||||
# TODO: Find a better solution.
|
||||
conn = sqlite3.connect(file_path)
|
||||
@@ -65,8 +67,10 @@ class IOSExtraction(MVTModule):
|
||||
|
||||
def _get_backup_files_from_manifest(self, relative_path=None, domain=None):
|
||||
"""Locate files from Manifest.db.
|
||||
:param relative_path: Relative path to use as filter from Manifest.db.
|
||||
:param domain: Domain to use as filter from Manifest.db.
|
||||
|
||||
:param relative_path: Relative path to use as filter from Manifest.db. (Default value = None)
|
||||
:param domain: Domain to use as filter from Manifest.db. (Default value = None)
|
||||
|
||||
"""
|
||||
manifest_db_path = os.path.join(self.base_folder, "Manifest.db")
|
||||
if not os.path.exists(manifest_db_path):
|
||||
@@ -116,8 +120,11 @@ class IOSExtraction(MVTModule):
|
||||
modules that expect to work with a single SQLite database.
|
||||
If a module requires to process multiple databases or files,
|
||||
you should use the helper functions above.
|
||||
|
||||
:param backup_id: iTunes backup database file's ID (or hash).
|
||||
:param root_paths: Glob patterns for files to seek in filesystem dump.
|
||||
:param root_paths: Glob patterns for files to seek in filesystem dump. (Default value = [])
|
||||
:param backup_ids: Default value = None)
|
||||
|
||||
"""
|
||||
file_path = None
|
||||
# First we check if the was an explicit file path specified.
|
||||
|
||||
@@ -6,11 +6,14 @@
|
||||
from .cache_files import CacheFiles
|
||||
from .filesystem import Filesystem
|
||||
from .net_netusage import Netusage
|
||||
from .networking_analytics import NetworkingAnalytics
|
||||
from .safari_favicon import SafariFavicon
|
||||
from .shutdownlog import ShutdownLog
|
||||
from .version_history import IOSVersionHistory
|
||||
from .webkit_indexeddb import WebkitIndexedDB
|
||||
from .webkit_localstorage import WebkitLocalStorage
|
||||
from .webkit_safariviewservice import WebkitSafariViewService
|
||||
|
||||
FS_MODULES = [CacheFiles, Filesystem, Netusage, SafariFavicon, IOSVersionHistory,
|
||||
WebkitIndexedDB, WebkitLocalStorage, WebkitSafariViewService,]
|
||||
FS_MODULES = [CacheFiles, Filesystem, Netusage, NetworkingAnalytics, SafariFavicon, ShutdownLog,
|
||||
IOSVersionHistory, WebkitIndexedDB, WebkitLocalStorage,
|
||||
WebkitSafariViewService,]
|
||||
|
||||
@@ -13,7 +13,10 @@ from ..base import IOSExtraction
|
||||
|
||||
class Filesystem(IOSExtraction):
|
||||
"""This module extracts creation and modification date of files from a
|
||||
full file-system dump."""
|
||||
full file-system dump.
|
||||
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
fast_mode=False, log=None, results=[]):
|
||||
@@ -25,8 +28,8 @@ class Filesystem(IOSExtraction):
|
||||
return {
|
||||
"timestamp": record["modified"],
|
||||
"module": self.__class__.__name__,
|
||||
"event": "file_modified",
|
||||
"data": record["file_path"],
|
||||
"event": "entry_modified",
|
||||
"data": record["path"],
|
||||
}
|
||||
|
||||
def check_indicators(self):
|
||||
@@ -34,16 +37,39 @@ class Filesystem(IOSExtraction):
|
||||
return
|
||||
|
||||
for result in self.results:
|
||||
if self.indicators.check_file(result["file_path"]):
|
||||
if self.indicators.check_file(result["path"]):
|
||||
self.log.warning("Found a known malicious file at path: %s", result["path"])
|
||||
self.detected.append(result)
|
||||
|
||||
# If we are instructed to run fast, we skip this.
|
||||
if self.fast_mode:
|
||||
self.log.info("Flag --fast was enabled: skipping extended search for suspicious files/processes")
|
||||
else:
|
||||
for ioc in self.indicators.ioc_processes:
|
||||
parts = result["path"].split("/")
|
||||
if ioc in parts:
|
||||
self.log.warning("Found a known malicious file/process at path: %s", result["path"])
|
||||
self.detected.append(result)
|
||||
|
||||
def run(self):
|
||||
for root, dirs, files in os.walk(self.base_folder):
|
||||
for dir_name in dirs:
|
||||
try:
|
||||
dir_path = os.path.join(root, dir_name)
|
||||
result = {
|
||||
"path": os.path.relpath(dir_path, self.base_folder),
|
||||
"modified": convert_timestamp_to_iso(datetime.datetime.utcfromtimestamp(os.stat(dir_path).st_mtime)),
|
||||
}
|
||||
except:
|
||||
continue
|
||||
else:
|
||||
self.results.append(result)
|
||||
|
||||
for file_name in files:
|
||||
try:
|
||||
file_path = os.path.join(root, file_name)
|
||||
result = {
|
||||
"file_path": os.path.relpath(file_path, self.base_folder),
|
||||
"path": os.path.relpath(file_path, self.base_folder),
|
||||
"modified": convert_timestamp_to_iso(datetime.datetime.utcfromtimestamp(os.stat(file_path).st_mtime)),
|
||||
}
|
||||
except:
|
||||
|
||||
@@ -14,7 +14,10 @@ NETUSAGE_ROOT_PATHS = [
|
||||
|
||||
class Netusage(NetBase):
|
||||
"""This class extracts data from netusage.sqlite and attempts to identify
|
||||
any suspicious processes if running on a full filesystem dump."""
|
||||
any suspicious processes if running on a full filesystem dump.
|
||||
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
fast_mode=False, log=None, results=[]):
|
||||
|
||||
91
mvt/ios/modules/fs/networking_analytics.py
Normal file
91
mvt/ios/modules/fs/networking_analytics.py
Normal file
@@ -0,0 +1,91 @@
|
||||
# Mobile Verification Toolkit (MVT)
|
||||
# Copyright (c) 2021 The MVT Project Authors.
|
||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||
# https://license.mvt.re/1.1/
|
||||
|
||||
import plistlib
|
||||
import sqlite3
|
||||
|
||||
from mvt.common.utils import convert_mactime_to_unix, convert_timestamp_to_iso
|
||||
|
||||
from ..base import IOSExtraction
|
||||
|
||||
NETWORKING_ANALYTICS_DB_PATH = [
|
||||
"private/var/Keychains/Analytics/networking_analytics.db",
|
||||
]
|
||||
|
||||
class NetworkingAnalytics(IOSExtraction):
|
||||
"""This module extracts information from the networking_analytics.db file."""
|
||||
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
fast_mode=False, log=None, results=[]):
|
||||
super().__init__(file_path=file_path, base_folder=base_folder,
|
||||
output_folder=output_folder, fast_mode=fast_mode,
|
||||
log=log, results=results)
|
||||
|
||||
def serialize(self, record):
|
||||
return {
|
||||
"timestamp": record["timestamp"],
|
||||
"module": self.__class__.__name__,
|
||||
"event": "network_crash",
|
||||
"data": f"{record}",
|
||||
}
|
||||
|
||||
def check_indicators(self):
|
||||
if not self.indicators:
|
||||
return
|
||||
|
||||
for result in self.results:
|
||||
for ioc in self.indicators.ioc_processes:
|
||||
for key in result.keys():
|
||||
if ioc == result[key]:
|
||||
self.log.warning("Found mention of a known malicious process \"%s\" in networking_analytics.db at %s",
|
||||
ioc, result["timestamp"])
|
||||
self.detected.append(result)
|
||||
break
|
||||
|
||||
def _extract_networking_analytics_data(self):
|
||||
conn = sqlite3.connect(self.file_path)
|
||||
cur = conn.cursor()
|
||||
cur.execute("""
|
||||
SELECT
|
||||
timestamp,
|
||||
data
|
||||
FROM hard_failures
|
||||
UNION
|
||||
SELECT
|
||||
timestamp,
|
||||
data
|
||||
FROM soft_failures;
|
||||
""")
|
||||
|
||||
for row in cur:
|
||||
if row[0] and row[1]:
|
||||
timestamp = convert_timestamp_to_iso(convert_mactime_to_unix(row[0], False))
|
||||
data = plistlib.loads(row[1])
|
||||
data["timestamp"] = timestamp
|
||||
elif row[0]:
|
||||
timestamp = convert_timestamp_to_iso(convert_mactime_to_unix(row[0], False))
|
||||
data = {}
|
||||
data["timestamp"] = timestamp
|
||||
elif row[1]:
|
||||
timestamp = ""
|
||||
data = plistlib.loads(row[1])
|
||||
data["timestamp"] = timestamp
|
||||
|
||||
self.results.append(data)
|
||||
|
||||
self.results = sorted(self.results, key=lambda entry: entry["timestamp"])
|
||||
|
||||
cur.close()
|
||||
conn.close()
|
||||
|
||||
self.log.info("Extracted information on %d network crashes", len(self.results))
|
||||
|
||||
def run(self):
|
||||
self._find_ios_database(root_paths=NETWORKING_ANALYTICS_DB_PATH)
|
||||
if self.file_path:
|
||||
self.log.info("Found networking_analytics.db log at path: %s", self.file_path)
|
||||
self._extract_networking_analytics_data()
|
||||
else:
|
||||
self.log.info("networking_analytics.db not found")
|
||||
81
mvt/ios/modules/fs/shutdownlog.py
Normal file
81
mvt/ios/modules/fs/shutdownlog.py
Normal file
@@ -0,0 +1,81 @@
|
||||
# Mobile Verification Toolkit (MVT)
|
||||
# Copyright (c) 2021 The MVT Project Authors.
|
||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||
# https://license.mvt.re/1.1/
|
||||
|
||||
from mvt.common.utils import convert_mactime_to_unix, convert_timestamp_to_iso
|
||||
|
||||
from ..base import IOSExtraction
|
||||
|
||||
SHUTDOWN_LOG_PATH = [
|
||||
"private/var/db/diagnostics/shutdown.log",
|
||||
]
|
||||
|
||||
class ShutdownLog(IOSExtraction):
|
||||
"""This module extracts processes information from the shutdown log file."""
|
||||
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
fast_mode=False, log=None, results=[]):
|
||||
super().__init__(file_path=file_path, base_folder=base_folder,
|
||||
output_folder=output_folder, fast_mode=fast_mode,
|
||||
log=log, results=results)
|
||||
|
||||
def serialize(self, record):
|
||||
return {
|
||||
"timestamp": record["isodate"],
|
||||
"module": self.__class__.__name__,
|
||||
"event": "shutdown",
|
||||
"data": f"Client {record['client']} with PID {record['pid']} was running when the device was shut down",
|
||||
}
|
||||
|
||||
def check_indicators(self):
|
||||
if not self.indicators:
|
||||
return
|
||||
|
||||
for result in self.results:
|
||||
for ioc in self.indicators.ioc_processes:
|
||||
parts = result["client"].split("/")
|
||||
if ioc in parts:
|
||||
self.log.warning("Found mention of a known malicious process \"%s\" in shutdown.log",
|
||||
ioc)
|
||||
self.detected.append(result)
|
||||
|
||||
def process_shutdownlog(self, content):
|
||||
current_processes = []
|
||||
for line in content.split("\n"):
|
||||
line = line.strip()
|
||||
|
||||
if line.startswith("remaining client pid:"):
|
||||
current_processes.append({
|
||||
"pid": line[line.find("pid: ")+5:line.find(" (")],
|
||||
"client": line[line.find("(")+1:line.find(")")],
|
||||
})
|
||||
elif line.startswith("SIGTERM: "):
|
||||
try:
|
||||
mac_timestamp = int(line[line.find("[")+1:line.find("]")])
|
||||
except ValueError:
|
||||
try:
|
||||
start = line.find(" @")+2
|
||||
mac_timestamp = int(line[start:start+10])
|
||||
except:
|
||||
mac_timestamp = 0
|
||||
|
||||
timestamp = convert_mactime_to_unix(mac_timestamp, from_2001=False)
|
||||
isodate = convert_timestamp_to_iso(timestamp)
|
||||
|
||||
for current_process in current_processes:
|
||||
self.results.append({
|
||||
"isodate": isodate,
|
||||
"pid": current_process["pid"],
|
||||
"client": current_process["client"],
|
||||
})
|
||||
|
||||
current_processes = []
|
||||
|
||||
self.results = sorted(self.results, key=lambda entry: entry["isodate"])
|
||||
|
||||
def run(self):
|
||||
self._find_ios_database(root_paths=SHUTDOWN_LOG_PATH)
|
||||
self.log.info("Found shutdown log at path: %s", self.file_path)
|
||||
with open(self.file_path, "r") as handle:
|
||||
self.process_shutdownlog(handle.read())
|
||||
@@ -11,7 +11,10 @@ WEBKIT_INDEXEDDB_ROOT_PATHS = [
|
||||
|
||||
class WebkitIndexedDB(WebkitBase):
|
||||
"""This module looks extracts records from WebKit IndexedDB folders,
|
||||
and checks them against any provided list of suspicious domains."""
|
||||
and checks them against any provided list of suspicious domains.
|
||||
|
||||
|
||||
"""
|
||||
|
||||
slug = "webkit_indexeddb"
|
||||
|
||||
|
||||
@@ -11,7 +11,10 @@ WEBKIT_LOCALSTORAGE_ROOT_PATHS = [
|
||||
|
||||
class WebkitLocalStorage(WebkitBase):
|
||||
"""This module looks extracts records from WebKit LocalStorage folders,
|
||||
and checks them against any provided list of suspicious domains."""
|
||||
and checks them against any provided list of suspicious domains.
|
||||
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
fast_mode=False, log=None, results=[]):
|
||||
|
||||
@@ -11,7 +11,10 @@ WEBKIT_SAFARIVIEWSERVICE_ROOT_PATHS = [
|
||||
|
||||
class WebkitSafariViewService(WebkitBase):
|
||||
"""This module looks extracts records from WebKit LocalStorage folders,
|
||||
and checks them against any provided list of suspicious domains."""
|
||||
and checks them against any provided list of suspicious domains.
|
||||
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
fast_mode=False, log=None, results=[]):
|
||||
|
||||
@@ -13,15 +13,18 @@ from .idstatuscache import IDStatusCache
|
||||
from .interactionc import InteractionC
|
||||
from .locationd import LocationdClients
|
||||
from .net_datausage import Datausage
|
||||
from .osanalytics_addaily import OSAnalyticsADDaily
|
||||
from .safari_browserstate import SafariBrowserState
|
||||
from .safari_history import SafariHistory
|
||||
from .sms import SMS
|
||||
from .sms_attachments import SMSAttachments
|
||||
from .tcc import TCC
|
||||
from .webkit_resource_load_statistics import WebkitResourceLoadStatistics
|
||||
from .webkit_session_resource_log import WebkitSessionResourceLog
|
||||
from .whatsapp import Whatsapp
|
||||
|
||||
MIXED_MODULES = [Calls, ChromeFavicon, ChromeHistory, Contacts, FirefoxFavicon,
|
||||
FirefoxHistory, IDStatusCache, InteractionC, LocationdClients,
|
||||
Datausage, SafariBrowserState, SafariHistory, SMS, SMSAttachments,
|
||||
WebkitResourceLoadStatistics, WebkitSessionResourceLog, Whatsapp,]
|
||||
OSAnalyticsADDaily, Datausage, SafariBrowserState, SafariHistory,
|
||||
TCC, SMS, SMSAttachments, WebkitResourceLoadStatistics,
|
||||
WebkitSessionResourceLog, Whatsapp,]
|
||||
|
||||
@@ -19,7 +19,10 @@ FIREFOX_HISTORY_ROOT_PATHS = [
|
||||
|
||||
class FirefoxHistory(IOSExtraction):
|
||||
"""This module extracts all Firefox visits and tries to detect potential
|
||||
network injection attacks."""
|
||||
network injection attacks.
|
||||
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
fast_mode=False, log=None, results=[]):
|
||||
|
||||
@@ -51,12 +51,8 @@ class IDStatusCache(IOSExtraction):
|
||||
result.get("user"))
|
||||
self.detected.append(result)
|
||||
|
||||
def run(self):
|
||||
self._find_ios_database(backup_ids=IDSTATUSCACHE_BACKUP_IDS,
|
||||
root_paths=IDSTATUSCACHE_ROOT_PATHS)
|
||||
self.log.info("Found IDStatusCache plist at path: %s", self.file_path)
|
||||
|
||||
with open(self.file_path, "rb") as handle:
|
||||
def _extract_idstatuscache_entries(self, file_path):
|
||||
with open(file_path, "rb") as handle:
|
||||
file_plist = plistlib.load(handle)
|
||||
|
||||
id_status_cache_entries = []
|
||||
@@ -84,4 +80,16 @@ class IDStatusCache(IOSExtraction):
|
||||
entry["occurrences"] = entry_counter[entry["user"]]
|
||||
self.results.append(entry)
|
||||
|
||||
def run(self):
|
||||
|
||||
if self.is_backup:
|
||||
self._find_ios_database(backup_ids=IDSTATUSCACHE_BACKUP_IDS)
|
||||
self.log.info("Found IDStatusCache plist at path: %s", self.file_path)
|
||||
self._extract_idstatuscache_entries(self.file_path)
|
||||
elif self.is_fs_dump:
|
||||
for idstatuscache_path in self._get_fs_files_from_patterns(IDSTATUSCACHE_ROOT_PATHS):
|
||||
self.file_path = idstatuscache_path
|
||||
self.log.info("Found IDStatusCache plist at path: %s", self.file_path)
|
||||
self._extract_idstatuscache_entries(self.file_path)
|
||||
|
||||
self.log.info("Extracted a total of %d ID Status Cache entries", len(self.results))
|
||||
|
||||
@@ -14,10 +14,11 @@ LOCATIOND_BACKUP_IDS = [
|
||||
]
|
||||
LOCATIOND_ROOT_PATHS = [
|
||||
"private/var/mobile/Library/Caches/locationd/clients.plist",
|
||||
"private/var/root/Library/Caches/locationd/clients.plist"
|
||||
]
|
||||
|
||||
class LocationdClients(IOSExtraction):
|
||||
"""Extract information from apps who used geolocation"""
|
||||
"""Extract information from apps who used geolocation."""
|
||||
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
fast_mode=False, log=None, results=[]):
|
||||
@@ -50,22 +51,40 @@ class LocationdClients(IOSExtraction):
|
||||
|
||||
return records
|
||||
|
||||
def run(self):
|
||||
self._find_ios_database(backup_ids=LOCATIOND_BACKUP_IDS,
|
||||
root_paths=LOCATIOND_ROOT_PATHS)
|
||||
self.log.info("Found Locationd Clients plist at path: %s", self.file_path)
|
||||
def check_indicators(self):
|
||||
if not self.indicators:
|
||||
return
|
||||
|
||||
with open(self.file_path, "rb") as handle:
|
||||
for result in self.results:
|
||||
parts = result["package"].split("/")
|
||||
proc_name = parts[len(parts)-1]
|
||||
|
||||
if self.indicators.check_process(proc_name):
|
||||
self.detected.append(result)
|
||||
|
||||
def _extract_locationd_entries(self, file_path):
|
||||
with open(file_path, "rb") as handle:
|
||||
file_plist = plistlib.load(handle)
|
||||
|
||||
for app in file_plist:
|
||||
if file_plist[app] is dict:
|
||||
result = file_plist[app]
|
||||
result["package"] = app
|
||||
for ts in self.timestamps:
|
||||
if ts in result.keys():
|
||||
result[ts] = convert_timestamp_to_iso(convert_mactime_to_unix(result[ts]))
|
||||
for key, values in file_plist.items():
|
||||
result = file_plist[key]
|
||||
result["package"] = key
|
||||
for ts in self.timestamps:
|
||||
if ts in result.keys():
|
||||
result[ts] = convert_timestamp_to_iso(convert_mactime_to_unix(result[ts]))
|
||||
|
||||
self.results.append(result)
|
||||
self.results.append(result)
|
||||
|
||||
def run(self):
|
||||
|
||||
if self.is_backup:
|
||||
self._find_ios_database(backup_ids=LOCATIOND_BACKUP_IDS)
|
||||
self.log.info("Found Locationd Clients plist at path: %s", self.file_path)
|
||||
self._extract_locationd_entries(self.file_path)
|
||||
elif self.is_fs_dump:
|
||||
for locationd_path in self._get_fs_files_from_patterns(LOCATIOND_ROOT_PATHS):
|
||||
self.file_path = locationd_path
|
||||
self.log.info("Found Locationd Clients plist at path: %s", self.file_path)
|
||||
self._extract_locationd_entries(self.file_path)
|
||||
|
||||
self.log.info("Extracted a total of %d Locationd Clients entries", len(self.results))
|
||||
|
||||
@@ -14,7 +14,10 @@ DATAUSAGE_ROOT_PATHS = [
|
||||
|
||||
class Datausage(NetBase):
|
||||
"""This class extracts data from DataUsage.sqlite and attempts to identify
|
||||
any suspicious processes if running on a full filesystem dump."""
|
||||
any suspicious processes if running on a full filesystem dump.
|
||||
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
fast_mode=False, log=None, results=[]):
|
||||
|
||||
64
mvt/ios/modules/mixed/osanalytics_addaily.py
Normal file
64
mvt/ios/modules/mixed/osanalytics_addaily.py
Normal file
@@ -0,0 +1,64 @@
|
||||
# Mobile Verification Toolkit (MVT)
|
||||
# Copyright (c) 2021 The MVT Project Authors.
|
||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||
# https://license.mvt.re/1.1/
|
||||
|
||||
import plistlib
|
||||
|
||||
from mvt.common.utils import convert_timestamp_to_iso
|
||||
|
||||
from ..base import IOSExtraction
|
||||
|
||||
OSANALYTICS_ADDAILY_BACKUP_IDS = [
|
||||
"f65b5fafc69bbd3c60be019c6e938e146825fa83",
|
||||
]
|
||||
OSANALYTICS_ADDAILY_ROOT_PATHS = [
|
||||
"private/var/mobile/Library/Preferences/com.apple.osanalytics.addaily.plist",
|
||||
]
|
||||
|
||||
class OSAnalyticsADDaily(IOSExtraction):
|
||||
"""Extract network usage information by process, from com.apple.osanalytics.addaily.plist"""
|
||||
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
fast_mode=False, log=None, results=[]):
|
||||
super().__init__(file_path=file_path, base_folder=base_folder,
|
||||
output_folder=output_folder, fast_mode=fast_mode,
|
||||
log=log, results=results)
|
||||
|
||||
def serialize(self, record):
|
||||
record_data = f"{record['package']} WIFI IN: {record['wifi_in']}, WIFI OUT: {record['wifi_out']} - " \
|
||||
f"WWAN IN: {record['wwan_in']}, WWAN OUT: {record['wwan_out']}"
|
||||
return {
|
||||
"timestamp": record["ts"],
|
||||
"module": self.__class__.__name__,
|
||||
"event": "osanalytics_addaily",
|
||||
"data": record_data,
|
||||
}
|
||||
|
||||
def check_indicators(self):
|
||||
if not self.indicators:
|
||||
return
|
||||
|
||||
for result in self.results:
|
||||
if self.indicators.check_process(result["package"]):
|
||||
self.detected.append(result)
|
||||
|
||||
def run(self):
|
||||
self._find_ios_database(backup_ids=OSANALYTICS_ADDAILY_BACKUP_IDS,
|
||||
root_paths=OSANALYTICS_ADDAILY_ROOT_PATHS)
|
||||
self.log.info("Found com.apple.osanalytics.addaily plist at path: %s", self.file_path)
|
||||
|
||||
with open(self.file_path, "rb") as handle:
|
||||
file_plist = plistlib.load(handle)
|
||||
|
||||
for app, values in file_plist.get("netUsageBaseline", {}).items():
|
||||
self.results.append({
|
||||
"package": app,
|
||||
"ts": convert_timestamp_to_iso(values[0]),
|
||||
"wifi_in": values[1],
|
||||
"wifi_out": values[2],
|
||||
"wwan_in": values[3],
|
||||
"wwan_out": values[4],
|
||||
})
|
||||
|
||||
self.log.info("Extracted a total of %d com.apple.osanalytics.addaily entries", len(self.results))
|
||||
@@ -13,9 +13,6 @@ from mvt.common.utils import (convert_mactime_to_unix,
|
||||
|
||||
from ..base import IOSExtraction
|
||||
|
||||
SAFARI_BROWSER_STATE_BACKUP_IDS = [
|
||||
"3a47b0981ed7c10f3e2800aa66bac96a3b5db28e",
|
||||
]
|
||||
SAFARI_BROWSER_STATE_BACKUP_RELPATH = "Library/Safari/BrowserState.db"
|
||||
SAFARI_BROWSER_STATE_ROOT_PATHS = [
|
||||
"private/var/mobile/Library/Safari/BrowserState.db",
|
||||
@@ -101,12 +98,17 @@ class SafariBrowserState(IOSExtraction):
|
||||
})
|
||||
|
||||
def run(self):
|
||||
# TODO: Is there really only one BrowserState.db in a device?
|
||||
self._find_ios_database(backup_ids=SAFARI_BROWSER_STATE_BACKUP_IDS,
|
||||
root_paths=SAFARI_BROWSER_STATE_ROOT_PATHS)
|
||||
self.log.info("Found Safari browser state database at path: %s", self.file_path)
|
||||
|
||||
self._process_browser_state_db(self.file_path)
|
||||
if self.is_backup:
|
||||
for backup_file in self._get_backup_files_from_manifest(relative_path=SAFARI_BROWSER_STATE_BACKUP_RELPATH):
|
||||
self.file_path = self._get_backup_file_from_id(backup_file["file_id"])
|
||||
self.log.info("Found Safari browser state database at path: %s", self.file_path)
|
||||
self._process_browser_state_db(self.file_path)
|
||||
elif self.is_fs_dump:
|
||||
for safari_browserstate_path in self._get_fs_files_from_patterns(SAFARI_BROWSER_STATE_ROOT_PATHS):
|
||||
self.file_path = safari_browserstate_path
|
||||
self.log.info("Found Safari browser state database at path: %s", self.file_path)
|
||||
self._process_browser_state_db(self.file_path)
|
||||
|
||||
self.log.info("Extracted a total of %d tab records and %d session history entries",
|
||||
len(self.results), self._session_history_count)
|
||||
|
||||
@@ -19,7 +19,10 @@ SAFARI_HISTORY_ROOT_PATHS = [
|
||||
|
||||
class SafariHistory(IOSExtraction):
|
||||
"""This module extracts all Safari visits and tries to detect potential
|
||||
network injection attacks."""
|
||||
network injection attacks.
|
||||
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
fast_mode=False, log=None, results=[]):
|
||||
|
||||
90
mvt/ios/modules/mixed/tcc.py
Normal file
90
mvt/ios/modules/mixed/tcc.py
Normal file
@@ -0,0 +1,90 @@
|
||||
# Mobile Verification Toolkit (MVT)
|
||||
# Copyright (c) 2021 The MVT Project Authors.
|
||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||
# https://license.mvt.re/1.1/
|
||||
|
||||
import sqlite3
|
||||
from datetime import datetime
|
||||
|
||||
from mvt.common.utils import convert_timestamp_to_iso
|
||||
|
||||
from ..base import IOSExtraction
|
||||
|
||||
TCC_BACKUP_IDS = [
|
||||
"64d0019cb3d46bfc8cce545a8ba54b93e7ea9347",
|
||||
]
|
||||
TCC_ROOT_PATHS = [
|
||||
"private/var/mobile/Library/TCC/TCC.db",
|
||||
]
|
||||
|
||||
AUTH_VALUES = {
|
||||
0: "denied",
|
||||
1: "unknown",
|
||||
2: "allowed",
|
||||
3: "limited",
|
||||
}
|
||||
AUTH_REASONS = {
|
||||
1: "error",
|
||||
2: "user_consent",
|
||||
3: "user_set",
|
||||
4: "system_set",
|
||||
5: "service_policy",
|
||||
6: "mdm_policy",
|
||||
7: "override_policy",
|
||||
8: "missing_usage_string",
|
||||
9: "prompt_timeout",
|
||||
10: "preflight_unknown",
|
||||
11: "entitled",
|
||||
12: "app_type_policy",
|
||||
}
|
||||
|
||||
class TCC(IOSExtraction):
|
||||
"""This module extracts records from the TCC.db SQLite database."""
|
||||
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
fast_mode=False, log=None, results=[]):
|
||||
super().__init__(file_path=file_path, base_folder=base_folder,
|
||||
output_folder=output_folder, fast_mode=fast_mode,
|
||||
log=log, results=results)
|
||||
|
||||
def process_db(self, file_path):
|
||||
conn = sqlite3.connect(file_path)
|
||||
cur = conn.cursor()
|
||||
cur.execute("""SELECT
|
||||
service, client, client_type, auth_value, auth_reason, last_modified
|
||||
FROM access;""")
|
||||
|
||||
for row in cur:
|
||||
service = row[0]
|
||||
client = row[1]
|
||||
client_type = row[2]
|
||||
client_type_desc = "bundle_id" if client_type == 0 else "absolute_path"
|
||||
auth_value = row[3]
|
||||
auth_value_desc = AUTH_VALUES.get(auth_value, "")
|
||||
auth_reason = row[4]
|
||||
auth_reason_desc = AUTH_REASONS.get(auth_reason, "unknown")
|
||||
last_modified = convert_timestamp_to_iso(datetime.utcfromtimestamp((row[5])))
|
||||
|
||||
if service in ["kTCCServiceMicrophone", "kTCCServiceCamera"]:
|
||||
device = "microphone" if service == "kTCCServiceMicrophone" else "camera"
|
||||
self.log.info("Found client \"%s\" with access %s to %s on %s by %s",
|
||||
client, auth_value_desc, device, last_modified, auth_reason_desc)
|
||||
|
||||
self.results.append({
|
||||
"service": service,
|
||||
"client": client,
|
||||
"client_type": client_type_desc,
|
||||
"auth_value": auth_value_desc,
|
||||
"auth_reason_desc": auth_reason_desc,
|
||||
"last_modified": last_modified,
|
||||
})
|
||||
|
||||
cur.close()
|
||||
conn.close()
|
||||
|
||||
def run(self):
|
||||
self._find_ios_database(backup_ids=TCC_BACKUP_IDS, root_paths=TCC_ROOT_PATHS)
|
||||
self.log.info("Found TCC database at path: %s", self.file_path)
|
||||
self.process_db(self.file_path)
|
||||
|
||||
self.log.info("Extracted a total of %d TCC items", len(self.results))
|
||||
@@ -18,8 +18,7 @@ WEBKIT_RESOURCELOADSTATICS_ROOT_PATHS = [
|
||||
]
|
||||
|
||||
class WebkitResourceLoadStatistics(IOSExtraction):
|
||||
"""This module extracts records from WebKit ResourceLoadStatistics observations.db.
|
||||
"""
|
||||
"""This module extracts records from WebKit ResourceLoadStatistics observations.db."""
|
||||
# TODO: Add serialize().
|
||||
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
|
||||
@@ -23,7 +23,10 @@ WEBKIT_SESSION_RESOURCE_LOG_ROOT_PATHS = [
|
||||
class WebkitSessionResourceLog(IOSExtraction):
|
||||
"""This module extracts records from WebKit browsing session
|
||||
resource logs, and checks them against any provided list of
|
||||
suspicious domains."""
|
||||
suspicious domains.
|
||||
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, file_path=None, base_folder=None, output_folder=None,
|
||||
fast_mode=False, log=None, results=[]):
|
||||
|
||||
@@ -39,7 +39,9 @@ class NetBase(IOSExtraction):
|
||||
ZLIVEUSAGE.ZHASPROCESS,
|
||||
ZLIVEUSAGE.ZTIMESTAMP
|
||||
FROM ZLIVEUSAGE
|
||||
LEFT JOIN ZPROCESS ON ZLIVEUSAGE.ZHASPROCESS = ZPROCESS.Z_PK;
|
||||
LEFT JOIN ZPROCESS ON ZLIVEUSAGE.ZHASPROCESS = ZPROCESS.Z_PK
|
||||
UNION
|
||||
SELECT ZFIRSTTIMESTAMP, ZTIMESTAMP, ZPROCNAME, ZBUNDLENAME, Z_PK, NULL, NULL, NULL, NULL, NULL, NULL, NULL FROM ZPROCESS WHERE Z_PK NOT IN (SELECT ZHASPROCESS FROM ZLIVEUSAGE);
|
||||
""")
|
||||
|
||||
for row in cur:
|
||||
@@ -68,7 +70,7 @@ class NetBase(IOSExtraction):
|
||||
"wwan_out": row[8],
|
||||
"live_id": row[9],
|
||||
"live_proc_id": row[10],
|
||||
"live_isodate": live_timestamp,
|
||||
"live_isodate": live_timestamp if row[10] else first_isodate,
|
||||
})
|
||||
|
||||
cur.close()
|
||||
@@ -89,7 +91,7 @@ class NetBase(IOSExtraction):
|
||||
}]
|
||||
|
||||
# Only included first_usage and current_usage records when a ZPROCESS entry exists.
|
||||
if "MANIPULATED" not in record["proc_name"] and "MISSING" not in record["proc_name"]:
|
||||
if "MANIPULATED" not in record["proc_name"] and "MISSING" not in record["proc_name"] and record["live_proc_id"] is not None:
|
||||
records.extend([
|
||||
{
|
||||
"timestamp": record["first_isodate"],
|
||||
@@ -145,14 +147,17 @@ class NetBase(IOSExtraction):
|
||||
self.log.debug("Located at %s", binary_path)
|
||||
else:
|
||||
msg = f"Could not find the binary associated with the process with name {proc['proc_name']}"
|
||||
if len(proc["proc_name"]) == 16:
|
||||
if (proc["proc_name"] is None):
|
||||
msg = f"Found process entry with empty 'proc_name' : {proc['live_proc_id']} at {proc['live_isodate']}"
|
||||
elif len(proc["proc_name"]) == 16:
|
||||
msg = msg + " (However, the process name might have been truncated in the database)"
|
||||
|
||||
self.log.warning(msg)
|
||||
if not proc["live_proc_id"]:
|
||||
self.log.info(f"Found process entry in ZPROCESS but not in ZLIVEUSAGE : {proc['proc_name']} at {proc['live_isodate']}")
|
||||
|
||||
def check_manipulated(self):
|
||||
"""Check for missing or manipulate DB entries
|
||||
"""
|
||||
"""Check for missing or manipulate DB entries"""
|
||||
# Don't show duplicates for each missing process.
|
||||
missing_process_cache = set()
|
||||
for result in sorted(self.results, key=operator.itemgetter("live_isodate")):
|
||||
|
||||
@@ -4,40 +4,44 @@
|
||||
# https://license.mvt.re/1.1/
|
||||
|
||||
IPHONE_MODELS = [
|
||||
{"description": "iPhone 4S", "identifier": "iPhone4,1"},
|
||||
{"description": "iPhone 5", "identifier": "iPhone5,1"},
|
||||
{"description": "iPhone 5", "identifier": "iPhone5,2"},
|
||||
{"description": "iPhone 5c", "identifier": "iPhone5,3"},
|
||||
{"description": "iPhone 5c", "identifier": "iPhone5,4"},
|
||||
{"description": "iPhone 5s", "identifier": "iPhone6,1"},
|
||||
{"description": "iPhone 5s", "identifier": "iPhone6,2"},
|
||||
{"description": "iPhone 6 Plus", "identifier": "iPhone7,1"},
|
||||
{"description": "iPhone 6", "identifier": "iPhone7,2"},
|
||||
{"description": "iPhone 6s", "identifier": "iPhone8,1"},
|
||||
{"description": "iPhone 6s Plus", "identifier": "iPhone8,2"},
|
||||
{"description": "iPhone SE (1st generation)", "identifier": "iPhone8,4"},
|
||||
{"description": "iPhone 7", "identifier": "iPhone9,1"},
|
||||
{"description": "iPhone 7 Plus", "identifier": "iPhone9,2"},
|
||||
{"description": "iPhone 7", "identifier": "iPhone9,3"},
|
||||
{"description": "iPhone 7 Plus", "identifier": "iPhone9,4"},
|
||||
{"description": "iPhone 8", "identifier": "iPhone10,1"},
|
||||
{"description": "iPhone 8 Plus", "identifier": "iPhone10,2"},
|
||||
{"description": "iPhone X", "identifier": "iPhone10,3"},
|
||||
{"description": "iPhone 8", "identifier": "iPhone10,4"},
|
||||
{"description": "iPhone 8 Plus", "identifier": "iPhone10,5"},
|
||||
{"description": "iPhone X", "identifier": "iPhone10,6"},
|
||||
{"description": "iPhone XS", "identifier": "iPhone11,2"},
|
||||
{"description": "iPhone XS Max", "identifier": "iPhone11,4"},
|
||||
{"description": "iPhone XS Max", "identifier": "iPhone11,6"},
|
||||
{"description": "iPhone XR", "identifier": "iPhone11,8"},
|
||||
{"description": "iPhone 11", "identifier": "iPhone12,1"},
|
||||
{"description": "iPhone 11 Pro", "identifier": "iPhone12,3"},
|
||||
{"description": "iPhone 11 Pro Max", "identifier": "iPhone12,5"},
|
||||
{"description": "iPhone SE (2nd generation)", "identifier": "iPhone12,8"},
|
||||
{"description": "iPhone 12 mini", "identifier": "iPhone13,1"},
|
||||
{"description": "iPhone 12", "identifier": "iPhone13,2"},
|
||||
{"description": "iPhone 12 Pro", "identifier": "iPhone13,3"},
|
||||
{"description": "iPhone 12 Pro Max", "identifier": "iPhone13,4"},
|
||||
{"identifier": "iPhone4,1", "description": "iPhone 4S"},
|
||||
{"identifier": "iPhone5,1", "description": "iPhone 5"},
|
||||
{"identifier": "iPhone5,2", "description": "iPhone 5"},
|
||||
{"identifier": "iPhone5,3", "description": "iPhone 5c"},
|
||||
{"identifier": "iPhone5,4", "description": "iPhone 5c"},
|
||||
{"identifier": "iPhone6,1", "description": "iPhone 5s"},
|
||||
{"identifier": "iPhone6,2", "description": "iPhone 5s"},
|
||||
{"identifier": "iPhone7,1", "description": "iPhone 6 Plus"},
|
||||
{"identifier": "iPhone7,2", "description": "iPhone 6"},
|
||||
{"identifier": "iPhone8,1", "description": "iPhone 6s"},
|
||||
{"identifier": "iPhone8,2", "description": "iPhone 6s Plus"},
|
||||
{"identifier": "iPhone8,4", "description": "iPhone SE (1st generation)"},
|
||||
{"identifier": "iPhone9,1", "description": "iPhone 7"},
|
||||
{"identifier": "iPhone9,2", "description": "iPhone 7 Plus"},
|
||||
{"identifier": "iPhone9,3", "description": "iPhone 7"},
|
||||
{"identifier": "iPhone9,4", "description": "iPhone 7 Plus"},
|
||||
{"identifier": "iPhone10,1", "description": "iPhone 8"},
|
||||
{"identifier": "iPhone10,2", "description": "iPhone 8 Plus"},
|
||||
{"identifier": "iPhone10,3", "description": "iPhone X"},
|
||||
{"identifier": "iPhone10,4", "description": "iPhone 8"},
|
||||
{"identifier": "iPhone10,5", "description": "iPhone 8 Plus"},
|
||||
{"identifier": "iPhone10,6", "description": "iPhone X"},
|
||||
{"identifier": "iPhone11,2", "description": "iPhone XS"},
|
||||
{"identifier": "iPhone11,4", "description": "iPhone XS Max"},
|
||||
{"identifier": "iPhone11,6", "description": "iPhone XS Max"},
|
||||
{"identifier": "iPhone11,8", "description": "iPhone XR"},
|
||||
{"identifier": "iPhone12,1", "description": "iPhone 11"},
|
||||
{"identifier": "iPhone12,3", "description": "iPhone 11 Pro"},
|
||||
{"identifier": "iPhone12,5", "description": "iPhone 11 Pro Max"},
|
||||
{"identifier": "iPhone12,8", "description": "iPhone SE (2nd generation)"},
|
||||
{"identifier": "iPhone13,1", "description": "iPhone 12 mini"},
|
||||
{"identifier": "iPhone13,2", "description": "iPhone 12"},
|
||||
{"identifier": "iPhone13,3", "description": "iPhone 12 Pro"},
|
||||
{"identifier": "iPhone13,4", "description": "iPhone 12 Pro Max"},
|
||||
{"identifier": "iPhone14,4", "description": "iPhone 13 Mini"},
|
||||
{"identifier": "iPhone14,5", "description": "iPhone 13"},
|
||||
{"identifier": "iPhone14,2", "description": "iPhone 13 Pro"},
|
||||
{"identifier": "iPhone14,3", "description": "iPhone 13 Pro Max"},
|
||||
]
|
||||
|
||||
IPHONE_IOS_VERSIONS = [
|
||||
@@ -222,6 +226,11 @@ IPHONE_IOS_VERSIONS = [
|
||||
{"build": "18F72", "version": "14.6"},
|
||||
{"build": "18G69", "version": "14.7"},
|
||||
{"build": "18G82", "version": "14.7.1"},
|
||||
{"build": "18H17", "version": "14.8"},
|
||||
{"build": "19A346", "version": "15.0"},
|
||||
{"build": "19A348", "version": "15.0.1"},
|
||||
{"build": "19A404", "version": "15.0.2"},
|
||||
{"build": "19B74", "version": "15.1"},
|
||||
]
|
||||
|
||||
def get_device_desc_from_id(identifier, devices_list=IPHONE_MODELS):
|
||||
|
||||
14
setup.py
14
setup.py
@@ -16,18 +16,18 @@ with open(readme_path, encoding="utf-8") as handle:
|
||||
|
||||
requires = (
|
||||
# Base dependencies:
|
||||
"click>=8.0.1",
|
||||
"rich>=10.6.0",
|
||||
"click>=8.0.3",
|
||||
"rich>=10.12.0",
|
||||
"tld>=0.12.6",
|
||||
"tqdm>=4.61.2",
|
||||
"tqdm>=4.62.3",
|
||||
"requests>=2.26.0",
|
||||
"simplejson>=3.17.3",
|
||||
"simplejson>=3.17.5",
|
||||
"packaging>=21.0",
|
||||
# iOS dependencies:
|
||||
"iOSbackup>=0.9.912",
|
||||
"iOSbackup>=0.9.921",
|
||||
# Android dependencies:
|
||||
"adb-shell>=0.4.0",
|
||||
"libusb1>=1.9.3",
|
||||
"adb-shell>=0.4.2",
|
||||
"libusb1>=2.0.1",
|
||||
)
|
||||
|
||||
def get_package_data(package):
|
||||
|
||||
Reference in New Issue
Block a user