mirror of
https://github.com/mvt-project/mvt.git
synced 2026-02-14 17:42:46 +00:00
Compare commits
86 Commits
tmp/timeli
...
fix_tests
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
a5ae729b65 | ||
|
|
86a0772eb2 | ||
|
|
7d0be9db4f | ||
|
|
4e120b2640 | ||
|
|
dbe9e5db9b | ||
|
|
0b00398729 | ||
|
|
87034d2c7a | ||
|
|
595a2f6536 | ||
|
|
8ead44a31e | ||
|
|
5c19d02a73 | ||
|
|
14ebc9ee4e | ||
|
|
de53cc07f8 | ||
|
|
22e066fc4a | ||
|
|
242052b8ec | ||
|
|
1df61b5bbf | ||
|
|
b691de2cc0 | ||
|
|
10915f250c | ||
|
|
c60cef4009 | ||
|
|
dda798df8e | ||
|
|
ffe6ad2014 | ||
|
|
a125b20fc5 | ||
|
|
49108e67e2 | ||
|
|
883b450601 | ||
|
|
ce813568ff | ||
|
|
93303f181a | ||
|
|
bee453a090 | ||
|
|
42106aa4d6 | ||
|
|
95076c8f71 | ||
|
|
c9ac12f336 | ||
|
|
486e3e7e9b | ||
|
|
be1fc3bd8b | ||
|
|
4757cff262 | ||
|
|
61f51caf31 | ||
|
|
511063fd0e | ||
|
|
88bc5672cb | ||
|
|
0fce0acf7a | ||
|
|
61f95d07d3 | ||
|
|
3dedd169c4 | ||
|
|
e34e03d3a3 | ||
|
|
34374699ce | ||
|
|
cf5aa7c89f | ||
|
|
2766739512 | ||
|
|
9c84afb4b0 | ||
|
|
80fc8bd879 | ||
|
|
ca41f7f106 | ||
|
|
55ddd86ad5 | ||
|
|
b184eeedf4 | ||
|
|
4e97e85350 | ||
|
|
e5865b166e | ||
|
|
a2dabb4267 | ||
|
|
b7595b62eb | ||
|
|
02c02ca15c | ||
|
|
6da33394fe | ||
|
|
086871e21d | ||
|
|
f32830c649 | ||
|
|
edcad488ab | ||
|
|
43901c96a0 | ||
|
|
0962383b46 | ||
|
|
34cd08fd9a | ||
|
|
579b53f7ec | ||
|
|
dbb80d6320 | ||
|
|
0fbf24e82a | ||
|
|
a2493baead | ||
|
|
0dc6228a59 | ||
|
|
6e230bdb6a | ||
|
|
2aa76c8a1c | ||
|
|
7d6dc9e6dc | ||
|
|
458195a0ab | ||
|
|
52e854b8b7 | ||
|
|
0f1eec3971 | ||
|
|
f4425865c0 | ||
|
|
28c0c86c4e | ||
|
|
154e6dab15 | ||
|
|
0c73e3e8fa | ||
|
|
9b5f2d89d5 | ||
|
|
3da61c8da8 | ||
|
|
5b2fe3baec | ||
|
|
a3a7789547 | ||
|
|
d3fcc686ff | ||
|
|
4bcc0e5f27 | ||
|
|
9d81b5bfa8 | ||
|
|
22fce280af | ||
|
|
4739d8853e | ||
|
|
ace01ff7fb | ||
|
|
7e4f0aec4d | ||
|
|
57647583cc |
11
.github/dependabot.yml
vendored
Normal file
11
.github/dependabot.yml
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
# To get started with Dependabot version updates, you'll need to specify which
|
||||
# package ecosystems to update and where the package manifests are located.
|
||||
# Please see the documentation for all configuration options:
|
||||
# https://docs.github.com/code-security/dependabot/dependabot-version-updates/configuration-options-for-the-dependabot.yml-file
|
||||
|
||||
version: 2
|
||||
updates:
|
||||
- package-ecosystem: "pip" # See documentation for possible values
|
||||
directory: "/" # Location of package manifests
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
4
.github/workflows/tests.yml
vendored
4
.github/workflows/tests.yml
vendored
@@ -12,7 +12,7 @@ jobs:
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
python-version: ['3.8', '3.9', '3.10'] # , '3.11']
|
||||
python-version: ['3.10', '3.11', '3.12', '3.13']
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
@@ -35,4 +35,4 @@ jobs:
|
||||
if: github.event_name == 'pull_request'
|
||||
with:
|
||||
pytest-coverage-path: ./pytest-coverage.txt
|
||||
junitxml-path: ./pytest.xml
|
||||
junitxml-path: ./pytest.xml
|
||||
|
||||
1
.github/workflows/update-ios-data.yml
vendored
1
.github/workflows/update-ios-data.yml
vendored
@@ -21,6 +21,7 @@ jobs:
|
||||
title: '[auto] Update iOS releases and versions'
|
||||
commit-message: Add new iOS versions and build numbers
|
||||
branch: auto/add-new-ios-releases
|
||||
draft: true
|
||||
body: |
|
||||
This is an automated pull request to update the iOS releases and version numbers.
|
||||
add-paths: |
|
||||
|
||||
@@ -1,19 +1,65 @@
|
||||
# Contributing
|
||||
# Contributing to Mobile Verification Toolkit (MVT)
|
||||
|
||||
Thank you for your interest in contributing to Mobile Verification Toolkit (MVT)! Your help is very much appreciated.
|
||||
We greatly appreciate contributions to MVT!
|
||||
|
||||
Your involvement, whether through identifying issues, improving functionality, or enhancing documentation, is very much appreciated. To ensure smooth collaboration and a welcoming environment, we've outlined some key guidelines for contributing below.
|
||||
|
||||
## Getting started
|
||||
|
||||
Contributing to an open-source project like MVT might seem overwhelming at first, but we're here to support you!
|
||||
|
||||
Whether you're a technologist, a frontline human rights defender, a field researcher, or someone new to consensual spyware forensics, there are many ways to make meaningful contributions.
|
||||
|
||||
Here's how you can get started:
|
||||
|
||||
1. **Explore the codebase:**
|
||||
- Browse the repository to get familar with MVT. Many MVT modules are simple in functionality and easy to understand.
|
||||
- Look for `TODO:` or `FIXME:` comments in the code for areas that need attention.
|
||||
|
||||
2. **Check Github issues:**
|
||||
- Look for issues tagged with ["help wanted"](https://github.com/mvt-project/mvt/issues?q=is%3Aissue+is%3Aopen+label%3A%22help+wanted%22) or ["good first issue"](https://github.com/mvt-project/mvt/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) to find tasks that are beginner-friendly or where input from the community would be helpful.
|
||||
|
||||
3. **Ask for guidance:**
|
||||
|
||||
- If you're unsure where to start, feel free to open a [discussion](https://github.com/mvt-project/mvt/discussions) or comment on an issue.
|
||||
|
||||
## How to contribute:
|
||||
|
||||
1. **Report issues:**
|
||||
|
||||
- Found a bug? Please check existing issues to see if it's already reported. If not, open a new issue. Mobile operating systems and databases are constantly evolving, an new errors may appear spontaniously in new app versions.
|
||||
|
||||
**Please provide as much information as possible about the prodblem including: any error messages, steps to reproduce the problem, and any logs or screenshots that can help.**
|
||||
|
||||
|
||||
## Where to start
|
||||
2. **Suggest features:**
|
||||
- If you have an idea for new functionality, create a feature request issue and describe your proposal.
|
||||
|
||||
Starting to contribute to a somewhat complex project like MVT might seem intimidating. Unless you have specific ideas of new functionality you would like to submit, some good starting points are searching for `TODO:` and `FIXME:` comments throughout the code. Alternatively you can check if any GitHub issues existed marked with the ["help wanted"](https://github.com/mvt-project/mvt/issues?q=is%3Aissue+is%3Aopen+label%3A%22help+wanted%22) tag.
|
||||
3. **Submit code:**
|
||||
- Fork the repository and create a new branch for your changes.
|
||||
- Ensure your changes align with the code style guidelines (see below).
|
||||
- Open a pull request (PR) with a clear description of your changes and link it to any relevant issues.
|
||||
|
||||
4. **Documentation contributions:**
|
||||
- Improving documentation is just as valuable as contributing code! If you notice gaps or inaccuracies in the documentation, feel free to submit changes or suggest updates.
|
||||
|
||||
## Code style
|
||||
Please follow these code style guidelines for consistency and readability:
|
||||
|
||||
When contributing code to
|
||||
- **Indentation**: use 4 spaces per tab.
|
||||
- **Quotes**: Use double quotes (`"`) by default. Use single quotes (`'`) for nested strings instead of escaping (`\"`), or when using f-formatting.
|
||||
- **Maximum line length**:
|
||||
- Aim for lines no longer than 80 characters.
|
||||
- Exceptions are allowed for long log lines or strings, which may extend up to 100 characters.
|
||||
- Wrap lines that exceed 100 characters.
|
||||
|
||||
- **Indentation**: we use 4-spaces tabs.
|
||||
Follow [PEP 8 guidelines](https://peps.python.org/pep-0008/) for indentation and overall Python code style. All MVT code is automatically linted with [Ruff](https://github.com/astral-sh/ruff) before merging.
|
||||
|
||||
- **Quotes**: we use double quotes (`"`) as a default. Single quotes (`'`) can be favored with nested strings instead of escaping (`\"`), or when using f-formatting.
|
||||
Please check your code before opening a pull request by running `make ruff`
|
||||
|
||||
- **Maximum line length**: we strongly encourage to respect a 80 characters long lines and to follow [PEP8 indentation guidelines](https://peps.python.org/pep-0008/#indentation) when having to wrap. However, if breaking at 80 is not possible or is detrimental to the readability of the code, exceptions are tolerated. For example, long log lines, or long strings can be extended to 100 characters long. Please hard wrap anything beyond 100 characters.
|
||||
|
||||
## Community and support
|
||||
|
||||
We aim to create a supportive and collaborative environment for all contributors. If you run into any challenges, feel free to reach out through the discussions or issues section of the repository.
|
||||
|
||||
Your contributions, big or small, help improve MVT and are always appreciated.
|
||||
@@ -103,7 +103,7 @@ RUN git clone https://github.com/libimobiledevice/usbmuxd && cd usbmuxd \
|
||||
|
||||
|
||||
# Create main image
|
||||
FROM ubuntu:22.04 as main
|
||||
FROM ubuntu:24.04 as main
|
||||
|
||||
LABEL org.opencontainers.image.url="https://mvt.re"
|
||||
LABEL org.opencontainers.image.documentation="https://docs.mvt.re"
|
||||
@@ -135,8 +135,7 @@ COPY --from=build-usbmuxd /build /
|
||||
COPY . mvt/
|
||||
RUN apt-get update \
|
||||
&& apt-get install -y git python3-pip \
|
||||
&& PIP_NO_CACHE_DIR=1 pip3 install --upgrade pip \
|
||||
&& PIP_NO_CACHE_DIR=1 pip3 install ./mvt \
|
||||
&& PIP_NO_CACHE_DIR=1 pip3 install --break-system-packages ./mvt \
|
||||
&& apt-get remove -y python3-pip git && apt-get autoremove -y \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& rm -rf mvt
|
||||
|
||||
5
Makefile
5
Makefile
@@ -27,9 +27,8 @@ test-requirements:
|
||||
|
||||
generate-proto-parsers:
|
||||
# Generate python parsers for protobuf files
|
||||
PROTO_DIR="src/mvt/android/parsers/proto/"; \
|
||||
PROTO_FILES=$$(find $(PROTO_DIR) -iname "*.proto"); \
|
||||
protoc -I$(PROTO_DIR) --python_betterproto_out=$(PROTO_DIR) $$PROTO_FILES
|
||||
PROTO_FILES=$$(find src/mvt/android/parsers/proto/ -iname "*.proto"); \
|
||||
protoc -Isrc/mvt/android/parsers/proto/ --python_betterproto_out=src/mvt/android/parsers/proto/ $$PROTO_FILES
|
||||
|
||||
clean:
|
||||
rm -rf $(PWD)/build $(PWD)/dist $(PWD)/src/mvt.egg-info
|
||||
|
||||
43
docs/command_completion.md
Normal file
43
docs/command_completion.md
Normal file
@@ -0,0 +1,43 @@
|
||||
# Command Completion
|
||||
|
||||
MVT utilizes the [Click](https://click.palletsprojects.com/en/stable/) library for creating its command line interface.
|
||||
|
||||
Click provides tab completion support for Bash (version 4.4 and up), Zsh, and Fish.
|
||||
|
||||
To enable it, you need to manually register a special function with your shell, which varies depending on the shell you are using.
|
||||
|
||||
The following describes how to generate the command completion scripts and add them to your shell configuration.
|
||||
|
||||
> **Note: You will need to start a new shell for the changes to take effect.**
|
||||
|
||||
### For Bash
|
||||
|
||||
```bash
|
||||
# Generates bash completion scripts
|
||||
echo "$(_MVT_IOS_COMPLETE=bash_source mvt-ios)" > ~/.mvt-ios-complete.bash &&
|
||||
echo "$(_MVT_ANDROID_COMPLETE=bash_source mvt-android)" > ~/.mvt-android-complete.bash
|
||||
```
|
||||
|
||||
Add the following to `~/.bashrc`:
|
||||
```bash
|
||||
# source mvt completion scripts
|
||||
. ~/.mvt-ios-complete.bash && . ~/.mvt-android-complete.bash
|
||||
```
|
||||
|
||||
### For Zsh
|
||||
|
||||
```bash
|
||||
# Generates zsh completion scripts
|
||||
echo "$(_MVT_IOS_COMPLETE=zsh_source mvt-ios)" > ~/.mvt-ios-complete.zsh &&
|
||||
echo "$(_MVT_ANDROID_COMPLETE=zsh_source mvt-android)" > ~/.mvt-android-complete.zsh
|
||||
```
|
||||
|
||||
Add the following to `~/.zshrc`:
|
||||
```bash
|
||||
# source mvt completion scripts
|
||||
. ~/.mvt-ios-complete.zsh && . ~/.mvt-android-complete.zsh
|
||||
```
|
||||
|
||||
For more information, visit the official [Click Docs](https://click.palletsprojects.com/en/stable/shell-completion/#enabling-completion).
|
||||
|
||||
|
||||
@@ -98,3 +98,7 @@ You now should have the `mvt-ios` and `mvt-android` utilities installed.
|
||||
**Notes:**
|
||||
1. The `--force` flag is necessary to force the reinstallation of the package.
|
||||
2. To revert to using a PyPI version, it will be necessary to `pipx uninstall mvt` first.
|
||||
|
||||
## Setting up command completions
|
||||
|
||||
See ["Command completions"](command_completion.md)
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
mkdocs==1.6.1
|
||||
mkdocs-autorefs==1.2.0
|
||||
mkdocs-material==9.5.42
|
||||
mkdocs-autorefs==1.4.2
|
||||
mkdocs-material==9.6.16
|
||||
mkdocs-material-extensions==1.3.1
|
||||
mkdocstrings==0.23.0
|
||||
mkdocstrings==0.30.0
|
||||
@@ -19,22 +19,26 @@ classifiers = [
|
||||
"Programming Language :: Python"
|
||||
]
|
||||
dependencies = [
|
||||
"click >=8.1.3",
|
||||
"rich >=12.6.0",
|
||||
"tld >=0.12.6",
|
||||
"requests >=2.28.1",
|
||||
"simplejson >=3.17.6",
|
||||
"packaging >=21.3",
|
||||
"appdirs >=1.4.4",
|
||||
"iOSbackup >=0.9.923",
|
||||
"adb-shell[usb] >=0.4.3",
|
||||
"libusb1 >=3.0.0",
|
||||
"cryptography >=42.0.5",
|
||||
"pyyaml >=6.0",
|
||||
"pyahocorasick >= 2.0.0",
|
||||
"betterproto >=1.2.0",
|
||||
"click==8.2.1",
|
||||
"rich==14.1.0",
|
||||
"tld==0.13.1",
|
||||
"requests==2.32.4",
|
||||
"simplejson==3.20.1",
|
||||
"packaging==25.0",
|
||||
"appdirs==1.4.4",
|
||||
"iOSbackup==0.9.925",
|
||||
"adb-shell[usb]==0.4.4",
|
||||
"libusb1==3.3.1",
|
||||
"cryptography==45.0.6",
|
||||
"PyYAML>=6.0.2",
|
||||
"pyahocorasick==2.2.0",
|
||||
"betterproto==1.2.5",
|
||||
"pydantic==2.11.7",
|
||||
"pydantic-settings==2.10.1",
|
||||
"NSKeyedUnArchiver==1.5.2",
|
||||
"python-dateutil==2.9.0.post0",
|
||||
]
|
||||
requires-python = ">= 3.8"
|
||||
requires-python = ">= 3.10"
|
||||
|
||||
[project.urls]
|
||||
homepage = "https://docs.mvt.re/en/latest/"
|
||||
@@ -100,4 +104,4 @@ where = ["src"]
|
||||
mvt = ["ios/data/*.json"]
|
||||
|
||||
[tool.setuptools.dynamic]
|
||||
version = {attr = "mvt.common.version.MVT_VERSION"}
|
||||
version = {attr = "mvt.common.version.MVT_VERSION"}
|
||||
|
||||
@@ -4,13 +4,14 @@
|
||||
# https://license.mvt.re/1.1/
|
||||
|
||||
import base64
|
||||
import binascii
|
||||
import hashlib
|
||||
|
||||
from .artifact import AndroidArtifact
|
||||
|
||||
|
||||
class DumpsysADBArtifact(AndroidArtifact):
|
||||
multiline_fields = ["user_keys"]
|
||||
multiline_fields = ["user_keys", "keystore"]
|
||||
|
||||
def indented_dump_parser(self, dump_data):
|
||||
"""
|
||||
@@ -20,12 +21,22 @@ class DumpsysADBArtifact(AndroidArtifact):
|
||||
stack = [res]
|
||||
cur_indent = 0
|
||||
in_multiline = False
|
||||
for line in dump_data.strip(b"\n").split(b"\n"):
|
||||
# Normalize line endings to handle both Unix (\n) and Windows (\r\n)
|
||||
normalized_data = dump_data.replace(b"\r\n", b"\n").replace(b"\r", b"\n")
|
||||
for line in normalized_data.strip(b"\n").split(b"\n"):
|
||||
# Skip completely empty lines
|
||||
if not line.strip():
|
||||
continue
|
||||
|
||||
# Track the level of indentation
|
||||
indent = len(line) - len(line.lstrip())
|
||||
if indent < cur_indent:
|
||||
# If the current line is less indented than the previous one, back out
|
||||
stack.pop()
|
||||
while len(stack) > 1 and indent < cur_indent:
|
||||
stack.pop()
|
||||
# Check if we were in multiline mode and need to exit it
|
||||
if in_multiline and not isinstance(stack[-1], list):
|
||||
in_multiline = False
|
||||
cur_indent = indent
|
||||
else:
|
||||
cur_indent = indent
|
||||
@@ -37,12 +48,30 @@ class DumpsysADBArtifact(AndroidArtifact):
|
||||
|
||||
# Annoyingly, some values are multiline and don't have a key on each line
|
||||
if in_multiline:
|
||||
if key == "":
|
||||
if key == "" and len(vals) < 2:
|
||||
# If the line is empty, it's the terminator for the multiline value
|
||||
in_multiline = False
|
||||
stack.pop()
|
||||
current_dict = stack[-1]
|
||||
elif len(vals) >= 2 and (key in self.multiline_fields or key == "}" or vals[1] == b"{"):
|
||||
# If we encounter a new field while in multiline mode, exit multiline mode
|
||||
# and process this line as a new field
|
||||
in_multiline = False
|
||||
stack.pop()
|
||||
current_dict = stack[-1]
|
||||
# Don't continue here - let the line be processed as a new field
|
||||
else:
|
||||
current_dict.append(line.lstrip())
|
||||
# When in multiline mode, the top of stack should be a list
|
||||
if isinstance(stack[-1], list):
|
||||
stack[-1].append(line.lstrip())
|
||||
else:
|
||||
# Something went wrong with the stack, exit multiline mode
|
||||
in_multiline = False
|
||||
current_dict = stack[-1]
|
||||
continue
|
||||
|
||||
# Skip lines that don't have a value after '='
|
||||
if len(vals) < 2:
|
||||
continue
|
||||
|
||||
if key == "}":
|
||||
@@ -67,14 +96,38 @@ class DumpsysADBArtifact(AndroidArtifact):
|
||||
|
||||
return res
|
||||
|
||||
def parse_xml(self, xml_data):
|
||||
"""
|
||||
Parse XML data from dumpsys ADB output
|
||||
"""
|
||||
import xml.etree.ElementTree as ET
|
||||
|
||||
keystore = []
|
||||
keystore_root = ET.fromstring(xml_data)
|
||||
for adb_key in keystore_root.findall("adbKey"):
|
||||
key_info = self.calculate_key_info(adb_key.get("key").encode("utf-8"))
|
||||
key_info["last_connected"] = adb_key.get("lastConnection")
|
||||
keystore.append(key_info)
|
||||
|
||||
return keystore
|
||||
|
||||
@staticmethod
|
||||
def calculate_key_info(user_key: bytes) -> str:
|
||||
key_base64, user = user_key.split(b" ", 1)
|
||||
key_raw = base64.b64decode(key_base64)
|
||||
key_fingerprint = hashlib.md5(key_raw).hexdigest().upper()
|
||||
key_fingerprint_colon = ":".join(
|
||||
[key_fingerprint[i : i + 2] for i in range(0, len(key_fingerprint), 2)]
|
||||
)
|
||||
if b" " in user_key:
|
||||
key_base64, user = user_key.split(b" ", 1)
|
||||
else:
|
||||
key_base64, user = user_key, b""
|
||||
|
||||
try:
|
||||
key_raw = base64.b64decode(key_base64)
|
||||
key_fingerprint = hashlib.md5(key_raw).hexdigest().upper()
|
||||
key_fingerprint_colon = ":".join(
|
||||
[key_fingerprint[i : i + 2] for i in range(0, len(key_fingerprint), 2)]
|
||||
)
|
||||
except binascii.Error:
|
||||
# Impossible to parse base64
|
||||
key_fingerprint_colon = ""
|
||||
|
||||
return {
|
||||
"user": user.decode("utf-8"),
|
||||
"fingerprint": key_fingerprint_colon,
|
||||
@@ -108,15 +161,40 @@ class DumpsysADBArtifact(AndroidArtifact):
|
||||
|
||||
# TODO: Parse AdbDebuggingManager line in output.
|
||||
start_of_json = content.find(b"\n{") + 2
|
||||
end_of_json = content.rfind(b"}\n") - 2
|
||||
|
||||
# Handle both Unix (\n) and Windows (\r\n) line endings
|
||||
end_of_json = content.rfind(b"}\n")
|
||||
if end_of_json == -1:
|
||||
end_of_json = content.rfind(b"}\r\n")
|
||||
if end_of_json == -1:
|
||||
self.log.error("Unable to find end of JSON block in dumpsys output")
|
||||
return
|
||||
|
||||
end_of_json -= 2
|
||||
json_content = content[start_of_json:end_of_json].rstrip()
|
||||
|
||||
parsed = self.indented_dump_parser(json_content)
|
||||
if parsed.get("debugging_manager") is None:
|
||||
self.log.error("Unable to find expected ADB entries in dumpsys output") # noqa
|
||||
return
|
||||
|
||||
# Keystore can be in different levels, as the basic parser
|
||||
# is not always consistent due to different dumpsys formats.
|
||||
if parsed.get("keystore"):
|
||||
keystore_data = b"\n".join(parsed["keystore"])
|
||||
elif parsed["debugging_manager"].get("keystore"):
|
||||
keystore_data = b"\n".join(parsed["debugging_manager"]["keystore"])
|
||||
else:
|
||||
parsed = parsed["debugging_manager"]
|
||||
keystore_data = None
|
||||
|
||||
# Keystore is in XML format on some devices and we need to parse it
|
||||
if keystore_data and keystore_data.startswith(b"<?xml"):
|
||||
parsed["debugging_manager"]["keystore"] = self.parse_xml(keystore_data)
|
||||
else:
|
||||
# Keystore is not XML format
|
||||
parsed["debugging_manager"]["keystore"] = keystore_data
|
||||
|
||||
parsed = parsed["debugging_manager"]
|
||||
|
||||
# Calculate key fingerprints for better readability
|
||||
key_info = []
|
||||
|
||||
@@ -11,6 +11,10 @@ from mvt.common.utils import convert_datetime_to_iso
|
||||
from .artifact import AndroidArtifact
|
||||
|
||||
|
||||
RISKY_PERMISSIONS = ["REQUEST_INSTALL_PACKAGES"]
|
||||
RISKY_PACKAGES = ["com.android.shell"]
|
||||
|
||||
|
||||
class DumpsysAppopsArtifact(AndroidArtifact):
|
||||
"""
|
||||
Parser for dumpsys app ops info
|
||||
@@ -45,15 +49,39 @@ class DumpsysAppopsArtifact(AndroidArtifact):
|
||||
self.detected.append(result)
|
||||
continue
|
||||
|
||||
detected_permissions = []
|
||||
for perm in result["permissions"]:
|
||||
if (
|
||||
perm["name"] == "REQUEST_INSTALL_PACKAGES"
|
||||
and perm["access"] == "allow"
|
||||
perm["name"] in RISKY_PERMISSIONS
|
||||
# and perm["access"] == "allow"
|
||||
):
|
||||
self.log.info(
|
||||
"Package %s with REQUEST_INSTALL_PACKAGES " "permission",
|
||||
result["package_name"],
|
||||
)
|
||||
detected_permissions.append(perm)
|
||||
for entry in sorted(perm["entries"], key=lambda x: x["timestamp"]):
|
||||
self.log.warning(
|
||||
"Package '%s' had risky permission '%s' set to '%s' at %s",
|
||||
result["package_name"],
|
||||
perm["name"],
|
||||
entry["access"],
|
||||
entry["timestamp"],
|
||||
)
|
||||
|
||||
elif result["package_name"] in RISKY_PACKAGES:
|
||||
detected_permissions.append(perm)
|
||||
for entry in sorted(perm["entries"], key=lambda x: x["timestamp"]):
|
||||
self.log.warning(
|
||||
"Risky package '%s' had '%s' permission set to '%s' at %s",
|
||||
result["package_name"],
|
||||
perm["name"],
|
||||
entry["access"],
|
||||
entry["timestamp"],
|
||||
)
|
||||
|
||||
if detected_permissions:
|
||||
# We clean the result to only include the risky permission, otherwise the timeline
|
||||
# will be polluted with all the other irrelevant permissions
|
||||
cleaned_result = result.copy()
|
||||
cleaned_result["permissions"] = detected_permissions
|
||||
self.detected.append(cleaned_result)
|
||||
|
||||
def parse(self, output: str) -> None:
|
||||
self.results: List[Dict[str, Any]] = []
|
||||
@@ -121,11 +149,16 @@ class DumpsysAppopsArtifact(AndroidArtifact):
|
||||
if line.startswith(" "):
|
||||
# Permission entry like:
|
||||
# Reject: [fg-s]2021-05-19 22:02:52.054 (-314d1h25m2s33ms)
|
||||
access_type = line.split(":")[0].strip()
|
||||
if access_type not in ["Access", "Reject"]:
|
||||
# Skipping invalid access type. Some entries are not in the format we expect
|
||||
continue
|
||||
|
||||
if entry:
|
||||
perm["entries"].append(entry)
|
||||
entry = {}
|
||||
|
||||
entry["access"] = line.split(":")[0].strip()
|
||||
entry["access"] = access_type
|
||||
entry["type"] = line[line.find("[") + 1 : line.find("]")]
|
||||
|
||||
try:
|
||||
|
||||
@@ -16,8 +16,7 @@ class DumpsysPackagesArtifact(AndroidArtifact):
|
||||
for result in self.results:
|
||||
if result["package_name"] in ROOT_PACKAGES:
|
||||
self.log.warning(
|
||||
"Found an installed package related to "
|
||||
'rooting/jailbreaking: "%s"',
|
||||
'Found an installed package related to rooting/jailbreaking: "%s"',
|
||||
result["package_name"],
|
||||
)
|
||||
self.detected.append(result)
|
||||
|
||||
42
src/mvt/android/artifacts/dumpsys_platform_compat.py
Normal file
42
src/mvt/android/artifacts/dumpsys_platform_compat.py
Normal file
@@ -0,0 +1,42 @@
|
||||
# Mobile Verification Toolkit (MVT)
|
||||
# Copyright (c) 2021-2023 The MVT Authors.
|
||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||
# https://license.mvt.re/1.1/
|
||||
|
||||
from .artifact import AndroidArtifact
|
||||
|
||||
|
||||
class DumpsysPlatformCompatArtifact(AndroidArtifact):
|
||||
"""
|
||||
Parser for uninstalled apps listed in platform_compat section.
|
||||
"""
|
||||
|
||||
def check_indicators(self) -> None:
|
||||
if not self.indicators:
|
||||
return
|
||||
|
||||
for result in self.results:
|
||||
ioc = self.indicators.check_app_id(result["package_name"])
|
||||
if ioc:
|
||||
result["matched_indicator"] = ioc
|
||||
self.detected.append(result)
|
||||
continue
|
||||
|
||||
def parse(self, data: str) -> None:
|
||||
for line in data.splitlines():
|
||||
if not line.startswith("ChangeId(168419799; name=DOWNSCALED;"):
|
||||
continue
|
||||
|
||||
if line.strip() == "":
|
||||
break
|
||||
|
||||
# Look for rawOverrides field
|
||||
if "rawOverrides={" in line:
|
||||
# Extract the content inside the braces for rawOverrides
|
||||
overrides_field = line.split("rawOverrides={", 1)[1].split("};", 1)[0]
|
||||
|
||||
for entry in overrides_field.split(", "):
|
||||
# Extract app name
|
||||
uninstall_app = entry.split("=")[0].strip()
|
||||
|
||||
self.results.append({"package_name": uninstall_app})
|
||||
43
src/mvt/android/artifacts/file_timestamps.py
Normal file
43
src/mvt/android/artifacts/file_timestamps.py
Normal file
@@ -0,0 +1,43 @@
|
||||
# Mobile Verification Toolkit (MVT)
|
||||
# Copyright (c) 2021-2023 The MVT Authors.
|
||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||
# https://license.mvt.re/1.1/
|
||||
from typing import Union
|
||||
|
||||
from .artifact import AndroidArtifact
|
||||
|
||||
|
||||
class FileTimestampsArtifact(AndroidArtifact):
|
||||
def serialize(self, record: dict) -> Union[dict, list]:
|
||||
records = []
|
||||
|
||||
for ts in set(
|
||||
[
|
||||
record.get("access_time"),
|
||||
record.get("changed_time"),
|
||||
record.get("modified_time"),
|
||||
]
|
||||
):
|
||||
if not ts:
|
||||
continue
|
||||
|
||||
macb = ""
|
||||
macb += "M" if ts == record.get("modified_time") else "-"
|
||||
macb += "A" if ts == record.get("access_time") else "-"
|
||||
macb += "C" if ts == record.get("changed_time") else "-"
|
||||
macb += "-"
|
||||
|
||||
msg = record["path"]
|
||||
if record.get("context"):
|
||||
msg += f" ({record['context']})"
|
||||
|
||||
records.append(
|
||||
{
|
||||
"timestamp": ts,
|
||||
"module": self.__class__.__name__,
|
||||
"event": macb,
|
||||
"data": msg,
|
||||
}
|
||||
)
|
||||
|
||||
return records
|
||||
@@ -42,6 +42,17 @@ class GetProp(AndroidArtifact):
|
||||
entry = {"name": matches[0][0], "value": matches[0][1]}
|
||||
self.results.append(entry)
|
||||
|
||||
def get_device_timezone(self) -> str:
|
||||
"""
|
||||
Get the device timezone from the getprop results
|
||||
|
||||
Used in other moduels to calculate the timezone offset
|
||||
"""
|
||||
for entry in self.results:
|
||||
if entry["name"] == "persist.sys.timezone":
|
||||
return entry["value"]
|
||||
return None
|
||||
|
||||
def check_indicators(self) -> None:
|
||||
for entry in self.results:
|
||||
if entry["name"] in INTERESTING_PROPERTIES:
|
||||
|
||||
@@ -16,6 +16,11 @@ ANDROID_DANGEROUS_SETTINGS = [
|
||||
"key": "package_verifier_enable",
|
||||
"safe_value": "1",
|
||||
},
|
||||
{
|
||||
"description": "disabled APK package verification",
|
||||
"key": "package_verifier_state",
|
||||
"safe_value": "1",
|
||||
},
|
||||
{
|
||||
"description": "disabled Google Play Protect",
|
||||
"key": "package_verifier_user_consent",
|
||||
@@ -46,11 +51,6 @@ ANDROID_DANGEROUS_SETTINGS = [
|
||||
"key": "send_action_app_error",
|
||||
"safe_value": "1",
|
||||
},
|
||||
{
|
||||
"description": "enabled installation of non Google Play apps",
|
||||
"key": "install_non_market_apps",
|
||||
"safe_value": "0",
|
||||
},
|
||||
{
|
||||
"description": "enabled accessibility services",
|
||||
"key": "accessibility_enabled",
|
||||
|
||||
@@ -3,11 +3,268 @@
|
||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||
# https://license.mvt.re/1.1/
|
||||
|
||||
import datetime
|
||||
from typing import List, Optional, Union
|
||||
|
||||
import pydantic
|
||||
import betterproto
|
||||
from dateutil import parser
|
||||
|
||||
from mvt.common.utils import convert_datetime_to_iso
|
||||
from mvt.android.parsers.proto.tombstone import Tombstone
|
||||
from .artifact import AndroidArtifact
|
||||
|
||||
|
||||
TOMBSTONE_DELIMITER = "*** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***"
|
||||
|
||||
# Map the legacy crash file keys to the new format.
|
||||
TOMBSTONE_TEXT_KEY_MAPPINGS = {
|
||||
"Build fingerprint": "build_fingerprint",
|
||||
"Revision": "revision",
|
||||
"ABI": "arch",
|
||||
"Timestamp": "timestamp",
|
||||
"Process uptime": "process_uptime",
|
||||
"Cmdline": "command_line",
|
||||
"pid": "pid",
|
||||
"tid": "tid",
|
||||
"name": "process_name",
|
||||
"binary_path": "binary_path",
|
||||
"uid": "uid",
|
||||
"signal": "signal_info",
|
||||
"code": "code",
|
||||
"Cause": "cause",
|
||||
}
|
||||
|
||||
|
||||
class SignalInfo(pydantic.BaseModel):
|
||||
code: int
|
||||
code_name: str
|
||||
name: str
|
||||
number: Optional[int] = None
|
||||
|
||||
|
||||
class TombstoneCrashResult(pydantic.BaseModel):
|
||||
"""
|
||||
MVT Result model for a tombstone crash result.
|
||||
|
||||
Needed for validation and serialization, and consistency between text and protobuf tombstones.
|
||||
"""
|
||||
|
||||
file_name: str
|
||||
file_timestamp: str # We store the timestamp as a string to avoid timezone issues
|
||||
build_fingerprint: str
|
||||
revision: int
|
||||
arch: Optional[str] = None
|
||||
timestamp: str # We store the timestamp as a string to avoid timezone issues
|
||||
process_uptime: Optional[int] = None
|
||||
command_line: Optional[List[str]] = None
|
||||
pid: int
|
||||
tid: int
|
||||
process_name: Optional[str] = None
|
||||
binary_path: Optional[str] = None
|
||||
selinux_label: Optional[str] = None
|
||||
uid: int
|
||||
signal_info: SignalInfo
|
||||
cause: Optional[str] = None
|
||||
extra: Optional[str] = None
|
||||
|
||||
|
||||
class TombstoneCrashArtifact(AndroidArtifact):
|
||||
def parse(self, content: bytes) -> None:
|
||||
""" "
|
||||
Parser for Android tombstone crash files.
|
||||
|
||||
This parser can parse both text and protobuf tombstone crash files.
|
||||
"""
|
||||
|
||||
def serialize(self, record: dict) -> Union[dict, list]:
|
||||
return {
|
||||
"timestamp": record["timestamp"],
|
||||
"module": self.__class__.__name__,
|
||||
"event": "Tombstone",
|
||||
"data": (
|
||||
f"Crash in '{record['process_name']}' process running as UID '{record['uid']}' in file '{record['file_name']}' "
|
||||
f"Crash type '{record['signal_info']['name']}' with code '{record['signal_info']['code_name']}'"
|
||||
),
|
||||
}
|
||||
|
||||
def check_indicators(self) -> None:
|
||||
if not self.indicators:
|
||||
return
|
||||
|
||||
for result in self.results:
|
||||
ioc = self.indicators.check_process(result["process_name"])
|
||||
if ioc:
|
||||
result["matched_indicator"] = ioc
|
||||
self.detected.append(result)
|
||||
continue
|
||||
|
||||
if result.get("command_line", []):
|
||||
command_name = result.get("command_line")[0].split("/")[-1]
|
||||
ioc = self.indicators.check_process(command_name)
|
||||
if ioc:
|
||||
result["matched_indicator"] = ioc
|
||||
self.detected.append(result)
|
||||
continue
|
||||
|
||||
SUSPICIOUS_UIDS = [
|
||||
0, # root
|
||||
1000, # system
|
||||
2000, # shell
|
||||
]
|
||||
if result["uid"] in SUSPICIOUS_UIDS:
|
||||
self.log.warning(
|
||||
f"Potentially suspicious crash in process '{result['process_name']}' "
|
||||
f"running as UID '{result['uid']}' in tombstone '{result['file_name']}' at {result['timestamp']}"
|
||||
)
|
||||
self.detected.append(result)
|
||||
|
||||
def parse_protobuf(
|
||||
self, file_name: str, file_timestamp: datetime.datetime, data: bytes
|
||||
) -> None:
|
||||
"""
|
||||
Parse Android tombstone crash files."""
|
||||
Parse Android tombstone crash files from a protobuf object.
|
||||
"""
|
||||
tombstone_pb = Tombstone().parse(data)
|
||||
tombstone_dict = tombstone_pb.to_dict(
|
||||
betterproto.Casing.SNAKE, include_default_values=True
|
||||
)
|
||||
|
||||
# Add some extra metadata
|
||||
tombstone_dict["timestamp"] = self._parse_timestamp_string(
|
||||
tombstone_pb.timestamp
|
||||
)
|
||||
tombstone_dict["file_name"] = file_name
|
||||
tombstone_dict["file_timestamp"] = convert_datetime_to_iso(file_timestamp)
|
||||
tombstone_dict["process_name"] = self._proccess_name_from_thread(tombstone_dict)
|
||||
|
||||
# Confirm the tombstone is valid, and matches the output model
|
||||
tombstone = TombstoneCrashResult.model_validate(tombstone_dict)
|
||||
self.results.append(tombstone.model_dump())
|
||||
|
||||
def parse(
|
||||
self, file_name: str, file_timestamp: datetime.datetime, content: bytes
|
||||
) -> None:
|
||||
"""
|
||||
Parse text Android tombstone crash files.
|
||||
"""
|
||||
|
||||
# Split the tombstone file into a dictonary
|
||||
tombstone_dict = {
|
||||
"file_name": file_name,
|
||||
"file_timestamp": convert_datetime_to_iso(file_timestamp),
|
||||
}
|
||||
lines = content.decode("utf-8").splitlines()
|
||||
for line in lines:
|
||||
if not line.strip() or TOMBSTONE_DELIMITER in line:
|
||||
continue
|
||||
for key, destination_key in TOMBSTONE_TEXT_KEY_MAPPINGS.items():
|
||||
self._parse_tombstone_line(line, key, destination_key, tombstone_dict)
|
||||
|
||||
# Validate the tombstone and add it to the results
|
||||
tombstone = TombstoneCrashResult.model_validate(tombstone_dict)
|
||||
self.results.append(tombstone.model_dump())
|
||||
|
||||
def _parse_tombstone_line(
|
||||
self, line: str, key: str, destination_key: str, tombstone: dict
|
||||
) -> bool:
|
||||
if not line.startswith(f"{key}"):
|
||||
return None
|
||||
|
||||
if key == "pid":
|
||||
return self._load_pid_line(line, tombstone)
|
||||
elif key == "signal":
|
||||
return self._load_signal_line(line, tombstone)
|
||||
elif key == "Timestamp":
|
||||
return self._load_timestamp_line(line, tombstone)
|
||||
else:
|
||||
return self._load_key_value_line(line, key, destination_key, tombstone)
|
||||
|
||||
def _load_key_value_line(
|
||||
self, line: str, key: str, destination_key: str, tombstone: dict
|
||||
) -> bool:
|
||||
line_key, value = line.split(":", 1)
|
||||
if line_key != key:
|
||||
raise ValueError(f"Expected key {key}, got {line_key}")
|
||||
|
||||
value_clean = value.strip().strip("'")
|
||||
if destination_key in ["uid", "revision"]:
|
||||
tombstone[destination_key] = int(value_clean)
|
||||
elif destination_key == "process_uptime":
|
||||
# eg. "Process uptime: 40s"
|
||||
tombstone[destination_key] = int(value_clean.rstrip("s"))
|
||||
elif destination_key == "command_line":
|
||||
# XXX: Check if command line should be a single string in a list, or a list of strings.
|
||||
tombstone[destination_key] = [value_clean]
|
||||
else:
|
||||
tombstone[destination_key] = value_clean
|
||||
return True
|
||||
|
||||
def _load_pid_line(self, line: str, tombstone: dict) -> bool:
|
||||
pid_part, tid_part, name_part = [part.strip() for part in line.split(",")]
|
||||
|
||||
pid_key, pid_value = pid_part.split(":", 1)
|
||||
if pid_key != "pid":
|
||||
raise ValueError(f"Expected key pid, got {pid_key}")
|
||||
pid_value = int(pid_value.strip())
|
||||
|
||||
tid_key, tid_value = tid_part.split(":", 1)
|
||||
if tid_key != "tid":
|
||||
raise ValueError(f"Expected key tid, got {tid_key}")
|
||||
tid_value = int(tid_value.strip())
|
||||
|
||||
name_key, name_value = name_part.split(":", 1)
|
||||
if name_key != "name":
|
||||
raise ValueError(f"Expected key name, got {name_key}")
|
||||
name_value = name_value.strip()
|
||||
process_name, binary_path = self._parse_process_name(name_value, tombstone)
|
||||
|
||||
tombstone["pid"] = pid_value
|
||||
tombstone["tid"] = tid_value
|
||||
tombstone["process_name"] = process_name
|
||||
tombstone["binary_path"] = binary_path
|
||||
return True
|
||||
|
||||
def _parse_process_name(self, process_name_part, tombstone: dict) -> bool:
|
||||
process_name, process_path = process_name_part.split(">>>")
|
||||
process_name = process_name.strip()
|
||||
binary_path = process_path.strip().split(" ")[0]
|
||||
return process_name, binary_path
|
||||
|
||||
def _load_signal_line(self, line: str, tombstone: dict) -> bool:
|
||||
signal, code, _ = [part.strip() for part in line.split(",", 2)]
|
||||
signal = signal.split("signal ")[1]
|
||||
signal_code, signal_name = signal.split(" ")
|
||||
signal_name = signal_name.strip("()")
|
||||
|
||||
code_part = code.split("code ")[1]
|
||||
code_number, code_name = code_part.split(" ")
|
||||
code_name = code_name.strip("()")
|
||||
|
||||
tombstone["signal_info"] = {
|
||||
"code": int(code_number),
|
||||
"code_name": code_name,
|
||||
"name": signal_name,
|
||||
"number": int(signal_code),
|
||||
}
|
||||
return True
|
||||
|
||||
def _load_timestamp_line(self, line: str, tombstone: dict) -> bool:
|
||||
timestamp = line.split(":", 1)[1].strip()
|
||||
tombstone["timestamp"] = self._parse_timestamp_string(timestamp)
|
||||
return True
|
||||
|
||||
@staticmethod
|
||||
def _parse_timestamp_string(timestamp: str) -> str:
|
||||
timestamp_parsed = parser.parse(timestamp)
|
||||
|
||||
# HACK: Swap the local timestamp to UTC, so keep the original time and avoid timezone conversion.
|
||||
local_timestamp = timestamp_parsed.replace(tzinfo=datetime.timezone.utc)
|
||||
return convert_datetime_to_iso(local_timestamp)
|
||||
|
||||
@staticmethod
|
||||
def _proccess_name_from_thread(tombstone_dict: dict) -> str:
|
||||
if tombstone_dict.get("threads"):
|
||||
for thread in tombstone_dict["threads"].values():
|
||||
if thread.get("id") == tombstone_dict["tid"] and thread.get("name"):
|
||||
return thread["name"]
|
||||
return "Unknown"
|
||||
|
||||
@@ -326,8 +326,7 @@ class AndroidExtraction(MVTModule):
|
||||
|
||||
if not header["backup"]:
|
||||
self.log.error(
|
||||
"Extracting SMS via Android backup failed. "
|
||||
"No valid backup data found."
|
||||
"Extracting SMS via Android backup failed. No valid backup data found."
|
||||
)
|
||||
return None
|
||||
|
||||
|
||||
@@ -75,8 +75,7 @@ class Packages(AndroidExtraction):
|
||||
for result in self.results:
|
||||
if result["package_name"] in ROOT_PACKAGES:
|
||||
self.log.warning(
|
||||
"Found an installed package related to "
|
||||
'rooting/jailbreaking: "%s"',
|
||||
'Found an installed package related to rooting/jailbreaking: "%s"',
|
||||
result["package_name"],
|
||||
)
|
||||
self.detected.append(result)
|
||||
|
||||
@@ -70,7 +70,7 @@ class SMS(AndroidExtraction):
|
||||
"timestamp": record["isodate"],
|
||||
"module": self.__class__.__name__,
|
||||
"event": f"sms_{record['direction']}",
|
||||
"data": f"{record.get('address', 'unknown source')}: \"{body}\"",
|
||||
"data": f'{record.get("address", "unknown source")}: "{body}"',
|
||||
}
|
||||
|
||||
def check_indicators(self) -> None:
|
||||
|
||||
@@ -14,6 +14,7 @@ from .dumpsys_receivers import DumpsysReceivers
|
||||
from .dumpsys_adb import DumpsysADBState
|
||||
from .getprop import Getprop
|
||||
from .packages import Packages
|
||||
from .dumpsys_platform_compat import DumpsysPlatformCompat
|
||||
from .processes import Processes
|
||||
from .settings import Settings
|
||||
from .sms import SMS
|
||||
@@ -29,6 +30,7 @@ ANDROIDQF_MODULES = [
|
||||
DumpsysBatteryHistory,
|
||||
DumpsysADBState,
|
||||
Packages,
|
||||
DumpsysPlatformCompat,
|
||||
Processes,
|
||||
Getprop,
|
||||
Settings,
|
||||
|
||||
@@ -48,6 +48,37 @@ class AndroidQFModule(MVTModule):
|
||||
def _get_files_by_pattern(self, pattern: str):
|
||||
return fnmatch.filter(self.files, pattern)
|
||||
|
||||
def _get_device_timezone(self):
|
||||
"""
|
||||
Get the device timezone from the getprop.txt file.
|
||||
|
||||
This is needed to map local timestamps stored in some
|
||||
Android log files to UTC/timezone-aware timestamps.
|
||||
"""
|
||||
get_prop_files = self._get_files_by_pattern("*/getprop.txt")
|
||||
if not get_prop_files:
|
||||
self.log.warning(
|
||||
"Could not find getprop.txt file. "
|
||||
"Some timestamps and timeline data may be incorrect."
|
||||
)
|
||||
return None
|
||||
|
||||
from mvt.android.artifacts.getprop import GetProp
|
||||
|
||||
properties_artifact = GetProp()
|
||||
prop_data = self._get_file_content(get_prop_files[0]).decode("utf-8")
|
||||
properties_artifact.parse(prop_data)
|
||||
timezone = properties_artifact.get_device_timezone()
|
||||
if timezone:
|
||||
self.log.debug("Identified local phone timezone: %s", timezone)
|
||||
return timezone
|
||||
|
||||
self.log.warning(
|
||||
"Could not find or determine local device timezone. "
|
||||
"Some timestamps and timeline data may be incorrect."
|
||||
)
|
||||
return None
|
||||
|
||||
def _get_file_content(self, file_path):
|
||||
if self.archive:
|
||||
handle = self.archive.open(file_path)
|
||||
|
||||
44
src/mvt/android/modules/androidqf/dumpsys_platform_compat.py
Normal file
44
src/mvt/android/modules/androidqf/dumpsys_platform_compat.py
Normal file
@@ -0,0 +1,44 @@
|
||||
# Mobile Verification Toolkit (MVT)
|
||||
# Copyright (c) 2021-2023 The MVT Authors.
|
||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||
# https://license.mvt.re/1.1/
|
||||
|
||||
import logging
|
||||
from typing import Optional
|
||||
|
||||
from mvt.android.artifacts.dumpsys_platform_compat import DumpsysPlatformCompatArtifact
|
||||
|
||||
from .base import AndroidQFModule
|
||||
|
||||
|
||||
class DumpsysPlatformCompat(DumpsysPlatformCompatArtifact, AndroidQFModule):
|
||||
"""This module extracts details on uninstalled apps."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
file_path: Optional[str] = None,
|
||||
target_path: Optional[str] = None,
|
||||
results_path: Optional[str] = None,
|
||||
module_options: Optional[dict] = None,
|
||||
log: logging.Logger = logging.getLogger(__name__),
|
||||
results: Optional[list] = None,
|
||||
) -> None:
|
||||
super().__init__(
|
||||
file_path=file_path,
|
||||
target_path=target_path,
|
||||
results_path=results_path,
|
||||
module_options=module_options,
|
||||
log=log,
|
||||
results=results,
|
||||
)
|
||||
|
||||
def run(self) -> None:
|
||||
dumpsys_file = self._get_files_by_pattern("*/dumpsys.txt")
|
||||
if not dumpsys_file:
|
||||
return
|
||||
|
||||
data = self._get_file_content(dumpsys_file[0]).decode("utf-8", errors="replace")
|
||||
content = self.extract_dumpsys_section(data, "DUMP OF SERVICE platform_compat:")
|
||||
self.parse(content)
|
||||
|
||||
self.log.info("Found %d uninstalled apps", len(self.results))
|
||||
@@ -6,6 +6,11 @@
|
||||
import datetime
|
||||
import json
|
||||
import logging
|
||||
|
||||
try:
|
||||
import zoneinfo
|
||||
except ImportError:
|
||||
from backports import zoneinfo
|
||||
from typing import Optional, Union
|
||||
|
||||
from mvt.android.modules.androidqf.base import AndroidQFModule
|
||||
@@ -74,7 +79,7 @@ class Files(AndroidQFModule):
|
||||
for result in self.results:
|
||||
ioc = self.indicators.check_file_path(result["path"])
|
||||
if ioc:
|
||||
result["matched_indicator"] == ioc
|
||||
result["matched_indicator"] = ioc
|
||||
self.detected.append(result)
|
||||
continue
|
||||
|
||||
@@ -106,6 +111,20 @@ class Files(AndroidQFModule):
|
||||
# TODO: adds SHA1 and MD5 when available in MVT
|
||||
|
||||
def run(self) -> None:
|
||||
if timezone := self._get_device_timezone():
|
||||
try:
|
||||
device_timezone = zoneinfo.ZoneInfo(timezone)
|
||||
except zoneinfo.ZoneInfoNotFoundError:
|
||||
self.log.warning("Device timezone '%s' not found, using UTC", timezone)
|
||||
device_timezone = datetime.timezone.utc
|
||||
else:
|
||||
self.log.warning("Unable to determine device timezone, using UTC")
|
||||
try:
|
||||
device_timezone = zoneinfo.ZoneInfo("UTC")
|
||||
except zoneinfo.ZoneInfoNotFoundError:
|
||||
# Fallback for Windows systems where zoneinfo might not have UTC
|
||||
device_timezone = datetime.timezone.utc
|
||||
|
||||
for file in self._get_files_by_pattern("*/files.json"):
|
||||
rawdata = self._get_file_content(file).decode("utf-8", errors="ignore")
|
||||
try:
|
||||
@@ -120,11 +139,18 @@ class Files(AndroidQFModule):
|
||||
for file_data in data:
|
||||
for ts in ["access_time", "changed_time", "modified_time"]:
|
||||
if ts in file_data:
|
||||
file_data[ts] = convert_datetime_to_iso(
|
||||
datetime.datetime.fromtimestamp(
|
||||
file_data[ts], tz=datetime.timezone.utc
|
||||
)
|
||||
utc_timestamp = datetime.datetime.fromtimestamp(
|
||||
file_data[ts], tz=datetime.timezone.utc
|
||||
)
|
||||
# Convert the UTC timestamp to local tiem on Android device's local timezone
|
||||
local_timestamp = utc_timestamp.astimezone(device_timezone)
|
||||
|
||||
# HACK: We only output the UTC timestamp in convert_datetime_to_iso, we
|
||||
# set the timestamp timezone to UTC, to avoid the timezone conversion again.
|
||||
local_timestamp = local_timestamp.replace(
|
||||
tzinfo=datetime.timezone.utc
|
||||
)
|
||||
file_data[ts] = convert_datetime_to_iso(local_timestamp)
|
||||
|
||||
self.results.append(file_data)
|
||||
|
||||
|
||||
65
src/mvt/android/modules/androidqf/logfile_timestamps.py
Normal file
65
src/mvt/android/modules/androidqf/logfile_timestamps.py
Normal file
@@ -0,0 +1,65 @@
|
||||
# Mobile Verification Toolkit (MVT)
|
||||
# Copyright (c) 2021-2023 The MVT Authors.
|
||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||
# https://license.mvt.re/1.1/
|
||||
|
||||
import os
|
||||
import datetime
|
||||
import logging
|
||||
from typing import Optional
|
||||
|
||||
from mvt.common.utils import convert_datetime_to_iso
|
||||
from .base import AndroidQFModule
|
||||
from mvt.android.artifacts.file_timestamps import FileTimestampsArtifact
|
||||
|
||||
|
||||
class LogsFileTimestamps(FileTimestampsArtifact, AndroidQFModule):
|
||||
"""This module extracts records from battery daily updates."""
|
||||
|
||||
slug = "logfile_timestamps"
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
file_path: Optional[str] = None,
|
||||
target_path: Optional[str] = None,
|
||||
results_path: Optional[str] = None,
|
||||
module_options: Optional[dict] = None,
|
||||
log: logging.Logger = logging.getLogger(__name__),
|
||||
results: Optional[list] = None,
|
||||
) -> None:
|
||||
super().__init__(
|
||||
file_path=file_path,
|
||||
target_path=target_path,
|
||||
results_path=results_path,
|
||||
module_options=module_options,
|
||||
log=log,
|
||||
results=results,
|
||||
)
|
||||
|
||||
def _get_file_modification_time(self, file_path: str) -> dict:
|
||||
if self.archive:
|
||||
file_timetuple = self.archive.getinfo(file_path).date_time
|
||||
return datetime.datetime(*file_timetuple)
|
||||
else:
|
||||
file_stat = os.stat(os.path.join(self.parent_path, file_path))
|
||||
return datetime.datetime.fromtimestamp(file_stat.st_mtime)
|
||||
|
||||
def run(self) -> None:
|
||||
filesystem_files = self._get_files_by_pattern("*/logs/*")
|
||||
|
||||
self.results = []
|
||||
for file in filesystem_files:
|
||||
# Only the modification time is available in the zip file metadata.
|
||||
# The timezone is the local timezone of the machine the phone.
|
||||
modification_time = self._get_file_modification_time(file)
|
||||
self.results.append(
|
||||
{
|
||||
"path": file,
|
||||
"modified_time": convert_datetime_to_iso(modification_time),
|
||||
}
|
||||
)
|
||||
|
||||
self.log.info(
|
||||
"Extracted a total of %d filesystem timestamps from AndroidQF logs directory.",
|
||||
len(self.results),
|
||||
)
|
||||
@@ -44,8 +44,7 @@ class Packages(AndroidQFModule):
|
||||
for result in self.results:
|
||||
if result["name"] in ROOT_PACKAGES:
|
||||
self.log.warning(
|
||||
"Found an installed package related to "
|
||||
'rooting/jailbreaking: "%s"',
|
||||
'Found an installed package related to rooting/jailbreaking: "%s"',
|
||||
result["name"],
|
||||
)
|
||||
self.detected.append(result)
|
||||
|
||||
@@ -3,10 +3,11 @@
|
||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||
# https://license.mvt.re/1.1/
|
||||
|
||||
import os
|
||||
|
||||
from rich.prompt import Prompt
|
||||
|
||||
from mvt.common.config import settings
|
||||
|
||||
MVT_ANDROID_BACKUP_PASSWORD = "MVT_ANDROID_BACKUP_PASSWORD"
|
||||
|
||||
|
||||
@@ -16,24 +17,24 @@ def cli_load_android_backup_password(log, backup_password):
|
||||
|
||||
Used in MVT CLI command parsers.
|
||||
"""
|
||||
password_from_env = os.environ.get(MVT_ANDROID_BACKUP_PASSWORD, None)
|
||||
password_from_env_or_config = settings.ANDROID_BACKUP_PASSWORD
|
||||
if backup_password:
|
||||
log.info(
|
||||
"Your password may be visible in the process table because it "
|
||||
"was supplied on the command line!"
|
||||
)
|
||||
if password_from_env:
|
||||
if password_from_env_or_config:
|
||||
log.info(
|
||||
"Ignoring %s environment variable, using --backup-password argument instead",
|
||||
MVT_ANDROID_BACKUP_PASSWORD,
|
||||
"MVT_ANDROID_BACKUP_PASSWORD",
|
||||
)
|
||||
return backup_password
|
||||
elif password_from_env:
|
||||
elif password_from_env_or_config:
|
||||
log.info(
|
||||
"Using backup password from %s environment variable",
|
||||
MVT_ANDROID_BACKUP_PASSWORD,
|
||||
"Using backup password from %s environment variable or config file",
|
||||
"MVT_ANDROID_BACKUP_PASSWORD",
|
||||
)
|
||||
return password_from_env
|
||||
return password_from_env_or_config
|
||||
|
||||
|
||||
def prompt_or_load_android_backup_password(log, module_options):
|
||||
|
||||
@@ -11,8 +11,11 @@ from .battery_history import BatteryHistory
|
||||
from .dbinfo import DBInfo
|
||||
from .getprop import Getprop
|
||||
from .packages import Packages
|
||||
from .platform_compat import PlatformCompat
|
||||
from .receivers import Receivers
|
||||
from .adb_state import DumpsysADBState
|
||||
from .fs_timestamps import BugReportTimestamps
|
||||
from .tombstones import Tombstones
|
||||
|
||||
BUGREPORT_MODULES = [
|
||||
Accessibility,
|
||||
@@ -23,6 +26,9 @@ BUGREPORT_MODULES = [
|
||||
DBInfo,
|
||||
Getprop,
|
||||
Packages,
|
||||
PlatformCompat,
|
||||
Receivers,
|
||||
DumpsysADBState,
|
||||
BugReportTimestamps,
|
||||
Tombstones,
|
||||
]
|
||||
|
||||
@@ -2,10 +2,11 @@
|
||||
# Copyright (c) 2021-2023 The MVT Authors.
|
||||
# See the file 'LICENSE' for usage and copying permissions, or find a copy at
|
||||
# https://github.com/mvt-project/mvt/blob/main/LICENSE
|
||||
|
||||
import datetime
|
||||
import fnmatch
|
||||
import logging
|
||||
import os
|
||||
|
||||
from typing import List, Optional
|
||||
from zipfile import ZipFile
|
||||
|
||||
@@ -91,3 +92,11 @@ class BugReportModule(MVTModule):
|
||||
return None
|
||||
|
||||
return self._get_file_content(dumpstate_logs[0])
|
||||
|
||||
def _get_file_modification_time(self, file_path: str) -> dict:
|
||||
if self.zip_archive:
|
||||
file_timetuple = self.zip_archive.getinfo(file_path).date_time
|
||||
return datetime.datetime(*file_timetuple)
|
||||
else:
|
||||
file_stat = os.stat(os.path.join(self.extract_path, file_path))
|
||||
return datetime.datetime.fromtimestamp(file_stat.st_mtime)
|
||||
|
||||
55
src/mvt/android/modules/bugreport/fs_timestamps.py
Normal file
55
src/mvt/android/modules/bugreport/fs_timestamps.py
Normal file
@@ -0,0 +1,55 @@
|
||||
# Mobile Verification Toolkit (MVT)
|
||||
# Copyright (c) 2021-2023 The MVT Authors.
|
||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||
# https://license.mvt.re/1.1/
|
||||
|
||||
import logging
|
||||
from typing import Optional
|
||||
|
||||
from mvt.common.utils import convert_datetime_to_iso
|
||||
from .base import BugReportModule
|
||||
from mvt.android.artifacts.file_timestamps import FileTimestampsArtifact
|
||||
|
||||
|
||||
class BugReportTimestamps(FileTimestampsArtifact, BugReportModule):
|
||||
"""This module extracts records from battery daily updates."""
|
||||
|
||||
slug = "bugreport_timestamps"
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
file_path: Optional[str] = None,
|
||||
target_path: Optional[str] = None,
|
||||
results_path: Optional[str] = None,
|
||||
module_options: Optional[dict] = None,
|
||||
log: logging.Logger = logging.getLogger(__name__),
|
||||
results: Optional[list] = None,
|
||||
) -> None:
|
||||
super().__init__(
|
||||
file_path=file_path,
|
||||
target_path=target_path,
|
||||
results_path=results_path,
|
||||
module_options=module_options,
|
||||
log=log,
|
||||
results=results,
|
||||
)
|
||||
|
||||
def run(self) -> None:
|
||||
filesystem_files = self._get_files_by_pattern("FS/*")
|
||||
|
||||
self.results = []
|
||||
for file in filesystem_files:
|
||||
# Only the modification time is available in the zip file metadata.
|
||||
# The timezone is the local timezone of the machine the phone.
|
||||
modification_time = self._get_file_modification_time(file)
|
||||
self.results.append(
|
||||
{
|
||||
"path": file,
|
||||
"modified_time": convert_datetime_to_iso(modification_time),
|
||||
}
|
||||
)
|
||||
|
||||
self.log.info(
|
||||
"Extracted a total of %d filesystem timestamps from bugreport.",
|
||||
len(self.results),
|
||||
)
|
||||
48
src/mvt/android/modules/bugreport/platform_compat.py
Normal file
48
src/mvt/android/modules/bugreport/platform_compat.py
Normal file
@@ -0,0 +1,48 @@
|
||||
# Mobile Verification Toolkit (MVT)
|
||||
# Copyright (c) 2021-2023 The MVT Authors.
|
||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||
# https://license.mvt.re/1.1/
|
||||
|
||||
import logging
|
||||
from typing import Optional
|
||||
|
||||
from mvt.android.artifacts.dumpsys_platform_compat import DumpsysPlatformCompatArtifact
|
||||
|
||||
from mvt.android.modules.bugreport.base import BugReportModule
|
||||
|
||||
|
||||
class PlatformCompat(DumpsysPlatformCompatArtifact, BugReportModule):
|
||||
"""This module extracts details on uninstalled apps."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
file_path: Optional[str] = None,
|
||||
target_path: Optional[str] = None,
|
||||
results_path: Optional[str] = None,
|
||||
module_options: Optional[dict] = None,
|
||||
log: logging.Logger = logging.getLogger(__name__),
|
||||
results: Optional[list] = None,
|
||||
) -> None:
|
||||
super().__init__(
|
||||
file_path=file_path,
|
||||
target_path=target_path,
|
||||
results_path=results_path,
|
||||
module_options=module_options,
|
||||
log=log,
|
||||
results=results,
|
||||
)
|
||||
|
||||
def run(self) -> None:
|
||||
data = self._get_dumpstate_file()
|
||||
if not data:
|
||||
self.log.error(
|
||||
"Unable to find dumpstate file. "
|
||||
"Did you provide a valid bug report archive?"
|
||||
)
|
||||
return
|
||||
|
||||
data = data.decode("utf-8", errors="replace")
|
||||
content = self.extract_dumpsys_section(data, "DUMP OF SERVICE platform_compat:")
|
||||
self.parse(content)
|
||||
|
||||
self.log.info("Found %d uninstalled apps", len(self.results))
|
||||
@@ -42,18 +42,23 @@ class Tombstones(TombstoneCrashArtifact, BugReportModule):
|
||||
)
|
||||
return
|
||||
|
||||
for tombstone_file in tombstone_files:
|
||||
if tombstone_file.endswith("*.pb"):
|
||||
self.log.info("Skipping protobuf tombstone file: %s", tombstone_file)
|
||||
continue
|
||||
|
||||
print(tombstone_file)
|
||||
for tombstone_file in sorted(tombstone_files):
|
||||
tombstone_filename = tombstone_file.split("/")[-1]
|
||||
modification_time = self._get_file_modification_time(tombstone_file)
|
||||
tombstone_data = self._get_file_content(tombstone_file)
|
||||
tombstone = self.parse_tombstone(tombstone_data)
|
||||
print(tombstone)
|
||||
break
|
||||
|
||||
# self.log.info(
|
||||
# "Extracted a total of %d database connection pool records",
|
||||
# len(self.results),
|
||||
# )
|
||||
try:
|
||||
if tombstone_file.endswith(".pb"):
|
||||
self.parse_protobuf(
|
||||
tombstone_filename, modification_time, tombstone_data
|
||||
)
|
||||
else:
|
||||
self.parse(tombstone_filename, modification_time, tombstone_data)
|
||||
except ValueError as e:
|
||||
# Catch any exceptions raised during parsing or validation.
|
||||
self.log.error(f"Error parsing tombstone file {tombstone_file}: {e}")
|
||||
|
||||
self.log.info(
|
||||
"Extracted a total of %d tombstone files",
|
||||
len(self.results),
|
||||
)
|
||||
|
||||
@@ -231,6 +231,7 @@ def parse_sms_file(data):
|
||||
entry.pop("mms_body")
|
||||
|
||||
body = entry.get("body", None)
|
||||
message_links = None
|
||||
if body:
|
||||
message_links = check_for_links(entry["body"])
|
||||
|
||||
|
||||
@@ -65,6 +65,10 @@ class CmdCheckIOCS(Command):
|
||||
m = iocs_module.from_json(
|
||||
file_path, log=logging.getLogger(iocs_module.__module__)
|
||||
)
|
||||
if not m:
|
||||
log.warning("No result from this module, skipping it")
|
||||
continue
|
||||
|
||||
if self.iocs.total_ioc_count > 0:
|
||||
m.indicators = self.iocs
|
||||
m.indicators.log = m.log
|
||||
|
||||
@@ -17,6 +17,7 @@ from mvt.common.utils import (
|
||||
generate_hashes_from_path,
|
||||
get_sha256_from_file_path,
|
||||
)
|
||||
from mvt.common.config import settings
|
||||
from mvt.common.version import MVT_VERSION
|
||||
|
||||
|
||||
@@ -81,7 +82,7 @@ class Command:
|
||||
os.path.join(self.results_path, "command.log")
|
||||
)
|
||||
formatter = logging.Formatter(
|
||||
"%(asctime)s - %(name)s - " "%(levelname)s - %(message)s"
|
||||
"%(asctime)s - %(name)s - %(levelname)s - %(message)s"
|
||||
)
|
||||
file_handler.setLevel(logging.DEBUG)
|
||||
file_handler.setFormatter(formatter)
|
||||
@@ -100,15 +101,25 @@ class Command:
|
||||
if not self.results_path:
|
||||
return
|
||||
|
||||
# We use local timestamps in the timeline on Android as many
|
||||
# logs do not contain timezone information.
|
||||
if type(self).__name__.startswith("CmdAndroid"):
|
||||
is_utc = False
|
||||
else:
|
||||
is_utc = True
|
||||
|
||||
if len(self.timeline) > 0:
|
||||
save_timeline(
|
||||
self.timeline, os.path.join(self.results_path, "timeline.csv")
|
||||
self.timeline,
|
||||
os.path.join(self.results_path, "timeline.csv"),
|
||||
is_utc=is_utc,
|
||||
)
|
||||
|
||||
if len(self.timeline_detected) > 0:
|
||||
save_timeline(
|
||||
self.timeline_detected,
|
||||
os.path.join(self.results_path, "timeline_detected.csv"),
|
||||
is_utc=is_utc,
|
||||
)
|
||||
|
||||
def _store_info(self) -> None:
|
||||
@@ -132,7 +143,7 @@ class Command:
|
||||
if ioc_file_path and ioc_file_path not in info["ioc_files"]:
|
||||
info["ioc_files"].append(ioc_file_path)
|
||||
|
||||
if self.target_path and (os.environ.get("MVT_HASH_FILES") or self.hashes):
|
||||
if self.target_path and (settings.HASH_FILES or self.hashes):
|
||||
self.generate_hashes()
|
||||
|
||||
info["hashes"] = self.hash_values
|
||||
@@ -141,7 +152,7 @@ class Command:
|
||||
with open(info_path, "w+", encoding="utf-8") as handle:
|
||||
json.dump(info, handle, indent=4)
|
||||
|
||||
if self.target_path and (os.environ.get("MVT_HASH_FILES") or self.hashes):
|
||||
if self.target_path and (settings.HASH_FILES or self.hashes):
|
||||
info_hash = get_sha256_from_file_path(info_path)
|
||||
self.log.info('Reference hash of the info.json file: "%s"', info_hash)
|
||||
|
||||
|
||||
105
src/mvt/common/config.py
Normal file
105
src/mvt/common/config.py
Normal file
@@ -0,0 +1,105 @@
|
||||
import os
|
||||
import yaml
|
||||
import json
|
||||
|
||||
from typing import Tuple, Type, Optional
|
||||
from appdirs import user_config_dir
|
||||
from pydantic import AnyHttpUrl, Field
|
||||
from pydantic_settings import (
|
||||
BaseSettings,
|
||||
InitSettingsSource,
|
||||
PydanticBaseSettingsSource,
|
||||
SettingsConfigDict,
|
||||
YamlConfigSettingsSource,
|
||||
)
|
||||
|
||||
MVT_CONFIG_FOLDER = user_config_dir("mvt")
|
||||
MVT_CONFIG_PATH = os.path.join(MVT_CONFIG_FOLDER, "config.yaml")
|
||||
|
||||
|
||||
class MVTSettings(BaseSettings):
|
||||
model_config = SettingsConfigDict(
|
||||
env_prefix="MVT_",
|
||||
env_nested_delimiter="_",
|
||||
extra="ignore",
|
||||
nested_model_default_partial_updates=True,
|
||||
)
|
||||
# Allow to decided if want to load environment variables
|
||||
load_env: bool = Field(True, exclude=True)
|
||||
|
||||
# General settings
|
||||
PYPI_UPDATE_URL: AnyHttpUrl = Field(
|
||||
"https://pypi.org/pypi/mvt/json",
|
||||
validate_default=False,
|
||||
)
|
||||
NETWORK_ACCESS_ALLOWED: bool = True
|
||||
NETWORK_TIMEOUT: int = 15
|
||||
|
||||
# Command default settings, all can be specified by MVT_ prefixed environment variables too.
|
||||
IOS_BACKUP_PASSWORD: Optional[str] = Field(
|
||||
None, description="Default password to use to decrypt iOS backups"
|
||||
)
|
||||
ANDROID_BACKUP_PASSWORD: Optional[str] = Field(
|
||||
None, description="Default password to use to decrypt Android backups"
|
||||
)
|
||||
STIX2: Optional[str] = Field(
|
||||
None, description="List of directories where STIX2 files are stored"
|
||||
)
|
||||
VT_API_KEY: Optional[str] = Field(
|
||||
None, description="API key to use for VirusTotal lookups"
|
||||
)
|
||||
PROFILE: bool = Field(False, description="Profile the execution of MVT modules")
|
||||
HASH_FILES: bool = Field(False, description="Should MVT hash output files")
|
||||
|
||||
@classmethod
|
||||
def settings_customise_sources(
|
||||
cls,
|
||||
settings_cls: Type[BaseSettings],
|
||||
init_settings: InitSettingsSource,
|
||||
env_settings: PydanticBaseSettingsSource,
|
||||
dotenv_settings: PydanticBaseSettingsSource,
|
||||
file_secret_settings: PydanticBaseSettingsSource,
|
||||
) -> Tuple[PydanticBaseSettingsSource, ...]:
|
||||
sources = (
|
||||
YamlConfigSettingsSource(settings_cls, MVT_CONFIG_PATH),
|
||||
init_settings,
|
||||
)
|
||||
# Load env variables if enabled
|
||||
if init_settings.init_kwargs.get("load_env", True):
|
||||
sources = (env_settings,) + sources
|
||||
return sources
|
||||
|
||||
def save_settings(
|
||||
self,
|
||||
) -> None:
|
||||
"""
|
||||
Save the current settings to a file.
|
||||
"""
|
||||
if not os.path.isdir(MVT_CONFIG_FOLDER):
|
||||
os.makedirs(MVT_CONFIG_FOLDER)
|
||||
|
||||
# Dump the settings to the YAML file
|
||||
model_serializable = json.loads(self.model_dump_json(exclude_defaults=True))
|
||||
with open(MVT_CONFIG_PATH, "w") as config_file:
|
||||
config_file.write(yaml.dump(model_serializable, default_flow_style=False))
|
||||
|
||||
@classmethod
|
||||
def initialise(cls) -> "MVTSettings":
|
||||
"""
|
||||
Initialise the settings file.
|
||||
|
||||
We first initialise the settings (without env variable) and then persist
|
||||
them to file. This way we can update the config file with the default values.
|
||||
|
||||
Afterwards we load the settings again, this time including the env variables.
|
||||
"""
|
||||
# Set invalid env prefix to avoid loading env variables.
|
||||
settings = MVTSettings(load_env=False)
|
||||
settings.save_settings()
|
||||
|
||||
# Load the settings again with any ENV variables.
|
||||
settings = MVTSettings(load_env=True)
|
||||
return settings
|
||||
|
||||
|
||||
settings = MVTSettings.initialise()
|
||||
@@ -14,6 +14,7 @@ import ahocorasick
|
||||
from appdirs import user_data_dir
|
||||
|
||||
from .url import URL
|
||||
from .config import settings
|
||||
|
||||
MVT_DATA_FOLDER = user_data_dir("mvt")
|
||||
MVT_INDICATORS_FOLDER = os.path.join(MVT_DATA_FOLDER, "indicators")
|
||||
@@ -41,12 +42,12 @@ class Indicators:
|
||||
|
||||
def _check_stix2_env_variable(self) -> None:
|
||||
"""
|
||||
Checks if a variable MVT_STIX2 contains path to a STIX file. Also recursively searches through dirs in MVT_STIX2
|
||||
Checks if MVT_STIX2 setting or environment variable contains path to a STIX file. Also recursively searches through dirs in MVT_STIX2
|
||||
"""
|
||||
if "MVT_STIX2" not in os.environ:
|
||||
if not settings.STIX2:
|
||||
return
|
||||
|
||||
paths = os.environ["MVT_STIX2"].split(":")
|
||||
paths = settings.STIX2.split(":")
|
||||
for path in paths:
|
||||
if os.path.isfile(path) and path.lower().endswith(".stix2"):
|
||||
self.parse_stix2(path)
|
||||
@@ -383,8 +384,7 @@ class Indicators:
|
||||
for ioc in self.get_iocs("urls"):
|
||||
if ioc["value"] == url:
|
||||
self.log.warning(
|
||||
"Found a known suspicious URL %s "
|
||||
'matching indicator "%s" from "%s"',
|
||||
'Found a known suspicious URL %s matching indicator "%s" from "%s"',
|
||||
url,
|
||||
ioc["value"],
|
||||
ioc["name"],
|
||||
@@ -654,7 +654,8 @@ class Indicators:
|
||||
return None
|
||||
|
||||
for ioc in self.get_iocs("processes"):
|
||||
parts = file_path.split("/")
|
||||
# Use os-agnostic path splitting to handle both Windows (\) and Unix (/) separators
|
||||
parts = file_path.replace("\\", "/").split("/")
|
||||
if ioc["value"] in parts:
|
||||
self.log.warning(
|
||||
"Found known suspicious process name mentioned in file at "
|
||||
|
||||
@@ -29,7 +29,7 @@ def check_updates() -> None:
|
||||
if latest_version:
|
||||
rich_print(
|
||||
f"\t\t[bold]Version {latest_version} is available! "
|
||||
"Upgrade mvt with `pip3 install -U mvt`[/bold]"
|
||||
"Upgrade mvt with `pip3 install -U mvt` or with `pipx upgrade mvt`[/bold]"
|
||||
)
|
||||
|
||||
# Then we check for indicators files updates.
|
||||
|
||||
@@ -69,10 +69,14 @@ class MVTModule:
|
||||
@classmethod
|
||||
def from_json(cls, json_path: str, log: logging.Logger):
|
||||
with open(json_path, "r", encoding="utf-8") as handle:
|
||||
results = json.load(handle)
|
||||
if log:
|
||||
log.info('Loaded %d results from "%s"', len(results), json_path)
|
||||
return cls(results=results, log=log)
|
||||
try:
|
||||
results = json.load(handle)
|
||||
if log:
|
||||
log.info('Loaded %d results from "%s"', len(results), json_path)
|
||||
return cls(results=results, log=log)
|
||||
except json.decoder.JSONDecodeError as err:
|
||||
log.error('Error to decode the json "%s" file: "%s"', json_path, err)
|
||||
return None
|
||||
|
||||
@classmethod
|
||||
def get_slug(cls) -> str:
|
||||
@@ -227,7 +231,7 @@ def run_module(module: MVTModule) -> None:
|
||||
module.save_to_json()
|
||||
|
||||
|
||||
def save_timeline(timeline: list, timeline_path: str) -> None:
|
||||
def save_timeline(timeline: list, timeline_path: str, is_utc: bool = True) -> None:
|
||||
"""Save the timeline in a csv file.
|
||||
|
||||
:param timeline: List of records to order and store
|
||||
@@ -238,7 +242,12 @@ def save_timeline(timeline: list, timeline_path: str) -> None:
|
||||
csvoutput = csv.writer(
|
||||
handle, delimiter=",", quotechar='"', quoting=csv.QUOTE_ALL, escapechar="\\"
|
||||
)
|
||||
csvoutput.writerow(["UTC Timestamp", "Plugin", "Event", "Description"])
|
||||
|
||||
if is_utc:
|
||||
timestamp_header = "UTC Timestamp"
|
||||
else:
|
||||
timestamp_header = "Device Local Timestamp"
|
||||
csvoutput.writerow([timestamp_header, "Plugin", "Event", "Description"])
|
||||
|
||||
for event in sorted(
|
||||
timeline, key=lambda x: x["timestamp"] if x["timestamp"] is not None else ""
|
||||
|
||||
@@ -14,6 +14,7 @@ from packaging import version
|
||||
|
||||
from .indicators import MVT_DATA_FOLDER, MVT_INDICATORS_FOLDER
|
||||
from .version import MVT_VERSION
|
||||
from .config import settings
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
@@ -23,7 +24,7 @@ INDICATORS_CHECK_FREQUENCY = 12
|
||||
|
||||
class MVTUpdates:
|
||||
def check(self) -> str:
|
||||
res = requests.get("https://pypi.org/pypi/mvt/json", timeout=15)
|
||||
res = requests.get(settings.PYPI_UPDATE_URL, timeout=15)
|
||||
data = res.json()
|
||||
latest_version = data.get("info", {}).get("version", "")
|
||||
|
||||
|
||||
@@ -13,6 +13,7 @@ import re
|
||||
from typing import Any, Iterator, Union
|
||||
|
||||
from rich.logging import RichHandler
|
||||
from mvt.common.config import settings
|
||||
|
||||
|
||||
class CustomJSONEncoder(json.JSONEncoder):
|
||||
@@ -256,7 +257,7 @@ def set_verbose_logging(verbose: bool = False):
|
||||
|
||||
def exec_or_profile(module, globals, locals):
|
||||
"""Hook for profiling MVT modules"""
|
||||
if int(os.environ.get("MVT_PROFILE", False)):
|
||||
if settings.PROFILE:
|
||||
cProfile.runctx(module, globals, locals)
|
||||
else:
|
||||
exec(module, globals, locals)
|
||||
|
||||
@@ -3,4 +3,4 @@
|
||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||
# https://license.mvt.re/1.1/
|
||||
|
||||
MVT_VERSION = "2.5.4"
|
||||
MVT_VERSION = "2.6.1"
|
||||
|
||||
@@ -100,7 +100,7 @@ def decrypt_backup(ctx, destination, password, key_file, hashes, backup_path):
|
||||
if key_file:
|
||||
if MVT_IOS_BACKUP_PASSWORD in os.environ:
|
||||
log.info(
|
||||
"Ignoring %s environment variable, using --key-file" "'%s' instead",
|
||||
"Ignoring %s environment variable, using --key-file'%s' instead",
|
||||
MVT_IOS_BACKUP_PASSWORD,
|
||||
key_file,
|
||||
)
|
||||
@@ -114,7 +114,7 @@ def decrypt_backup(ctx, destination, password, key_file, hashes, backup_path):
|
||||
|
||||
if MVT_IOS_BACKUP_PASSWORD in os.environ:
|
||||
log.info(
|
||||
"Ignoring %s environment variable, using --password" "argument instead",
|
||||
"Ignoring %s environment variable, using --passwordargument instead",
|
||||
MVT_IOS_BACKUP_PASSWORD,
|
||||
)
|
||||
|
||||
@@ -168,8 +168,7 @@ def extract_key(password, key_file, backup_path):
|
||||
|
||||
if MVT_IOS_BACKUP_PASSWORD in os.environ:
|
||||
log.info(
|
||||
"Ignoring %s environment variable, using --password "
|
||||
"argument instead",
|
||||
"Ignoring %s environment variable, using --password argument instead",
|
||||
MVT_IOS_BACKUP_PASSWORD,
|
||||
)
|
||||
elif MVT_IOS_BACKUP_PASSWORD in os.environ:
|
||||
|
||||
@@ -891,6 +891,10 @@
|
||||
"version": "15.8.2",
|
||||
"build": "19H384"
|
||||
},
|
||||
{
|
||||
"version": "15.8.4",
|
||||
"build": "19H390"
|
||||
},
|
||||
{
|
||||
"build": "20A362",
|
||||
"version": "16.0"
|
||||
@@ -992,6 +996,10 @@
|
||||
"version": "16.7.8",
|
||||
"build": "20H343"
|
||||
},
|
||||
{
|
||||
"version": "16.7.11",
|
||||
"build": "20H360"
|
||||
},
|
||||
{
|
||||
"version": "17.0",
|
||||
"build": "21A327"
|
||||
@@ -1076,6 +1084,10 @@
|
||||
"version": "17.6.1",
|
||||
"build": "21G101"
|
||||
},
|
||||
{
|
||||
"version": "17.7.7",
|
||||
"build": "21H433"
|
||||
},
|
||||
{
|
||||
"version": "18",
|
||||
"build": "22A3354"
|
||||
@@ -1083,5 +1095,45 @@
|
||||
{
|
||||
"version": "18.0.1",
|
||||
"build": "22A3370"
|
||||
},
|
||||
{
|
||||
"version": "18.1",
|
||||
"build": "22B83"
|
||||
},
|
||||
{
|
||||
"version": "18.1.1",
|
||||
"build": "22B91"
|
||||
},
|
||||
{
|
||||
"version": "18.2",
|
||||
"build": "22C152"
|
||||
},
|
||||
{
|
||||
"version": "18.2.1",
|
||||
"build": "22C161"
|
||||
},
|
||||
{
|
||||
"version": "18.3",
|
||||
"build": "22D63"
|
||||
},
|
||||
{
|
||||
"version": "18.3.1",
|
||||
"build": "22D72"
|
||||
},
|
||||
{
|
||||
"version": "18.4",
|
||||
"build": "22E240"
|
||||
},
|
||||
{
|
||||
"version": "18.4.1",
|
||||
"build": "22E252"
|
||||
},
|
||||
{
|
||||
"version": "18.5",
|
||||
"build": "22F76"
|
||||
},
|
||||
{
|
||||
"version": "18.6",
|
||||
"build": "22G86"
|
||||
}
|
||||
]
|
||||
@@ -41,7 +41,7 @@ class BackupInfo(IOSExtraction):
|
||||
info_path = os.path.join(self.target_path, "Info.plist")
|
||||
if not os.path.exists(info_path):
|
||||
raise DatabaseNotFoundError(
|
||||
"No Info.plist at backup path, unable to extract device " "information"
|
||||
"No Info.plist at backup path, unable to extract device information"
|
||||
)
|
||||
|
||||
with open(info_path, "rb") as handle:
|
||||
|
||||
@@ -110,8 +110,7 @@ class Manifest(IOSExtraction):
|
||||
ioc = self.indicators.check_url(part)
|
||||
if ioc:
|
||||
self.log.warning(
|
||||
'Found mention of domain "%s" in a backup file with '
|
||||
"path: %s",
|
||||
'Found mention of domain "%s" in a backup file with path: %s',
|
||||
ioc["value"],
|
||||
rel_path,
|
||||
)
|
||||
|
||||
@@ -74,7 +74,7 @@ class IOSExtraction(MVTModule):
|
||||
|
||||
if not shutil.which("sqlite3"):
|
||||
raise DatabaseCorruptedError(
|
||||
"failed to recover without sqlite3 binary: please install " "sqlite3!"
|
||||
"failed to recover without sqlite3 binary: please install sqlite3!"
|
||||
)
|
||||
if '"' in file_path:
|
||||
raise DatabaseCorruptedError(
|
||||
|
||||
@@ -17,6 +17,12 @@ from mvt.ios.modules.base import IOSExtraction
|
||||
APPLICATIONS_DB_PATH = [
|
||||
"private/var/containers/Bundle/Application/*/iTunesMetadata.plist"
|
||||
]
|
||||
KNOWN_APP_INSTALLERS = [
|
||||
"com.apple.AppStore",
|
||||
"com.apple.AppStore.ProductPageExtension",
|
||||
"com.apple.dmd",
|
||||
"dmd",
|
||||
]
|
||||
|
||||
|
||||
class Applications(IOSExtraction):
|
||||
@@ -80,12 +86,10 @@ class Applications(IOSExtraction):
|
||||
self.detected.append(result)
|
||||
continue
|
||||
# Some apps installed from apple store with sourceApp "com.apple.AppStore.ProductPageExtension"
|
||||
if result.get("sourceApp", "com.apple.AppStore") not in [
|
||||
"com.apple.AppStore",
|
||||
"com.apple.AppStore.ProductPageExtension",
|
||||
"com.apple.dmd",
|
||||
"dmd",
|
||||
]:
|
||||
if (
|
||||
result.get("sourceApp", "com.apple.AppStore")
|
||||
not in KNOWN_APP_INSTALLERS
|
||||
):
|
||||
self.log.warning(
|
||||
"Suspicious app not installed from the App Store or MDM: %s",
|
||||
result["softwareVersionBundleId"],
|
||||
|
||||
@@ -43,6 +43,8 @@ class GlobalPreferences(IOSExtraction):
|
||||
self.log.warning("Lockdown mode enabled")
|
||||
else:
|
||||
self.log.warning("Lockdown mode disabled")
|
||||
return
|
||||
self.log.warning("Lockdown mode disabled")
|
||||
|
||||
def process_file(self, file_path: str) -> None:
|
||||
with open(file_path, "rb") as handle:
|
||||
|
||||
@@ -95,14 +95,17 @@ class SafariBrowserState(IOSExtraction):
|
||||
)
|
||||
except sqlite3.OperationalError:
|
||||
# Old version iOS <12 likely
|
||||
cur.execute(
|
||||
try:
|
||||
cur.execute(
|
||||
"""
|
||||
SELECT
|
||||
title, url, user_visible_url, last_viewed_time, session_data
|
||||
FROM tabs
|
||||
ORDER BY last_viewed_time;
|
||||
"""
|
||||
SELECT
|
||||
title, url, user_visible_url, last_viewed_time, session_data
|
||||
FROM tabs
|
||||
ORDER BY last_viewed_time;
|
||||
"""
|
||||
)
|
||||
)
|
||||
except sqlite3.OperationalError as e:
|
||||
self.log.error(f"Error executing query: {e}")
|
||||
|
||||
for row in cur:
|
||||
session_entries = []
|
||||
|
||||
@@ -43,7 +43,7 @@ class SMS(IOSExtraction):
|
||||
|
||||
def serialize(self, record: dict) -> Union[dict, list]:
|
||||
text = record["text"].replace("\n", "\\n")
|
||||
sms_data = f"{record['service']}: {record['guid']} \"{text}\" from {record['phone_number']} ({record['account']})"
|
||||
sms_data = f'{record["service"]}: {record["guid"]} "{text}" from {record["phone_number"]} ({record["account"]})'
|
||||
records = [
|
||||
{
|
||||
"timestamp": record["isodate"],
|
||||
|
||||
@@ -116,13 +116,16 @@ class TCC(IOSExtraction):
|
||||
)
|
||||
db_version = "v2"
|
||||
except sqlite3.OperationalError:
|
||||
cur.execute(
|
||||
"""SELECT
|
||||
service, client, client_type, allowed,
|
||||
prompt_count
|
||||
FROM access;"""
|
||||
)
|
||||
db_version = "v1"
|
||||
try:
|
||||
cur.execute(
|
||||
"""SELECT
|
||||
service, client, client_type, allowed,
|
||||
prompt_count
|
||||
FROM access;"""
|
||||
)
|
||||
db_version = "v1"
|
||||
except sqlite3.OperationalError as e:
|
||||
self.log.error(f"Error parsing TCC database: {e}")
|
||||
|
||||
for row in cur:
|
||||
service = row[0]
|
||||
|
||||
@@ -100,7 +100,7 @@ class WebkitSessionResourceLog(IOSExtraction):
|
||||
redirect_path += ", ".join(source_domains)
|
||||
redirect_path += " -> "
|
||||
|
||||
redirect_path += f"ORIGIN: \"{entry['origin']}\""
|
||||
redirect_path += f'ORIGIN: "{entry["origin"]}"'
|
||||
|
||||
if len(destination_domains) > 0:
|
||||
redirect_path += " -> "
|
||||
|
||||
@@ -38,44 +38,70 @@ class NetBase(IOSExtraction):
|
||||
|
||||
def _extract_net_data(self):
|
||||
conn = sqlite3.connect(self.file_path)
|
||||
conn.row_factory = sqlite3.Row
|
||||
cur = conn.cursor()
|
||||
cur.execute(
|
||||
try:
|
||||
cur.execute(
|
||||
"""
|
||||
SELECT
|
||||
ZPROCESS.ZFIRSTTIMESTAMP,
|
||||
ZPROCESS.ZTIMESTAMP,
|
||||
ZPROCESS.ZPROCNAME,
|
||||
ZPROCESS.ZBUNDLENAME,
|
||||
ZPROCESS.Z_PK AS ZPROCESS_PK,
|
||||
ZLIVEUSAGE.ZWIFIIN,
|
||||
ZLIVEUSAGE.ZWIFIOUT,
|
||||
ZLIVEUSAGE.ZWWANIN,
|
||||
ZLIVEUSAGE.ZWWANOUT,
|
||||
ZLIVEUSAGE.Z_PK AS ZLIVEUSAGE_PK,
|
||||
ZLIVEUSAGE.ZHASPROCESS,
|
||||
ZLIVEUSAGE.ZTIMESTAMP AS ZL_TIMESTAMP
|
||||
FROM ZLIVEUSAGE
|
||||
LEFT JOIN ZPROCESS ON ZLIVEUSAGE.ZHASPROCESS = ZPROCESS.Z_PK
|
||||
UNION
|
||||
SELECT ZFIRSTTIMESTAMP, ZTIMESTAMP, ZPROCNAME, ZBUNDLENAME, Z_PK,
|
||||
NULL, NULL, NULL, NULL, NULL, NULL, NULL
|
||||
FROM ZPROCESS WHERE Z_PK NOT IN
|
||||
(SELECT ZHASPROCESS FROM ZLIVEUSAGE);
|
||||
"""
|
||||
SELECT
|
||||
ZPROCESS.ZFIRSTTIMESTAMP,
|
||||
ZPROCESS.ZTIMESTAMP,
|
||||
ZPROCESS.ZPROCNAME,
|
||||
ZPROCESS.ZBUNDLENAME,
|
||||
ZPROCESS.Z_PK,
|
||||
ZLIVEUSAGE.ZWIFIIN,
|
||||
ZLIVEUSAGE.ZWIFIOUT,
|
||||
ZLIVEUSAGE.ZWWANIN,
|
||||
ZLIVEUSAGE.ZWWANOUT,
|
||||
ZLIVEUSAGE.Z_PK,
|
||||
ZLIVEUSAGE.ZHASPROCESS,
|
||||
ZLIVEUSAGE.ZTIMESTAMP
|
||||
FROM ZLIVEUSAGE
|
||||
LEFT JOIN ZPROCESS ON ZLIVEUSAGE.ZHASPROCESS = ZPROCESS.Z_PK
|
||||
UNION
|
||||
SELECT ZFIRSTTIMESTAMP, ZTIMESTAMP, ZPROCNAME, ZBUNDLENAME, Z_PK,
|
||||
NULL, NULL, NULL, NULL, NULL, NULL, NULL
|
||||
FROM ZPROCESS WHERE Z_PK NOT IN
|
||||
(SELECT ZHASPROCESS FROM ZLIVEUSAGE);
|
||||
"""
|
||||
)
|
||||
)
|
||||
except sqlite3.OperationalError:
|
||||
# Recent phones don't have ZWIFIIN and ZWIFIOUT columns
|
||||
cur.execute(
|
||||
"""
|
||||
SELECT
|
||||
ZPROCESS.ZFIRSTTIMESTAMP,
|
||||
ZPROCESS.ZTIMESTAMP,
|
||||
ZPROCESS.ZPROCNAME,
|
||||
ZPROCESS.ZBUNDLENAME,
|
||||
ZPROCESS.Z_PK AS ZPROCESS_PK,
|
||||
ZLIVEUSAGE.ZWWANIN,
|
||||
ZLIVEUSAGE.ZWWANOUT,
|
||||
ZLIVEUSAGE.Z_PK AS ZLIVEUSAGE_PK,
|
||||
ZLIVEUSAGE.ZHASPROCESS,
|
||||
ZLIVEUSAGE.ZTIMESTAMP AS ZL_TIMESTAMP
|
||||
FROM ZLIVEUSAGE
|
||||
LEFT JOIN ZPROCESS ON ZLIVEUSAGE.ZHASPROCESS = ZPROCESS.Z_PK
|
||||
UNION
|
||||
SELECT ZFIRSTTIMESTAMP, ZTIMESTAMP, ZPROCNAME, ZBUNDLENAME, Z_PK,
|
||||
NULL, NULL, NULL, NULL, NULL
|
||||
FROM ZPROCESS WHERE Z_PK NOT IN
|
||||
(SELECT ZHASPROCESS FROM ZLIVEUSAGE);
|
||||
"""
|
||||
)
|
||||
|
||||
for row in cur:
|
||||
# ZPROCESS records can be missing after the JOIN.
|
||||
# Handle NULL timestamps.
|
||||
if row[0] and row[1]:
|
||||
first_isodate = convert_mactime_to_iso(row[0])
|
||||
isodate = convert_mactime_to_iso(row[1])
|
||||
if row["ZFIRSTTIMESTAMP"] and row["ZTIMESTAMP"]:
|
||||
first_isodate = convert_mactime_to_iso(row["ZFIRSTTIMESTAMP"])
|
||||
isodate = convert_mactime_to_iso(row["ZTIMESTAMP"])
|
||||
else:
|
||||
first_isodate = row[0]
|
||||
isodate = row[1]
|
||||
first_isodate = row["ZFIRSTTIMESTAMP"]
|
||||
isodate = row["ZTIMESTAMP"]
|
||||
|
||||
if row[11]:
|
||||
live_timestamp = convert_mactime_to_iso(row[11])
|
||||
if row["ZL_TIMESTAMP"]:
|
||||
live_timestamp = convert_mactime_to_iso(row["ZL_TIMESTAMP"])
|
||||
else:
|
||||
live_timestamp = ""
|
||||
|
||||
@@ -83,16 +109,18 @@ class NetBase(IOSExtraction):
|
||||
{
|
||||
"first_isodate": first_isodate,
|
||||
"isodate": isodate,
|
||||
"proc_name": row[2],
|
||||
"bundle_id": row[3],
|
||||
"proc_id": row[4],
|
||||
"wifi_in": row[5],
|
||||
"wifi_out": row[6],
|
||||
"wwan_in": row[7],
|
||||
"wwan_out": row[8],
|
||||
"live_id": row[9],
|
||||
"live_proc_id": row[10],
|
||||
"live_isodate": live_timestamp if row[11] else first_isodate,
|
||||
"proc_name": row["ZPROCNAME"],
|
||||
"bundle_id": row["ZBUNDLENAME"],
|
||||
"proc_id": row["ZPROCESS_PK"],
|
||||
"wifi_in": row["ZWIFIIN"] if "ZWIFIIN" in row.keys() else None,
|
||||
"wifi_out": row["ZWIFIOUT"] if "ZWIFIOUT" in row.keys() else None,
|
||||
"wwan_in": row["ZWWANIN"],
|
||||
"wwan_out": row["ZWWANOUT"],
|
||||
"live_id": row["ZLIVEUSAGE_PK"],
|
||||
"live_proc_id": row["ZHASPROCESS"],
|
||||
"live_isodate": live_timestamp
|
||||
if row["ZL_TIMESTAMP"]
|
||||
else first_isodate,
|
||||
}
|
||||
)
|
||||
|
||||
@@ -108,8 +136,6 @@ class NetBase(IOSExtraction):
|
||||
)
|
||||
record_data_usage = (
|
||||
record_data + " "
|
||||
f"WIFI IN: {record['wifi_in']}, "
|
||||
f"WIFI OUT: {record['wifi_out']} - "
|
||||
f"WWAN IN: {record['wwan_in']}, "
|
||||
f"WWAN OUT: {record['wwan_out']}"
|
||||
)
|
||||
|
||||
@@ -29,3 +29,28 @@ class TestDumpsysADBArtifact:
|
||||
user_key["fingerprint"] == "F0:A1:3D:8C:B3:F4:7B:09:9F:EE:8B:D8:38:2E:BD:C6"
|
||||
)
|
||||
assert user_key["user"] == "user@linux"
|
||||
|
||||
def test_parsing_adb_xml(self):
|
||||
da_adb = DumpsysADBArtifact()
|
||||
file = get_artifact("android_data/dumpsys_adb_xml.txt")
|
||||
with open(file, "rb") as f:
|
||||
data = f.read()
|
||||
|
||||
da_adb.parse(data)
|
||||
|
||||
assert len(da_adb.results) == 1
|
||||
|
||||
adb_data = da_adb.results[0]
|
||||
assert "user_keys" in adb_data
|
||||
assert len(adb_data["user_keys"]) == 1
|
||||
|
||||
# Check key and fingerprint parsed successfully.
|
||||
expected_fingerprint = "F0:0B:27:08:E3:68:7B:FA:4C:79:A2:B4:BF:0E:CF:70"
|
||||
user_key = adb_data["user_keys"][0]
|
||||
user_key["fingerprint"] == expected_fingerprint
|
||||
assert user_key["user"] == "user@laptop"
|
||||
|
||||
key_store_entry = adb_data["keystore"][0]
|
||||
assert key_store_entry["user"] == "user@laptop"
|
||||
assert key_store_entry["fingerprint"] == expected_fingerprint
|
||||
assert key_store_entry["last_connected"] == "1628501829898"
|
||||
|
||||
@@ -43,5 +43,21 @@ class TestDumpsysAppopsArtifact:
|
||||
ind.ioc_collections[0]["app_ids"].append("com.facebook.katana")
|
||||
da.indicators = ind
|
||||
assert len(da.detected) == 0
|
||||
|
||||
da.check_indicators()
|
||||
assert len(da.detected) == 1
|
||||
detected_by_ioc = [
|
||||
detected for detected in da.detected if detected.get("matched_indicator")
|
||||
]
|
||||
detected_by_permission_heuristic = [
|
||||
detected
|
||||
for detected in da.detected
|
||||
if all(
|
||||
[
|
||||
perm["name"] == "REQUEST_INSTALL_PACKAGES"
|
||||
for perm in detected["permissions"]
|
||||
]
|
||||
)
|
||||
]
|
||||
assert len(da.detected) == 3
|
||||
assert len(detected_by_ioc) == 1
|
||||
assert len(detected_by_permission_heuristic) == 2
|
||||
|
||||
40
tests/android/test_artifact_dumpsys_platform_compat.py
Normal file
40
tests/android/test_artifact_dumpsys_platform_compat.py
Normal file
@@ -0,0 +1,40 @@
|
||||
# Mobile Verification Toolkit (MVT)
|
||||
# Copyright (c) 2021-2023 The MVT Authors.
|
||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||
# https://license.mvt.re/1.1/
|
||||
import logging
|
||||
|
||||
from mvt.android.artifacts.dumpsys_platform_compat import DumpsysPlatformCompatArtifact
|
||||
from mvt.common.indicators import Indicators
|
||||
|
||||
from ..utils import get_artifact
|
||||
|
||||
|
||||
class TestDumpsysPlatformCompatArtifact:
|
||||
def test_parsing(self):
|
||||
dbi = DumpsysPlatformCompatArtifact()
|
||||
file = get_artifact("android_data/dumpsys_platform_compat.txt")
|
||||
with open(file) as f:
|
||||
data = f.read()
|
||||
|
||||
assert len(dbi.results) == 0
|
||||
dbi.parse(data)
|
||||
assert len(dbi.results) == 2
|
||||
assert dbi.results[0]["package_name"] == "org.torproject.torbrowser"
|
||||
assert dbi.results[1]["package_name"] == "org.article19.circulo.next"
|
||||
|
||||
def test_ioc_check(self, indicator_file):
|
||||
dbi = DumpsysPlatformCompatArtifact()
|
||||
file = get_artifact("android_data/dumpsys_platform_compat.txt")
|
||||
with open(file) as f:
|
||||
data = f.read()
|
||||
dbi.parse(data)
|
||||
|
||||
ind = Indicators(log=logging.getLogger())
|
||||
ind.parse_stix2(indicator_file)
|
||||
ind.ioc_collections[0]["app_ids"].append("org.torproject.torbrowser")
|
||||
ind.ioc_collections[0]["app_ids"].append("org.article19.circulo.next")
|
||||
dbi.indicators = ind
|
||||
assert len(dbi.detected) == 0
|
||||
dbi.check_indicators()
|
||||
assert len(dbi.detected) == 2
|
||||
@@ -2,39 +2,66 @@
|
||||
# Copyright (c) 2021-2023 The MVT Authors.
|
||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||
# https://license.mvt.re/1.1/
|
||||
import os
|
||||
import datetime
|
||||
|
||||
import pytest
|
||||
|
||||
from mvt.android.artifacts.tombstone_crashes import TombstoneCrashArtifact
|
||||
from mvt.android.parsers.proto.tombstone import Tombstone
|
||||
|
||||
from ..utils import get_artifact
|
||||
|
||||
|
||||
class TestTombstoneCrashArtifact:
|
||||
# def test_tombtone_process_parsing(self):
|
||||
# tombstone_artifact = TombstoneCrashArtifact()
|
||||
# file = get_artifact("android_data/tombstone_process.txt")
|
||||
# with open(file, "rb") as f:
|
||||
# data = f.read()
|
||||
|
||||
# tombstone_artifact.parse_text(data)
|
||||
# assert len(tombstone_artifact.results) == 1
|
||||
|
||||
# def test_tombtone_kernel_parsing(self):
|
||||
# tombstone_artifact = TombstoneCrashArtifact()
|
||||
# file = get_artifact("android_data/tombstone_kernel.txt")
|
||||
# with open(file, "rb") as f:
|
||||
# data = f.read()
|
||||
|
||||
# tombstone_artifact.parse_text(data)
|
||||
# assert len(tombstone_artifact.results) == 1
|
||||
|
||||
def test_tombstone_pb_process_parsing(self):
|
||||
file = get_artifact("android_data/tombstone_process.pb")
|
||||
def test_tombtone_process_parsing(self):
|
||||
tombstone_artifact = TombstoneCrashArtifact()
|
||||
artifact_path = "android_data/tombstone_process.txt"
|
||||
file = get_artifact(artifact_path)
|
||||
with open(file, "rb") as f:
|
||||
data = f.read()
|
||||
|
||||
parsed_tombstone = Tombstone().parse(data)
|
||||
assert parsed_tombstone
|
||||
assert parsed_tombstone.command_line == ["/vendor/bin/hw/android.hardware.media.c2@1.2-mediatek"]
|
||||
assert parsed_tombstone.uid == 1046
|
||||
assert parsed_tombstone.timestamp == "2023-04-12 12:32:40.518290770+0200"
|
||||
# Pass the file name and timestamp to the parse method
|
||||
file_name = os.path.basename(artifact_path)
|
||||
file_timestamp = datetime.datetime(2023, 4, 12, 12, 32, 40, 518290)
|
||||
tombstone_artifact.parse(file_name, file_timestamp, data)
|
||||
|
||||
assert len(tombstone_artifact.results) == 1
|
||||
self.validate_tombstone_result(tombstone_artifact.results[0])
|
||||
|
||||
def test_tombstone_pb_process_parsing(self):
|
||||
tombstone_artifact = TombstoneCrashArtifact()
|
||||
artifact_path = "android_data/tombstone_process.pb"
|
||||
file = get_artifact(artifact_path)
|
||||
with open(file, "rb") as f:
|
||||
data = f.read()
|
||||
|
||||
file_name = os.path.basename(artifact_path)
|
||||
file_timestamp = datetime.datetime(2023, 4, 12, 12, 32, 40, 518290)
|
||||
tombstone_artifact.parse_protobuf(file_name, file_timestamp, data)
|
||||
|
||||
assert len(tombstone_artifact.results) == 1
|
||||
self.validate_tombstone_result(tombstone_artifact.results[0])
|
||||
|
||||
@pytest.mark.skip(reason="Not implemented yet")
|
||||
def test_tombtone_kernel_parsing(self):
|
||||
tombstone_artifact = TombstoneCrashArtifact()
|
||||
file = get_artifact("android_data/tombstone_kernel.txt")
|
||||
with open(file, "rb") as f:
|
||||
data = f.read()
|
||||
|
||||
tombstone_artifact.parse_text(data)
|
||||
assert len(tombstone_artifact.results) == 1
|
||||
|
||||
def validate_tombstone_result(self, tombstone_result: dict):
|
||||
assert tombstone_result.get("command_line") == [
|
||||
"/vendor/bin/hw/android.hardware.media.c2@1.2-mediatek"
|
||||
]
|
||||
assert tombstone_result.get("uid") == 1046
|
||||
assert tombstone_result.get("pid") == 25541
|
||||
assert tombstone_result.get("process_name") == "mtk.ape.decoder"
|
||||
|
||||
# With Android logs we want to keep timestamps as device local time for consistency.
|
||||
# We often don't know the time offset for a log entry and so can't convert everything to UTC.
|
||||
# MVT should output the local time only:
|
||||
# So original 2023-04-12 12:32:40.518290770+0200 -> 2023-04-12 12:32:40.000000
|
||||
assert tombstone_result.get("timestamp") == "2023-04-12 12:32:40.518290"
|
||||
|
||||
23
tests/android_androidqf/test_dumpsys_platform_compat.py
Normal file
23
tests/android_androidqf/test_dumpsys_platform_compat.py
Normal file
@@ -0,0 +1,23 @@
|
||||
# Mobile Verification Toolkit (MVT)
|
||||
# Copyright (c) 2021-2023 The MVT Authors.
|
||||
# Use of this software is governed by the MVT License 1.1 that can be found at
|
||||
# https://license.mvt.re/1.1/
|
||||
|
||||
from pathlib import Path
|
||||
|
||||
from mvt.android.modules.androidqf.dumpsys_platform_compat import DumpsysPlatformCompat
|
||||
from mvt.common.module import run_module
|
||||
|
||||
from ..utils import get_android_androidqf, list_files
|
||||
|
||||
|
||||
class TestDumpsysPlatformCompatModule:
|
||||
def test_parsing(self):
|
||||
data_path = get_android_androidqf()
|
||||
m = DumpsysPlatformCompat(target_path=data_path)
|
||||
files = list_files(data_path)
|
||||
parent_path = Path(data_path).absolute().parent.as_posix()
|
||||
m.from_folder(parent_path, files)
|
||||
run_module(m)
|
||||
assert len(m.results) == 2
|
||||
assert len(m.detected) == 0
|
||||
@@ -21,4 +21,9 @@ class TestDumpsysAppOpsModule:
|
||||
run_module(m)
|
||||
assert len(m.results) == 12
|
||||
assert len(m.timeline) == 16
|
||||
assert len(m.detected) == 0
|
||||
|
||||
detected_by_ioc = [
|
||||
detected for detected in m.detected if detected.get("matched_indicator")
|
||||
]
|
||||
assert len(m.detected) == 1
|
||||
assert len(detected_by_ioc) == 0
|
||||
|
||||
@@ -9,6 +9,7 @@ from pathlib import Path
|
||||
from mvt.android.modules.bugreport.appops import Appops
|
||||
from mvt.android.modules.bugreport.getprop import Getprop
|
||||
from mvt.android.modules.bugreport.packages import Packages
|
||||
from mvt.android.modules.bugreport.tombstones import Tombstones
|
||||
from mvt.common.module import run_module
|
||||
|
||||
from ..utils import get_artifact_folder
|
||||
@@ -33,7 +34,12 @@ class TestBugreportAnalysis:
|
||||
m = self.launch_bug_report_module(Appops)
|
||||
assert len(m.results) == 12
|
||||
assert len(m.timeline) == 16
|
||||
assert len(m.detected) == 0
|
||||
|
||||
detected_by_ioc = [
|
||||
detected for detected in m.detected if detected.get("matched_indicator")
|
||||
]
|
||||
assert len(m.detected) == 1 # Hueristic detection for suspicious permissions
|
||||
assert len(detected_by_ioc) == 0
|
||||
|
||||
def test_packages_module(self):
|
||||
m = self.launch_bug_report_module(Packages)
|
||||
@@ -49,3 +55,8 @@ class TestBugreportAnalysis:
|
||||
def test_getprop_module(self):
|
||||
m = self.launch_bug_report_module(Getprop)
|
||||
assert len(m.results) == 0
|
||||
|
||||
def test_tombstones_modules(self):
|
||||
m = self.launch_bug_report_module(Tombstones)
|
||||
assert len(m.results) == 2
|
||||
assert m.results[1]["pid"] == 3559
|
||||
|
||||
@@ -0,0 +1,27 @@
|
||||
*** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
|
||||
Build fingerprint: 'samsung/a10eea/a10:10/.190711.020/A105:user/release-keys'
|
||||
Revision: '5'
|
||||
ABI: 'arm'
|
||||
Timestamp: 2021-09-29 17:43:49+0200
|
||||
pid: 9850, tid: 9893, name: UsbFfs-worker >>> /system/bin/adbd <<<
|
||||
uid: 2000
|
||||
signal 6 (SIGABRT), code -1 (SI_QUEUE), fault addr --------
|
||||
Abort message: 'Check failed: payload.size() <= bytes_left (payload.size()=99, bytes_left=51) '
|
||||
r0 00000000 r1 000026a5 r2 00000006 r3 f11fad98
|
||||
r4 f11fadac r5 f11fad90 r6 0000267a r7 0000016b
|
||||
r8 f11fada8 r9 f11fad98 r10 f11fadc8 r11 f11fadb8
|
||||
ip 000026a5 sp f11fad68 lr f20c23b7 pc f20c23ca
|
||||
|
||||
backtrace:
|
||||
#00 pc 000603ca /apex/com.android.runtime/lib/bionic/libc.so (abort+166) (BuildId: 320fbdc2a1289fadd7dacae7f2eb77a3)
|
||||
#01 pc 00007e23 /system/lib/libbase.so (android::base::DefaultAborter(char const*)+6) (BuildId: a28585ee446ea17e3e6fcf9c907fff2a)
|
||||
#02 pc 0000855f /system/lib/libbase.so (android::base::LogMessage::~LogMessage()+406) (BuildId: a28585ee446ea17e3e6fcf9c907fff2a)
|
||||
#03 pc 000309cf /system/lib/libadbd.so (UsbFfsConnection::ProcessRead(IoBlock*)+814) (BuildId: 3645b175977ae210c156a57b25dfa599)
|
||||
#04 pc 00030459 /system/lib/libadbd.so (UsbFfsConnection::HandleRead(TransferId, long long)+84) (BuildId: 3645b175977ae210c156a57b25dfa599)
|
||||
#05 pc 00030349 /system/lib/libadbd.so (UsbFfsConnection::ReadEvents()+92) (BuildId: 3645b175977ae210c156a57b25dfa599)
|
||||
#06 pc 00030169 /system/lib/libadbd.so (_ZZN16UsbFfsConnection11StartWorkerEvENKUlvE_clEv+504) (BuildId: 3645b175977ae210c156a57b25dfa599)
|
||||
#07 pc 0002ff53 /system/lib/libadbd.so (_ZNSt3__114__thread_proxyINS_5tupleIJNS_10unique_ptrINS_15__thread_structENS_14default_deleteIS3_EEEEZN16UsbFfsConnection11StartWorkerEvEUlvE_EEEEEPvSA_+26) (BuildId: 3645b175977ae210c156a57b25dfa599)
|
||||
#08 pc 000a75b3 /apex/com.android.runtime/lib/bionic/libc.so (__pthread_start(void*)+20) (BuildId: 320fbdc2a1289fadd7dacae7f2eb77a3)
|
||||
#09 pc 00061b33 /apex/com.android.runtime/lib/bionic/libc.so (__start_thread+30) (BuildId: 320fbdc2a1289fadd7dacae7f2eb77a3)
|
||||
|
||||
|
||||
@@ -0,0 +1,38 @@
|
||||
*** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
|
||||
Build fingerprint: 'samsung/a10eea/a10:11/RP1A.200720.012/A105:user/release-keys'
|
||||
Revision: '5'
|
||||
ABI: 'arm'
|
||||
Timestamp: 2023-08-21 23:28:59-0400
|
||||
pid: 3559, tid: 3568, name: tzts_daemon >>> /vendor/bin/tzts_daemon <<<
|
||||
uid: 1000
|
||||
signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0xe8b4d14c
|
||||
r0 e8b4d14c r1 e8b4d14c r2 0000002b r3 00000004
|
||||
r4 00000000 r5 e8b4d14c r6 00000000 r7 00000000
|
||||
r8 e7ef78b0 r9 0000002b r10 e7ef7dad r11 e7ef7400
|
||||
ip 00000000 sp e7ef7208 lr e89f4b01 pc e89c273a
|
||||
|
||||
backtrace:
|
||||
#00 pc 0005f73a /apex/com.android.runtime/lib/bionic/libc.so (strlen_a15+54) (BuildId: fef5b751123147ea65bf3f4f798c9518)
|
||||
#01 pc 00091afd /apex/com.android.runtime/lib/bionic/libc.so (__vfprintf+3364) (BuildId: fef5b751123147ea65bf3f4f798c9518)
|
||||
#02 pc 000a68e5 /apex/com.android.runtime/lib/bionic/libc.so (vsnprintf+152) (BuildId: fef5b751123147ea65bf3f4f798c9518)
|
||||
#03 pc 000051cf /system/lib/liblog.so (__android_log_vprint+74) (BuildId: 3fcead474cd0ecbdafb529ff176b0d13)
|
||||
#04 pc 000012e8 /vendor/bin/tzts_daemon
|
||||
|
||||
memory near r0:
|
||||
e8b4d12c -------- -------- -------- -------- ................
|
||||
e8b4d13c -------- -------- -------- -------- ................
|
||||
e8b4d14c -------- -------- -------- -------- ................
|
||||
e8b4d15c -------- -------- -------- -------- ................
|
||||
e8b4d16c -------- -------- -------- -------- ................
|
||||
e8b4d17c -------- -------- -------- -------- ................
|
||||
e8b4d18c -------- -------- -------- -------- ................
|
||||
e8b4d19c -------- -------- -------- -------- ................
|
||||
e8b4d1ac -------- -------- -------- -------- ................
|
||||
e8b4d1bc -------- -------- -------- -------- ................
|
||||
e8b4d1cc -------- -------- -------- -------- ................
|
||||
e8b4d1dc -------- -------- -------- -------- ................
|
||||
e8b4d1ec -------- -------- -------- -------- ................
|
||||
e8b4d1fc -------- -------- -------- -------- ................
|
||||
e8b4d20c -------- -------- -------- -------- ................
|
||||
e8b4d21c -------- -------- -------- -------- ................
|
||||
|
||||
@@ -246,6 +246,23 @@ Packages:
|
||||
com.instagram.direct.share.handler.DirectMultipleExternalMediaShareActivity
|
||||
com.instagram.share.handleractivity.ClipsShareHandlerActivity
|
||||
com.instagram.direct.share.handler.DirectMultipleExternalMediaShareActivityInterop
|
||||
|
||||
--------- 0.053s was the duration of dumpsys appops, ending at: 2022-03-29 23:14:27
|
||||
-------------------------------------------------------------------------------
|
||||
DUMP OF SERVICE platform_compat:
|
||||
ChangeId(180326845; name=OVERRIDE_MIN_ASPECT_RATIO_MEDIUM; disabled; overridable)
|
||||
ChangeId(189969744; name=DOWNSCALE_65; disabled; overridable)
|
||||
ChangeId(183372781; name=ENABLE_RAW_SYSTEM_GALLERY_ACCESS; enableSinceTargetSdk=30)
|
||||
ChangeId(150939131; name=ADD_CONTENT_OBSERVER_FLAGS; enableSinceTargetSdk=30)
|
||||
ChangeId(226439802; name=SCHEDULE_EXACT_ALARM_DENIED_BY_DEFAULT; disabled)
|
||||
ChangeId(270674727; name=ENABLE_STRICT_FORMATTER_VALIDATION; enableSinceTargetSdk=35)
|
||||
ChangeId(183155436; name=ALWAYS_USE_CONTEXT_USER; enableSinceTargetSdk=33)
|
||||
ChangeId(303742236; name=ROLE_MANAGER_USER_HANDLE_AWARE; enableSinceTargetSdk=35)
|
||||
ChangeId(203800354; name=MEDIA_CONTROL_SESSION_ACTIONS; enableSinceTargetSdk=33)
|
||||
ChangeId(144027538; name=BLOCK_GPS_STATUS_USAGE; enableSinceTargetSdk=31)
|
||||
ChangeId(189969749; name=DOWNSCALE_35; disabled; overridable)
|
||||
ChangeId(143539591; name=SELINUX_LATEST_CHANGES; disabled)
|
||||
ChangeId(247079863; name=DISALLOW_INVALID_GROUP_REFERENCE; enableSinceTargetSdk=34)
|
||||
ChangeId(174227820; name=FORCE_DISABLE_HEVC_SUPPORT; disabled)
|
||||
ChangeId(168419799; name=DOWNSCALED; disabled; packageOverrides={com.google.android.apps.tachyon=false, org.torproject.torbrowser=false}; rawOverrides={org.torproject.torbrowser=false, org.article19.circulo.next=false}; overridable)
|
||||
|
||||
|
||||
|
||||
16
tests/artifacts/android_data/dumpsys_adb_xml.txt
Normal file
16
tests/artifacts/android_data/dumpsys_adb_xml.txt
Normal file
@@ -0,0 +1,16 @@
|
||||
-------------------------------------------------------------------------------
|
||||
DUMP OF SERVICE adb:
|
||||
ADB MANAGER STATE (dumpsys adb):
|
||||
{
|
||||
debugging_manager={
|
||||
connected_to_adb=true
|
||||
user_keys=QAAAAAcgbytJst31DsaSP7hn8QcBXKR9NPVPK9MZssFVSNIP user@laptop
|
||||
|
||||
keystore=<?xml version='1.0' encoding='utf-8' standalone='yes' ?>
|
||||
<keyStore version="1">
|
||||
<adbKey key="QAAAAAcgbytJst31DsaSP7hn8QcBXKR9NPVPK9MZssFVSNIP user@laptop" lastConnection="1628501829898" />
|
||||
</keyStore>
|
||||
|
||||
}
|
||||
}
|
||||
--------- 0.012s was the duration of dumpsys adb, ending at: 2025-02-04 20:25:58
|
||||
16
tests/artifacts/android_data/dumpsys_platform_compat.txt
Normal file
16
tests/artifacts/android_data/dumpsys_platform_compat.txt
Normal file
@@ -0,0 +1,16 @@
|
||||
DUMP OF SERVICE platform_compat:
|
||||
ChangeId(180326845; name=OVERRIDE_MIN_ASPECT_RATIO_MEDIUM; disabled; overridable)
|
||||
ChangeId(189969744; name=DOWNSCALE_65; disabled; overridable)
|
||||
ChangeId(183372781; name=ENABLE_RAW_SYSTEM_GALLERY_ACCESS; enableSinceTargetSdk=30)
|
||||
ChangeId(150939131; name=ADD_CONTENT_OBSERVER_FLAGS; enableSinceTargetSdk=30)
|
||||
ChangeId(226439802; name=SCHEDULE_EXACT_ALARM_DENIED_BY_DEFAULT; disabled)
|
||||
ChangeId(270674727; name=ENABLE_STRICT_FORMATTER_VALIDATION; enableSinceTargetSdk=35)
|
||||
ChangeId(183155436; name=ALWAYS_USE_CONTEXT_USER; enableSinceTargetSdk=33)
|
||||
ChangeId(303742236; name=ROLE_MANAGER_USER_HANDLE_AWARE; enableSinceTargetSdk=35)
|
||||
ChangeId(203800354; name=MEDIA_CONTROL_SESSION_ACTIONS; enableSinceTargetSdk=33)
|
||||
ChangeId(144027538; name=BLOCK_GPS_STATUS_USAGE; enableSinceTargetSdk=31)
|
||||
ChangeId(189969749; name=DOWNSCALE_35; disabled; overridable)
|
||||
ChangeId(143539591; name=SELINUX_LATEST_CHANGES; disabled)
|
||||
ChangeId(247079863; name=DISALLOW_INVALID_GROUP_REFERENCE; enableSinceTargetSdk=34)
|
||||
ChangeId(174227820; name=FORCE_DISABLE_HEVC_SUPPORT; disabled)
|
||||
ChangeId(168419799; name=DOWNSCALED; disabled; packageOverrides={com.google.android.apps.tachyon=false, org.torproject.torbrowser=false}; rawOverrides={org.torproject.torbrowser=false, org.article19.circulo.next=false}; overridable)
|
||||
@@ -379,4 +379,22 @@ Daily stats:
|
||||
Update com.google.android.projection.gearhead vers=99632623
|
||||
Update com.google.android.projection.gearhead vers=99632623
|
||||
Update com.google.android.projection.gearhead vers=99632623
|
||||
--------- 0.053s was the duration of dumpsys batterystats, ending at: 2024-03-21 11:07:22
|
||||
-------------------------------------------------------------------------------
|
||||
DUMP OF SERVICE platform_compat:
|
||||
ChangeId(180326845; name=OVERRIDE_MIN_ASPECT_RATIO_MEDIUM; disabled; overridable)
|
||||
ChangeId(189969744; name=DOWNSCALE_65; disabled; overridable)
|
||||
ChangeId(183372781; name=ENABLE_RAW_SYSTEM_GALLERY_ACCESS; enableSinceTargetSdk=30)
|
||||
ChangeId(150939131; name=ADD_CONTENT_OBSERVER_FLAGS; enableSinceTargetSdk=30)
|
||||
ChangeId(226439802; name=SCHEDULE_EXACT_ALARM_DENIED_BY_DEFAULT; disabled)
|
||||
ChangeId(270674727; name=ENABLE_STRICT_FORMATTER_VALIDATION; enableSinceTargetSdk=35)
|
||||
ChangeId(183155436; name=ALWAYS_USE_CONTEXT_USER; enableSinceTargetSdk=33)
|
||||
ChangeId(303742236; name=ROLE_MANAGER_USER_HANDLE_AWARE; enableSinceTargetSdk=35)
|
||||
ChangeId(203800354; name=MEDIA_CONTROL_SESSION_ACTIONS; enableSinceTargetSdk=33)
|
||||
ChangeId(144027538; name=BLOCK_GPS_STATUS_USAGE; enableSinceTargetSdk=31)
|
||||
ChangeId(189969749; name=DOWNSCALE_35; disabled; overridable)
|
||||
ChangeId(143539591; name=SELINUX_LATEST_CHANGES; disabled)
|
||||
ChangeId(247079863; name=DISALLOW_INVALID_GROUP_REFERENCE; enableSinceTargetSdk=34)
|
||||
ChangeId(174227820; name=FORCE_DISABLE_HEVC_SUPPORT; disabled)
|
||||
ChangeId(168419799; name=DOWNSCALED; disabled; packageOverrides={com.google.android.apps.tachyon=false, org.torproject.torbrowser=false}; rawOverrides={org.torproject.torbrowser=false, org.article19.circulo.next=false}; overridable)
|
||||
|
||||
|
||||
@@ -6,6 +6,8 @@
|
||||
import logging
|
||||
import os
|
||||
|
||||
|
||||
from mvt.common.config import settings
|
||||
from mvt.common.indicators import Indicators
|
||||
from ..utils import get_artifact_folder
|
||||
|
||||
@@ -100,6 +102,8 @@ class TestIndicators:
|
||||
|
||||
def test_env_stix(self, indicator_file):
|
||||
os.environ["MVT_STIX2"] = indicator_file
|
||||
settings.__init__() # Reset settings
|
||||
|
||||
ind = Indicators(log=logging)
|
||||
ind.load_indicators_files([], load_default=False)
|
||||
assert ind.total_ioc_count == 9
|
||||
|
||||
@@ -75,7 +75,7 @@ class TestHashes:
|
||||
# This needs to be updated when we add or edit files in AndroidQF folder
|
||||
assert (
|
||||
hashes[1]["sha256"]
|
||||
== "1bd255f656a7f9d5647a730f0f0cc47053115576f11532d41bf28c16635b193d"
|
||||
== "9fb6396b64cfff30e2a459a64496d3c1386926d09edd68be2d878de45fa7b3a9"
|
||||
)
|
||||
|
||||
|
||||
|
||||
@@ -8,6 +8,7 @@ import os
|
||||
from click.testing import CliRunner
|
||||
|
||||
from mvt.android.cli import check_androidqf
|
||||
from mvt.common.config import settings
|
||||
|
||||
from .utils import get_artifact_folder
|
||||
|
||||
@@ -56,6 +57,8 @@ class TestCheckAndroidqfCommand:
|
||||
)
|
||||
|
||||
os.environ["MVT_ANDROID_BACKUP_PASSWORD"] = TEST_BACKUP_PASSWORD
|
||||
settings.__init__() # Reset settings
|
||||
|
||||
runner = CliRunner()
|
||||
path = os.path.join(get_artifact_folder(), "androidqf_encrypted")
|
||||
result = runner.invoke(check_androidqf, [path])
|
||||
@@ -63,3 +66,4 @@ class TestCheckAndroidqfCommand:
|
||||
assert prompt_mock.call_count == 0
|
||||
assert result.exit_code == 0
|
||||
del os.environ["MVT_ANDROID_BACKUP_PASSWORD"]
|
||||
settings.__init__() # Reset settings
|
||||
|
||||
@@ -9,6 +9,7 @@ import os
|
||||
from click.testing import CliRunner
|
||||
|
||||
from mvt.android.cli import check_backup
|
||||
from mvt.common.config import settings
|
||||
|
||||
from .utils import get_artifact_folder
|
||||
|
||||
@@ -63,6 +64,8 @@ class TestCheckAndroidBackupCommand:
|
||||
)
|
||||
|
||||
os.environ["MVT_ANDROID_BACKUP_PASSWORD"] = TEST_BACKUP_PASSWORD
|
||||
settings.__init__() # Reset settings
|
||||
|
||||
runner = CliRunner()
|
||||
path = os.path.join(get_artifact_folder(), "androidqf_encrypted/backup.ab")
|
||||
result = runner.invoke(check_backup, [path])
|
||||
@@ -70,3 +73,4 @@ class TestCheckAndroidBackupCommand:
|
||||
assert prompt_mock.call_count == 0
|
||||
assert result.exit_code == 0
|
||||
del os.environ["MVT_ANDROID_BACKUP_PASSWORD"]
|
||||
settings.__init__() # Reset settings
|
||||
|
||||
Reference in New Issue
Block a user