Compare commits

...

219 Commits

Author SHA1 Message Date
zhom e8c382400c chore: version bump 2025-11-30 17:36:31 +04:00
zhom c40f023d41 refactor: improved performance for old profiles 2025-11-30 17:34:04 +04:00
zhom e16512576c refactor: try getting high priority for local proxies 2025-11-30 17:04:07 +04:00
zhom f098128988 refactor: cleanup bandwidth tracking functionality 2025-11-30 16:55:23 +04:00
zhom cdba9aac33 feat: add network overview 2025-11-30 15:04:48 +04:00
zhom 01b3109dc1 feat: add mass-proxy-assign action 2025-11-30 11:03:19 +04:00
zhom 8aa3885240 style: move the body of the profile creation trigger a bit left 2025-11-30 10:46:03 +04:00
zhom 5947ec14e6 feat: add notes 2025-11-30 10:45:39 +04:00
zhom 2c7c07c414 test: fix flackyness 2025-11-30 09:51:28 +04:00
zhom 2e26b53db8 refactor: reuse copy-to-clipboard button 2025-11-30 00:31:42 +04:00
zhom 966a10c045 feat: allow user select system for which to generate fingerprint 2025-11-30 00:27:08 +04:00
zhom f72e3066f3 feat: add catppuccin themes 2025-11-30 00:14:23 +04:00
zhom cd8e1dcf18 refactor: allow the user to view the fingerprint while the profile is running 2025-11-29 23:50:54 +04:00
zhom dfc8cd4c9f chore: cleanup windows config 2025-11-29 23:40:33 +04:00
zhom 5a1726d119 build: disable msi bundler until ice30 issue is resolved 2025-11-29 23:38:54 +04:00
zhom 133ed98df1 chore: update windows config 2025-11-29 23:37:46 +04:00
zhom 4683410a2c feat: add openapi spec generation 2025-11-29 23:32:49 +04:00
zhom 44b5e71593 fix: remove parenthesis 2025-11-29 22:53:42 +04:00
zhom a02c16126b Merge pull request #145 from zhom/dependabot/cargo/src-tauri/rust-dependencies-2c3a25eb42
deps(rust)(deps): bump the rust-dependencies group in /src-tauri with 27 updates
2025-11-29 22:45:46 +04:00
zhom fd7edfc332 build: enable all build targets 2025-11-29 22:45:28 +04:00
zhom 1e48caf129 chore: linting 2025-11-29 22:44:55 +04:00
zhom e39047bdfd chore: prevent output truncation in release notes workflow 2025-11-29 22:42:15 +04:00
zhom f3fe0fa0e7 chore: remove unused permissions and variables 2025-11-29 22:41:32 +04:00
zhom a7f523ac4c chore: stop using automerge action 2025-11-29 22:24:07 +04:00
zhom 763d5a5a1b build: disable msi bundler until ice30 issue is resolved 2025-11-29 21:58:51 +04:00
dependabot[bot] 65d37d48e2 deps(rust)(deps): bump the rust-dependencies group
Bumps the rust-dependencies group in /src-tauri with 27 updates:

| Package | From | To |
| --- | --- | --- |
| [tauri](https://github.com/tauri-apps/tauri) | `2.9.2` | `2.9.3` |
| [zip](https://github.com/zip-rs/zip2) | `5.1.1` | `6.0.0` |
| [axum](https://github.com/tokio-rs/axum) | `0.8.6` | `0.8.7` |
| [tower-http](https://github.com/tower-rs/tower-http) | `0.6.6` | `0.6.7` |
| [hyper](https://github.com/hyperium/hyper) | `1.8.0` | `1.8.1` |
| [hyper-util](https://github.com/hyperium/hyper-util) | `0.1.17` | `0.1.18` |
| [tauri-build](https://github.com/tauri-apps/tauri) | `2.5.1` | `2.5.2` |
| [bytes](https://github.com/tokio-rs/bytes) | `1.10.1` | `1.11.0` |
| [cc](https://github.com/rust-lang/cc-rs) | `1.2.45` | `1.2.48` |
| [crc](https://github.com/mrhooray/crc-rs) | `3.3.0` | `3.4.0` |
| [crypto-common](https://github.com/RustCrypto/traits) | `0.1.6` | `0.1.7` |
| [endi](https://github.com/zeenix/endi) | `1.1.0` | `1.1.1` |
| [find-msvc-tools](https://github.com/rust-lang/cc-rs) | `0.1.4` | `0.1.5` |
| [generic-array](https://github.com/fizyk20/generic-array) | `0.14.9` | `0.14.7` |
| [http](https://github.com/hyperium/http) | `1.3.1` | `1.4.0` |
| [open](https://github.com/Byron/open-rs) | `5.3.2` | `5.3.3` |
| [rustls-pki-types](https://github.com/rustls/pki-types) | `1.13.0` | `1.13.1` |
| [serde_with](https://github.com/jonasbb/serde_with) | `3.15.1` | `3.16.1` |
| [serde_with_macros](https://github.com/jonasbb/serde_with) | `3.15.1` | `3.16.1` |
| [signal-hook-registry](https://github.com/vorner/signal-hook) | `1.4.6` | `1.4.7` |
| [tauri-codegen](https://github.com/tauri-apps/tauri) | `2.5.0` | `2.5.1` |
| [tauri-macros](https://github.com/tauri-apps/tauri) | `2.5.0` | `2.5.1` |
| [tracing](https://github.com/tokio-rs/tracing) | `0.1.41` | `0.1.43` |
| [tracing-attributes](https://github.com/tokio-rs/tracing) | `0.1.30` | `0.1.31` |
| [tracing-core](https://github.com/tokio-rs/tracing) | `0.1.34` | `0.1.35` |
| [zerocopy](https://github.com/google/zerocopy) | `0.8.27` | `0.8.30` |
| [zerocopy-derive](https://github.com/google/zerocopy) | `0.8.27` | `0.8.30` |


Updates `tauri` from 2.9.2 to 2.9.3
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-v2.9.2...tauri-v2.9.3)

Updates `zip` from 5.1.1 to 6.0.0
- [Release notes](https://github.com/zip-rs/zip2/releases)
- [Changelog](https://github.com/zip-rs/zip2/blob/master/CHANGELOG.md)
- [Commits](https://github.com/zip-rs/zip2/compare/v5.1.1...v6.0.0)

Updates `axum` from 0.8.6 to 0.8.7
- [Release notes](https://github.com/tokio-rs/axum/releases)
- [Changelog](https://github.com/tokio-rs/axum/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tokio-rs/axum/compare/axum-v0.8.6...axum-v0.8.7)

Updates `tower-http` from 0.6.6 to 0.6.7
- [Release notes](https://github.com/tower-rs/tower-http/releases)
- [Commits](https://github.com/tower-rs/tower-http/compare/tower-http-0.6.6...tower-http-0.6.7)

Updates `hyper` from 1.8.0 to 1.8.1
- [Release notes](https://github.com/hyperium/hyper/releases)
- [Changelog](https://github.com/hyperium/hyper/blob/master/CHANGELOG.md)
- [Commits](https://github.com/hyperium/hyper/compare/v1.8.0...v1.8.1)

Updates `hyper-util` from 0.1.17 to 0.1.18
- [Release notes](https://github.com/hyperium/hyper-util/releases)
- [Changelog](https://github.com/hyperium/hyper-util/blob/master/CHANGELOG.md)
- [Commits](https://github.com/hyperium/hyper-util/compare/v0.1.17...v0.1.18)

Updates `tauri-build` from 2.5.1 to 2.5.2
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-build-v2.5.1...tauri-build-v2.5.2)

Updates `bytes` from 1.10.1 to 1.11.0
- [Release notes](https://github.com/tokio-rs/bytes/releases)
- [Changelog](https://github.com/tokio-rs/bytes/blob/master/CHANGELOG.md)
- [Commits](https://github.com/tokio-rs/bytes/compare/v1.10.1...v1.11.0)

Updates `cc` from 1.2.45 to 1.2.48
- [Release notes](https://github.com/rust-lang/cc-rs/releases)
- [Changelog](https://github.com/rust-lang/cc-rs/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rust-lang/cc-rs/compare/cc-v1.2.45...cc-v1.2.48)

Updates `crc` from 3.3.0 to 3.4.0
- [Commits](https://github.com/mrhooray/crc-rs/compare/3.3.0...3.4.0)

Updates `crypto-common` from 0.1.6 to 0.1.7
- [Commits](https://github.com/RustCrypto/traits/compare/crypto-common-v0.1.6...crypto-common-v0.1.7)

Updates `endi` from 1.1.0 to 1.1.1
- [Release notes](https://github.com/zeenix/endi/releases)
- [Commits](https://github.com/zeenix/endi/compare/1.1.0...1.1.1)

Updates `find-msvc-tools` from 0.1.4 to 0.1.5
- [Release notes](https://github.com/rust-lang/cc-rs/releases)
- [Changelog](https://github.com/rust-lang/cc-rs/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rust-lang/cc-rs/compare/find-msvc-tools-v0.1.4...find-msvc-tools-v0.1.5)

Updates `generic-array` from 0.14.9 to 0.14.7
- [Release notes](https://github.com/fizyk20/generic-array/releases)
- [Changelog](https://github.com/fizyk20/generic-array/blob/master/CHANGELOG.md)
- [Commits](https://github.com/fizyk20/generic-array/commits)

Updates `http` from 1.3.1 to 1.4.0
- [Release notes](https://github.com/hyperium/http/releases)
- [Changelog](https://github.com/hyperium/http/blob/master/CHANGELOG.md)
- [Commits](https://github.com/hyperium/http/compare/v1.3.1...v1.4.0)

Updates `open` from 5.3.2 to 5.3.3
- [Release notes](https://github.com/Byron/open-rs/releases)
- [Changelog](https://github.com/Byron/open-rs/blob/main/changelog.md)
- [Commits](https://github.com/Byron/open-rs/compare/v5.3.2...v5.3.3)

Updates `rustls-pki-types` from 1.13.0 to 1.13.1
- [Release notes](https://github.com/rustls/pki-types/releases)
- [Commits](https://github.com/rustls/pki-types/compare/v/1.13.0...v/1.13.1)

Updates `serde_with` from 3.15.1 to 3.16.1
- [Release notes](https://github.com/jonasbb/serde_with/releases)
- [Commits](https://github.com/jonasbb/serde_with/compare/v3.15.1...v3.16.1)

Updates `serde_with_macros` from 3.15.1 to 3.16.1
- [Release notes](https://github.com/jonasbb/serde_with/releases)
- [Commits](https://github.com/jonasbb/serde_with/compare/v3.15.1...v3.16.1)

Updates `signal-hook-registry` from 1.4.6 to 1.4.7
- [Changelog](https://github.com/vorner/signal-hook/blob/master/CHANGELOG.md)
- [Commits](https://github.com/vorner/signal-hook/compare/registry-v1.4.6...registry-v1.4.7)

Updates `tauri-codegen` from 2.5.0 to 2.5.1
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-codegen-v2.5.0...tauri-codegen-v2.5.1)

Updates `tauri-macros` from 2.5.0 to 2.5.1
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-macros-v2.5.0...tauri-macros-v2.5.1)

Updates `tracing` from 0.1.41 to 0.1.43
- [Release notes](https://github.com/tokio-rs/tracing/releases)
- [Commits](https://github.com/tokio-rs/tracing/compare/tracing-0.1.41...tracing-0.1.43)

Updates `tracing-attributes` from 0.1.30 to 0.1.31
- [Release notes](https://github.com/tokio-rs/tracing/releases)
- [Commits](https://github.com/tokio-rs/tracing/compare/tracing-attributes-0.1.30...tracing-attributes-0.1.31)

Updates `tracing-core` from 0.1.34 to 0.1.35
- [Release notes](https://github.com/tokio-rs/tracing/releases)
- [Commits](https://github.com/tokio-rs/tracing/compare/tracing-core-0.1.34...tracing-core-0.1.35)

Updates `zerocopy` from 0.8.27 to 0.8.30
- [Release notes](https://github.com/google/zerocopy/releases)
- [Changelog](https://github.com/google/zerocopy/blob/main/CHANGELOG.md)
- [Commits](https://github.com/google/zerocopy/compare/v0.8.27...v0.8.30)

Updates `zerocopy-derive` from 0.8.27 to 0.8.30
- [Release notes](https://github.com/google/zerocopy/releases)
- [Changelog](https://github.com/google/zerocopy/blob/main/CHANGELOG.md)
- [Commits](https://github.com/google/zerocopy/compare/v0.8.27...v0.8.30)

---
updated-dependencies:
- dependency-name: tauri
  dependency-version: 2.9.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: zip
  dependency-version: 6.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: rust-dependencies
- dependency-name: axum
  dependency-version: 0.8.7
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: tower-http
  dependency-version: 0.6.7
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: hyper
  dependency-version: 1.8.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: hyper-util
  dependency-version: 0.1.18
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: tauri-build
  dependency-version: 2.5.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: bytes
  dependency-version: 1.11.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: cc
  dependency-version: 1.2.48
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: crc
  dependency-version: 3.4.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: crypto-common
  dependency-version: 0.1.7
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: endi
  dependency-version: 1.1.1
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: find-msvc-tools
  dependency-version: 0.1.5
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: generic-array
  dependency-version: 0.14.7
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: http
  dependency-version: 1.4.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: open
  dependency-version: 5.3.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: rustls-pki-types
  dependency-version: 1.13.1
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: serde_with
  dependency-version: 3.16.1
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: serde_with_macros
  dependency-version: 3.16.1
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: signal-hook-registry
  dependency-version: 1.4.7
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: tauri-codegen
  dependency-version: 2.5.1
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: tauri-macros
  dependency-version: 2.5.1
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: tracing
  dependency-version: 0.1.43
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: tracing-attributes
  dependency-version: 0.1.31
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: tracing-core
  dependency-version: 0.1.35
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: zerocopy
  dependency-version: 0.8.30
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: zerocopy-derive
  dependency-version: 0.8.30
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-29 17:46:07 +00:00
zhom eab5def6b1 refactor: explicitly name bin as donutbrowser 2025-11-29 21:06:52 +04:00
zhom 8da0dae545 refactor: rename lib to donutbrowser_lib 2025-11-29 21:00:22 +04:00
zhom 371abf33c1 build: fix wix issue on Windows 2025-11-29 20:10:30 +04:00
zhom 9ddc63931f refactor: add /open-url, /kill, and fix the /run endpoint to properly handle initial URL 2025-11-29 18:19:48 +04:00
zhom d2f4988635 build: execute next build directly 2025-11-29 17:40:48 +04:00
zhom 68228dcf3c build: verify frontend dir before donut-proxy build step 2025-11-29 17:04:05 +04:00
zhom 0805c37d33 refactor: move actions for selected items into its own component 2025-11-29 16:33:58 +04:00
zhom 61dcbbc715 build: don't fail if dist is already created 2025-11-29 16:23:36 +04:00
zhom 287b5a2190 test: cleanup stale proxies from previous test runs and better error handling 2025-11-29 15:44:12 +04:00
zhom 23f5921eb6 feat: add randomized fingerprint option 2025-11-29 15:38:51 +04:00
zhom 035d36e387 test: run donut-proxy tests in sequence 2025-11-29 15:23:21 +04:00
zhom 131ef92370 build: use next build directly 2025-11-29 15:03:19 +04:00
zhom c8b259e6ae build: build frontend before donut-proxy 2025-11-29 14:55:34 +04:00
zhom 98b4e9d145 chore: suppress cfg warning 2025-11-29 14:37:17 +04:00
zhom 8af318b5ce build: build frontend for rust linting 2025-11-29 14:36:33 +04:00
zhom 13c6946798 build: create dist folder for rust linting 2025-11-29 14:24:48 +04:00
zhom aed24c4df6 refactor: prompt manual update on linux 2025-11-29 14:21:18 +04:00
zhom a48d03a1e4 chore: linting 2025-11-29 14:12:46 +04:00
zhom 5d7ed0430e build: don't check for sidecars during donut-proxy build 2025-11-29 14:02:31 +04:00
zhom 84af35c4f5 build: copy donut-proxy binary in linting 2025-11-29 13:52:49 +04:00
zhom cff69fbd11 chore: fix syntax in issue validation workflow 2025-11-29 13:49:48 +04:00
zhom 2f639652c9 chore: fix syntax in issue validation workflow 2025-11-29 13:46:51 +04:00
zhom 30a787e50d refactor: improve process termination logic 2025-11-29 13:44:30 +04:00
zhom 71e3f4a078 chore: codegen 2025-11-29 13:31:56 +04:00
zhom 6e3dc6b657 build: copy donut-proxy into the binaries folder 2025-11-29 13:27:15 +04:00
zhom 8f0bb4a335 chore: fix syntax in issue validation workflow 2025-11-29 13:24:53 +04:00
zhom 9b770dc2e3 chore: codegen 2025-11-29 12:29:15 +04:00
zhom 955cf887a0 build: improve speed 2025-11-29 12:29:07 +04:00
zhom 0acfa66e16 refactor: improve chromium process killing mechanism 2025-11-26 20:52:15 +04:00
zhom 26099b3f7f chore: remove useless logging 2025-11-26 20:51:56 +04:00
zhom 0b63ad6556 refactor: add proper logging 2025-11-26 20:21:17 +04:00
zhom bab9301c31 style: make ui for proxies and groups similar 2025-11-26 16:17:30 +04:00
zhom a4cb3c6b1d style: add tooltip to more actions button 2025-11-26 13:57:59 +04:00
zhom e22838ca55 chore: linting 2025-11-26 13:46:54 +04:00
zhom 18bfb1ed5b fix: always show scroll indicator 2025-11-26 01:42:52 +04:00
zhom c7c910d1ca style: tight table design 2025-11-26 01:42:33 +04:00
zhom 35ba7e2d96 style: better margins 2025-11-26 00:55:49 +04:00
zhom 689eeafc75 style: prevent text from being selected on drag-to-scroll 2025-11-26 00:43:46 +04:00
zhom 40886f2ded refactor: disable deep links on linux 2025-11-26 00:21:33 +04:00
zhom e02f588a90 style: add drag-to-scroll 2025-11-26 00:00:50 +04:00
zhom 3690ceb734 style: improve group functionality 2025-11-25 23:33:28 +04:00
zhom d90a333eb0 style: simplify profile import dialog 2025-11-25 20:53:57 +04:00
zhom 93b85e760e style: make window smaller 2025-11-25 20:53:00 +04:00
zhom 64328e91a2 refactor: migrate proxy functionality from nodecar to rust sidecar 2025-11-25 20:43:12 +04:00
zhom 8a1943f84e feat: add proxy check button 2025-11-25 14:37:17 +04:00
zhom cc22384c54 chore: update contributors workflow 2025-11-25 13:26:20 +04:00
zhom a720f914b0 chore: improve greetings workflow for the first-time contributors 2025-11-22 15:53:53 +04:00
zhom b899af0983 chore: use personal token for automerge 2025-11-22 15:53:53 +04:00
zhom 43277a9579 chore: linting 2025-11-22 15:53:53 +04:00
zhom f4a36996db refactor: fix deprecation warning 2025-11-22 15:53:53 +04:00
zhom 15e8a1029a chore: rename "Donut Browser" to "Donut" 2025-11-22 15:53:53 +04:00
zhom 43b9f405ca chore: conditionally generate suggestions 2025-11-22 15:53:53 +04:00
zhom f9a527637f chore: update dependencies 2025-11-22 15:53:53 +04:00
zhom be0d3053e7 Merge pull request #139 from zhom/dependabot/github_actions/github-actions-3ad8007637
ci(deps): bump the github-actions group with 4 updates
2025-11-22 15:52:38 +04:00
dependabot[bot] 070e40ffe0 ci(deps): bump the github-actions group with 4 updates
Bumps the github-actions group with 4 updates: [actions/checkout](https://github.com/actions/checkout), [google/osv-scanner-action/.github/workflows/osv-scanner-reusable-pr.yml](https://github.com/google/osv-scanner-action), [ridedott/merge-me-action](https://github.com/ridedott/merge-me-action) and [google/osv-scanner-action/.github/workflows/osv-scanner-reusable.yml](https://github.com/google/osv-scanner-action).


Updates `actions/checkout` from 5.0.0 to 6.0.0
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/08c6903cd8c0fde910a37f88322edcfb5dd907a8...1af3b93b6815bc44a9784bd300feb67ff0d1eeb3)

Updates `google/osv-scanner-action/.github/workflows/osv-scanner-reusable-pr.yml` from 2.2.4 to 2.3.0
- [Release notes](https://github.com/google/osv-scanner-action/releases)
- [Commits](https://github.com/google/osv-scanner-action/compare/9bb69575e74019c2ad085a1860787043adf47ccb...b77c075a1235514558f0eb88dbd31e22c45e0cd2)

Updates `ridedott/merge-me-action` from 2.10.134 to 2.10.138
- [Release notes](https://github.com/ridedott/merge-me-action/releases)
- [Changelog](https://github.com/ridedott/merge-me-action/blob/master/CHANGELOG.md)
- [Commits](https://github.com/ridedott/merge-me-action/compare/a8b93e4510b1cb03192d058ddef97e6b1de25522...18dd4f01d259faf0a2d900a56cd6b7e765009209)

Updates `google/osv-scanner-action/.github/workflows/osv-scanner-reusable.yml` from 2.2.4 to 2.3.0
- [Release notes](https://github.com/google/osv-scanner-action/releases)
- [Commits](https://github.com/google/osv-scanner-action/compare/9bb69575e74019c2ad085a1860787043adf47ccb...b77c075a1235514558f0eb88dbd31e22c45e0cd2)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: 6.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: github-actions
- dependency-name: google/osv-scanner-action/.github/workflows/osv-scanner-reusable-pr.yml
  dependency-version: 2.3.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: github-actions
- dependency-name: ridedott/merge-me-action
  dependency-version: 2.10.138
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
- dependency-name: google/osv-scanner-action/.github/workflows/osv-scanner-reusable.yml
  dependency-version: 2.3.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: github-actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-22 09:05:45 +00:00
zhom 416bec77bc Merge pull request #135 from zhom/dependabot/github_actions/github-actions-8a794122f6
ci(deps): bump the github-actions group with 3 updates
2025-11-15 14:47:28 +04:00
dependabot[bot] d3a6c568dc ci(deps): bump the github-actions group with 3 updates
Bumps the github-actions group with 3 updates: [ridedott/merge-me-action](https://github.com/ridedott/merge-me-action), [tauri-apps/tauri-action](https://github.com/tauri-apps/tauri-action) and [crate-ci/typos](https://github.com/crate-ci/typos).


Updates `ridedott/merge-me-action` from 2.10.133 to 2.10.134
- [Release notes](https://github.com/ridedott/merge-me-action/releases)
- [Changelog](https://github.com/ridedott/merge-me-action/blob/master/CHANGELOG.md)
- [Commits](https://github.com/ridedott/merge-me-action/compare/a2e29d4313d8ee783692b40abfce8f2ad60d3f0c...a8b93e4510b1cb03192d058ddef97e6b1de25522)

Updates `tauri-apps/tauri-action` from 0.5.24 to 0.6.0
- [Release notes](https://github.com/tauri-apps/tauri-action/releases)
- [Changelog](https://github.com/tauri-apps/tauri-action/blob/dev/CHANGELOG.md)
- [Commits](https://github.com/tauri-apps/tauri-action/compare/3b50ac4d4512105f96edbaa78a6e2f9392805589...19b93bb55601e3e373a93cfb6eb4242e45f5af20)

Updates `crate-ci/typos` from 1.39.0 to 1.39.2
- [Release notes](https://github.com/crate-ci/typos/releases)
- [Changelog](https://github.com/crate-ci/typos/blob/master/CHANGELOG.md)
- [Commits](https://github.com/crate-ci/typos/compare/07d900b8fa1097806b8adb6391b0d3e0ac2fdea7...626c4bedb751ce0b7f03262ca97ddda9a076ae1c)

---
updated-dependencies:
- dependency-name: ridedott/merge-me-action
  dependency-version: 2.10.134
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
- dependency-name: tauri-apps/tauri-action
  dependency-version: 0.6.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: github-actions
- dependency-name: crate-ci/typos
  dependency-version: 1.39.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-15 09:06:25 +00:00
zhom 0659d11ee7 Merge pull request #123 from zhom/dependabot/github_actions/github-actions-42af67f577
ci(deps): bump the github-actions group with 3 updates
2025-11-01 13:54:08 +00:00
dependabot[bot] 3175ecccf0 ci(deps): bump the github-actions group with 3 updates
Bumps the github-actions group with 3 updates: [google/osv-scanner-action](https://github.com/google/osv-scanner-action), [tauri-apps/tauri-action](https://github.com/tauri-apps/tauri-action) and [crate-ci/typos](https://github.com/crate-ci/typos).


Updates `google/osv-scanner-action` from 2.2.3 to 2.2.4
- [Release notes](https://github.com/google/osv-scanner-action/releases)
- [Commits](https://github.com/google/osv-scanner-action/compare/e92b5d07338d4f0ba0981dffed17c48976ca4730...9bb69575e74019c2ad085a1860787043adf47ccb)

Updates `tauri-apps/tauri-action` from 0.5.23 to 0.5.24
- [Release notes](https://github.com/tauri-apps/tauri-action/releases)
- [Changelog](https://github.com/tauri-apps/tauri-action/blob/dev/CHANGELOG.md)
- [Commits](https://github.com/tauri-apps/tauri-action/compare/e834788a94591d81e3ae0bd9ec06366f5afb8994...3b50ac4d4512105f96edbaa78a6e2f9392805589)

Updates `crate-ci/typos` from 1.38.1 to 1.39.0
- [Release notes](https://github.com/crate-ci/typos/releases)
- [Changelog](https://github.com/crate-ci/typos/blob/master/CHANGELOG.md)
- [Commits](https://github.com/crate-ci/typos/compare/80c8a4945eec0f6d464eaf9e65ed98ef085283d1...07d900b8fa1097806b8adb6391b0d3e0ac2fdea7)

---
updated-dependencies:
- dependency-name: google/osv-scanner-action
  dependency-version: 2.2.4
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
- dependency-name: tauri-apps/tauri-action
  dependency-version: 0.5.24
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
- dependency-name: crate-ci/typos
  dependency-version: 1.39.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: github-actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-01 09:05:50 +00:00
zhom 7b641e9b41 Merge pull request #116 from zhom/dependabot/github_actions/github-actions-ac4cdb36ca
ci(deps): bump the github-actions group with 2 updates
2025-10-19 08:58:57 +00:00
dependabot[bot] f438621bc8 ci(deps): bump the github-actions group with 2 updates
Bumps the github-actions group with 2 updates: [actions/setup-node](https://github.com/actions/setup-node) and [ridedott/merge-me-action](https://github.com/ridedott/merge-me-action).


Updates `actions/setup-node` from 5.0.0 to 6.0.0
- [Release notes](https://github.com/actions/setup-node/releases)
- [Commits](https://github.com/actions/setup-node/compare/a0853c24544627f65ddf259abe73b1d18a591444...2028fbc5c25fe9cf00d9f06a71cc4710d4507903)

Updates `ridedott/merge-me-action` from 2.10.131 to 2.10.133
- [Release notes](https://github.com/ridedott/merge-me-action/releases)
- [Changelog](https://github.com/ridedott/merge-me-action/blob/master/CHANGELOG.md)
- [Commits](https://github.com/ridedott/merge-me-action/compare/a3b9ffd551d69f9f4375a87e9fa56235a0749518...a2e29d4313d8ee783692b40abfce8f2ad60d3f0c)

---
updated-dependencies:
- dependency-name: actions/setup-node
  dependency-version: 6.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: github-actions
- dependency-name: ridedott/merge-me-action
  dependency-version: 2.10.133
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-18 09:05:40 +00:00
zhom 4fc2cb7730 Merge pull request #110 from zhom/dependabot/github_actions/github-actions-36c42c0093
ci(deps): bump the github-actions group with 2 updates
2025-10-13 06:30:41 +00:00
dependabot[bot] c41a5d84b2 ci(deps): bump the github-actions group with 2 updates
Bumps the github-actions group with 2 updates: [pnpm/action-setup](https://github.com/pnpm/action-setup) and [crate-ci/typos](https://github.com/crate-ci/typos).


Updates `pnpm/action-setup` from 4.1.0 to 4.2.0
- [Release notes](https://github.com/pnpm/action-setup/releases)
- [Commits](https://github.com/pnpm/action-setup/compare/a7487c7e89a18df4991f7f222e4898a00d66ddda...41ff72655975bd51cab0327fa583b6e92b6d3061)

Updates `crate-ci/typos` from 1.37.2 to 1.38.1
- [Release notes](https://github.com/crate-ci/typos/releases)
- [Changelog](https://github.com/crate-ci/typos/blob/master/CHANGELOG.md)
- [Commits](https://github.com/crate-ci/typos/compare/7436548694def3314aacd93ed06c721b1f91ea04...80c8a4945eec0f6d464eaf9e65ed98ef085283d1)

---
updated-dependencies:
- dependency-name: pnpm/action-setup
  dependency-version: 4.2.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: github-actions
- dependency-name: crate-ci/typos
  dependency-version: 1.38.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: github-actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-11 09:06:58 +00:00
zhom fda2887aef Merge pull request #106 from zhom/dependabot/github_actions/github-actions-61623bb75b
ci(deps): bump the github-actions group with 5 updates
2025-10-04 09:30:48 +00:00
dependabot[bot] f58b790293 ci(deps): bump the github-actions group with 5 updates
Bumps the github-actions group with 5 updates:

| Package | From | To |
| --- | --- | --- |
| [google/osv-scanner-action](https://github.com/google/osv-scanner-action) | `2.2.2` | `2.2.3` |
| [ridedott/merge-me-action](https://github.com/ridedott/merge-me-action) | `2.10.130` | `2.10.131` |
| [actions/first-interaction](https://github.com/actions/first-interaction) | `3.0.0` | `3.1.0` |
| [crate-ci/typos](https://github.com/crate-ci/typos) | `1.36.3` | `1.37.2` |
| [actions/stale](https://github.com/actions/stale) | `10.0.0` | `10.1.0` |


Updates `google/osv-scanner-action` from 2.2.2 to 2.2.3
- [Release notes](https://github.com/google/osv-scanner-action/releases)
- [Commits](https://github.com/google/osv-scanner-action/compare/90b209d0ea55cea1da9fc0c4e65782cc6acb6e2e...e92b5d07338d4f0ba0981dffed17c48976ca4730)

Updates `ridedott/merge-me-action` from 2.10.130 to 2.10.131
- [Release notes](https://github.com/ridedott/merge-me-action/releases)
- [Changelog](https://github.com/ridedott/merge-me-action/blob/master/CHANGELOG.md)
- [Commits](https://github.com/ridedott/merge-me-action/compare/a310eac074af628e0fd6c6d78858bba5bcf01179...a3b9ffd551d69f9f4375a87e9fa56235a0749518)

Updates `actions/first-interaction` from 3.0.0 to 3.1.0
- [Release notes](https://github.com/actions/first-interaction/releases)
- [Commits](https://github.com/actions/first-interaction/compare/753c925c8d1ac6fede23781875376600628d9b5d...1c4688942c71f71d4f5502a26ea67c331730fa4d)

Updates `crate-ci/typos` from 1.36.3 to 1.37.2
- [Release notes](https://github.com/crate-ci/typos/releases)
- [Changelog](https://github.com/crate-ci/typos/blob/master/CHANGELOG.md)
- [Commits](https://github.com/crate-ci/typos/compare/0c17dabcee8b8f1957fa917d17393a23e02e1583...7436548694def3314aacd93ed06c721b1f91ea04)

Updates `actions/stale` from 10.0.0 to 10.1.0
- [Release notes](https://github.com/actions/stale/releases)
- [Changelog](https://github.com/actions/stale/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/stale/compare/3a9db7e6a41a89f618792c92c0e97cc736e1b13f...5f858e3efba33a5ca4407a664cc011ad407f2008)

---
updated-dependencies:
- dependency-name: google/osv-scanner-action
  dependency-version: 2.2.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
- dependency-name: ridedott/merge-me-action
  dependency-version: 2.10.131
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
- dependency-name: actions/first-interaction
  dependency-version: 3.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: github-actions
- dependency-name: crate-ci/typos
  dependency-version: 1.37.2
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: github-actions
- dependency-name: actions/stale
  dependency-version: 10.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: github-actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-04 09:05:48 +00:00
zhom 518a02f782 chore: version bump 2025-10-02 20:35:20 +04:00
zhom 0999a265dc chore: hide allow addon new tab 2025-10-02 20:33:09 +04:00
zhom 984f529505 chore: remove unused dependency 2025-10-02 19:50:22 +04:00
zhom 3b030df37f build: install dependencies in correct order 2025-10-02 19:49:10 +04:00
zhom 03b8cae825 feat: allow user configuring allowAddonNewTab 2025-10-02 19:43:18 +04:00
zhom 00e486cc85 build: set pnpm version only in package.json 2025-09-30 10:34:35 +04:00
zhom 640185ff2e build: set pnpm to 10 and install dependencies manually 2025-09-30 10:24:52 +04:00
zhom 22fa2cfef0 fix: don't create 2 .desktop files 2025-09-30 10:13:23 +04:00
zhom a1db587314 chore: pnpm update 2025-09-30 09:34:10 +04:00
zhom 8862630a09 chore: cleanup 2025-09-30 09:34:10 +04:00
zhom 5956daeb9a Merge pull request #99 from zhom/dependabot/github_actions/github-actions-7ebf98940a
ci(deps): bump crate-ci/typos from 1.36.2 to 1.36.3 in the github-actions group
2025-09-27 15:20:42 +04:00
dependabot[bot] dfde9df72e ci(deps): bump crate-ci/typos in the github-actions group
Bumps the github-actions group with 1 update: [crate-ci/typos](https://github.com/crate-ci/typos).


Updates `crate-ci/typos` from 1.36.2 to 1.36.3
- [Release notes](https://github.com/crate-ci/typos/releases)
- [Changelog](https://github.com/crate-ci/typos/blob/master/CHANGELOG.md)
- [Commits](https://github.com/crate-ci/typos/compare/85f62a8a84f939ae994ab3763f01a0296d61a7ee...0c17dabcee8b8f1957fa917d17393a23e02e1583)

---
updated-dependencies:
- dependency-name: crate-ci/typos
  dependency-version: 1.36.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-27 09:04:52 +00:00
zhom 3cbbd75618 Merge pull request #92 from zhom/dependabot/github_actions/github-actions-8939090574
ci(deps): bump the github-actions group with 2 updates
2025-09-20 13:25:16 +04:00
dependabot[bot] 8a32d73a25 ci(deps): bump the github-actions group with 2 updates
Bumps the github-actions group with 2 updates: [swatinem/rust-cache](https://github.com/swatinem/rust-cache) and [ridedott/merge-me-action](https://github.com/ridedott/merge-me-action).


Updates `swatinem/rust-cache` from 2.8.0 to 2.8.1
- [Release notes](https://github.com/swatinem/rust-cache/releases)
- [Changelog](https://github.com/Swatinem/rust-cache/blob/master/CHANGELOG.md)
- [Commits](https://github.com/swatinem/rust-cache/compare/98c8021b550208e191a6a3145459bfc9fb29c4c0...f13886b937689c021905a6b90929199931d60db1)

Updates `ridedott/merge-me-action` from 2.10.129 to 2.10.130
- [Release notes](https://github.com/ridedott/merge-me-action/releases)
- [Changelog](https://github.com/ridedott/merge-me-action/blob/master/CHANGELOG.md)
- [Commits](https://github.com/ridedott/merge-me-action/compare/884aad0742ac6ee2eb4ff7c4496786d73df4ff69...a310eac074af628e0fd6c6d78858bba5bcf01179)

---
updated-dependencies:
- dependency-name: swatinem/rust-cache
  dependency-version: 2.8.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
- dependency-name: ridedott/merge-me-action
  dependency-version: 2.10.130
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-20 09:05:28 +00:00
zhom 2007080d4b Merge pull request #87 from zhom/dependabot/github_actions/github-actions-3a7b1f4069
ci(deps): bump ridedott/merge-me-action from 2.10.128 to 2.10.129 in the github-actions group
2025-09-14 23:40:26 +04:00
dependabot[bot] feb604ffaa ci(deps): bump ridedott/merge-me-action in the github-actions group
Bumps the github-actions group with 1 update: [ridedott/merge-me-action](https://github.com/ridedott/merge-me-action).


Updates `ridedott/merge-me-action` from 2.10.128 to 2.10.129
- [Release notes](https://github.com/ridedott/merge-me-action/releases)
- [Changelog](https://github.com/ridedott/merge-me-action/blob/master/CHANGELOG.md)
- [Commits](https://github.com/ridedott/merge-me-action/compare/60142b76c22362f5845c877672fd2822b4d07c13...884aad0742ac6ee2eb4ff7c4496786d73df4ff69)

---
updated-dependencies:
- dependency-name: ridedott/merge-me-action
  dependency-version: 2.10.129
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-13 09:04:59 +00:00
zhom 14659180d7 Merge pull request #83 from zhom/dependabot/github_actions/github-actions-2ae6d0a682
ci(deps): bump the github-actions group with 4 updates
2025-09-06 20:36:18 +04:00
zhom 82ebd7dc18 Merge pull request #84 from zhom/dependabot/npm_and_yarn/frontend-dependencies-fa3fda6213
deps(deps): bump the frontend-dependencies group with 76 updates
2025-09-06 20:36:08 +04:00
zhom 1c995e676c Merge pull request #85 from zhom/dependabot/cargo/src-tauri/rust-dependencies-ebac297506
deps(rust)(deps): bump the rust-dependencies group in /src-tauri with 19 updates
2025-09-06 20:35:58 +04:00
dependabot[bot] e5fd63d03d deps(rust)(deps): bump the rust-dependencies group
Bumps the rust-dependencies group in /src-tauri with 19 updates:

| Package | From | To |
| --- | --- | --- |
| [tauri](https://github.com/tauri-apps/tauri) | `2.8.4` | `2.8.5` |
| [tauri-plugin-deep-link](https://github.com/tauri-apps/plugins-workspace) | `2.4.2` | `2.4.3` |
| [tauri-plugin-dialog](https://github.com/tauri-apps/plugins-workspace) | `2.3.3` | `2.4.0` |
| [zip](https://github.com/zip-rs/zip2) | `4.5.0` | `5.0.0` |
| [uuid](https://github.com/uuid-rs/uuid) | `1.18.0` | `1.18.1` |
| [windows](https://github.com/microsoft/windows-rs) | `0.61.3` | `0.62.0` |
| [tauri-build](https://github.com/tauri-apps/tauri) | `2.4.0` | `2.4.1` |
| [cc](https://github.com/rust-lang/cc-rs) | `1.2.34` | `1.2.36` |
| [deadpool](https://github.com/bikeshedder/deadpool) | `0.12.2` | `0.12.3` |
| [libz-rs-sys](https://github.com/trifectatechfoundation/zlib-rs) | `0.5.1` | `0.5.2` |
| [log](https://github.com/rust-lang/log) | `0.4.27` | `0.4.28` |
| [rust-ini](https://github.com/zonyitoo/rust-ini) | `0.21.1` | `0.21.3` |
| [tao](https://github.com/tauri-apps/tao) | `0.34.2` | `0.34.3` |
| [time](https://github.com/time-rs/time) | `0.3.41` | `0.3.43` |
| [time-core](https://github.com/time-rs/time) | `0.1.4` | `0.1.6` |
| [time-macros](https://github.com/time-rs/time) | `0.2.22` | `0.2.24` |
| [windows-version](https://github.com/microsoft/windows-rs) | `0.1.4` | `0.1.5` |
| [zlib-rs](https://github.com/trifectatechfoundation/zlib-rs) | `0.5.1` | `0.5.2` |
| [zstd-sys](https://github.com/gyscos/zstd-rs) | `2.0.15+zstd.1.5.7` | `2.0.16+zstd.1.5.7` |


Updates `tauri` from 2.8.4 to 2.8.5
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-v2.8.4...tauri-v2.8.5)

Updates `tauri-plugin-deep-link` from 2.4.2 to 2.4.3
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](https://github.com/tauri-apps/plugins-workspace/compare/fs-v2.4.2...http-v2.4.3)

Updates `tauri-plugin-dialog` from 2.3.3 to 2.4.0
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](https://github.com/tauri-apps/plugins-workspace/compare/dialog-v2.3.3...fs-v2.4.0)

Updates `zip` from 4.5.0 to 5.0.0
- [Release notes](https://github.com/zip-rs/zip2/releases)
- [Changelog](https://github.com/zip-rs/zip2/blob/master/CHANGELOG.md)
- [Commits](https://github.com/zip-rs/zip2/compare/v4.5.0...v5.0.0)

Updates `uuid` from 1.18.0 to 1.18.1
- [Release notes](https://github.com/uuid-rs/uuid/releases)
- [Commits](https://github.com/uuid-rs/uuid/compare/v1.18.0...v1.18.1)

Updates `windows` from 0.61.3 to 0.62.0
- [Release notes](https://github.com/microsoft/windows-rs/releases)
- [Commits](https://github.com/microsoft/windows-rs/commits/0.62.0)

Updates `tauri-build` from 2.4.0 to 2.4.1
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-build-v2.4.0...tauri-build-v2.4.1)

Updates `cc` from 1.2.34 to 1.2.36
- [Release notes](https://github.com/rust-lang/cc-rs/releases)
- [Changelog](https://github.com/rust-lang/cc-rs/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rust-lang/cc-rs/compare/cc-v1.2.34...cc-v1.2.36)

Updates `deadpool` from 0.12.2 to 0.12.3
- [Changelog](https://github.com/deadpool-rs/deadpool/blob/main/release.toml)
- [Commits](https://github.com/bikeshedder/deadpool/compare/deadpool-v0.12.2...deadpool-v0.12.3)

Updates `libz-rs-sys` from 0.5.1 to 0.5.2
- [Release notes](https://github.com/trifectatechfoundation/zlib-rs/releases)
- [Changelog](https://github.com/trifectatechfoundation/zlib-rs/blob/main/docs/release.md)
- [Commits](https://github.com/trifectatechfoundation/zlib-rs/compare/v0.5.1...v0.5.2)

Updates `log` from 0.4.27 to 0.4.28
- [Release notes](https://github.com/rust-lang/log/releases)
- [Changelog](https://github.com/rust-lang/log/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rust-lang/log/compare/0.4.27...0.4.28)

Updates `rust-ini` from 0.21.1 to 0.21.3
- [Release notes](https://github.com/zonyitoo/rust-ini/releases)
- [Commits](https://github.com/zonyitoo/rust-ini/compare/v0.21.1...v0.21.3)

Updates `tao` from 0.34.2 to 0.34.3
- [Release notes](https://github.com/tauri-apps/tao/releases)
- [Changelog](https://github.com/tauri-apps/tao/blob/dev/CHANGELOG.md)
- [Commits](https://github.com/tauri-apps/tao/compare/tao-v0.34.2...tao-v0.34.3)

Updates `time` from 0.3.41 to 0.3.43
- [Release notes](https://github.com/time-rs/time/releases)
- [Changelog](https://github.com/time-rs/time/blob/main/CHANGELOG.md)
- [Commits](https://github.com/time-rs/time/compare/v0.3.41...v0.3.43)

Updates `time-core` from 0.1.4 to 0.1.6
- [Release notes](https://github.com/time-rs/time/releases)
- [Changelog](https://github.com/time-rs/time/blob/main/CHANGELOG.md)
- [Commits](https://github.com/time-rs/time/commits)

Updates `time-macros` from 0.2.22 to 0.2.24
- [Release notes](https://github.com/time-rs/time/releases)
- [Changelog](https://github.com/time-rs/time/blob/main/CHANGELOG.md)
- [Commits](https://github.com/time-rs/time/compare/v0.2.22...v0.2.24)

Updates `windows-version` from 0.1.4 to 0.1.5
- [Release notes](https://github.com/microsoft/windows-rs/releases)
- [Commits](https://github.com/microsoft/windows-rs/commits)

Updates `zlib-rs` from 0.5.1 to 0.5.2
- [Release notes](https://github.com/trifectatechfoundation/zlib-rs/releases)
- [Changelog](https://github.com/trifectatechfoundation/zlib-rs/blob/main/docs/release.md)
- [Commits](https://github.com/trifectatechfoundation/zlib-rs/compare/v0.5.1...v0.5.2)

Updates `zstd-sys` from 2.0.15+zstd.1.5.7 to 2.0.16+zstd.1.5.7
- [Release notes](https://github.com/gyscos/zstd-rs/releases)
- [Commits](https://github.com/gyscos/zstd-rs/commits)

---
updated-dependencies:
- dependency-name: tauri
  dependency-version: 2.8.5
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: tauri-plugin-deep-link
  dependency-version: 2.4.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: tauri-plugin-dialog
  dependency-version: 2.4.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: zip
  dependency-version: 5.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: rust-dependencies
- dependency-name: uuid
  dependency-version: 1.18.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: windows
  dependency-version: 0.62.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: tauri-build
  dependency-version: 2.4.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: cc
  dependency-version: 1.2.36
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: deadpool
  dependency-version: 0.12.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: libz-rs-sys
  dependency-version: 0.5.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: log
  dependency-version: 0.4.28
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: rust-ini
  dependency-version: 0.21.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: tao
  dependency-version: 0.34.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: time
  dependency-version: 0.3.43
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: time-core
  dependency-version: 0.1.6
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: time-macros
  dependency-version: 0.2.24
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: windows-version
  dependency-version: 0.1.5
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: zlib-rs
  dependency-version: 0.5.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: zstd-sys
  dependency-version: 2.0.16+zstd.1.5.7
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-06 09:38:49 +00:00
dependabot[bot] 11200dbe09 deps(deps): bump the frontend-dependencies group with 76 updates
Bumps the frontend-dependencies group with 76 updates:

| Package | From | To |
| --- | --- | --- |
| [@tauri-apps/plugin-deep-link](https://github.com/tauri-apps/plugins-workspace) | `2.4.2` | `2.4.3` |
| [@tauri-apps/plugin-dialog](https://github.com/tauri-apps/plugins-workspace) | `2.3.3` | `2.4.0` |
| [ahooks](https://github.com/alibaba/hooks) | `3.9.4` | `3.9.5` |
| [@biomejs/biome](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.2` | `2.2.3` |
| [@tailwindcss/postcss](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/packages/@tailwindcss-postcss) | `4.1.12` | `4.1.13` |
| [@tauri-apps/cli](https://github.com/tauri-apps/tauri) | `2.8.3` | `2.8.4` |
| [@types/node](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node) | `24.3.0` | `24.3.1` |
| [lint-staged](https://github.com/lint-staged/lint-staged) | `16.1.5` | `16.1.6` |
| [tailwindcss](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/packages/tailwindcss) | `4.1.12` | `4.1.13` |
| [tw-animate-css](https://github.com/Wombosvideo/tw-animate-css) | `1.3.7` | `1.3.8` |
| [dotenv](https://github.com/motdotla/dotenv) | `17.2.1` | `17.2.2` |
| [fingerprint-generator](https://github.com/apify/fingerprint-suite) | `2.1.70` | `2.1.72` |
| [@babel/runtime](https://github.com/babel/babel/tree/HEAD/packages/babel-runtime) | `7.28.3` | `7.28.4` |
| [@biomejs/cli-darwin-arm64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.2` | `2.2.3` |
| [@biomejs/cli-darwin-x64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.2` | `2.2.3` |
| [@biomejs/cli-linux-arm64-musl](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.2` | `2.2.3` |
| [@biomejs/cli-linux-arm64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.2` | `2.2.3` |
| [@biomejs/cli-linux-x64-musl](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.2` | `2.2.3` |
| [@biomejs/cli-linux-x64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.2` | `2.2.3` |
| [@biomejs/cli-win32-arm64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.2` | `2.2.3` |
| [@biomejs/cli-win32-x64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.2` | `2.2.3` |
| [@rollup/rollup-android-arm-eabi](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-android-arm64](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-darwin-arm64](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-darwin-x64](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-freebsd-arm64](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-freebsd-x64](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-linux-arm-gnueabihf](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-linux-arm-musleabihf](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-linux-arm64-gnu](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-linux-arm64-musl](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-linux-loongarch64-gnu](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-linux-ppc64-gnu](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-linux-riscv64-gnu](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-linux-riscv64-musl](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-linux-s390x-gnu](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-linux-x64-gnu](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-linux-x64-musl](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-win32-arm64-msvc](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-win32-ia32-msvc](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@rollup/rollup-win32-x64-msvc](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |
| [@tailwindcss/node](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/packages/@tailwindcss-node) | `4.1.12` | `4.1.13` |
| [@tailwindcss/oxide-android-arm64](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/crates/node/npm/android-arm64) | `4.1.12` | `4.1.13` |
| [@tailwindcss/oxide-darwin-arm64](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/crates/node/npm/darwin-arm64) | `4.1.12` | `4.1.13` |
| [@tailwindcss/oxide-darwin-x64](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/crates/node/npm/darwin-x64) | `4.1.12` | `4.1.13` |
| [@tailwindcss/oxide-freebsd-x64](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/crates/node/npm/freebsd-x64) | `4.1.12` | `4.1.13` |
| [@tailwindcss/oxide-linux-arm-gnueabihf](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/crates/node/npm/linux-arm-gnueabihf) | `4.1.12` | `4.1.13` |
| [@tailwindcss/oxide-linux-arm64-gnu](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/crates/node/npm/linux-arm64-gnu) | `4.1.12` | `4.1.13` |
| [@tailwindcss/oxide-linux-arm64-musl](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/crates/node/npm/linux-arm64-musl) | `4.1.12` | `4.1.13` |
| [@tailwindcss/oxide-linux-x64-gnu](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/crates/node/npm/linux-x64-gnu) | `4.1.12` | `4.1.13` |
| [@tailwindcss/oxide-linux-x64-musl](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/crates/node/npm/linux-x64-musl) | `4.1.12` | `4.1.13` |
| [@tailwindcss/oxide-wasm32-wasi](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/crates/node) | `4.1.12` | `4.1.13` |
| [@tailwindcss/oxide-win32-arm64-msvc](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/crates/node/npm/win32-arm64-msvc) | `4.1.12` | `4.1.13` |
| [@tailwindcss/oxide-win32-x64-msvc](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/crates/node/npm/win32-x64-msvc) | `4.1.12` | `4.1.13` |
| [@tailwindcss/oxide](https://github.com/tailwindlabs/tailwindcss/tree/HEAD/crates/node) | `4.1.12` | `4.1.13` |
| [@tauri-apps/cli-darwin-arm64](https://github.com/tauri-apps/tauri) | `2.8.3` | `2.8.4` |
| [@tauri-apps/cli-darwin-x64](https://github.com/tauri-apps/tauri) | `2.8.3` | `2.8.4` |
| [@tauri-apps/cli-linux-arm-gnueabihf](https://github.com/tauri-apps/tauri) | `2.8.3` | `2.8.4` |
| [@tauri-apps/cli-linux-arm64-gnu](https://github.com/tauri-apps/tauri) | `2.8.3` | `2.8.4` |
| [@tauri-apps/cli-linux-arm64-musl](https://github.com/tauri-apps/tauri) | `2.8.3` | `2.8.4` |
| [@tauri-apps/cli-linux-riscv64-gnu](https://github.com/tauri-apps/tauri) | `2.8.3` | `2.8.4` |
| [@tauri-apps/cli-linux-x64-gnu](https://github.com/tauri-apps/tauri) | `2.8.3` | `2.8.4` |
| [@tauri-apps/cli-linux-x64-musl](https://github.com/tauri-apps/tauri) | `2.8.3` | `2.8.4` |
| [@tauri-apps/cli-win32-arm64-msvc](https://github.com/tauri-apps/tauri) | `2.8.3` | `2.8.4` |
| [@tauri-apps/cli-win32-ia32-msvc](https://github.com/tauri-apps/tauri) | `2.8.3` | `2.8.4` |
| [@tauri-apps/cli-win32-x64-msvc](https://github.com/tauri-apps/tauri) | `2.8.3` | `2.8.4` |
| [browserslist](https://github.com/browserslist/browserslist) | `4.25.3` | `4.25.4` |
| [dayjs](https://github.com/iamkun/dayjs) | `1.11.13` | `1.11.18` |
| [electron-to-chromium](https://github.com/kilian/electron-to-chromium) | `1.5.209` | `1.5.214` |
| [generative-bayesian-network](https://github.com/apify/fingerprint-suite) | `2.1.70` | `2.1.72` |
| [get-east-asian-width](https://github.com/sindresorhus/get-east-asian-width) | `1.3.0` | `1.3.1` |
| [header-generator](https://github.com/apify/fingerprint-suite) | `2.1.70` | `2.1.72` |
| [listr2](https://github.com/listr2/listr2) | `9.0.2` | `9.0.3` |
| [nano-spawn](https://github.com/sindresorhus/nano-spawn) | `1.0.2` | `1.0.3` |
| [node-releases](https://github.com/chicoxyzzy/node-releases) | `2.0.19` | `2.0.20` |
| [rollup](https://github.com/rollup/rollup) | `4.49.0` | `4.50.0` |


Updates `@tauri-apps/plugin-deep-link` from 2.4.2 to 2.4.3
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](https://github.com/tauri-apps/plugins-workspace/compare/fs-v2.4.2...http-v2.4.3)

Updates `@tauri-apps/plugin-dialog` from 2.3.3 to 2.4.0
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](https://github.com/tauri-apps/plugins-workspace/compare/dialog-v2.3.3...fs-v2.4.0)

Updates `ahooks` from 3.9.4 to 3.9.5
- [Release notes](https://github.com/alibaba/hooks/releases)
- [Commits](https://github.com/alibaba/hooks/compare/v3.9.4...v3.9.5)

Updates `@biomejs/biome` from 2.2.2 to 2.2.3
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.3/packages/@biomejs/biome)

Updates `@tailwindcss/postcss` from 4.1.12 to 4.1.13
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/commits/v4.1.13/packages/@tailwindcss-postcss)

Updates `@tauri-apps/cli` from 2.8.3 to 2.8.4
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/@tauri-apps/cli-v2.8.3...@tauri-apps/cli-v2.8.4)

Updates `@types/node` from 24.3.0 to 24.3.1
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node)

Updates `lint-staged` from 16.1.5 to 16.1.6
- [Release notes](https://github.com/lint-staged/lint-staged/releases)
- [Changelog](https://github.com/lint-staged/lint-staged/blob/main/CHANGELOG.md)
- [Commits](https://github.com/lint-staged/lint-staged/compare/v16.1.5...v16.1.6)

Updates `tailwindcss` from 4.1.12 to 4.1.13
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/commits/v4.1.13/packages/tailwindcss)

Updates `tw-animate-css` from 1.3.7 to 1.3.8
- [Release notes](https://github.com/Wombosvideo/tw-animate-css/releases)
- [Commits](https://github.com/Wombosvideo/tw-animate-css/compare/v1.3.7...v1.3.8)

Updates `dotenv` from 17.2.1 to 17.2.2
- [Changelog](https://github.com/motdotla/dotenv/blob/master/CHANGELOG.md)
- [Commits](https://github.com/motdotla/dotenv/compare/v17.2.1...v17.2.2)

Updates `fingerprint-generator` from 2.1.70 to 2.1.72
- [Release notes](https://github.com/apify/fingerprint-suite/releases)
- [Commits](https://github.com/apify/fingerprint-suite/compare/v2.1.70...v2.1.72)

Updates `@babel/runtime` from 7.28.3 to 7.28.4
- [Release notes](https://github.com/babel/babel/releases)
- [Changelog](https://github.com/babel/babel/blob/main/CHANGELOG.md)
- [Commits](https://github.com/babel/babel/commits/v7.28.4/packages/babel-runtime)

Updates `@biomejs/cli-darwin-arm64` from 2.2.2 to 2.2.3
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.3/packages/@biomejs/biome)

Updates `@biomejs/cli-darwin-x64` from 2.2.2 to 2.2.3
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.3/packages/@biomejs/biome)

Updates `@biomejs/cli-linux-arm64-musl` from 2.2.2 to 2.2.3
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.3/packages/@biomejs/biome)

Updates `@biomejs/cli-linux-arm64` from 2.2.2 to 2.2.3
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.3/packages/@biomejs/biome)

Updates `@biomejs/cli-linux-x64-musl` from 2.2.2 to 2.2.3
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.3/packages/@biomejs/biome)

Updates `@biomejs/cli-linux-x64` from 2.2.2 to 2.2.3
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.3/packages/@biomejs/biome)

Updates `@biomejs/cli-win32-arm64` from 2.2.2 to 2.2.3
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.3/packages/@biomejs/biome)

Updates `@biomejs/cli-win32-x64` from 2.2.2 to 2.2.3
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.3/packages/@biomejs/biome)

Updates `@rollup/rollup-android-arm-eabi` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-android-arm64` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-darwin-arm64` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-darwin-x64` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-freebsd-arm64` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-freebsd-x64` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-linux-arm-gnueabihf` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-linux-arm-musleabihf` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-linux-arm64-gnu` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-linux-arm64-musl` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-linux-loongarch64-gnu` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-linux-ppc64-gnu` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-linux-riscv64-gnu` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-linux-riscv64-musl` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-linux-s390x-gnu` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-linux-x64-gnu` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-linux-x64-musl` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-win32-arm64-msvc` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-win32-ia32-msvc` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@rollup/rollup-win32-x64-msvc` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

Updates `@tailwindcss/node` from 4.1.12 to 4.1.13
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/commits/v4.1.13/packages/@tailwindcss-node)

Updates `@tailwindcss/oxide-android-arm64` from 4.1.12 to 4.1.13
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/commits/v4.1.13/crates/node/npm/android-arm64)

Updates `@tailwindcss/oxide-darwin-arm64` from 4.1.12 to 4.1.13
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/commits/v4.1.13/crates/node/npm/darwin-arm64)

Updates `@tailwindcss/oxide-darwin-x64` from 4.1.12 to 4.1.13
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/commits/v4.1.13/crates/node/npm/darwin-x64)

Updates `@tailwindcss/oxide-freebsd-x64` from 4.1.12 to 4.1.13
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/commits/v4.1.13/crates/node/npm/freebsd-x64)

Updates `@tailwindcss/oxide-linux-arm-gnueabihf` from 4.1.12 to 4.1.13
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/commits/v4.1.13/crates/node/npm/linux-arm-gnueabihf)

Updates `@tailwindcss/oxide-linux-arm64-gnu` from 4.1.12 to 4.1.13
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/commits/v4.1.13/crates/node/npm/linux-arm64-gnu)

Updates `@tailwindcss/oxide-linux-arm64-musl` from 4.1.12 to 4.1.13
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/commits/v4.1.13/crates/node/npm/linux-arm64-musl)

Updates `@tailwindcss/oxide-linux-x64-gnu` from 4.1.12 to 4.1.13
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/commits/v4.1.13/crates/node/npm/linux-x64-gnu)

Updates `@tailwindcss/oxide-linux-x64-musl` from 4.1.12 to 4.1.13
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/commits/v4.1.13/crates/node/npm/linux-x64-musl)

Updates `@tailwindcss/oxide-wasm32-wasi` from 4.1.12 to 4.1.13
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/commits/v4.1.13/crates/node)

Updates `@tailwindcss/oxide-win32-arm64-msvc` from 4.1.12 to 4.1.13
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/commits/v4.1.13/crates/node/npm/win32-arm64-msvc)

Updates `@tailwindcss/oxide-win32-x64-msvc` from 4.1.12 to 4.1.13
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/commits/v4.1.13/crates/node/npm/win32-x64-msvc)

Updates `@tailwindcss/oxide` from 4.1.12 to 4.1.13
- [Release notes](https://github.com/tailwindlabs/tailwindcss/releases)
- [Changelog](https://github.com/tailwindlabs/tailwindcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/tailwindlabs/tailwindcss/commits/v4.1.13/crates/node)

Updates `@tauri-apps/cli-darwin-arm64` from 2.8.3 to 2.8.4
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-v2.8.3...tauri-v2.8.4)

Updates `@tauri-apps/cli-darwin-x64` from 2.8.3 to 2.8.4
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-v2.8.3...tauri-v2.8.4)

Updates `@tauri-apps/cli-linux-arm-gnueabihf` from 2.8.3 to 2.8.4
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-v2.8.3...tauri-v2.8.4)

Updates `@tauri-apps/cli-linux-arm64-gnu` from 2.8.3 to 2.8.4
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-v2.8.3...tauri-v2.8.4)

Updates `@tauri-apps/cli-linux-arm64-musl` from 2.8.3 to 2.8.4
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-v2.8.3...tauri-v2.8.4)

Updates `@tauri-apps/cli-linux-riscv64-gnu` from 2.8.3 to 2.8.4
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-v2.8.3...tauri-v2.8.4)

Updates `@tauri-apps/cli-linux-x64-gnu` from 2.8.3 to 2.8.4
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-v2.8.3...tauri-v2.8.4)

Updates `@tauri-apps/cli-linux-x64-musl` from 2.8.3 to 2.8.4
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-v2.8.3...tauri-v2.8.4)

Updates `@tauri-apps/cli-win32-arm64-msvc` from 2.8.3 to 2.8.4
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-v2.8.3...tauri-v2.8.4)

Updates `@tauri-apps/cli-win32-ia32-msvc` from 2.8.3 to 2.8.4
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-v2.8.3...tauri-v2.8.4)

Updates `@tauri-apps/cli-win32-x64-msvc` from 2.8.3 to 2.8.4
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-v2.8.3...tauri-v2.8.4)

Updates `browserslist` from 4.25.3 to 4.25.4
- [Release notes](https://github.com/browserslist/browserslist/releases)
- [Changelog](https://github.com/browserslist/browserslist/blob/main/CHANGELOG.md)
- [Commits](https://github.com/browserslist/browserslist/compare/4.25.3...4.25.4)

Updates `dayjs` from 1.11.13 to 1.11.18
- [Release notes](https://github.com/iamkun/dayjs/releases)
- [Changelog](https://github.com/iamkun/dayjs/blob/v1.11.18/CHANGELOG.md)
- [Commits](https://github.com/iamkun/dayjs/compare/v1.11.13...v1.11.18)

Updates `electron-to-chromium` from 1.5.209 to 1.5.214
- [Changelog](https://github.com/Kilian/electron-to-chromium/blob/master/CHANGELOG.md)
- [Commits](https://github.com/kilian/electron-to-chromium/compare/v1.5.209...v1.5.214)

Updates `generative-bayesian-network` from 2.1.70 to 2.1.72
- [Release notes](https://github.com/apify/fingerprint-suite/releases)
- [Commits](https://github.com/apify/fingerprint-suite/compare/v2.1.70...v2.1.72)

Updates `get-east-asian-width` from 1.3.0 to 1.3.1
- [Release notes](https://github.com/sindresorhus/get-east-asian-width/releases)
- [Commits](https://github.com/sindresorhus/get-east-asian-width/compare/v1.3.0...v1.3.1)

Updates `header-generator` from 2.1.70 to 2.1.72
- [Release notes](https://github.com/apify/fingerprint-suite/releases)
- [Commits](https://github.com/apify/fingerprint-suite/compare/v2.1.70...v2.1.72)

Updates `listr2` from 9.0.2 to 9.0.3
- [Release notes](https://github.com/listr2/listr2/releases)
- [Changelog](https://github.com/listr2/listr2/blob/master/release.config.js)
- [Commits](https://github.com/listr2/listr2/compare/listr2@9.0.2...listr2@9.0.3)

Updates `nano-spawn` from 1.0.2 to 1.0.3
- [Release notes](https://github.com/sindresorhus/nano-spawn/releases)
- [Commits](https://github.com/sindresorhus/nano-spawn/compare/v1.0.2...v1.0.3)

Updates `node-releases` from 2.0.19 to 2.0.20
- [Commits](https://github.com/chicoxyzzy/node-releases/commits)

Updates `rollup` from 4.49.0 to 4.50.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.49.0...v4.50.0)

---
updated-dependencies:
- dependency-name: "@tauri-apps/plugin-deep-link"
  dependency-version: 2.4.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/plugin-dialog"
  dependency-version: 2.4.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: ahooks
  dependency-version: 3.9.5
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/biome"
  dependency-version: 2.2.3
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tailwindcss/postcss"
  dependency-version: 4.1.13
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli"
  dependency-version: 2.8.4
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@types/node"
  dependency-version: 24.3.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: lint-staged
  dependency-version: 16.1.6
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: tailwindcss
  dependency-version: 4.1.13
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: tw-animate-css
  dependency-version: 1.3.8
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: dotenv
  dependency-version: 17.2.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: fingerprint-generator
  dependency-version: 2.1.72
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@babel/runtime"
  dependency-version: 7.28.4
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-darwin-arm64"
  dependency-version: 2.2.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-darwin-x64"
  dependency-version: 2.2.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-linux-arm64-musl"
  dependency-version: 2.2.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-linux-arm64"
  dependency-version: 2.2.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-linux-x64-musl"
  dependency-version: 2.2.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-linux-x64"
  dependency-version: 2.2.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-win32-arm64"
  dependency-version: 2.2.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-win32-x64"
  dependency-version: 2.2.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-android-arm-eabi"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-android-arm64"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-darwin-arm64"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-darwin-x64"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-freebsd-arm64"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-freebsd-x64"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-arm-gnueabihf"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-arm-musleabihf"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-arm64-gnu"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-arm64-musl"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-loongarch64-gnu"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-ppc64-gnu"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-riscv64-gnu"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-riscv64-musl"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-s390x-gnu"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-x64-gnu"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-x64-musl"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-win32-arm64-msvc"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-win32-ia32-msvc"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-win32-x64-msvc"
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@tailwindcss/node"
  dependency-version: 4.1.13
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tailwindcss/oxide-android-arm64"
  dependency-version: 4.1.13
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tailwindcss/oxide-darwin-arm64"
  dependency-version: 4.1.13
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tailwindcss/oxide-darwin-x64"
  dependency-version: 4.1.13
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tailwindcss/oxide-freebsd-x64"
  dependency-version: 4.1.13
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tailwindcss/oxide-linux-arm-gnueabihf"
  dependency-version: 4.1.13
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tailwindcss/oxide-linux-arm64-gnu"
  dependency-version: 4.1.13
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tailwindcss/oxide-linux-arm64-musl"
  dependency-version: 4.1.13
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tailwindcss/oxide-linux-x64-gnu"
  dependency-version: 4.1.13
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tailwindcss/oxide-linux-x64-musl"
  dependency-version: 4.1.13
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tailwindcss/oxide-wasm32-wasi"
  dependency-version: 4.1.13
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tailwindcss/oxide-win32-arm64-msvc"
  dependency-version: 4.1.13
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tailwindcss/oxide-win32-x64-msvc"
  dependency-version: 4.1.13
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tailwindcss/oxide"
  dependency-version: 4.1.13
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-darwin-arm64"
  dependency-version: 2.8.4
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-darwin-x64"
  dependency-version: 2.8.4
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-linux-arm-gnueabihf"
  dependency-version: 2.8.4
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-linux-arm64-gnu"
  dependency-version: 2.8.4
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-linux-arm64-musl"
  dependency-version: 2.8.4
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-linux-riscv64-gnu"
  dependency-version: 2.8.4
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-linux-x64-gnu"
  dependency-version: 2.8.4
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-linux-x64-musl"
  dependency-version: 2.8.4
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-win32-arm64-msvc"
  dependency-version: 2.8.4
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-win32-ia32-msvc"
  dependency-version: 2.8.4
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-win32-x64-msvc"
  dependency-version: 2.8.4
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: browserslist
  dependency-version: 4.25.4
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: dayjs
  dependency-version: 1.11.18
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: electron-to-chromium
  dependency-version: 1.5.214
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: generative-bayesian-network
  dependency-version: 2.1.72
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: get-east-asian-width
  dependency-version: 1.3.1
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: header-generator
  dependency-version: 2.1.72
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: listr2
  dependency-version: 9.0.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: nano-spawn
  dependency-version: 1.0.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: node-releases
  dependency-version: 2.0.20
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: rollup
  dependency-version: 4.50.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-06 09:27:22 +00:00
dependabot[bot] 2bd01376db ci(deps): bump the github-actions group with 4 updates
Bumps the github-actions group with 4 updates: [actions/setup-node](https://github.com/actions/setup-node), [ridedott/merge-me-action](https://github.com/ridedott/merge-me-action), [crate-ci/typos](https://github.com/crate-ci/typos) and [actions/stale](https://github.com/actions/stale).


Updates `actions/setup-node` from 4.4.0 to 5.0.0
- [Release notes](https://github.com/actions/setup-node/releases)
- [Commits](https://github.com/actions/setup-node/compare/49933ea5288caeca8642d1e84afbd3f7d6820020...a0853c24544627f65ddf259abe73b1d18a591444)

Updates `ridedott/merge-me-action` from 2.10.126 to 2.10.128
- [Release notes](https://github.com/ridedott/merge-me-action/releases)
- [Changelog](https://github.com/ridedott/merge-me-action/blob/master/CHANGELOG.md)
- [Commits](https://github.com/ridedott/merge-me-action/compare/ad649157c69da4d34e601ee360de7a74ce4e2090...60142b76c22362f5845c877672fd2822b4d07c13)

Updates `crate-ci/typos` from 1.35.7 to 1.36.2
- [Release notes](https://github.com/crate-ci/typos/releases)
- [Changelog](https://github.com/crate-ci/typos/blob/master/CHANGELOG.md)
- [Commits](https://github.com/crate-ci/typos/compare/65f69f021b736bdbe548ce72200500752d42b40e...85f62a8a84f939ae994ab3763f01a0296d61a7ee)

Updates `actions/stale` from 9.1.0 to 10.0.0
- [Release notes](https://github.com/actions/stale/releases)
- [Changelog](https://github.com/actions/stale/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/stale/compare/5bef64f19d7facfb25b37b414482c7164d639639...3a9db7e6a41a89f618792c92c0e97cc736e1b13f)

---
updated-dependencies:
- dependency-name: actions/setup-node
  dependency-version: 5.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: github-actions
- dependency-name: ridedott/merge-me-action
  dependency-version: 2.10.128
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
- dependency-name: crate-ci/typos
  dependency-version: 1.36.2
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: github-actions
- dependency-name: actions/stale
  dependency-version: 10.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: github-actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-06 09:05:57 +00:00
zhom ba36956158 chore: next-env autogen 2025-09-03 22:23:40 +04:00
zhom ce3e27ca64 chore: version bump 2025-09-03 22:05:42 +04:00
zhom fd0fb8c7ca refactor: mark all firefox developers edition releases as nightly 2025-09-03 22:02:48 +04:00
zhom 701c8aefd3 fix: properly update profile after downloading broweser 2025-09-03 20:49:26 +04:00
zhom d4a7c347b6 chore: next-env autogen 2025-09-03 19:57:51 +04:00
zhom 3c3e6df3b2 chore: ignore ts files inside tauri output 2025-09-03 19:44:42 +04:00
zhom cd4b23bd27 refactor: simplify browser runner 2025-09-03 19:40:17 +04:00
zhom 042a348971 refactor: better binary management 2025-09-02 23:41:17 +04:00
zhom b8f4e4adda build: switch back to ubuntu 22.04 runner 2025-09-02 19:26:54 +04:00
zhom e8852a3caf build: switch to ubuntu 18 for linux build 2025-09-02 18:44:21 +04:00
zhom 6ed1adafc8 chore: version bump 2025-09-02 18:21:40 +04:00
zhom 22e6b2762e fix: search for correct folder on chromium extraction on linux x64 2025-09-02 18:18:27 +04:00
zhom bc7c8d1a1e refactor: better profile creation flow 2025-09-01 14:40:55 +04:00
zhom b133f928d4 Merge pull request #79 from zhom/dependabot/github_actions/github-actions-9dffe6f043
ci(deps): bump the github-actions group with 4 updates
2025-08-30 16:22:28 +04:00
zhom 02185e0480 Merge pull request #80 from zhom/dependabot/npm_and_yarn/frontend-dependencies-4db6c81266
deps(deps): bump the frontend-dependencies group with 46 updates
2025-08-30 16:22:15 +04:00
zhom 6402ff302a Merge pull request #81 from zhom/dependabot/cargo/src-tauri/rust-dependencies-84e3784149
deps(rust)(deps): bump the rust-dependencies group in /src-tauri with 5 updates
2025-08-30 16:21:59 +04:00
dependabot[bot] ed830ed789 deps(rust)(deps): bump the rust-dependencies group
Bumps the rust-dependencies group in /src-tauri with 5 updates:

| Package | From | To |
| --- | --- | --- |
| [async-executor](https://github.com/smol-rs/async-executor) | `1.13.2` | `1.13.3` |
| [camino](https://github.com/camino-rs/camino) | `1.1.11` | `1.1.12` |
| [liblzma](https://github.com/portable-network-archive/liblzma-rs) | `0.4.3` | `0.4.4` |
| [potential_utf](https://github.com/unicode-org/icu4x) | `0.1.2` | `0.1.3` |
| [wry](https://github.com/tauri-apps/wry) | `0.53.2` | `0.53.3` |


Updates `async-executor` from 1.13.2 to 1.13.3
- [Release notes](https://github.com/smol-rs/async-executor/releases)
- [Changelog](https://github.com/smol-rs/async-executor/blob/master/CHANGELOG.md)
- [Commits](https://github.com/smol-rs/async-executor/compare/v1.13.2...v1.13.3)

Updates `camino` from 1.1.11 to 1.1.12
- [Release notes](https://github.com/camino-rs/camino/releases)
- [Changelog](https://github.com/camino-rs/camino/blob/main/CHANGELOG.md)
- [Commits](https://github.com/camino-rs/camino/compare/camino-1.1.11...camino-1.1.12)

Updates `liblzma` from 0.4.3 to 0.4.4
- [Release notes](https://github.com/portable-network-archive/liblzma-rs/releases)
- [Commits](https://github.com/portable-network-archive/liblzma-rs/compare/liblzma-0.4.3...liblzma-0.4.4)

Updates `potential_utf` from 0.1.2 to 0.1.3
- [Release notes](https://github.com/unicode-org/icu4x/releases)
- [Changelog](https://github.com/unicode-org/icu4x/blob/main/CHANGELOG.md)
- [Commits](https://github.com/unicode-org/icu4x/commits/ind/potential_utf@0.1.3)

Updates `wry` from 0.53.2 to 0.53.3
- [Release notes](https://github.com/tauri-apps/wry/releases)
- [Changelog](https://github.com/tauri-apps/wry/blob/dev/CHANGELOG.md)
- [Commits](https://github.com/tauri-apps/wry/compare/wry-v0.53.2...wry-v0.53.3)

---
updated-dependencies:
- dependency-name: async-executor
  dependency-version: 1.13.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: camino
  dependency-version: 1.1.12
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: liblzma
  dependency-version: 0.4.4
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: potential_utf
  dependency-version: 0.1.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: wry
  dependency-version: 0.53.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-08-30 09:41:48 +00:00
dependabot[bot] d03f598567 deps(deps): bump the frontend-dependencies group with 46 updates
Bumps the frontend-dependencies group with 46 updates:

| Package | From | To |
| --- | --- | --- |
| [lucide-react](https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react) | `0.541.0` | `0.542.0` |
| [next](https://github.com/vercel/next.js) | `15.5.1` | `15.5.2` |
| [@biomejs/biome](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.0` | `2.2.2` |
| [@types/react](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/react) | `19.1.11` | `19.1.12` |
| [@types/react-dom](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/react-dom) | `19.1.8` | `19.1.9` |
| [@vitejs/plugin-react](https://github.com/vitejs/vite-plugin-react/tree/HEAD/packages/plugin-react) | `5.0.1` | `5.0.2` |
| [@biomejs/cli-darwin-arm64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.0` | `2.2.2` |
| [@biomejs/cli-darwin-x64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.0` | `2.2.2` |
| [@biomejs/cli-linux-arm64-musl](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.0` | `2.2.2` |
| [@biomejs/cli-linux-arm64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.0` | `2.2.2` |
| [@biomejs/cli-linux-x64-musl](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.0` | `2.2.2` |
| [@biomejs/cli-linux-x64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.0` | `2.2.2` |
| [@biomejs/cli-win32-arm64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.0` | `2.2.2` |
| [@biomejs/cli-win32-x64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.2.0` | `2.2.2` |
| [@emnapi/runtime](https://github.com/toyobayashi/emnapi) | `1.4.5` | `1.5.0` |
| [@next/env](https://github.com/vercel/next.js/tree/HEAD/packages/next-env) | `15.5.1` | `15.5.2` |
| [@next/swc-darwin-arm64](https://github.com/vercel/next.js/tree/HEAD/crates/napi/npm/darwin-arm64) | `15.5.1` | `15.5.2` |
| [@next/swc-darwin-x64](https://github.com/vercel/next.js/tree/HEAD/crates/napi/npm/darwin-x64) | `15.5.1` | `15.5.2` |
| [@next/swc-linux-arm64-gnu](https://github.com/vercel/next.js/tree/HEAD/crates/napi/npm/linux-arm64-gnu) | `15.5.1` | `15.5.2` |
| [@next/swc-linux-arm64-musl](https://github.com/vercel/next.js/tree/HEAD/crates/napi/npm/linux-arm64-musl) | `15.5.1` | `15.5.2` |
| [@next/swc-linux-x64-gnu](https://github.com/vercel/next.js/tree/HEAD/crates/napi/npm/linux-x64-gnu) | `15.5.1` | `15.5.2` |
| [@next/swc-linux-x64-musl](https://github.com/vercel/next.js/tree/HEAD/crates/napi/npm/linux-x64-musl) | `15.5.1` | `15.5.2` |
| [@next/swc-win32-arm64-msvc](https://github.com/vercel/next.js/tree/HEAD/crates/napi/npm/win32-arm64-msvc) | `15.5.1` | `15.5.2` |
| [@next/swc-win32-x64-msvc](https://github.com/vercel/next.js/tree/HEAD/crates/napi/npm/win32-x64-msvc) | `15.5.1` | `15.5.2` |
| [@rolldown/pluginutils](https://github.com/rolldown/rolldown/tree/HEAD/packages/pluginutils) | `1.0.0-beta.32` | `1.0.0-beta.34` |
| [@rollup/rollup-android-arm-eabi](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-android-arm64](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-darwin-arm64](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-darwin-x64](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-freebsd-arm64](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-freebsd-x64](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-linux-arm-gnueabihf](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-linux-arm-musleabihf](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-linux-arm64-gnu](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-linux-arm64-musl](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-linux-loongarch64-gnu](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-linux-ppc64-gnu](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-linux-riscv64-gnu](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-linux-riscv64-musl](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-linux-s390x-gnu](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-linux-x64-gnu](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-linux-x64-musl](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-win32-arm64-msvc](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-win32-ia32-msvc](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [@rollup/rollup-win32-x64-msvc](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |
| [rollup](https://github.com/rollup/rollup) | `4.48.1` | `4.49.0` |


Updates `lucide-react` from 0.541.0 to 0.542.0
- [Release notes](https://github.com/lucide-icons/lucide/releases)
- [Commits](https://github.com/lucide-icons/lucide/commits/0.542.0/packages/lucide-react)

Updates `next` from 15.5.1 to 15.5.2
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/compare/v15.5.1...v15.5.2)

Updates `@biomejs/biome` from 2.2.0 to 2.2.2
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.2/packages/@biomejs/biome)

Updates `@types/react` from 19.1.11 to 19.1.12
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/react)

Updates `@types/react-dom` from 19.1.8 to 19.1.9
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/react-dom)

Updates `@vitejs/plugin-react` from 5.0.1 to 5.0.2
- [Release notes](https://github.com/vitejs/vite-plugin-react/releases)
- [Changelog](https://github.com/vitejs/vite-plugin-react/blob/main/packages/plugin-react/CHANGELOG.md)
- [Commits](https://github.com/vitejs/vite-plugin-react/commits/plugin-react@5.0.2/packages/plugin-react)

Updates `@biomejs/cli-darwin-arm64` from 2.2.0 to 2.2.2
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.2/packages/@biomejs/biome)

Updates `@biomejs/cli-darwin-x64` from 2.2.0 to 2.2.2
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.2/packages/@biomejs/biome)

Updates `@biomejs/cli-linux-arm64-musl` from 2.2.0 to 2.2.2
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.2/packages/@biomejs/biome)

Updates `@biomejs/cli-linux-arm64` from 2.2.0 to 2.2.2
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.2/packages/@biomejs/biome)

Updates `@biomejs/cli-linux-x64-musl` from 2.2.0 to 2.2.2
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.2/packages/@biomejs/biome)

Updates `@biomejs/cli-linux-x64` from 2.2.0 to 2.2.2
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.2/packages/@biomejs/biome)

Updates `@biomejs/cli-win32-arm64` from 2.2.0 to 2.2.2
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.2/packages/@biomejs/biome)

Updates `@biomejs/cli-win32-x64` from 2.2.0 to 2.2.2
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.2/packages/@biomejs/biome)

Updates `@emnapi/runtime` from 1.4.5 to 1.5.0
- [Release notes](https://github.com/toyobayashi/emnapi/releases)
- [Commits](https://github.com/toyobayashi/emnapi/compare/v1.4.5...v1.5.0)

Updates `@next/env` from 15.5.1 to 15.5.2
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.2/packages/next-env)

Updates `@next/swc-darwin-arm64` from 15.5.1 to 15.5.2
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.2/crates/napi/npm/darwin-arm64)

Updates `@next/swc-darwin-x64` from 15.5.1 to 15.5.2
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.2/crates/napi/npm/darwin-x64)

Updates `@next/swc-linux-arm64-gnu` from 15.5.1 to 15.5.2
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.2/crates/napi/npm/linux-arm64-gnu)

Updates `@next/swc-linux-arm64-musl` from 15.5.1 to 15.5.2
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.2/crates/napi/npm/linux-arm64-musl)

Updates `@next/swc-linux-x64-gnu` from 15.5.1 to 15.5.2
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.2/crates/napi/npm/linux-x64-gnu)

Updates `@next/swc-linux-x64-musl` from 15.5.1 to 15.5.2
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.2/crates/napi/npm/linux-x64-musl)

Updates `@next/swc-win32-arm64-msvc` from 15.5.1 to 15.5.2
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.2/crates/napi/npm/win32-arm64-msvc)

Updates `@next/swc-win32-x64-msvc` from 15.5.1 to 15.5.2
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.2/crates/napi/npm/win32-x64-msvc)

Updates `@rolldown/pluginutils` from 1.0.0-beta.32 to 1.0.0-beta.34
- [Release notes](https://github.com/rolldown/rolldown/releases)
- [Changelog](https://github.com/rolldown/rolldown/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rolldown/rolldown/commits/v1.0.0-beta.34/packages/pluginutils)

Updates `@rollup/rollup-android-arm-eabi` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-android-arm64` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-darwin-arm64` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-darwin-x64` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-freebsd-arm64` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-freebsd-x64` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-linux-arm-gnueabihf` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-linux-arm-musleabihf` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-linux-arm64-gnu` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-linux-arm64-musl` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-linux-loongarch64-gnu` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-linux-ppc64-gnu` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-linux-riscv64-gnu` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-linux-riscv64-musl` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-linux-s390x-gnu` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-linux-x64-gnu` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-linux-x64-musl` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-win32-arm64-msvc` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-win32-ia32-msvc` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `@rollup/rollup-win32-x64-msvc` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

Updates `rollup` from 4.48.1 to 4.49.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.48.1...v4.49.0)

---
updated-dependencies:
- dependency-name: lucide-react
  dependency-version: 0.542.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: next
  dependency-version: 15.5.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/biome"
  dependency-version: 2.2.2
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@types/react"
  dependency-version: 19.1.12
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@types/react-dom"
  dependency-version: 19.1.9
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@vitejs/plugin-react"
  dependency-version: 5.0.2
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-darwin-arm64"
  dependency-version: 2.2.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-darwin-x64"
  dependency-version: 2.2.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-linux-arm64-musl"
  dependency-version: 2.2.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-linux-arm64"
  dependency-version: 2.2.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-linux-x64-musl"
  dependency-version: 2.2.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-linux-x64"
  dependency-version: 2.2.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-win32-arm64"
  dependency-version: 2.2.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-win32-x64"
  dependency-version: 2.2.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@emnapi/runtime"
  dependency-version: 1.5.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@next/env"
  dependency-version: 15.5.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@next/swc-darwin-arm64"
  dependency-version: 15.5.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@next/swc-darwin-x64"
  dependency-version: 15.5.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@next/swc-linux-arm64-gnu"
  dependency-version: 15.5.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@next/swc-linux-arm64-musl"
  dependency-version: 15.5.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@next/swc-linux-x64-gnu"
  dependency-version: 15.5.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@next/swc-linux-x64-musl"
  dependency-version: 15.5.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@next/swc-win32-arm64-msvc"
  dependency-version: 15.5.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@next/swc-win32-x64-msvc"
  dependency-version: 15.5.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@rolldown/pluginutils"
  dependency-version: 1.0.0-beta.34
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-android-arm-eabi"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-android-arm64"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-darwin-arm64"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-darwin-x64"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-freebsd-arm64"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-freebsd-x64"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-arm-gnueabihf"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-arm-musleabihf"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-arm64-gnu"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-arm64-musl"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-loongarch64-gnu"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-ppc64-gnu"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-riscv64-gnu"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-riscv64-musl"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-s390x-gnu"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-x64-gnu"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-x64-musl"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-win32-arm64-msvc"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-win32-ia32-msvc"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-win32-x64-msvc"
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: rollup
  dependency-version: 4.49.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-08-30 09:23:10 +00:00
dependabot[bot] 6aedf58264 ci(deps): bump the github-actions group with 4 updates
Bumps the github-actions group with 4 updates: [google/osv-scanner-action](https://github.com/google/osv-scanner-action), [actions/ai-inference](https://github.com/actions/ai-inference), [tauri-apps/tauri-action](https://github.com/tauri-apps/tauri-action) and [crate-ci/typos](https://github.com/crate-ci/typos).


Updates `google/osv-scanner-action` from 2.2.1 to 2.2.2
- [Release notes](https://github.com/google/osv-scanner-action/releases)
- [Commits](https://github.com/google/osv-scanner-action/compare/456ceb78310755116e0a3738121351006286b797...90b209d0ea55cea1da9fc0c4e65782cc6acb6e2e)

Updates `actions/ai-inference` from 2.0.0 to 2.0.1
- [Release notes](https://github.com/actions/ai-inference/releases)
- [Commits](https://github.com/actions/ai-inference/compare/f347eae8ebabecb85d17f52960f909b8a4a8dad5...a1c11829223a786afe3b5663db904a3aa1eac3a2)

Updates `tauri-apps/tauri-action` from 0.5.22 to 0.5.23
- [Release notes](https://github.com/tauri-apps/tauri-action/releases)
- [Changelog](https://github.com/tauri-apps/tauri-action/blob/dev/CHANGELOG.md)
- [Commits](https://github.com/tauri-apps/tauri-action/compare/564aea5a8075c7a54c167bb0cf5b3255314a7f9d...e834788a94591d81e3ae0bd9ec06366f5afb8994)

Updates `crate-ci/typos` from 1.35.5 to 1.35.7
- [Release notes](https://github.com/crate-ci/typos/releases)
- [Changelog](https://github.com/crate-ci/typos/blob/master/CHANGELOG.md)
- [Commits](https://github.com/crate-ci/typos/compare/a4c3e43aea0a9e9b9e6578d2731ebd9a27e8f6cd...65f69f021b736bdbe548ce72200500752d42b40e)

---
updated-dependencies:
- dependency-name: google/osv-scanner-action
  dependency-version: 2.2.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
- dependency-name: actions/ai-inference
  dependency-version: 2.0.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
- dependency-name: tauri-apps/tauri-action
  dependency-version: 0.5.23
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
- dependency-name: crate-ci/typos
  dependency-version: 1.35.7
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-08-30 09:04:30 +00:00
zhom 636f1ea4ba chore: disable changelog update 2025-08-27 19:21:46 +04:00
zhom adb253e103 chore: version bump 2025-08-27 15:28:21 +04:00
zhom e12ac66c7a chore: next-env gen 2025-08-27 15:28:07 +04:00
zhom e06a824438 refactor: handle pending browser updates 2025-08-27 15:23:49 +04:00
zhom 4293b7eab5 chore: next-env gen 2025-08-27 15:13:58 +04:00
zhom 68b138d5ff chore: update dependencies 2025-08-27 07:47:38 +04:00
zhom b79bd94506 docs: readme 2025-08-24 22:24:48 +04:00
zhom 181c76980a feat: add profile search 2025-08-24 21:49:08 +04:00
zhom 274b275c03 chore: linting 2025-08-23 15:13:32 +04:00
zhom 821cce0986 chore: update biome config 2025-08-23 14:56:04 +04:00
zhom 716a028923 chore: pnpm update 2025-08-23 14:55:29 +04:00
zhom 7c25bd3ba2 Merge pull request #78 from zhom/dependabot/cargo/src-tauri/rust-dependencies-c76d1c372d
deps(rust)(deps): bump the rust-dependencies group in /src-tauri with 28 updates
2025-08-23 14:45:46 +04:00
zhom 6d89098263 Merge pull request #77 from zhom/dependabot/npm_and_yarn/frontend-dependencies-b8479b5523
deps(deps): bump the frontend-dependencies group with 64 updates
2025-08-23 14:45:35 +04:00
zhom a1a1a2202e Merge pull request #76 from zhom/dependabot/github_actions/github-actions-7fdfea5e97
ci(deps): bump the github-actions group with 7 updates
2025-08-23 14:45:25 +04:00
zhom 485daae40e chore: formatting 2025-08-23 14:02:49 +04:00
dependabot[bot] 9f22c57b7a deps(rust)(deps): bump the rust-dependencies group
Bumps the rust-dependencies group in /src-tauri with 28 updates:

| Package | From | To |
| --- | --- | --- |
| [serde_json](https://github.com/serde-rs/json) | `1.0.142` | `1.0.143` |
| [tauri-plugin-fs](https://github.com/tauri-apps/plugins-workspace) | `2.4.1` | `2.4.2` |
| [tauri-plugin-deep-link](https://github.com/tauri-apps/plugins-workspace) | `2.4.1` | `2.4.2` |
| [tauri-plugin-dialog](https://github.com/tauri-apps/plugins-workspace) | `2.3.2` | `2.3.3` |
| [zip](https://github.com/zip-rs/zip2) | `4.3.0` | `4.5.0` |
| [url](https://github.com/servo/rust-url) | `2.5.4` | `2.5.6` |
| [tempfile](https://github.com/Stebalien/tempfile) | `3.20.0` | `3.21.0` |
| [hyper](https://github.com/hyperium/hyper) | `1.6.0` | `1.7.0` |
| [tauri-build](https://github.com/tauri-apps/tauri) | `2.3.1` | `2.4.0` |
| [cc](https://github.com/rust-lang/cc-rs) | `1.2.33` | `1.2.34` |
| [cfg-if](https://github.com/rust-lang/cfg-if) | `1.0.1` | `1.0.3` |
| [dlopen2](https://github.com/OpenByteDev/dlopen2) | `0.7.0` | `0.8.0` |
| [filetime](https://github.com/alexcrichton/filetime) | `0.2.25` | `0.2.26` |
| [form_urlencoded](https://github.com/servo/rust-url) | `1.2.1` | `1.2.2` |
| [idna](https://github.com/servo/rust-url) | `1.0.3` | `1.1.0` |
| [io-uring](https://github.com/tokio-rs/io-uring) | `0.7.9` | `0.7.10` |
| [percent-encoding](https://github.com/servo/rust-url) | `2.3.1` | `2.3.2` |
| [serialize-to-javascript](https://github.com/chippers/serialize-to-javascript) | `0.1.1` | `0.1.2` |
| [serialize-to-javascript-impl](https://github.com/chippers/serialize-to-javascript) | `0.1.1` | `0.1.2` |
| [tao](https://github.com/tauri-apps/tao) | `0.34.0` | `0.34.2` |
| [tauri-codegen](https://github.com/tauri-apps/tauri) | `2.3.1` | `2.4.0` |
| [tauri-macros](https://github.com/tauri-apps/tauri) | `2.3.2` | `2.4.0` |
| [tauri-plugin](https://github.com/tauri-apps/tauri) | `2.3.1` | `2.4.0` |
| [tauri-runtime](https://github.com/tauri-apps/tauri) | `2.7.1` | `2.8.0` |
| [tauri-runtime-wry](https://github.com/tauri-apps/tauri) | `2.7.2` | `2.8.0` |
| [tauri-utils](https://github.com/tauri-apps/tauri) | `2.6.0` | `2.7.0` |
| [winapi-util](https://github.com/BurntSushi/winapi-util) | `0.1.9` | `0.1.10` |
| [wry](https://github.com/tauri-apps/wry) | `0.52.1` | `0.53.1` |


Updates `serde_json` from 1.0.142 to 1.0.143
- [Release notes](https://github.com/serde-rs/json/releases)
- [Commits](https://github.com/serde-rs/json/compare/v1.0.142...v1.0.143)

Updates `tauri-plugin-fs` from 2.4.1 to 2.4.2
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](https://github.com/tauri-apps/plugins-workspace/compare/fs-v2.4.1...fs-v2.4.2)

Updates `tauri-plugin-deep-link` from 2.4.1 to 2.4.2
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](https://github.com/tauri-apps/plugins-workspace/compare/fs-v2.4.1...fs-v2.4.2)

Updates `tauri-plugin-dialog` from 2.3.2 to 2.3.3
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](https://github.com/tauri-apps/plugins-workspace/compare/dialog-v2.3.2...dialog-v2.3.3)

Updates `zip` from 4.3.0 to 4.5.0
- [Release notes](https://github.com/zip-rs/zip2/releases)
- [Changelog](https://github.com/zip-rs/zip2/blob/master/CHANGELOG.md)
- [Commits](https://github.com/zip-rs/zip2/compare/v4.3.0...v4.5.0)

Updates `url` from 2.5.4 to 2.5.6
- [Release notes](https://github.com/servo/rust-url/releases)
- [Commits](https://github.com/servo/rust-url/commits)

Updates `tempfile` from 3.20.0 to 3.21.0
- [Changelog](https://github.com/Stebalien/tempfile/blob/master/CHANGELOG.md)
- [Commits](https://github.com/Stebalien/tempfile/commits)

Updates `hyper` from 1.6.0 to 1.7.0
- [Release notes](https://github.com/hyperium/hyper/releases)
- [Changelog](https://github.com/hyperium/hyper/blob/master/CHANGELOG.md)
- [Commits](https://github.com/hyperium/hyper/compare/v1.6.0...v1.7.0)

Updates `tauri-build` from 2.3.1 to 2.4.0
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-build-v2.3.1...tauri-build-v2.4.0)

Updates `cc` from 1.2.33 to 1.2.34
- [Release notes](https://github.com/rust-lang/cc-rs/releases)
- [Changelog](https://github.com/rust-lang/cc-rs/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rust-lang/cc-rs/compare/cc-v1.2.33...cc-v1.2.34)

Updates `cfg-if` from 1.0.1 to 1.0.3
- [Release notes](https://github.com/rust-lang/cfg-if/releases)
- [Changelog](https://github.com/rust-lang/cfg-if/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rust-lang/cfg-if/compare/v1.0.1...v1.0.3)

Updates `dlopen2` from 0.7.0 to 0.8.0
- [Commits](https://github.com/OpenByteDev/dlopen2/commits)

Updates `filetime` from 0.2.25 to 0.2.26
- [Commits](https://github.com/alexcrichton/filetime/commits)

Updates `form_urlencoded` from 1.2.1 to 1.2.2
- [Release notes](https://github.com/servo/rust-url/releases)
- [Commits](https://github.com/servo/rust-url/compare/v1.2.1...v1.2.2)

Updates `idna` from 1.0.3 to 1.1.0
- [Release notes](https://github.com/servo/rust-url/releases)
- [Commits](https://github.com/servo/rust-url/commits)

Updates `io-uring` from 0.7.9 to 0.7.10
- [Commits](https://github.com/tokio-rs/io-uring/commits)

Updates `percent-encoding` from 2.3.1 to 2.3.2
- [Release notes](https://github.com/servo/rust-url/releases)
- [Commits](https://github.com/servo/rust-url/commits)

Updates `serialize-to-javascript` from 0.1.1 to 0.1.2
- [Commits](https://github.com/chippers/serialize-to-javascript/commits)

Updates `serialize-to-javascript-impl` from 0.1.1 to 0.1.2
- [Commits](https://github.com/chippers/serialize-to-javascript/commits)

Updates `tao` from 0.34.0 to 0.34.2
- [Release notes](https://github.com/tauri-apps/tao/releases)
- [Changelog](https://github.com/tauri-apps/tao/blob/dev/CHANGELOG.md)
- [Commits](https://github.com/tauri-apps/tao/compare/tao-v0.34...tao-v0.34.2)

Updates `tauri-codegen` from 2.3.1 to 2.4.0
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-codegen-v2.3.1...tauri-codegen-v2.4.0)

Updates `tauri-macros` from 2.3.2 to 2.4.0
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-macros-v2.3.2...tauri-macros-v2.4.0)

Updates `tauri-plugin` from 2.3.1 to 2.4.0
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-plugin-v2.3.1...tauri-plugin-v2.4.0)

Updates `tauri-runtime` from 2.7.1 to 2.8.0
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-runtime-v2.7.1...tauri-runtime-v2.8.0)

Updates `tauri-runtime-wry` from 2.7.2 to 2.8.0
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-runtime-wry-v2.7.2...tauri-runtime-wry-v2.8.0)

Updates `tauri-utils` from 2.6.0 to 2.7.0
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-utils-v2.6.0...tauri-utils-v2.7.0)

Updates `winapi-util` from 0.1.9 to 0.1.10
- [Commits](https://github.com/BurntSushi/winapi-util/compare/0.1.9...0.1.10)

Updates `wry` from 0.52.1 to 0.53.1
- [Release notes](https://github.com/tauri-apps/wry/releases)
- [Changelog](https://github.com/tauri-apps/wry/blob/dev/CHANGELOG.md)
- [Commits](https://github.com/tauri-apps/wry/compare/wry-v0.52.1...wry-v0.53.1)

---
updated-dependencies:
- dependency-name: serde_json
  dependency-version: 1.0.143
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: tauri-plugin-fs
  dependency-version: 2.4.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: tauri-plugin-deep-link
  dependency-version: 2.4.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: tauri-plugin-dialog
  dependency-version: 2.3.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: zip
  dependency-version: 4.5.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: url
  dependency-version: 2.5.6
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: tempfile
  dependency-version: 3.21.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: hyper
  dependency-version: 1.7.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: tauri-build
  dependency-version: 2.4.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: cc
  dependency-version: 1.2.34
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: cfg-if
  dependency-version: 1.0.3
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: dlopen2
  dependency-version: 0.8.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: filetime
  dependency-version: 0.2.26
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: form_urlencoded
  dependency-version: 1.2.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: idna
  dependency-version: 1.1.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: io-uring
  dependency-version: 0.7.10
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: percent-encoding
  dependency-version: 2.3.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: serialize-to-javascript
  dependency-version: 0.1.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: serialize-to-javascript-impl
  dependency-version: 0.1.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: tao
  dependency-version: 0.34.2
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: tauri-codegen
  dependency-version: 2.4.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: tauri-macros
  dependency-version: 2.4.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: tauri-plugin
  dependency-version: 2.4.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: tauri-runtime
  dependency-version: 2.8.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: tauri-runtime-wry
  dependency-version: 2.8.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: tauri-utils
  dependency-version: 2.7.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
- dependency-name: winapi-util
  dependency-version: 0.1.10
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: rust-dependencies
- dependency-name: wry
  dependency-version: 0.53.1
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: rust-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-08-23 09:56:23 +00:00
dependabot[bot] 45d959e407 deps(deps): bump the frontend-dependencies group with 64 updates
Bumps the frontend-dependencies group with 64 updates:

| Package | From | To |
| --- | --- | --- |
| [@tauri-apps/api](https://github.com/tauri-apps/tauri) | `2.7.0` | `2.8.0` |
| [@tauri-apps/plugin-deep-link](https://github.com/tauri-apps/plugins-workspace) | `2.4.1` | `2.4.2` |
| [@tauri-apps/plugin-dialog](https://github.com/tauri-apps/plugins-workspace) | `2.3.2` | `2.3.3` |
| [@tauri-apps/plugin-fs](https://github.com/tauri-apps/plugins-workspace) | `2.4.1` | `2.4.2` |
| [@tauri-apps/plugin-opener](https://github.com/tauri-apps/plugins-workspace) | `2.4.0` | `2.5.0` |
| [ahooks](https://github.com/alibaba/hooks) | `3.9.0` | `3.9.4` |
| [lucide-react](https://github.com/lucide-icons/lucide/tree/HEAD/packages/lucide-react) | `0.539.0` | `0.541.0` |
| [next](https://github.com/vercel/next.js) | `15.4.6` | `15.5.0` |
| [@biomejs/biome](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.1.4` | `2.2.0` |
| [@tauri-apps/cli](https://github.com/tauri-apps/tauri) | `2.7.1` | `2.8.1` |
| [@types/react](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/react) | `19.1.10` | `19.1.11` |
| [@vitejs/plugin-react](https://github.com/vitejs/vite-plugin-react/tree/HEAD/packages/plugin-react) | `5.0.0` | `5.0.1` |
| [playwright-core](https://github.com/microsoft/playwright) | `1.54.2` | `1.55.0` |
| [@biomejs/cli-darwin-arm64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.1.4` | `2.2.0` |
| [@biomejs/cli-darwin-x64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.1.4` | `2.2.0` |
| [@biomejs/cli-linux-arm64-musl](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.1.4` | `2.2.0` |
| [@biomejs/cli-linux-arm64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.1.4` | `2.2.0` |
| [@biomejs/cli-linux-x64-musl](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.1.4` | `2.2.0` |
| [@biomejs/cli-linux-x64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.1.4` | `2.2.0` |
| [@biomejs/cli-win32-arm64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.1.4` | `2.2.0` |
| [@biomejs/cli-win32-x64](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) | `2.1.4` | `2.2.0` |
| [@next/env](https://github.com/vercel/next.js/tree/HEAD/packages/next-env) | `15.4.6` | `15.5.0` |
| [@next/swc-darwin-arm64](https://github.com/vercel/next.js/tree/HEAD/crates/napi/npm/darwin-arm64) | `15.4.6` | `15.5.0` |
| [@next/swc-darwin-x64](https://github.com/vercel/next.js/tree/HEAD/crates/napi/npm/darwin-x64) | `15.4.6` | `15.5.0` |
| [@next/swc-linux-arm64-gnu](https://github.com/vercel/next.js/tree/HEAD/crates/napi/npm/linux-arm64-gnu) | `15.4.6` | `15.5.0` |
| [@next/swc-linux-arm64-musl](https://github.com/vercel/next.js/tree/HEAD/crates/napi/npm/linux-arm64-musl) | `15.4.6` | `15.5.0` |
| [@next/swc-linux-x64-gnu](https://github.com/vercel/next.js/tree/HEAD/crates/napi/npm/linux-x64-gnu) | `15.4.6` | `15.5.0` |
| [@next/swc-linux-x64-musl](https://github.com/vercel/next.js/tree/HEAD/crates/napi/npm/linux-x64-musl) | `15.4.6` | `15.5.0` |
| [@next/swc-win32-arm64-msvc](https://github.com/vercel/next.js/tree/HEAD/crates/napi/npm/win32-arm64-msvc) | `15.4.6` | `15.5.0` |
| [@next/swc-win32-x64-msvc](https://github.com/vercel/next.js/tree/HEAD/crates/napi/npm/win32-x64-msvc) | `15.4.6` | `15.5.0` |
| [@rolldown/pluginutils](https://github.com/rolldown/rolldown/tree/HEAD/packages/pluginutils) | `1.0.0-beta.30` | `1.0.0-beta.32` |
| [@rollup/rollup-android-arm-eabi](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-android-arm64](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-darwin-arm64](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-darwin-x64](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-freebsd-arm64](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-freebsd-x64](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-linux-arm-gnueabihf](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-linux-arm-musleabihf](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-linux-arm64-gnu](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-linux-arm64-musl](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-linux-loongarch64-gnu](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-linux-ppc64-gnu](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-linux-riscv64-gnu](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-linux-riscv64-musl](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-linux-s390x-gnu](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-linux-x64-gnu](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-linux-x64-musl](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-win32-arm64-msvc](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-win32-ia32-msvc](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@rollup/rollup-win32-x64-msvc](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |
| [@tauri-apps/cli-darwin-arm64](https://github.com/tauri-apps/tauri) | `2.7.1` | `2.8.1` |
| [@tauri-apps/cli-darwin-x64](https://github.com/tauri-apps/tauri) | `2.7.1` | `2.8.1` |
| [@tauri-apps/cli-linux-arm-gnueabihf](https://github.com/tauri-apps/tauri) | `2.7.1` | `2.8.1` |
| [@tauri-apps/cli-linux-arm64-gnu](https://github.com/tauri-apps/tauri) | `2.7.1` | `2.8.1` |
| [@tauri-apps/cli-linux-arm64-musl](https://github.com/tauri-apps/tauri) | `2.7.1` | `2.8.1` |
| [@tauri-apps/cli-linux-riscv64-gnu](https://github.com/tauri-apps/tauri) | `2.7.1` | `2.8.1` |
| [@tauri-apps/cli-linux-x64-gnu](https://github.com/tauri-apps/tauri) | `2.7.1` | `2.8.1` |
| [@tauri-apps/cli-linux-x64-musl](https://github.com/tauri-apps/tauri) | `2.7.1` | `2.8.1` |
| [@tauri-apps/cli-win32-arm64-msvc](https://github.com/tauri-apps/tauri) | `2.7.1` | `2.8.1` |
| [@tauri-apps/cli-win32-ia32-msvc](https://github.com/tauri-apps/tauri) | `2.7.1` | `2.8.1` |
| [@tauri-apps/cli-win32-x64-msvc](https://github.com/tauri-apps/tauri) | `2.7.1` | `2.8.1` |
| [caniuse-lite](https://github.com/browserslist/caniuse-lite) | `1.0.30001735` | `1.0.30001737` |
| [rollup](https://github.com/rollup/rollup) | `4.46.2` | `4.48.0` |


Updates `@tauri-apps/api` from 2.7.0 to 2.8.0
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/@tauri-apps/api-v2.7.0...@tauri-apps/api-v2.8.0)

Updates `@tauri-apps/plugin-deep-link` from 2.4.1 to 2.4.2
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](https://github.com/tauri-apps/plugins-workspace/compare/fs-v2.4.1...fs-v2.4.2)

Updates `@tauri-apps/plugin-dialog` from 2.3.2 to 2.3.3
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](https://github.com/tauri-apps/plugins-workspace/compare/dialog-v2.3.2...dialog-v2.3.3)

Updates `@tauri-apps/plugin-fs` from 2.4.1 to 2.4.2
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](https://github.com/tauri-apps/plugins-workspace/compare/fs-v2.4.1...fs-v2.4.2)

Updates `@tauri-apps/plugin-opener` from 2.4.0 to 2.5.0
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](https://github.com/tauri-apps/plugins-workspace/compare/fs-v2.4.0...log-v2.5.0)

Updates `ahooks` from 3.9.0 to 3.9.4
- [Release notes](https://github.com/alibaba/hooks/releases)
- [Commits](https://github.com/alibaba/hooks/compare/v3.9.0...v3.9.4)

Updates `lucide-react` from 0.539.0 to 0.541.0
- [Release notes](https://github.com/lucide-icons/lucide/releases)
- [Commits](https://github.com/lucide-icons/lucide/commits/0.541.0/packages/lucide-react)

Updates `next` from 15.4.6 to 15.5.0
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/compare/v15.4.6...v15.5.0)

Updates `@biomejs/biome` from 2.1.4 to 2.2.0
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.0/packages/@biomejs/biome)

Updates `@tauri-apps/cli` from 2.7.1 to 2.8.1
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/@tauri-apps/cli-v2.7.1...@tauri-apps/cli-v2.8.1)

Updates `@types/react` from 19.1.10 to 19.1.11
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/react)

Updates `@vitejs/plugin-react` from 5.0.0 to 5.0.1
- [Release notes](https://github.com/vitejs/vite-plugin-react/releases)
- [Changelog](https://github.com/vitejs/vite-plugin-react/blob/main/packages/plugin-react/CHANGELOG.md)
- [Commits](https://github.com/vitejs/vite-plugin-react/commits/plugin-react@5.0.1/packages/plugin-react)

Updates `playwright-core` from 1.54.2 to 1.55.0
- [Release notes](https://github.com/microsoft/playwright/releases)
- [Commits](https://github.com/microsoft/playwright/compare/v1.54.2...v1.55.0)

Updates `@biomejs/cli-darwin-arm64` from 2.1.4 to 2.2.0
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.0/packages/@biomejs/biome)

Updates `@biomejs/cli-darwin-x64` from 2.1.4 to 2.2.0
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.0/packages/@biomejs/biome)

Updates `@biomejs/cli-linux-arm64-musl` from 2.1.4 to 2.2.0
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.0/packages/@biomejs/biome)

Updates `@biomejs/cli-linux-arm64` from 2.1.4 to 2.2.0
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.0/packages/@biomejs/biome)

Updates `@biomejs/cli-linux-x64-musl` from 2.1.4 to 2.2.0
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.0/packages/@biomejs/biome)

Updates `@biomejs/cli-linux-x64` from 2.1.4 to 2.2.0
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.0/packages/@biomejs/biome)

Updates `@biomejs/cli-win32-arm64` from 2.1.4 to 2.2.0
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.0/packages/@biomejs/biome)

Updates `@biomejs/cli-win32-x64` from 2.1.4 to 2.2.0
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.2.0/packages/@biomejs/biome)

Updates `@next/env` from 15.4.6 to 15.5.0
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.0/packages/next-env)

Updates `@next/swc-darwin-arm64` from 15.4.6 to 15.5.0
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.0/crates/napi/npm/darwin-arm64)

Updates `@next/swc-darwin-x64` from 15.4.6 to 15.5.0
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.0/crates/napi/npm/darwin-x64)

Updates `@next/swc-linux-arm64-gnu` from 15.4.6 to 15.5.0
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.0/crates/napi/npm/linux-arm64-gnu)

Updates `@next/swc-linux-arm64-musl` from 15.4.6 to 15.5.0
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.0/crates/napi/npm/linux-arm64-musl)

Updates `@next/swc-linux-x64-gnu` from 15.4.6 to 15.5.0
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.0/crates/napi/npm/linux-x64-gnu)

Updates `@next/swc-linux-x64-musl` from 15.4.6 to 15.5.0
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.0/crates/napi/npm/linux-x64-musl)

Updates `@next/swc-win32-arm64-msvc` from 15.4.6 to 15.5.0
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.0/crates/napi/npm/win32-arm64-msvc)

Updates `@next/swc-win32-x64-msvc` from 15.4.6 to 15.5.0
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/commits/v15.5.0/crates/napi/npm/win32-x64-msvc)

Updates `@rolldown/pluginutils` from 1.0.0-beta.30 to 1.0.0-beta.32
- [Release notes](https://github.com/rolldown/rolldown/releases)
- [Changelog](https://github.com/rolldown/rolldown/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rolldown/rolldown/commits/v1.0.0-beta.32/packages/pluginutils)

Updates `@rollup/rollup-android-arm-eabi` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-android-arm64` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-darwin-arm64` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-darwin-x64` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-freebsd-arm64` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-freebsd-x64` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-linux-arm-gnueabihf` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-linux-arm-musleabihf` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-linux-arm64-gnu` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-linux-arm64-musl` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-linux-loongarch64-gnu` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-linux-ppc64-gnu` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-linux-riscv64-gnu` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-linux-riscv64-musl` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-linux-s390x-gnu` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-linux-x64-gnu` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-linux-x64-musl` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-win32-arm64-msvc` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-win32-ia32-msvc` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@rollup/rollup-win32-x64-msvc` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

Updates `@tauri-apps/cli-darwin-arm64` from 2.7.1 to 2.8.1
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-cli-v2.7.1...tauri-v2.8.1)

Updates `@tauri-apps/cli-darwin-x64` from 2.7.1 to 2.8.1
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-cli-v2.7.1...tauri-v2.8.1)

Updates `@tauri-apps/cli-linux-arm-gnueabihf` from 2.7.1 to 2.8.1
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-cli-v2.7.1...tauri-v2.8.1)

Updates `@tauri-apps/cli-linux-arm64-gnu` from 2.7.1 to 2.8.1
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-cli-v2.7.1...tauri-v2.8.1)

Updates `@tauri-apps/cli-linux-arm64-musl` from 2.7.1 to 2.8.1
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-cli-v2.7.1...tauri-v2.8.1)

Updates `@tauri-apps/cli-linux-riscv64-gnu` from 2.7.1 to 2.8.1
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-cli-v2.7.1...tauri-v2.8.1)

Updates `@tauri-apps/cli-linux-x64-gnu` from 2.7.1 to 2.8.1
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-cli-v2.7.1...tauri-v2.8.1)

Updates `@tauri-apps/cli-linux-x64-musl` from 2.7.1 to 2.8.1
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-cli-v2.7.1...tauri-v2.8.1)

Updates `@tauri-apps/cli-win32-arm64-msvc` from 2.7.1 to 2.8.1
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-cli-v2.7.1...tauri-v2.8.1)

Updates `@tauri-apps/cli-win32-ia32-msvc` from 2.7.1 to 2.8.1
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-cli-v2.7.1...tauri-v2.8.1)

Updates `@tauri-apps/cli-win32-x64-msvc` from 2.7.1 to 2.8.1
- [Release notes](https://github.com/tauri-apps/tauri/releases)
- [Commits](https://github.com/tauri-apps/tauri/compare/tauri-cli-v2.7.1...tauri-v2.8.1)

Updates `caniuse-lite` from 1.0.30001735 to 1.0.30001737
- [Commits](https://github.com/browserslist/caniuse-lite/compare/1.0.30001735...1.0.30001737)

Updates `rollup` from 4.46.2 to 4.48.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.46.2...v4.48.0)

---
updated-dependencies:
- dependency-name: "@tauri-apps/api"
  dependency-version: 2.8.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/plugin-deep-link"
  dependency-version: 2.4.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/plugin-dialog"
  dependency-version: 2.3.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/plugin-fs"
  dependency-version: 2.4.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/plugin-opener"
  dependency-version: 2.5.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: ahooks
  dependency-version: 3.9.4
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: lucide-react
  dependency-version: 0.541.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: next
  dependency-version: 15.5.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/biome"
  dependency-version: 2.2.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli"
  dependency-version: 2.8.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@types/react"
  dependency-version: 19.1.11
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@vitejs/plugin-react"
  dependency-version: 5.0.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: playwright-core
  dependency-version: 1.55.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-darwin-arm64"
  dependency-version: 2.2.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-darwin-x64"
  dependency-version: 2.2.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-linux-arm64-musl"
  dependency-version: 2.2.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-linux-arm64"
  dependency-version: 2.2.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-linux-x64-musl"
  dependency-version: 2.2.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-linux-x64"
  dependency-version: 2.2.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-win32-arm64"
  dependency-version: 2.2.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@biomejs/cli-win32-x64"
  dependency-version: 2.2.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@next/env"
  dependency-version: 15.5.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@next/swc-darwin-arm64"
  dependency-version: 15.5.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@next/swc-darwin-x64"
  dependency-version: 15.5.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@next/swc-linux-arm64-gnu"
  dependency-version: 15.5.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@next/swc-linux-arm64-musl"
  dependency-version: 15.5.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@next/swc-linux-x64-gnu"
  dependency-version: 15.5.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@next/swc-linux-x64-musl"
  dependency-version: 15.5.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@next/swc-win32-arm64-msvc"
  dependency-version: 15.5.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@next/swc-win32-x64-msvc"
  dependency-version: 15.5.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rolldown/pluginutils"
  dependency-version: 1.0.0-beta.32
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-android-arm-eabi"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-android-arm64"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-darwin-arm64"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-darwin-x64"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-freebsd-arm64"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-freebsd-x64"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-arm-gnueabihf"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-arm-musleabihf"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-arm64-gnu"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-arm64-musl"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-loongarch64-gnu"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-ppc64-gnu"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-riscv64-gnu"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-riscv64-musl"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-s390x-gnu"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-x64-gnu"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-linux-x64-musl"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-win32-arm64-msvc"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-win32-ia32-msvc"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@rollup/rollup-win32-x64-msvc"
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-darwin-arm64"
  dependency-version: 2.8.1
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-darwin-x64"
  dependency-version: 2.8.1
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-linux-arm-gnueabihf"
  dependency-version: 2.8.1
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-linux-arm64-gnu"
  dependency-version: 2.8.1
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-linux-arm64-musl"
  dependency-version: 2.8.1
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-linux-riscv64-gnu"
  dependency-version: 2.8.1
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-linux-x64-gnu"
  dependency-version: 2.8.1
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-linux-x64-musl"
  dependency-version: 2.8.1
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-win32-arm64-msvc"
  dependency-version: 2.8.1
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-win32-ia32-msvc"
  dependency-version: 2.8.1
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: "@tauri-apps/cli-win32-x64-msvc"
  dependency-version: 2.8.1
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
- dependency-name: caniuse-lite
  dependency-version: 1.0.30001737
  dependency-type: indirect
  update-type: version-update:semver-patch
  dependency-group: frontend-dependencies
- dependency-name: rollup
  dependency-version: 4.48.0
  dependency-type: indirect
  update-type: version-update:semver-minor
  dependency-group: frontend-dependencies
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-08-23 09:43:13 +00:00
dependabot[bot] d75a367f39 ci(deps): bump the github-actions group with 7 updates
Bumps the github-actions group with 7 updates:

| Package | From | To |
| --- | --- | --- |
| [actions/checkout](https://github.com/actions/checkout) | `4.2.2` | `5.0.0` |
| [dtolnay/rust-toolchain](https://github.com/dtolnay/rust-toolchain) | `b3b07ba8b418998c39fb20f53e8b695cdcc8de1b` | `e97e2d8cc328f1b50210efc529dca0028893a2d9` |
| [google/osv-scanner-action](https://github.com/google/osv-scanner-action) | `2.1.0` | `2.2.1` |
| [ridedott/merge-me-action](https://github.com/ridedott/merge-me-action) | `2.10.124` | `2.10.126` |
| [actions/first-interaction](https://github.com/actions/first-interaction) | `2.0.0` | `3.0.0` |
| [actions/ai-inference](https://github.com/actions/ai-inference) | `1.2.8` | `2.0.0` |
| [crate-ci/typos](https://github.com/crate-ci/typos) | `1.35.3` | `1.35.5` |


Updates `actions/checkout` from 4.2.2 to 5.0.0
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/11bd71901bbe5b1630ceea73d27597364c9af683...08c6903cd8c0fde910a37f88322edcfb5dd907a8)

Updates `dtolnay/rust-toolchain` from b3b07ba8b418998c39fb20f53e8b695cdcc8de1b to e97e2d8cc328f1b50210efc529dca0028893a2d9
- [Release notes](https://github.com/dtolnay/rust-toolchain/releases)
- [Commits](https://github.com/dtolnay/rust-toolchain/compare/b3b07ba8b418998c39fb20f53e8b695cdcc8de1b...e97e2d8cc328f1b50210efc529dca0028893a2d9)

Updates `google/osv-scanner-action` from 2.1.0 to 2.2.1
- [Release notes](https://github.com/google/osv-scanner-action/releases)
- [Commits](https://github.com/google/osv-scanner-action/compare/b00f71e051ddddc6e46a193c31c8c0bf283bf9e6...456ceb78310755116e0a3738121351006286b797)

Updates `ridedott/merge-me-action` from 2.10.124 to 2.10.126
- [Release notes](https://github.com/ridedott/merge-me-action/releases)
- [Changelog](https://github.com/ridedott/merge-me-action/blob/master/CHANGELOG.md)
- [Commits](https://github.com/ridedott/merge-me-action/compare/f96a67511b4be051e77760230e6a3fb9cb7b1903...ad649157c69da4d34e601ee360de7a74ce4e2090)

Updates `actions/first-interaction` from 2.0.0 to 3.0.0
- [Release notes](https://github.com/actions/first-interaction/releases)
- [Commits](https://github.com/actions/first-interaction/compare/2d4393e6bc0e2efb2e48fba7e06819c3bf61ffc9...753c925c8d1ac6fede23781875376600628d9b5d)

Updates `actions/ai-inference` from 1.2.8 to 2.0.0
- [Release notes](https://github.com/actions/ai-inference/releases)
- [Commits](https://github.com/actions/ai-inference/compare/b81b2afb8390ee6839b494a404766bef6493c7d9...f347eae8ebabecb85d17f52960f909b8a4a8dad5)

Updates `crate-ci/typos` from 1.35.3 to 1.35.5
- [Release notes](https://github.com/crate-ci/typos/releases)
- [Changelog](https://github.com/crate-ci/typos/blob/master/CHANGELOG.md)
- [Commits](https://github.com/crate-ci/typos/compare/52bd719c2c91f9d676e2aa359fc8e0db8925e6d8...a4c3e43aea0a9e9b9e6578d2731ebd9a27e8f6cd)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: 5.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: github-actions
- dependency-name: dtolnay/rust-toolchain
  dependency-version: e97e2d8cc328f1b50210efc529dca0028893a2d9
  dependency-type: direct:production
  dependency-group: github-actions
- dependency-name: google/osv-scanner-action
  dependency-version: 2.2.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: github-actions
- dependency-name: ridedott/merge-me-action
  dependency-version: 2.10.126
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
- dependency-name: actions/first-interaction
  dependency-version: 3.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: github-actions
- dependency-name: actions/ai-inference
  dependency-version: 2.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: github-actions
- dependency-name: crate-ci/typos
  dependency-version: 1.35.5
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: github-actions
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-08-23 09:20:24 +00:00
zhom a48eb5d631 refactor: cleanup empty browser folders 2025-08-21 09:13:30 +04:00
zhom 0d79f385bd chore: cleanup 2025-08-19 14:01:24 +04:00
zhom 25bb1dccdc chore: version bump 2025-08-19 13:57:58 +04:00
zhom 97044d58fe feat: proxy sorting 2025-08-19 13:55:40 +04:00
zhom 4748a31714 style: don't overflow 2025-08-19 13:50:24 +04:00
zhom d91c97dd85 chore: formatting 2025-08-19 13:47:19 +04:00
zhom 8e299fddd4 feat: docs inside ui 2025-08-19 13:45:19 +04:00
zhom 6c3c9fb58a chore: comment 2025-08-19 13:38:38 +04:00
zhom f5066e866b fix: pass id instead of profile name to open_url_with_profile 2025-08-19 13:38:29 +04:00
zhom e12a5661b1 refactor: use ids instead of names for all profile operations 2025-08-19 13:31:46 +04:00
zhom f8a4ec3277 refactor: require auth for local api 2025-08-19 13:31:28 +04:00
zhom 1e5664e3b2 feat: launch browsers via api and expose them to selenium 2025-08-19 09:49:39 +04:00
zhom d0fea2fec1 style: scroll area adjustments 2025-08-19 09:48:46 +04:00
zhom ce0627030d style: move geolocation and locale fields above webgl 2025-08-18 18:26:05 +04:00
zhom d70ec16706 refacto: use backend events for ui update 2025-08-18 18:10:37 +04:00
zhom 5863d5549e chore: version bump 2025-08-18 17:47:07 +04:00
zhom 4df35515ae chore: formatting 2025-08-18 17:46:52 +04:00
zhom 59f430ec43 refactor: make ui reactive for proxy changes 2025-08-18 17:37:42 +04:00
zhom 9f68a21824 refactor: make ui reactive for group changes 2025-08-18 17:37:22 +04:00
zhom 9bf7f39c0c docs: agents 2025-08-18 17:01:10 +04:00
zhom d1b45778c4 style: don't trim 'Not Selected' 2025-08-18 16:58:38 +04:00
zhom 6d6527d812 refactor: update ui on profile events 2025-08-18 16:54:36 +04:00
zhom c30df278fb chore: formatting 2025-08-17 13:34:11 +04:00
zhom 95592b4aa1 fix: pass proper argument to profile rename 2025-08-17 13:30:02 +04:00
zhom 58b0067b37 fix: pass correct props to the backend 2025-08-17 13:22:34 +04:00
zhom 6260d78901 style: adjust column width 2025-08-17 13:01:47 +04:00
zhom efab286dad chore: version bump 2025-08-17 12:48:22 +04:00
zhom e51e31911b chore: pnpm update 2025-08-17 12:47:13 +04:00
zhom 348a727da7 chore: formatting 2025-08-17 12:45:58 +04:00
zhom 10f8061acf refactor: better browser process handling 2025-08-17 12:44:20 +04:00
zhom 69348a101e chore: fix check-unused-commands command 2025-08-17 11:31:47 +04:00
zhom f7e116f345 refactor: reduce table re-renders 2025-08-17 11:31:24 +04:00
zhom 2e6bb2498b chore: remove console logs in production 2025-08-16 19:26:18 +04:00
zhom 178f07bec7 chore: pnpm update 2025-08-16 19:02:34 +04:00
zhom c6caf0633e style: init ui block spinner 2025-08-16 13:59:58 +04:00
zhom 29fe20af09 refactor: move profile filtering outside table component 2025-08-16 12:33:54 +04:00
zhom 1cb8e7236d style: highlight non-editing tags state 2025-08-16 12:05:59 +04:00
zhom ab256cd695 style: copy 2025-08-16 11:57:36 +04:00
zhom 96c42ae55e refactor: better error handling for failed downloads 2025-08-16 11:50:34 +04:00
zhom c98e12900f feat: add local api support 2025-08-16 11:42:15 +04:00
zhom 7a0d14642a refactor: better profile creation handling and navigation 2025-08-15 23:31:13 +04:00
zhom a1f153f4fa refactor: show cursor pointer on badge removal button 2025-08-15 19:39:31 +04:00
zhom ff9ad0a5ad refactor: prevent layout shift on tags mode change 2025-08-15 19:37:36 +04:00
zhom 3b78fea62a style: make the tag list more compact 2025-08-15 19:29:06 +04:00
zhom 74e1642aa2 refactor: attempt to rerender less 2025-08-15 19:26:53 +04:00
zhom c9d37519f7 style: make name cell same size in both fields 2025-08-15 19:21:22 +04:00
zhom da9e1d1b58 refactor: show tooltip on tags hover 2025-08-15 19:17:07 +04:00
zhom 77f93cc122 refactor: cleanup tags ui 2025-08-15 19:10:16 +04:00
zhom f7ccca0075 style: cleanup 2025-08-15 18:57:13 +04:00
zhom d7c2f08133 fix: show absolutely positioned items in the table 2025-08-15 18:48:47 +04:00
zhom 8dffd86ab9 refactor: allow window drag during ui block 2025-08-15 18:11:44 +04:00
zhom f3b3207489 refactor: default theme for custom is light 2025-08-15 18:09:55 +04:00
zhom d7a787586d refactor: custom theme cleanup 2025-08-15 18:08:33 +04:00
zhom 4a98eedba0 refactor: reduce table re-renders 2025-08-15 10:33:22 +04:00
zhom 95ee807f3b refactor: better theme initialization 2025-08-15 10:08:15 +04:00
zhom fac99f4a51 refactor: tags 2025-08-15 00:55:10 +04:00
zhom 88cb154fca refactor: color picker 2025-08-15 00:54:57 +04:00
zhom a6af568d9e feat: custom theme 2025-08-15 00:04:31 +04:00
zhom 7c2ed1e0fc refactor: tags 2025-08-15 00:04:10 +04:00
zhom 334f894e68 chore: version bump 2025-08-14 23:06:17 +04:00
zhom a77b733a31 chore: version bump 2025-08-14 22:39:16 +04:00
zhom c10c3b0f95 refactor: creation modal cleanup 2025-08-14 22:37:03 +04:00
zhom 4b16341401 refactor: integrage rename of profile into row 2025-08-14 22:35:44 +04:00
zhom 016d423d2c refactor: revert global listener 2025-08-14 22:01:14 +04:00
zhom 0596cc4009 style: fix toast width 2025-08-14 21:46:36 +04:00
zhom 269db678b7 chore: version bump 2025-08-13 16:29:42 +04:00
zhom f809b975f3 refactor: better ui handling for proxy changes 2025-08-13 16:28:08 +04:00
zhom e369214715 refactor: do not try to reuse old proxy port 2025-08-13 16:13:47 +04:00
zhom 5f93841bb7 style: copy 2025-08-13 15:28:05 +04:00
121 changed files with 22758 additions and 10557 deletions
+6
View File
@@ -0,0 +1,6 @@
---
description:
globs:
alwaysApply: true
---
If you are modifying the UI, do not add random colors that are not controlled by src/lib/themes.ts file.
+7 -5
View File
@@ -31,19 +31,21 @@ jobs:
# build-mode: none
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 #v4.2.2
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 #v6.0.0
- name: Set up pnpm package manager
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda #v4.1.0
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 #v4.2.0
with:
run_install: false
- name: Set up Node.js
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 #v4.4.0
uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 #v6.0.0
with:
node-version-file: .node-version
cache: "pnpm"
- name: Setup Rust
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b #master
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 #master
with:
toolchain: stable
targets: x86_64-unknown-linux-gnu
@@ -55,7 +57,7 @@ jobs:
sudo apt-get install -y libwebkit2gtk-4.1-dev libgtk-3-dev libayatana-appindicator3-dev librsvg2-dev pkg-config xdg-utils
- name: Rust cache
uses: swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 #v2.8.0
uses: swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 #v2.8.1
with:
workdir: ./src-tauri
+5
View File
@@ -2,6 +2,9 @@ on:
push:
branches:
- main
release:
types:
- published
permissions:
contents: write
@@ -15,6 +18,8 @@ jobs:
contents: write
pull-requests: write
steps:
- name: Checkout repository
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 #v6.0.0
- name: Contribute List
uses: akhilmhdh/contributors-readme-action@83ea0b4f1ac928fbfe88b9e8460a932a528eb79f #v2.3.11
env:
+7 -9
View File
@@ -13,7 +13,7 @@ jobs:
security-scan:
name: Security Vulnerability Scan
if: ${{ github.actor == 'dependabot[bot]' }}
uses: "google/osv-scanner-action/.github/workflows/osv-scanner-reusable-pr.yml@b00f71e051ddddc6e46a193c31c8c0bf283bf9e6" # v2.1.0
uses: "google/osv-scanner-action/.github/workflows/osv-scanner-reusable-pr.yml@b77c075a1235514558f0eb88dbd31e22c45e0cd2" # v2.3.0
with:
scan-args: |-
-r
@@ -70,13 +70,11 @@ jobs:
id: metadata
uses: dependabot/fetch-metadata@08eff52bf64351f401fb50d4972fa95b9f2c2d1b #v2.4.0
with:
compat-lookup: true
github-token: "${{ secrets.GITHUB_TOKEN }}"
- name: Auto-merge minor and patch updates
uses: ridedott/merge-me-action@f96a67511b4be051e77760230e6a3fb9cb7b1903 #v2.10.124
with:
GITHUB_TOKEN: ${{ secrets.SECRET_DEPENDABOT_GITHUB_TOKEN }}
MERGE_METHOD: SQUASH
PRESET: DEPENDABOT_MINOR
MAXIMUM_RETRIES: 5
- name: Enable auto-merge for minor and patch updates
if: ${{ steps.metadata.outputs.update-type == 'version-update:semver-minor' || steps.metadata.outputs.update-type == 'version-update:semver-patch' }}
run: gh pr merge --auto --squash "$PR_URL"
env:
PR_URL: ${{ github.event.pull_request.html_url }}
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
timeout-minutes: 10
+2 -5
View File
@@ -3,17 +3,14 @@ name: Greetings
on:
pull_request:
types: [opened]
issues:
types: [opened]
jobs:
greeting:
runs-on: ubuntu-latest
permissions:
issues: write
pull-requests: write
steps:
- uses: actions/first-interaction@2d4393e6bc0e2efb2e48fba7e06819c3bf61ffc9 # v2.0.0
- uses: actions/first-interaction@1c4688942c71f71d4f5502a26ea67c331730fa4d # v3.1.0
with:
issue-message: "Thank you for your first issue ❤️ If it's a feature request, please make sure it's clear what you want, why you want it, and how important it is to you. If you posted a bug report, please make sure it includes as much detail as possible."
repo-token: ${{ secrets.GITHUB_TOKEN }}
pr-message: "Welcome to the community and thank you for your first contribution ❤️ A human will review your PR shortly. Make sure that the pipelines are green, so that the PR is considered ready for a review and could be merged."
+59 -45
View File
@@ -15,7 +15,7 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 #v4.2.2
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 #v6.0.0
- name: Get issue templates
id: get-templates
@@ -49,7 +49,7 @@ jobs:
- name: Validate issue with AI
id: validate
uses: actions/ai-inference@b81b2afb8390ee6839b494a404766bef6493c7d9 # v1.2.8
uses: actions/ai-inference@a1c11829223a786afe3b5663db904a3aa1eac3a2 # v2.0.1
with:
prompt-file: issue_analysis.txt
system-prompt: |
@@ -93,6 +93,25 @@ jobs:
Be constructive and helpful in your feedback. If the issue is incomplete, provide specific guidance on what's needed.
model: openai/gpt-4o
- name: Check if first-time contributor
id: check-first-time
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
ISSUE_AUTHOR: ${{ github.event.issue.user.login }}
run: |
# Check if user has created issues before (excluding the current one)
ISSUE_COUNT=$(gh api "/repos/${{ github.repository }}/issues" \
--jq "map(select(.user.login == \"$ISSUE_AUTHOR\" and .number != ${{ github.event.issue.number }})) | length" \
--paginate || echo "0")
if [ "$ISSUE_COUNT" = "0" ]; then
echo "is_first_time=true" >> $GITHUB_OUTPUT
echo "✅ First-time contributor detected"
else
echo "is_first_time=false" >> $GITHUB_OUTPUT
echo "️ Returning contributor"
fi
- name: Parse validation result and take action
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
@@ -120,41 +139,35 @@ jobs:
echo "Issue validation result: $IS_VALID"
echo "Issue type: $ISSUE_TYPE"
# Prepare greeting message for first-time contributors
IS_FIRST_TIME="${{ steps.check-first-time.outputs.is_first_time }}"
GREETING_SECTION=""
if [ "$IS_FIRST_TIME" = "true" ]; then
GREETING_SECTION="## 👋 Welcome!\n\nThank you for your first issue ❤️ If this is a feature request, please make sure it is clear what you want, why you want it, and how important it is to you. If you posted a bug report, please make sure it includes as much detail as possible.\n\n---\n\n"
fi
if [ "$IS_VALID" = "false" ]; then
# Create a comment asking for more information
cat > comment.md << EOF
## 🤖 Issue Validation
Thank you for submitting this issue! However, it appears that some required information might be missing to help us better understand and address your concern.
**Issue Type Detected:** \`$ISSUE_TYPE\`
**Assessment:** $ASSESSMENT
### 📋 Missing Information:
$MISSING_INFO
### 💡 Suggestions for Improvement:
$SUGGESTIONS
### 📝 How to Provide Additional Information:
Please edit your original issue description to include the missing information. Here are our issue templates for reference:
- **Bug Report Template:** [View Template](.github/ISSUE_TEMPLATE/01-bug-report.md)
- **Feature Request Template:** [View Template](.github/ISSUE_TEMPLATE/02-feature-request.md)
### 🔧 Quick Tips:
- For **bug reports**: Include step-by-step reproduction instructions, your environment details, and any error messages
- For **feature requests**: Describe the use case, expected behavior, and why this feature would be valuable
- Add **screenshots** or **logs** when applicable
Once you've updated the issue with the missing information, feel free to remove this comment or reply to let us know you've made the updates.
---
*This validation was performed automatically to ensure we have all the information needed to help you effectively.*
EOF
{
printf "%b" "$GREETING_SECTION"
printf "## 🤖 Issue Validation\n\n"
printf "Thank you for submitting this issue! However, it appears that some required information might be missing to help us better understand and address your concern.\n\n"
printf "**Issue Type Detected:** \`%s\`\n\n" "$ISSUE_TYPE"
printf "**Assessment:** %s\n\n" "$ASSESSMENT"
printf "### 📋 Missing Information:\n%s\n\n" "$MISSING_INFO"
printf "### 💡 Suggestions for Improvement:\n%s\n\n" "$SUGGESTIONS"
printf "### 📝 How to Provide Additional Information:\n\n"
printf "Please edit your original issue description to include the missing information. Here are our issue templates for reference:\n\n"
printf -- "- **Bug Report Template:** [View Template](.github/ISSUE_TEMPLATE/01-bug-report.md)\n"
printf -- "- **Feature Request Template:** [View Template](.github/ISSUE_TEMPLATE/02-feature-request.md)\n\n"
printf "### 🔧 Quick Tips:\n"
printf -- "- For **bug reports**: Include step-by-step reproduction instructions, your environment details, and any error messages\n"
printf -- "- For **feature requests**: Describe the use case, expected behavior, and why this feature would be valuable\n"
printf -- "- Add **screenshots** or **logs** when applicable\n\n"
printf "Once you have updated the issue with the missing information, feel free to remove this comment or reply to let us know you have made the updates.\n\n"
printf -- "---\n*This validation was performed automatically to ensure we have all the information needed to help you effectively.*\n"
} > comment.md
# Post the comment
gh issue comment ${{ github.event.issue.number }} --body-file comment.md
@@ -167,18 +180,19 @@ jobs:
echo "✅ Issue contains sufficient information"
# Prepare a summary comment even when valid
cat > comment.md << EOF
## 🤖 Issue Validation
SUGGESTIONS_SECTION=""
if [ -n "$SUGGESTIONS" ]; then
SUGGESTIONS_SECTION=$(printf "### 💡 Suggestions:\n%s\n\n" "$SUGGESTIONS")
fi
**Issue Type Detected:** \`$ISSUE_TYPE\`
**Assessment:** $ASSESSMENT
$( [ -n "$SUGGESTIONS" ] && echo "### 💡 Suggestions:" && echo "$SUGGESTIONS" )
---
*This validation was performed automatically to help triage issues.*
EOF
{
printf "%b" "$GREETING_SECTION"
printf "## 🤖 Issue Validation\n\n"
printf "**Issue Type Detected:** \`%s\`\n\n" "$ISSUE_TYPE"
printf "**Assessment:** %s\n\n" "$ASSESSMENT"
printf "%b" "$SUGGESTIONS_SECTION"
printf -- "---\n*This validation was performed automatically to help triage issues.*\n"
} > comment.md
# Post the summary comment
gh issue comment ${{ github.event.issue.number }} --body-file comment.md
+5 -3
View File
@@ -34,13 +34,15 @@ jobs:
run: git config --global core.autocrlf false
- name: Checkout repository code
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 #v4.2.2
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 #v6.0.0
- name: Set up pnpm package manager
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda #v4.1.0
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 #v4.2.0
with:
run_install: false
- name: Set up Node.js
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 #v4.4.0
uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 #v6.0.0
with:
node-version-file: .node-version
cache: "pnpm"
+37 -10
View File
@@ -31,7 +31,7 @@ jobs:
build:
strategy:
matrix:
os: [macos-latest, ubuntu-latest]
os: [macos-latest, ubuntu-22.04]
runs-on: ${{ matrix.os }}
@@ -41,19 +41,21 @@ jobs:
run: git config --global core.autocrlf false
- name: Checkout repository code
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 #v4.2.2
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 #v6.0.0
- name: Set up pnpm package manager
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda #v4.1.0
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 #v4.2.0
with:
run_install: false
- name: Set up Node.js
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 #v4.4.0
uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 #v6.0.0
with:
node-version-file: .node-version
cache: "pnpm"
- name: Install Rust toolchain
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b #master
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 #master
with:
toolchain: stable
components: rustfmt, clippy
@@ -65,7 +67,7 @@ jobs:
run: cargo install banderole
- name: Install dependencies (Ubuntu only)
if: matrix.os == 'ubuntu-latest'
if: matrix.os == 'ubuntu-22.04'
run: |
sudo apt-get update
sudo apt install libwebkit2gtk-4.1-dev build-essential curl wget file libxdo-dev libssl-dev libayatana-appindicator3-dev librsvg2-dev
@@ -77,7 +79,7 @@ jobs:
shell: bash
working-directory: ./nodecar
run: |
if [[ "${{ matrix.os }}" == "ubuntu-latest" ]]; then
if [[ "${{ matrix.os }}" == "ubuntu-22.04" ]]; then
pnpm run build:linux-x64
elif [[ "${{ matrix.os }}" == "macos-latest" ]]; then
pnpm run build:mac-aarch64
@@ -94,7 +96,7 @@ jobs:
shell: bash
run: |
mkdir -p src-tauri/binaries
if [[ "${{ matrix.os }}" == "ubuntu-latest" ]]; then
if [[ "${{ matrix.os }}" == "ubuntu-22.04" ]]; then
cp nodecar/nodecar-bin src-tauri/binaries/nodecar-x86_64-unknown-linux-gnu
elif [[ "${{ matrix.os }}" == "macos-latest" ]]; then
cp nodecar/nodecar-bin src-tauri/binaries/nodecar-aarch64-apple-darwin
@@ -102,8 +104,33 @@ jobs:
cp nodecar/nodecar-bin.exe src-tauri/binaries/nodecar-x86_64-pc-windows-msvc.exe
fi
- name: Create empty 'dist' directory
run: mkdir dist
- name: Build frontend
run: pnpm next build
- name: Get host target
id: host_target
shell: bash
run: |
HOST_TARGET=$(rustc -vV | sed -n 's|host: ||p')
echo "target=${HOST_TARGET}" >> $GITHUB_OUTPUT
echo "Host target: ${HOST_TARGET}"
- name: Build donut-proxy sidecar
shell: bash
working-directory: ./src-tauri
run: cargo build --bin donut-proxy
- name: Copy donut-proxy binary to Tauri binaries
shell: bash
run: |
mkdir -p src-tauri/binaries
HOST_TARGET="${{ steps.host_target.outputs.target }}"
if [[ "$HOST_TARGET" == *"windows"* ]]; then
cp src-tauri/target/debug/donut-proxy.exe src-tauri/binaries/donut-proxy-${HOST_TARGET}.exe
else
cp src-tauri/target/debug/donut-proxy src-tauri/binaries/donut-proxy-${HOST_TARGET}
chmod +x src-tauri/binaries/donut-proxy-${HOST_TARGET}
fi
- name: Run rustfmt check
run: cargo fmt --all -- --check
+2 -2
View File
@@ -50,7 +50,7 @@ jobs:
scan-scheduled:
name: Scheduled Security Scan
if: ${{ github.event_name == 'push' || github.event_name == 'schedule' }}
uses: "google/osv-scanner-action/.github/workflows/osv-scanner-reusable.yml@b00f71e051ddddc6e46a193c31c8c0bf283bf9e6" # v2.1.0
uses: "google/osv-scanner-action/.github/workflows/osv-scanner-reusable.yml@b77c075a1235514558f0eb88dbd31e22c45e0cd2" # v2.3.0
with:
scan-args: |-
-r
@@ -63,7 +63,7 @@ jobs:
scan-pr:
name: PR Security Scan
if: ${{ github.event_name == 'pull_request' || github.event_name == 'merge_group' }}
uses: "google/osv-scanner-action/.github/workflows/osv-scanner-reusable-pr.yml@b00f71e051ddddc6e46a193c31c8c0bf283bf9e6" # v2.1.0
uses: "google/osv-scanner-action/.github/workflows/osv-scanner-reusable-pr.yml@b77c075a1235514558f0eb88dbd31e22c45e0cd2" # v2.3.0
with:
scan-args: |-
-r
+1 -1
View File
@@ -29,7 +29,7 @@ jobs:
security-scan:
name: Security Vulnerability Scan
if: ${{ github.event_name == 'pull_request' || github.event_name == 'merge_group' }}
uses: "google/osv-scanner-action/.github/workflows/osv-scanner-reusable-pr.yml@b00f71e051ddddc6e46a193c31c8c0bf283bf9e6" # v2.1.0
uses: "google/osv-scanner-action/.github/workflows/osv-scanner-reusable-pr.yml@b77c075a1235514558f0eb88dbd31e22c45e0cd2" # v2.3.0
with:
scan-args: |-
-r
@@ -15,7 +15,7 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 #v4.2.2
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 #v6.0.0
with:
fetch-depth: 0 # Fetch full history to compare with previous release
@@ -58,7 +58,7 @@ jobs:
- name: Generate release notes with AI
id: generate-notes
uses: actions/ai-inference@b81b2afb8390ee6839b494a404766bef6493c7d9 # v1.2.8
uses: actions/ai-inference@a1c11829223a786afe3b5663db904a3aa1eac3a2 # v2.0.1
with:
prompt-file: commits.txt
system-prompt: |
@@ -104,8 +104,13 @@ jobs:
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
# Get the generated release notes
RELEASE_NOTES="${{ steps.generate-notes.outputs.response }}"
# Prefer reading from the response file to avoid output truncation
RESPONSE_FILE='${{ steps.generate-notes.outputs.response-file }}'
if [ -n "$RESPONSE_FILE" ] && [ -f "$RESPONSE_FILE" ]; then
RELEASE_NOTES=$(cat "$RESPONSE_FILE")
else
RELEASE_NOTES='${{ steps.generate-notes.outputs.response }}'
fi
# Update the release with the generated notes
gh api --method PATCH /repos/${{ github.repository }}/releases/${{ github.event.release.id }} \
+49 -17
View File
@@ -13,7 +13,7 @@ env:
jobs:
security-scan:
name: Security Vulnerability Scan
uses: "google/osv-scanner-action/.github/workflows/osv-scanner-reusable.yml@b00f71e051ddddc6e46a193c31c8c0bf283bf9e6" # v2.1.0
uses: "google/osv-scanner-action/.github/workflows/osv-scanner-reusable.yml@b77c075a1235514558f0eb88dbd31e22c45e0cd2" # v2.3.0
with:
scan-args: |-
-r
@@ -105,18 +105,21 @@ jobs:
runs-on: ${{ matrix.platform }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 #v4.2.2
- name: Setup Node.js
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 #v4.4.0
with:
node-version-file: .node-version
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 #v6.0.0
- name: Setup pnpm
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda #v4.1.0
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 #v4.2.0
with:
run_install: false
- name: Setup Node.js
uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 #v6.0.0
with:
node-version-file: .node-version
cache: "pnpm"
- name: Setup Rust
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b #master
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 #master
with:
toolchain: stable
targets: ${{ matrix.target }}
@@ -128,7 +131,7 @@ jobs:
sudo apt-get install -y libwebkit2gtk-4.1-dev libgtk-3-dev libayatana-appindicator3-dev librsvg2-dev pkg-config xdg-utils
- name: Rust cache
uses: swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 #v2.8.0
uses: swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 #v2.8.1
with:
workdir: ./src-tauri
@@ -159,14 +162,43 @@ jobs:
# continue-on-error: true
- name: Build frontend
run: pnpm build
run: pnpm exec next build
- name: Verify frontend dist exists
shell: bash
run: |
if [ ! -d "dist" ]; then
echo "Error: dist directory not found after build"
ls -la
exit 1
fi
echo "Frontend dist directory verified at $(pwd)/dist"
echo "Checking from src-tauri perspective:"
ls -la src-tauri/../dist || echo "Warning: dist not accessible from src-tauri"
- name: Build donut-proxy sidecar
shell: bash
working-directory: ./src-tauri
run: cargo build --bin donut-proxy --target ${{ matrix.target }} --release
- name: Copy donut-proxy binary to Tauri binaries
shell: bash
run: |
mkdir -p src-tauri/binaries
if [[ "${{ matrix.platform }}" == "windows-latest" ]]; then
cp src-tauri/target/${{ matrix.target }}/release/donut-proxy.exe src-tauri/binaries/donut-proxy-${{ matrix.target }}.exe
else
cp src-tauri/target/${{ matrix.target }}/release/donut-proxy src-tauri/binaries/donut-proxy-${{ matrix.target }}
chmod +x src-tauri/binaries/donut-proxy-${{ matrix.target }}
fi
- name: Build Tauri app
uses: tauri-apps/tauri-action@564aea5a8075c7a54c167bb0cf5b3255314a7f9d #v0.5.22
uses: tauri-apps/tauri-action@19b93bb55601e3e373a93cfb6eb4242e45f5af20 #v0.6.0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_REF_NAME: ${{ github.ref_name }}
with:
projectPath: ./src-tauri
tagName: ${{ github.ref_name }}
releaseName: "Donut Browser ${{ github.ref_name }}"
releaseBody: "See the assets to download this version and install."
@@ -174,8 +206,8 @@ jobs:
prerelease: false
args: ${{ matrix.args }}
- name: Commit CHANGELOG.md
uses: stefanzweifel/git-auto-commit-action@778341af668090896ca464160c2def5d1d1a3eb0 #v6.0.1
with:
branch: main
commit_message: "docs: update CHANGELOG.md for ${{ github.ref_name }} [skip ci]"
# - name: Commit CHANGELOG.md
# uses: stefanzweifel/git-auto-commit-action@778341af668090896ca464160c2def5d1d1a3eb0 #v6.0.1
# with:
# branch: main
# commit_message: "docs: update CHANGELOG.md for ${{ github.ref_name }} [skip ci]"
+44 -12
View File
@@ -12,7 +12,7 @@ env:
jobs:
security-scan:
name: Security Vulnerability Scan
uses: "google/osv-scanner-action/.github/workflows/osv-scanner-reusable.yml@b00f71e051ddddc6e46a193c31c8c0bf283bf9e6" # v2.1.0
uses: "google/osv-scanner-action/.github/workflows/osv-scanner-reusable.yml@b77c075a1235514558f0eb88dbd31e22c45e0cd2" # v2.3.0
with:
scan-args: |-
-r
@@ -104,18 +104,21 @@ jobs:
runs-on: ${{ matrix.platform }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 #v4.2.2
- name: Setup Node.js
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 #v4.4.0
with:
node-version-file: .node-version
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 #v6.0.0
- name: Setup pnpm
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda #v4.1.0
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 #v4.2.0
with:
run_install: false
- name: Setup Node.js
uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 #v6.0.0
with:
node-version-file: .node-version
cache: "pnpm"
- name: Setup Rust
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b #master
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 #master
with:
toolchain: stable
targets: ${{ matrix.target }}
@@ -127,7 +130,7 @@ jobs:
sudo apt-get install -y libwebkit2gtk-4.1-dev libgtk-3-dev libayatana-appindicator3-dev librsvg2-dev pkg-config xdg-utils
- name: Rust cache
uses: swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 #v2.8.0
uses: swatinem/rust-cache@f13886b937689c021905a6b90929199931d60db1 #v2.8.1
with:
workdir: ./src-tauri
@@ -158,7 +161,35 @@ jobs:
# continue-on-error: true
- name: Build frontend
run: pnpm build
run: pnpm exec next build
- name: Verify frontend dist exists
shell: bash
run: |
if [ ! -d "dist" ]; then
echo "Error: dist directory not found after build"
ls -la
exit 1
fi
echo "Frontend dist directory verified at $(pwd)/dist"
echo "Checking from src-tauri perspective:"
ls -la src-tauri/../dist || echo "Warning: dist not accessible from src-tauri"
- name: Build donut-proxy sidecar
shell: bash
working-directory: ./src-tauri
run: cargo build --bin donut-proxy --target ${{ matrix.target }} --release
- name: Copy donut-proxy binary to Tauri binaries
shell: bash
run: |
mkdir -p src-tauri/binaries
if [[ "${{ matrix.platform }}" == "windows-latest" ]]; then
cp src-tauri/target/${{ matrix.target }}/release/donut-proxy.exe src-tauri/binaries/donut-proxy-${{ matrix.target }}.exe
else
cp src-tauri/target/${{ matrix.target }}/release/donut-proxy src-tauri/binaries/donut-proxy-${{ matrix.target }}
chmod +x src-tauri/binaries/donut-proxy-${{ matrix.target }}
fi
- name: Generate nightly timestamp
id: timestamp
@@ -170,13 +201,14 @@ jobs:
echo "Generated timestamp: ${TIMESTAMP}-${COMMIT_HASH}"
- name: Build Tauri app
uses: tauri-apps/tauri-action@564aea5a8075c7a54c167bb0cf5b3255314a7f9d #v0.5.22
uses: tauri-apps/tauri-action@19b93bb55601e3e373a93cfb6eb4242e45f5af20 #v0.6.0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
BUILD_TAG: "nightly-${{ steps.timestamp.outputs.timestamp }}"
GITHUB_REF_NAME: "nightly-${{ steps.timestamp.outputs.timestamp }}"
GITHUB_SHA: ${{ github.sha }}
with:
projectPath: ./src-tauri
tagName: "nightly-${{ steps.timestamp.outputs.timestamp }}"
releaseName: "Donut Browser Nightly (Build ${{ steps.timestamp.outputs.timestamp }})"
releaseBody: "⚠️ **Nightly Release** - This is an automatically generated pre-release build from the latest main branch. Use with caution.\n\nCommit: ${{ github.sha }}\nBuild: ${{ steps.timestamp.outputs.timestamp }}"
+2 -2
View File
@@ -21,6 +21,6 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions Repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 #v4.2.2
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 #v6.0.0
- name: Spell Check Repo
uses: crate-ci/typos@52bd719c2c91f9d676e2aa359fc8e0db8925e6d8 #v1.35.3
uses: crate-ci/typos@626c4bedb751ce0b7f03262ca97ddda9a076ae1c #v1.39.2
+1 -1
View File
@@ -12,7 +12,7 @@ jobs:
pull-requests: write
steps:
- uses: actions/stale@5bef64f19d7facfb25b37b414482c7164d639639 # v9.1.0
- uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
stale-issue-message: "This issue has been inactive for 60 days. Please respond to keep it open."
+22
View File
@@ -1,5 +1,7 @@
{
"cSpell.words": [
"ABORTIFHUNG",
"aboutwelcome",
"adwaita",
"ahooks",
"akhilmhdh",
@@ -15,11 +17,14 @@
"busctl",
"CAMOU",
"camoufox",
"catppuccin",
"cdylib",
"certifi",
"CFURL",
"checkin",
"chrono",
"ciphertext",
"cksum",
"CLICOLOR",
"clippy",
"cmdk",
@@ -31,7 +36,9 @@
"dataclasses",
"datareporting",
"datas",
"DBAPI",
"dconf",
"debuginfo",
"devedition",
"distro",
"doctest",
@@ -67,12 +74,14 @@
"kdeglobals",
"keras",
"KHTML",
"killall",
"Kolkata",
"kreadconfig",
"launchservices",
"letterboxing",
"libatk",
"libayatana",
"libc",
"libcairo",
"libgdk",
"libglib",
@@ -81,11 +90,15 @@
"libwebkit",
"libxdo",
"localtime",
"lpdw",
"lxml",
"lzma",
"macchiato",
"Matchalk",
"mmdb",
"mountpoint",
"msiexec",
"mstone",
"msvc",
"msys",
"Mullvad",
@@ -111,9 +124,12 @@
"peerconnection",
"pids",
"pixbuf",
"pkexec",
"pkill",
"plasmohq",
"platformdirs",
"prefs",
"PRIO",
"propertylist",
"psutil",
"pycache",
@@ -127,10 +143,15 @@
"ridedott",
"rlib",
"rustc",
"rwxr",
"SARIF",
"scipy",
"screeninfo",
"selectables",
"serde",
"setpriority",
"setsid",
"SETTINGCHANGE",
"setuptools",
"shadcn",
"showcursor",
@@ -138,6 +159,7 @@
"signon",
"signum",
"sklearn",
"SMTO",
"sonner",
"splitn",
"sspi",
+1
View File
@@ -6,3 +6,4 @@
- Before finishing the task and showing summary, always run "pnpm format && pnpm lint && pnpm test" at the root of the project to ensure that you don't finish with broken application.
- Anytime you change nodecar's code and try to test, recompile it with "cd nodecar && pnpm build".
- If there is a global singleton of a struct, only use it inside a method while properly initializing it, unless I have explicitly specified in the request otherwise.
- If you are modifying the UI, do not add random colors that are not controlled by src/lib/themes.ts file.
+1 -1
View File
@@ -35,7 +35,7 @@
- Create unlimited number of local browser profiles completely isolated from each other
- Safely use multiple accounts on one device by using anti-detect browser profiles, powered by [Camoufox](https://camoufox.com)
- Proxy support with basic auth for all browsers except for TOR Browser
- Proxy support with basic auth for all browsers
- Import profiles from your existing browsers
- Automatic updates for browsers
- Set Donut Browser as your default browser to control in which profile to open links
+4 -4
View File
@@ -1,5 +1,5 @@
{
"$schema": "https://biomejs.dev/schemas/2.0.6/schema.json",
"$schema": "https://biomejs.dev/schemas/2.2.0/schema.json",
"vcs": {
"enabled": false,
"clientKind": "git",
@@ -18,11 +18,11 @@
"rules": {
"recommended": true,
"correctness": {
"useUniqueElementIds": "off",
"useHookAtTopLevel": "error"
},
"nursery": {
"useUniqueElementIds": "off"
},
"nursery": "off",
"suspicious": "off",
"a11y": {
"useSemanticElements": "off"
}
+1
View File
@@ -1,5 +1,6 @@
/// <reference types="next" />
/// <reference types="next/image-types/global" />
/// <reference path="./dist/types/routes.d.ts" />
// NOTE: This file should not be edited
// see https://nextjs.org/docs/app/api-reference/config/typescript for more information.
+3
View File
@@ -7,6 +7,9 @@ const nextConfig: NextConfig = {
unoptimized: true,
},
distDir: "dist",
compiler: {
removeConsole: process.env.NODE_ENV === "production",
},
};
export default nextConfig;
-3
View File
@@ -23,6 +23,3 @@ fi
# Copy the file with target triple suffix
cp "nodecar-bin" "../src-tauri/binaries/nodecar-${TARGET_TRIPLE}${EXT}"
# Also copy a generic version for Tauri to find
cp "nodecar-bin" "../src-tauri/binaries/nodecar${EXT}"
+8 -8
View File
@@ -21,18 +21,18 @@
"author": "",
"license": "AGPL-3.0",
"dependencies": {
"@types/node": "^24.2.1",
"commander": "^14.0.0",
"donutbrowser-camoufox-js": "^0.6.6",
"dotenv": "^17.2.1",
"fingerprint-generator": "^2.1.69",
"@types/node": "^24.10.0",
"commander": "^14.0.2",
"donutbrowser-camoufox-js": "^0.7.0",
"dotenv": "^17.2.3",
"fingerprint-generator": "^2.1.76",
"get-port": "^7.1.0",
"nodemon": "^3.1.10",
"playwright-core": "^1.54.2",
"nodemon": "^3.1.11",
"playwright-core": "^1.56.1",
"proxy-chain": "^2.5.9",
"tmp": "^0.2.5",
"ts-node": "^10.9.2",
"typescript": "^5.9.2"
"typescript": "^5.9.3"
},
"devDependencies": {
"@types/tmp": "^0.2.6"
+87 -27
View File
@@ -267,6 +267,18 @@ export async function startCamoufoxProcess(
});
}
/**
* Check if a process is running by PID
*/
function isProcessRunning(pid: number): boolean {
try {
process.kill(pid, 0);
return true;
} catch {
return false;
}
}
/**
* Stop a Camoufox process
* @param id The Camoufox ID to stop
@@ -279,45 +291,85 @@ export async function stopCamoufoxProcess(id: string): Promise<boolean> {
return false;
}
const pid = config.processId;
try {
// Method 1: If we have a process ID, kill by PID with proper signal sequence
if (config.processId) {
if (pid && isProcessRunning(pid)) {
try {
// First try SIGTERM for graceful shutdown
process.kill(config.processId, "SIGTERM");
// Give it more time to terminate gracefully (increased from 2s to 5s)
await new Promise((resolve) => setTimeout(resolve, 5000));
process.kill(pid, "SIGTERM");
// Check if process is still running
try {
process.kill(config.processId, 0); // Signal 0 checks if process exists
process.kill(config.processId, "SIGKILL");
} catch {}
} catch {}
// Wait up to 3 seconds for graceful shutdown
for (let i = 0; i < 30; i++) {
await new Promise((resolve) => setTimeout(resolve, 100));
if (!isProcessRunning(pid)) {
break;
}
}
// If still running, force kill
if (isProcessRunning(pid)) {
process.kill(pid, "SIGKILL");
// Wait for SIGKILL to take effect
for (let i = 0; i < 20; i++) {
await new Promise((resolve) => setTimeout(resolve, 100));
if (!isProcessRunning(pid)) {
break;
}
}
}
} catch {
// Process might have already exited
}
}
// Method 2: Pattern-based kill as fallback
const killByPattern = spawn(
"pkill",
["-TERM", "-f", `camoufox-worker.*${id}`],
{
stdio: "ignore",
},
);
// Wait for pattern-based kill command to complete
// Method 2: Pattern-based kill as fallback (kills any child processes)
await new Promise<void>((resolve) => {
const killByPattern = spawn(
"pkill",
["-TERM", "-f", `camoufox-worker.*${id}`],
{ stdio: "ignore" },
);
killByPattern.on("exit", () => resolve());
// Timeout after 3 seconds
setTimeout(() => resolve(), 3000);
setTimeout(() => resolve(), 1000);
});
// Final cleanup with SIGKILL if needed
setTimeout(() => {
spawn("pkill", ["-KILL", "-f", `camoufox-worker.*${id}`], {
stdio: "ignore",
// Wait a moment then force kill any remaining
await new Promise((resolve) => setTimeout(resolve, 500));
await new Promise<void>((resolve) => {
const killByPatternForce = spawn(
"pkill",
["-KILL", "-f", `camoufox-worker.*${id}`],
{ stdio: "ignore" },
);
killByPatternForce.on("exit", () => resolve());
setTimeout(() => resolve(), 1000);
});
// Also kill any Firefox processes associated with this profile
if (config.profilePath) {
await new Promise<void>((resolve) => {
const killFirefox = spawn(
"pkill",
["-KILL", "-f", config.profilePath!],
{ stdio: "ignore" },
);
killFirefox.on("exit", () => resolve());
setTimeout(() => resolve(), 1000);
});
}, 1000);
}
// Verify process is actually dead
if (pid && isProcessRunning(pid)) {
// Last resort: SIGKILL again
try {
process.kill(pid, "SIGKILL");
} catch {
// Ignore
}
}
// Delete the configuration
deleteCamoufoxConfig(id);
@@ -352,6 +404,7 @@ interface GenerateConfigOptions {
blockWebgl?: boolean;
executablePath?: string;
fingerprint?: string;
os?: "windows" | "macos" | "linux";
}
/**
@@ -431,6 +484,13 @@ export async function generateCamoufoxConfig(
}
}
launchOpts.allowAddonNewTab = true;
// Add OS option for fingerprint generation
if (options.os) {
launchOpts.os = options.os;
}
// Generate the configuration using launchOptions
const generatedOptions = await launchOptions(launchOpts);
+124 -1
View File
@@ -1,18 +1,40 @@
import fs from "node:fs";
import path from "node:path";
import { launchOptions } from "donutbrowser-camoufox-js";
import type { LaunchOptions } from "donutbrowser-camoufox-js/dist/utils.js";
import { type Browser, type BrowserContext, firefox } from "playwright-core";
import tmp from "tmp";
import { getCamoufoxConfig, saveCamoufoxConfig } from "./camoufox-storage.js";
import { getEnvVars, parseProxyString } from "./utils.js";
// Set up debug logging to a file
const LOG_DIR = path.join(tmp.tmpdir, "donutbrowser", "camoufox-logs");
if (!fs.existsSync(LOG_DIR)) {
fs.mkdirSync(LOG_DIR, { recursive: true });
}
function debugLog(id: string, message: string, data?: any): void {
const logFile = path.join(LOG_DIR, `${id}.log`);
const timestamp = new Date().toISOString();
const logMessage = data
? `[${timestamp}] ${message}: ${JSON.stringify(data, null, 2)}\n`
: `[${timestamp}] ${message}\n`;
fs.appendFileSync(logFile, logMessage);
}
/**
* Run a Camoufox browser server as a worker process
* @param id The Camoufox configuration ID
*/
export async function runCamoufoxWorker(id: string): Promise<void> {
debugLog(id, "Worker starting", { pid: process.pid });
// Get the Camoufox configuration
debugLog(id, "Loading Camoufox configuration");
const config = getCamoufoxConfig(id);
if (!config) {
debugLog(id, "Configuration not found");
console.error(
JSON.stringify({
error: "Configuration not found",
@@ -22,6 +44,13 @@ export async function runCamoufoxWorker(id: string): Promise<void> {
process.exit(1);
}
debugLog(id, "Configuration loaded successfully", {
profilePath: config.profilePath,
hasOptions: !!config.options,
hasCustomConfig: !!config.customConfig,
hasUrl: !!config.url,
});
config.processId = process.pid;
saveCamoufoxConfig(config);
@@ -37,12 +66,14 @@ export async function runCamoufoxWorker(id: string): Promise<void> {
// Launch browser in background - this can take time and may fail
setImmediate(async () => {
debugLog(id, "Starting browser launch in background");
let browser: Browser | null = null;
let context: BrowserContext | null = null;
let windowCheckInterval: NodeJS.Timeout | null = null;
// Graceful shutdown handler with access to browser and server
const gracefulShutdown = async () => {
debugLog(id, "Graceful shutdown initiated");
try {
// Clear any intervals first
if (windowCheckInterval) {
@@ -76,14 +107,19 @@ export async function runCamoufoxWorker(id: string): Promise<void> {
process.on("unhandledRejection", () => void gracefulShutdown());
try {
debugLog(id, "Preparing launch options");
// Deep clone to avoid reference sharing and ensure fresh configuration for each instance
const camoufoxOptions: LaunchOptions = JSON.parse(
JSON.stringify(config.options || {}),
);
debugLog(id, "Base options cloned", {
hasOptions: Object.keys(camoufoxOptions).length,
});
// Add profile path if provided
if (config.profilePath) {
camoufoxOptions.user_data_dir = config.profilePath;
debugLog(id, "Set user_data_dir", { profilePath: config.profilePath });
}
// Ensure block options are properly set
@@ -111,52 +147,94 @@ export async function runCamoufoxWorker(id: string): Promise<void> {
showcursor: false,
...(camoufoxOptions.config || {}),
};
debugLog(id, "Set default options", {
i_know_what_im_doing: true,
disableTheming: true,
showcursor: false,
});
// Generate fresh options for this specific instance
debugLog(id, "Generating launch options via launchOptions function");
const generatedOptions = await launchOptions(camoufoxOptions);
debugLog(id, "Launch options generated successfully", {
hasEnv: !!generatedOptions.env,
argsLength: generatedOptions.args?.length || 0,
});
// Start with process environment to ensure proper inheritance
let finalEnv = { ...process.env };
debugLog(id, "Base environment variables set", {
envVarCount: Object.keys(finalEnv).length,
});
// Add generated options environment variables
if (generatedOptions.env) {
finalEnv = { ...finalEnv, ...generatedOptions.env };
debugLog(id, "Added generated environment variables", {
generatedEnvCount: Object.keys(generatedOptions.env).length,
totalEnvCount: Object.keys(finalEnv).length,
});
}
// If we have a custom config from Rust, use it directly as environment variables
if (config.customConfig) {
debugLog(id, "Processing custom config", {
customConfigLength: config.customConfig.length,
});
try {
// Parse the custom config JSON string
const customConfigObj = JSON.parse(config.customConfig);
debugLog(id, "Custom config parsed successfully", {
customConfigKeys: Object.keys(customConfigObj),
});
// Ensure default config values are preserved even with custom config
const mergedConfig = {
...customConfigObj,
disableTheming: true,
showcursor: false,
// allowAddonNewTab will be handled from the fingerprint config if present
};
// Convert merged config to environment variables using getEnvVars
const customEnvVars = getEnvVars(mergedConfig);
debugLog(id, "Custom config converted to environment variables", {
customEnvVarCount: Object.keys(customEnvVars).length,
});
// Merge custom config with generated config (custom takes precedence)
finalEnv = { ...finalEnv, ...customEnvVars };
debugLog(id, "Custom config merged with final environment", {
finalEnvCount: Object.keys(finalEnv).length,
});
} catch (error) {
debugLog(id, "Failed to parse custom config", {
error: error instanceof Error ? error.message : String(error),
});
console.error(
`Camoufox worker ${id}: Failed to parse custom config, using generated config:`,
error,
);
await gracefulShutdown();
return;
}
} else {
debugLog(id, "No custom config provided");
}
// Prepare profile path for persistent context
const profilePath = config.profilePath || "";
debugLog(id, "Profile path prepared", { profilePath });
// Launch persistent context with the final configuration
const finalOptions: any = {
...generatedOptions,
env: finalEnv,
};
debugLog(id, "Final launch options prepared", {
hasExecutablePath: !!finalOptions.executablePath,
hasProxy: !!camoufoxOptions.proxy,
profilePath,
});
// If a custom executable path was provided, ensure Playwright uses it
if (
@@ -165,46 +243,66 @@ export async function runCamoufoxWorker(id: string): Promise<void> {
) {
finalOptions.executablePath = (camoufoxOptions as any)
.executable_path as string;
debugLog(id, "Custom executable path set", {
executablePath: finalOptions.executablePath,
});
}
// Only add proxy if it exists and is valid
if (camoufoxOptions.proxy) {
debugLog(id, "Processing proxy configuration", {
proxyString: camoufoxOptions.proxy,
});
try {
finalOptions.proxy = parseProxyString(camoufoxOptions.proxy);
debugLog(id, "Proxy parsed successfully");
} catch (error) {
debugLog(id, "Failed to parse proxy", {
error: error instanceof Error ? error.message : String(error),
});
console.error({
message: "Failed to parse proxy, launching without proxy",
error,
});
await gracefulShutdown();
return;
}
}
// Use launchPersistentContext instead of launchServer
debugLog(id, "Launching persistent context", { profilePath });
context = await firefox.launchPersistentContext(
profilePath,
finalOptions,
);
debugLog(id, "Persistent context launched successfully");
// Get the browser instance from context
browser = context.browser();
debugLog(id, "Browser instance obtained from context", {
browserConnected: browser?.isConnected(),
});
// Handle browser disconnection for proper cleanup
if (browser) {
browser.on("disconnected", () => void gracefulShutdown());
debugLog(id, "Browser disconnect handler registered");
}
// Handle context close for proper cleanup
context.on("close", () => void gracefulShutdown());
debugLog(id, "Context close handler registered");
saveCamoufoxConfig(config);
// Monitor for window closure
const startWindowMonitoring = () => {
debugLog(id, "Starting window monitoring");
windowCheckInterval = setInterval(async () => {
try {
// Check if context is still active
if (!context?.pages || context.pages().length === 0) {
debugLog(id, "No pages found in context, shutting down");
if (windowCheckInterval) {
clearInterval(windowCheckInterval);
}
@@ -214,6 +312,7 @@ export async function runCamoufoxWorker(id: string): Promise<void> {
// Check if browser is still connected (if available)
if (browser && !browser.isConnected()) {
debugLog(id, "Browser disconnected, shutting down");
if (windowCheckInterval) {
clearInterval(windowCheckInterval);
}
@@ -224,12 +323,16 @@ export async function runCamoufoxWorker(id: string): Promise<void> {
// Check pages in the persistent context
const pages = context.pages();
if (pages.length === 0) {
debugLog(id, "No pages in context, shutting down");
if (windowCheckInterval) {
clearInterval(windowCheckInterval);
}
await gracefulShutdown();
}
} catch {
} catch (error) {
debugLog(id, "Error in window monitoring", {
error: error instanceof Error ? error.message : String(error),
});
// If we can't check windows, assume browser is closing
if (windowCheckInterval) {
clearInterval(windowCheckInterval);
@@ -241,19 +344,29 @@ export async function runCamoufoxWorker(id: string): Promise<void> {
// Handle URL opening if provided
if (config.url) {
debugLog(id, "Opening URL in browser", { url: config.url });
try {
const pages = await context.pages();
if (pages.length) {
const page = pages[0];
debugLog(id, "Navigating to URL");
await page.goto(config.url, {
waitUntil: "domcontentloaded",
timeout: 30000,
});
debugLog(id, "URL opened successfully");
// Start monitoring after page is created
startWindowMonitoring();
} else {
debugLog(id, "No pages available to open URL");
startWindowMonitoring();
}
} catch (urlError) {
debugLog(id, "Failed to open URL", {
error:
urlError instanceof Error ? urlError.message : String(urlError),
});
console.error({
message: "Failed to open URL",
error: urlError,
@@ -263,15 +376,18 @@ export async function runCamoufoxWorker(id: string): Promise<void> {
startWindowMonitoring();
}
} else {
debugLog(id, "No URL provided, starting monitoring");
// Start monitoring after page is created
startWindowMonitoring();
}
// Monitor browser/context connection
debugLog(id, "Starting keep-alive monitoring");
const keepAlive = setInterval(async () => {
try {
// Check if context is still active
if (!context?.pages) {
debugLog(id, "Context not active in keep-alive, shutting down");
clearInterval(keepAlive);
await gracefulShutdown();
return;
@@ -279,11 +395,15 @@ export async function runCamoufoxWorker(id: string): Promise<void> {
// Check browser connection if available
if (browser && !browser.isConnected()) {
debugLog(id, "Browser not connected in keep-alive, shutting down");
clearInterval(keepAlive);
await gracefulShutdown();
return;
}
} catch (error) {
debugLog(id, "Error in keep-alive check", {
error: error instanceof Error ? error.message : String(error),
});
console.error({
message: "Error in keepAlive check",
error,
@@ -293,6 +413,9 @@ export async function runCamoufoxWorker(id: string): Promise<void> {
}
}, 2000);
} catch (error) {
debugLog(id, "Failed to launch Camoufox", {
error: error instanceof Error ? error.message : String(error),
});
console.error({
message: "Failed to launch Camoufox",
error,
+8 -139
View File
@@ -8,145 +8,6 @@ import {
} from "./camoufox-launcher.js";
import { listCamoufoxConfigs } from "./camoufox-storage.js";
import { runCamoufoxWorker } from "./camoufox-worker.js";
import {
startProxyProcess,
stopAllProxyProcesses,
stopProxyProcess,
} from "./proxy-runner";
import { listProxyConfigs } from "./proxy-storage";
import { runProxyWorker } from "./proxy-worker";
// Command for proxy management
program
.command("proxy")
.argument("<action>", "start, stop, or list proxies")
.option("--host <host>", "upstream proxy host")
.option("--proxy-port <port>", "upstream proxy port", Number.parseInt)
.option("--type <type>", "proxy type (http, https, socks4, socks5)")
.option("--username <username>", "proxy username")
.option("--password <password>", "proxy password")
.option(
"-p, --port <number>",
"local port to use (random if not specified)",
Number.parseInt,
)
.option("--ignore-certificate", "ignore certificate errors for HTTPS proxies")
.option("--id <id>", "proxy ID for stop command")
.option(
"-u, --upstream <url>",
"upstream proxy URL (protocol://[username:password@]host:port)",
)
.description("manage proxy servers")
.action(
async (
action: string,
options: {
host?: string;
proxyPort?: number;
type?: string;
username?: string;
password?: string;
port?: number;
ignoreCertificate?: boolean;
id?: string;
upstream?: string;
},
) => {
if (action === "start") {
let upstreamUrl: string | undefined;
// Build upstream URL from individual components if provided
if (options.host && options.proxyPort && options.type) {
// Preserve provided scheme (http, https, socks4, socks5)
const protocol = String(options.type).toLowerCase();
const auth =
options.username && options.password
? `${encodeURIComponent(options.username)}:${encodeURIComponent(
options.password,
)}@`
: "";
upstreamUrl = `${protocol}://${auth}${options.host}:${options.proxyPort}`;
} else if (options.upstream) {
upstreamUrl = options.upstream;
}
// If no upstream is provided, create a direct proxy
try {
const config = await startProxyProcess(upstreamUrl, {
port: options.port,
ignoreProxyCertificate: options.ignoreCertificate,
});
// Output the configuration as JSON for the Rust side to parse
console.log(
JSON.stringify({
id: config.id,
localPort: config.localPort,
localUrl: config.localUrl,
upstreamUrl: config.upstreamUrl,
}),
);
// Exit successfully to allow the process to detach
process.exit(0);
} catch (error: unknown) {
console.error(
`Failed to start proxy: ${
error instanceof Error ? error.message : JSON.stringify(error)
}`,
);
process.exit(1);
}
} else if (action === "stop") {
if (options.id) {
const stopped = await stopProxyProcess(options.id);
console.log(JSON.stringify({ success: stopped }));
} else if (options.upstream) {
// Find proxies with this upstream URL
const configs = listProxyConfigs().filter(
(config) => config.upstreamUrl === options.upstream,
);
if (configs.length === 0) {
console.error(`No proxies found for ${options.upstream}`);
process.exit(1);
return;
}
for (const config of configs) {
const stopped = await stopProxyProcess(config.id);
console.log(JSON.stringify({ success: stopped }));
}
} else {
await stopAllProxyProcesses();
console.log(JSON.stringify({ success: true }));
}
process.exit(0);
} else if (action === "list") {
const configs = listProxyConfigs();
console.log(JSON.stringify(configs));
process.exit(0);
} else {
console.error("Invalid action. Use 'start', 'stop', or 'list'");
process.exit(1);
}
},
);
// Command for proxy worker (internal use)
program
.command("proxy-worker")
.argument("<action>", "start a proxy worker")
.requiredOption("--id <id>", "proxy configuration ID")
.description("run a proxy worker process")
.action(async (action: string, options: { id: string }) => {
if (action === "start") {
await runProxyWorker(options.id);
} else {
console.error("Invalid action for proxy-worker. Use 'start'");
process.exit(1);
}
});
// Command for Camoufox management
program
@@ -173,6 +34,10 @@ program
.option("--fingerprint <json>", "fingerprint JSON string")
.option("--headless", "run in headless mode")
.option("--custom-config <json>", "custom config JSON string")
.option(
"--os <os>",
"operating system for fingerprint: windows, macos, linux",
)
.description("manage Camoufox browser instances")
.action(
@@ -423,6 +288,10 @@ program
typeof options.fingerprint === "string"
? options.fingerprint
: undefined,
os:
typeof options.os === "string"
? (options.os as "windows" | "macos" | "linux")
: undefined,
});
console.log(config);
process.exit(0);
-124
View File
@@ -1,124 +0,0 @@
import { spawn } from "node:child_process";
import path from "node:path";
import getPort from "get-port";
import {
deleteProxyConfig,
generateProxyId,
getProxyConfig,
isProcessRunning,
listProxyConfigs,
type ProxyConfig,
saveProxyConfig,
} from "./proxy-storage";
/**
* Start a proxy in a separate process
* @param upstreamUrl The upstream proxy URL (optional for direct proxy)
* @param options Optional configuration
* @returns Promise resolving to the proxy configuration
*/
export async function startProxyProcess(
upstreamUrl?: string,
options: { port?: number; ignoreProxyCertificate?: boolean } = {},
): Promise<ProxyConfig> {
// Generate a unique ID for this proxy
const id = generateProxyId();
// Get a random available port if not specified
const port = options.port ?? (await getPort());
// Create the proxy configuration
const config: ProxyConfig = {
id,
upstreamUrl: upstreamUrl || "DIRECT",
localPort: port,
ignoreProxyCertificate: options.ignoreProxyCertificate ?? false,
};
// Save the configuration before starting the process
saveProxyConfig(config);
// Build the command arguments
const args = [
path.join(__dirname, "index.js"),
"proxy-worker",
"start",
"--id",
id,
];
// Spawn the process with proper detachment
const child = spawn(process.execPath, args, {
detached: true,
stdio: ["ignore", "ignore", "ignore"], // Completely ignore all stdio
cwd: process.cwd(),
});
// Unref the child to allow the parent to exit independently
child.unref();
// Store the process ID and local URL
config.pid = child.pid;
config.localUrl = `http://127.0.0.1:${port}`;
// Update the configuration with the process ID
saveProxyConfig(config);
// Give the worker a moment to start before returning
await new Promise((resolve) => setTimeout(resolve, 100));
return config;
}
/**
* Stop a proxy process
* @param id The proxy ID to stop
* @returns Promise resolving to true if stopped, false if not found
*/
export async function stopProxyProcess(id: string): Promise<boolean> {
const config = getProxyConfig(id);
if (!config?.pid) {
// Try to delete the config anyway in case it exists without a PID
deleteProxyConfig(id);
return false;
}
try {
// Check if the process is running
if (isProcessRunning(config.pid)) {
// Send SIGTERM to the process
process.kill(config.pid, "SIGTERM");
// Wait a bit to ensure the process has terminated
await new Promise((resolve) => setTimeout(resolve, 500));
// If still running, send SIGKILL
if (isProcessRunning(config.pid)) {
process.kill(config.pid, "SIGKILL");
await new Promise((resolve) => setTimeout(resolve, 200));
}
}
// Delete the configuration
deleteProxyConfig(id);
return true;
} catch (error) {
console.error(`Error stopping proxy ${id}:`, error);
// Delete the configuration even if stopping failed
deleteProxyConfig(id);
return false;
}
}
/**
* Stop all proxy processes
* @returns Promise resolving when all proxies are stopped
*/
export async function stopAllProxyProcesses(): Promise<void> {
const configs = listProxyConfigs();
const stopPromises = configs.map((config) => stopProxyProcess(config.id));
await Promise.all(stopPromises);
}
-150
View File
@@ -1,150 +0,0 @@
import fs from "node:fs";
import path from "node:path";
import tmp from "tmp";
export interface ProxyConfig {
id: string;
upstreamUrl: string; // Can be "DIRECT" for direct proxy
localPort?: number;
ignoreProxyCertificate?: boolean;
localUrl?: string;
pid?: number;
}
const STORAGE_DIR = path.join(tmp.tmpdir, "donutbrowser", "proxies");
if (!fs.existsSync(STORAGE_DIR)) {
fs.mkdirSync(STORAGE_DIR, { recursive: true });
}
/**
* Save a proxy configuration to disk
* @param config The proxy configuration to save
*/
export function saveProxyConfig(config: ProxyConfig): void {
const filePath = path.join(STORAGE_DIR, `${config.id}.json`);
fs.writeFileSync(filePath, JSON.stringify(config, null, 2));
}
/**
* Get a proxy configuration by ID
* @param id The proxy ID
* @returns The proxy configuration or null if not found
*/
export function getProxyConfig(id: string): ProxyConfig | null {
const filePath = path.join(STORAGE_DIR, `${id}.json`);
if (!fs.existsSync(filePath)) {
return null;
}
try {
const content = fs.readFileSync(filePath, "utf-8");
return JSON.parse(content) as ProxyConfig;
} catch (error) {
console.error(`Error reading proxy config ${id}:`, error);
return null;
}
}
/**
* Delete a proxy configuration
* @param id The proxy ID to delete
* @returns True if deleted, false if not found
*/
export function deleteProxyConfig(id: string): boolean {
const filePath = path.join(STORAGE_DIR, `${id}.json`);
if (!fs.existsSync(filePath)) {
return false;
}
try {
fs.unlinkSync(filePath);
return true;
} catch (error) {
console.error(`Error deleting proxy config ${id}:`, error);
return false;
}
}
/**
* List all saved proxy configurations
* @returns Array of proxy configurations
*/
export function listProxyConfigs(): ProxyConfig[] {
if (!fs.existsSync(STORAGE_DIR)) {
return [];
}
try {
return fs
.readdirSync(STORAGE_DIR)
.filter((file) => file.endsWith(".json"))
.map((file) => {
try {
const content = fs.readFileSync(
path.join(STORAGE_DIR, file),
"utf-8",
);
return JSON.parse(content) as ProxyConfig;
} catch (error) {
console.error(`Error reading proxy config ${file}:`, error);
return null;
}
})
.filter((config): config is ProxyConfig => config !== null);
} catch (error) {
console.error("Error listing proxy configs:", error);
return [];
}
}
/**
* Update a proxy configuration
* @param config The proxy configuration to update
* @returns True if updated, false if not found
*/
export function updateProxyConfig(config: ProxyConfig): boolean {
const filePath = path.join(STORAGE_DIR, `${config.id}.json`);
try {
fs.readFileSync(filePath, "utf-8");
fs.writeFileSync(filePath, JSON.stringify(config, null, 2));
return true;
} catch (error) {
if ((error as NodeJS.ErrnoException).code === "ENOENT") {
console.error(
`Config ${config.id} was deleted while the app was running`,
);
return false;
}
console.error(`Error updating proxy config ${config.id}:`, error);
return false;
}
}
/**
* Check if a proxy process is running
* @param pid The process ID to check
* @returns True if running, false otherwise
*/
export function isProcessRunning(pid: number): boolean {
try {
// The kill method with signal 0 doesn't actually kill the process
// but checks if it exists
process.kill(pid, 0);
return true;
} catch {
return false;
}
}
/**
* Generate a unique ID for a proxy
* @returns A unique ID string
*/
export function generateProxyId(): string {
return `proxy_${Date.now()}_${Math.floor(Math.random() * 10000)}`;
}
-70
View File
@@ -1,70 +0,0 @@
import { Server } from "proxy-chain";
import { getProxyConfig, updateProxyConfig } from "./proxy-storage";
/**
* Run a proxy server as a worker process
* @param id The proxy configuration ID
*/
export async function runProxyWorker(id: string): Promise<void> {
// Get the proxy configuration
const config = getProxyConfig(id);
if (!config) {
console.error(`Proxy configuration ${id} not found`);
process.exit(1);
}
// Create a new proxy server
const server = new Server({
port: config.localPort,
host: "127.0.0.1",
prepareRequestFunction: () => {
// If upstreamUrl is "DIRECT", don't use upstream proxy
if (config.upstreamUrl === "DIRECT") {
return {};
}
return {
upstreamProxyUrl: config.upstreamUrl,
ignoreUpstreamProxyCertificate: config.ignoreProxyCertificate ?? false,
};
},
});
// Handle process termination gracefully
const gracefulShutdown = async () => {
try {
await server.close(true);
} catch {}
process.exit(0);
};
process.on("SIGTERM", () => void gracefulShutdown());
process.on("SIGINT", () => void gracefulShutdown());
// Handle uncaught exceptions
process.on("uncaughtException", () => {
process.exit(1);
});
process.on("unhandledRejection", () => {
process.exit(1);
});
// Start the server
try {
await server.listen();
// Update the config with the actual port (in case it was auto-assigned)
config.localPort = server.port;
config.localUrl = `http://127.0.0.1:${server.port}`;
updateProxyConfig(config);
// Keep the process alive
setInterval(() => {
// Do nothing, just keep the process alive
}, 60000);
} catch (error) {
console.error(`Failed to start proxy worker ${id}:`, error);
process.exit(1);
}
}
+1
View File
@@ -66,6 +66,7 @@ export function parseProxyString(proxyString: LaunchOptions["proxy"] | string) {
// Try parsing as URL first (handles protocol://username:password@host:port)
if (trimmed.includes("://")) {
const url = new URL(trimmed);
// Playwright accepts short form "host:port" for HTTP proxies
server = `${url.hostname}:${url.port}`;
if (url.username) {
+47 -36
View File
@@ -2,7 +2,7 @@
"name": "donutbrowser",
"private": true,
"license": "AGPL-3.0",
"version": "0.9.2",
"version": "0.13.0",
"type": "module",
"scripts": {
"dev": "next dev --turbopack",
@@ -21,55 +21,66 @@
"format": "pnpm format:js && pnpm format:rust",
"cargo": "cd src-tauri && cargo",
"unused-exports:js": "ts-unused-exports tsconfig.json",
"check-unused-commands": "cd src-tauri && cargo run --bin check_unused_commands"
"check-unused-commands": "cd src-tauri && cargo test test_no_unused_tauri_commands",
"copy-proxy-binary": "cd src-tauri && bash copy-proxy-binary.sh",
"prebuild": "pnpm copy-proxy-binary",
"pretauri:dev": "pnpm copy-proxy-binary",
"precargo": "pnpm copy-proxy-binary"
},
"dependencies": {
"@radix-ui/react-checkbox": "^1.3.2",
"@radix-ui/react-dialog": "^1.1.14",
"@radix-ui/react-dropdown-menu": "^2.1.15",
"@radix-ui/react-label": "^2.1.7",
"@radix-ui/react-popover": "^1.1.14",
"@radix-ui/react-progress": "^1.1.7",
"@radix-ui/react-radio-group": "^1.3.7",
"@radix-ui/react-scroll-area": "^1.2.9",
"@radix-ui/react-select": "^2.2.5",
"@radix-ui/react-slot": "^1.2.3",
"@radix-ui/react-tabs": "^1.1.12",
"@radix-ui/react-tooltip": "^1.2.7",
"@radix-ui/react-checkbox": "^1.3.3",
"@radix-ui/react-dialog": "^1.1.15",
"@radix-ui/react-dropdown-menu": "^2.1.16",
"@radix-ui/react-label": "^2.1.8",
"@radix-ui/react-popover": "^1.1.15",
"@radix-ui/react-progress": "^1.1.8",
"@radix-ui/react-radio-group": "^1.3.8",
"@radix-ui/react-scroll-area": "^1.2.10",
"@radix-ui/react-select": "^2.2.6",
"@radix-ui/react-slot": "^1.2.4",
"@radix-ui/react-tabs": "^1.1.13",
"@radix-ui/react-tooltip": "^1.2.8",
"@tanstack/react-table": "^8.21.3",
"@tauri-apps/api": "^2.7.0",
"@tauri-apps/plugin-deep-link": "^2.4.1",
"@tauri-apps/plugin-dialog": "^2.3.2",
"@tauri-apps/plugin-fs": "~2.4.1",
"@tauri-apps/plugin-opener": "^2.4.0",
"ahooks": "^3.9.0",
"@tauri-apps/api": "^2.9.0",
"@tauri-apps/plugin-deep-link": "^2.4.5",
"@tauri-apps/plugin-dialog": "^2.4.2",
"@tauri-apps/plugin-fs": "~2.4.4",
"@tauri-apps/plugin-log": "^2.7.1",
"@tauri-apps/plugin-opener": "^2.5.2",
"ahooks": "^3.9.6",
"class-variance-authority": "^0.7.1",
"clsx": "^2.1.1",
"cmdk": "^1.1.1",
"motion": "^12.23.12",
"next": "^15.4.6",
"color": "^5.0.2",
"flag-icons": "^7.5.0",
"lucide-react": "^0.555.0",
"motion": "^12.23.24",
"next": "^15.5.6",
"next-themes": "^0.4.6",
"react": "^19.1.1",
"react-dom": "^19.1.1",
"radix-ui": "^1.4.3",
"react": "^19.2.0",
"react-dom": "^19.2.0",
"react-icons": "^5.5.0",
"recharts": "2.15.4",
"sonner": "^2.0.7",
"tailwind-merge": "^3.3.1",
"tailwind-merge": "^3.4.0",
"tauri-plugin-macos-permissions-api": "^2.3.0"
},
"devDependencies": {
"@biomejs/biome": "2.1.4",
"@tailwindcss/postcss": "^4.1.11",
"@tauri-apps/cli": "^2.7.1",
"@types/node": "^24.2.1",
"@types/react": "^19.1.9",
"@types/react-dom": "^19.1.7",
"@vitejs/plugin-react": "^5.0.0",
"@biomejs/biome": "2.2.3",
"@tailwindcss/postcss": "^4.1.17",
"@tauri-apps/cli": "^2.9.4",
"@types/color": "^4.2.0",
"@types/node": "^24.10.0",
"@types/react": "^19.2.3",
"@types/react-dom": "^19.2.2",
"@vitejs/plugin-react": "^5.1.0",
"husky": "^9.1.7",
"lint-staged": "^16.1.5",
"tailwindcss": "^4.1.11",
"lint-staged": "^16.2.6",
"tailwindcss": "^4.1.17",
"ts-unused-exports": "^11.0.1",
"tw-animate-css": "^1.3.6",
"typescript": "~5.9.2"
"tw-animate-css": "^1.4.0",
"typescript": "~5.9.3"
},
"packageManager": "pnpm@10.14.0+sha512.ad27a79641b49c3e481a16a805baa71817a04bbe06a38d17e60e2eaee83f6a146c6a688125f5792e48dd5ba30e7da52a5cda4c3992b9ccf333f9ce223af84748",
"lint-staged": {
+2658 -2470
View File
File diff suppressed because it is too large Load Diff
+2
View File
@@ -1,8 +1,10 @@
packages:
- nodecar
onlyBuiltDependencies:
- '@biomejs/biome'
- '@tailwindcss/oxide'
- better-sqlite3
- esbuild
- sharp
- sqlite3
+4
View File
@@ -0,0 +1,4 @@
[build]
# Omit jobs setting to use all cores
incremental = true
+1629 -989
View File
File diff suppressed because it is too large Load Diff
+57 -11
View File
@@ -1,6 +1,6 @@
[package]
name = "donutbrowser"
version = "0.9.2"
version = "0.13.0"
description = "Simple Yet Powerful Anti-Detect Browser"
authors = ["zhom@github"]
edition = "2021"
@@ -12,10 +12,18 @@ default-run = "donutbrowser"
# The `_lib` suffix may seem redundant but it is necessary
# to make the lib name unique and wouldn't conflict with the bin name.
# This seems to be only an issue on Windows, see https://github.com/rust-lang/cargo/issues/8519
name = "donutbrowser"
name = "donutbrowser_lib"
crate-type = ["staticlib", "cdylib", "rlib"]
doctest = false
[[bin]]
name = "donutbrowser"
path = "src/main.rs"
[[bin]]
name = "donut-proxy"
path = "src/bin/proxy_server.rs"
[build-dependencies]
tauri-build = { version = "2", features = [] }
@@ -29,24 +37,42 @@ tauri-plugin-shell = "2"
tauri-plugin-deep-link = "2"
tauri-plugin-dialog = "2"
tauri-plugin-macos-permissions = "2"
tauri-plugin-log = "2"
log = "0.4"
directories = "6"
reqwest = { version = "0.12", features = ["json", "stream"] }
reqwest = { version = "0.12", features = ["json", "stream", "socks"] }
tokio = { version = "1", features = ["full", "sync"] }
sysinfo = "0.36"
sysinfo = "0.37"
lazy_static = "1.4"
base64 = "0.22"
libc = "0.2"
async-trait = "0.1"
futures-util = "0.3"
zip = "4"
zip = "6"
tar = "0"
bzip2 = "0"
flate2 = "1"
lzma-rs = "0"
msi-extract = "0"
uuid = { version = "1.0", features = ["v4", "serde"] }
uuid = { version = "1.18", features = ["v4", "serde"] }
url = "2.5"
urlencoding = "2.1"
chrono = { version = "0.4", features = ["serde"] }
axum = "0.8.7"
tower = "0.5"
tower-http = { version = "0.6", features = ["cors"] }
rand = "0.9.2"
utoipa = { version = "5", features = ["axum_extras", "chrono"] }
utoipa-axum = "0.2"
argon2 = "0.5"
aes-gcm = "0.10"
hyper = { version = "1.8", features = ["full"] }
hyper-util = { version = "0.1", features = ["full"] }
http-body-util = "0.1"
clap = { version = "4", features = ["derive"] }
async-socks5 = "0.6"
[target."cfg(any(target_os = \"macos\", windows, target_os = \"linux\"))".dependencies]
tauri-plugin-single-instance = { version = "2", features = ["deep-link"] }
@@ -58,7 +84,7 @@ objc2-app-kit = { version = "0.3.1", features = ["NSWindow"] }
[target.'cfg(target_os = "windows")'.dependencies]
winreg = "0.55"
windows = { version = "0.61", features = [
windows = { version = "0.62", features = [
"Win32_Foundation",
"Win32_System_ProcessStatus",
"Win32_System_Threading",
@@ -71,19 +97,39 @@ windows = { version = "0.61", features = [
] }
[dev-dependencies]
tempfile = "3.13.0"
tempfile = "3.21.0"
wiremock = "0.6"
hyper = { version = "1.0", features = ["full"] }
hyper = { version = "1.8", features = ["full"] }
hyper-util = { version = "0.1", features = ["full"] }
http-body-util = "0.1"
tower = "0.5"
tower-http = { version = "0.6", features = ["fs", "trace"] }
futures-util = "0.3"
serial_test = "3"
# Integration test configuration
[[test]]
name = "nodecar_integration"
path = "tests/nodecar_integration.rs"
name = "donut_proxy_integration"
path = "tests/donut_proxy_integration.rs"
[profile.dev]
codegen-units = 256
incremental = true
opt-level = 0
# Split debuginfo on macOS for faster linking (ignored on other platforms)
split-debuginfo = "unpacked"
[profile.release]
codegen-units = 1
opt-level = 3
lto = "thin"
# Split debuginfo on macOS for faster linking (ignored on other platforms)
split-debuginfo = "unpacked"
[profile.test]
# Optimize test builds for faster compilation
codegen-units = 256
incremental = true
[features]
# by default Tauri runs in production mode
+6 -6
View File
@@ -3,15 +3,15 @@
<plist version="1.0">
<dict>
<key>NSCameraUsageDescription</key>
<string>Donut Browser needs camera access to enable camera functionality in web browsers. Each website will still ask for your permission individually.</string>
<string>Donut needs camera access to enable camera functionality in web browsers. Each website will still ask for your permission individually.</string>
<key>NSMicrophoneUsageDescription</key>
<string>Donut Browser needs microphone access to enable microphone functionality in web browsers. Each website will still ask for your permission individually.</string>
<string>Donut needs microphone access to enable microphone functionality in web browsers. Each website will still ask for your permission individually.</string>
<key>NSLocalNetworkUsageDescription</key>
<string>Donut Browser has proxy functionality that requires local network access. You can deny this functionality if you don't plan on setting proxies for browser profiles.</string>
<string>Donut has proxy functionality that requires local network access. You can deny this functionality if you don't plan on setting proxies for browser profiles.</string>
<key>CFBundleDisplayName</key>
<string>Donut Browser</string>
<string>Donut</string>
<key>CFBundleName</key>
<string>Donut Browser</string>
<string>Donut</string>
<key>CFBundleIdentifier</key>
<string>com.donutbrowser</string>
<key>CFBundleURLName</key>
@@ -25,7 +25,7 @@
<key>LSApplicationCategoryType</key>
<string>public.app-category.productivity</string>
<key>NSHumanReadableCopyright</key>
<string>Copyright © 2025 Donut Browser</string>
<string>Copyright © 2025 Donut</string>
<key>CFBundleDocumentTypes</key>
<array>
<dict>
+58 -1
View File
@@ -1,4 +1,6 @@
fn main() {
println!("cargo::rustc-check-cfg=cfg(mobile)");
#[cfg(target_os = "macos")]
{
println!("cargo:rustc-link-lib=framework=CoreFoundation");
@@ -26,5 +28,60 @@ fn main() {
println!("cargo:rustc-env=BUILD_VERSION=dev-{version}");
}
tauri_build::build()
// Inject vault password at build time
if let Ok(vault_password) = std::env::var("DONUT_BROWSER_VAULT_PASSWORD") {
println!("cargo:rustc-env=DONUT_BROWSER_VAULT_PASSWORD={vault_password}");
} else {
// Use default password if environment variable is not set
println!("cargo:rustc-env=DONUT_BROWSER_VAULT_PASSWORD=donutbrowser-api-vault-password");
}
// Tell Cargo to rebuild if the proxy binary source changes
println!("cargo:rerun-if-changed=src/bin/proxy_server.rs");
println!("cargo:rerun-if-changed=src/proxy_server.rs");
println!("cargo:rerun-if-changed=src/proxy_runner.rs");
println!("cargo:rerun-if-changed=src/proxy_storage.rs");
// Only run tauri_build if all external binaries exist
// This allows building donut-proxy sidecar without the other binaries present
if external_binaries_exist() {
tauri_build::build()
} else {
println!("cargo:warning=Skipping tauri_build: external binaries not found. This is expected when building sidecar binaries.");
}
}
fn external_binaries_exist() -> bool {
use std::env;
use std::path::PathBuf;
let manifest_dir = match env::var("CARGO_MANIFEST_DIR") {
Ok(dir) => dir,
Err(_) => return false,
};
let target = match env::var("TARGET") {
Ok(t) => t,
Err(_) => return false,
};
let binaries_dir = PathBuf::from(&manifest_dir).join("binaries");
// Check for both required external binaries
let nodecar_name = if target.contains("windows") {
format!("nodecar-{}.exe", target)
} else {
format!("nodecar-{}", target)
};
let donut_proxy_name = if target.contains("windows") {
format!("donut-proxy-{}.exe", target)
} else {
format!("donut-proxy-{}", target)
};
let nodecar_exists = binaries_dir.join(&nodecar_name).exists();
let donut_proxy_exists = binaries_dir.join(&donut_proxy_name).exists();
nodecar_exists && donut_proxy_exists
}
+2 -1
View File
@@ -29,6 +29,7 @@
"macos-permissions:allow-request-microphone-permission",
"macos-permissions:allow-request-camera-permission",
"macos-permissions:allow-check-microphone-permission",
"macos-permissions:allow-check-camera-permission"
"macos-permissions:allow-check-camera-permission",
"log:default"
]
}
+69
View File
@@ -0,0 +1,69 @@
#!/bin/bash
set -e
# Get the target triple from environment or use default
TARGET="${TARGET:-$(rustc -vV 2>/dev/null | sed -n 's|host: ||p' || echo "unknown")}"
MANIFEST_DIR="$(dirname "$0")"
# Determine binary name based on target
if [[ "$TARGET" == *"windows"* ]]; then
BIN_NAME="donut-proxy.exe"
else
BIN_NAME="donut-proxy"
fi
# Determine source path
HOST_TARGET=$(rustc -vV 2>/dev/null | sed -n 's|host: ||p' || echo "$TARGET")
if [[ "$TARGET" == "$HOST_TARGET" ]] || [[ "$TARGET" == "unknown" ]]; then
# Native target - use debug or release based on profile
if [[ "${PROFILE:-debug}" == "release" ]]; then
SRC_DIR="$MANIFEST_DIR/target/release"
else
SRC_DIR="$MANIFEST_DIR/target/debug"
fi
else
# Cross-compilation target
if [[ "${PROFILE:-debug}" == "release" ]]; then
SRC_DIR="$MANIFEST_DIR/target/$TARGET/release"
else
SRC_DIR="$MANIFEST_DIR/target/$TARGET/debug"
fi
fi
SOURCE="$SRC_DIR/$BIN_NAME"
DEST_DIR="$MANIFEST_DIR/binaries"
# Tauri expects the format: donut-proxy-{target} with hyphens (same as nodecar)
DEST_NAME="donut-proxy-$TARGET"
if [[ "$TARGET" == *"windows"* ]]; then
DEST_NAME="$DEST_NAME.exe"
fi
DEST="$DEST_DIR/$DEST_NAME"
# Create binaries directory if it doesn't exist
mkdir -p "$DEST_DIR"
# Copy the binary if it exists
if [[ -f "$SOURCE" ]]; then
cp "$SOURCE" "$DEST"
echo "Copied $BIN_NAME to $DEST"
else
echo "Warning: Binary not found at $SOURCE"
echo "Building donut-proxy binary..."
cd "$MANIFEST_DIR"
BUILD_ARGS=("build" "--bin" "donut-proxy")
if [[ -n "$PROFILE" ]] && [[ "$PROFILE" == "release" ]]; then
BUILD_ARGS+=("--release")
fi
if [[ -n "$TARGET" ]] && [[ "$TARGET" != "unknown" ]] && [[ "$TARGET" != "$HOST_TARGET" ]]; then
BUILD_ARGS+=("--target" "$TARGET")
fi
cargo "${BUILD_ARGS[@]}"
if [[ -f "$SOURCE" ]]; then
cp "$SOURCE" "$DEST"
echo "Built and copied $BIN_NAME to $DEST"
else
echo "Error: Failed to build donut-proxy binary"
exit 1
fi
fi
+4 -1
View File
@@ -1,7 +1,10 @@
[Desktop Entry]
Version=1.0
Type=Application
Name=Donut Browser
Name=Donut
Name[en]=Donut
GenericName=Web Browser
X-GNOME-FullName=Donut
Comment=Simple Yet Powerful Anti-Detect Browser
Exec=donutbrowser %u
Icon=donutbrowser
+85 -58
View File
@@ -221,6 +221,20 @@ pub fn sort_versions(versions: &mut [String]) {
});
}
// Helper function to compare two versions
pub fn compare_versions(version1: &str, version2: &str) -> std::cmp::Ordering {
let version_a = VersionComponent::parse(version1);
let version_b = VersionComponent::parse(version2);
version_a.cmp(&version_b)
}
pub fn is_version_newer(version1: &str, version2: &str) -> bool {
// Use the proper VersionComponent comparison from api_client.rs
let version_a = VersionComponent::parse(version1);
let version_b = VersionComponent::parse(version2);
version_a > version_b
}
// Helper function to sort GitHub releases
pub fn sort_github_releases(releases: &mut [GithubRelease]) {
releases.sort_by(|a, b| {
@@ -268,7 +282,12 @@ pub fn is_browser_version_nightly(
// Last resort: when no name available, treat as nightly (non-Release)
true
}
"firefox" | "firefox-developer" => {
"firefox-developer" => {
// For Firefox Developer Edition, always treat as nightly/prerelease
// This ensures consistent behavior regardless of cache state or API response parsing
true
}
"firefox" => {
// For Firefox, use the category from the API response to determine stability
// This will be handled in the API parsing, so this fallback is for cached versions
is_nightly_version(version)
@@ -334,9 +353,14 @@ pub struct ApiClient {
}
impl ApiClient {
fn new() -> Self {
pub fn new() -> Self {
let client = Client::builder()
.timeout(std::time::Duration::from_secs(30))
.build()
.unwrap_or_else(|_| Client::new());
Self {
client: Client::new(),
client,
firefox_api_base: "https://product-details.mozilla.org/1.0".to_string(),
firefox_dev_api_base: "https://product-details.mozilla.org/1.0".to_string(),
github_api_base: "https://api.github.com".to_string(),
@@ -383,8 +407,8 @@ impl ApiClient {
let text = response.text().await?;
let mut page_releases: Vec<GithubRelease> = serde_json::from_str(&text).map_err(|e| {
eprintln!("Failed to parse GitHub API response (page {page}): {e}");
eprintln!(
log::error!("Failed to parse GitHub API response (page {page}): {e}");
log::error!(
"Response text (first 500 chars): {}",
if text.len() > 500 {
&text[..500]
@@ -463,13 +487,13 @@ impl ApiClient {
let content = fs::read_to_string(&cache_file).ok()?;
if let Ok(cached) = serde_json::from_str::<CachedVersionData>(&content) {
// Always return cached releases regardless of age - they're always valid
println!("Using cached versions for {browser}");
log::info!("Using cached versions for {browser}");
return Some(cached.releases);
}
// Backward compatibility: legacy caches stored just an array of version strings
if let Ok(legacy_versions) = serde_json::from_str::<Vec<String>>(&content) {
println!("Using legacy cached versions for {browser}; upgrading in-memory");
log::info!("Using legacy cached versions for {browser}; upgrading in-memory");
let releases: Vec<BrowserRelease> = legacy_versions
.into_iter()
.map(|version| BrowserRelease {
@@ -524,7 +548,7 @@ impl ApiClient {
let content = serde_json::to_string_pretty(&cached_data)?;
fs::write(&cache_file, content)?;
println!("Cached {} versions for {}", releases.len(), browser);
log::info!("Cached {} versions for {}", releases.len(), browser);
Ok(())
}
@@ -540,7 +564,7 @@ impl ApiClient {
let cached_data: CachedGithubData = serde_json::from_str(&content).ok()?;
// Always use cached GitHub releases - cache never expires, only gets updated with new versions
println!("Using cached GitHub releases for {browser}");
log::info!("Using cached GitHub releases for {browser}");
Some(cached_data.releases)
}
@@ -564,7 +588,7 @@ impl ApiClient {
let content = serde_json::to_string_pretty(&cached_data)?;
fs::write(&cache_file, content)?;
println!("Cached {} GitHub releases for {}", releases.len(), browser);
log::info!("Cached {} GitHub releases for {}", releases.len(), browser);
Ok(())
}
@@ -579,7 +603,7 @@ impl ApiClient {
}
}
println!("Fetching Firefox releases from Mozilla API...");
log::info!("Fetching Firefox releases from Mozilla API...");
let url = format!("{}/firefox.json", self.firefox_api_base);
let response = self
@@ -624,7 +648,7 @@ impl ApiClient {
// Cache the results (unless bypassing cache)
if !no_caching {
if let Err(e) = self.save_cached_versions("firefox", &releases) {
eprintln!("Failed to cache Firefox versions: {e}");
log::error!("Failed to cache Firefox versions: {e}");
}
}
@@ -642,24 +666,24 @@ impl ApiClient {
}
}
println!("Fetching Firefox Developer Edition releases from Mozilla API...");
log::info!("Fetching Firefox Developer Edition releases from Mozilla API...");
let url = format!("{}/devedition.json", self.firefox_dev_api_base);
let response = self
.client
.get(url)
.get(&url)
.header("User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/136.0.0.0 Safari/537.36")
.send()
.await?;
if !response.status().is_success() {
return Err(
format!(
"Failed to fetch Firefox Developer Edition versions: {}",
response.status()
)
.into(),
let error_msg = format!(
"Failed to fetch Firefox Developer Edition versions: {} - URL: {}",
response.status(),
url
);
log::error!("{error_msg}");
return Err(error_msg.into());
}
let firefox_response: FirefoxApiResponse = response.json().await?;
@@ -693,7 +717,7 @@ impl ApiClient {
// Cache the results (unless bypassing cache)
if !no_caching {
if let Err(e) = self.save_cached_versions("firefox-developer", &releases) {
eprintln!("Failed to cache Firefox Developer versions: {e}");
log::error!("Failed to cache Firefox Developer versions: {e}");
}
}
@@ -711,7 +735,7 @@ impl ApiClient {
}
}
println!("Fetching Mullvad releases from GitHub API");
log::info!("Fetching Mullvad releases from GitHub API");
let base_url = format!(
"{}/repos/mullvad/mullvad-browser/releases",
self.github_api_base
@@ -732,7 +756,7 @@ impl ApiClient {
// Cache the results (unless bypassing cache)
if !no_caching {
if let Err(e) = self.save_cached_github_releases("mullvad", &releases) {
eprintln!("Failed to cache Mullvad releases: {e}");
log::error!("Failed to cache Mullvad releases: {e}");
}
}
@@ -750,7 +774,7 @@ impl ApiClient {
}
}
println!("Fetching Zen releases from GitHub API");
log::info!("Fetching Zen releases from GitHub API");
let base_url = format!(
"{}/repos/zen-browser/desktop/releases",
self.github_api_base
@@ -768,7 +792,7 @@ impl ApiClient {
if release.tag_name.to_lowercase() == "twilight" {
if let Ok(has_update) = self.check_twilight_update(release).await {
if has_update {
println!(
log::info!(
"Detected update for Zen twilight release: {}",
release.tag_name
);
@@ -783,7 +807,7 @@ impl ApiClient {
// Cache the results (unless bypassing cache)
if !no_caching {
if let Err(e) = self.save_cached_github_releases("zen", &releases) {
eprintln!("Failed to cache Zen releases: {e}");
log::error!("Failed to cache Zen releases: {e}");
}
}
@@ -801,7 +825,7 @@ impl ApiClient {
}
}
println!("Fetching Brave releases from GitHub API");
log::info!("Fetching Brave releases from GitHub API");
let base_url = format!(
"{}/repos/brave/brave-browser/releases",
self.github_api_base
@@ -819,7 +843,6 @@ impl ApiClient {
let has_compatible_asset = Self::has_compatible_brave_asset(&release.assets, &os);
if has_compatible_asset {
println!("release.name: {:?}", release.name);
// Use the centralized nightly detection function
release.is_nightly =
is_browser_version_nightly("brave", &release.tag_name, Some(&release.name));
@@ -834,7 +857,7 @@ impl ApiClient {
sort_github_releases(&mut filtered_releases);
if let Err(e) = self.save_cached_github_releases("brave", &filtered_releases) {
eprintln!("Failed to cache Brave releases: {e}");
log::error!("Failed to cache Brave releases: {e}");
}
Ok(filtered_releases)
@@ -954,7 +977,7 @@ impl ApiClient {
}
}
println!("Fetching Chromium releases...");
log::info!("Fetching Chromium releases...");
// Get the latest version first
let latest_version = self.fetch_chromium_latest_version().await?;
@@ -982,7 +1005,7 @@ impl ApiClient {
// Cache the results (unless bypassing cache)
if !no_caching {
if let Err(e) = self.save_cached_versions("chromium", &releases) {
eprintln!("Failed to cache Chromium versions: {e}");
log::error!("Failed to cache Chromium versions: {e}");
}
}
@@ -996,7 +1019,7 @@ impl ApiClient {
// Check cache first (unless bypassing)
if !no_caching {
if let Some(cached_releases) = self.load_cached_github_releases("camoufox") {
println!(
log::info!(
"Using cached Camoufox releases, count: {}",
cached_releases.len()
);
@@ -1004,18 +1027,18 @@ impl ApiClient {
}
}
println!("Fetching Camoufox releases from GitHub API");
log::info!("Fetching Camoufox releases from GitHub API");
let base_url = format!("{}/repos/daijro/camoufox/releases", self.github_api_base);
let releases: Vec<GithubRelease> = self.fetch_github_releases_multiple_pages(&base_url).await?;
println!(
log::info!(
"Fetched {} total Camoufox releases from GitHub",
releases.len()
);
// Get platform info to filter appropriate releases
let (os, arch) = Self::get_platform_info();
println!("Filtering for platform: {os}/{arch}");
log::info!("Filtering for platform: {os}/{arch}");
// Filter releases that have assets compatible with the current platform
let mut compatible_releases: Vec<GithubRelease> = releases
@@ -1024,11 +1047,14 @@ impl ApiClient {
.filter_map(|(i, release)| {
let has_compatible = self.has_compatible_camoufox_asset(&release.assets, &os, &arch);
if !has_compatible {
println!(
log::info!(
"Release {} ({}) has no compatible assets for {}/{}",
i, release.tag_name, os, arch
i,
release.tag_name,
os,
arch
);
println!(
log::info!(
" Available assets: {:?}",
release.assets.iter().map(|a| &a.name).collect::<Vec<_>>()
);
@@ -1041,13 +1067,13 @@ impl ApiClient {
})
.collect();
println!(
log::info!(
"After platform filtering: {} compatible releases",
compatible_releases.len()
);
// Sort by version (latest first) with debugging
println!(
log::info!(
"Before sorting: {:?}",
compatible_releases
.iter()
@@ -1056,7 +1082,7 @@ impl ApiClient {
.collect::<Vec<_>>()
);
sort_github_releases(&mut compatible_releases);
println!(
log::info!(
"After sorting: {:?}",
compatible_releases
.iter()
@@ -1068,9 +1094,9 @@ impl ApiClient {
// Cache the results (unless bypassing cache)
if !no_caching {
if let Err(e) = self.save_cached_github_releases("camoufox", &compatible_releases) {
eprintln!("Failed to cache Camoufox releases: {e}");
log::error!("Failed to cache Camoufox releases: {e}");
} else {
println!("Cached {} Camoufox releases", compatible_releases.len());
log::info!("Cached {} Camoufox releases", compatible_releases.len());
}
}
@@ -1088,7 +1114,7 @@ impl ApiClient {
}
}
println!("Fetching TOR releases from archive...");
log::info!("Fetching TOR releases from archive...");
let url = format!("{}/", self.tor_archive_base);
let html = self
.client
@@ -1153,7 +1179,7 @@ impl ApiClient {
// Cache the results (unless bypassing cache)
if !no_caching {
if let Err(e) = self.save_cached_versions("tor-browser", &releases) {
eprintln!("Failed to cache TOR versions: {e}");
log::error!("Failed to cache TOR versions: {e}");
}
}
@@ -1224,9 +1250,10 @@ impl ApiClient {
// File size changed, update cache and return true
let content = serde_json::to_string_pretty(&current_info)?;
fs::write(&twilight_cache_file, content)?;
println!(
log::info!(
"Zen twilight release updated: file size changed from {} to {}",
cached_info.file_size, current_info.file_size
cached_info.file_size,
current_info.file_size
);
return Ok(true);
}
@@ -1244,10 +1271,10 @@ impl ApiClient {
let path = entry.path();
if path.is_file() {
fs::remove_file(&path)?;
println!("Removed cache file: {path:?}");
log::info!("Removed cache file: {path:?}");
}
}
println!("All version cache cleared successfully");
log::info!("All version cache cleared successfully");
}
Ok(())
@@ -1450,7 +1477,7 @@ mod tests {
let result = client.fetch_firefox_releases_with_caching(true).await;
if let Err(e) = &result {
println!("Firefox API test error: {e}");
log::info!("Firefox API test error: {e}");
}
assert!(result.is_ok());
let releases = result.unwrap();
@@ -1492,7 +1519,7 @@ mod tests {
.await;
if let Err(e) = &result {
println!("Firefox Developer API test error: {e}");
log::info!("Firefox Developer API test error: {e}");
}
assert!(result.is_ok());
let releases = result.unwrap();
@@ -1627,7 +1654,7 @@ mod tests {
let result = client.fetch_brave_releases_with_caching(true).await;
if let Err(e) = &result {
println!("Brave API test error: {e}");
log::info!("Brave API test error: {e}");
}
assert!(result.is_ok());
let releases = result.unwrap();
@@ -1968,8 +1995,8 @@ mod tests {
let v22 = VersionComponent::parse("135.0.5beta22");
let v24 = VersionComponent::parse("135.0.5beta24");
println!("v22: {v22:?}");
println!("v24: {v24:?}");
log::info!("v22: {v22:?}");
log::info!("v24: {v24:?}");
// v24 should be greater than v22
assert!(
@@ -1992,7 +2019,7 @@ mod tests {
sort_versions(&mut versions);
println!("Sorted versions: {versions:?}");
log::info!("Sorted versions: {versions:?}");
// Should be sorted from newest to oldest
assert_eq!(versions[0], "135.0.5beta24");
@@ -2007,8 +2034,8 @@ mod tests {
let v22 = VersionComponent::parse("135.0beta22");
let v24 = VersionComponent::parse("135.0.1beta24");
println!("User reported v22: {v22:?}");
println!("User reported v24: {v24:?}");
log::info!("User reported v22: {v22:?}");
log::info!("User reported v24: {v24:?}");
// 135.0.1beta24 should be greater than 135.0beta22 (newer patch version)
assert!(
@@ -2021,7 +2048,7 @@ mod tests {
sort_versions(&mut versions);
println!("User reported sorted versions: {versions:?}");
log::info!("User reported sorted versions: {versions:?}");
// Should be sorted from newest to oldest
assert_eq!(
File diff suppressed because it is too large Load Diff
+129 -94
View File
@@ -107,6 +107,8 @@ pub struct AppUpdateInfo {
pub download_url: String,
pub is_nightly: bool,
pub published_at: String,
pub manual_update_required: bool,
pub release_page_url: Option<String>,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
@@ -120,12 +122,14 @@ pub struct AppUpdateProgress {
pub struct AppAutoUpdater {
client: Client,
extractor: &'static crate::extraction::Extractor,
}
impl AppAutoUpdater {
fn new() -> Self {
Self {
client: Client::new(),
extractor: crate::extraction::Extractor::instance(),
}
}
@@ -164,13 +168,13 @@ impl AppAutoUpdater {
let current_version = Self::get_current_version();
let is_nightly = Self::is_nightly_build();
println!("=== App Update Check ===");
println!("Current version: {current_version}");
println!("Is nightly build: {is_nightly}");
println!("STABLE_RELEASE env: {:?}", option_env!("STABLE_RELEASE"));
log::info!("=== App Update Check ===");
log::info!("Current version: {current_version}");
log::info!("Is nightly build: {is_nightly}");
log::info!("STABLE_RELEASE env: {:?}", option_env!("STABLE_RELEASE"));
let releases = self.fetch_app_releases().await?;
println!("Fetched {} releases from GitHub", releases.len());
log::info!("Fetched {} releases from GitHub", releases.len());
// Filter releases based on build type
let filtered_releases: Vec<&AppRelease> = if is_nightly {
@@ -179,7 +183,7 @@ impl AppAutoUpdater {
.iter()
.filter(|release| release.tag_name.starts_with("nightly-"))
.collect();
println!("Found {} nightly releases", nightly_releases.len());
log::info!("Found {} nightly releases", nightly_releases.len());
nightly_releases
} else {
// For stable builds, look for stable releases (semver format)
@@ -187,47 +191,87 @@ impl AppAutoUpdater {
.iter()
.filter(|release| release.tag_name.starts_with('v'))
.collect();
println!("Found {} stable releases", stable_releases.len());
log::info!("Found {} stable releases", stable_releases.len());
stable_releases
};
if filtered_releases.is_empty() {
println!("No releases found for build type (nightly: {is_nightly})");
log::info!("No releases found for build type (nightly: {is_nightly})");
return Ok(None);
}
// Get the latest release
let latest_release = filtered_releases[0];
println!(
log::info!(
"Latest release: {} ({})",
latest_release.tag_name, latest_release.name
latest_release.tag_name,
latest_release.name
);
// Check if we need to update
if self.should_update(&current_version, &latest_release.tag_name, is_nightly) {
println!("Update available!");
log::info!("Update available!");
// Build the release page URL
let release_page_url = format!(
"https://github.com/zhom/donutbrowser/releases/tag/{}",
latest_release.tag_name
);
// Find the appropriate asset for current platform
if let Some(download_url) = self.get_download_url_for_platform(&latest_release.assets) {
let download_url = self.get_download_url_for_platform(&latest_release.assets);
// On Linux, we show the update notification even if auto-update is disabled
// Users can manually download from the release page
#[cfg(target_os = "linux")]
{
let manual_update_required = download_url.is_none();
let update_info = AppUpdateInfo {
current_version,
new_version: latest_release.tag_name.clone(),
release_notes: latest_release.body.clone(),
download_url,
download_url: download_url.unwrap_or_else(|| release_page_url.clone()),
is_nightly,
published_at: latest_release.published_at.clone(),
manual_update_required,
release_page_url: Some(release_page_url),
};
println!(
"Update info prepared: {} -> {}",
update_info.current_version, update_info.new_version
log::info!(
"Update info prepared: {} -> {} (manual_update_required: {})",
update_info.current_version,
update_info.new_version,
update_info.manual_update_required
);
return Ok(Some(update_info));
} else {
println!("No suitable download asset found for current platform");
}
#[cfg(not(target_os = "linux"))]
{
if let Some(url) = download_url {
let update_info = AppUpdateInfo {
current_version,
new_version: latest_release.tag_name.clone(),
release_notes: latest_release.body.clone(),
download_url: url,
is_nightly,
published_at: latest_release.published_at.clone(),
manual_update_required: false,
release_page_url: Some(release_page_url),
};
log::info!(
"Update info prepared: {} -> {}",
update_info.current_version,
update_info.new_version
);
return Ok(Some(update_info));
} else {
log::info!("No suitable download asset found for current platform");
}
}
} else {
println!("No update needed");
log::info!("No update needed");
}
Ok(None)
@@ -259,7 +303,7 @@ impl AppAutoUpdater {
return false;
}
println!(
log::info!(
"Comparing versions: current={current_version}, new={new_version}, is_nightly={is_nightly}"
);
@@ -271,20 +315,20 @@ impl AppAutoUpdater {
) {
// Different commit hashes mean we should update
let should_update = new_hash != current_hash;
println!("Nightly comparison: current_hash={current_hash}, new_hash={new_hash}, should_update={should_update}");
log::info!("Nightly comparison: current_hash={current_hash}, new_hash={new_hash}, should_update={should_update}");
return should_update;
}
// If current version doesn't have nightly prefix but we're in nightly mode,
// this could be a dev build or stable build upgrading to nightly
if !current_version.starts_with("nightly-") {
println!("Upgrading from non-nightly to nightly: {new_version}");
log::info!("Upgrading from non-nightly to nightly: {new_version}");
return true;
}
} else {
// For stable builds, use semantic versioning comparison
let should_update = self.is_version_newer(new_version, current_version);
println!("Stable comparison: {new_version} > {current_version} = {should_update}");
log::info!("Stable comparison: {new_version} > {current_version} = {should_update}");
return should_update;
}
@@ -352,7 +396,7 @@ impl AppAutoUpdater {
};
let exe_path_str = exe_path.to_string_lossy();
println!("Detecting installation method for: {exe_path_str}");
log::info!("Detecting installation method for: {exe_path_str}");
// Check if installed via package manager by querying package databases
if let Some(exe_name) = exe_path.file_name().and_then(|n| n.to_str()) {
@@ -363,7 +407,7 @@ impl AppAutoUpdater {
if output.status.success() {
let stdout = String::from_utf8_lossy(&output.stdout);
if !stdout.trim().is_empty() && !stdout.contains("no path found") {
println!("Found DEB package owning the executable");
log::info!("Found DEB package owning the executable");
return LinuxInstallationMethod::Deb;
}
}
@@ -374,7 +418,7 @@ impl AppAutoUpdater {
if output.status.success() {
let stdout = String::from_utf8_lossy(&output.stdout);
if !stdout.trim().is_empty() && !stdout.contains("not owned") {
println!("Found RPM package owning the executable");
log::info!("Found RPM package owning the executable");
return LinuxInstallationMethod::Rpm;
}
}
@@ -389,7 +433,7 @@ impl AppAutoUpdater {
if output.status.success() {
let stdout = String::from_utf8_lossy(&output.stdout);
if !stdout.trim().is_empty() && stdout.contains(exe_name) {
println!("Found RPM package via {rpm_cmd}");
log::info!("Found RPM package via {rpm_cmd}");
return LinuxInstallationMethod::Rpm;
}
}
@@ -400,7 +444,7 @@ impl AppAutoUpdater {
// Check installation location to infer method
if exe_path_str.starts_with("/usr/bin/") || exe_path_str.starts_with("/usr/local/bin/") {
// Likely installed via package manager or system-wide installation
println!("Executable in system directory, assuming package installation");
log::info!("Executable in system directory, assuming package installation");
// Try to determine which package system is available
if Command::new("dpkg").arg("--version").output().is_ok() {
@@ -412,11 +456,11 @@ impl AppAutoUpdater {
return LinuxInstallationMethod::Manual;
} else if exe_path_str.contains("/.local/") || exe_path_str.starts_with("/home/") {
// User-local installation
println!("Executable in user directory, assuming manual installation");
log::info!("Executable in user directory, assuming manual installation");
return LinuxInstallationMethod::Manual;
}
println!("Could not determine installation method");
log::info!("Could not determine installation method");
LinuxInstallationMethod::Unknown
}
@@ -430,13 +474,13 @@ impl AppAutoUpdater {
"unknown"
};
println!("Looking for platform-specific asset for arch: {arch}");
log::info!("Looking for platform-specific asset for arch: {arch}");
#[cfg(target_os = "linux")]
{
// If we're running from an AppImage, disable auto-updates for safety
if self.is_running_from_appimage() {
println!("Running from AppImage - auto-updates disabled for safety");
log::info!("Running from AppImage - auto-updates disabled for safety");
return None;
}
}
@@ -458,7 +502,7 @@ impl AppAutoUpdater {
#[cfg(not(any(target_os = "macos", target_os = "windows", target_os = "linux")))]
{
println!("Unsupported platform for auto-update");
log::info!("Unsupported platform for auto-update");
None
}
}
@@ -473,7 +517,7 @@ impl AppAutoUpdater {
|| asset.name.contains(&format!("_{arch}_"))
|| asset.name.contains(&format!("-{arch}-")))
{
println!("Found exact architecture match: {}", asset.name);
log::info!("Found exact architecture match: {}", asset.name);
return Some(asset.browser_download_url.clone());
}
}
@@ -484,7 +528,7 @@ impl AppAutoUpdater {
if asset.name.contains(".dmg")
&& (asset.name.contains("x86_64") || asset.name.contains("x86-64"))
{
println!("Found x86_64 variant: {}", asset.name);
log::info!("Found x86_64 variant: {}", asset.name);
return Some(asset.browser_download_url.clone());
}
}
@@ -496,7 +540,7 @@ impl AppAutoUpdater {
if asset.name.contains(".dmg")
&& (asset.name.contains("arm64") || asset.name.contains("aarch64"))
{
println!("Found arm64 variant: {}", asset.name);
log::info!("Found arm64 variant: {}", asset.name);
return Some(asset.browser_download_url.clone());
}
}
@@ -509,7 +553,7 @@ impl AppAutoUpdater {
|| asset.name.to_lowercase().contains("darwin")
|| !asset.name.contains(".app.tar.gz"))
{
println!("Found fallback DMG: {}", asset.name);
log::info!("Found fallback DMG: {}", asset.name);
return Some(asset.browser_download_url.clone());
}
}
@@ -531,7 +575,7 @@ impl AppAutoUpdater {
|| asset.name.contains(&format!("_{arch}_"))
|| asset.name.contains(&format!("-{arch}-")))
{
println!("Found Windows {ext} with exact arch match: {}", asset.name);
log::info!("Found Windows {ext} with exact arch match: {}", asset.name);
return Some(asset.browser_download_url.clone());
}
}
@@ -542,7 +586,7 @@ impl AppAutoUpdater {
if asset.name.to_lowercase().ends_with(&format!(".{ext}"))
&& (asset.name.contains("x86_64") || asset.name.contains("x86-64"))
{
println!("Found Windows {ext} with x86_64 variant: {}", asset.name);
log::info!("Found Windows {ext} with x86_64 variant: {}", asset.name);
return Some(asset.browser_download_url.clone());
}
}
@@ -555,7 +599,7 @@ impl AppAutoUpdater {
|| asset.name.to_lowercase().contains("win32")
|| asset.name.to_lowercase().contains("win64"))
{
println!("Found Windows {ext} fallback: {}", asset.name);
log::info!("Found Windows {ext} fallback: {}", asset.name);
return Some(asset.browser_download_url.clone());
}
}
@@ -568,7 +612,7 @@ impl AppAutoUpdater {
fn get_linux_download_url(&self, assets: &[AppReleaseAsset], arch: &str) -> Option<String> {
// Detect installation method to prioritize appropriate formats
let installation_method = self.detect_linux_installation_method();
println!("Detected Linux installation method: {installation_method:?}");
log::info!("Detected Linux installation method: {installation_method:?}");
// Priority order based on installation method
let extensions = match installation_method {
@@ -576,7 +620,7 @@ impl AppAutoUpdater {
LinuxInstallationMethod::Rpm => vec!["rpm", "tar.gz"],
LinuxInstallationMethod::AppImage => {
// AppImages should not auto-update for safety
println!("AppImage installation detected - auto-updates disabled");
log::info!("AppImage installation detected - auto-updates disabled");
return None;
}
LinuxInstallationMethod::Manual | LinuxInstallationMethod::Unknown => {
@@ -594,7 +638,7 @@ impl AppAutoUpdater {
|| asset.name.contains(&format!("_{arch}_"))
|| asset.name.contains(&format!("-{arch}-")))
{
println!("Found Linux {ext} with exact arch match: {}", asset.name);
log::info!("Found Linux {ext} with exact arch match: {}", asset.name);
return Some(asset.browser_download_url.clone());
}
}
@@ -608,7 +652,7 @@ impl AppAutoUpdater {
|| asset.name.contains("x86-64")
|| asset.name.contains("amd64"))
{
println!("Found Linux {ext} with x86_64 variant: {}", asset.name);
log::info!("Found Linux {ext} with x86_64 variant: {}", asset.name);
return Some(asset.browser_download_url.clone());
}
}
@@ -621,7 +665,7 @@ impl AppAutoUpdater {
if asset_name_lower.ends_with(&format!(".{ext}"))
&& (asset.name.contains("arm64") || asset.name.contains("aarch64"))
{
println!("Found Linux {ext} with arm64 variant: {}", asset.name);
log::info!("Found Linux {ext} with arm64 variant: {}", asset.name);
return Some(asset.browser_download_url.clone());
}
}
@@ -635,7 +679,7 @@ impl AppAutoUpdater {
|| asset_name_lower.contains("ubuntu")
|| asset_name_lower.contains("debian"))
{
println!("Found Linux {ext} fallback: {}", asset.name);
log::info!("Found Linux {ext} fallback: {}", asset.name);
return Some(asset.browser_download_url.clone());
}
}
@@ -829,8 +873,6 @@ impl AppAutoUpdater {
archive_path: &Path,
dest_dir: &Path,
) -> Result<PathBuf, Box<dyn std::error::Error + Send + Sync>> {
let extractor = crate::extraction::Extractor::instance();
let file_name = archive_path
.file_name()
.and_then(|name| name.to_str())
@@ -838,7 +880,7 @@ impl AppAutoUpdater {
// Handle compound extensions like .tar.gz
if file_name.ends_with(".tar.gz") {
return extractor.extract_tar_gz(archive_path, dest_dir).await;
return self.extractor.extract_tar_gz(archive_path, dest_dir).await;
}
let extension = archive_path
@@ -850,7 +892,7 @@ impl AppAutoUpdater {
"dmg" => {
#[cfg(target_os = "macos")]
{
extractor.extract_dmg(archive_path, dest_dir).await
self.extractor.extract_dmg(archive_path, dest_dir).await
}
#[cfg(not(target_os = "macos"))]
{
@@ -914,7 +956,7 @@ impl AppAutoUpdater {
Err("AppImage installation is only supported on Linux".into())
}
}
"zip" => extractor.extract_zip(archive_path, dest_dir).await,
"zip" => self.extractor.extract_zip(archive_path, dest_dir).await,
_ => Err(format!("Unsupported archive format: {extension}").into()),
}
}
@@ -967,12 +1009,12 @@ impl AppAutoUpdater {
.and_then(|ext| ext.to_str())
.unwrap_or("");
println!("Installing Windows update with extension: {extension}");
log::info!("Installing Windows update with extension: {extension}");
match extension {
"msi" => {
// Install MSI silently with enhanced error handling
println!("Running MSI installer: {}", installer_path.display());
log::info!("Running MSI installer: {}", installer_path.display());
let mut cmd = Command::new("msiexec");
cmd.args([
@@ -995,10 +1037,10 @@ impl AppAutoUpdater {
let log_path = format!("{}.log", installer_path.to_str().unwrap());
let log_content = fs::read_to_string(&log_path).unwrap_or_default();
println!("MSI installation failed with exit code: {exit_code}");
println!("Error output: {error_msg}");
log::info!("MSI installation failed with exit code: {exit_code}");
log::info!("Error output: {error_msg}");
if !log_content.is_empty() {
println!(
log::info!(
"Log file content (last 500 chars): {}",
&log_content
.chars()
@@ -1016,11 +1058,11 @@ impl AppAutoUpdater {
);
}
println!("MSI installation completed successfully");
log::info!("MSI installation completed successfully");
}
"exe" => {
// Run exe installer silently with multiple fallback options
println!("Running EXE installer: {}", installer_path.display());
log::info!("Running EXE installer: {}", installer_path.display());
// Try NSIS silent flag first (most common for Tauri)
let mut success = false;
@@ -1035,12 +1077,12 @@ impl AppAutoUpdater {
];
for args in nsis_args {
println!("Trying installer with args: {:?}", args);
log::info!("Trying installer with args: {:?}", args);
let output = Command::new(installer_path).args(&args).output();
match output {
Ok(output) if output.status.success() => {
println!(
log::info!(
"EXE installation completed successfully with args: {:?}",
args
);
@@ -1054,13 +1096,14 @@ impl AppAutoUpdater {
output.status.code().unwrap_or(-1),
error_msg
);
println!("Installer failed with args {:?}: {}", args, last_error);
log::info!("Installer failed with args {:?}: {}", args, last_error);
}
Err(e) => {
last_error = format!("Failed to execute installer: {e}");
println!(
log::info!(
"Failed to execute installer with args {:?}: {}",
args, last_error
args,
last_error
);
}
}
@@ -1077,14 +1120,14 @@ impl AppAutoUpdater {
}
"zip" => {
// Handle ZIP files by extracting and replacing the current executable
println!("Handling ZIP update: {}", installer_path.display());
log::info!("Handling ZIP update: {}", installer_path.display());
let temp_extract_dir = installer_path.parent().unwrap().join("extracted");
fs::create_dir_all(&temp_extract_dir)?;
// Extract ZIP file
let extractor = crate::extraction::Extractor::instance();
let extracted_path = extractor
let extracted_path = self
.extractor
.extract_zip(installer_path, &temp_extract_dir)
.await?;
@@ -1124,7 +1167,7 @@ impl AppAutoUpdater {
// Clean up
let _ = fs::remove_dir_all(&temp_extract_dir);
println!("ZIP update completed successfully");
log::info!("ZIP update completed successfully");
}
_ => {
return Err(format!("Unsupported installer format: {extension}").into());
@@ -1141,7 +1184,7 @@ impl AppAutoUpdater {
.and_then(|name| name.to_str())
.unwrap_or("");
println!("Installing Linux update: {}", installer_path.display());
log::info!("Installing Linux update: {}", installer_path.display());
// Handle compound extensions like .tar.gz
if file_name.ends_with(".tar.gz") {
@@ -1173,7 +1216,7 @@ impl AppAutoUpdater {
&self,
deb_path: &Path,
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
println!("Installing DEB package: {}", deb_path.display());
log::info!("Installing DEB package: {}", deb_path.display());
// Try different package managers in order of preference
let package_managers = [
@@ -1186,23 +1229,23 @@ impl AppAutoUpdater {
for (manager, args) in &package_managers {
// Check if package manager exists
if Command::new("which").arg(manager).output().is_ok() {
println!("Trying to install with {manager}");
log::info!("Trying to install with {manager}");
let output = Command::new("pkexec").arg(manager).args(args).output();
match output {
Ok(output) if output.status.success() => {
println!("DEB installation completed successfully with {manager}");
log::info!("DEB installation completed successfully with {manager}");
return Ok(());
}
Ok(output) => {
let error_msg = String::from_utf8_lossy(&output.stderr);
last_error = format!("{manager} failed: {error_msg}");
println!("Installation failed with {manager}: {error_msg}");
log::info!("Installation failed with {manager}: {error_msg}");
}
Err(e) => {
last_error = format!("Failed to execute {manager}: {e}");
println!("Failed to execute {manager}: {e}");
log::info!("Failed to execute {manager}: {e}");
}
}
}
@@ -1217,7 +1260,7 @@ impl AppAutoUpdater {
&self,
rpm_path: &Path,
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
println!("Installing RPM package: {}", rpm_path.display());
log::info!("Installing RPM package: {}", rpm_path.display());
// Try different package managers in order of preference
let package_managers = [
@@ -1232,23 +1275,23 @@ impl AppAutoUpdater {
for (manager, args) in &package_managers {
// Check if package manager exists
if Command::new("which").arg(manager).output().is_ok() {
println!("Trying to install with {manager}");
log::info!("Trying to install with {manager}");
let output = Command::new("pkexec").arg(manager).args(args).output();
match output {
Ok(output) if output.status.success() => {
println!("RPM installation completed successfully with {manager}");
log::info!("RPM installation completed successfully with {manager}");
return Ok(());
}
Ok(output) => {
let error_msg = String::from_utf8_lossy(&output.stderr);
last_error = format!("{manager} failed: {error_msg}");
println!("Installation failed with {manager}: {error_msg}");
log::info!("Installation failed with {manager}: {error_msg}");
}
Err(e) => {
last_error = format!("Failed to execute {manager}: {e}");
println!("Failed to execute {manager}: {e}");
log::info!("Failed to execute {manager}: {e}");
}
}
}
@@ -1263,7 +1306,7 @@ impl AppAutoUpdater {
&self,
appimage_path: &Path,
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
println!("Installing AppImage: {}", appimage_path.display());
log::info!("Installing AppImage: {}", appimage_path.display());
// This function should not be called for AppImages since we disable auto-updates for them
// But if it somehow gets called, we'll handle it safely
@@ -1297,7 +1340,7 @@ impl AppAutoUpdater {
// Replace the AppImage
fs::copy(appimage_path, &current_appimage)?;
println!("AppImage replacement completed successfully");
log::info!("AppImage replacement completed successfully");
Ok(())
}
@@ -1307,15 +1350,15 @@ impl AppAutoUpdater {
&self,
tarball_path: &Path,
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
println!("Installing tarball: {}", tarball_path.display());
log::info!("Installing tarball: {}", tarball_path.display());
let current_exe = self.get_current_app_path()?;
let temp_extract_dir = tarball_path.parent().unwrap().join("extracted");
fs::create_dir_all(&temp_extract_dir)?;
// Extract tarball
let extractor = crate::extraction::Extractor::instance();
let extracted_path = extractor
let extracted_path = self
.extractor
.extract_tar_gz(tarball_path, &temp_extract_dir)
.await?;
@@ -1369,7 +1412,7 @@ impl AppAutoUpdater {
// Clean up
let _ = fs::remove_dir_all(&temp_extract_dir);
println!("Tarball installation completed successfully");
log::info!("Tarball installation completed successfully");
Ok(())
}
@@ -1606,7 +1649,7 @@ pub async fn download_and_install_app_update(
#[tauri::command]
pub async fn check_for_app_updates_manual() -> Result<Option<AppUpdateInfo>, String> {
println!("Manual app update check triggered");
log::info!("Manual app update check triggered");
let updater = AppAutoUpdater::instance();
updater
.check_for_updates()
@@ -1614,14 +1657,6 @@ pub async fn check_for_app_updates_manual() -> Result<Option<AppUpdateInfo>, Str
.map_err(|e| format!("Failed to check for app updates: {e}"))
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct PlatformInfo {
pub os: String,
pub arch: String,
pub installation_method: String,
pub supported_formats: Vec<String>,
}
#[cfg(test)]
mod tests {
use super::*;
@@ -1630,7 +1665,7 @@ mod tests {
fn test_is_nightly_build() {
// This will depend on whether STABLE_RELEASE is set during test compilation
let is_nightly = AppAutoUpdater::is_nightly_build();
println!("Is nightly build: {is_nightly}");
log::info!("Is nightly build: {is_nightly}");
// The result should be true for test builds since STABLE_RELEASE is not set
// unless the test is run in a stable release environment
+91 -93
View File
@@ -1,6 +1,5 @@
use crate::api_client::is_browser_version_nightly;
use crate::browser_version_manager::{BrowserVersionInfo, BrowserVersionManager};
use crate::profile::BrowserProfile;
use crate::profile::{BrowserProfile, ProfileManager};
use crate::settings_manager::SettingsManager;
use serde::{Deserialize, Serialize};
use std::collections::{HashMap, HashSet};
@@ -29,15 +28,17 @@ pub struct AutoUpdateState {
}
pub struct AutoUpdater {
version_service: &'static BrowserVersionManager,
browser_version_manager: &'static BrowserVersionManager,
settings_manager: &'static SettingsManager,
profile_manager: &'static ProfileManager,
}
impl AutoUpdater {
fn new() -> Self {
Self {
version_service: BrowserVersionManager::instance(),
browser_version_manager: BrowserVersionManager::instance(),
settings_manager: SettingsManager::instance(),
profile_manager: ProfileManager::instance(),
}
}
@@ -53,8 +54,8 @@ impl AutoUpdater {
let mut browser_versions: HashMap<String, Vec<BrowserVersionInfo>> = HashMap::new();
// Group profiles by browser
let profile_manager = crate::profile::ProfileManager::instance();
let profiles = profile_manager
let profiles = self
.profile_manager
.list_profiles()
.map_err(|e| format!("Failed to list profiles: {e}"))?;
let mut browser_profiles: HashMap<String, Vec<BrowserProfile>> = HashMap::new();
@@ -62,7 +63,7 @@ impl AutoUpdater {
for profile in profiles {
// Only check supported browsers
if !self
.version_service
.browser_version_manager
.is_browser_supported(&profile.browser)
.unwrap_or(false)
{
@@ -78,14 +79,14 @@ impl AutoUpdater {
for (browser, profiles) in browser_profiles {
// Get cached versions first, then try to fetch if needed
let versions = if let Some(cached) = self
.version_service
.browser_version_manager
.get_cached_browser_versions_detailed(&browser)
{
cached
} else if self.version_service.should_update_cache(&browser) {
} else if self.browser_version_manager.should_update_cache(&browser) {
// Try to fetch fresh versions
match self
.version_service
.browser_version_manager
.fetch_browser_versions_detailed(&browser, false)
.await
{
@@ -108,13 +109,13 @@ impl AutoUpdater {
let new_version = &update.new_version.parse::<u32>().unwrap();
let result = new_version - current_version;
println!(
log::info!(
"Current version: {current_version}, New version: {new_version}, Result: {result}"
);
if result > 400 {
notifications.push(update);
} else {
println!(
log::info!(
"Skipping chromium update notification: only {result} new versions (need 400+)"
);
}
@@ -129,62 +130,66 @@ impl AutoUpdater {
}
pub async fn check_for_updates_with_progress(&self, app_handle: &tauri::AppHandle) {
println!("Starting auto-update check with progress...");
log::info!("Starting auto-update check with progress...");
// Check for browser updates and trigger auto-downloads
match self.check_for_updates().await {
Ok(update_notifications) => {
if !update_notifications.is_empty() {
println!(
log::info!(
"Found {} browser updates to auto-download",
update_notifications.len()
);
// Trigger automatic downloads for each update
for notification in update_notifications {
println!(
log::info!(
"Auto-downloading {} version {}",
notification.browser, notification.new_version
notification.browser,
notification.new_version
);
// Clone app_handle for the async task
let app_handle_clone = app_handle.clone();
let browser = notification.browser.clone();
let new_version = notification.new_version.clone();
let notification_id = notification.id.clone();
let affected_profiles = notification.affected_profiles.clone();
let app_handle_clone = app_handle.clone();
// Spawn async task to handle the download and auto-update
tokio::spawn(async move {
// TODO: update the logic to use the downloaded browsers registry instance instead of the static method
// First, check if browser already exists
match crate::browser_runner::is_browser_downloaded(
match crate::downloaded_browsers_registry::is_browser_downloaded(
browser.clone(),
new_version.clone(),
) {
true => {
println!("Browser {browser} {new_version} already downloaded, proceeding to auto-update profiles");
log::info!("Browser {browser} {new_version} already downloaded, proceeding to auto-update profiles");
// Browser already exists, go straight to profile update
match crate::auto_updater::complete_browser_update_with_auto_update(
browser.clone(),
new_version.clone(),
)
.await
match AutoUpdater::instance()
.complete_browser_update_with_auto_update(
&app_handle_clone,
&browser.clone(),
&new_version.clone(),
)
.await
{
Ok(updated_profiles) => {
println!(
log::info!(
"Auto-update completed for {} profiles: {:?}",
updated_profiles.len(),
updated_profiles
);
}
Err(e) => {
eprintln!("Failed to complete auto-update for {browser}: {e}");
log::error!("Failed to complete auto-update for {browser}: {e}");
}
}
}
false => {
println!("Downloading browser {browser} version {new_version}...");
log::info!("Downloading browser {browser} version {new_version}...");
// Emit the auto-update event to trigger frontend handling
let auto_update_event = serde_json::json!({
@@ -197,20 +202,20 @@ impl AutoUpdater {
if let Err(e) =
app_handle_clone.emit("browser-auto-update-available", &auto_update_event)
{
eprintln!("Failed to emit auto-update event for {browser}: {e}");
log::error!("Failed to emit auto-update event for {browser}: {e}");
} else {
println!("Emitted auto-update event for {browser}");
log::info!("Emitted auto-update event for {browser}");
}
}
}
});
}
} else {
println!("No browser updates needed");
log::info!("No browser updates needed");
}
}
Err(e) => {
eprintln!("Failed to check for browser updates: {e}");
log::error!("Failed to check for browser updates: {e}");
}
}
}
@@ -222,7 +227,8 @@ impl AutoUpdater {
available_versions: &[BrowserVersionInfo],
) -> Result<Option<UpdateNotification>, Box<dyn std::error::Error + Send + Sync>> {
let current_version = &profile.version;
let is_current_nightly = is_browser_version_nightly(&profile.browser, current_version, None);
let is_current_nightly =
crate::api_client::is_browser_version_nightly(&profile.browser, current_version, None);
// Find the best available update
let best_update = available_versions
@@ -230,7 +236,8 @@ impl AutoUpdater {
.filter(|v| {
// Only consider versions newer than current
self.is_version_newer(&v.version, current_version)
&& is_browser_version_nightly(&profile.browser, &v.version, None) == is_current_nightly
&& crate::api_client::is_browser_version_nightly(&profile.browser, &v.version, None)
== is_current_nightly
})
.max_by(|a, b| self.compare_versions(&a.version, &b.version));
@@ -293,11 +300,12 @@ impl AutoUpdater {
/// Automatically update all affected profile versions after browser download
pub async fn auto_update_profile_versions(
&self,
app_handle: &tauri::AppHandle,
browser: &str,
new_version: &str,
) -> Result<Vec<String>, Box<dyn std::error::Error + Send + Sync>> {
let profile_manager = crate::profile::ProfileManager::instance();
let profiles = profile_manager
let profiles = self
.profile_manager
.list_profiles()
.map_err(|e| format!("Failed to list profiles: {e}"))?;
@@ -314,12 +322,16 @@ impl AutoUpdater {
// Check if this is an update (newer version)
if self.is_version_newer(new_version, &profile.version) {
// Update the profile version
match profile_manager.update_profile_version(&profile.name, new_version) {
match self.profile_manager.update_profile_version(
app_handle,
&profile.id.to_string(),
new_version,
) {
Ok(_) => {
updated_profiles.push(profile.name);
}
Err(e) => {
eprintln!("Failed to update profile {}: {}", profile.name, e);
log::error!("Failed to update profile {}: {}", profile.name, e);
}
}
}
@@ -332,12 +344,13 @@ impl AutoUpdater {
/// Complete browser update process with auto-update of profile versions
pub async fn complete_browser_update_with_auto_update(
&self,
app_handle: &tauri::AppHandle,
browser: &str,
new_version: &str,
) -> Result<Vec<String>, Box<dyn std::error::Error + Send + Sync>> {
// Auto-update profile versions first
let updated_profiles = self
.auto_update_profile_versions(browser, new_version)
.auto_update_profile_versions(app_handle, browser, new_version)
.await?;
// Remove browser from disabled list and clean up auto-update tracking
@@ -347,46 +360,9 @@ impl AutoUpdater {
state.auto_update_downloads.remove(&download_key);
self.save_auto_update_state(&state)?;
// Always perform cleanup after auto-update - don't fail the update if cleanup fails
if let Err(e) = self.cleanup_unused_binaries_internal() {
eprintln!("Warning: Failed to cleanup unused binaries after auto-update: {e}");
}
Ok(updated_profiles)
}
/// Internal method to cleanup unused binaries (used by auto-cleanup)
fn cleanup_unused_binaries_internal(
&self,
) -> Result<Vec<String>, Box<dyn std::error::Error + Send + Sync>> {
// Load current profiles
let profile_manager = crate::profile::ProfileManager::instance();
let profiles = profile_manager
.list_profiles()
.map_err(|e| format!("Failed to load profiles: {e}"))?;
// Get registry instance
let registry = crate::downloaded_browsers::DownloadedBrowsersRegistry::instance();
// Get active browser versions (all profiles)
let active_versions = registry.get_active_browser_versions(&profiles);
// Get running browser versions (only running profiles)
let running_versions = registry.get_running_browser_versions(&profiles);
// Cleanup unused binaries (but keep running ones)
let cleaned_up = registry
.cleanup_unused_binaries(&active_versions, &running_versions)
.map_err(|e| format!("Failed to cleanup unused binaries: {e}"))?;
// Save updated registry
registry
.save()
.map_err(|e| format!("Failed to save registry: {e}"))?;
Ok(cleaned_up)
}
/// Check if browser is disabled due to ongoing update
pub fn is_browser_disabled(
&self,
@@ -408,17 +384,11 @@ impl AutoUpdater {
}
fn is_version_newer(&self, version1: &str, version2: &str) -> bool {
// Use the proper VersionComponent comparison from api_client.rs
let version_a = crate::api_client::VersionComponent::parse(version1);
let version_b = crate::api_client::VersionComponent::parse(version2);
version_a > version_b
crate::api_client::is_version_newer(version1, version2)
}
fn compare_versions(&self, version1: &str, version2: &str) -> std::cmp::Ordering {
// Use the proper VersionComponent comparison from api_client.rs
let version_a = crate::api_client::VersionComponent::parse(version1);
let version_b = crate::api_client::VersionComponent::parse(version2);
version_a.cmp(&version_b)
crate::api_client::compare_versions(version1, version2)
}
fn get_auto_update_state_file(&self) -> PathBuf {
@@ -455,6 +425,39 @@ impl AutoUpdater {
Ok(())
}
/// Get pending update versions for a specific browser
/// Returns a set of (browser, version) pairs that have pending updates
pub fn get_pending_update_versions(
&self,
) -> Result<std::collections::HashSet<(String, String)>, Box<dyn std::error::Error + Send + Sync>>
{
let state = self.load_auto_update_state()?;
let mut pending_versions = std::collections::HashSet::new();
for update in &state.pending_updates {
pending_versions.insert((update.browser.clone(), update.new_version.clone()));
}
Ok(pending_versions)
}
/// Get pending update for a specific browser version if it exists
pub fn get_pending_update(
&self,
browser: &str,
current_version: &str,
) -> Result<Option<UpdateNotification>, Box<dyn std::error::Error + Send + Sync>> {
let state = self.load_auto_update_state()?;
for update in &state.pending_updates {
if update.browser == browser && update.current_version == current_version {
return Ok(Some(update.clone()));
}
}
Ok(None)
}
}
// Tauri commands
@@ -470,14 +473,6 @@ pub async fn check_for_browser_updates() -> Result<Vec<UpdateNotification>, Stri
Ok(grouped)
}
#[tauri::command]
pub async fn is_browser_disabled_for_update(browser: String) -> Result<bool, String> {
let updater = AutoUpdater::instance();
updater
.is_browser_disabled(&browser)
.map_err(|e| format!("Failed to check browser status: {e}"))
}
#[tauri::command]
pub async fn dismiss_update_notification(notification_id: String) -> Result<(), String> {
let updater = AutoUpdater::instance();
@@ -488,12 +483,13 @@ pub async fn dismiss_update_notification(notification_id: String) -> Result<(),
#[tauri::command]
pub async fn complete_browser_update_with_auto_update(
app_handle: tauri::AppHandle,
browser: String,
new_version: String,
) -> Result<Vec<String>, String> {
let updater = AutoUpdater::instance();
updater
.complete_browser_update_with_auto_update(&browser, &new_version)
.complete_browser_update_with_auto_update(&app_handle, &browser, &new_version)
.await
.map_err(|e| format!("Failed to complete browser update: {e}"))
}
@@ -520,6 +516,8 @@ mod tests {
release_type: "stable".to_string(),
camoufox_config: None,
group_id: None,
tags: Vec::new(),
note: None,
}
}
@@ -883,7 +881,7 @@ mod tests {
use tempfile::TempDir;
// Create a temporary directory for testing
let temp_dir = TempDir::new().expect("Failed to create temp directory");
let temp_dir = TempDir::new().unwrap();
// Create a mock settings manager that uses the temp directory
struct TestSettingsManager {
+322
View File
@@ -0,0 +1,322 @@
use clap::{Arg, Command};
use donutbrowser_lib::proxy_runner::{
start_proxy_process_with_profile, stop_all_proxy_processes, stop_proxy_process,
};
use donutbrowser_lib::proxy_server::run_proxy_server;
use donutbrowser_lib::proxy_storage::get_proxy_config;
use std::process;
fn set_high_priority() {
#[cfg(unix)]
{
unsafe {
// Set high priority (negative nice value = higher priority)
// -10 is a reasonably high priority without being too aggressive
// This may fail without elevated privileges, which is fine
let result = libc::setpriority(libc::PRIO_PROCESS, 0, -10);
if result == 0 {
log::info!("Set process priority to -10 (high priority)");
} else {
// Try a less aggressive priority if -10 fails
let result = libc::setpriority(libc::PRIO_PROCESS, 0, -5);
if result == 0 {
log::info!("Set process priority to -5 (above normal)");
}
}
}
}
#[cfg(target_os = "linux")]
{
// Lower OOM score so this process is less likely to be killed under memory pressure
// Valid range is -1000 to 1000, lower = less likely to be killed
// -500 is a reasonable value that makes us less likely to be killed
if let Err(e) = std::fs::write("/proc/self/oom_score_adj", "-500") {
log::debug!("Could not set OOM score adjustment: {}", e);
} else {
log::info!("Set OOM score adjustment to -500");
}
}
#[cfg(windows)]
{
use windows::Win32::System::Threading::{
GetCurrentProcess, SetPriorityClass, ABOVE_NORMAL_PRIORITY_CLASS,
};
unsafe {
let process = GetCurrentProcess();
if SetPriorityClass(process, ABOVE_NORMAL_PRIORITY_CLASS).is_ok() {
log::info!("Set process priority to ABOVE_NORMAL_PRIORITY_CLASS");
} else {
log::debug!("Could not set process priority class");
}
}
}
}
fn build_proxy_url(
proxy_type: &str,
host: &str,
port: u16,
username: Option<&str>,
password: Option<&str>,
) -> String {
let mut url = format!("{}://", proxy_type.to_lowercase());
if let (Some(user), Some(pass)) = (username, password) {
let encoded_user = urlencoding::encode(user);
let encoded_pass = urlencoding::encode(pass);
url.push_str(&format!("{}:{}@", encoded_user, encoded_pass));
} else if let Some(user) = username {
let encoded_user = urlencoding::encode(user);
url.push_str(&format!("{}@", encoded_user));
}
url.push_str(host);
url.push(':');
url.push_str(&port.to_string());
url
}
#[tokio::main(flavor = "multi_thread")]
async fn main() {
// Set up panic handler to log panics before process exits
std::panic::set_hook(Box::new(|panic_info| {
log::error!("PANIC in proxy worker: {:?}", panic_info);
if let Some(location) = panic_info.location() {
log::error!(
"Location: {}:{}:{}",
location.file(),
location.line(),
location.column()
);
}
if let Some(s) = panic_info.payload().downcast_ref::<&str>() {
log::error!("Message: {}", s);
}
}));
let matches = Command::new("donut-proxy")
.subcommand(
Command::new("proxy")
.about("Manage proxy servers")
.subcommand(
Command::new("start")
.about("Start a proxy server")
.arg(Arg::new("host").long("host").help("Upstream proxy host"))
.arg(
Arg::new("proxy-port")
.long("proxy-port")
.value_parser(clap::value_parser!(u16))
.help("Upstream proxy port"),
)
.arg(
Arg::new("type")
.long("type")
.help("Proxy type (http, https, socks4, socks5)"),
)
.arg(Arg::new("username").long("username").help("Proxy username"))
.arg(Arg::new("password").long("password").help("Proxy password"))
.arg(
Arg::new("port")
.short('p')
.long("port")
.value_parser(clap::value_parser!(u16))
.help("Local port to use (random if not specified)"),
)
.arg(
Arg::new("ignore-certificate")
.long("ignore-certificate")
.help("Ignore certificate errors for HTTPS proxies"),
)
.arg(
Arg::new("upstream")
.short('u')
.long("upstream")
.help("Upstream proxy URL (protocol://[username:password@]host:port)"),
)
.arg(
Arg::new("profile-id")
.long("profile-id")
.help("ID of the profile this proxy is associated with"),
),
)
.subcommand(
Command::new("stop")
.about("Stop a proxy server")
.arg(Arg::new("id").long("id").help("Proxy ID to stop"))
.arg(
Arg::new("upstream")
.long("upstream")
.help("Stop proxies with this upstream URL"),
),
)
.subcommand(Command::new("list").about("List all proxy servers")),
)
.subcommand(
Command::new("proxy-worker")
.about("Run a proxy worker process (internal use)")
.arg(
Arg::new("id")
.long("id")
.required(true)
.help("Proxy configuration ID"),
)
.arg(Arg::new("action").required(true).help("Action (start)")),
)
.get_matches();
if let Some(proxy_matches) = matches.subcommand_matches("proxy") {
if let Some(start_matches) = proxy_matches.subcommand_matches("start") {
let mut upstream_url: Option<String> = None;
// Build upstream URL from individual components if provided
if let (Some(host), Some(port), Some(proxy_type)) = (
start_matches.get_one::<String>("host"),
start_matches.get_one::<u16>("proxy-port"),
start_matches.get_one::<String>("type"),
) {
let username = start_matches.get_one::<String>("username");
let password = start_matches.get_one::<String>("password");
upstream_url = Some(build_proxy_url(
proxy_type,
host,
*port,
username.map(|s| s.as_str()),
password.map(|s| s.as_str()),
));
} else if let Some(upstream) = start_matches.get_one::<String>("upstream") {
upstream_url = Some(upstream.clone());
}
let port = start_matches.get_one::<u16>("port").copied();
let profile_id = start_matches.get_one::<String>("profile-id").cloned();
match start_proxy_process_with_profile(upstream_url, port, profile_id).await {
Ok(config) => {
// Output the configuration as JSON for the Rust side to parse
// Use println! here because this needs to go to stdout for parsing
println!(
"{}",
serde_json::json!({
"id": config.id,
"localPort": config.local_port,
"localUrl": config.local_url,
"upstreamUrl": config.upstream_url,
})
);
process::exit(0);
}
Err(e) => {
eprintln!("Failed to start proxy: {}", e);
process::exit(1);
}
}
} else if let Some(stop_matches) = proxy_matches.subcommand_matches("stop") {
if let Some(id) = stop_matches.get_one::<String>("id") {
match stop_proxy_process(id).await {
Ok(success) => {
// Use println! here because this needs to go to stdout for parsing
println!("{}", serde_json::json!({ "success": success }));
process::exit(0);
}
Err(e) => {
eprintln!("Failed to stop proxy: {}", e);
process::exit(1);
}
}
} else if let Some(upstream) = stop_matches.get_one::<String>("upstream") {
// Find proxies with this upstream URL
let configs = donutbrowser_lib::proxy_storage::list_proxy_configs();
let matching_configs: Vec<_> = configs
.iter()
.filter(|config| config.upstream_url == *upstream)
.collect();
if matching_configs.is_empty() {
eprintln!("No proxies found for {}", upstream);
process::exit(1);
}
for config in matching_configs {
let _ = stop_proxy_process(&config.id).await;
}
// Use println! here because this needs to go to stdout for parsing
println!("{}", serde_json::json!({ "success": true }));
process::exit(0);
} else {
// Stop all proxies
match stop_all_proxy_processes().await {
Ok(_) => {
// Use println! here because this needs to go to stdout for parsing
println!("{}", serde_json::json!({ "success": true }));
process::exit(0);
}
Err(e) => {
eprintln!("Failed to stop all proxies: {}", e);
process::exit(1);
}
}
}
} else if proxy_matches.subcommand_matches("list").is_some() {
let configs = donutbrowser_lib::proxy_storage::list_proxy_configs();
// Use println! here because this needs to go to stdout for parsing
println!("{}", serde_json::to_string(&configs).unwrap());
process::exit(0);
} else {
log::error!("Invalid action. Use 'start', 'stop', or 'list'");
process::exit(1);
}
} else if let Some(worker_matches) = matches.subcommand_matches("proxy-worker") {
let id = worker_matches
.get_one::<String>("id")
.expect("id is required");
let action = worker_matches
.get_one::<String>("action")
.expect("action is required");
if action == "start" {
// Set high priority so this process is killed last under resource pressure
set_high_priority();
log::error!("Proxy worker starting, looking for config id: {}", id);
log::error!("Process PID: {}", std::process::id());
let config = match get_proxy_config(id) {
Some(config) => {
log::error!(
"Found config: id={}, port={:?}, upstream={}",
config.id,
config.local_port,
config.upstream_url
);
config
}
None => {
log::error!("Proxy configuration {} not found", id);
process::exit(1);
}
};
// Run the proxy server - this should never return (infinite loop)
log::error!("Starting proxy server for config id: {}", id);
if let Err(e) = run_proxy_server(config).await {
log::error!("Failed to run proxy server: {}", e);
log::error!("Error details: {:?}", e);
process::exit(1);
}
// This should never be reached - run_proxy_server has an infinite loop
log::error!("ERROR: Proxy server returned unexpectedly (this should never happen)");
process::exit(1);
} else {
log::error!("Invalid action for proxy-worker. Use 'start'");
process::exit(1);
}
} else {
log::error!("No command specified");
process::exit(1);
}
}
+172 -28
View File
@@ -1,7 +1,8 @@
use serde::{Deserialize, Serialize};
use std::path::{Path, PathBuf};
use utoipa::ToSchema;
#[derive(Debug, Clone, Serialize, Deserialize)]
#[derive(Debug, Clone, Serialize, Deserialize, ToSchema)]
pub struct ProxySettings {
pub proxy_type: String, // "http", "https", "socks4", or "socks5"
pub host: String,
@@ -58,6 +59,8 @@ pub trait Browser: Send + Sync {
profile_path: &str,
proxy_settings: Option<&ProxySettings>,
url: Option<String>,
remote_debugging_port: Option<u16>,
headless: bool,
) -> Result<Vec<String>, Box<dyn std::error::Error>>;
fn is_version_downloaded(&self, version: &str, binaries_dir: &Path) -> bool;
fn prepare_executable(&self, executable_path: &Path) -> Result<(), Box<dyn std::error::Error>>;
@@ -239,12 +242,29 @@ mod linux {
browser_type: &BrowserType,
) -> Result<PathBuf, Box<dyn std::error::Error>> {
let possible_executables = match browser_type {
BrowserType::Chromium => vec![install_dir.join("chromium"), install_dir.join("chrome")],
BrowserType::Chromium => vec![
// Direct paths (for manual installations)
install_dir.join("chromium"),
install_dir.join("chrome"),
install_dir.join("chromium-browser"),
// Subdirectory paths (for downloaded archives)
install_dir.join("chrome-linux").join("chrome"),
install_dir.join("chrome-linux").join("chromium"),
install_dir.join("chromium").join("chromium"),
install_dir.join("chromium").join("chrome"),
// Binary subdirectory
install_dir.join("bin").join("chromium"),
install_dir.join("bin").join("chrome"),
],
BrowserType::Brave => vec![
install_dir.join("brave"),
install_dir.join("brave-browser"),
install_dir.join("brave-browser-nightly"),
install_dir.join("brave-browser-beta"),
// Subdirectory paths
install_dir.join("brave").join("brave"),
install_dir.join("brave-browser").join("brave"),
install_dir.join("bin").join("brave"),
],
_ => vec![],
};
@@ -320,12 +340,29 @@ mod linux {
pub fn is_chromium_version_downloaded(install_dir: &Path, browser_type: &BrowserType) -> bool {
let possible_executables = match browser_type {
BrowserType::Chromium => vec![install_dir.join("chromium"), install_dir.join("chrome")],
BrowserType::Chromium => vec![
// Direct paths (for manual installations)
install_dir.join("chromium"),
install_dir.join("chrome"),
install_dir.join("chromium-browser"),
// Subdirectory paths (for downloaded archives)
install_dir.join("chrome-linux").join("chrome"),
install_dir.join("chrome-linux").join("chromium"),
install_dir.join("chromium").join("chromium"),
install_dir.join("chromium").join("chrome"),
// Binary subdirectory
install_dir.join("bin").join("chromium"),
install_dir.join("bin").join("chrome"),
],
BrowserType::Brave => vec![
install_dir.join("brave"),
install_dir.join("brave-browser"),
install_dir.join("brave-browser-nightly"),
install_dir.join("brave-browser-beta"),
// Subdirectory paths
install_dir.join("brave").join("brave"),
install_dir.join("brave-browser").join("brave"),
install_dir.join("bin").join("brave"),
],
_ => vec![],
};
@@ -341,7 +378,7 @@ mod linux {
pub fn prepare_executable(executable_path: &Path) -> Result<(), Box<dyn std::error::Error>> {
// On Linux, ensure the executable has proper permissions
println!("Setting execute permissions for: {:?}", executable_path);
log::info!("Setting execute permissions for: {:?}", executable_path);
let metadata = std::fs::metadata(executable_path)?;
let mut permissions = metadata.permissions();
@@ -352,7 +389,7 @@ mod linux {
std::fs::set_permissions(executable_path, permissions)?;
println!(
log::info!(
"Execute permissions set successfully for: {:?}",
executable_path
);
@@ -413,11 +450,18 @@ mod windows {
install_dir.join("chrome.exe"),
install_dir.join("chromium-browser.exe"),
install_dir.join("bin").join("chromium.exe"),
// Common archive extraction patterns
install_dir.join("chrome-win").join("chrome.exe"),
install_dir.join("chromium").join("chromium.exe"),
install_dir.join("chromium").join("chrome.exe"),
],
BrowserType::Brave => vec![
install_dir.join("brave.exe"),
install_dir.join("brave-browser.exe"),
install_dir.join("bin").join("brave.exe"),
// Subdirectory patterns
install_dir.join("brave").join("brave.exe"),
install_dir.join("brave-browser").join("brave.exe"),
],
_ => vec![],
};
@@ -489,11 +533,18 @@ mod windows {
install_dir.join("chrome.exe"),
install_dir.join("chromium-browser.exe"),
install_dir.join("bin").join("chromium.exe"),
// Common archive extraction patterns
install_dir.join("chrome-win").join("chrome.exe"),
install_dir.join("chromium").join("chromium.exe"),
install_dir.join("chromium").join("chrome.exe"),
],
BrowserType::Brave => vec![
install_dir.join("brave.exe"),
install_dir.join("brave-browser.exe"),
install_dir.join("bin").join("brave.exe"),
// Subdirectory patterns
install_dir.join("brave").join("brave.exe"),
install_dir.join("brave-browser").join("brave.exe"),
],
_ => vec![],
};
@@ -557,11 +608,23 @@ impl Browser for FirefoxBrowser {
profile_path: &str,
_proxy_settings: Option<&ProxySettings>,
url: Option<String>,
remote_debugging_port: Option<u16>,
headless: bool,
) -> Result<Vec<String>, Box<dyn std::error::Error>> {
let mut args = vec!["-profile".to_string(), profile_path.to_string()];
// Only use -no-remote for browsers that require it for security (Mullvad, Tor)
// Regular Firefox browsers can use remote commands for better URL handling
// Add remote debugging if requested
if let Some(port) = remote_debugging_port {
args.push("--start-debugger-server".to_string());
args.push(port.to_string());
}
// Add headless mode if requested
if headless {
args.push("--headless".to_string());
}
// Use -no-remote for browsers that require it for security (Mullvad, Tor) or when remote debugging
match self.browser_type {
BrowserType::MullvadBrowser | BrowserType::TorBrowser => {
args.push("-no-remote".to_string());
@@ -570,7 +633,11 @@ impl Browser for FirefoxBrowser {
| BrowserType::FirefoxDeveloper
| BrowserType::Zen
| BrowserType::Camoufox => {
// Don't use -no-remote so we can communicate with existing instances
// Use -no-remote when remote debugging to avoid conflicts
if remote_debugging_port.is_some() {
args.push("-no-remote".to_string());
}
// Don't use -no-remote for normal launches so we can communicate with existing instances
}
_ => {}
}
@@ -587,14 +654,14 @@ impl Browser for FirefoxBrowser {
// Expected structure: binaries/<browser>/<version>
let browser_dir = binaries_dir.join(self.browser_type.as_str()).join(version);
println!("Firefox browser checking version {version} in directory: {browser_dir:?}");
log::info!("Firefox browser checking version {version} in directory: {browser_dir:?}");
if !browser_dir.exists() {
println!("Directory does not exist: {browser_dir:?}");
log::info!("Directory does not exist: {browser_dir:?}");
return false;
}
println!("Directory exists, checking for browser files...");
log::info!("Directory exists, checking for browser files...");
#[cfg(target_os = "macos")]
return macos::is_firefox_version_downloaded(&browser_dir);
@@ -607,7 +674,7 @@ impl Browser for FirefoxBrowser {
#[cfg(not(any(target_os = "macos", target_os = "linux", target_os = "windows")))]
{
println!("Unsupported platform for browser verification");
log::info!("Unsupported platform for browser verification");
false
}
}
@@ -659,6 +726,8 @@ impl Browser for ChromiumBrowser {
profile_path: &str,
proxy_settings: Option<&ProxySettings>,
url: Option<String>,
remote_debugging_port: Option<u16>,
headless: bool,
) -> Result<Vec<String>, Box<dyn std::error::Error>> {
let mut args = vec![
format!("--user-data-dir={}", profile_path),
@@ -670,9 +739,19 @@ impl Browser for ChromiumBrowser {
"--disable-updater".to_string(),
];
// Add remote debugging if requested
if let Some(port) = remote_debugging_port {
args.push("--remote-debugging-address=0.0.0.0".to_string());
args.push(format!("--remote-debugging-port={port}"));
}
// Add headless mode if requested
if headless {
args.push("--headless".to_string());
}
// Add proxy configuration if provided
if let Some(proxy) = proxy_settings {
// Apply proxy settings
args.push(format!(
"--proxy-server=http://{}:{}",
proxy.host, proxy.port
@@ -690,14 +769,14 @@ impl Browser for ChromiumBrowser {
// Expected structure: binaries/<browser>/<version>
let browser_dir = binaries_dir.join(self.browser_type.as_str()).join(version);
println!("Chromium browser checking version {version} in directory: {browser_dir:?}");
log::info!("Chromium browser checking version {version} in directory: {browser_dir:?}");
if !browser_dir.exists() {
println!("Directory does not exist: {browser_dir:?}");
log::info!("Directory does not exist: {browser_dir:?}");
return false;
}
println!("Directory exists, checking for browser files...");
log::info!("Directory exists, checking for browser files...");
#[cfg(target_os = "macos")]
return macos::is_chromium_version_downloaded(&browser_dir);
@@ -710,7 +789,7 @@ impl Browser for ChromiumBrowser {
#[cfg(not(any(target_os = "macos", target_os = "linux", target_os = "windows")))]
{
println!("Unsupported platform for browser verification");
log::info!("Unsupported platform for browser verification");
false
}
}
@@ -758,6 +837,8 @@ impl Browser for CamoufoxBrowser {
profile_path: &str,
_proxy_settings: Option<&ProxySettings>,
url: Option<String>,
remote_debugging_port: Option<u16>,
headless: bool,
) -> Result<Vec<String>, Box<dyn std::error::Error>> {
// For Camoufox, we handle launching through the camoufox launcher
// This method won't be used directly, but we provide basic Firefox args as fallback
@@ -767,6 +848,17 @@ impl Browser for CamoufoxBrowser {
"-no-remote".to_string(),
];
// Add remote debugging if requested
if let Some(port) = remote_debugging_port {
args.push("--start-debugger-server".to_string());
args.push(port.to_string());
}
// Add headless mode if requested
if headless {
args.push("--headless".to_string());
}
if let Some(url) = url {
args.push(url);
}
@@ -962,15 +1054,15 @@ mod tests {
#[test]
fn test_firefox_launch_args() {
// Test regular Firefox (should not use -no-remote)
// Test regular Firefox (should not use -no-remote for normal launch)
let browser = FirefoxBrowser::new(BrowserType::Firefox);
let args = browser
.create_launch_args("/path/to/profile", None, None)
.create_launch_args("/path/to/profile", None, None, None, false)
.expect("Failed to create launch args for Firefox");
assert_eq!(args, vec!["-profile", "/path/to/profile"]);
assert!(
!args.contains(&"-no-remote".to_string()),
"Firefox should not use -no-remote"
"Firefox should not use -no-remote for normal launch"
);
let args = browser
@@ -978,6 +1070,8 @@ mod tests {
"/path/to/profile",
None,
Some("https://example.com".to_string()),
None,
false,
)
.expect("Failed to create launch args for Firefox with URL");
assert_eq!(
@@ -985,29 +1079,55 @@ mod tests {
vec!["-profile", "/path/to/profile", "https://example.com"]
);
// Test Mullvad Browser (should use -no-remote)
// Test Firefox with remote debugging (should use -no-remote)
let args = browser
.create_launch_args("/path/to/profile", None, None, Some(9222), false)
.expect("Failed to create launch args for Firefox with remote debugging");
assert!(
args.contains(&"-no-remote".to_string()),
"Firefox should use -no-remote for remote debugging"
);
assert!(
args.contains(&"--start-debugger-server".to_string()),
"Firefox should include debugger server arg"
);
assert!(
args.contains(&"9222".to_string()),
"Firefox should include debugging port"
);
// Test Mullvad Browser (should always use -no-remote)
let browser = FirefoxBrowser::new(BrowserType::MullvadBrowser);
let args = browser
.create_launch_args("/path/to/profile", None, None)
.create_launch_args("/path/to/profile", None, None, None, false)
.expect("Failed to create launch args for Mullvad Browser");
assert_eq!(args, vec!["-profile", "/path/to/profile", "-no-remote"]);
// Test Tor Browser (should use -no-remote)
// Test Tor Browser (should always use -no-remote)
let browser = FirefoxBrowser::new(BrowserType::TorBrowser);
let args = browser
.create_launch_args("/path/to/profile", None, None)
.create_launch_args("/path/to/profile", None, None, None, false)
.expect("Failed to create launch args for Tor Browser");
assert_eq!(args, vec!["-profile", "/path/to/profile", "-no-remote"]);
// Test Zen Browser (should not use -no-remote)
// Test Zen Browser (should not use -no-remote for normal launch)
let browser = FirefoxBrowser::new(BrowserType::Zen);
let args = browser
.create_launch_args("/path/to/profile", None, None)
.create_launch_args("/path/to/profile", None, None, None, false)
.expect("Failed to create launch args for Zen Browser");
assert_eq!(args, vec!["-profile", "/path/to/profile"]);
assert!(
!args.contains(&"-no-remote".to_string()),
"Zen Browser should not use -no-remote"
"Zen Browser should not use -no-remote for normal launch"
);
// Test headless mode
let args = browser
.create_launch_args("/path/to/profile", None, None, None, true)
.expect("Failed to create launch args for Zen Browser headless");
assert!(
args.contains(&"--headless".to_string()),
"Browser should include headless flag when requested"
);
}
@@ -1015,7 +1135,7 @@ mod tests {
fn test_chromium_launch_args() {
let browser = ChromiumBrowser::new(BrowserType::Chromium);
let args = browser
.create_launch_args("/path/to/profile", None, None)
.create_launch_args("/path/to/profile", None, None, None, false)
.expect("Failed to create launch args for Chromium");
// Test that basic required arguments are present
@@ -1043,6 +1163,8 @@ mod tests {
"/path/to/profile",
None,
Some("https://example.com".to_string()),
None,
false,
)
.expect("Failed to create launch args for Chromium with URL");
assert!(
@@ -1055,6 +1177,28 @@ mod tests {
args_with_url.last().expect("Args should not be empty"),
"https://example.com"
);
// Test remote debugging
let args_with_debug = browser
.create_launch_args("/path/to/profile", None, None, Some(9222), false)
.expect("Failed to create launch args for Chromium with remote debugging");
assert!(
args_with_debug.contains(&"--remote-debugging-port=9222".to_string()),
"Chromium args should contain remote debugging port"
);
assert!(
args_with_debug.contains(&"--remote-debugging-address=0.0.0.0".to_string()),
"Chromium args should contain remote debugging address"
);
// Test headless mode
let args_headless = browser
.create_launch_args("/path/to/profile", None, None, None, true)
.expect("Failed to create launch args for Chromium headless");
assert!(
args_headless.contains(&"--headless".to_string()),
"Chromium args should contain headless flag when requested"
);
}
#[test]
File diff suppressed because it is too large Load Diff
+103 -7
View File
@@ -117,7 +117,8 @@ impl BrowserVersionManager {
/// Get cached browser versions immediately (returns None if no cache exists)
pub fn get_cached_browser_versions(&self, browser: &str) -> Option<Vec<String>> {
if browser == "brave" {
return ApiClient::instance()
return self
.api_client
.get_cached_github_releases("brave")
.map(|releases| releases.into_iter().map(|r| r.tag_name).collect());
}
@@ -134,7 +135,7 @@ impl BrowserVersionManager {
browser: &str,
) -> Option<Vec<BrowserVersionInfo>> {
if browser == "brave" {
if let Some(releases) = ApiClient::instance().get_cached_github_releases("brave") {
if let Some(releases) = self.api_client.get_cached_github_releases("brave") {
let detailed_info: Vec<BrowserVersionInfo> = releases
.into_iter()
.map(|r| BrowserVersionInfo {
@@ -276,7 +277,7 @@ impl BrowserVersionManager {
.api_client
.save_cached_versions(browser, &merged_releases)
{
eprintln!("Failed to save merged cache for {browser}: {e}");
log::error!("Failed to save merged cache for {browser}: {e}");
}
}
@@ -533,7 +534,7 @@ impl BrowserVersionManager {
})
.collect();
if let Err(e) = self.api_client.save_cached_versions(browser, &releases) {
eprintln!("Failed to save updated cache for {browser}: {e}");
log::error!("Failed to save updated cache for {browser}: {e}");
}
Ok(new_versions_count)
@@ -921,7 +922,7 @@ impl BrowserVersionManager {
.collect();
// Always save so that other callers without release_name can classify correctly
if let Err(e) = self.api_client.save_cached_versions("brave", &converted) {
eprintln!("Failed to persist Brave versions cache: {e}");
log::error!("Failed to persist Brave versions cache: {e}");
}
Ok(releases.into_iter().map(|r| r.tag_name).collect())
@@ -946,7 +947,7 @@ impl BrowserVersionManager {
})
.collect();
if let Err(e) = self.api_client.save_cached_versions("brave", &converted) {
eprintln!("Failed to persist Brave versions cache: {e}");
log::error!("Failed to persist Brave versions cache: {e}");
}
Ok(releases)
@@ -1270,10 +1271,105 @@ mod tests {
let unsupported_result = service.get_download_info("unsupported", "1.0.0");
assert!(unsupported_result.is_err());
println!("Download info test passed for all browsers");
log::info!("Download info test passed for all browsers");
}
}
#[tauri::command]
pub fn get_supported_browsers() -> Result<Vec<String>, String> {
let service = BrowserVersionManager::instance();
Ok(service.get_supported_browsers())
}
#[tauri::command]
pub fn is_browser_supported_on_platform(browser_str: String) -> Result<bool, String> {
let service = BrowserVersionManager::instance();
service
.is_browser_supported(&browser_str)
.map_err(|e| format!("Failed to check browser support: {e}"))
}
#[tauri::command]
pub async fn fetch_browser_versions_cached_first(
browser_str: String,
) -> Result<Vec<BrowserVersionInfo>, String> {
let service = BrowserVersionManager::instance();
// Get cached versions immediately if available
if let Some(cached_versions) = service.get_cached_browser_versions_detailed(&browser_str) {
// Check if we should update cache in background
if service.should_update_cache(&browser_str) {
// Start background update but return cached data immediately
let service_clone = BrowserVersionManager::instance();
let browser_str_clone = browser_str.clone();
tokio::spawn(async move {
if let Err(e) = service_clone
.fetch_browser_versions_detailed(&browser_str_clone, false)
.await
{
log::error!("Background version update failed for {browser_str_clone}: {e}");
}
});
}
Ok(cached_versions)
} else {
// No cache available, fetch fresh
service
.fetch_browser_versions_detailed(&browser_str, false)
.await
.map_err(|e| format!("Failed to fetch detailed browser versions: {e}"))
}
}
#[tauri::command]
pub async fn fetch_browser_versions_with_count_cached_first(
browser_str: String,
) -> Result<BrowserVersionsResult, String> {
let service = BrowserVersionManager::instance();
// Get cached versions immediately if available
if let Some(cached_versions) = service.get_cached_browser_versions(&browser_str) {
// Check if we should update cache in background
if service.should_update_cache(&browser_str) {
// Start background update but return cached data immediately
let service_clone = BrowserVersionManager::instance();
let browser_str_clone = browser_str.clone();
tokio::spawn(async move {
if let Err(e) = service_clone
.fetch_browser_versions_with_count(&browser_str_clone, false)
.await
{
log::error!("Background version update failed for {browser_str_clone}: {e}");
}
});
}
// Return cached data in the expected format
Ok(BrowserVersionsResult {
versions: cached_versions.clone(),
new_versions_count: None, // No new versions when returning cached data
total_versions_count: cached_versions.len(),
})
} else {
// No cache available, fetch fresh
service
.fetch_browser_versions_with_count(&browser_str, false)
.await
.map_err(|e| format!("Failed to fetch browser versions: {e}"))
}
}
#[tauri::command]
pub async fn fetch_browser_versions_with_count(
browser_str: String,
) -> Result<BrowserVersionsResult, String> {
let service = BrowserVersionManager::instance();
service
.fetch_browser_versions_with_count(&browser_str, false)
.await
.map_err(|e| format!("Failed to fetch browser versions: {e}"))
}
// Global singleton instance
lazy_static::lazy_static! {
static ref BROWSER_VERSION_SERVICE: BrowserVersionManager = BrowserVersionManager::new();
@@ -1,8 +1,10 @@
use crate::browser_runner::BrowserRunner;
use crate::profile::BrowserProfile;
use directories::BaseDirs;
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
use std::path::PathBuf;
use std::sync::Arc;
use tauri::AppHandle;
use tauri_plugin_shell::ShellExt;
use tokio::sync::Mutex as AsyncMutex;
@@ -20,6 +22,8 @@ pub struct CamoufoxConfig {
pub block_webgl: Option<bool>,
pub executable_path: Option<String>,
pub fingerprint: Option<String>, // JSON string of the complete fingerprint config
pub randomize_fingerprint_on_launch: Option<bool>, // Generate new fingerprint on every launch
pub os: Option<String>, // Operating system for fingerprint generation: "windows", "macos", or "linux"
}
impl Default for CamoufoxConfig {
@@ -36,6 +40,8 @@ impl Default for CamoufoxConfig {
block_webgl: None,
executable_path: None,
fingerprint: None,
randomize_fingerprint_on_launch: None,
os: None,
}
}
}
@@ -60,27 +66,40 @@ struct CamoufoxInstance {
url: Option<String>,
}
struct CamoufoxNodecarLauncherInner {
struct CamoufoxManagerInner {
instances: HashMap<String, CamoufoxInstance>,
}
pub struct CamoufoxNodecarLauncher {
inner: Arc<AsyncMutex<CamoufoxNodecarLauncherInner>>,
pub struct CamoufoxManager {
inner: Arc<AsyncMutex<CamoufoxManagerInner>>,
base_dirs: BaseDirs,
}
impl CamoufoxNodecarLauncher {
impl CamoufoxManager {
fn new() -> Self {
Self {
inner: Arc::new(AsyncMutex::new(CamoufoxNodecarLauncherInner {
inner: Arc::new(AsyncMutex::new(CamoufoxManagerInner {
instances: HashMap::new(),
})),
base_dirs: BaseDirs::new().expect("Failed to get base directories"),
}
}
pub fn instance() -> &'static CamoufoxNodecarLauncher {
pub fn instance() -> &'static CamoufoxManager {
&CAMOUFOX_NODECAR_LAUNCHER
}
pub fn get_profiles_dir(&self) -> PathBuf {
let mut path = self.base_dirs.data_local_dir().to_path_buf();
path.push(if cfg!(debug_assertions) {
"DonutBrowserDev"
} else {
"DonutBrowser"
});
path.push("profiles");
path
}
/// Generate Camoufox fingerprint configuration during profile creation
pub async fn generate_fingerprint_config(
&self,
@@ -95,8 +114,8 @@ impl CamoufoxNodecarLauncher {
path.clone()
} else {
// Use the browser runner helper with the real profile
let browser_runner = crate::browser_runner::BrowserRunner::instance();
browser_runner
// Use self.browser_runner instead of instance()
BrowserRunner::instance()
.get_browser_executable_path(profile)
.map_err(|e| format!("Failed to get Camoufox executable path: {e}"))?
.to_string_lossy()
@@ -154,6 +173,11 @@ impl CamoufoxNodecarLauncher {
}
}
// Add OS option for fingerprint generation
if let Some(os) = &config.os {
config_args.extend(["--os".to_string(), os.clone()]);
}
// Execute config generation command
let mut config_sidecar = self.get_nodecar_sidecar(app_handle)?;
for arg in &config_args {
@@ -191,7 +215,7 @@ impl CamoufoxNodecarLauncher {
url: Option<&str>,
) -> Result<CamoufoxLaunchResult, Box<dyn std::error::Error + Send + Sync>> {
let custom_config = if let Some(existing_fingerprint) = &config.fingerprint {
println!("Using existing fingerprint from profile metadata");
log::info!("Using existing fingerprint from profile metadata");
existing_fingerprint.clone()
} else {
return Err("No fingerprint provided".into());
@@ -202,8 +226,8 @@ impl CamoufoxNodecarLauncher {
path.clone()
} else {
// Use the browser runner helper with the real profile
let browser_runner = crate::browser_runner::BrowserRunner::instance();
browser_runner
// Use self.browser_runner instead of instance()
BrowserRunner::instance()
.get_browser_executable_path(profile)
.map_err(|e| format!("Failed to get Camoufox executable path: {e}"))?
.to_string_lossy()
@@ -251,18 +275,18 @@ impl CamoufoxNodecarLauncher {
}
// Execute nodecar sidecar command
println!("Executing nodecar command with args: {args:?}");
log::info!("Executing nodecar command with args: {args:?}");
let output = sidecar_command.output().await?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let stdout = String::from_utf8_lossy(&output.stdout);
println!("nodecar camoufox failed - stdout: {stdout}, stderr: {stderr}");
log::info!("nodecar camoufox failed - stdout: {stdout}, stderr: {stderr}");
return Err(format!("nodecar camoufox failed: {stderr}").into());
}
let stdout = String::from_utf8_lossy(&output.stdout);
println!("nodecar camoufox output: {stdout}");
log::info!("nodecar camoufox output: {stdout}");
// Parse the JSON output
let launch_result: CamoufoxLaunchResult = serde_json::from_str(&stdout)
@@ -325,6 +349,8 @@ impl CamoufoxNodecarLauncher {
}
/// Find Camoufox server by profile path (for integration with browser_runner)
/// This method first checks in-memory instances, then scans system processes
/// to detect Camoufox instances that may have been started before the app restarted.
pub async fn find_camoufox_by_profile(
&self,
profile_path: &str,
@@ -332,41 +358,127 @@ impl CamoufoxNodecarLauncher {
// First clean up any dead instances
self.cleanup_dead_instances().await?;
let inner = self.inner.lock().await;
// Convert paths to canonical form for comparison
let target_path = std::path::Path::new(profile_path)
.canonicalize()
.unwrap_or_else(|_| std::path::Path::new(profile_path).to_path_buf());
for (id, instance) in inner.instances.iter() {
if let Some(instance_profile_path) = &instance.profile_path {
let instance_path = std::path::Path::new(instance_profile_path)
.canonicalize()
.unwrap_or_else(|_| std::path::Path::new(instance_profile_path).to_path_buf());
// Check in-memory instances first
{
let inner = self.inner.lock().await;
if instance_path == target_path {
// Verify the server is actually running by checking the process
if let Some(process_id) = instance.process_id {
if self.is_server_running(process_id).await {
// Found running Camoufox instance
return Ok(Some(CamoufoxLaunchResult {
id: id.clone(),
processId: instance.process_id,
profilePath: instance.profile_path.clone(),
url: instance.url.clone(),
}));
} else {
// Camoufox instance found but process is not running
for (id, instance) in inner.instances.iter() {
if let Some(instance_profile_path) = &instance.profile_path {
let instance_path = std::path::Path::new(instance_profile_path)
.canonicalize()
.unwrap_or_else(|_| std::path::Path::new(instance_profile_path).to_path_buf());
if instance_path == target_path {
// Verify the server is actually running by checking the process
if let Some(process_id) = instance.process_id {
if self.is_server_running(process_id).await {
// Found running Camoufox instance
return Ok(Some(CamoufoxLaunchResult {
id: id.clone(),
processId: instance.process_id,
profilePath: instance.profile_path.clone(),
url: instance.url.clone(),
}));
}
}
}
}
}
}
// If not found in in-memory instances, scan system processes
// This handles the case where the app was restarted but Camoufox is still running
if let Some((pid, found_profile_path)) = self.find_camoufox_process_by_profile(&target_path) {
log::info!(
"Found running Camoufox process (PID: {}) for profile path via system scan",
pid
);
// Register this instance in our tracking
let instance_id = format!("recovered_{}", pid);
let mut inner = self.inner.lock().await;
inner.instances.insert(
instance_id.clone(),
CamoufoxInstance {
id: instance_id.clone(),
process_id: Some(pid),
profile_path: Some(found_profile_path.clone()),
url: None,
},
);
return Ok(Some(CamoufoxLaunchResult {
id: instance_id,
processId: Some(pid),
profilePath: Some(found_profile_path),
url: None,
}));
}
Ok(None)
}
/// Scan system processes to find a Camoufox process using a specific profile path
fn find_camoufox_process_by_profile(
&self,
target_path: &std::path::Path,
) -> Option<(u32, String)> {
use sysinfo::{ProcessRefreshKind, RefreshKind, System};
let system = System::new_with_specifics(
RefreshKind::nothing().with_processes(ProcessRefreshKind::everything()),
);
let target_path_str = target_path.to_string_lossy();
for (pid, process) in system.processes() {
let cmd = process.cmd();
if cmd.is_empty() {
continue;
}
// Check if this is a Camoufox/Firefox process
let exe_name = process.name().to_string_lossy().to_lowercase();
let is_firefox_like = exe_name.contains("firefox")
|| exe_name.contains("camoufox")
|| exe_name.contains("firefox-bin");
if !is_firefox_like {
continue;
}
// Check if the command line contains our profile path
for (i, arg) in cmd.iter().enumerate() {
if let Some(arg_str) = arg.to_str() {
// Check for -profile argument followed by our path
if arg_str == "-profile" && i + 1 < cmd.len() {
if let Some(next_arg) = cmd.get(i + 1).and_then(|a| a.to_str()) {
let cmd_path = std::path::Path::new(next_arg)
.canonicalize()
.unwrap_or_else(|_| std::path::Path::new(next_arg).to_path_buf());
if cmd_path == target_path {
return Some((pid.as_u32(), next_arg.to_string()));
}
}
}
// Also check if the argument contains the profile path directly
if arg_str.contains(&*target_path_str) {
return Some((pid.as_u32(), target_path_str.to_string()));
}
}
}
}
None
}
/// Check if servers are still alive and clean up dead instances
pub async fn cleanup_dead_instances(
&self,
@@ -431,7 +543,7 @@ impl CamoufoxNodecarLauncher {
}
}
impl CamoufoxNodecarLauncher {
impl CamoufoxManager {
pub async fn launch_camoufox_profile_nodecar(
&self,
app_handle: AppHandle,
@@ -440,8 +552,7 @@ impl CamoufoxNodecarLauncher {
url: Option<String>,
) -> Result<CamoufoxLaunchResult, String> {
// Get profile path
let browser_runner = crate::browser_runner::BrowserRunner::instance();
let profiles_dir = browser_runner.get_profiles_dir();
let profiles_dir = self.get_profiles_dir();
let profile_path = profile.get_profile_data_path(&profiles_dir);
let profile_path_str = profile_path.to_string_lossy();
@@ -479,10 +590,12 @@ mod tests {
assert_eq!(default_config.geoip, Some(serde_json::Value::Bool(true)));
assert_eq!(default_config.proxy, None);
assert_eq!(default_config.fingerprint, None);
assert_eq!(default_config.randomize_fingerprint_on_launch, None);
assert_eq!(default_config.os, None);
}
}
// Global singleton instance
lazy_static::lazy_static! {
static ref CAMOUFOX_NODECAR_LAUNCHER: CamoufoxNodecarLauncher = CamoufoxNodecarLauncher::new();
static ref CAMOUFOX_NODECAR_LAUNCHER: CamoufoxManager = CamoufoxManager::new();
}
+2 -46
View File
@@ -1,10 +1,10 @@
use tauri::command;
pub struct DefaultBrowser;
pub struct DefaultBrowser {}
impl DefaultBrowser {
fn new() -> Self {
Self
Self {}
}
pub fn instance() -> &'static DefaultBrowser {
@@ -38,38 +38,6 @@ impl DefaultBrowser {
#[cfg(not(any(target_os = "macos", target_os = "windows", target_os = "linux")))]
Err("Unsupported platform".to_string())
}
pub async fn open_url_with_profile(
&self,
app_handle: tauri::AppHandle,
profile_name: String,
url: String,
) -> Result<(), String> {
let runner = crate::browser_runner::BrowserRunner::instance();
// Get the profile by name
let profiles = runner
.list_profiles()
.map_err(|e| format!("Failed to list profiles: {e}"))?;
let profile = profiles
.into_iter()
.find(|p| p.name == profile_name)
.ok_or_else(|| format!("Profile '{profile_name}' not found"))?;
println!("Opening URL '{url}' with profile '{profile_name}'");
// Use launch_or_open_url which handles both launching new instances and opening in existing ones
runner
.launch_or_open_url(app_handle, &profile, Some(url.clone()), None)
.await
.map_err(|e| {
println!("Failed to open URL with profile '{profile_name}': {e}");
format!("Failed to open URL with profile: {e}")
})?;
println!("Successfully opened URL '{url}' with profile '{profile_name}'");
Ok(())
}
}
#[cfg(target_os = "macos")]
@@ -570,15 +538,3 @@ pub async fn set_as_default_browser() -> Result<(), String> {
let default_browser = DefaultBrowser::instance();
default_browser.set_as_default_browser().await
}
#[tauri::command]
pub async fn open_url_with_profile(
app_handle: tauri::AppHandle,
profile_name: String,
url: String,
) -> Result<(), String> {
let default_browser = DefaultBrowser::instance();
default_browser
.open_url_with_profile(app_handle, profile_name, url)
.await
}
-586
View File
@@ -1,586 +0,0 @@
use directories::BaseDirs;
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
use std::fs;
use std::path::PathBuf;
use std::sync::Mutex;
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct DownloadedBrowserInfo {
pub browser: String,
pub version: String,
pub file_path: PathBuf,
}
#[derive(Debug, Serialize, Deserialize, Default)]
struct RegistryData {
pub browsers: HashMap<String, HashMap<String, DownloadedBrowserInfo>>, // browser -> version -> info
}
pub struct DownloadedBrowsersRegistry {
data: Mutex<RegistryData>,
}
impl DownloadedBrowsersRegistry {
fn new() -> Self {
Self {
data: Mutex::new(RegistryData::default()),
}
}
pub fn instance() -> &'static DownloadedBrowsersRegistry {
&DOWNLOADED_BROWSERS_REGISTRY
}
pub fn load(&self) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let registry_path = Self::get_registry_path()?;
if !registry_path.exists() {
return Ok(());
}
let content = fs::read_to_string(&registry_path)?;
let registry_data: RegistryData = serde_json::from_str(&content)?;
let mut data = self.data.lock().unwrap();
*data = registry_data;
Ok(())
}
pub fn save(&self) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let registry_path = Self::get_registry_path()?;
// Ensure parent directory exists
if let Some(parent) = registry_path.parent() {
fs::create_dir_all(parent)?;
}
let data = self.data.lock().unwrap();
let content = serde_json::to_string_pretty(&*data)?;
fs::write(&registry_path, content)?;
Ok(())
}
fn get_registry_path() -> Result<PathBuf, Box<dyn std::error::Error + Send + Sync>> {
let base_dirs = BaseDirs::new().ok_or("Failed to get base directories")?;
let mut path = base_dirs.data_local_dir().to_path_buf();
path.push(if cfg!(debug_assertions) {
"DonutBrowserDev"
} else {
"DonutBrowser"
});
path.push("data");
path.push("downloaded_browsers.json");
Ok(path)
}
pub fn add_browser(&self, info: DownloadedBrowserInfo) {
let mut data = self.data.lock().unwrap();
data
.browsers
.entry(info.browser.clone())
.or_default()
.insert(info.version.clone(), info);
}
pub fn remove_browser(&self, browser: &str, version: &str) -> Option<DownloadedBrowserInfo> {
let mut data = self.data.lock().unwrap();
data.browsers.get_mut(browser)?.remove(version)
}
pub fn is_browser_downloaded(&self, browser: &str, version: &str) -> bool {
let data = self.data.lock().unwrap();
data
.browsers
.get(browser)
.and_then(|versions| versions.get(version))
.is_some()
}
pub fn get_downloaded_versions(&self, browser: &str) -> Vec<String> {
let data = self.data.lock().unwrap();
data
.browsers
.get(browser)
.map(|versions| versions.keys().cloned().collect())
.unwrap_or_default()
}
pub fn mark_download_started(&self, browser: &str, version: &str, file_path: PathBuf) {
let info = DownloadedBrowserInfo {
browser: browser.to_string(),
version: version.to_string(),
file_path,
};
self.add_browser(info);
}
pub fn mark_download_completed(&self, browser: &str, version: &str) -> Result<(), String> {
let data = self.data.lock().unwrap();
if data
.browsers
.get(browser)
.and_then(|versions| versions.get(version))
.is_some()
{
Ok(())
} else {
Err(format!("Browser {browser}:{version} not found in registry"))
}
}
pub fn cleanup_failed_download(
&self,
browser: &str,
version: &str,
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
if let Some(info) = self.remove_browser(browser, version) {
// Clean up extracted binaries but preserve downloaded archives
if info.file_path.exists() {
if info.file_path.is_dir() {
// Allowed archive extensions to preserve
let archive_exts = [
"zip", "dmg", "tar.xz", "tar.gz", "tar.bz2", "AppImage", "exe", "pkg", "msi",
];
for entry in fs::read_dir(&info.file_path)? {
let entry = entry?;
let path = entry.path();
if path.is_dir() {
fs::remove_dir_all(&path)?;
continue;
}
// For files, preserve if they look like downloaded archives/installers
let keep = path
.file_name()
.and_then(|n| n.to_str())
.map(|name| {
// Match suffixes (handles multi-part extensions like .tar.xz)
archive_exts
.iter()
.any(|ext| name.to_lowercase().ends_with(&ext.to_lowercase()))
})
.unwrap_or(false);
if !keep {
fs::remove_file(&path)?;
}
}
} else {
// It's a file. If it's not an archive, remove it; otherwise preserve it.
let file_name = info
.file_path
.file_name()
.and_then(|n| n.to_str())
.unwrap_or("");
let archive_exts = [
"zip", "dmg", "tar.xz", "tar.gz", "tar.bz2", "AppImage", "exe", "pkg", "msi",
];
let is_archive = archive_exts
.iter()
.any(|ext| file_name.to_lowercase().ends_with(&ext.to_lowercase()));
if !is_archive {
fs::remove_file(&info.file_path)?;
}
}
}
}
Ok(())
}
/// Find and remove unused browser binaries that are not referenced by any active profiles
pub fn cleanup_unused_binaries(
&self,
active_profiles: &[(String, String)], // (browser, version) pairs
running_profiles: &[(String, String)], // (browser, version) pairs for running profiles
) -> Result<Vec<String>, Box<dyn std::error::Error + Send + Sync>> {
let active_set: std::collections::HashSet<(String, String)> =
active_profiles.iter().cloned().collect();
let running_set: std::collections::HashSet<(String, String)> =
running_profiles.iter().cloned().collect();
let mut cleaned_up = Vec::new();
// Collect all downloaded browsers that are not in active profiles
let mut to_remove = Vec::new();
{
let data = self.data.lock().unwrap();
for (browser, versions) in &data.browsers {
for version in versions.keys() {
let browser_version = (browser.clone(), version.clone());
// Don't remove if it's used by any active profile
if active_set.contains(&browser_version) {
println!("Keeping: {browser} {version} (in use by profile)");
continue;
}
// Don't remove if it's currently running (even if not in active profiles)
if running_set.contains(&browser_version) {
println!("Keeping: {browser} {version} (currently running)");
continue;
}
// Mark for removal
to_remove.push(browser_version);
println!("Marking for removal: {browser} {version} (not used by any profile)");
}
}
}
// Remove unused binaries
for (browser, version) in to_remove {
if let Err(e) = self.cleanup_failed_download(&browser, &version) {
eprintln!("Failed to cleanup unused binary {browser}:{version}: {e}");
} else {
cleaned_up.push(format!("{browser} {version}"));
println!("Successfully removed unused binary: {browser} {version}");
}
}
if cleaned_up.is_empty() {
println!("No unused binaries found to clean up");
} else {
println!("Cleaned up {} unused binaries", cleaned_up.len());
}
Ok(cleaned_up)
}
/// Get all browsers and versions referenced by active profiles
pub fn get_active_browser_versions(
&self,
profiles: &[crate::profile::BrowserProfile],
) -> Vec<(String, String)> {
profiles
.iter()
.map(|profile| (profile.browser.clone(), profile.version.clone()))
.collect()
}
/// Verify that all registered browsers actually exist on disk and clean up stale entries
pub fn verify_and_cleanup_stale_entries(
&self,
browser_runner: &crate::browser_runner::BrowserRunner,
) -> Result<Vec<String>, Box<dyn std::error::Error + Send + Sync>> {
use crate::browser::{create_browser, BrowserType};
let mut cleaned_up = Vec::new();
let binaries_dir = browser_runner.get_binaries_dir();
let browsers_to_check: Vec<(String, String)> = {
let data = self.data.lock().unwrap();
data
.browsers
.iter()
.flat_map(|(browser, versions)| {
versions
.keys()
.map(|version| (browser.clone(), version.clone()))
})
.collect()
};
for (browser_str, version) in browsers_to_check {
if let Ok(browser_type) = BrowserType::from_str(&browser_str) {
let browser = create_browser(browser_type);
if !browser.is_version_downloaded(&version, &binaries_dir) {
// Files don't exist, remove from registry
if let Some(_removed) = self.remove_browser(&browser_str, &version) {
cleaned_up.push(format!("{browser_str} {version}"));
println!("Removed stale registry entry for {browser_str} {version}");
}
}
}
}
if !cleaned_up.is_empty() {
self.save()?;
}
Ok(cleaned_up)
}
/// Get all browsers and versions that are currently running
pub fn get_running_browser_versions(
&self,
profiles: &[crate::profile::BrowserProfile],
) -> Vec<(String, String)> {
profiles
.iter()
.filter(|profile| profile.process_id.is_some())
.map(|profile| (profile.browser.clone(), profile.version.clone()))
.collect()
}
/// Scan the binaries directory and sync with registry
/// This ensures the registry reflects what's actually on disk
pub fn sync_with_binaries_directory(
&self,
binaries_dir: &std::path::Path,
) -> Result<Vec<String>, Box<dyn std::error::Error + Send + Sync>> {
let mut changes = Vec::new();
if !binaries_dir.exists() {
return Ok(changes);
}
// Scan for actual browser directories
for browser_entry in fs::read_dir(binaries_dir)? {
let browser_entry = browser_entry?;
let browser_path = browser_entry.path();
if !browser_path.is_dir() {
continue;
}
let browser_name = browser_path
.file_name()
.and_then(|n| n.to_str())
.unwrap_or("");
if browser_name.is_empty() || browser_name.starts_with('.') {
continue;
}
// Scan for version directories within this browser
for version_entry in fs::read_dir(&browser_path)? {
let version_entry = version_entry?;
let version_path = version_entry.path();
if !version_path.is_dir() {
continue;
}
let version_name = version_path
.file_name()
.and_then(|n| n.to_str())
.unwrap_or("");
if version_name.is_empty() || version_name.starts_with('.') {
continue;
}
// Only add to registry if this looks like a valid installed browser, not just an archive
if !self.is_browser_downloaded(browser_name, version_name) {
if let Ok(browser_type) = crate::browser::BrowserType::from_str(browser_name) {
let browser = crate::browser::create_browser(browser_type);
if browser.is_version_downloaded(version_name, binaries_dir) {
let info = DownloadedBrowserInfo {
browser: browser_name.to_string(),
version: version_name.to_string(),
file_path: version_path.clone(),
};
self.add_browser(info);
changes.push(format!("Added {browser_name} {version_name} to registry"));
}
}
}
}
}
if !changes.is_empty() {
self.save()?;
}
Ok(changes)
}
/// Comprehensive cleanup that removes unused binaries and syncs registry
pub fn comprehensive_cleanup(
&self,
binaries_dir: &std::path::Path,
active_profiles: &[(String, String)],
running_profiles: &[(String, String)],
) -> Result<Vec<String>, Box<dyn std::error::Error + Send + Sync>> {
let mut cleanup_results = Vec::new();
// First, sync registry with actual binaries on disk
let sync_results = self.sync_with_binaries_directory(binaries_dir)?;
cleanup_results.extend(sync_results);
// Then perform the regular cleanup
let regular_cleanup = self.cleanup_unused_binaries(active_profiles, running_profiles)?;
cleanup_results.extend(regular_cleanup);
// Finally, verify and cleanup stale entries
let stale_cleanup = self.verify_and_cleanup_stale_entries_simple(binaries_dir)?;
cleanup_results.extend(stale_cleanup);
if !cleanup_results.is_empty() {
self.save()?;
}
Ok(cleanup_results)
}
/// Simplified version of verify_and_cleanup_stale_entries that doesn't need BrowserRunner
pub fn verify_and_cleanup_stale_entries_simple(
&self,
binaries_dir: &std::path::Path,
) -> Result<Vec<String>, Box<dyn std::error::Error + Send + Sync>> {
let mut cleaned_up = Vec::new();
let mut browsers_to_remove = Vec::new();
{
let data = self.data.lock().unwrap();
for (browser_str, versions) in &data.browsers {
for version in versions.keys() {
// Check if the browser directory actually exists
let browser_dir = binaries_dir.join(browser_str).join(version);
if !browser_dir.exists() {
browsers_to_remove.push((browser_str.clone(), version.clone()));
}
}
}
}
// Remove stale entries
for (browser_str, version) in browsers_to_remove {
if let Some(_removed) = self.remove_browser(&browser_str, &version) {
cleaned_up.push(format!(
"Removed stale registry entry for {browser_str} {version}"
));
}
}
Ok(cleaned_up)
}
}
// Global singleton instance
lazy_static::lazy_static! {
static ref DOWNLOADED_BROWSERS_REGISTRY: DownloadedBrowsersRegistry = {
let registry = DownloadedBrowsersRegistry::new();
if let Err(e) = registry.load() {
eprintln!("Warning: Failed to load downloaded browsers registry: {e}");
}
registry
};
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_registry_creation() {
let registry = DownloadedBrowsersRegistry::new();
let data = registry.data.lock().unwrap();
assert!(data.browsers.is_empty());
}
#[test]
fn test_add_and_get_browser() {
let registry = DownloadedBrowsersRegistry::new();
let info = DownloadedBrowserInfo {
browser: "firefox".to_string(),
version: "139.0".to_string(),
file_path: PathBuf::from("/test/path"),
};
registry.add_browser(info.clone());
assert!(registry.is_browser_downloaded("firefox", "139.0"));
assert!(!registry.is_browser_downloaded("firefox", "140.0"));
assert!(!registry.is_browser_downloaded("chrome", "139.0"));
}
#[test]
fn test_get_downloaded_versions() {
let registry = DownloadedBrowsersRegistry::new();
let info1 = DownloadedBrowserInfo {
browser: "firefox".to_string(),
version: "139.0".to_string(),
file_path: PathBuf::from("/test/path1"),
};
let info2 = DownloadedBrowserInfo {
browser: "firefox".to_string(),
version: "140.0".to_string(),
file_path: PathBuf::from("/test/path2"),
};
let info3 = DownloadedBrowserInfo {
browser: "firefox".to_string(),
version: "141.0".to_string(),
file_path: PathBuf::from("/test/path3"),
};
registry.add_browser(info1);
registry.add_browser(info2);
registry.add_browser(info3);
let versions = registry.get_downloaded_versions("firefox");
assert_eq!(versions.len(), 3);
assert!(versions.contains(&"139.0".to_string()));
assert!(versions.contains(&"140.0".to_string()));
assert!(versions.contains(&"141.0".to_string()));
}
#[test]
fn test_mark_download_lifecycle() {
let registry = DownloadedBrowsersRegistry::new();
// Mark download started
registry.mark_download_started("firefox", "139.0", PathBuf::from("/test/path"));
// Should be considered downloaded immediately
assert!(
registry.is_browser_downloaded("firefox", "139.0"),
"Browser should be considered downloaded after marking as started"
);
// Mark as completed
registry
.mark_download_completed("firefox", "139.0")
.expect("Failed to mark download as completed");
// Should still be considered downloaded
assert!(
registry.is_browser_downloaded("firefox", "139.0"),
"Browser should still be considered downloaded after completion"
);
}
#[test]
fn test_remove_browser() {
let registry = DownloadedBrowsersRegistry::new();
let info = DownloadedBrowserInfo {
browser: "firefox".to_string(),
version: "139.0".to_string(),
file_path: PathBuf::from("/test/path"),
};
registry.add_browser(info);
assert!(
registry.is_browser_downloaded("firefox", "139.0"),
"Browser should be downloaded after adding"
);
let removed = registry.remove_browser("firefox", "139.0");
assert!(
removed.is_some(),
"Remove operation should return the removed browser info"
);
assert!(
!registry.is_browser_downloaded("firefox", "139.0"),
"Browser should not be downloaded after removal"
);
}
#[test]
fn test_twilight_download() {
let registry = DownloadedBrowsersRegistry::new();
// Mark twilight download started
registry.mark_download_started("zen", "twilight", PathBuf::from("/test/zen-twilight"));
// Check that it's registered
assert!(
registry.is_browser_downloaded("zen", "twilight"),
"Zen twilight version should be registered as downloaded"
);
}
}
File diff suppressed because it is too large Load Diff
@@ -2,12 +2,19 @@ use reqwest::Client;
use serde::{Deserialize, Serialize};
use std::io;
use std::path::{Path, PathBuf};
use std::sync::Mutex;
use tauri::Emitter;
use crate::api_client::ApiClient;
use crate::browser::BrowserType;
use crate::browser::{create_browser, BrowserType};
use crate::browser_version_manager::DownloadInfo;
// Global state to track currently downloading browser-version pairs
lazy_static::lazy_static! {
static ref DOWNLOADING_BROWSERS: std::sync::Arc<Mutex<std::collections::HashSet<String>>> =
std::sync::Arc::new(Mutex::new(std::collections::HashSet::new()));
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct DownloadProgress {
pub browser: String,
@@ -23,6 +30,10 @@ pub struct DownloadProgress {
pub struct Downloader {
client: Client,
api_client: &'static ApiClient,
registry: &'static crate::downloaded_browsers_registry::DownloadedBrowsersRegistry,
version_service: &'static crate::browser_version_manager::BrowserVersionManager,
extractor: &'static crate::extraction::Extractor,
geoip_downloader: &'static crate::geoip_downloader::GeoIPDownloader,
}
impl Downloader {
@@ -30,6 +41,10 @@ impl Downloader {
Self {
client: Client::new(),
api_client: ApiClient::instance(),
registry: crate::downloaded_browsers_registry::DownloadedBrowsersRegistry::instance(),
version_service: crate::browser_version_manager::BrowserVersionManager::instance(),
extractor: crate::extraction::Extractor::instance(),
geoip_downloader: crate::geoip_downloader::GeoIPDownloader::instance(),
}
}
@@ -42,6 +57,10 @@ impl Downloader {
Self {
client: Client::new(),
api_client: ApiClient::instance(),
registry: crate::downloaded_browsers_registry::DownloadedBrowsersRegistry::instance(),
version_service: crate::browser_version_manager::BrowserVersionManager::instance(),
extractor: crate::extraction::Extractor::instance(),
geoip_downloader: crate::geoip_downloader::GeoIPDownloader::instance(),
}
}
@@ -85,7 +104,7 @@ impl Downloader {
let releases = match self.api_client.fetch_zen_releases_with_caching(true).await {
Ok(releases) => releases,
Err(e) => {
eprintln!("Failed to fetch Zen releases: {e}");
log::error!("Failed to fetch Zen releases: {e}");
return Err(format!("Failed to fetch Zen releases from GitHub API: {e}. This might be due to GitHub API rate limiting or network issues. Please try again later.").into());
}
};
@@ -402,23 +421,44 @@ impl Downloader {
existing_size = meta.len();
}
// Build request, add Range only if we have bytes
let mut request = self
.client
.get(&download_url)
.header(
"User-Agent",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/136.0.0.0 Safari/537.36",
);
// Build request, add Range only if we have bytes. If the server responds with 416 (Range Not
// Satisfiable), delete the partial file and retry once without the Range header.
let response = {
let mut request = self
.client
.get(&download_url)
.header(
"User-Agent",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/136.0.0.0 Safari/537.36",
);
if existing_size > 0 {
request = request.header("Range", format!("bytes={existing_size}-"));
}
if existing_size > 0 {
request = request.header("Range", format!("bytes={existing_size}-"));
}
// Start download (or resume)
let response = request.send().await?;
let first = request.send().await?;
// Check if the response is successful
if first.status().as_u16() == 416 && existing_size > 0 {
// Partial file on disk is not acceptable to the server — remove it and retry from scratch
let _ = std::fs::remove_file(&file_path);
existing_size = 0;
let retry = self
.client
.get(&download_url)
.header(
"User-Agent",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/136.0.0.0 Safari/537.36",
)
.send()
.await?;
retry
} else {
first
}
};
// Check if the response is successful (200 OK or 206 Partial Content)
if !(response.status().is_success() || response.status().as_u16() == 206) {
return Err(format!("Download failed with status: {}", response.status()).into());
}
@@ -552,6 +592,327 @@ impl Downloader {
Ok(file_path)
}
/// Download a browser binary, verify it, and register it in the downloaded browsers registry
pub async fn download_browser_full(
&self,
app_handle: &tauri::AppHandle,
browser_str: String,
version: String,
) -> Result<String, Box<dyn std::error::Error + Send + Sync>> {
// Check if this browser-version pair is already being downloaded
let download_key = format!("{browser_str}-{version}");
{
let mut downloading = DOWNLOADING_BROWSERS.lock().unwrap();
if downloading.contains(&download_key) {
return Err(format!("Browser '{browser_str}' version '{version}' is already being downloaded. Please wait for the current download to complete.").into());
}
// Mark this browser-version pair as being downloaded
downloading.insert(download_key.clone());
}
let browser_type =
BrowserType::from_str(&browser_str).map_err(|e| format!("Invalid browser type: {e}"))?;
let browser = create_browser(browser_type.clone());
// Use injected registry instance
// Get binaries directory - we need to get it from somewhere
// This is a bit tricky since we don't have access to BrowserRunner's get_binaries_dir
// We'll need to replicate this logic
let binaries_dir = if let Some(base_dirs) = directories::BaseDirs::new() {
let mut path = base_dirs.data_local_dir().to_path_buf();
path.push(if cfg!(debug_assertions) {
"DonutBrowserDev"
} else {
"DonutBrowser"
});
path.push("binaries");
path
} else {
return Err("Failed to get base directories".into());
};
// Check if registry thinks it's downloaded, but also verify files actually exist
if self.registry.is_browser_downloaded(&browser_str, &version) {
let actually_exists = browser.is_version_downloaded(&version, &binaries_dir);
if actually_exists {
// Remove from downloading set since it's already downloaded
let mut downloading = DOWNLOADING_BROWSERS.lock().unwrap();
downloading.remove(&download_key);
return Ok(version);
} else {
// Registry says it's downloaded but files don't exist - clean up registry
log::info!("Registry indicates {browser_str} {version} is downloaded, but files are missing. Cleaning up registry entry.");
self.registry.remove_browser(&browser_str, &version);
self
.registry
.save()
.map_err(|e| format!("Failed to save cleaned registry: {e}"))?;
}
}
// Check if browser is supported on current platform before attempting download
if !self
.version_service
.is_browser_supported(&browser_str)
.unwrap_or(false)
{
// Remove from downloading set on error
let mut downloading = DOWNLOADING_BROWSERS.lock().unwrap();
downloading.remove(&download_key);
return Err(
format!(
"Browser '{}' is not supported on your platform ({} {}). Supported browsers: {}",
browser_str,
std::env::consts::OS,
std::env::consts::ARCH,
self.version_service.get_supported_browsers().join(", ")
)
.into(),
);
}
let download_info = self
.version_service
.get_download_info(&browser_str, &version)
.map_err(|e| format!("Failed to get download info: {e}"))?;
// Create browser directory
let mut browser_dir = binaries_dir.clone();
browser_dir.push(&browser_str);
browser_dir.push(&version);
std::fs::create_dir_all(&browser_dir)
.map_err(|e| format!("Failed to create browser directory: {e}"))?;
// Mark download as started (but don't add to registry yet)
self
.registry
.mark_download_started(&browser_str, &version, browser_dir.clone());
// Attempt to download the archive. If the download fails but an archive with the
// expected filename already exists (manual download), continue using that file.
let download_path: PathBuf = match self
.download_browser(
app_handle,
browser_type.clone(),
&version,
&download_info,
&browser_dir,
)
.await
{
Ok(path) => path,
Err(e) => {
// Do NOT continue with extraction on failed downloads. Partial files may exist but are invalid.
// Clean registry entry and stop here so the UI can show a single, clear error.
let _ = self.registry.remove_browser(&browser_str, &version);
let _ = self.registry.save();
let mut downloading = DOWNLOADING_BROWSERS.lock().unwrap();
downloading.remove(&download_key);
return Err(format!("Failed to download browser: {e}").into());
}
};
// Use the extraction module
if download_info.is_archive {
match self
.extractor
.extract_browser(
app_handle,
browser_type.clone(),
&version,
&download_path,
&browser_dir,
)
.await
{
Ok(_) => {
// Do not remove the archive here. We keep it until verification succeeds.
}
Err(e) => {
// Do not remove the archive or extracted files. Just drop the registry entry
// so it won't be reported as downloaded.
let _ = self.registry.remove_browser(&browser_str, &version);
let _ = self.registry.save();
// Remove browser-version pair from downloading set on error
{
let mut downloading = DOWNLOADING_BROWSERS.lock().unwrap();
downloading.remove(&download_key);
}
return Err(format!("Failed to extract browser: {e}").into());
}
}
// Give filesystem a moment to settle after extraction
tokio::time::sleep(tokio::time::Duration::from_millis(500)).await;
}
// Emit verification progress
let progress = DownloadProgress {
browser: browser_str.clone(),
version: version.clone(),
downloaded_bytes: 0,
total_bytes: None,
percentage: 100.0,
speed_bytes_per_sec: 0.0,
eta_seconds: None,
stage: "verifying".to_string(),
};
let _ = app_handle.emit("download-progress", &progress);
// Verify the browser was downloaded correctly
log::info!("Verifying download for browser: {browser_str}, version: {version}");
// Use the browser's own verification method
if !browser.is_version_downloaded(&version, &binaries_dir) {
// Provide detailed error information for debugging
let browser_dir = binaries_dir.join(&browser_str).join(&version);
let mut error_details = format!(
"Browser download completed but verification failed for {} {}. Expected directory: {}",
browser_str,
version,
browser_dir.display()
);
// List what files actually exist
if browser_dir.exists() {
error_details.push_str("\nFiles found in directory:");
if let Ok(entries) = std::fs::read_dir(&browser_dir) {
for entry in entries.flatten() {
let path = entry.path();
let file_type = if path.is_dir() { "DIR" } else { "FILE" };
error_details.push_str(&format!("\n {} {}", file_type, path.display()));
}
} else {
error_details.push_str("\n (Could not read directory contents)");
}
} else {
error_details.push_str("\nDirectory does not exist!");
}
// For Camoufox on Linux, provide specific expected files
if browser_str == "camoufox" && cfg!(target_os = "linux") {
let camoufox_subdir = browser_dir.join("camoufox");
error_details.push_str("\nExpected Camoufox executable locations:");
error_details.push_str(&format!("\n {}/camoufox-bin", camoufox_subdir.display()));
error_details.push_str(&format!("\n {}/camoufox", camoufox_subdir.display()));
if camoufox_subdir.exists() {
error_details.push_str(&format!(
"\nCamoufox subdirectory exists: {}",
camoufox_subdir.display()
));
if let Ok(entries) = std::fs::read_dir(&camoufox_subdir) {
error_details.push_str("\nFiles in camoufox subdirectory:");
for entry in entries.flatten() {
let path = entry.path();
let file_type = if path.is_dir() { "DIR" } else { "FILE" };
error_details.push_str(&format!("\n {} {}", file_type, path.display()));
}
}
} else {
error_details.push_str(&format!(
"\nCamoufox subdirectory does not exist: {}",
camoufox_subdir.display()
));
}
}
// Do not delete files on verification failure; keep archive for manual retry.
let _ = self.registry.remove_browser(&browser_str, &version);
let _ = self.registry.save();
// Remove browser-version pair from downloading set on verification failure
{
let mut downloading = DOWNLOADING_BROWSERS.lock().unwrap();
downloading.remove(&download_key);
}
return Err(error_details.into());
}
// Mark completion in registry - only now add to registry after verification
if let Err(e) =
self
.registry
.mark_download_completed(&browser_str, &version, browser_dir.clone())
{
log::warn!("Warning: Could not mark {browser_str} {version} as completed in registry: {e}");
}
self
.registry
.save()
.map_err(|e| format!("Failed to save registry: {e}"))?;
// Now that verification succeeded, remove the archive file if it exists
if download_info.is_archive {
let archive_path = browser_dir.join(&download_info.filename);
if archive_path.exists() {
if let Err(e) = std::fs::remove_file(&archive_path) {
log::warn!("Warning: Could not delete archive file after verification: {e}");
}
}
}
// If this is Camoufox, automatically download GeoIP database
if browser_str == "camoufox" {
// Check if GeoIP database is already available
if !crate::geoip_downloader::GeoIPDownloader::is_geoip_database_available() {
log::info!("Downloading GeoIP database for Camoufox...");
match self
.geoip_downloader
.download_geoip_database(app_handle)
.await
{
Ok(_) => {
log::info!("GeoIP database downloaded successfully");
}
Err(e) => {
log::error!("Failed to download GeoIP database: {e}");
// Don't fail the browser download if GeoIP download fails
}
}
} else {
log::info!("GeoIP database already available");
}
}
// Emit completion
let progress = DownloadProgress {
browser: browser_str.clone(),
version: version.clone(),
downloaded_bytes: 0,
total_bytes: None,
percentage: 100.0,
speed_bytes_per_sec: 0.0,
eta_seconds: Some(0.0),
stage: "completed".to_string(),
};
let _ = app_handle.emit("download-progress", &progress);
// Remove browser-version pair from downloading set
{
let mut downloading = DOWNLOADING_BROWSERS.lock().unwrap();
downloading.remove(&download_key);
}
Ok(version)
}
}
#[tauri::command]
pub async fn download_browser(
app_handle: tauri::AppHandle,
browser_str: String,
version: String,
) -> Result<String, String> {
let downloader = Downloader::instance();
downloader
.download_browser_full(&app_handle, browser_str, version)
.await
.map_err(|e| format!("Failed to download browser: {e}"))
}
#[cfg(test)]
+51 -51
View File
@@ -4,7 +4,7 @@ use std::path::{Path, PathBuf};
use tauri::Emitter;
use crate::browser::BrowserType;
use crate::download::DownloadProgress;
use crate::downloader::DownloadProgress;
#[cfg(any(target_os = "macos", target_os = "windows"))]
use std::process::Command;
@@ -55,7 +55,7 @@ impl Extractor {
// If the executable is not in the expected subdirectory, create the structure
if !exe_path.starts_with(&expected_subdir) {
println!("Reorganizing directory structure for {}", browser_type);
log::info!("Reorganizing directory structure for {}", browser_type);
// Create the expected subdirectory
std::fs::create_dir_all(&expected_subdir)?;
@@ -78,19 +78,19 @@ impl Extractor {
// Move the file/directory
if let Err(e) = std::fs::rename(&path, &target_path) {
println!(
log::info!(
"Warning: Failed to move {} to {}: {}",
path.display(),
target_path.display(),
e
);
} else {
println!("Moved {} to {}", path.display(), target_path.display());
log::info!("Moved {} to {}", path.display(), target_path.display());
}
}
}
println!("Directory structure reorganized for {}", browser_type);
log::info!("Directory structure reorganized for {}", browser_type);
}
Ok(())
@@ -117,7 +117,7 @@ impl Extractor {
};
let _ = app_handle.emit("download-progress", &progress);
println!(
log::info!(
"Starting extraction of {} for browser {} version {}",
archive_path.display(),
browser_type.as_str(),
@@ -132,7 +132,7 @@ impl Extractor {
e
)
})?;
println!("Detected format: {actual_format}");
log::info!("Detected format: {actual_format}");
let extraction_result = match actual_format.as_str() {
"dmg" => {
@@ -210,7 +210,7 @@ impl Extractor {
match extraction_result {
Ok(path) => {
println!(
log::info!(
"Successfully extracted {} {} to: {}",
browser_type.as_str(),
version,
@@ -219,7 +219,7 @@ impl Extractor {
Ok(path)
}
Err(e) => {
eprintln!(
log::error!(
"Extraction failed for {} {}: {}",
browser_type.as_str(),
version,
@@ -337,7 +337,7 @@ impl Extractor {
dmg_path: &Path,
dest_dir: &Path,
) -> Result<PathBuf, Box<dyn std::error::Error + Send + Sync>> {
println!(
log::info!(
"Extracting DMG: {} to {}",
dmg_path.display(),
dest_dir.display()
@@ -353,7 +353,7 @@ impl Extractor {
));
create_dir_all(&mount_point)?;
println!("Created mount point: {}", mount_point.display());
log::info!("Created mount point: {}", mount_point.display());
// Mount the DMG
let output = Command::new("hdiutil")
@@ -369,7 +369,7 @@ impl Extractor {
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let stdout = String::from_utf8_lossy(&output.stdout);
println!("Failed to mount DMG. stdout: {stdout}, stderr: {stderr}");
log::info!("Failed to mount DMG. stdout: {stdout}, stderr: {stderr}");
// Clean up mount point before returning error
let _ = fs::remove_dir_all(&mount_point);
@@ -377,7 +377,7 @@ impl Extractor {
return Err(format!("Failed to mount DMG: {stderr}").into());
}
println!("Successfully mounted DMG");
log::info!("Successfully mounted DMG");
// Find the .app directory in the mount point
let app_result = self.find_app_in_directory(&mount_point).await;
@@ -385,7 +385,7 @@ impl Extractor {
let app_entry = match app_result {
Ok(app_path) => app_path,
Err(e) => {
println!("Failed to find .app in mount point: {e}");
log::info!("Failed to find .app in mount point: {e}");
// Try to unmount before returning error
let _ = Command::new("hdiutil")
@@ -397,12 +397,12 @@ impl Extractor {
}
};
println!("Found .app bundle: {}", app_entry.display());
log::info!("Found .app bundle: {}", app_entry.display());
// Copy the .app to the destination
let app_path = dest_dir.join(app_entry.file_name().unwrap());
println!("Copying .app to: {}", app_path.display());
log::info!("Copying .app to: {}", app_path.display());
let output = Command::new("cp")
.args([
@@ -414,7 +414,7 @@ impl Extractor {
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
println!("Failed to copy app: {stderr}");
log::info!("Failed to copy app: {stderr}");
// Unmount before returning error
let _ = Command::new("hdiutil")
@@ -425,7 +425,7 @@ impl Extractor {
return Err(format!("Failed to copy app: {stderr}").into());
}
println!("Successfully copied .app bundle");
log::info!("Successfully copied .app bundle");
// Remove quarantine attributes
let _ = Command::new("xattr")
@@ -436,7 +436,7 @@ impl Extractor {
.args(["-cr", app_path.to_str().unwrap()])
.output();
println!("Removed quarantine attributes");
log::info!("Removed quarantine attributes");
// Unmount the DMG
let output = Command::new("hdiutil")
@@ -445,10 +445,10 @@ impl Extractor {
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
println!("Warning: Failed to unmount DMG: {stderr}");
log::warn!("Warning: Failed to unmount DMG: {stderr}");
// Don't fail if unmount fails - the extraction was successful
} else {
println!("Successfully unmounted DMG");
log::info!("Successfully unmounted DMG");
}
// Clean up mount point directory
@@ -486,7 +486,7 @@ impl Extractor {
if path.is_dir() {
if let Some(extension) = path.extension() {
if extension == "app" {
println!("Found .app bundle at depth {}: {}", depth, path.display());
log::info!("Found .app bundle at depth {}: {}", depth, path.display());
return Ok(path);
}
}
@@ -535,7 +535,7 @@ impl Extractor {
zip_path: &Path,
dest_dir: &Path,
) -> Result<PathBuf, Box<dyn std::error::Error + Send + Sync>> {
println!("Extracting ZIP archive: {}", zip_path.display());
log::info!("Extracting ZIP archive: {}", zip_path.display());
std::fs::create_dir_all(dest_dir)?;
let file = File::open(zip_path)
@@ -544,7 +544,7 @@ impl Extractor {
let mut archive = zip::ZipArchive::new(BufReader::new(file))
.map_err(|e| format!("Failed to read ZIP archive {}: {}", zip_path.display(), e))?;
println!("ZIP archive contains {} files", archive.len());
log::info!("ZIP archive contains {} files", archive.len());
for i in 0..archive.len() {
let mut entry = archive
@@ -591,7 +591,7 @@ impl Extractor {
}
}
println!("ZIP extraction completed. Searching for executable...");
log::info!("ZIP extraction completed. Searching for executable...");
self
.find_extracted_executable(dest_dir)
.await
@@ -603,7 +603,7 @@ impl Extractor {
tar_path: &Path,
dest_dir: &Path,
) -> Result<PathBuf, Box<dyn std::error::Error + Send + Sync>> {
println!("Extracting tar.gz archive: {}", tar_path.display());
log::info!("Extracting tar.gz archive: {}", tar_path.display());
std::fs::create_dir_all(dest_dir)?;
let file = File::open(tar_path)?;
@@ -615,7 +615,7 @@ impl Extractor {
// Set executable permissions for extracted files
self.set_executable_permissions_recursive(dest_dir).await?;
println!("tar.gz extraction completed. Searching for executable...");
log::info!("tar.gz extraction completed. Searching for executable...");
self.find_extracted_executable(dest_dir).await
}
@@ -624,7 +624,7 @@ impl Extractor {
tar_path: &Path,
dest_dir: &Path,
) -> Result<PathBuf, Box<dyn std::error::Error + Send + Sync>> {
println!("Extracting tar.bz2 archive: {}", tar_path.display());
log::info!("Extracting tar.bz2 archive: {}", tar_path.display());
std::fs::create_dir_all(dest_dir)?;
let file = File::open(tar_path)?;
@@ -636,7 +636,7 @@ impl Extractor {
// Set executable permissions for extracted files
self.set_executable_permissions_recursive(dest_dir).await?;
println!("tar.bz2 extraction completed. Searching for executable...");
log::info!("tar.bz2 extraction completed. Searching for executable...");
self.find_extracted_executable(dest_dir).await
}
@@ -645,7 +645,7 @@ impl Extractor {
tar_path: &Path,
dest_dir: &Path,
) -> Result<PathBuf, Box<dyn std::error::Error + Send + Sync>> {
println!("Extracting tar.xz archive: {}", tar_path.display());
log::info!("Extracting tar.xz archive: {}", tar_path.display());
std::fs::create_dir_all(dest_dir)?;
let file = File::open(tar_path)?;
@@ -671,7 +671,7 @@ impl Extractor {
// Set executable permissions for extracted files
self.set_executable_permissions_recursive(dest_dir).await?;
println!("tar.xz extraction completed. Searching for executable...");
log::info!("tar.xz extraction completed. Searching for executable...");
self.find_extracted_executable(dest_dir).await
}
@@ -680,7 +680,7 @@ impl Extractor {
msi_path: &Path,
dest_dir: &Path,
) -> Result<PathBuf, Box<dyn std::error::Error + Send + Sync>> {
println!("Extracting MSI archive: {}", msi_path.display());
log::info!("Extracting MSI archive: {}", msi_path.display());
std::fs::create_dir_all(dest_dir)?;
// Extract MSI in a separate scope to avoid Send issues
@@ -689,7 +689,7 @@ impl Extractor {
extractor.to(dest_dir);
}
println!("MSI extraction completed. Searching for executable...");
log::info!("MSI extraction completed. Searching for executable...");
self.find_extracted_executable(dest_dir).await
}
@@ -812,7 +812,7 @@ impl Extractor {
&self,
dest_dir: &Path,
) -> Result<PathBuf, Box<dyn std::error::Error + Send + Sync>> {
println!("Searching for .app bundle in: {}", dest_dir.display());
log::info!("Searching for .app bundle in: {}", dest_dir.display());
// Use the enhanced recursive search
match self.find_app_in_directory(dest_dir).await {
@@ -820,7 +820,7 @@ impl Extractor {
// Check if the app is in a subdirectory and move it to the root if needed
let app_parent = app_path.parent().unwrap();
if app_parent != dest_dir {
println!(
log::info!(
"Found .app in subdirectory, moving to root: {} -> {}",
app_path.display(),
dest_dir.display()
@@ -837,15 +837,15 @@ impl Extractor {
}
}
println!("Successfully moved .app to: {}", target_path.display());
log::info!("Successfully moved .app to: {}", target_path.display());
Ok(target_path)
} else {
println!("Found .app at root level: {}", app_path.display());
log::info!("Found .app at root level: {}", app_path.display());
Ok(app_path)
}
}
Err(e) => {
println!("Failed to find .app bundle: {e}");
log::info!("Failed to find .app bundle: {e}");
Err("No .app found after extraction".into())
}
}
@@ -856,7 +856,7 @@ impl Extractor {
&self,
dest_dir: &Path,
) -> Result<PathBuf, Box<dyn std::error::Error + Send + Sync>> {
println!(
log::info!(
"Searching for Windows executable in: {}",
dest_dir.display()
);
@@ -877,7 +877,7 @@ impl Extractor {
for exe_name in &priority_exe_names {
let exe_path = dest_dir.join(exe_name);
if exe_path.exists() {
println!("Found priority executable: {}", exe_path.display());
log::info!("Found priority executable: {}", exe_path.display());
return Ok(exe_path);
}
}
@@ -885,7 +885,7 @@ impl Extractor {
// Recursively search for executables with depth limit
match self.find_windows_executable_recursive(dest_dir, 0, 3).await {
Ok(exe_path) => {
println!(
log::info!(
"Found executable via recursive search: {}",
exe_path.display()
);
@@ -983,7 +983,7 @@ impl Extractor {
&self,
dest_dir: &Path,
) -> Result<PathBuf, Box<dyn std::error::Error + Send + Sync>> {
println!("Searching for Linux executable in: {}", dest_dir.display());
log::info!("Searching for Linux executable in: {}", dest_dir.display());
// Enhanced list of common browser executable names
let exe_names = [
@@ -1031,7 +1031,7 @@ impl Extractor {
for exe_name in &exe_names {
let exe_path = dest_dir.join(exe_name);
if exe_path.exists() && self.is_executable(&exe_path) {
println!("Found executable at root level: {}", exe_path.display());
log::info!("Found executable at root level: {}", exe_path.display());
return Ok(exe_path);
}
}
@@ -1078,7 +1078,7 @@ impl Extractor {
for exe_name in &exe_names {
let exe_path = subdir_path.join(exe_name);
if exe_path.exists() && self.is_executable(&exe_path) {
println!("Found executable in subdirectory: {}", exe_path.display());
log::info!("Found executable in subdirectory: {}", exe_path.display());
return Ok(exe_path);
}
}
@@ -1091,7 +1091,7 @@ impl Extractor {
let path = entry.path();
if let Some(file_name) = path.file_name().and_then(|n| n.to_str()) {
if file_name.ends_with(".AppImage") && self.is_executable(&path) {
println!("Found AppImage: {}", path.display());
log::info!("Found AppImage: {}", path.display());
return Ok(path);
}
}
@@ -1099,15 +1099,15 @@ impl Extractor {
}
// Last resort: recursive search for any executable file
println!("Performing recursive search for executables...");
log::info!("Performing recursive search for executables...");
match self.find_any_executable_recursive(dest_dir, 0).await {
Ok(path) => {
println!("Found executable via recursive search: {}", path.display());
log::info!("Found executable via recursive search: {}", path.display());
Ok(path)
}
Err(e) => {
// List all files in the directory for debugging
println!("Failed to find executable. Directory contents:");
log::info!("Failed to find executable. Directory contents:");
if let Ok(entries) = fs::read_dir(dest_dir) {
for entry in entries.flatten() {
let path = entry.path();
@@ -1116,7 +1116,7 @@ impl Extractor {
} else {
false
};
println!(" {} (executable: {})", path.display(), is_exec);
log::info!(" {} (executable: {})", path.display(), is_exec);
}
}
Err(
@@ -1220,7 +1220,7 @@ impl Extractor {
|| name_lower.contains("camoufox")
|| file_name.ends_with(".AppImage")
{
println!(
log::info!(
"Found priority executable at depth {}: {}",
depth,
path.display()
@@ -1262,7 +1262,7 @@ impl Extractor {
a_name.len().cmp(&b_name.len())
});
println!(
log::info!(
"Found potential executable at depth {}: {}",
depth,
potential_executables[0].display()
+33 -5
View File
@@ -1,4 +1,5 @@
use crate::browser::GithubRelease;
use crate::profile::manager::ProfileManager;
use directories::BaseDirs;
use reqwest::Client;
use serde::{Deserialize, Serialize};
@@ -75,6 +76,25 @@ impl GeoIPDownloader {
false
}
}
/// Check if GeoIP database is missing for Camoufox profiles
pub fn check_missing_geoip_database(
&self,
) -> Result<bool, Box<dyn std::error::Error + Send + Sync>> {
// Get all profiles
let profiles = ProfileManager::instance()
.list_profiles()
.map_err(|e| format!("Failed to list profiles: {e}"))?;
// Check if there are any Camoufox profiles
let has_camoufox_profiles = profiles.iter().any(|profile| profile.browser == "camoufox");
if has_camoufox_profiles {
// Check if GeoIP database is available
return Ok(!Self::is_geoip_database_available());
}
Ok(false)
}
fn find_city_mmdb_asset(&self, release: &GithubRelease) -> Option<String> {
for asset in &release.assets {
@@ -218,6 +238,19 @@ impl GeoIPDownloader {
}
}
#[tauri::command]
pub fn check_missing_geoip_database() -> Result<bool, String> {
let geoip_downloader = GeoIPDownloader::instance();
geoip_downloader
.check_missing_geoip_database()
.map_err(|e| format!("Failed to check missing GeoIP database: {e}"))
}
// Global singleton instance
lazy_static::lazy_static! {
static ref GEOIP_DOWNLOADER: GeoIPDownloader = GeoIPDownloader::new();
}
#[cfg(test)]
mod tests {
use super::*;
@@ -353,8 +386,3 @@ mod tests {
);
}
}
// Global singleton instance
lazy_static::lazy_static! {
static ref GEOIP_DOWNLOADER: GeoIPDownloader = GeoIPDownloader::new();
}
+53 -273
View File
@@ -4,6 +4,7 @@ use std::collections::HashMap;
use std::fs;
use std::path::{Path, PathBuf};
use std::sync::Mutex;
use tauri::Emitter;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ProfileGroup {
@@ -97,7 +98,11 @@ impl GroupManager {
Ok(groups_data.groups)
}
pub fn create_group(&self, name: String) -> Result<ProfileGroup, Box<dyn std::error::Error>> {
pub fn create_group(
&self,
app_handle: &tauri::AppHandle,
name: String,
) -> Result<ProfileGroup, Box<dyn std::error::Error>> {
let mut groups_data = self.load_groups_data()?;
// Check if group with this name already exists
@@ -113,11 +118,17 @@ impl GroupManager {
groups_data.groups.push(group.clone());
self.save_groups_data(&groups_data)?;
// Emit event for reactive UI updates
if let Err(e) = app_handle.emit("groups-changed", ()) {
log::error!("Failed to emit groups-changed event: {e}");
}
Ok(group)
}
pub fn update_group(
&self,
app_handle: &tauri::AppHandle,
id: String,
name: String,
) -> Result<ProfileGroup, Box<dyn std::error::Error>> {
@@ -142,10 +153,20 @@ impl GroupManager {
let updated_group = group.clone();
self.save_groups_data(&groups_data)?;
// Emit event for reactive UI updates
if let Err(e) = app_handle.emit("groups-changed", ()) {
log::error!("Failed to emit groups-changed event: {e}");
}
Ok(updated_group)
}
pub fn delete_group(&self, id: String) -> Result<(), Box<dyn std::error::Error>> {
pub fn delete_group(
&self,
app_handle: &tauri::AppHandle,
id: String,
) -> Result<(), Box<dyn std::error::Error>> {
let mut groups_data = self.load_groups_data()?;
let initial_len = groups_data.groups.len();
@@ -156,6 +177,12 @@ impl GroupManager {
}
self.save_groups_data(&groups_data)?;
// Emit event for reactive UI updates
if let Err(e) = app_handle.emit("groups-changed", ()) {
log::error!("Failed to emit groups-changed event: {e}");
}
Ok(())
}
@@ -203,267 +230,6 @@ lazy_static::lazy_static! {
pub static ref GROUP_MANAGER: Mutex<GroupManager> = Mutex::new(GroupManager::new());
}
#[cfg(test)]
mod tests {
use super::*;
use std::env;
use tempfile::TempDir;
fn create_test_group_manager() -> (GroupManager, TempDir) {
let temp_dir = TempDir::new().expect("Failed to create temp directory");
// Set up a temporary home directory for testing
env::set_var("HOME", temp_dir.path());
// Use per-test isolated data directory without relying on global env vars
let data_override = temp_dir.path().join("donutbrowser_test_data");
let manager = GroupManager::with_data_dir_override(&data_override);
(manager, temp_dir)
}
#[test]
fn test_group_manager_creation() {
let (_manager, _temp_dir) = create_test_group_manager();
// Test passes if no panic occurs
}
#[test]
fn test_create_and_get_groups() {
let (manager, _temp_dir) = create_test_group_manager();
// Initially should have no groups
let groups = manager
.get_all_groups()
.expect("Should be able to get groups");
assert!(groups.is_empty(), "Should start with no groups");
// Create a group
let group_name = "Test Group".to_string();
let created_group = manager
.create_group(group_name.clone())
.expect("Should create group successfully");
assert_eq!(
created_group.name, group_name,
"Created group should have correct name"
);
assert!(
!created_group.id.is_empty(),
"Created group should have an ID"
);
// Verify group was saved
let groups = manager
.get_all_groups()
.expect("Should be able to get groups");
assert_eq!(groups.len(), 1, "Should have one group");
assert_eq!(
groups[0].name, group_name,
"Retrieved group should have correct name"
);
assert_eq!(
groups[0].id, created_group.id,
"Retrieved group should have correct ID"
);
}
#[test]
fn test_create_duplicate_group_fails() {
let (manager, _temp_dir) = create_test_group_manager();
let group_name = "Duplicate Group".to_string();
// Create first group
let _first_group = manager
.create_group(group_name.clone())
.expect("Should create first group");
// Try to create duplicate group
let result = manager.create_group(group_name.clone());
assert!(result.is_err(), "Should fail to create duplicate group");
let error_msg = result.unwrap_err().to_string();
assert!(
error_msg.contains("already exists"),
"Error should mention group already exists"
);
}
#[test]
fn test_update_group() {
let (manager, _temp_dir) = create_test_group_manager();
// Create a group
let original_name = "Original Name".to_string();
let created_group = manager
.create_group(original_name)
.expect("Should create group");
// Update the group
let new_name = "Updated Name".to_string();
let updated_group = manager
.update_group(created_group.id.clone(), new_name.clone())
.expect("Should update group successfully");
assert_eq!(
updated_group.name, new_name,
"Updated group should have new name"
);
assert_eq!(
updated_group.id, created_group.id,
"Updated group should keep same ID"
);
// Verify update was persisted
let groups = manager.get_all_groups().expect("Should get groups");
assert_eq!(groups.len(), 1, "Should still have one group");
assert_eq!(
groups[0].name, new_name,
"Persisted group should have updated name"
);
}
#[test]
fn test_update_nonexistent_group_fails() {
let (manager, _temp_dir) = create_test_group_manager();
let result = manager.update_group("nonexistent-id".to_string(), "New Name".to_string());
assert!(result.is_err(), "Should fail to update nonexistent group");
let error_msg = result.unwrap_err().to_string();
assert!(
error_msg.contains("not found"),
"Error should mention group not found"
);
}
#[test]
fn test_delete_group() {
let (manager, _temp_dir) = create_test_group_manager();
// Create a group
let group_name = "To Delete".to_string();
let created_group = manager
.create_group(group_name)
.expect("Should create group");
// Verify group exists
let groups = manager.get_all_groups().expect("Should get groups");
assert_eq!(groups.len(), 1, "Should have one group");
// Delete the group
manager
.delete_group(created_group.id)
.expect("Should delete group successfully");
// Verify group was deleted
let groups = manager.get_all_groups().expect("Should get groups");
assert!(groups.is_empty(), "Should have no groups after deletion");
}
#[test]
fn test_delete_nonexistent_group_fails() {
let (manager, _temp_dir) = create_test_group_manager();
let result = manager.delete_group("nonexistent-id".to_string());
assert!(result.is_err(), "Should fail to delete nonexistent group");
let error_msg = result.unwrap_err().to_string();
assert!(
error_msg.contains("not found"),
"Error should mention group not found"
);
}
#[test]
fn test_get_groups_with_profile_counts() {
let (manager, _temp_dir) = create_test_group_manager();
// Create test groups
let group1 = manager
.create_group("Group 1".to_string())
.expect("Should create group 1");
let _group2 = manager
.create_group("Group 2".to_string())
.expect("Should create group 2");
// Create mock profiles
let profiles = vec![
crate::profile::BrowserProfile {
id: uuid::Uuid::new_v4(),
name: "Profile 1".to_string(),
browser: "firefox".to_string(),
version: "1.0".to_string(),
proxy_id: None,
process_id: None,
last_launch: None,
release_type: "stable".to_string(),
camoufox_config: None,
group_id: Some(group1.id.clone()),
},
crate::profile::BrowserProfile {
id: uuid::Uuid::new_v4(),
name: "Profile 2".to_string(),
browser: "firefox".to_string(),
version: "1.0".to_string(),
proxy_id: None,
process_id: None,
last_launch: None,
release_type: "stable".to_string(),
camoufox_config: None,
group_id: Some(group1.id.clone()),
},
crate::profile::BrowserProfile {
id: uuid::Uuid::new_v4(),
name: "Profile 3".to_string(),
browser: "firefox".to_string(),
version: "1.0".to_string(),
proxy_id: None,
process_id: None,
last_launch: None,
release_type: "stable".to_string(),
camoufox_config: None,
group_id: None, // Default group
},
];
let groups_with_counts = manager
.get_groups_with_profile_counts(&profiles)
.expect("Should get groups with counts");
// Should have default group + group1 + group2 (group2 has 0 profiles but should still appear)
assert_eq!(
groups_with_counts.len(),
3,
"Should include all groups, even those with 0 profiles"
);
// Check default group
let default_group = groups_with_counts
.iter()
.find(|g| g.id == "default")
.expect("Should have default group");
assert_eq!(
default_group.count, 1,
"Default group should have 1 profile"
);
// Check group1
let group1_with_count = groups_with_counts
.iter()
.find(|g| g.id == group1.id)
.expect("Should have group1");
assert_eq!(group1_with_count.count, 2, "Group1 should have 2 profiles");
// Check that group2 exists with 0 profiles
let group2_with_count = groups_with_counts
.iter()
.find(|g| g.name == "Group 2")
.expect("Should have group2 present even with 0 profiles");
assert_eq!(group2_with_count.count, 0, "Group2 should have 0 profiles");
}
}
// Helper function to get groups with counts
pub fn get_groups_with_counts(profiles: &[crate::profile::BrowserProfile]) -> Vec<GroupWithCount> {
let group_manager = GROUP_MANAGER.lock().unwrap();
@@ -491,44 +257,58 @@ pub async fn get_groups_with_profile_counts() -> Result<Vec<GroupWithCount>, Str
}
#[tauri::command]
pub async fn create_profile_group(name: String) -> Result<ProfileGroup, String> {
pub async fn create_profile_group(
app_handle: tauri::AppHandle,
name: String,
) -> Result<ProfileGroup, String> {
let group_manager = GROUP_MANAGER.lock().unwrap();
group_manager
.create_group(name)
.create_group(&app_handle, name)
.map_err(|e| format!("Failed to create group: {e}"))
}
#[tauri::command]
pub async fn update_profile_group(group_id: String, name: String) -> Result<ProfileGroup, String> {
pub async fn update_profile_group(
app_handle: tauri::AppHandle,
group_id: String,
name: String,
) -> Result<ProfileGroup, String> {
let group_manager = GROUP_MANAGER.lock().unwrap();
group_manager
.update_group(group_id, name)
.update_group(&app_handle, group_id, name)
.map_err(|e| format!("Failed to update group: {e}"))
}
#[tauri::command]
pub async fn delete_profile_group(group_id: String) -> Result<(), String> {
pub async fn delete_profile_group(
app_handle: tauri::AppHandle,
group_id: String,
) -> Result<(), String> {
let group_manager = GROUP_MANAGER.lock().unwrap();
group_manager
.delete_group(group_id)
.delete_group(&app_handle, group_id)
.map_err(|e| format!("Failed to delete group: {e}"))
}
#[tauri::command]
pub async fn assign_profiles_to_group(
profile_names: Vec<String>,
app_handle: tauri::AppHandle,
profile_ids: Vec<String>,
group_id: Option<String>,
) -> Result<(), String> {
let profile_manager = crate::profile::ProfileManager::instance();
profile_manager
.assign_profiles_to_group(profile_names, group_id)
.assign_profiles_to_group(&app_handle, profile_ids, group_id)
.map_err(|e| format!("Failed to assign profiles to group: {e}"))
}
#[tauri::command]
pub async fn delete_selected_profiles(profile_names: Vec<String>) -> Result<(), String> {
pub async fn delete_selected_profiles(
app_handle: tauri::AppHandle,
profile_ids: Vec<String>,
) -> Result<(), String> {
let profile_manager = crate::profile::ProfileManager::instance();
profile_manager
.delete_multiple_profiles(profile_names)
.delete_multiple_profiles(&app_handle, profile_ids)
.map_err(|e| format!("Failed to delete profiles: {e}"))
}
+307 -69
View File
@@ -3,20 +3,22 @@ use std::env;
use std::sync::Mutex;
use tauri::{Emitter, Manager, Runtime, WebviewUrl, WebviewWindow, WebviewWindowBuilder};
use tauri_plugin_deep_link::DeepLinkExt;
use tauri_plugin_log::{Target, TargetKind};
// Store pending URLs that need to be handled when the window is ready
static PENDING_URLS: Mutex<Vec<String>> = Mutex::new(Vec::new());
mod api_client;
mod api_server;
mod app_auto_updater;
mod auto_updater;
mod browser;
mod browser_runner;
mod browser_version_manager;
mod camoufox;
mod camoufox_manager;
mod default_browser;
mod download;
mod downloaded_browsers;
mod downloaded_browsers_registry;
mod downloader;
mod extraction;
mod geoip_downloader;
mod group_manager;
@@ -24,36 +26,53 @@ mod platform_browser;
mod profile;
mod profile_importer;
mod proxy_manager;
pub mod proxy_runner;
pub mod proxy_server;
pub mod proxy_storage;
mod settings_manager;
pub mod traffic_stats;
// mod theme_detector; // removed: theme detection handled in webview via CSS prefers-color-scheme
mod tag_manager;
mod version_updater;
extern crate lazy_static;
use browser_runner::{
check_browser_exists, check_browser_status, check_missing_binaries, check_missing_geoip_database,
create_browser_profile_new, delete_profile, download_browser, ensure_all_binaries_exist,
fetch_browser_versions_cached_first, fetch_browser_versions_with_count,
fetch_browser_versions_with_count_cached_first, get_downloaded_browser_versions,
get_supported_browsers, is_browser_supported_on_platform, kill_browser_profile,
launch_browser_profile, list_browser_profiles, rename_profile, update_camoufox_config,
update_profile_proxy,
check_browser_exists, kill_browser_profile, launch_browser_profile, open_url_with_profile,
};
use profile::manager::{
check_browser_status, create_browser_profile_new, delete_profile, list_browser_profiles,
rename_profile, update_camoufox_config, update_profile_note, update_profile_proxy,
update_profile_tags,
};
use browser_version_manager::{
fetch_browser_versions_cached_first, fetch_browser_versions_with_count,
fetch_browser_versions_with_count_cached_first, get_supported_browsers,
is_browser_supported_on_platform,
};
use downloaded_browsers_registry::{
check_missing_binaries, ensure_all_binaries_exist, get_downloaded_browser_versions,
};
use downloader::download_browser;
use settings_manager::{
clear_all_version_cache_and_refetch, get_app_settings, get_table_sorting_settings,
save_app_settings, save_table_sorting_settings, should_show_settings_on_startup,
get_app_settings, get_table_sorting_settings, save_app_settings, save_table_sorting_settings,
should_show_settings_on_startup,
};
use default_browser::{is_default_browser, open_url_with_profile, set_as_default_browser};
use tag_manager::get_all_tags;
use default_browser::{is_default_browser, set_as_default_browser};
use version_updater::{
get_version_update_status, get_version_updater, trigger_manual_version_update,
clear_all_version_cache_and_refetch, get_version_update_status, get_version_updater,
trigger_manual_version_update,
};
use auto_updater::{
check_for_browser_updates, complete_browser_update_with_auto_update, dismiss_update_notification,
is_browser_disabled_for_update,
};
use app_auto_updater::{
@@ -62,17 +81,17 @@ use app_auto_updater::{
use profile_importer::{detect_existing_profiles, import_browser_profile};
// use theme_detector::get_system_theme;
use group_manager::{
assign_profiles_to_group, create_profile_group, delete_profile_group, delete_selected_profiles,
get_groups_with_profile_counts, get_profile_groups, update_profile_group,
};
use geoip_downloader::GeoIPDownloader;
use geoip_downloader::{check_missing_geoip_database, GeoIPDownloader};
use browser_version_manager::get_browser_release_types;
use api_server::{get_api_server_status, start_api_server, stop_api_server};
// Trait to extend WebviewWindow with transparent titlebar functionality
pub trait WindowExt {
#[cfg(target_os = "macos")]
@@ -136,7 +155,7 @@ async fn warm_up_nodecar(app: tauri::AppHandle) -> Result<(), String> {
match timeout(Duration::from_secs(120), exec_future).await {
Ok(Ok(_output)) => {
let duration = start_time.elapsed();
println!(
log::info!(
"Nodecar warm-up (frontend-triggered) completed in {:.2}s",
duration.as_secs_f64()
);
@@ -149,11 +168,11 @@ async fn warm_up_nodecar(app: tauri::AppHandle) -> Result<(), String> {
#[tauri::command]
async fn handle_url_open(app: tauri::AppHandle, url: String) -> Result<(), String> {
println!("handle_url_open called with URL: {url}");
log::info!("handle_url_open called with URL: {url}");
// Check if the main window exists and is ready
if let Some(window) = app.get_webview_window("main") {
println!("Main window exists");
log::debug!("Main window exists");
// Try to show and focus the window first
let _ = window.show();
@@ -165,7 +184,7 @@ async fn handle_url_open(app: tauri::AppHandle, url: String) -> Result<(), Strin
.map_err(|e| format!("Failed to emit URL open event: {e}"))?;
} else {
// Window doesn't exist yet - add to pending URLs
println!("Main window doesn't exist, adding URL to pending list");
log::debug!("Main window doesn't exist, adding URL to pending list");
let mut pending = PENDING_URLS.lock().unwrap();
pending.push(url);
}
@@ -175,11 +194,12 @@ async fn handle_url_open(app: tauri::AppHandle, url: String) -> Result<(), Strin
#[tauri::command]
async fn create_stored_proxy(
app_handle: tauri::AppHandle,
name: String,
proxy_settings: crate::browser::ProxySettings,
) -> Result<crate::proxy_manager::StoredProxy, String> {
crate::proxy_manager::PROXY_MANAGER
.create_stored_proxy(name, proxy_settings)
.create_stored_proxy(&app_handle, name, proxy_settings)
.map_err(|e| format!("Failed to create stored proxy: {e}"))
}
@@ -190,27 +210,70 @@ async fn get_stored_proxies() -> Result<Vec<crate::proxy_manager::StoredProxy>,
#[tauri::command]
async fn update_stored_proxy(
app_handle: tauri::AppHandle,
proxy_id: String,
name: Option<String>,
proxy_settings: Option<crate::browser::ProxySettings>,
) -> Result<crate::proxy_manager::StoredProxy, String> {
crate::proxy_manager::PROXY_MANAGER
.update_stored_proxy(&proxy_id, name, proxy_settings)
.update_stored_proxy(&app_handle, &proxy_id, name, proxy_settings)
.map_err(|e| format!("Failed to update stored proxy: {e}"))
}
#[tauri::command]
async fn delete_stored_proxy(proxy_id: String) -> Result<(), String> {
async fn delete_stored_proxy(app_handle: tauri::AppHandle, proxy_id: String) -> Result<(), String> {
crate::proxy_manager::PROXY_MANAGER
.delete_stored_proxy(&proxy_id)
.delete_stored_proxy(&app_handle, &proxy_id)
.map_err(|e| format!("Failed to delete stored proxy: {e}"))
}
#[tauri::command]
async fn check_proxy_validity(
proxy_id: String,
proxy_settings: crate::browser::ProxySettings,
) -> Result<crate::proxy_manager::ProxyCheckResult, String> {
crate::proxy_manager::PROXY_MANAGER
.check_proxy_validity(&proxy_id, &proxy_settings)
.await
}
#[tauri::command]
fn get_cached_proxy_check(proxy_id: String) -> Option<crate::proxy_manager::ProxyCheckResult> {
crate::proxy_manager::PROXY_MANAGER.get_cached_proxy_check(&proxy_id)
}
#[tauri::command]
async fn is_geoip_database_available() -> Result<bool, String> {
Ok(GeoIPDownloader::is_geoip_database_available())
}
#[tauri::command]
async fn get_all_traffic_snapshots() -> Result<Vec<crate::traffic_stats::TrafficSnapshot>, String> {
Ok(
crate::traffic_stats::list_traffic_stats()
.into_iter()
.map(|s| s.to_snapshot())
.collect(),
)
}
#[tauri::command]
async fn clear_all_traffic_stats() -> Result<(), String> {
crate::traffic_stats::clear_all_traffic_stats()
.map_err(|e| format!("Failed to clear traffic stats: {e}"))
}
#[tauri::command]
async fn get_traffic_stats_for_period(
profile_id: String,
seconds: u64,
) -> Result<Option<crate::traffic_stats::FilteredTrafficStats>, String> {
Ok(crate::traffic_stats::get_traffic_stats_for_period(
&profile_id,
seconds,
))
}
#[tauri::command]
async fn download_geoip_database(app_handle: tauri::AppHandle) -> Result<(), String> {
let downloader = GeoIPDownloader::instance();
@@ -226,14 +289,49 @@ pub fn run() {
let startup_url = args.iter().find(|arg| arg.starts_with("http")).cloned();
if let Some(url) = startup_url.clone() {
println!("Found startup URL in command line: {url}");
log::info!("Found startup URL in command line: {url}");
let mut pending = PENDING_URLS.lock().unwrap();
pending.push(url.clone());
}
// Configure logging plugin with separate logs for dev and production
let log_file_name = if cfg!(debug_assertions) {
"DonutBrowserDev"
} else {
"DonutBrowser"
};
tauri::Builder::default()
.plugin(
tauri_plugin_log::Builder::new()
.clear_targets() // Clear default targets to avoid duplicates
.target(Target::new(TargetKind::Stdout))
.target(Target::new(TargetKind::Webview))
.target(Target::new(TargetKind::LogDir {
file_name: Some(log_file_name.to_string()),
}))
.max_file_size(100_000) // 100KB
.level(log::LevelFilter::Info)
.format(|out, message, record| {
use chrono::Local;
let now = Local::now();
let timestamp = format!(
"{}.{:03}",
now.format("%Y-%m-%d %H:%M:%S"),
now.timestamp_subsec_millis()
);
out.finish(format_args!(
"[{}][{}][{}] {}",
timestamp,
record.target(),
record.level(),
message
))
})
.build(),
)
.plugin(tauri_plugin_single_instance::init(|_, args, _cwd| {
println!("Single instance triggered with args: {args:?}");
log::info!("Single instance triggered with args: {args:?}");
}))
.plugin(tauri_plugin_deep_link::init())
.plugin(tauri_plugin_fs::init())
@@ -246,7 +344,7 @@ pub fn run() {
#[allow(unused_variables)]
let win_builder = WebviewWindowBuilder::new(app, "main", WebviewUrl::default())
.title("Donut Browser")
.inner_size(900.0, 600.0)
.inner_size(800.0, 500.0)
.resizable(false)
.fullscreen(false)
.center()
@@ -260,18 +358,18 @@ pub fn run() {
#[cfg(target_os = "macos")]
{
if let Err(e) = window.set_transparent_titlebar(true) {
eprintln!("Failed to set transparent titlebar: {e}");
log::warn!("Failed to set transparent titlebar: {e}");
}
}
// Set up deep link handler
let handle = app.handle().clone();
#[cfg(any(windows, target_os = "linux"))]
#[cfg(windows)]
{
// For Windows and Linux, register all deep links at runtime for development
// For Windows, register all deep links at runtime
if let Err(e) = app.deep_link().register_all() {
eprintln!("Failed to register deep links: {e}");
log::warn!("Failed to register deep links: {e}");
}
}
@@ -279,7 +377,7 @@ pub fn run() {
{
// On macOS, try to register deep links for development builds
if let Err(e) = app.deep_link().register_all() {
eprintln!(
log::debug!(
"Note: Deep link registration failed on macOS (this is normal for production): {e}"
);
}
@@ -289,11 +387,11 @@ pub fn run() {
let handle = handle.clone();
move |event| {
let urls = event.urls();
println!("Deep link event received with {} URLs", urls.len());
log::info!("Deep link event received with {} URLs", urls.len());
for url in urls {
let url_string = url.to_string();
println!("Deep link received: {url_string}");
log::info!("Deep link received: {url_string}");
// Clone the handle for each async task
let handle_clone = handle.clone();
@@ -301,7 +399,7 @@ pub fn run() {
// Handle the URL asynchronously
tauri::async_runtime::spawn(async move {
if let Err(e) = handle_url_open(handle_clone, url_string.clone()).await {
eprintln!("Failed to handle deep link URL: {e}");
log::error!("Failed to handle deep link URL: {e}");
}
});
}
@@ -311,9 +409,9 @@ pub fn run() {
if let Some(startup_url) = startup_url {
let handle_clone = handle.clone();
tauri::async_runtime::spawn(async move {
println!("Processing startup URL from command line: {startup_url}");
log::info!("Processing startup URL from command line: {startup_url}");
if let Err(e) = handle_url_open(handle_clone, startup_url.clone()).await {
eprintln!("Failed to handle startup URL: {e}");
log::error!("Failed to handle startup URL: {e}");
}
});
}
@@ -333,7 +431,7 @@ pub fn run() {
{
let updater_guard = version_updater.lock().await;
if let Err(e) = updater_guard.start_background_updates().await {
eprintln!("Failed to start background updates: {e}");
log::error!("Failed to start background updates: {e}");
}
}
});
@@ -364,9 +462,9 @@ pub fn run() {
};
for url in pending_urls {
println!("Processing pending URL: {url}");
log::info!("Processing pending URL: {url}");
if let Err(e) = handle_url_open(handle_pending.clone(), url).await {
eprintln!("Failed to handle pending URL: {e}");
log::error!("Failed to handle pending URL: {e}");
}
}
});
@@ -378,37 +476,39 @@ pub fn run() {
loop {
interval.tick().await;
let browser_runner = crate::browser_runner::BrowserRunner::instance();
if let Err(e) = browser_runner.cleanup_unused_binaries_internal() {
eprintln!("Periodic cleanup failed: {e}");
let registry =
crate::downloaded_browsers_registry::DownloadedBrowsersRegistry::instance();
if let Err(e) = registry.cleanup_unused_binaries() {
log::error!("Periodic cleanup failed: {e}");
} else {
println!("Periodic cleanup completed successfully");
log::debug!("Periodic cleanup completed successfully");
}
}
});
let app_handle_update = app.handle().clone();
tauri::async_runtime::spawn(async move {
println!("Starting app update check at startup...");
log::info!("Starting app update check at startup...");
let updater = app_auto_updater::AppAutoUpdater::instance();
match updater.check_for_updates().await {
Ok(Some(update_info)) => {
println!(
log::info!(
"App update available: {} -> {}",
update_info.current_version, update_info.new_version
update_info.current_version,
update_info.new_version
);
// Emit update available event to the frontend
if let Err(e) = app_handle_update.emit("app-update-available", &update_info) {
eprintln!("Failed to emit app update event: {e}");
log::error!("Failed to emit app update event: {e}");
} else {
println!("App update event emitted successfully");
log::debug!("App update event emitted successfully");
}
}
Ok(None) => {
println!("No app updates available");
log::debug!("No app updates available");
}
Err(e) => {
eprintln!("Failed to check for app updates: {e}");
log::error!("Failed to check for app updates: {e}");
}
}
});
@@ -416,18 +516,18 @@ pub fn run() {
// Start Camoufox cleanup task
let _app_handle_cleanup = app.handle().clone();
tauri::async_runtime::spawn(async move {
let launcher = crate::camoufox::CamoufoxNodecarLauncher::instance();
let camoufox_manager = crate::camoufox_manager::CamoufoxManager::instance();
let mut interval = tokio::time::interval(tokio::time::Duration::from_secs(5));
loop {
interval.tick().await;
match launcher.cleanup_dead_instances().await {
Ok(_dead_instances) => {
match camoufox_manager.cleanup_dead_instances().await {
Ok(_) => {
// Cleanup completed silently
}
Err(e) => {
eprintln!("Error during Camoufox cleanup: {e}");
log::error!("Error during Camoufox cleanup: {e}");
}
}
}
@@ -439,25 +539,27 @@ pub fn run() {
// Wait a bit for the app to fully initialize
tokio::time::sleep(tokio::time::Duration::from_secs(2)).await;
let browser_runner = crate::browser_runner::BrowserRunner::instance();
match browser_runner.check_missing_geoip_database() {
let geoip_downloader = crate::geoip_downloader::GeoIPDownloader::instance();
match geoip_downloader.check_missing_geoip_database() {
Ok(true) => {
println!("GeoIP database is missing for Camoufox profiles, downloading at startup...");
log::info!(
"GeoIP database is missing for Camoufox profiles, downloading at startup..."
);
let geoip_downloader = GeoIPDownloader::instance();
if let Err(e) = geoip_downloader
.download_geoip_database(&app_handle_geoip)
.await
{
eprintln!("Failed to download GeoIP database at startup: {e}");
log::error!("Failed to download GeoIP database at startup: {e}");
} else {
println!("GeoIP database downloaded successfully at startup");
log::info!("GeoIP database downloaded successfully at startup");
}
}
Ok(false) => {
// No Camoufox profiles or GeoIP database already available
}
Err(e) => {
eprintln!("Failed to check GeoIP database status at startup: {e}");
log::error!("Failed to check GeoIP database status at startup: {e}");
}
}
});
@@ -476,14 +578,92 @@ pub fn run() {
{
Ok(dead_pids) => {
if !dead_pids.is_empty() {
println!(
log::info!(
"Cleaned up proxies for {} dead browser processes",
dead_pids.len()
);
}
}
Err(e) => {
eprintln!("Error during proxy cleanup: {e}");
log::error!("Error during proxy cleanup: {e}");
}
}
}
});
// Periodically broadcast browser running status to the frontend
let app_handle_status = app.handle().clone();
tauri::async_runtime::spawn(async move {
let mut interval = tokio::time::interval(tokio::time::Duration::from_millis(500));
let mut last_running_states: std::collections::HashMap<String, bool> =
std::collections::HashMap::new();
loop {
interval.tick().await;
let runner = crate::browser_runner::BrowserRunner::instance();
// If listing profiles fails, skip this tick
let profiles = match runner.profile_manager.list_profiles() {
Ok(p) => p,
Err(e) => {
log::warn!("Failed to list profiles in status checker: {e}");
continue;
}
};
for profile in profiles {
// Check browser status and track changes
match runner
.check_browser_status(app_handle_status.clone(), &profile)
.await
{
Ok(is_running) => {
let profile_id = profile.id.to_string();
let last_state = last_running_states
.get(&profile_id)
.copied()
.unwrap_or(false);
// Only emit event if state actually changed
if last_state != is_running {
log::debug!(
"Status checker detected change for profile {}: {} -> {}",
profile.name,
last_state,
is_running
);
#[derive(serde::Serialize)]
struct RunningChangedPayload {
id: String,
is_running: bool,
}
let payload = RunningChangedPayload {
id: profile_id.clone(),
is_running,
};
if let Err(e) = app_handle_status.emit("profile-running-changed", &payload) {
log::warn!("Failed to emit profile running changed event: {e}");
} else {
log::debug!(
"Status checker emitted profile-running-changed event for {}: running={}",
profile.name,
is_running
);
}
last_running_states.insert(profile_id, is_running);
} else {
// Update the state even if unchanged to ensure we have it tracked
last_running_states.insert(profile_id, is_running);
}
}
Err(e) => {
log::warn!("Status check failed for profile {}: {}", profile.name, e);
continue;
}
}
}
}
@@ -491,6 +671,55 @@ pub fn run() {
// Nodecar warm-up is now triggered from the frontend to allow UI blocking overlay
// Start API server if enabled in settings
let app_handle_api = app.handle().clone();
tauri::async_runtime::spawn(async move {
match crate::settings_manager::get_app_settings(app_handle_api.clone()).await {
Ok(settings) => {
if settings.api_enabled {
log::info!("API is enabled in settings, starting API server...");
match crate::api_server::start_api_server_internal(settings.api_port, &app_handle_api)
.await
{
Ok(port) => {
log::info!("API server started successfully on port {port}");
// Emit success toast to frontend
if let Err(e) = app_handle_api.emit(
"show-toast",
crate::api_server::ToastPayload {
message: "API server started successfully".to_string(),
variant: "success".to_string(),
title: "Local API Started".to_string(),
description: Some(format!("API server running on port {port}")),
},
) {
log::error!("Failed to emit API start toast: {e}");
}
}
Err(e) => {
log::error!("Failed to start API server at startup: {e}");
// Emit error toast to frontend
if let Err(toast_err) = app_handle_api.emit(
"show-toast",
crate::api_server::ToastPayload {
message: "Failed to start API server".to_string(),
variant: "error".to_string(),
title: "Failed to Start Local API".to_string(),
description: Some(format!("Error: {e}")),
},
) {
log::error!("Failed to emit API error toast: {toast_err}");
}
}
}
}
}
Err(e) => {
log::error!("Failed to load app settings for API startup: {e}");
}
}
});
Ok(())
})
.invoke_handler(tauri::generate_handler![
@@ -506,8 +735,11 @@ pub fn run() {
fetch_browser_versions_cached_first,
fetch_browser_versions_with_count_cached_first,
get_downloaded_browser_versions,
get_all_tags,
get_browser_release_types,
update_profile_proxy,
update_profile_tags,
update_profile_note,
check_browser_status,
kill_browser_profile,
rename_profile,
@@ -523,13 +755,11 @@ pub fn run() {
trigger_manual_version_update,
get_version_update_status,
check_for_browser_updates,
is_browser_disabled_for_update,
dismiss_update_notification,
complete_browser_update_with_auto_update,
check_for_app_updates,
check_for_app_updates_manual,
download_and_install_app_update,
// get_system_theme, // removed
detect_existing_profiles,
import_browser_profile,
check_missing_binaries,
@@ -539,6 +769,8 @@ pub fn run() {
get_stored_proxies,
update_stored_proxy,
delete_stored_proxy,
check_proxy_validity,
get_cached_proxy_check,
update_camoufox_config,
get_profile_groups,
get_groups_with_profile_counts,
@@ -550,6 +782,12 @@ pub fn run() {
is_geoip_database_available,
download_geoip_database,
warm_up_nodecar,
start_api_server,
stop_api_server,
get_api_server_status,
get_all_traffic_snapshots,
clear_all_traffic_stats,
get_traffic_stats_for_period
])
.run(tauri::generate_context!())
.expect("error while running tauri application");
+1 -1
View File
@@ -2,5 +2,5 @@
#![cfg_attr(not(debug_assertions), windows_subsystem = "windows")]
fn main() {
donutbrowser::run()
donutbrowser_lib::run()
}
+203 -104
View File
@@ -8,6 +8,7 @@ use std::process::Command;
#[cfg(target_os = "macos")]
pub mod macos {
use super::*;
use sysinfo::{Pid, System};
pub fn is_tor_or_mullvad_browser(exe_name: &str, cmd: &[OsString], browser_type: &str) -> bool {
match browser_type {
@@ -47,7 +48,7 @@ pub mod macos {
executable_path: &std::path::Path,
args: &[String],
) -> Result<std::process::Child, Box<dyn std::error::Error + Send + Sync>> {
println!("Launching browser on macOS: {executable_path:?} with args: {args:?}");
log::info!("Launching browser on macOS: {executable_path:?} with args: {args:?}");
// If the executable is inside an app bundle, launch via Launch Services so
// macOS recognizes the real application for privacy permissions (e.g. Screen Recording).
// This ensures TCC prompts are attributed to the browser app, not our launcher.
@@ -93,7 +94,7 @@ pub mod macos {
let profile_data_path = profile.get_profile_data_path(profiles_dir);
// First try: Use Firefox remote command
println!("Trying Firefox remote command for PID: {pid}");
log::info!("Trying Firefox remote command for PID: {pid}");
let browser = create_browser(browser_type);
if let Ok(executable_path) = browser.get_executable_path(browser_dir) {
let remote_args = vec![
@@ -107,17 +108,17 @@ pub mod macos {
match remote_output {
Ok(output) if output.status.success() => {
println!("Firefox remote command succeeded");
log::info!("Firefox remote command succeeded");
return Ok(());
}
Ok(output) => {
let stderr = String::from_utf8_lossy(&output.stderr);
println!(
log::info!(
"Firefox remote command failed with stderr: {stderr}, trying AppleScript fallback"
);
}
Err(e) => {
println!("Firefox remote command error: {e}, trying AppleScript fallback");
log::info!("Firefox remote command error: {e}, trying AppleScript fallback");
}
}
}
@@ -195,12 +196,12 @@ end try
"#
);
println!("Executing AppleScript fallback for Firefox-based browser (PID: {pid})...");
log::info!("Executing AppleScript fallback for Firefox-based browser (PID: {pid})...");
let output = Command::new("osascript").args(["-e", &script]).output()?;
if !output.status.success() {
let error_msg = String::from_utf8_lossy(&output.stderr);
println!("AppleScript failed: {error_msg}");
log::info!("AppleScript failed: {error_msg}");
return Err(
format!(
"Both Firefox remote command and AppleScript failed. AppleScript error: {error_msg}"
@@ -208,12 +209,172 @@ end try
.into(),
);
} else {
println!("AppleScript succeeded");
log::info!("AppleScript succeeded");
}
Ok(())
}
pub async fn kill_browser_process_impl(
pid: u32,
profile_data_path: Option<&str>,
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
log::info!("Attempting to kill browser process with PID: {pid}");
let mut pids_to_kill = vec![pid];
let descendants = get_all_descendant_pids(pid).await;
pids_to_kill.extend(descendants);
if let Some(profile_path) = profile_data_path {
let additional_pids = find_processes_by_profile_path(profile_path).await;
for p in additional_pids {
if !pids_to_kill.contains(&p) {
log::info!("Found additional process {} using profile path", p);
pids_to_kill.push(p);
}
}
}
log::info!("Total processes to kill: {:?}", pids_to_kill);
for &p in &pids_to_kill {
log::info!("Sending SIGKILL to PID: {p}");
let _ = Command::new("kill")
.args(["-KILL", &p.to_string()])
.output();
}
let pid_str = pid.to_string();
let _ = Command::new("pkill")
.args(["-KILL", "-P", &pid_str])
.output();
let _ = Command::new("pkill")
.args(["-KILL", "-g", &pid_str])
.output();
for &p in &pids_to_kill {
let system = System::new_all();
if system.process(Pid::from(p as usize)).is_some() {
log::info!("Process {p} still running, retrying kill");
let _ = Command::new("kill")
.args(["-KILL", &p.to_string()])
.output();
}
}
tokio::time::sleep(tokio::time::Duration::from_millis(500)).await;
let system = System::new_all();
let mut still_running = Vec::new();
for &p in &pids_to_kill {
if system.process(Pid::from(p as usize)).is_some() {
still_running.push(p);
}
}
if !still_running.is_empty() {
log::info!(
"Processes {:?} still running, trying final termination",
still_running
);
for p in &still_running {
let _ = Command::new("/bin/kill")
.args(["-KILL", &p.to_string()])
.output();
}
tokio::time::sleep(tokio::time::Duration::from_millis(1000)).await;
let system = System::new_all();
let mut final_still_running = Vec::new();
for &p in &pids_to_kill {
if system.process(Pid::from(p as usize)).is_some() {
final_still_running.push(p);
}
}
if !final_still_running.is_empty() {
log::error!(
"ERROR: Processes {:?} could not be terminated despite aggressive attempts",
final_still_running
);
return Err(
format!(
"Failed to terminate browser processes {:?} - still running",
final_still_running
)
.into(),
);
}
}
log::info!("Browser termination completed for PID: {pid}");
Ok(())
}
async fn find_processes_by_profile_path(profile_path: &str) -> Vec<u32> {
use sysinfo::System;
let mut pids = Vec::new();
let system = System::new_all();
for (pid, process) in system.processes() {
let cmd = process.cmd();
if cmd.is_empty() {
continue;
}
// Check if any command line argument contains the profile path
let has_profile = cmd.iter().any(|arg| {
if let Some(arg_str) = arg.to_str() {
arg_str.contains(profile_path)
} else {
false
}
});
if has_profile {
pids.push(pid.as_u32());
}
}
pids
}
// Recursively find all descendant processes
async fn get_all_descendant_pids(parent_pid: u32) -> Vec<u32> {
use sysinfo::System;
let system = System::new_all();
let mut descendants = Vec::new();
let mut to_check = vec![parent_pid];
let mut checked = std::collections::HashSet::new();
while let Some(current_pid) = to_check.pop() {
if checked.contains(&current_pid) {
continue;
}
checked.insert(current_pid);
// Find direct children of current_pid
for (pid, process) in system.processes() {
let pid_u32 = pid.as_u32();
if let Some(parent) = process.parent() {
if parent.as_u32() == current_pid && !checked.contains(&pid_u32) {
descendants.push(pid_u32);
to_check.push(pid_u32);
}
}
}
}
descendants
}
pub async fn open_url_in_existing_browser_tor_mullvad(
profile: &BrowserProfile,
url: &str,
@@ -223,10 +384,10 @@ end try
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let pid = profile.process_id.unwrap();
println!("Opening URL in TOR/Mullvad browser using file-based approach (PID: {pid})");
log::info!("Opening URL in TOR/Mullvad browser using file-based approach (PID: {pid})");
// Method 1: Try using a temporary HTML file approach
println!("Attempting file-based URL opening for TOR/Mullvad browser");
log::info!("Attempting file-based URL opening for TOR/Mullvad browser");
let temp_dir = std::env::temp_dir();
let temp_file_name = format!("donut_browser_url_{}.html", std::process::id());
@@ -251,7 +412,7 @@ end try
match std::fs::write(&temp_file_path, html_content) {
Ok(()) => {
println!("Created temporary HTML file: {temp_file_path:?}");
log::info!("Created temporary HTML file: {temp_file_path:?}");
let browser = create_browser(browser_type.clone());
if let Ok(executable_path) = browser.get_executable_path(browser_dir) {
@@ -272,15 +433,15 @@ end try
match open_result {
Ok(output) if output.status.success() => {
println!("Successfully opened URL using file-based approach");
log::info!("Successfully opened URL using file-based approach");
return Ok(());
}
Ok(output) => {
let stderr = String::from_utf8_lossy(&output.stderr);
println!("File-based approach failed: {stderr}");
log::info!("File-based approach failed: {stderr}");
}
Err(e) => {
println!("File-based approach error: {e}");
log::info!("File-based approach error: {e}");
}
}
}
@@ -288,12 +449,12 @@ end try
let _ = std::fs::remove_file(&temp_file_path);
}
Err(e) => {
println!("Failed to create temporary HTML file: {e}");
log::info!("Failed to create temporary HTML file: {e}");
}
}
// Method 2: Try using the 'open' command directly with the URL
println!("Attempting direct URL opening with 'open' command");
log::info!("Attempting direct URL opening with 'open' command");
let browser = create_browser(browser_type.clone());
if let Ok(executable_path) = browser.get_executable_path(browser_dir) {
@@ -303,15 +464,15 @@ end try
match direct_open_result {
Ok(output) if output.status.success() => {
println!("Successfully opened URL using direct 'open' command");
log::info!("Successfully opened URL using direct 'open' command");
return Ok(());
}
Ok(output) => {
let stderr = String::from_utf8_lossy(&output.stderr);
println!("Direct 'open' command failed: {stderr}");
log::info!("Direct 'open' command failed: {stderr}");
}
Err(e) => {
println!("Direct 'open' command error: {e}");
log::info!("Direct 'open' command error: {e}");
}
}
}
@@ -340,7 +501,7 @@ end try
let pid = profile.process_id.unwrap();
// First, try using the browser's built-in URL opening capability
println!("Trying Chromium URL opening for PID: {pid}");
log::info!("Trying Chromium URL opening for PID: {pid}");
let browser = create_browser(browser_type);
if let Ok(executable_path) = browser.get_executable_path(browser_dir) {
@@ -354,15 +515,15 @@ end try
match remote_output {
Ok(output) if output.status.success() => {
println!("Chromium URL opening succeeded");
log::info!("Chromium URL opening succeeded");
return Ok(());
}
Ok(output) => {
let stderr = String::from_utf8_lossy(&output.stderr);
println!("Chromium URL opening failed: {stderr}, trying AppleScript");
log::info!("Chromium URL opening failed: {stderr}, trying AppleScript");
}
Err(e) => {
println!("Chromium URL opening error: {e}, trying AppleScript");
log::info!("Chromium URL opening error: {e}, trying AppleScript");
}
}
}
@@ -440,54 +601,21 @@ end try
"#
);
println!("Executing AppleScript for Chromium-based browser (PID: {pid})...");
log::info!("Executing AppleScript for Chromium-based browser (PID: {pid})...");
let output = Command::new("osascript").args(["-e", &script]).output()?;
if !output.status.success() {
let error_msg = String::from_utf8_lossy(&output.stderr);
println!("AppleScript failed: {error_msg}");
log::info!("AppleScript failed: {error_msg}");
return Err(
format!("Failed to open URL in existing Chromium-based browser: {error_msg}").into(),
);
} else {
println!("AppleScript succeeded");
log::info!("AppleScript succeeded");
}
Ok(())
}
pub async fn kill_browser_process_impl(
pid: u32,
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
println!("Attempting to kill browser process with PID: {pid}");
// First try SIGTERM (graceful shutdown)
let output = Command::new("kill")
.args(["-TERM", &pid.to_string()])
.output()
.map_err(|e| format!("Failed to execute kill command: {e}"))?;
if !output.status.success() {
// If SIGTERM fails, try SIGKILL (force kill)
let output = Command::new("kill")
.args(["-KILL", &pid.to_string()])
.output()?;
if !output.status.success() {
return Err(
format!(
"Failed to kill process {}: {}",
pid,
String::from_utf8_lossy(&output.stderr)
)
.into(),
);
}
}
println!("Successfully killed browser process with PID: {pid}");
Ok(())
}
}
#[cfg(target_os = "windows")]
@@ -534,9 +662,10 @@ pub mod windows {
executable_path: &std::path::Path,
args: &[String],
) -> Result<std::process::Child, Box<dyn std::error::Error + Send + Sync>> {
println!(
log::info!(
"Launching browser on Windows: {:?} with args: {:?}",
executable_path, args
executable_path,
args
);
// Check if the executable exists
@@ -575,7 +704,7 @@ pub mod windows {
.spawn()
.map_err(|e| format!("Failed to launch browser process: {}", e))?;
println!(
log::info!(
"Successfully launched browser process with PID: {}",
child.id()
);
@@ -727,41 +856,10 @@ pub mod windows {
cmd.current_dir(parent_dir);
}
let output = cmd.output()?;
if !output.status.success() {
// Try fallback without --new-window
let mut fallback_cmd = Command::new(&executable_path);
fallback_cmd.args([
&format!(
"--user-data-dir={}",
profile
.get_profile_data_path(profiles_dir)
.to_string_lossy()
),
url,
]);
if let Some(parent_dir) = browser_dir
.parent()
.or_else(|| browser_dir.ancestors().nth(1))
{
fallback_cmd.current_dir(parent_dir);
}
let fallback_output = fallback_cmd.output()?;
if !fallback_output.status.success() {
return Err(
format!(
"Failed to open URL in existing Chromium-based browser: {}",
String::from_utf8_lossy(&fallback_output.stderr)
)
.into(),
);
}
}
// Do not call output() to avoid blocking the UI thread while the browser processes the request.
// Spawn the helper process and return immediately. This applies to Chromium-based browsers
// including Brave to prevent UI freezes observed in production.
let _child = cmd.spawn()?;
Ok(())
}
@@ -773,7 +871,7 @@ pub mod windows {
let system = System::new_all();
if let Some(process) = system.process(Pid::from(pid as usize)) {
if process.kill() {
println!("Successfully killed browser process with PID: {pid}");
log::info!("Successfully killed browser process with PID: {pid}");
return Ok(());
}
}
@@ -789,7 +887,7 @@ pub mod windows {
match output {
Ok(result) => {
if result.status.success() {
println!("Successfully killed browser process with PID: {pid} using taskkill");
log::info!("Successfully killed browser process with PID: {pid} using taskkill");
Ok(())
} else {
Err(
@@ -824,9 +922,10 @@ pub mod linux {
executable_path: &std::path::Path,
args: &[String],
) -> Result<std::process::Child, Box<dyn std::error::Error + Send + Sync>> {
println!(
log::info!(
"Launching browser on Linux: {:?} with args: {:?}",
executable_path, args
executable_path,
args
);
// Check if the executable exists and is executable
@@ -890,7 +989,7 @@ pub mod linux {
// Set the combined LD_LIBRARY_PATH
if !ld_library_path.is_empty() {
cmd.env("LD_LIBRARY_PATH", ld_library_path.join(":"));
println!("Set LD_LIBRARY_PATH to: {}", ld_library_path.join(":"));
log::info!("Set LD_LIBRARY_PATH to: {}", ld_library_path.join(":"));
}
}
@@ -907,7 +1006,7 @@ pub mod linux {
// Disable GPU acceleration if running in headless environments
if std::env::var("DISPLAY").is_err() || std::env::var("WAYLAND_DISPLAY").is_err() {
println!("No display detected, browser may fail to start");
log::info!("No display detected, browser may fail to start");
}
// Attempt to spawn with better error handling for architecture issues
@@ -1031,7 +1130,7 @@ pub mod linux {
return Err(format!("Process {} not found", pid).into());
}
println!("Successfully killed browser process with PID: {pid}");
log::info!("Successfully killed browser process with PID: {pid}");
Ok(())
}
}
File diff suppressed because it is too large Load Diff
+5 -1
View File
@@ -1,4 +1,4 @@
use crate::camoufox::CamoufoxConfig;
use crate::camoufox_manager::CamoufoxConfig;
use serde::{Deserialize, Serialize};
use std::path::{Path, PathBuf};
@@ -20,6 +20,10 @@ pub struct BrowserProfile {
pub camoufox_config: Option<CamoufoxConfig>, // Camoufox configuration
#[serde(default)]
pub group_id: Option<String>, // Reference to profile group
#[serde(default)]
pub tags: Vec<String>, // Free-form tags
#[serde(default)]
pub note: Option<String>, // User note
}
pub fn default_release_type() -> String {
+15 -7
View File
@@ -5,7 +5,8 @@ use std::fs::{self, create_dir_all};
use std::path::{Path, PathBuf};
use crate::browser::BrowserType;
use crate::browser_runner::BrowserRunner;
use crate::downloaded_browsers_registry::DownloadedBrowsersRegistry;
use crate::profile::ProfileManager;
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct DetectedProfile {
@@ -17,12 +18,16 @@ pub struct DetectedProfile {
pub struct ProfileImporter {
base_dirs: BaseDirs,
downloaded_browsers_registry: &'static DownloadedBrowsersRegistry,
profile_manager: &'static ProfileManager,
}
impl ProfileImporter {
fn new() -> Self {
Self {
base_dirs: BaseDirs::new().expect("Failed to get base directories"),
downloaded_browsers_registry: DownloadedBrowsersRegistry::instance(),
profile_manager: ProfileManager::instance(),
}
}
@@ -520,7 +525,7 @@ impl ProfileImporter {
.map_err(|_| format!("Invalid browser type: {browser_type}"))?;
// Check if a profile with this name already exists
let existing_profiles = BrowserRunner::instance().list_profiles()?;
let existing_profiles = self.profile_manager.list_profiles()?;
if existing_profiles
.iter()
.any(|p| p.name.to_lowercase() == new_profile_name.to_lowercase())
@@ -530,7 +535,7 @@ impl ProfileImporter {
// Generate UUID for new profile and create the directory structure
let profile_id = uuid::Uuid::new_v4();
let profiles_dir = BrowserRunner::instance().get_profiles_dir();
let profiles_dir = self.profile_manager.get_profiles_dir();
let new_profile_uuid_dir = profiles_dir.join(profile_id.to_string());
let new_profile_data_dir = new_profile_uuid_dir.join("profile");
@@ -555,12 +560,14 @@ impl ProfileImporter {
release_type: "stable".to_string(),
camoufox_config: None,
group_id: None,
tags: Vec::new(),
note: None,
};
// Save the profile metadata
BrowserRunner::instance().save_profile(&profile)?;
self.profile_manager.save_profile(&profile)?;
println!(
log::info!(
"Successfully imported profile '{}' from '{}'",
new_profile_name,
source_path.display()
@@ -575,8 +582,9 @@ impl ProfileImporter {
browser_type: &str,
) -> Result<String, Box<dyn std::error::Error>> {
// Check if any version of the browser is downloaded
let registry = crate::downloaded_browsers::DownloadedBrowsersRegistry::instance();
let downloaded_versions = registry.get_downloaded_versions(browser_type);
let downloaded_versions = self
.downloaded_browsers_registry
.get_downloaded_versions(browser_type);
if let Some(version) = downloaded_versions.first() {
return Ok(version.clone());
File diff suppressed because it is too large Load Diff
+269
View File
@@ -0,0 +1,269 @@
use crate::proxy_storage::{
delete_proxy_config, generate_proxy_id, get_proxy_config, is_process_running, list_proxy_configs,
save_proxy_config, ProxyConfig,
};
use std::process::Stdio;
lazy_static::lazy_static! {
static ref PROXY_PROCESSES: std::sync::Mutex<std::collections::HashMap<String, u32>> =
std::sync::Mutex::new(std::collections::HashMap::new());
}
pub async fn start_proxy_process(
upstream_url: Option<String>,
port: Option<u16>,
) -> Result<ProxyConfig, Box<dyn std::error::Error>> {
start_proxy_process_with_profile(upstream_url, port, None).await
}
pub async fn start_proxy_process_with_profile(
upstream_url: Option<String>,
port: Option<u16>,
profile_id: Option<String>,
) -> Result<ProxyConfig, Box<dyn std::error::Error>> {
let id = generate_proxy_id();
let upstream = upstream_url.unwrap_or_else(|| "DIRECT".to_string());
// Get available port if not specified
let local_port = port.unwrap_or_else(|| {
// Find an available port
let listener = std::net::TcpListener::bind("127.0.0.1:0").unwrap();
listener.local_addr().unwrap().port()
});
let config =
ProxyConfig::new(id.clone(), upstream, Some(local_port)).with_profile_id(profile_id.clone());
save_proxy_config(&config)?;
// Log profile_id for debugging
if let Some(ref pid) = profile_id {
log::info!("Saved proxy config {} with profile_id: {}", id, pid);
} else {
log::info!("Saved proxy config {} without profile_id", id);
}
// Spawn proxy worker process in the background using std::process::Command
// This ensures proper process detachment on Unix systems
let exe = std::env::current_exe()?;
#[cfg(unix)]
{
use std::os::unix::process::CommandExt;
use std::process::Command as StdCommand;
let mut cmd = StdCommand::new(&exe);
cmd.arg("proxy-worker");
cmd.arg("start");
cmd.arg("--id");
cmd.arg(&id);
cmd.stdin(Stdio::null());
cmd.stdout(Stdio::null());
#[cfg(debug_assertions)]
{
let log_path = std::path::PathBuf::from("/tmp").join(format!("donut-proxy-{}.log", id));
if let Ok(file) = std::fs::File::create(&log_path) {
log::error!("Proxy worker stderr will be logged to: {:?}", log_path);
cmd.stderr(Stdio::from(file));
} else {
cmd.stderr(Stdio::null());
}
}
#[cfg(not(debug_assertions))]
{
cmd.stderr(Stdio::null());
}
// Properly detach the process on Unix by creating a new session
unsafe {
cmd.pre_exec(|| {
// Create a new process group so the process survives parent exit
libc::setsid();
// Set high priority so the proxy is killed last under resource pressure
// Negative nice value = higher priority. Try -10, fall back to -5 if it fails.
if libc::setpriority(libc::PRIO_PROCESS, 0, -10) != 0 {
let _ = libc::setpriority(libc::PRIO_PROCESS, 0, -5);
}
Ok(())
});
}
// Spawn detached process
let child = cmd.spawn()?;
let pid = child.id();
// Store PID
{
let mut processes = PROXY_PROCESSES.lock().unwrap();
processes.insert(id.clone(), pid);
}
// Update config with PID
let mut config_with_pid = config.clone();
config_with_pid.pid = Some(pid);
save_proxy_config(&config_with_pid)?;
// Don't wait for the child - it's detached
drop(child);
}
#[cfg(windows)]
{
use std::os::windows::process::CommandExt;
use std::process::Command as StdCommand;
use windows::Win32::Foundation::CloseHandle;
use windows::Win32::System::Threading::{
OpenProcess, SetPriorityClass, ABOVE_NORMAL_PRIORITY_CLASS, PROCESS_SET_INFORMATION,
};
let mut cmd = StdCommand::new(&exe);
cmd.arg("proxy-worker");
cmd.arg("start");
cmd.arg("--id");
cmd.arg(&id);
cmd.stdin(Stdio::null());
cmd.stdout(Stdio::null());
cmd.stderr(Stdio::null());
// On Windows, use CREATE_NEW_PROCESS_GROUP flag for proper detachment
const CREATE_NEW_PROCESS_GROUP: u32 = 0x00000200;
cmd.creation_flags(CREATE_NEW_PROCESS_GROUP);
let child = cmd.spawn()?;
let pid = child.id();
// Set high priority so the proxy is killed last under resource pressure
unsafe {
if let Ok(handle) = OpenProcess(PROCESS_SET_INFORMATION, false, pid) {
let _ = SetPriorityClass(handle, ABOVE_NORMAL_PRIORITY_CLASS);
let _ = CloseHandle(handle);
}
}
// Store PID
{
let mut processes = PROXY_PROCESSES.lock().unwrap();
processes.insert(id.clone(), pid);
}
// Update config with PID
let mut config_with_pid = config.clone();
config_with_pid.pid = Some(pid);
save_proxy_config(&config_with_pid)?;
drop(child);
}
// Give the process a moment to start up before checking
tokio::time::sleep(tokio::time::Duration::from_millis(100)).await;
// Wait for the worker to bind to the port and update config
// Since we pre-allocated the port, the worker should bind immediately
// We check quickly with short intervals to make startup fast
let mut attempts = 0;
let max_attempts = 40; // 4 seconds max (40 * 100ms) - give it more time to start
loop {
// Use shorter sleep for faster startup
tokio::time::sleep(tokio::time::Duration::from_millis(100)).await;
if let Some(updated_config) = get_proxy_config(&id) {
// Check if local_url is set (worker has bound and updated config)
if let Some(ref local_url) = updated_config.local_url {
if !local_url.is_empty() {
if let Some(port) = updated_config.local_port {
// Try to connect immediately - port should be ready since we pre-allocated it
match tokio::time::timeout(
tokio::time::Duration::from_millis(100),
tokio::net::TcpStream::connect(("127.0.0.1", port)),
)
.await
{
Ok(Ok(_stream)) => {
// Port is listening and accepting connections!
return Ok(updated_config);
}
Ok(Err(_)) | Err(_) => {
// Port not ready yet, continue waiting
}
}
}
}
}
}
attempts += 1;
if attempts >= max_attempts {
// Try to get the config one more time for better error message
if let Some(config) = get_proxy_config(&id) {
// Check if process is still running
let process_running = config.pid.map(is_process_running).unwrap_or(false);
return Err(
format!(
"Proxy worker failed to start in time. Config: id={}, local_url={:?}, local_port={:?}, pid={:?}, process_running={}",
config.id, config.local_url, config.local_port, config.pid, process_running
)
.into(),
);
}
return Err(
format!(
"Proxy worker failed to start in time. Config not found for id: {}",
id
)
.into(),
);
}
}
}
pub async fn stop_proxy_process(id: &str) -> Result<bool, Box<dyn std::error::Error>> {
let config = get_proxy_config(id);
if let Some(config) = config {
if let Some(pid) = config.pid {
// Kill the process
#[cfg(unix)]
{
use std::process::Command;
let _ = Command::new("kill")
.arg("-TERM")
.arg(pid.to_string())
.output();
}
#[cfg(windows)]
{
use std::process::Command;
let _ = Command::new("taskkill")
.args(["/F", "/PID", &pid.to_string()])
.output();
}
// Wait a bit for the process to exit
tokio::time::sleep(tokio::time::Duration::from_millis(500)).await;
// Remove from tracking
{
let mut processes = PROXY_PROCESSES.lock().unwrap();
processes.remove(id);
}
// Delete the config file
delete_proxy_config(id);
return Ok(true);
}
}
Ok(false)
}
pub async fn stop_all_proxy_processes() -> Result<(), Box<dyn std::error::Error>> {
let configs = list_proxy_configs();
for config in configs {
let _ = stop_proxy_process(&config.id).await;
}
Ok(())
}
+888
View File
@@ -0,0 +1,888 @@
use crate::proxy_storage::ProxyConfig;
use crate::traffic_stats::{get_traffic_tracker, init_traffic_tracker};
use http_body_util::{BodyExt, Full};
use hyper::body::Bytes;
use hyper::server::conn::http1;
use hyper::service::service_fn;
use hyper::{Method, Request, Response, StatusCode};
use hyper_util::rt::TokioIo;
use std::convert::Infallible;
use std::io;
use std::net::SocketAddr;
use std::pin::Pin;
use std::sync::atomic::{AtomicU64, Ordering};
use std::sync::Arc;
use std::task::{Context, Poll};
use tokio::io::{AsyncRead, AsyncReadExt, AsyncWrite, AsyncWriteExt, ReadBuf};
use tokio::net::TcpListener;
use tokio::net::TcpStream;
use url::Url;
/// Wrapper stream that counts bytes read and written
struct CountingStream<S> {
inner: S,
bytes_read: Arc<AtomicU64>,
bytes_written: Arc<AtomicU64>,
}
impl<S> CountingStream<S> {
fn new(inner: S) -> Self {
Self {
inner,
bytes_read: Arc::new(AtomicU64::new(0)),
bytes_written: Arc::new(AtomicU64::new(0)),
}
}
}
impl<S: AsyncRead + Unpin> AsyncRead for CountingStream<S> {
fn poll_read(
mut self: Pin<&mut Self>,
cx: &mut Context<'_>,
buf: &mut ReadBuf<'_>,
) -> Poll<io::Result<()>> {
let filled_before = buf.filled().len();
let result = Pin::new(&mut self.inner).poll_read(cx, buf);
if let Poll::Ready(Ok(())) = &result {
let bytes_read = buf.filled().len() - filled_before;
if bytes_read > 0 {
self
.bytes_read
.fetch_add(bytes_read as u64, Ordering::Relaxed);
// Update global tracker - count as received (data coming into proxy)
if let Some(tracker) = get_traffic_tracker() {
tracker.add_bytes_received(bytes_read as u64);
}
}
}
result
}
}
impl<S: AsyncWrite + Unpin> AsyncWrite for CountingStream<S> {
fn poll_write(
mut self: Pin<&mut Self>,
cx: &mut Context<'_>,
buf: &[u8],
) -> Poll<io::Result<usize>> {
let result = Pin::new(&mut self.inner).poll_write(cx, buf);
if let Poll::Ready(Ok(n)) = &result {
self.bytes_written.fetch_add(*n as u64, Ordering::Relaxed);
// Update global tracker - count as sent (data going out of proxy)
if let Some(tracker) = get_traffic_tracker() {
tracker.add_bytes_sent(*n as u64);
}
}
result
}
fn poll_flush(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<io::Result<()>> {
Pin::new(&mut self.inner).poll_flush(cx)
}
fn poll_shutdown(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<io::Result<()>> {
Pin::new(&mut self.inner).poll_shutdown(cx)
}
}
// Wrapper to prepend consumed bytes to a stream
struct PrependReader {
prepended: Vec<u8>,
prepended_pos: usize,
inner: TcpStream,
}
impl AsyncRead for PrependReader {
fn poll_read(
mut self: Pin<&mut Self>,
cx: &mut Context<'_>,
buf: &mut ReadBuf<'_>,
) -> Poll<io::Result<()>> {
// First, read from prepended bytes if any
if self.prepended_pos < self.prepended.len() {
let available = self.prepended.len() - self.prepended_pos;
let to_copy = available.min(buf.remaining());
buf.put_slice(&self.prepended[self.prepended_pos..self.prepended_pos + to_copy]);
self.prepended_pos += to_copy;
return Poll::Ready(Ok(()));
}
// Then read from inner stream
Pin::new(&mut self.inner).poll_read(cx, buf)
}
}
impl AsyncWrite for PrependReader {
fn poll_write(
mut self: Pin<&mut Self>,
cx: &mut Context<'_>,
buf: &[u8],
) -> Poll<io::Result<usize>> {
Pin::new(&mut self.inner).poll_write(cx, buf)
}
fn poll_flush(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<io::Result<()>> {
Pin::new(&mut self.inner).poll_flush(cx)
}
fn poll_shutdown(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<io::Result<()>> {
Pin::new(&mut self.inner).poll_shutdown(cx)
}
}
async fn handle_request(
req: Request<hyper::body::Incoming>,
upstream_url: Option<String>,
) -> Result<Response<Full<Bytes>>, Infallible> {
// Handle CONNECT method for HTTPS tunneling
if req.method() == Method::CONNECT {
return handle_connect(req, upstream_url).await;
}
// Handle regular HTTP requests
handle_http(req, upstream_url).await
}
async fn handle_connect(
req: Request<hyper::body::Incoming>,
upstream_url: Option<String>,
) -> Result<Response<Full<Bytes>>, Infallible> {
let authority = req.uri().authority().cloned();
if let Some(authority) = authority {
let target_addr = format!("{}", authority);
// Parse target host and port
let (target_host, target_port) = if let Some(colon_pos) = target_addr.find(':') {
let host = &target_addr[..colon_pos];
let port: u16 = target_addr[colon_pos + 1..].parse().unwrap_or(443);
(host, port)
} else {
(&target_addr[..], 443)
};
// If no upstream proxy, connect directly
if upstream_url.is_none()
|| upstream_url
.as_ref()
.map(|s| s == "DIRECT")
.unwrap_or(false)
{
match TcpStream::connect(&target_addr).await {
Ok(_stream) => {
let mut response = Response::new(Full::new(Bytes::from("")));
*response.status_mut() = StatusCode::from_u16(200).unwrap();
return Ok(response);
}
Err(e) => {
log::error!("Failed to connect to {}: {}", target_addr, e);
let mut response =
Response::new(Full::new(Bytes::from(format!("Connection failed: {}", e))));
*response.status_mut() = StatusCode::BAD_GATEWAY;
return Ok(response);
}
}
}
// Connect through upstream proxy
let upstream = match upstream_url.as_ref().and_then(|u| Url::parse(u).ok()) {
Some(url) => url,
None => {
let mut response = Response::new(Full::new(Bytes::from("Invalid upstream URL")));
*response.status_mut() = StatusCode::BAD_GATEWAY;
return Ok(response);
}
};
let scheme = upstream.scheme();
match scheme {
"http" | "https" => {
// Use manual CONNECT for HTTP/HTTPS proxies
match connect_via_http_proxy(&upstream, target_host, target_port).await {
Ok(_) => {
let mut response = Response::new(Full::new(Bytes::from("")));
*response.status_mut() = StatusCode::from_u16(200).unwrap();
Ok(response)
}
Err(e) => {
log::error!("HTTP proxy CONNECT failed: {}", e);
let mut response = Response::new(Full::new(Bytes::from(format!(
"Proxy connection failed: {}",
e
))));
*response.status_mut() = StatusCode::BAD_GATEWAY;
Ok(response)
}
}
}
"socks4" | "socks5" => {
// Use async-socks5 for SOCKS proxies
let host = upstream.host_str().unwrap_or("127.0.0.1");
let port = upstream.port().unwrap_or(1080);
let socks_addr = format!("{}:{}", host, port);
let username = upstream.username();
let password = upstream.password().unwrap_or("");
match connect_via_socks(
&socks_addr,
target_host,
target_port,
scheme == "socks5",
if !username.is_empty() {
Some((username, password))
} else {
None
},
)
.await
{
Ok(_stream) => {
let mut response = Response::new(Full::new(Bytes::from("")));
*response.status_mut() = StatusCode::from_u16(200).unwrap();
Ok(response)
}
Err(e) => {
log::error!("SOCKS connection failed: {}", e);
let mut response = Response::new(Full::new(Bytes::from(format!(
"SOCKS connection failed: {}",
e
))));
*response.status_mut() = StatusCode::BAD_GATEWAY;
Ok(response)
}
}
}
_ => {
let mut response = Response::new(Full::new(Bytes::from("Unsupported upstream scheme")));
*response.status_mut() = StatusCode::BAD_GATEWAY;
Ok(response)
}
}
} else {
let mut response = Response::new(Full::new(Bytes::from("Bad Request")));
*response.status_mut() = StatusCode::BAD_REQUEST;
Ok(response)
}
}
async fn connect_via_http_proxy(
upstream: &Url,
target_host: &str,
target_port: u16,
) -> Result<TcpStream, Box<dyn std::error::Error>> {
let proxy_host = upstream.host_str().unwrap_or("127.0.0.1");
let proxy_port = upstream.port().unwrap_or(8080);
let mut stream = TcpStream::connect((proxy_host, proxy_port)).await?;
// Add proxy authentication if provided
let mut connect_req = format!(
"CONNECT {}:{} HTTP/1.1\r\nHost: {}:{}\r\n",
target_host, target_port, target_host, target_port
);
if !upstream.username().is_empty() {
use base64::{engine::general_purpose, Engine as _};
let username = upstream.username();
let password = upstream.password().unwrap_or("");
let auth = general_purpose::STANDARD.encode(format!("{}:{}", username, password));
connect_req.push_str(&format!("Proxy-Authorization: Basic {}\r\n", auth));
}
connect_req.push_str("\r\n");
stream.write_all(connect_req.as_bytes()).await?;
let mut buffer = [0u8; 4096];
let n = stream.read(&mut buffer).await?;
let response = String::from_utf8_lossy(&buffer[..n]);
if response.starts_with("HTTP/1.1 200") || response.starts_with("HTTP/1.0 200") {
Ok(stream)
} else {
Err(format!("Upstream proxy CONNECT failed: {}", response).into())
}
}
async fn connect_via_socks(
socks_addr: &str,
target_host: &str,
target_port: u16,
is_socks5: bool,
auth: Option<(&str, &str)>,
) -> Result<TcpStream, Box<dyn std::error::Error>> {
let mut stream = TcpStream::connect(socks_addr).await?;
if is_socks5 {
// SOCKS5 connection using async_socks5
use async_socks5::{connect, AddrKind, Auth};
let target = if let Ok(ip) = target_host.parse::<std::net::IpAddr>() {
AddrKind::Ip(std::net::SocketAddr::new(ip, target_port))
} else {
AddrKind::Domain(target_host.to_string(), target_port)
};
let auth_info: Option<Auth> = auth.map(|(user, pass)| Auth {
username: user.to_string(),
password: pass.to_string(),
});
connect(&mut stream, target, auth_info).await?;
Ok(stream)
} else {
// SOCKS4 - simplified implementation
let ip: std::net::IpAddr = target_host.parse()?;
let mut request = vec![0x04, 0x01]; // SOCKS4, CONNECT
request.extend_from_slice(&target_port.to_be_bytes());
match ip {
std::net::IpAddr::V4(ipv4) => {
request.extend_from_slice(&ipv4.octets());
}
std::net::IpAddr::V6(_) => {
return Err("SOCKS4 does not support IPv6".into());
}
}
request.push(0); // NULL terminator for userid
stream.write_all(&request).await?;
let mut response = [0u8; 8];
stream.read_exact(&mut response).await?;
if response[1] != 0x5A {
return Err("SOCKS4 connection failed".into());
}
Ok(stream)
}
}
async fn handle_http(
req: Request<hyper::body::Incoming>,
upstream_url: Option<String>,
) -> Result<Response<Full<Bytes>>, Infallible> {
// Use reqwest for all HTTP requests as it handles proxies better
// This is faster and more reliable than trying to use hyper-proxy with version conflicts
use reqwest::Client;
// Extract domain for traffic tracking
let domain = req
.uri()
.host()
.map(|h| h.to_string())
.unwrap_or_else(|| "unknown".to_string());
let client_builder = Client::builder();
let client = if let Some(ref upstream) = upstream_url {
if upstream == "DIRECT" {
client_builder.build().unwrap_or_default()
} else {
// Build reqwest client with proxy
match build_reqwest_client_with_proxy(upstream) {
Ok(c) => c,
Err(e) => {
log::error!("Failed to create proxy client: {}", e);
let mut response = Response::new(Full::new(Bytes::from(format!(
"Proxy configuration error: {}",
e
))));
*response.status_mut() = StatusCode::BAD_GATEWAY;
return Ok(response);
}
}
}
} else {
client_builder.build().unwrap_or_default()
};
// Convert hyper request to reqwest request
let uri = req.uri().to_string();
let method = req.method().clone();
let headers = req.headers().clone();
let mut request_builder = match method.as_str() {
"GET" => client.get(&uri),
"POST" => client.post(&uri),
"PUT" => client.put(&uri),
"DELETE" => client.delete(&uri),
"PATCH" => client.patch(&uri),
"HEAD" => client.head(&uri),
_ => {
let mut response = Response::new(Full::new(Bytes::from("Unsupported method")));
*response.status_mut() = StatusCode::METHOD_NOT_ALLOWED;
return Ok(response);
}
};
// Copy headers, but skip proxy-specific headers that shouldn't be forwarded
for (name, value) in headers.iter() {
// Skip proxy-specific headers - these are for the local proxy, not the upstream
if name.as_str().eq_ignore_ascii_case("proxy-authorization")
|| name.as_str().eq_ignore_ascii_case("proxy-connection")
|| name.as_str().eq_ignore_ascii_case("proxy-authenticate")
{
continue;
}
if let Ok(val) = value.to_str() {
request_builder = request_builder.header(name.as_str(), val);
}
}
// Get body
let body_bytes = match req.collect().await {
Ok(collected) => collected.to_bytes(),
Err(_) => Bytes::new(),
};
if !body_bytes.is_empty() {
request_builder = request_builder.body(body_bytes.to_vec());
}
// Execute request
match request_builder.send().await {
Ok(response) => {
let status = response.status();
let headers = response.headers().clone();
let body = response.bytes().await.unwrap_or_default();
// Record request in traffic tracker
let response_size = body.len() as u64;
if let Some(tracker) = get_traffic_tracker() {
tracker.record_request(&domain, body_bytes.len() as u64, response_size);
}
let mut hyper_response = Response::new(Full::new(body));
*hyper_response.status_mut() = StatusCode::from_u16(status.as_u16()).unwrap();
// Copy response headers
for (name, value) in headers.iter() {
if let Ok(val) = value.to_str() {
hyper_response
.headers_mut()
.insert(name, val.parse().unwrap());
}
}
Ok(hyper_response)
}
Err(e) => {
log::error!("Request failed: {}", e);
let mut response = Response::new(Full::new(Bytes::from(format!("Request failed: {}", e))));
*response.status_mut() = StatusCode::BAD_GATEWAY;
Ok(response)
}
}
}
fn build_reqwest_client_with_proxy(
upstream_url: &str,
) -> Result<reqwest::Client, Box<dyn std::error::Error>> {
use reqwest::Proxy;
let client_builder = reqwest::Client::builder();
// Parse the upstream URL
let url = Url::parse(upstream_url)?;
let scheme = url.scheme();
let proxy = match scheme {
"http" | "https" => {
// For HTTP/HTTPS proxies, reqwest handles them directly
Proxy::http(upstream_url)?
}
"socks5" => {
// For SOCKS5, reqwest supports it directly
Proxy::all(upstream_url)?
}
"socks4" => {
// SOCKS4 is not directly supported by reqwest, would need custom handling
return Err("SOCKS4 not supported for HTTP requests via reqwest".into());
}
_ => {
return Err(format!("Unsupported proxy scheme: {}", scheme).into());
}
};
Ok(client_builder.proxy(proxy).build()?)
}
pub async fn run_proxy_server(config: ProxyConfig) -> Result<(), Box<dyn std::error::Error>> {
log::error!(
"Proxy worker starting, looking for config id: {}",
config.id
);
// Load the config from disk to get the latest state
let config = match crate::proxy_storage::get_proxy_config(&config.id) {
Some(c) => c,
None => {
log::error!("Config not found for id: {}", config.id);
return Err("Config not found".into());
}
};
log::error!(
"Found config: id={}, port={:?}, upstream={}, profile_id={:?}",
config.id,
config.local_port,
config.upstream_url,
config.profile_id
);
log::error!("Starting proxy server for config id: {}", config.id);
// Initialize traffic tracker with profile ID if available
// This can now be called multiple times to update the tracker
init_traffic_tracker(config.id.clone(), config.profile_id.clone());
log::error!(
"Traffic tracker initialized for proxy: {} (profile_id: {:?})",
config.id,
config.profile_id
);
// Verify tracker was initialized correctly
if let Some(tracker) = crate::traffic_stats::get_traffic_tracker() {
log::error!(
"Tracker verified: proxy_id={}, profile_id={:?}",
tracker.proxy_id,
tracker.profile_id
);
} else {
log::error!("WARNING: Tracker was not initialized!");
}
// Determine the bind address
let bind_addr = SocketAddr::from(([127, 0, 0, 1], config.local_port.unwrap_or(0)));
log::error!("Attempting to bind proxy server to {}", bind_addr);
// Bind to the port
let listener = TcpListener::bind(bind_addr).await?;
let actual_port = listener.local_addr()?.port();
log::error!("Successfully bound to port {}", actual_port);
// Update config with actual port and local_url
let mut updated_config = config.clone();
updated_config.local_port = Some(actual_port);
updated_config.local_url = Some(format!("http://127.0.0.1:{}", actual_port));
// Save the updated config
log::error!(
"Saving updated config with local_url={:?}",
updated_config.local_url
);
if !crate::proxy_storage::update_proxy_config(&updated_config) {
log::error!("Failed to update proxy config");
return Err("Failed to update proxy config".into());
}
let upstream_url = if updated_config.upstream_url == "DIRECT" {
None
} else {
Some(updated_config.upstream_url.clone())
};
log::error!("Proxy server bound to 127.0.0.1:{}", actual_port);
log::error!(
"Proxy server listening on 127.0.0.1:{} (ready to accept connections)",
actual_port
);
log::error!("Proxy server entering accept loop - process should stay alive");
// Start a background task to periodically flush traffic stats to disk
tokio::spawn(async move {
let mut interval = tokio::time::interval(tokio::time::Duration::from_secs(1));
loop {
interval.tick().await;
if let Some(tracker) = get_traffic_tracker() {
if let Err(e) = tracker.flush_to_disk() {
log::error!("Failed to flush traffic stats: {}", e);
}
}
}
});
// Keep the runtime alive with an infinite loop
// This ensures the process doesn't exit even if there are no active connections
loop {
match listener.accept().await {
Ok((mut stream, _)) => {
let upstream = upstream_url.clone();
tokio::task::spawn(async move {
// Read first bytes to detect CONNECT requests
// CONNECT requests need special handling for tunneling
let mut peek_buffer = [0u8; 8];
match stream.read(&mut peek_buffer).await {
Ok(n) if n >= 7 => {
let request_start = String::from_utf8_lossy(&peek_buffer[..n.min(7)]);
if request_start.starts_with("CONNECT") {
// Handle CONNECT request manually for tunneling
let mut full_request = Vec::with_capacity(4096);
full_request.extend_from_slice(&peek_buffer[..n]);
// Read the rest of the CONNECT request
let mut remaining = [0u8; 4096];
loop {
match stream.read(&mut remaining).await {
Ok(0) => break,
Ok(m) => {
full_request.extend_from_slice(&remaining[..m]);
if full_request.ends_with(b"\r\n\r\n") || full_request.ends_with(b"\n\n") {
break;
}
}
Err(_) => break,
}
}
// Handle CONNECT manually
log::error!(
"DEBUG: Handling CONNECT manually for: {}",
String::from_utf8_lossy(&full_request[..full_request.len().min(100)])
);
if let Err(e) = handle_connect_from_buffer(stream, full_request, upstream).await {
log::error!("Error handling CONNECT request: {:?}", e);
} else {
log::error!("DEBUG: CONNECT handled successfully");
}
return;
}
// Not CONNECT - reconstruct stream with consumed bytes prepended
let prepended_bytes = peek_buffer[..n].to_vec();
let prepended_reader = PrependReader {
prepended: prepended_bytes,
prepended_pos: 0,
inner: stream,
};
let io = TokioIo::new(prepended_reader);
let service = service_fn(move |req| handle_request(req, upstream.clone()));
if let Err(err) = http1::Builder::new().serve_connection(io, service).await {
log::error!("Error serving connection: {:?}", err);
}
return;
}
_ => {}
}
// For non-CONNECT requests, use hyper's HTTP handling
let io = TokioIo::new(stream);
let service = service_fn(move |req| handle_request(req, upstream.clone()));
if let Err(err) = http1::Builder::new().serve_connection(io, service).await {
log::error!("Error serving connection: {:?}", err);
}
});
}
Err(e) => {
log::error!("Error accepting connection: {:?}", e);
// Continue accepting connections even if one fails
// Add a small delay to avoid busy-waiting on errors
tokio::time::sleep(tokio::time::Duration::from_millis(100)).await;
}
}
}
}
async fn handle_connect_from_buffer(
mut client_stream: TcpStream,
request_buffer: Vec<u8>,
upstream_url: Option<String>,
) -> Result<(), Box<dyn std::error::Error>> {
// Parse the CONNECT request from the buffer
let request_str = String::from_utf8_lossy(&request_buffer);
let lines: Vec<&str> = request_str.lines().collect();
if lines.is_empty() {
let _ = client_stream
.write_all(b"HTTP/1.1 400 Bad Request\r\n\r\n")
.await;
return Err("Empty CONNECT request".into());
}
// Parse CONNECT request: "CONNECT host:port HTTP/1.1"
let parts: Vec<&str> = lines[0].split_whitespace().collect();
if parts.len() < 2 || parts[0] != "CONNECT" {
let _ = client_stream
.write_all(b"HTTP/1.1 400 Bad Request\r\n\r\n")
.await;
return Err("Invalid CONNECT request".into());
}
let target = parts[1];
let (target_host, target_port) = if let Some(colon_pos) = target.find(':') {
let host = &target[..colon_pos];
let port: u16 = target[colon_pos + 1..].parse().unwrap_or(443);
(host, port)
} else {
(target, 443)
};
// Record domain access in traffic tracker
let domain = target_host.to_string();
if let Some(tracker) = get_traffic_tracker() {
tracker.record_request(&domain, 0, 0);
}
// Connect to target (directly or via upstream proxy)
let target_stream = if upstream_url.is_none()
|| upstream_url
.as_ref()
.map(|s| s == "DIRECT")
.unwrap_or(false)
{
// Direct connection
TcpStream::connect((target_host, target_port)).await?
} else {
// Connect via upstream proxy
let upstream = Url::parse(upstream_url.as_ref().unwrap())?;
let scheme = upstream.scheme();
match scheme {
"http" | "https" => {
// Connect via HTTP proxy CONNECT
let proxy_host = upstream.host_str().unwrap_or("127.0.0.1");
let proxy_port = upstream.port().unwrap_or(8080);
let mut proxy_stream = TcpStream::connect((proxy_host, proxy_port)).await?;
// Add authentication if provided
let mut connect_req = format!(
"CONNECT {}:{} HTTP/1.1\r\nHost: {}:{}\r\n",
target_host, target_port, target_host, target_port
);
if !upstream.username().is_empty() {
use base64::{engine::general_purpose, Engine as _};
let username = upstream.username();
let password = upstream.password().unwrap_or("");
let auth = general_purpose::STANDARD.encode(format!("{}:{}", username, password));
connect_req.push_str(&format!("Proxy-Authorization: Basic {}\r\n", auth));
}
connect_req.push_str("\r\n");
// Send CONNECT request to upstream proxy
proxy_stream.write_all(connect_req.as_bytes()).await?;
// Read response
let mut buffer = [0u8; 4096];
let n = proxy_stream.read(&mut buffer).await?;
let response = String::from_utf8_lossy(&buffer[..n]);
if !response.starts_with("HTTP/1.1 200") && !response.starts_with("HTTP/1.0 200") {
return Err(format!("Upstream proxy CONNECT failed: {}", response).into());
}
proxy_stream
}
"socks4" | "socks5" => {
// Connect via SOCKS proxy
let socks_host = upstream.host_str().unwrap_or("127.0.0.1");
let socks_port = upstream.port().unwrap_or(1080);
let socks_addr = format!("{}:{}", socks_host, socks_port);
let username = upstream.username();
let password = upstream.password().unwrap_or("");
connect_via_socks(
&socks_addr,
target_host,
target_port,
scheme == "socks5",
if !username.is_empty() {
Some((username, password))
} else {
None
},
)
.await?
}
_ => {
return Err(format!("Unsupported upstream proxy scheme: {}", scheme).into());
}
}
};
// Send 200 Connection Established response to client
// CRITICAL: Must flush after writing to ensure response is sent before tunneling
client_stream
.write_all(b"HTTP/1.1 200 Connection Established\r\n\r\n")
.await?;
client_stream.flush().await?;
log::error!("DEBUG: Sent 200 Connection Established response, starting tunnel");
// Now tunnel data bidirectionally with counting
// Wrap streams to count bytes transferred
let counting_client = CountingStream::new(client_stream);
let counting_target = CountingStream::new(target_stream);
// Get references for final stats
let client_read_counter = counting_client.bytes_read.clone();
let client_write_counter = counting_client.bytes_written.clone();
let target_read_counter = counting_target.bytes_read.clone();
let target_write_counter = counting_target.bytes_written.clone();
// Split streams for bidirectional copying
let (mut client_read, mut client_write) = tokio::io::split(counting_client);
let (mut target_read, mut target_write) = tokio::io::split(counting_target);
log::error!("DEBUG: Starting bidirectional tunnel");
// Spawn two tasks to forward data in both directions
let client_to_target = tokio::spawn(async move {
let result = tokio::io::copy(&mut client_read, &mut target_write).await;
match result {
Ok(bytes) => {
log::error!("DEBUG: Tunneled {} bytes from client->target", bytes);
}
Err(e) => {
log::error!("Error forwarding client->target: {:?}", e);
}
}
});
let target_to_client = tokio::spawn(async move {
let result = tokio::io::copy(&mut target_read, &mut client_write).await;
match result {
Ok(bytes) => {
log::error!("DEBUG: Tunneled {} bytes from target->client", bytes);
}
Err(e) => {
log::error!("Error forwarding target->client: {:?}", e);
}
}
});
// Wait for either direction to finish (connection closed)
tokio::select! {
_ = client_to_target => {
log::error!("DEBUG: Client->target tunnel closed");
}
_ = target_to_client => {
log::error!("DEBUG: Target->client tunnel closed");
}
}
// Log final byte counts and update domain stats
let final_sent =
client_read_counter.load(Ordering::Relaxed) + target_write_counter.load(Ordering::Relaxed);
let final_recv =
target_read_counter.load(Ordering::Relaxed) + client_write_counter.load(Ordering::Relaxed);
log::error!(
"DEBUG: Tunnel closed - sent: {} bytes, received: {} bytes",
final_sent,
final_recv
);
// Update domain-specific byte counts now that tunnel is complete
if let Some(tracker) = get_traffic_tracker() {
tracker.update_domain_bytes(&domain, final_sent, final_recv);
}
Ok(())
}
+125
View File
@@ -0,0 +1,125 @@
#[cfg(test)]
mod tests {
use super::*;
use crate::proxy_runner::{start_proxy_process, stop_proxy_process};
use crate::proxy_storage::{delete_proxy_config, generate_proxy_id, list_proxy_configs};
use std::process::Command;
use std::time::Duration;
use tokio::net::TcpStream;
use tokio::time::sleep;
#[tokio::test]
async fn test_proxy_storage() {
// Test proxy config storage
let id = generate_proxy_id();
let config = crate::proxy_storage::ProxyConfig::new(id.clone(), "DIRECT".to_string(), Some(8080));
// Save config
crate::proxy_storage::save_proxy_config(&config).unwrap();
// Load config
let loaded = crate::proxy_storage::get_proxy_config(&id).unwrap();
assert_eq!(loaded.id, id);
assert_eq!(loaded.upstream_url, "DIRECT");
assert_eq!(loaded.local_port, Some(8080));
// Delete config
assert!(crate::proxy_storage::delete_proxy_config(&id));
assert!(crate::proxy_storage::get_proxy_config(&id).is_none());
}
#[tokio::test]
async fn test_proxy_id_generation() {
let id1 = generate_proxy_id();
let id2 = generate_proxy_id();
assert_ne!(id1, id2);
assert!(id1.starts_with("proxy_"));
}
#[tokio::test]
async fn test_proxy_process_lifecycle() {
// Start a direct proxy
let config = start_proxy_process(None, Some(0)).await.unwrap();
let id = config.id.clone();
// Verify config was saved
let loaded = crate::proxy_storage::get_proxy_config(&id).unwrap();
assert_eq!(loaded.id, id);
// Wait a bit for the proxy to start
sleep(Duration::from_millis(500)).await;
// Stop the proxy
let stopped = stop_proxy_process(&id).await.unwrap();
assert!(stopped);
// Verify config was deleted
assert!(crate::proxy_storage::get_proxy_config(&id).is_none());
}
#[tokio::test]
async fn test_proxy_with_upstream_http() {
// Start a proxy with HTTP upstream (using a non-existent proxy for testing)
let upstream_url = "http://127.0.0.1:9999";
let config = start_proxy_process(Some(upstream_url.to_string()), Some(0))
.await
.unwrap();
let id = config.id.clone();
// Wait a bit
sleep(Duration::from_millis(500)).await;
// Clean up
let _ = stop_proxy_process(&id).await;
}
#[tokio::test]
async fn test_proxy_with_upstream_socks5() {
// Start a proxy with SOCKS5 upstream
let upstream_url = "socks5://127.0.0.1:1080";
let config = start_proxy_process(Some(upstream_url.to_string()), Some(0))
.await
.unwrap();
let id = config.id.clone();
// Wait a bit
sleep(Duration::from_millis(500)).await;
// Clean up
let _ = stop_proxy_process(&id).await;
}
#[tokio::test]
async fn test_proxy_port_assignment() {
// Start multiple proxies and verify they get different ports
let config1 = start_proxy_process(None, None).await.unwrap();
sleep(Duration::from_millis(100)).await;
let config2 = start_proxy_process(None, None).await.unwrap();
// They should have different IDs
assert_ne!(config1.id, config2.id);
// Clean up
let _ = stop_proxy_process(&config1.id).await;
let _ = stop_proxy_process(&config2.id).await;
}
#[tokio::test]
async fn test_proxy_list() {
// Start a few proxies
let config1 = start_proxy_process(None, None).await.unwrap();
sleep(Duration::from_millis(100)).await;
let config2 = start_proxy_process(None, None).await.unwrap();
// List all proxies
let configs = list_proxy_configs();
assert!(configs.len() >= 2);
// Clean up
let _ = stop_proxy_process(&config1.id).await;
let _ = stop_proxy_process(&config2.id).await;
}
}
+138
View File
@@ -0,0 +1,138 @@
use directories::BaseDirs;
use serde::{Deserialize, Serialize};
use std::fs;
use std::path::PathBuf;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ProxyConfig {
pub id: String,
pub upstream_url: String, // Can be "DIRECT" for direct proxy
pub local_port: Option<u16>,
pub ignore_proxy_certificate: Option<bool>,
pub local_url: Option<String>,
pub pid: Option<u32>,
#[serde(default)]
pub profile_id: Option<String>,
}
impl ProxyConfig {
pub fn new(id: String, upstream_url: String, local_port: Option<u16>) -> Self {
Self {
id,
upstream_url,
local_port,
ignore_proxy_certificate: None,
local_url: None,
pid: None,
profile_id: None,
}
}
pub fn with_profile_id(mut self, profile_id: Option<String>) -> Self {
self.profile_id = profile_id;
self
}
}
pub fn get_storage_dir() -> PathBuf {
let base_dirs = BaseDirs::new().expect("Failed to get base directories");
let mut path = base_dirs.data_local_dir().to_path_buf();
path.push(if cfg!(debug_assertions) {
"DonutBrowserDev"
} else {
"DonutBrowser"
});
path.push("proxies");
path
}
pub fn save_proxy_config(config: &ProxyConfig) -> Result<(), Box<dyn std::error::Error>> {
let storage_dir = get_storage_dir();
fs::create_dir_all(&storage_dir)?;
let file_path = storage_dir.join(format!("{}.json", config.id));
let content = serde_json::to_string_pretty(config)?;
fs::write(&file_path, content)?;
Ok(())
}
pub fn get_proxy_config(id: &str) -> Option<ProxyConfig> {
let storage_dir = get_storage_dir();
let file_path = storage_dir.join(format!("{}.json", id));
if !file_path.exists() {
return None;
}
match fs::read_to_string(&file_path) {
Ok(content) => serde_json::from_str(&content).ok(),
Err(_) => None,
}
}
pub fn delete_proxy_config(id: &str) -> bool {
let storage_dir = get_storage_dir();
let file_path = storage_dir.join(format!("{}.json", id));
if !file_path.exists() {
return false;
}
fs::remove_file(&file_path).is_ok()
}
pub fn list_proxy_configs() -> Vec<ProxyConfig> {
let storage_dir = get_storage_dir();
if !storage_dir.exists() {
return Vec::new();
}
let mut configs = Vec::new();
if let Ok(entries) = fs::read_dir(&storage_dir) {
for entry in entries.flatten() {
let path = entry.path();
if path.extension().is_some_and(|ext| ext == "json") {
if let Ok(content) = fs::read_to_string(&path) {
if let Ok(config) = serde_json::from_str::<ProxyConfig>(&content) {
configs.push(config);
}
}
}
}
}
configs
}
pub fn update_proxy_config(config: &ProxyConfig) -> bool {
let storage_dir = get_storage_dir();
let file_path = storage_dir.join(format!("{}.json", config.id));
if !file_path.exists() {
return false;
}
match serde_json::to_string_pretty(config) {
Ok(content) => fs::write(&file_path, content).is_ok(),
Err(_) => false,
}
}
pub fn generate_proxy_id() -> String {
format!(
"proxy_{}_{}",
std::time::SystemTime::now()
.duration_since(std::time::UNIX_EPOCH)
.unwrap()
.as_secs(),
rand::random::<u32>()
)
}
pub fn is_process_running(pid: u32) -> bool {
use sysinfo::{Pid, System};
let system = System::new_all();
system.process(Pid::from(pid as usize)).is_some()
}
+269 -59
View File
@@ -3,8 +3,11 @@ use serde::{Deserialize, Serialize};
use std::fs::{self, create_dir_all};
use std::path::PathBuf;
use crate::api_client::ApiClient;
use crate::version_updater;
use aes_gcm::{
aead::{Aead, AeadCore, KeyInit, OsRng},
Aes256Gcm, Key, Nonce,
};
use argon2::{password_hash::SaltString, Argon2, PasswordHasher};
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct TableSortingSettings {
@@ -27,17 +30,33 @@ pub struct AppSettings {
pub set_as_default_browser: bool,
#[serde(default = "default_theme")]
pub theme: String, // "light", "dark", or "system"
#[serde(default)]
pub custom_theme: Option<std::collections::HashMap<String, String>>, // CSS var name -> value (e.g., "--background": "#1a1b26")
#[serde(default)]
pub api_enabled: bool,
#[serde(default = "default_api_port")]
pub api_port: u16,
#[serde(default)]
pub api_token: Option<String>, // Displayed token for user to copy
}
fn default_theme() -> String {
"system".to_string()
}
fn default_api_port() -> u16 {
10108
}
impl Default for AppSettings {
fn default() -> Self {
Self {
set_as_default_browser: false,
theme: "system".to_string(),
custom_theme: None,
api_enabled: false,
api_port: 10108,
api_token: None,
}
}
}
@@ -91,17 +110,17 @@ impl SettingsManager {
Ok(settings) => {
// Save the settings back to ensure any missing fields are written with defaults
if let Err(e) = self.save_settings(&settings) {
eprintln!("Warning: Failed to update settings file with defaults: {e}");
log::warn!("Warning: Failed to update settings file with defaults: {e}");
}
Ok(settings)
}
Err(e) => {
eprintln!("Warning: Failed to parse settings file, using defaults: {e}");
log::warn!("Warning: Failed to parse settings file, using defaults: {e}");
let default_settings = AppSettings::default();
// Try to save default settings to fix the corrupted file
if let Err(save_error) = self.save_settings(&default_settings) {
eprintln!("Warning: Failed to save default settings: {save_error}");
log::warn!("Warning: Failed to save default settings: {save_error}");
}
Ok(default_settings)
@@ -151,22 +170,257 @@ impl SettingsManager {
// Always return false - we don't show settings on startup anymore
Ok(false)
}
fn get_vault_password() -> String {
env!("DONUT_BROWSER_VAULT_PASSWORD").to_string()
}
pub async fn generate_api_token(
&self,
app_handle: &tauri::AppHandle,
) -> Result<String, Box<dyn std::error::Error>> {
// Generate a secure random token (base64 encoded for URL safety)
let token_bytes: [u8; 32] = {
use rand::RngCore;
let mut rng = rand::rng();
let mut bytes = [0u8; 32];
rng.fill_bytes(&mut bytes);
bytes
};
use base64::{engine::general_purpose, Engine as _};
let token = general_purpose::URL_SAFE_NO_PAD.encode(token_bytes);
// Store token securely
self.store_api_token(app_handle, &token).await?;
Ok(token)
}
pub async fn store_api_token(
&self,
_app_handle: &tauri::AppHandle,
token: &str,
) -> Result<(), Box<dyn std::error::Error>> {
// Store token in an encrypted file using Argon2 + AES-GCM
let token_file = self.get_settings_dir().join("api_token.dat");
// Create directory if it doesn't exist
if let Some(parent) = token_file.parent() {
std::fs::create_dir_all(parent)?;
}
let vault_password = Self::get_vault_password();
// Generate a random salt for Argon2
let salt = SaltString::generate(&mut OsRng);
// Use Argon2 to derive a 32-byte key from the vault password
let argon2 = Argon2::default();
let password_hash = argon2
.hash_password(vault_password.as_bytes(), &salt)
.map_err(|e| format!("Argon2 key derivation failed: {e}"))?;
let hash_value = password_hash.hash.unwrap();
let hash_bytes = hash_value.as_bytes();
// Take first 32 bytes for AES-256 key
let key_bytes: [u8; 32] = hash_bytes[..32]
.try_into()
.map_err(|_| "Invalid key length")?;
let key = Key::<Aes256Gcm>::from(key_bytes);
let cipher = Aes256Gcm::new(&key);
// Generate a random nonce
let nonce = Aes256Gcm::generate_nonce(&mut OsRng);
// Encrypt the token
let ciphertext = cipher
.encrypt(&nonce, token.as_bytes())
.map_err(|e| format!("Encryption failed: {e}"))?;
// Create file data with header, salt, nonce, and encrypted data
let mut file_data = Vec::new();
file_data.extend_from_slice(b"DBAPI"); // 5-byte header
file_data.push(2u8); // Version 2 (Argon2 + AES-GCM)
// Store salt length and salt
let salt_str = salt.as_str();
file_data.push(salt_str.len() as u8);
file_data.extend_from_slice(salt_str.as_bytes());
// Store nonce (12 bytes for AES-GCM)
file_data.extend_from_slice(&nonce);
// Store ciphertext length and ciphertext
file_data.extend_from_slice(&(ciphertext.len() as u32).to_le_bytes());
file_data.extend_from_slice(&ciphertext);
std::fs::write(token_file, file_data)?;
Ok(())
}
pub async fn get_api_token(
&self,
_app_handle: &tauri::AppHandle,
) -> Result<Option<String>, Box<dyn std::error::Error>> {
let token_file = self.get_settings_dir().join("api_token.dat");
if !token_file.exists() {
return Ok(None);
}
let file_data = std::fs::read(token_file)?;
// Validate header
if file_data.len() < 6 || &file_data[0..5] != b"DBAPI" {
return Ok(None);
}
let version = file_data[5];
// Only support Argon2 + AES-GCM (version 2)
if version != 2 {
return Ok(None);
}
// Argon2 + AES-GCM decryption
let mut offset = 6;
// Read salt
if offset >= file_data.len() {
return Ok(None);
}
let salt_len = file_data[offset] as usize;
offset += 1;
if offset + salt_len > file_data.len() {
return Ok(None);
}
let salt_bytes = &file_data[offset..offset + salt_len];
let salt_str = std::str::from_utf8(salt_bytes).map_err(|_| "Invalid salt encoding")?;
let salt = SaltString::from_b64(salt_str).map_err(|_| "Invalid salt format")?;
offset += salt_len;
// Read nonce (12 bytes)
if offset + 12 > file_data.len() {
return Ok(None);
}
let nonce_bytes: [u8; 12] = file_data[offset..offset + 12]
.try_into()
.map_err(|_| "Invalid nonce length")?;
let nonce = Nonce::from(nonce_bytes);
offset += 12;
// Read ciphertext
if offset + 4 > file_data.len() {
return Ok(None);
}
let ciphertext_len = u32::from_le_bytes([
file_data[offset],
file_data[offset + 1],
file_data[offset + 2],
file_data[offset + 3],
]) as usize;
offset += 4;
if offset + ciphertext_len > file_data.len() {
return Ok(None);
}
let ciphertext = &file_data[offset..offset + ciphertext_len];
// Derive key using Argon2
let vault_password = Self::get_vault_password();
let argon2 = Argon2::default();
let password_hash = argon2
.hash_password(vault_password.as_bytes(), &salt)
.map_err(|e| format!("Argon2 key derivation failed: {e}"))?;
let hash_value = password_hash.hash.unwrap();
let hash_bytes = hash_value.as_bytes();
let key_bytes: [u8; 32] = hash_bytes[..32]
.try_into()
.map_err(|_| "Invalid key length")?;
let key = Key::<Aes256Gcm>::from(key_bytes);
let cipher = Aes256Gcm::new(&key);
// Decrypt the token
let plaintext = cipher
.decrypt(&nonce, ciphertext)
.map_err(|_| "Decryption failed")?;
match String::from_utf8(plaintext) {
Ok(token) => Ok(Some(token)),
Err(_) => Ok(None),
}
}
pub async fn remove_api_token(
&self,
_app_handle: &tauri::AppHandle,
) -> Result<(), Box<dyn std::error::Error>> {
let token_file = self.get_settings_dir().join("api_token.dat");
if token_file.exists() {
std::fs::remove_file(token_file)?;
}
Ok(())
}
}
#[tauri::command]
pub async fn get_app_settings() -> Result<AppSettings, String> {
pub async fn get_app_settings(app_handle: tauri::AppHandle) -> Result<AppSettings, String> {
let manager = SettingsManager::instance();
manager
let mut settings = manager
.load_settings()
.map_err(|e| format!("Failed to load settings: {e}"))
.map_err(|e| format!("Failed to load settings: {e}"))?;
// Always load token for display purposes if it exists
settings.api_token = manager
.get_api_token(&app_handle)
.await
.map_err(|e| format!("Failed to load API token: {e}"))?;
Ok(settings)
}
#[tauri::command]
pub async fn save_app_settings(settings: AppSettings) -> Result<(), String> {
pub async fn save_app_settings(
app_handle: tauri::AppHandle,
mut settings: AppSettings,
) -> Result<AppSettings, String> {
let manager = SettingsManager::instance();
if settings.api_enabled {
if let Some(ref token) = settings.api_token {
manager
.store_api_token(&app_handle, token)
.await
.map_err(|e| format!("Failed to store API token: {e}"))?;
} else {
let token = manager
.generate_api_token(&app_handle)
.await
.map_err(|e| format!("Failed to generate API token: {e}"))?;
settings.api_token = Some(token);
}
}
// If API is being disabled, remove the token
if !settings.api_enabled {
manager
.remove_api_token(&app_handle)
.await
.map_err(|e| format!("Failed to remove API token: {e}"))?;
settings.api_token = None;
}
let mut persist_settings = settings.clone();
persist_settings.api_token = None;
manager
.save_settings(&settings)
.map_err(|e| format!("Failed to save settings: {e}"))
.save_settings(&persist_settings)
.map_err(|e| format!("Failed to save settings: {e}"))?;
Ok(settings)
}
#[tauri::command]
@@ -193,54 +447,6 @@ pub async fn save_table_sorting_settings(sorting: TableSortingSettings) -> Resul
.map_err(|e| format!("Failed to save table sorting settings: {e}"))
}
#[tauri::command]
pub async fn clear_all_version_cache_and_refetch(
app_handle: tauri::AppHandle,
) -> Result<(), String> {
let api_client = ApiClient::instance();
// Clear all cache first
api_client
.clear_all_cache()
.map_err(|e| format!("Failed to clear version cache: {e}"))?;
// Disable all browsers during the update process
let auto_updater = crate::auto_updater::AutoUpdater::instance();
let supported_browsers =
crate::browser_version_manager::BrowserVersionManager::instance().get_supported_browsers();
// Load current state and disable all browsers
let mut state = auto_updater
.load_auto_update_state()
.map_err(|e| format!("Failed to load auto update state: {e}"))?;
for browser in &supported_browsers {
state.disabled_browsers.insert(browser.clone());
}
auto_updater
.save_auto_update_state(&state)
.map_err(|e| format!("Failed to save auto update state: {e}"))?;
let updater = version_updater::get_version_updater();
let updater_guard = updater.lock().await;
let result = updater_guard
.trigger_manual_update(&app_handle)
.await
.map_err(|e| format!("Failed to trigger version update: {e}"));
// Re-enable all browsers after the update completes (regardless of success/failure)
let mut final_state = auto_updater.load_auto_update_state().unwrap_or_default();
for browser in &supported_browsers {
final_state.disabled_browsers.remove(browser);
}
if let Err(e) = auto_updater.save_auto_update_state(&final_state) {
eprintln!("Warning: Failed to re-enable browsers after cache clear: {e}");
}
result?;
Ok(())
}
// Global singleton instance
lazy_static::lazy_static! {
static ref SETTINGS_MANAGER: SettingsManager = SettingsManager::new();
@@ -321,6 +527,10 @@ mod tests {
let test_settings = AppSettings {
set_as_default_browser: true,
theme: "dark".to_string(),
custom_theme: None,
api_enabled: false,
api_port: 10108,
api_token: None,
};
// Save settings
+114
View File
@@ -0,0 +1,114 @@
use crate::profile::BrowserProfile;
use directories::BaseDirs;
use serde::{Deserialize, Serialize};
use std::collections::BTreeSet;
use std::fs;
use std::path::{Path, PathBuf};
#[derive(Debug, Serialize, Deserialize, Default, Clone)]
struct TagsData {
tags: Vec<String>,
}
pub struct TagManager {
base_dirs: BaseDirs,
data_dir_override: Option<PathBuf>,
}
impl TagManager {
pub fn new() -> Self {
Self {
base_dirs: BaseDirs::new().expect("Failed to get base directories"),
data_dir_override: std::env::var("DONUTBROWSER_DATA_DIR")
.ok()
.map(PathBuf::from),
}
}
// Helper for tests to override data directory without global env var
#[allow(dead_code)]
pub fn with_data_dir_override(dir: &Path) -> Self {
Self {
base_dirs: BaseDirs::new().expect("Failed to get base directories"),
data_dir_override: Some(dir.to_path_buf()),
}
}
fn get_tags_file_path(&self) -> PathBuf {
if let Some(dir) = &self.data_dir_override {
let mut override_path = dir.clone();
let _ = fs::create_dir_all(&override_path);
override_path.push("tags.json");
return override_path;
}
let mut path = self.base_dirs.data_local_dir().to_path_buf();
path.push(if cfg!(debug_assertions) {
"DonutBrowserDev"
} else {
"DonutBrowser"
});
path.push("data");
path.push("tags.json");
path
}
fn load_tags_data(&self) -> Result<TagsData, Box<dyn std::error::Error>> {
let file_path = self.get_tags_file_path();
if !file_path.exists() {
return Ok(TagsData::default());
}
let content = fs::read_to_string(file_path)?;
let data: TagsData = serde_json::from_str(&content)?;
Ok(data)
}
fn save_tags_data(&self, data: &TagsData) -> Result<(), Box<dyn std::error::Error>> {
let file_path = self.get_tags_file_path();
if let Some(parent) = file_path.parent() {
fs::create_dir_all(parent)?;
}
let json = serde_json::to_string_pretty(data)?;
fs::write(file_path, json)?;
Ok(())
}
pub fn get_all_tags(&self) -> Result<Vec<String>, Box<dyn std::error::Error>> {
let mut all = self.load_tags_data()?.tags;
// Ensure deterministic order
all.sort();
all.dedup();
Ok(all)
}
pub fn rebuild_from_profiles(
&self,
profiles: &[BrowserProfile],
) -> Result<Vec<String>, Box<dyn std::error::Error>> {
// Build a set of all tags currently used by any profile
let mut set: BTreeSet<String> = BTreeSet::new();
for profile in profiles {
for tag in &profile.tags {
// Store exactly as provided (no normalization) to preserve characters
set.insert(tag.clone());
}
}
let combined: Vec<String> = set.into_iter().collect();
self.save_tags_data(&TagsData {
tags: combined.clone(),
})?;
Ok(combined)
}
}
#[tauri::command]
pub fn get_all_tags() -> Result<Vec<String>, String> {
let tag_manager = crate::tag_manager::TAG_MANAGER.lock().unwrap();
tag_manager
.get_all_tags()
.map_err(|e| format!("Failed to get tags: {e}"))
}
lazy_static::lazy_static! {
pub static ref TAG_MANAGER: std::sync::Mutex<TagManager> = std::sync::Mutex::new(TagManager::new());
}
+668
View File
@@ -0,0 +1,668 @@
use directories::BaseDirs;
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
use std::fs;
use std::path::PathBuf;
use std::sync::atomic::{AtomicU64, Ordering};
use std::sync::{Arc, RwLock};
/// Individual bandwidth data point for time-series tracking
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct BandwidthDataPoint {
/// Unix timestamp in seconds
pub timestamp: u64,
/// Bytes sent in this interval
pub bytes_sent: u64,
/// Bytes received in this interval
pub bytes_received: u64,
}
/// Domain access information
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct DomainAccess {
/// Domain name
pub domain: String,
/// Number of requests to this domain
pub request_count: u64,
/// Total bytes sent to this domain
pub bytes_sent: u64,
/// Total bytes received from this domain
pub bytes_received: u64,
/// First access timestamp
pub first_access: u64,
/// Last access timestamp
pub last_access: u64,
}
/// Lightweight snapshot for real-time updates (sent via events)
/// Contains only the data needed for the mini chart and summary display
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct TrafficSnapshot {
/// Profile ID (for matching)
pub profile_id: Option<String>,
/// Session start timestamp
pub session_start: u64,
/// Last update timestamp
pub last_update: u64,
/// Total bytes sent across all time
pub total_bytes_sent: u64,
/// Total bytes received across all time
pub total_bytes_received: u64,
/// Total requests made
pub total_requests: u64,
/// Current bandwidth (bytes per second) sent
pub current_bytes_sent: u64,
/// Current bandwidth (bytes per second) received
pub current_bytes_received: u64,
/// Recent bandwidth history (last 60 seconds only, for mini chart)
pub recent_bandwidth: Vec<BandwidthDataPoint>,
}
/// Traffic statistics for a profile/proxy session
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct TrafficStats {
/// Proxy ID this stats belong to (for backwards compatibility)
pub proxy_id: String,
/// Profile ID (if associated) - this is now the primary key for storage
pub profile_id: Option<String>,
/// Session start timestamp
pub session_start: u64,
/// Last update timestamp
pub last_update: u64,
/// Total bytes sent across all time
pub total_bytes_sent: u64,
/// Total bytes received across all time
pub total_bytes_received: u64,
/// Total requests made
pub total_requests: u64,
/// Bandwidth data points (time-series, 1 point per second, stored indefinitely)
#[serde(default)]
pub bandwidth_history: Vec<BandwidthDataPoint>,
/// Domain access statistics
#[serde(default)]
pub domains: HashMap<String, DomainAccess>,
/// Unique IPs accessed
#[serde(default)]
pub unique_ips: Vec<String>,
}
impl TrafficStats {
pub fn new(proxy_id: String, profile_id: Option<String>) -> Self {
let now = current_timestamp();
Self {
proxy_id,
profile_id,
session_start: now,
last_update: now,
total_bytes_sent: 0,
total_bytes_received: 0,
total_requests: 0,
bandwidth_history: Vec::new(),
domains: HashMap::new(),
unique_ips: Vec::new(),
}
}
/// Create a lightweight snapshot for real-time UI updates
pub fn to_snapshot(&self) -> TrafficSnapshot {
let now = current_timestamp();
let cutoff = now.saturating_sub(60); // Last 60 seconds for mini chart
// Get current bandwidth from last data point
let (current_sent, current_recv) = self
.bandwidth_history
.last()
.filter(|dp| dp.timestamp >= now.saturating_sub(2)) // Within last 2 seconds
.map(|dp| (dp.bytes_sent, dp.bytes_received))
.unwrap_or((0, 0));
TrafficSnapshot {
profile_id: self.profile_id.clone(),
session_start: self.session_start,
last_update: self.last_update,
total_bytes_sent: self.total_bytes_sent,
total_bytes_received: self.total_bytes_received,
total_requests: self.total_requests,
current_bytes_sent: current_sent,
current_bytes_received: current_recv,
recent_bandwidth: self
.bandwidth_history
.iter()
.filter(|dp| dp.timestamp >= cutoff)
.cloned()
.collect(),
}
}
/// Record bandwidth for current second (data is stored indefinitely)
pub fn record_bandwidth(&mut self, bytes_sent: u64, bytes_received: u64) {
let now = current_timestamp();
self.last_update = now;
self.total_bytes_sent += bytes_sent;
self.total_bytes_received += bytes_received;
// Find or create data point for this second
if let Some(last) = self.bandwidth_history.last_mut() {
if last.timestamp == now {
last.bytes_sent += bytes_sent;
last.bytes_received += bytes_received;
return;
}
}
// Add new data point (even if bytes are zero, to ensure chart has continuous data)
self.bandwidth_history.push(BandwidthDataPoint {
timestamp: now,
bytes_sent,
bytes_received,
});
}
/// Record a request to a domain
pub fn record_request(&mut self, domain: &str, bytes_sent: u64, bytes_received: u64) {
let now = current_timestamp();
self.total_requests += 1;
let entry = self
.domains
.entry(domain.to_string())
.or_insert(DomainAccess {
domain: domain.to_string(),
request_count: 0,
bytes_sent: 0,
bytes_received: 0,
first_access: now,
last_access: now,
});
entry.request_count += 1;
entry.bytes_sent += bytes_sent;
entry.bytes_received += bytes_received;
entry.last_access = now;
}
/// Record an IP address access
pub fn record_ip(&mut self, ip: &str) {
if !self.unique_ips.contains(&ip.to_string()) {
self.unique_ips.push(ip.to_string());
}
}
/// Get bandwidth data for the last N seconds
pub fn get_recent_bandwidth(&self, seconds: u64) -> Vec<BandwidthDataPoint> {
let now = current_timestamp();
let cutoff = now.saturating_sub(seconds);
self
.bandwidth_history
.iter()
.filter(|dp| dp.timestamp >= cutoff)
.cloned()
.collect()
}
}
/// Get current Unix timestamp in seconds
fn current_timestamp() -> u64 {
std::time::SystemTime::now()
.duration_since(std::time::UNIX_EPOCH)
.unwrap_or_default()
.as_secs()
}
/// Get the traffic stats storage directory
pub fn get_traffic_stats_dir() -> PathBuf {
let base_dirs = BaseDirs::new().expect("Failed to get base directories");
let mut path = base_dirs.cache_dir().to_path_buf();
path.push(if cfg!(debug_assertions) {
"DonutBrowserDev"
} else {
"DonutBrowser"
});
path.push("traffic_stats");
path
}
/// Get the storage key for traffic stats (profile_id if available, otherwise proxy_id)
fn get_stats_storage_key(stats: &TrafficStats) -> String {
stats
.profile_id
.clone()
.unwrap_or_else(|| stats.proxy_id.clone())
}
/// Save traffic stats to disk using profile_id as the key
pub fn save_traffic_stats(stats: &TrafficStats) -> Result<(), Box<dyn std::error::Error>> {
let storage_dir = get_traffic_stats_dir();
fs::create_dir_all(&storage_dir)?;
let key = get_stats_storage_key(stats);
let file_path = storage_dir.join(format!("{key}.json"));
let content = serde_json::to_string(stats)?;
fs::write(&file_path, content)?;
Ok(())
}
/// Load traffic stats from disk by profile_id or proxy_id
pub fn load_traffic_stats(id: &str) -> Option<TrafficStats> {
let storage_dir = get_traffic_stats_dir();
let file_path = storage_dir.join(format!("{id}.json"));
if !file_path.exists() {
return None;
}
let content = fs::read_to_string(&file_path).ok()?;
serde_json::from_str(&content).ok()
}
/// Load traffic stats by profile_id
pub fn load_traffic_stats_by_profile(profile_id: &str) -> Option<TrafficStats> {
load_traffic_stats(profile_id)
}
/// List all traffic stats files and migrate old proxy-id based files to profile-id based
pub fn list_traffic_stats() -> Vec<TrafficStats> {
let storage_dir = get_traffic_stats_dir();
if !storage_dir.exists() {
return Vec::new();
}
let mut stats_map: HashMap<String, TrafficStats> = HashMap::new();
let mut files_to_delete: Vec<std::path::PathBuf> = Vec::new();
if let Ok(entries) = fs::read_dir(&storage_dir) {
for entry in entries.flatten() {
let path = entry.path();
if path.extension().is_some_and(|ext| ext == "json") {
if let Ok(content) = fs::read_to_string(&path) {
if let Ok(s) = serde_json::from_str::<TrafficStats>(&content) {
// Determine the key for this stats entry
let key = s.profile_id.clone().unwrap_or_else(|| s.proxy_id.clone());
// Check if this is an old proxy-id based file that should be migrated
let file_stem = path.file_stem().and_then(|s| s.to_str()).unwrap_or("");
let is_old_proxy_file = file_stem.starts_with("proxy_")
&& s.profile_id.is_some()
&& file_stem != s.profile_id.as_ref().unwrap();
if let Some(existing) = stats_map.get_mut(&key) {
// Merge stats from this file into existing
merge_traffic_stats(existing, &s);
if is_old_proxy_file {
files_to_delete.push(path.clone());
}
} else {
stats_map.insert(key.clone(), s);
if is_old_proxy_file {
files_to_delete.push(path.clone());
}
}
}
}
}
}
}
// Save merged stats and delete old files
for stats in stats_map.values() {
if let Err(e) = save_traffic_stats(stats) {
log::warn!("Failed to save merged traffic stats: {}", e);
}
}
for path in files_to_delete {
if let Err(e) = fs::remove_file(&path) {
log::warn!("Failed to delete old traffic stats file {:?}: {}", path, e);
}
}
stats_map.into_values().collect()
}
/// Merge traffic stats from source into destination
fn merge_traffic_stats(dest: &mut TrafficStats, src: &TrafficStats) {
// Update totals
dest.total_bytes_sent += src.total_bytes_sent;
dest.total_bytes_received += src.total_bytes_received;
dest.total_requests += src.total_requests;
// Update timestamps
dest.session_start = dest.session_start.min(src.session_start);
dest.last_update = dest.last_update.max(src.last_update);
// Merge bandwidth history (keep all data, sorted by timestamp)
let mut combined_history: Vec<BandwidthDataPoint> = dest.bandwidth_history.clone();
for point in &src.bandwidth_history {
if !combined_history
.iter()
.any(|p| p.timestamp == point.timestamp)
{
combined_history.push(point.clone());
}
}
combined_history.sort_by_key(|p| p.timestamp);
dest.bandwidth_history = combined_history;
// Merge domains
for (domain, access) in &src.domains {
let entry = dest.domains.entry(domain.clone()).or_insert(DomainAccess {
domain: domain.clone(),
request_count: 0,
bytes_sent: 0,
bytes_received: 0,
first_access: access.first_access,
last_access: access.last_access,
});
entry.request_count += access.request_count;
entry.bytes_sent += access.bytes_sent;
entry.bytes_received += access.bytes_received;
entry.first_access = entry.first_access.min(access.first_access);
entry.last_access = entry.last_access.max(access.last_access);
}
// Merge unique IPs
for ip in &src.unique_ips {
if !dest.unique_ips.contains(ip) {
dest.unique_ips.push(ip.clone());
}
}
}
/// Delete traffic stats by id (profile_id or proxy_id)
pub fn delete_traffic_stats(id: &str) -> bool {
let storage_dir = get_traffic_stats_dir();
let file_path = storage_dir.join(format!("{id}.json"));
if file_path.exists() {
fs::remove_file(&file_path).is_ok()
} else {
false
}
}
/// Clear all traffic stats (used when clearing cache)
pub fn clear_all_traffic_stats() -> Result<(), Box<dyn std::error::Error>> {
let storage_dir = get_traffic_stats_dir();
if storage_dir.exists() {
for entry in fs::read_dir(&storage_dir)?.flatten() {
let path = entry.path();
if path.extension().is_some_and(|ext| ext == "json") {
let _ = fs::remove_file(&path);
}
}
}
Ok(())
}
/// Live bandwidth tracker for real-time stats collection in the proxy
/// This is designed to be used from within the proxy server
pub struct LiveTrafficTracker {
pub proxy_id: String,
pub profile_id: Option<String>,
bytes_sent: AtomicU64,
bytes_received: AtomicU64,
requests: AtomicU64,
domain_stats: RwLock<HashMap<String, (u64, u64, u64)>>, // domain -> (count, sent, recv)
ips: RwLock<Vec<String>>,
#[allow(dead_code)]
session_start: u64,
}
impl LiveTrafficTracker {
pub fn new(proxy_id: String, profile_id: Option<String>) -> Self {
Self {
proxy_id,
profile_id,
bytes_sent: AtomicU64::new(0),
bytes_received: AtomicU64::new(0),
requests: AtomicU64::new(0),
domain_stats: RwLock::new(HashMap::new()),
ips: RwLock::new(Vec::new()),
session_start: current_timestamp(),
}
}
pub fn add_bytes_sent(&self, bytes: u64) {
self.bytes_sent.fetch_add(bytes, Ordering::Relaxed);
}
pub fn add_bytes_received(&self, bytes: u64) {
self.bytes_received.fetch_add(bytes, Ordering::Relaxed);
}
pub fn record_request(&self, domain: &str, bytes_sent: u64, bytes_received: u64) {
self.requests.fetch_add(1, Ordering::Relaxed);
// Also update total byte counters for HTTP requests (not tunneled)
self.bytes_sent.fetch_add(bytes_sent, Ordering::Relaxed);
self
.bytes_received
.fetch_add(bytes_received, Ordering::Relaxed);
if let Ok(mut stats) = self.domain_stats.write() {
let entry = stats.entry(domain.to_string()).or_insert((0, 0, 0));
entry.0 += 1;
entry.1 += bytes_sent;
entry.2 += bytes_received;
}
}
pub fn record_ip(&self, ip: &str) {
if let Ok(mut ips) = self.ips.write() {
if !ips.contains(&ip.to_string()) {
ips.push(ip.to_string());
}
}
}
/// Update domain-specific byte counts (called when CONNECT tunnel closes)
pub fn update_domain_bytes(&self, domain: &str, bytes_sent: u64, bytes_received: u64) {
if let Ok(mut stats) = self.domain_stats.write() {
let entry = stats.entry(domain.to_string()).or_insert((0, 0, 0));
entry.1 += bytes_sent;
entry.2 += bytes_received;
}
}
/// Get current stats snapshot
pub fn get_snapshot(&self) -> (u64, u64, u64) {
(
self.bytes_sent.load(Ordering::Relaxed),
self.bytes_received.load(Ordering::Relaxed),
self.requests.load(Ordering::Relaxed),
)
}
/// Flush current stats to disk and return the delta
pub fn flush_to_disk(&self) -> Result<(u64, u64), Box<dyn std::error::Error>> {
let bytes_sent = self.bytes_sent.swap(0, Ordering::Relaxed);
let bytes_received = self.bytes_received.swap(0, Ordering::Relaxed);
// Use profile_id as storage key if available, otherwise fall back to proxy_id
let storage_key = self
.profile_id
.clone()
.unwrap_or_else(|| self.proxy_id.clone());
// Load or create stats using the storage key
let mut stats = load_traffic_stats(&storage_key)
.unwrap_or_else(|| TrafficStats::new(self.proxy_id.clone(), self.profile_id.clone()));
// Ensure profile_id is set (in case stats were loaded from disk without it)
if stats.profile_id.is_none() && self.profile_id.is_some() {
stats.profile_id = self.profile_id.clone();
}
// Update the proxy_id to current session (for debugging/tracking)
stats.proxy_id = self.proxy_id.clone();
// Update bandwidth history
stats.record_bandwidth(bytes_sent, bytes_received);
// Update domain stats
if let Ok(mut domain_map) = self.domain_stats.write() {
for (domain, (count, sent, recv)) in domain_map.drain() {
stats.record_request(&domain, sent, recv);
// Adjust request count (record_request increments total_requests)
stats.total_requests = stats.total_requests.saturating_sub(1) + count;
}
}
// Update IPs
if let Ok(ips) = self.ips.read() {
for ip in ips.iter() {
stats.record_ip(ip);
}
}
// Save to disk
save_traffic_stats(&stats)?;
Ok((bytes_sent, bytes_received))
}
}
/// Global traffic tracker that can be accessed from connection handlers
/// Using RwLock to allow reinitialization when proxy config changes
static TRAFFIC_TRACKER: std::sync::RwLock<Option<Arc<LiveTrafficTracker>>> =
std::sync::RwLock::new(None);
/// Initialize the global traffic tracker
/// This can be called multiple times to update the tracker when proxy config changes
pub fn init_traffic_tracker(proxy_id: String, profile_id: Option<String>) {
let tracker = Arc::new(LiveTrafficTracker::new(proxy_id, profile_id));
if let Ok(mut guard) = TRAFFIC_TRACKER.write() {
*guard = Some(tracker);
}
}
/// Get the global traffic tracker
pub fn get_traffic_tracker() -> Option<Arc<LiveTrafficTracker>> {
TRAFFIC_TRACKER.read().ok().and_then(|guard| guard.clone())
}
/// Filtered traffic stats for client display (only contains data for requested period)
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct FilteredTrafficStats {
pub profile_id: Option<String>,
pub session_start: u64,
pub last_update: u64,
pub total_bytes_sent: u64,
pub total_bytes_received: u64,
pub total_requests: u64,
/// Bandwidth history filtered to requested time period
pub bandwidth_history: Vec<BandwidthDataPoint>,
/// Period stats: bytes sent/received within the requested period
pub period_bytes_sent: u64,
pub period_bytes_received: u64,
/// Domain access statistics (always full, as it's already aggregated)
pub domains: HashMap<String, DomainAccess>,
/// Unique IPs accessed
pub unique_ips: Vec<String>,
}
/// Get traffic stats for a profile, filtered to a specific time period
/// seconds: number of seconds to include (0 = all time)
pub fn get_traffic_stats_for_period(
profile_id: &str,
seconds: u64,
) -> Option<FilteredTrafficStats> {
let stats = load_traffic_stats(profile_id)?;
let now = current_timestamp();
let cutoff = if seconds == 0 {
0 // All time
} else {
now.saturating_sub(seconds)
};
// Filter bandwidth history to requested period
let filtered_history: Vec<BandwidthDataPoint> = stats
.bandwidth_history
.iter()
.filter(|dp| dp.timestamp >= cutoff)
.cloned()
.collect();
// Calculate period totals
let period_bytes_sent: u64 = filtered_history.iter().map(|dp| dp.bytes_sent).sum();
let period_bytes_received: u64 = filtered_history.iter().map(|dp| dp.bytes_received).sum();
Some(FilteredTrafficStats {
profile_id: stats.profile_id,
session_start: stats.session_start,
last_update: stats.last_update,
total_bytes_sent: stats.total_bytes_sent,
total_bytes_received: stats.total_bytes_received,
total_requests: stats.total_requests,
bandwidth_history: filtered_history,
period_bytes_sent,
period_bytes_received,
domains: stats.domains,
unique_ips: stats.unique_ips,
})
}
/// Get lightweight traffic snapshot for a profile (for mini charts, only recent 60 seconds)
pub fn get_traffic_snapshot_for_profile(profile_id: &str) -> Option<TrafficSnapshot> {
let stats = load_traffic_stats(profile_id)?;
Some(stats.to_snapshot())
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_traffic_stats_creation() {
let stats = TrafficStats::new(
"test_proxy".to_string(),
Some("test-profile-id".to_string()),
);
assert_eq!(stats.proxy_id, "test_proxy");
assert_eq!(stats.profile_id, Some("test-profile-id".to_string()));
assert_eq!(stats.total_bytes_sent, 0);
assert_eq!(stats.total_bytes_received, 0);
}
#[test]
fn test_bandwidth_recording() {
let mut stats = TrafficStats::new("test_proxy".to_string(), None);
stats.record_bandwidth(1000, 2000);
assert_eq!(stats.total_bytes_sent, 1000);
assert_eq!(stats.total_bytes_received, 2000);
assert_eq!(stats.bandwidth_history.len(), 1);
stats.record_bandwidth(500, 1000);
assert_eq!(stats.total_bytes_sent, 1500);
assert_eq!(stats.total_bytes_received, 3000);
}
#[test]
fn test_domain_recording() {
let mut stats = TrafficStats::new("test_proxy".to_string(), None);
stats.record_request("example.com", 100, 500);
stats.record_request("example.com", 200, 1000);
stats.record_request("google.com", 50, 200);
assert_eq!(stats.domains.len(), 2);
assert_eq!(stats.domains["example.com"].request_count, 2);
assert_eq!(stats.domains["example.com"].bytes_sent, 300);
assert_eq!(stats.domains["google.com"].request_count, 1);
}
#[test]
fn test_ip_recording() {
let mut stats = TrafficStats::new("test_proxy".to_string(), None);
stats.record_ip("192.168.1.1");
stats.record_ip("192.168.1.1"); // Duplicate
stats.record_ip("10.0.0.1");
assert_eq!(stats.unique_ips.len(), 2);
}
}
+99 -52
View File
@@ -46,8 +46,9 @@ impl Default for BackgroundUpdateState {
}
}
/// Extension of auto_updater.rs for background updates
pub struct VersionUpdater {
version_service: &'static BrowserVersionManager,
browser_version_manager: &'static BrowserVersionManager,
auto_updater: &'static AutoUpdater,
app_handle: Option<tauri::AppHandle>,
}
@@ -55,7 +56,7 @@ pub struct VersionUpdater {
impl VersionUpdater {
pub fn new() -> Self {
Self {
version_service: BrowserVersionManager::instance(),
browser_version_manager: BrowserVersionManager::instance(),
auto_updater: AutoUpdater::instance(),
app_handle: None,
}
@@ -128,14 +129,14 @@ impl VersionUpdater {
let should_update = state.last_update_time == 0 || elapsed_secs >= update_interval_secs;
if should_update {
println!(
log::debug!(
"Background update needed: last_update={}, elapsed={}h, required={}h",
state.last_update_time,
elapsed_secs / 3600,
state.update_interval_hours
);
} else {
println!(
log::debug!(
"Background update not needed: last_update={}, elapsed={}h, required={}h",
state.last_update_time,
elapsed_secs / 3600,
@@ -151,12 +152,12 @@ impl VersionUpdater {
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
// Only run if an update is actually needed
if !Self::should_run_background_update() {
println!("No startup version update needed");
log::debug!("No startup version update needed");
return Ok(());
}
if let Some(ref app_handle) = self.app_handle {
println!("Running startup version update...");
log::info!("Running startup version update...");
match self.update_all_browser_versions(app_handle).await {
Ok(_) => {
@@ -167,13 +168,13 @@ impl VersionUpdater {
};
if let Err(e) = Self::save_background_update_state(&state) {
eprintln!("Failed to save background update state: {e}");
log::error!("Failed to save background update state: {e}");
} else {
println!("Startup version update completed successfully");
log::info!("Startup version update completed successfully");
}
}
Err(e) => {
eprintln!("Startup version update failed: {e}");
log::error!("Startup version update failed: {e}");
return Err(e);
}
}
@@ -263,7 +264,7 @@ impl VersionUpdater {
&self,
app_handle: &tauri::AppHandle,
) -> Result<Vec<BackgroundUpdateResult>, Box<dyn std::error::Error + Send + Sync>> {
let supported_browsers = self.version_service.get_supported_browsers();
let supported_browsers = self.browser_version_manager.get_supported_browsers();
let total_browsers = supported_browsers.len();
let mut results = Vec::new();
let mut total_new_versions = 0;
@@ -279,11 +280,11 @@ impl VersionUpdater {
};
if let Err(e) = app_handle.emit("version-update-progress", &initial_progress) {
eprintln!("Failed to emit initial progress: {e}");
log::error!("Failed to emit initial progress: {e}");
}
for (index, browser) in supported_browsers.iter().enumerate() {
println!("Updating browser versions for: {browser}");
log::debug!("Updating browser versions for: {browser}");
// Emit progress update for current browser
let progress = VersionUpdateProgress {
@@ -296,7 +297,7 @@ impl VersionUpdater {
};
if let Err(e) = app_handle.emit("version-update-progress", &progress) {
eprintln!("Failed to emit progress for {browser}: {e}");
log::error!("Failed to emit progress for {browser}: {e}");
}
match self.update_browser_versions(browser).await {
@@ -322,7 +323,7 @@ impl VersionUpdater {
};
if let Err(e) = app_handle.emit("version-update-progress", &progress) {
eprintln!("Failed to emit progress with versions for {browser}: {e}");
log::error!("Failed to emit progress with versions for {browser}: {e}");
}
}
Err(e) => {
@@ -374,7 +375,7 @@ impl VersionUpdater {
browser: &str,
) -> Result<usize, Box<dyn std::error::Error + Send + Sync>> {
self
.version_service
.browser_version_manager
.update_browser_versions_incrementally(browser)
.await
}
@@ -392,7 +393,7 @@ impl VersionUpdater {
};
if let Err(e) = Self::save_background_update_state(&state) {
eprintln!("Failed to save background update state after manual update: {e}");
log::error!("Failed to save background update state after manual update: {e}");
}
Ok(results)
@@ -455,6 +456,63 @@ pub async fn get_version_update_status() -> Result<(Option<u64>, u64), String> {
Ok((last_update, time_until_next))
}
#[tauri::command]
pub async fn clear_all_version_cache_and_refetch(
app_handle: tauri::AppHandle,
) -> Result<(), String> {
let api_client = crate::api_client::ApiClient::instance();
let version_updater = VersionUpdater::new();
// Clear all cache first
api_client
.clear_all_cache()
.map_err(|e| format!("Failed to clear version cache: {e}"))?;
// Disable all browsers during the update process
let supported_browsers = version_updater
.browser_version_manager
.get_supported_browsers();
// Load current state and disable all browsers
let mut state = version_updater
.auto_updater
.load_auto_update_state()
.map_err(|e| format!("Failed to load auto update state: {e}"))?;
for browser in &supported_browsers {
state.disabled_browsers.insert(browser.clone());
}
version_updater
.auto_updater
.save_auto_update_state(&state)
.map_err(|e| format!("Failed to save auto update state: {e}"))?;
let updater = get_version_updater();
let updater_guard = updater.lock().await;
let result = updater_guard
.trigger_manual_update(&app_handle)
.await
.map_err(|e| format!("Failed to trigger version update: {e}"));
// Re-enable all browsers after the update completes (regardless of success/failure)
let mut final_state = version_updater
.auto_updater
.load_auto_update_state()
.unwrap_or_default();
for browser in &supported_browsers {
final_state.disabled_browsers.remove(browser);
}
if let Err(e) = version_updater
.auto_updater
.save_auto_update_state(&final_state)
{
log::warn!("Failed to re-enable browsers after cache clear: {e}");
}
result?;
Ok(())
}
#[cfg(test)]
mod tests {
use super::*;
@@ -487,13 +545,22 @@ mod tests {
Err(_) => return BackgroundUpdateState::default(),
};
serde_json::from_str(&content).unwrap_or_default()
match serde_json::from_str(&content) {
Ok(state) => state,
Err(e) => {
eprintln!("Failed to parse test state file {:?}: {}", state_file, e);
BackgroundUpdateState::default()
}
}
}
#[test]
fn test_background_update_state_persistence() {
let test_name = "persistence";
// Clean up any existing test file first
let _ = fs::remove_file(get_test_state_file(test_name));
// Create a test state
let test_state = BackgroundUpdateState {
last_update_time: 1609459200, // 2021-01-01 00:00:00 UTC
@@ -503,14 +570,22 @@ mod tests {
// Save the state
save_test_state(test_name, &test_state).unwrap();
// Verify file was created
let state_file = get_test_state_file(test_name);
assert!(state_file.exists(), "State file should exist after saving");
// Load the state back
let loaded_state = load_test_state(test_name);
// Verify the values match
assert_eq!(loaded_state.last_update_time, test_state.last_update_time);
assert_eq!(
loaded_state.update_interval_hours,
test_state.update_interval_hours
loaded_state.last_update_time, test_state.last_update_time,
"last_update_time should match. Expected: {}, Got: {}",
test_state.last_update_time, loaded_state.last_update_time
);
assert_eq!(
loaded_state.update_interval_hours, test_state.update_interval_hours,
"update_interval_hours should match"
);
// Clean up
@@ -570,44 +645,16 @@ mod tests {
);
}
#[test]
fn test_cache_dir_creation() {
// This should not panic and should create the directory if it doesn't exist
let cache_dir_result = VersionUpdater::get_cache_dir();
assert!(
cache_dir_result.is_ok(),
"Should successfully get cache directory"
);
let cache_dir = cache_dir_result.unwrap();
assert!(
cache_dir.exists(),
"Cache directory should exist after creation"
);
assert!(cache_dir.is_dir(), "Cache directory should be a directory");
// Verify the path contains expected components
let path_str = cache_dir.to_string_lossy();
assert!(
path_str.contains("version_cache"),
"Path should contain version_cache"
);
// Test that calling it again returns the same directory
let cache_dir2 = VersionUpdater::get_cache_dir().unwrap();
assert_eq!(
cache_dir, cache_dir2,
"Multiple calls should return same directory"
);
}
#[test]
fn test_version_updater_creation() {
let updater = VersionUpdater::new();
// Should have valid references to services
assert!(
!std::ptr::eq(updater.version_service as *const _, std::ptr::null()),
!std::ptr::eq(
updater.browser_version_manager as *const _,
std::ptr::null()
),
"Version service should not be null"
);
assert!(
+15 -14
View File
@@ -1,12 +1,12 @@
{
"$schema": "https://schema.tauri.app/config/2",
"productName": "Donut Browser",
"version": "0.9.2",
"productName": "Donut",
"version": "0.13.0",
"identifier": "com.donutbrowser",
"build": {
"beforeDevCommand": "pnpm dev",
"beforeDevCommand": "pnpm copy-proxy-binary && pnpm dev",
"devUrl": "http://localhost:3000",
"beforeBuildCommand": "pnpm build",
"beforeBuildCommand": "pnpm copy-proxy-binary && (test -d ../dist || pnpm build)",
"frontendDist": "../dist"
},
"app": {
@@ -17,9 +17,9 @@
},
"bundle": {
"active": true,
"targets": "all",
"targets": ["app", "dmg", "nsis", "deb", "rpm", "appimage"],
"category": "Productivity",
"externalBin": ["binaries/nodecar"],
"externalBin": ["binaries/nodecar", "binaries/donut-proxy"],
"icon": [
"icons/32x32.png",
"icons/128x128.png",
@@ -41,22 +41,23 @@
},
"linux": {
"deb": {
"depends": ["xdg-utils"],
"files": {
"/usr/share/applications/donutbrowser.desktop": "donutbrowser.desktop"
}
"desktopTemplate": "donutbrowser.desktop",
"depends": ["xdg-utils"]
},
"rpm": {
"depends": ["xdg-utils"],
"files": {
"/usr/share/applications/donutbrowser.desktop": "donutbrowser.desktop"
}
"desktopTemplate": "donutbrowser.desktop",
"depends": ["xdg-utils"]
},
"appimage": {
"files": {
"usr/share/applications/donutbrowser.desktop": "donutbrowser.desktop"
}
}
},
"windows": {
"certificateThumbprint": null,
"digestAlgorithm": "sha256",
"timestampUrl": ""
}
},
"plugins": {
+3 -115
View File
@@ -1,59 +1,13 @@
use std::env;
use std::path::PathBuf;
use std::process::Command;
use std::time::Duration;
/// Utility functions for integration tests
pub struct TestUtils;
impl TestUtils {
/// Build the nodecar binary if it doesn't exist
pub async fn ensure_nodecar_binary() -> Result<PathBuf, Box<dyn std::error::Error + Send + Sync>>
{
let cargo_manifest_dir = env::var("CARGO_MANIFEST_DIR")?;
let project_root = PathBuf::from(cargo_manifest_dir)
.parent()
.unwrap()
.to_path_buf();
let nodecar_dir = project_root.join("nodecar");
let nodecar_binary = nodecar_dir.join("nodecar-bin");
// Check if binary already exists
if nodecar_binary.exists() {
return Ok(nodecar_binary);
}
println!("Building nodecar binary for integration tests...");
// Install dependencies
let install_status = Command::new("pnpm")
.args(["install", "--frozen-lockfile"])
.current_dir(&nodecar_dir)
.status()?;
if !install_status.success() {
return Err("Failed to install nodecar dependencies".into());
}
// Build the binary
let build_status = Command::new("pnpm")
.args(["run", "build"])
.current_dir(&nodecar_dir)
.status()?;
if !build_status.success() {
return Err("Failed to build nodecar binary".into());
}
if !nodecar_binary.exists() {
return Err("Nodecar binary was not created successfully".into());
}
Ok(nodecar_binary)
}
/// Execute a nodecar command with timeout
pub async fn execute_nodecar_command(
/// Execute a command (generic, for donut-proxy tests)
#[allow(dead_code)]
pub async fn execute_command(
binary_path: &PathBuf,
args: &[&str],
) -> Result<std::process::Output, Box<dyn std::error::Error + Send + Sync>> {
@@ -64,70 +18,4 @@ impl TestUtils {
Ok(output)
}
/// Check if a port is available
pub async fn is_port_available(port: u16) -> bool {
tokio::net::TcpListener::bind(format!("127.0.0.1:{port}"))
.await
.is_ok()
}
/// Wait for a port to become available or occupied
pub async fn wait_for_port_state(port: u16, should_be_occupied: bool, timeout_secs: u64) -> bool {
let start = std::time::Instant::now();
while start.elapsed().as_secs() < timeout_secs {
let is_available = Self::is_port_available(port).await;
if should_be_occupied && !is_available {
return true; // Port is occupied as expected
} else if !should_be_occupied && is_available {
return true; // Port is available as expected
}
tokio::time::sleep(Duration::from_millis(100)).await;
}
false
}
/// Create a temporary directory for test files
pub fn create_temp_dir() -> Result<tempfile::TempDir, Box<dyn std::error::Error + Send + Sync>> {
Ok(tempfile::tempdir()?)
}
/// Clean up specific nodecar processes by IDs (for targeted test cleanup)
pub async fn cleanup_specific_processes(
nodecar_path: &PathBuf,
proxy_ids: &[String],
camoufox_ids: &[String],
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
println!("Cleaning up specific test processes...");
// Stop specific proxies
for proxy_id in proxy_ids {
let stop_args = ["proxy", "stop", "--id", proxy_id];
if let Ok(output) = Self::execute_nodecar_command(nodecar_path, &stop_args).await {
if output.status.success() {
println!("Stopped test proxy: {proxy_id}");
}
}
}
// Stop specific camoufox instances
for camoufox_id in camoufox_ids {
let stop_args = ["camoufox", "stop", "--id", camoufox_id];
if let Ok(output) = Self::execute_nodecar_command(nodecar_path, &stop_args).await {
if output.status.success() {
println!("Stopped test camoufox instance: {camoufox_id}");
}
}
}
// Give processes time to clean up
tokio::time::sleep(Duration::from_millis(500)).await;
println!("Test process cleanup completed");
Ok(())
}
}
+640
View File
@@ -0,0 +1,640 @@
mod common;
use common::TestUtils;
use serde_json::Value;
use serial_test::serial;
use std::time::Duration;
use tokio::io::{AsyncReadExt, AsyncWriteExt};
use tokio::net::TcpStream;
use tokio::time::sleep;
/// Setup function to ensure donut-proxy binary exists and cleanup stale proxies
async fn setup_test() -> Result<std::path::PathBuf, Box<dyn std::error::Error + Send + Sync>> {
let cargo_manifest_dir = std::env::var("CARGO_MANIFEST_DIR")?;
let project_root = std::path::PathBuf::from(cargo_manifest_dir)
.parent()
.unwrap()
.to_path_buf();
// Build donut-proxy binary if it doesn't exist
let proxy_binary = project_root
.join("src-tauri")
.join("target")
.join("debug")
.join("donut-proxy");
if !proxy_binary.exists() {
println!("Building donut-proxy binary for integration tests...");
let build_status = std::process::Command::new("cargo")
.args(["build", "--bin", "donut-proxy"])
.current_dir(project_root.join("src-tauri"))
.status()?;
if !build_status.success() {
return Err("Failed to build donut-proxy binary".into());
}
}
if !proxy_binary.exists() {
return Err("donut-proxy binary was not created successfully".into());
}
// Clean up any stale proxies from previous test runs
let _ = TestUtils::execute_command(&proxy_binary, &["proxy", "stop"]).await;
Ok(proxy_binary)
}
/// Helper to track and cleanup proxy processes
struct ProxyTestTracker {
proxy_ids: Vec<String>,
binary_path: std::path::PathBuf,
}
impl ProxyTestTracker {
fn new(binary_path: std::path::PathBuf) -> Self {
Self {
proxy_ids: Vec::new(),
binary_path,
}
}
fn track_proxy(&mut self, proxy_id: String) {
self.proxy_ids.push(proxy_id);
}
async fn cleanup_all(&self) {
for proxy_id in &self.proxy_ids {
let _ =
TestUtils::execute_command(&self.binary_path, &["proxy", "stop", "--id", proxy_id]).await;
}
}
}
impl Drop for ProxyTestTracker {
fn drop(&mut self) {
let proxy_ids = self.proxy_ids.clone();
let binary_path = self.binary_path.clone();
tokio::spawn(async move {
for proxy_id in &proxy_ids {
let _ =
TestUtils::execute_command(&binary_path, &["proxy", "stop", "--id", proxy_id]).await;
}
});
}
}
/// Test starting a local proxy without upstream proxy (DIRECT)
#[tokio::test]
#[serial]
async fn test_local_proxy_direct() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let binary_path = setup_test().await?;
let mut tracker = ProxyTestTracker::new(binary_path.clone());
println!("Starting local proxy without upstream (DIRECT)...");
let output = TestUtils::execute_command(&binary_path, &["proxy", "start"]).await?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let stdout = String::from_utf8_lossy(&output.stdout);
return Err(format!("Proxy start failed - stdout: {stdout}, stderr: {stderr}").into());
}
let stdout = String::from_utf8(output.stdout)?;
let config: Value = serde_json::from_str(&stdout)?;
let proxy_id = config["id"].as_str().unwrap().to_string();
let local_port = config["localPort"].as_u64().unwrap() as u16;
let local_url = config["localUrl"].as_str().unwrap();
let upstream_url = config["upstreamUrl"].as_str().unwrap();
tracker.track_proxy(proxy_id.clone());
println!(
"Proxy started: id={}, port={}, url={}, upstream={}",
proxy_id, local_port, local_url, upstream_url
);
// Verify proxy is listening
sleep(Duration::from_millis(500)).await;
match TcpStream::connect(("127.0.0.1", local_port)).await {
Ok(_) => {
println!("Proxy is listening on port {local_port}");
}
Err(e) => {
return Err(format!("Proxy port {local_port} is not listening: {e}").into());
}
}
// Test making an HTTP request through the proxy
let mut stream = TcpStream::connect(("127.0.0.1", local_port)).await?;
let request =
b"GET http://httpbin.org/ip HTTP/1.1\r\nHost: httpbin.org\r\nConnection: close\r\n\r\n";
stream.write_all(request).await?;
let mut response = Vec::new();
stream.read_to_end(&mut response).await?;
let response_str = String::from_utf8_lossy(&response);
if response_str.contains("200 OK") || response_str.contains("origin") {
println!("Proxy successfully forwarded HTTP request");
} else {
println!(
"Warning: Proxy response may be unexpected: {}",
&response_str[..response_str.len().min(200)]
);
}
// Cleanup
tracker.cleanup_all().await;
Ok(())
}
/// Test chaining local proxies (local proxy -> local proxy -> internet)
#[tokio::test]
#[serial]
async fn test_chained_local_proxies() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let binary_path = setup_test().await?;
let mut tracker = ProxyTestTracker::new(binary_path.clone());
println!("Testing chained local proxies...");
// Start first proxy (DIRECT - connects to internet)
let output1 = TestUtils::execute_command(&binary_path, &["proxy", "start"]).await?;
if !output1.status.success() {
let stderr = String::from_utf8_lossy(&output1.stderr);
let stdout = String::from_utf8_lossy(&output1.stdout);
return Err(format!("Failed to start first proxy - stdout: {stdout}, stderr: {stderr}").into());
}
let config1: Value = serde_json::from_str(&String::from_utf8(output1.stdout)?)?;
let proxy1_id = config1["id"].as_str().unwrap().to_string();
let proxy1_port = config1["localPort"].as_u64().unwrap() as u16;
tracker.track_proxy(proxy1_id.clone());
println!("First proxy started on port {}", proxy1_port);
// Wait for first proxy to be ready
sleep(Duration::from_millis(500)).await;
match TcpStream::connect(("127.0.0.1", proxy1_port)).await {
Ok(_) => println!("First proxy is ready"),
Err(e) => return Err(format!("First proxy not ready: {e}").into()),
}
// Start second proxy chained to first proxy
let output2 = TestUtils::execute_command(
&binary_path,
&[
"proxy",
"start",
"--host",
"127.0.0.1",
"--proxy-port",
&proxy1_port.to_string(),
"--type",
"http",
],
)
.await?;
if !output2.status.success() {
let stderr = String::from_utf8_lossy(&output2.stderr);
let stdout = String::from_utf8_lossy(&output2.stdout);
return Err(
format!("Failed to start second proxy - stdout: {stdout}, stderr: {stderr}").into(),
);
}
let config2: Value = serde_json::from_str(&String::from_utf8(output2.stdout)?)?;
let proxy2_id = config2["id"].as_str().unwrap().to_string();
let proxy2_port = config2["localPort"].as_u64().unwrap() as u16;
tracker.track_proxy(proxy2_id.clone());
println!(
"Second proxy started on port {} (chained to proxy on port {})",
proxy2_port, proxy1_port
);
// Wait for second proxy to be ready
sleep(Duration::from_millis(500)).await;
match TcpStream::connect(("127.0.0.1", proxy2_port)).await {
Ok(_) => println!("Second proxy is ready"),
Err(e) => return Err(format!("Second proxy not ready: {e}").into()),
}
// Test making an HTTP request through the chained proxy
let mut stream = TcpStream::connect(("127.0.0.1", proxy2_port)).await?;
let request =
b"GET http://httpbin.org/ip HTTP/1.1\r\nHost: httpbin.org\r\nConnection: close\r\n\r\n";
stream.write_all(request).await?;
let mut response = Vec::new();
stream.read_to_end(&mut response).await?;
let response_str = String::from_utf8_lossy(&response);
if response_str.contains("200 OK") || response_str.contains("origin") {
println!("Chained proxy successfully forwarded HTTP request");
} else {
println!(
"Warning: Chained proxy response may be unexpected: {}",
&response_str[..response_str.len().min(200)]
);
}
// Cleanup
tracker.cleanup_all().await;
Ok(())
}
/// Test starting a local proxy with HTTP upstream proxy
#[tokio::test]
#[serial]
async fn test_local_proxy_with_http_upstream(
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let binary_path = setup_test().await?;
let mut tracker = ProxyTestTracker::new(binary_path.clone());
// Start a mock HTTP upstream proxy server
let upstream_listener = tokio::net::TcpListener::bind("127.0.0.1:0").await?;
let upstream_addr = upstream_listener.local_addr()?;
let upstream_port = upstream_addr.port();
let upstream_handle = tokio::spawn(async move {
use http_body_util::Full;
use hyper::body::Bytes;
use hyper::server::conn::http1;
use hyper::service::service_fn;
use hyper::{Response, StatusCode};
use hyper_util::rt::TokioIo;
while let Ok((stream, _)) = upstream_listener.accept().await {
let io = TokioIo::new(stream);
tokio::task::spawn(async move {
let service = service_fn(|_req| async {
Ok::<_, hyper::Error>(
Response::builder()
.status(StatusCode::OK)
.body(Full::new(Bytes::from("Upstream Proxy Response")))
.unwrap(),
)
});
let _ = http1::Builder::new().serve_connection(io, service).await;
});
}
});
sleep(Duration::from_millis(200)).await;
println!("Starting local proxy with HTTP upstream proxy...");
let output = TestUtils::execute_command(
&binary_path,
&[
"proxy",
"start",
"--host",
"127.0.0.1",
"--proxy-port",
&upstream_port.to_string(),
"--type",
"http",
],
)
.await?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let stdout = String::from_utf8_lossy(&output.stdout);
upstream_handle.abort();
return Err(format!("Proxy start failed - stdout: {stdout}, stderr: {stderr}").into());
}
let stdout = String::from_utf8(output.stdout)?;
let config: Value = serde_json::from_str(&stdout)?;
let proxy_id = config["id"].as_str().unwrap().to_string();
let local_port = config["localPort"].as_u64().unwrap() as u16;
tracker.track_proxy(proxy_id.clone());
println!("Proxy started: id={}, port={}", proxy_id, local_port);
// Verify proxy is listening
sleep(Duration::from_millis(500)).await;
match TcpStream::connect(("127.0.0.1", local_port)).await {
Ok(_) => {
println!("Proxy is listening on port {local_port}");
}
Err(e) => {
upstream_handle.abort();
return Err(format!("Proxy port {local_port} is not listening: {e}").into());
}
}
// Cleanup
tracker.cleanup_all().await;
upstream_handle.abort();
Ok(())
}
/// Test multiple proxies running simultaneously
#[tokio::test]
#[serial]
async fn test_multiple_proxies_simultaneously(
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let binary_path = setup_test().await?;
let mut tracker = ProxyTestTracker::new(binary_path.clone());
println!("Starting multiple proxies simultaneously...");
let mut proxy_ports = Vec::new();
// Start 3 proxies, waiting for each to be ready before starting the next
// This avoids race conditions on macOS where processes need time to initialize
for i in 0..3 {
let output = TestUtils::execute_command(&binary_path, &["proxy", "start"]).await?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let stdout = String::from_utf8_lossy(&output.stdout);
return Err(
format!(
"Failed to start proxy {} - stdout: {}, stderr: {}",
i + 1,
stdout,
stderr
)
.into(),
);
}
let config: Value = serde_json::from_str(&String::from_utf8(output.stdout)?)?;
let proxy_id = config["id"].as_str().unwrap().to_string();
let local_port = config["localPort"].as_u64().unwrap() as u16;
tracker.track_proxy(proxy_id);
proxy_ports.push(local_port);
println!("Proxy {} started on port {}", i + 1, local_port);
// Wait for this proxy to be ready before starting the next one
// This prevents race conditions on macOS where processes need time to initialize
let mut attempts = 0;
let max_attempts = 50; // 5 seconds max (50 * 100ms)
loop {
sleep(Duration::from_millis(100)).await;
match TcpStream::connect(("127.0.0.1", local_port)).await {
Ok(_) => {
println!("Proxy {} is ready on port {}", i + 1, local_port);
break;
}
Err(_) => {
attempts += 1;
if attempts >= max_attempts {
return Err(
format!(
"Proxy {} on port {} failed to become ready after {} attempts",
i + 1,
local_port,
max_attempts
)
.into(),
);
}
}
}
}
}
// Verify all proxies are still listening
for (i, port) in proxy_ports.iter().enumerate() {
match TcpStream::connect(("127.0.0.1", *port)).await {
Ok(_) => {
println!("Proxy {} is listening on port {}", i + 1, port);
}
Err(e) => {
return Err(format!("Proxy {} on port {} is not listening: {e}", i + 1, port).into());
}
}
}
// Cleanup
tracker.cleanup_all().await;
Ok(())
}
/// Test proxy listing
#[tokio::test]
#[serial]
async fn test_proxy_list() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let binary_path = setup_test().await?;
let mut tracker = ProxyTestTracker::new(binary_path.clone());
// Start a proxy
let output = TestUtils::execute_command(&binary_path, &["proxy", "start"]).await?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let stdout = String::from_utf8_lossy(&output.stdout);
return Err(format!("Failed to start proxy - stdout: {stdout}, stderr: {stderr}").into());
}
let config: Value = serde_json::from_str(&String::from_utf8(output.stdout)?)?;
let proxy_id = config["id"].as_str().unwrap().to_string();
tracker.track_proxy(proxy_id.clone());
// List proxies
let list_output = TestUtils::execute_command(&binary_path, &["proxy", "list"]).await?;
if !list_output.status.success() {
return Err("Failed to list proxies".into());
}
let list_stdout = String::from_utf8(list_output.stdout)?;
let proxies: Vec<Value> = serde_json::from_str(&list_stdout)?;
// Verify our proxy is in the list
let found = proxies.iter().any(|p| p["id"].as_str() == Some(&proxy_id));
assert!(found, "Proxy should be in the list");
// Cleanup
tracker.cleanup_all().await;
Ok(())
}
/// Test traffic tracking through proxy
#[tokio::test]
#[serial]
async fn test_traffic_tracking() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let binary_path = setup_test().await?;
let mut tracker = ProxyTestTracker::new(binary_path.clone());
println!("Testing traffic tracking through proxy...");
// Start a proxy
let output = TestUtils::execute_command(&binary_path, &["proxy", "start"]).await?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let stdout = String::from_utf8_lossy(&output.stdout);
return Err(format!("Failed to start proxy - stdout: {stdout}, stderr: {stderr}").into());
}
let config: Value = serde_json::from_str(&String::from_utf8(output.stdout)?)?;
let proxy_id = config["id"].as_str().unwrap().to_string();
let local_port = config["localPort"].as_u64().unwrap() as u16;
tracker.track_proxy(proxy_id.clone());
println!("Proxy started on port {}", local_port);
// Wait for proxy to be ready
sleep(Duration::from_millis(500)).await;
// Make an HTTP request through the proxy
let mut stream = TcpStream::connect(("127.0.0.1", local_port)).await?;
let request =
b"GET http://httpbin.org/ip HTTP/1.1\r\nHost: httpbin.org\r\nConnection: close\r\n\r\n";
// Track bytes sent
let bytes_sent = request.len();
stream.write_all(request).await?;
// Read response
let mut response = Vec::new();
stream.read_to_end(&mut response).await?;
let bytes_received = response.len();
println!(
"HTTP request completed: sent {} bytes, received {} bytes",
bytes_sent, bytes_received
);
// Wait for traffic stats to be flushed (happens every second)
sleep(Duration::from_secs(2)).await;
// Verify traffic was tracked by checking traffic stats file exists
// Note: Traffic stats are stored in the cache directory
let cache_dir = directories::BaseDirs::new()
.expect("Failed to get base directories")
.cache_dir()
.to_path_buf();
let traffic_stats_dir = cache_dir.join("DonutBrowserDev").join("traffic_stats");
let stats_file = traffic_stats_dir.join(format!("{}.json", proxy_id));
if stats_file.exists() {
let content = std::fs::read_to_string(&stats_file)?;
let stats: Value = serde_json::from_str(&content)?;
let total_sent = stats["total_bytes_sent"].as_u64().unwrap_or(0);
let total_received = stats["total_bytes_received"].as_u64().unwrap_or(0);
let total_requests = stats["total_requests"].as_u64().unwrap_or(0);
println!(
"Traffic stats recorded: sent {} bytes, received {} bytes, {} requests",
total_sent, total_received, total_requests
);
// Check if domains are being tracked
let mut domain_traffic = false;
if let Some(domains) = stats.get("domains") {
if let Some(domain_map) = domains.as_object() {
println!("Domains tracked: {}", domain_map.len());
for (domain, domain_stats) in domain_map {
println!(" - {}", domain);
// Check if any domain has traffic
if let Some(domain_obj) = domain_stats.as_object() {
let domain_sent = domain_obj
.get("bytes_sent")
.and_then(|v| v.as_u64())
.unwrap_or(0);
let domain_recv = domain_obj
.get("bytes_received")
.and_then(|v| v.as_u64())
.unwrap_or(0);
let domain_reqs = domain_obj
.get("request_count")
.and_then(|v| v.as_u64())
.unwrap_or(0);
println!(
" sent: {}, received: {}, requests: {}",
domain_sent, domain_recv, domain_reqs
);
if domain_sent > 0 || domain_recv > 0 || domain_reqs > 0 {
domain_traffic = true;
}
}
}
}
}
// Verify that some traffic was recorded - check either total bytes or domain traffic
assert!(
total_sent > 0 || total_received > 0 || total_requests > 0 || domain_traffic,
"Traffic stats should record some activity (sent: {}, received: {}, requests: {})",
total_sent,
total_received,
total_requests
);
println!("Traffic tracking test passed!");
} else {
println!("Warning: Traffic stats file not found at {:?}", stats_file);
// This is not necessarily a failure - the file may not have been created yet
// The important thing is that the proxy is working
}
// Cleanup
tracker.cleanup_all().await;
// Clean up the traffic stats file
if stats_file.exists() {
let _ = std::fs::remove_file(&stats_file);
}
Ok(())
}
/// Test proxy stop
#[tokio::test]
#[serial]
async fn test_proxy_stop() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let binary_path = setup_test().await?;
let _tracker = ProxyTestTracker::new(binary_path.clone());
// Start a proxy
let output = TestUtils::execute_command(&binary_path, &["proxy", "start"]).await?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let stdout = String::from_utf8_lossy(&output.stdout);
return Err(format!("Failed to start proxy - stdout: {stdout}, stderr: {stderr}").into());
}
let config: Value = serde_json::from_str(&String::from_utf8(output.stdout)?)?;
let proxy_id = config["id"].as_str().unwrap().to_string();
let local_port = config["localPort"].as_u64().unwrap() as u16;
// Verify proxy is running
sleep(Duration::from_millis(500)).await;
match TcpStream::connect(("127.0.0.1", local_port)).await {
Ok(_) => println!("Proxy is running"),
Err(_) => return Err("Proxy is not running".into()),
}
// Stop the proxy
let stop_output =
TestUtils::execute_command(&binary_path, &["proxy", "stop", "--id", &proxy_id]).await?;
if !stop_output.status.success() {
return Err("Failed to stop proxy".into());
}
// Wait a bit for the process to exit
sleep(Duration::from_millis(500)).await;
// Verify proxy is stopped (connection should fail)
match TcpStream::connect(("127.0.0.1", local_port)).await {
Ok(_) => return Err("Proxy should be stopped but is still listening".into()),
Err(_) => println!("Proxy successfully stopped"),
}
Ok(())
}
-999
View File
@@ -1,999 +0,0 @@
mod common;
use common::TestUtils;
use serde_json::Value;
/// Setup function to ensure clean state before tests
async fn setup_test() -> Result<std::path::PathBuf, Box<dyn std::error::Error + Send + Sync>> {
let nodecar_path = TestUtils::ensure_nodecar_binary().await?;
// Only clean up test-specific processes, not all processes
// This prevents interfering with actual app usage during testing
println!("Setting up test environment...");
Ok(nodecar_path)
}
/// Helper to track and cleanup specific test resources
struct TestResourceTracker {
proxy_ids: Vec<String>,
camoufox_ids: Vec<String>,
nodecar_path: std::path::PathBuf,
}
impl TestResourceTracker {
fn new(nodecar_path: std::path::PathBuf) -> Self {
Self {
proxy_ids: Vec::new(),
camoufox_ids: Vec::new(),
nodecar_path,
}
}
fn track_proxy(&mut self, proxy_id: String) {
self.proxy_ids.push(proxy_id);
}
fn track_camoufox(&mut self, camoufox_id: String) {
self.camoufox_ids.push(camoufox_id);
}
async fn cleanup_all(&self) {
// Use targeted cleanup to only stop test-specific processes
let _ = TestUtils::cleanup_specific_processes(
&self.nodecar_path,
&self.proxy_ids,
&self.camoufox_ids,
)
.await;
}
}
impl Drop for TestResourceTracker {
fn drop(&mut self) {
// Ensure cleanup happens even if test panics
let proxy_ids = self.proxy_ids.clone();
let camoufox_ids = self.camoufox_ids.clone();
let nodecar_path = self.nodecar_path.clone();
tokio::spawn(async move {
let _ = TestUtils::cleanup_specific_processes(&nodecar_path, &proxy_ids, &camoufox_ids).await;
});
}
}
/// Integration tests for nodecar proxy functionality
#[tokio::test]
async fn test_nodecar_proxy_lifecycle() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let nodecar_path = setup_test().await?;
let mut tracker = TestResourceTracker::new(nodecar_path.clone());
// Test proxy start with a known working upstream
let args = [
"proxy",
"start",
"--host",
"httpbin.org",
"--proxy-port",
"80",
"--type",
"http",
];
println!("Starting proxy with nodecar...");
let output = TestUtils::execute_nodecar_command(&nodecar_path, &args).await?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let stdout = String::from_utf8_lossy(&output.stdout);
tracker.cleanup_all().await;
return Err(format!("Proxy start failed - stdout: {stdout}, stderr: {stderr}").into());
}
let stdout = String::from_utf8(output.stdout)?;
let config: Value = serde_json::from_str(&stdout)?;
// Verify proxy configuration structure
assert!(config["id"].is_string(), "Proxy ID should be a string");
assert!(
config["localPort"].is_number(),
"Local port should be a number"
);
assert!(
config["localUrl"].is_string(),
"Local URL should be a string"
);
let proxy_id = config["id"].as_str().unwrap().to_string();
let local_port = config["localPort"].as_u64().unwrap() as u16;
tracker.track_proxy(proxy_id.clone());
println!("Proxy started with ID: {proxy_id} on port: {local_port}");
// Wait for the proxy to start listening
let is_listening = TestUtils::wait_for_port_state(local_port, true, 10).await;
assert!(
is_listening,
"Proxy should be listening on the assigned port"
);
// Test stopping the proxy
let stop_args = ["proxy", "stop", "--id", &proxy_id];
let stop_output = TestUtils::execute_nodecar_command(&nodecar_path, &stop_args).await?;
assert!(stop_output.status.success(), "Proxy stop should succeed");
let port_available = TestUtils::wait_for_port_state(local_port, false, 5).await;
assert!(
port_available,
"Port should be available after stopping proxy"
);
tracker.cleanup_all().await;
Ok(())
}
/// Test proxy with authentication
#[tokio::test]
async fn test_nodecar_proxy_with_auth() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let nodecar_path = setup_test().await?;
let mut tracker = TestResourceTracker::new(nodecar_path.clone());
let args = [
"proxy",
"start",
"--host",
"httpbin.org",
"--proxy-port",
"80",
"--type",
"http",
"--username",
"testuser",
"--password",
"testpass",
];
let output = TestUtils::execute_nodecar_command(&nodecar_path, &args).await?;
if output.status.success() {
let stdout = String::from_utf8(output.stdout)?;
let config: Value = serde_json::from_str(&stdout)?;
let proxy_id = config["id"].as_str().unwrap().to_string();
tracker.track_proxy(proxy_id.clone());
// Verify upstream URL contains encoded credentials
if let Some(upstream_url) = config["upstreamUrl"].as_str() {
assert!(
upstream_url.contains("testuser"),
"Upstream URL should contain username"
);
// Password might be encoded, so we check for the presence of auth info
assert!(
upstream_url.contains("@"),
"Upstream URL should contain auth separator"
);
}
}
tracker.cleanup_all().await;
Ok(())
}
/// Test proxy list functionality
#[tokio::test]
async fn test_nodecar_proxy_list() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let nodecar_path = setup_test().await?;
let mut tracker = TestResourceTracker::new(nodecar_path.clone());
// Start a proxy first
let start_args = [
"proxy",
"start",
"--host",
"httpbin.org",
"--proxy-port",
"80",
"--type",
"http",
];
let start_output = TestUtils::execute_nodecar_command(&nodecar_path, &start_args).await?;
if start_output.status.success() {
let stdout = String::from_utf8(start_output.stdout)?;
let config: Value = serde_json::from_str(&stdout)?;
let proxy_id = config["id"].as_str().unwrap().to_string();
tracker.track_proxy(proxy_id.clone());
// Test list command
let list_args = ["proxy", "list"];
let list_output = TestUtils::execute_nodecar_command(&nodecar_path, &list_args).await?;
assert!(list_output.status.success(), "Proxy list should succeed");
let list_stdout = String::from_utf8(list_output.stdout)?;
let proxy_list: Value = serde_json::from_str(&list_stdout)?;
assert!(proxy_list.is_array(), "Proxy list should be an array");
let proxies = proxy_list.as_array().unwrap();
assert!(
!proxies.is_empty(),
"Should have at least one proxy in the list"
);
// Find our proxy in the list
let found_proxy = proxies.iter().find(|p| p["id"].as_str() == Some(&proxy_id));
assert!(found_proxy.is_some(), "Started proxy should be in the list");
}
tracker.cleanup_all().await;
Ok(())
}
/// Test Camoufox functionality
#[tokio::test]
async fn test_nodecar_camoufox_lifecycle() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let nodecar_path = setup_test().await?;
let mut tracker = TestResourceTracker::new(nodecar_path.clone());
let temp_dir = TestUtils::create_temp_dir()?;
let profile_path = temp_dir.path().join("test_profile");
let args = [
"camoufox",
"start",
"--profile-path",
profile_path.to_str().unwrap(),
"--headless",
];
println!("Starting Camoufox with nodecar...");
let output = TestUtils::execute_nodecar_command(&nodecar_path, &args).await?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let stdout = String::from_utf8_lossy(&output.stdout);
// If Camoufox is not installed or times out, skip the test
if stderr.contains("not installed")
|| stderr.contains("not found")
|| stderr.contains("timeout")
|| stdout.contains("timeout")
{
println!("Skipping Camoufox test - Camoufox not available or timed out");
tracker.cleanup_all().await;
return Ok(());
}
tracker.cleanup_all().await;
return Err(format!("Camoufox start failed - stdout: {stdout}, stderr: {stderr}").into());
}
let stdout = String::from_utf8(output.stdout)?;
let config: Value = serde_json::from_str(&stdout)?;
// Verify Camoufox configuration structure
assert!(config["id"].is_string(), "Camoufox ID should be a string");
let camoufox_id = config["id"].as_str().unwrap().to_string();
tracker.track_camoufox(camoufox_id.clone());
println!("Camoufox started with ID: {camoufox_id}");
// Test stopping Camoufox
let stop_args = ["camoufox", "stop", "--id", &camoufox_id];
let stop_output = TestUtils::execute_nodecar_command(&nodecar_path, &stop_args).await?;
assert!(stop_output.status.success(), "Camoufox stop should succeed");
tracker.cleanup_all().await;
Ok(())
}
/// Test Camoufox with URL opening
#[tokio::test]
async fn test_nodecar_camoufox_with_url() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let nodecar_path = setup_test().await?;
let mut tracker = TestResourceTracker::new(nodecar_path.clone());
let temp_dir = TestUtils::create_temp_dir()?;
let profile_path = temp_dir.path().join("test_profile_url");
let args = [
"camoufox",
"start",
"--profile-path",
profile_path.to_str().unwrap(),
"--url",
"https://httpbin.org/get",
"--headless",
];
let output = TestUtils::execute_nodecar_command(&nodecar_path, &args).await?;
if output.status.success() {
let stdout = String::from_utf8(output.stdout)?;
let config: Value = serde_json::from_str(&stdout)?;
let camoufox_id = config["id"].as_str().unwrap().to_string();
tracker.track_camoufox(camoufox_id.clone());
// Verify URL is set
if let Some(url) = config["url"].as_str() {
assert_eq!(
url, "https://httpbin.org/get",
"URL should match what was provided"
);
}
// Test stopping Camoufox explicitly
let stop_args = ["camoufox", "stop", "--id", &camoufox_id];
let stop_output = TestUtils::execute_nodecar_command(&nodecar_path, &stop_args).await?;
assert!(stop_output.status.success(), "Camoufox stop should succeed");
} else {
println!("Skipping Camoufox URL test - likely not installed");
tracker.cleanup_all().await;
return Ok(());
}
tracker.cleanup_all().await;
Ok(())
}
/// Test Camoufox list functionality
#[tokio::test]
async fn test_nodecar_camoufox_list() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let nodecar_path = setup_test().await?;
let tracker = TestResourceTracker::new(nodecar_path.clone());
// Test list command (should work even without Camoufox installed)
let list_args = ["camoufox", "list"];
let list_output = TestUtils::execute_nodecar_command(&nodecar_path, &list_args).await?;
assert!(list_output.status.success(), "Camoufox list should succeed");
let list_stdout = String::from_utf8(list_output.stdout)?;
let camoufox_list: Value = serde_json::from_str(&list_stdout)?;
assert!(camoufox_list.is_array(), "Camoufox list should be an array");
tracker.cleanup_all().await;
Ok(())
}
/// Test Camoufox process tracking and management
#[tokio::test]
async fn test_nodecar_camoufox_process_tracking(
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let nodecar_path = setup_test().await?;
let mut tracker = TestResourceTracker::new(nodecar_path.clone());
let temp_dir = TestUtils::create_temp_dir()?;
let profile_path = temp_dir.path().join("test_profile_tracking");
// Start multiple Camoufox instances
let mut instance_ids: Vec<String> = Vec::new();
for i in 0..2 {
let instance_profile_path = format!("{}_instance_{}", profile_path.to_str().unwrap(), i);
let args = [
"camoufox",
"start",
"--profile-path",
&instance_profile_path,
"--headless",
];
println!("Starting Camoufox instance {i}...");
let output = TestUtils::execute_nodecar_command(&nodecar_path, &args).await?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let stdout = String::from_utf8_lossy(&output.stdout);
// If Camoufox is not installed, skip the test
if stderr.contains("not installed") || stderr.contains("not found") {
println!("Skipping Camoufox process tracking test - Camoufox not installed");
tracker.cleanup_all().await;
return Ok(());
}
tracker.cleanup_all().await;
return Err(
format!("Camoufox instance {i} start failed - stdout: {stdout}, stderr: {stderr}").into(),
);
}
let stdout = String::from_utf8(output.stdout)?;
let config: Value = serde_json::from_str(&stdout)?;
let camoufox_id = config["id"].as_str().unwrap().to_string();
instance_ids.push(camoufox_id.clone());
tracker.track_camoufox(camoufox_id.clone());
println!("Camoufox instance {i} started with ID: {camoufox_id}");
}
// Verify all instances are tracked
let list_args = ["camoufox", "list"];
let list_output = TestUtils::execute_nodecar_command(&nodecar_path, &list_args).await?;
assert!(list_output.status.success(), "Camoufox list should succeed");
let list_stdout = String::from_utf8(list_output.stdout)?;
println!("Camoufox list output: {list_stdout}");
let instances: Value = serde_json::from_str(&list_stdout)?;
let instances_array = instances.as_array().unwrap();
println!("Found {} instances in list", instances_array.len());
// Verify our instances are in the list
for instance_id in &instance_ids {
let instance_found = instances_array
.iter()
.any(|i| i["id"].as_str() == Some(instance_id));
if !instance_found {
println!("Instance {instance_id} not found in list. Available instances:");
for instance in instances_array {
if let Some(id) = instance["id"].as_str() {
println!(" - {id}");
}
}
}
assert!(
instance_found,
"Camoufox instance {instance_id} should be found in list"
);
}
// Stop all instances individually
for instance_id in &instance_ids {
println!("Stopping Camoufox instance: {instance_id}");
let stop_args = ["camoufox", "stop", "--id", instance_id];
let stop_output = TestUtils::execute_nodecar_command(&nodecar_path, &stop_args).await?;
if stop_output.status.success() {
let stop_stdout = String::from_utf8(stop_output.stdout)?;
if let Ok(stop_result) = serde_json::from_str::<Value>(&stop_stdout) {
let success = stop_result["success"].as_bool().unwrap_or(false);
if !success {
println!("Warning: Stop command returned success=false for instance {instance_id}");
}
} else {
println!("Warning: Could not parse stop result for instance {instance_id}");
}
} else {
println!("Warning: Stop command failed for instance {instance_id}");
}
}
// Verify all instances are removed
let list_output_after = TestUtils::execute_nodecar_command(&nodecar_path, &list_args).await?;
let instances_after: Value = serde_json::from_str(&String::from_utf8(list_output_after.stdout)?)?;
let instances_after_array = instances_after.as_array().unwrap();
for instance_id in &instance_ids {
let instance_still_exists = instances_after_array
.iter()
.any(|i| i["id"].as_str() == Some(instance_id));
assert!(
!instance_still_exists,
"Stopped Camoufox instance {instance_id} should not be found in list"
);
}
println!("Camoufox process tracking test completed successfully");
tracker.cleanup_all().await;
Ok(())
}
/// Test Camoufox with various configuration options
#[tokio::test]
async fn test_nodecar_camoufox_configuration_options(
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let nodecar_path = setup_test().await?;
let mut tracker = TestResourceTracker::new(nodecar_path.clone());
let temp_dir = TestUtils::create_temp_dir()?;
let profile_path = temp_dir.path().join("test_profile_config");
let args = [
"camoufox",
"start",
"--profile-path",
profile_path.to_str().unwrap(),
"--block-images",
"--max-width",
"1920",
"--max-height",
"1080",
"--headless",
];
println!("Starting Camoufox with configuration options...");
let output = TestUtils::execute_nodecar_command(&nodecar_path, &args).await?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let stdout = String::from_utf8_lossy(&output.stdout);
// If Camoufox is not installed, skip the test
if stderr.contains("not installed") || stderr.contains("not found") {
println!("Skipping Camoufox configuration test - Camoufox not installed");
tracker.cleanup_all().await;
return Ok(());
}
tracker.cleanup_all().await;
return Err(
format!("Camoufox with config start failed - stdout: {stdout}, stderr: {stderr}").into(),
);
}
let stdout = String::from_utf8(output.stdout)?;
let config: Value = serde_json::from_str(&stdout)?;
let camoufox_id = config["id"].as_str().unwrap().to_string();
tracker.track_camoufox(camoufox_id.clone());
println!("Camoufox with configuration started with ID: {camoufox_id}");
// Verify configuration was applied by checking the profile path
if let Some(returned_profile_path) = config["profilePath"].as_str() {
assert!(
returned_profile_path.contains("test_profile_config"),
"Profile path should match what was provided"
);
}
// Test stopping Camoufox explicitly
let stop_args = ["camoufox", "stop", "--id", &camoufox_id];
let stop_output = TestUtils::execute_nodecar_command(&nodecar_path, &stop_args).await?;
assert!(stop_output.status.success(), "Camoufox stop should succeed");
println!("Camoufox configuration test completed successfully");
tracker.cleanup_all().await;
Ok(())
}
/// Test Camoufox generate-config command with basic options
#[ignore = "CI is rate limited for camoufox download"]
#[tokio::test]
async fn test_nodecar_camoufox_generate_config_basic(
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let nodecar_path = setup_test().await?;
let tracker = TestResourceTracker::new(nodecar_path.clone());
let args = [
"camoufox",
"generate-config",
"--max-width",
"1920",
"--max-height",
"1080",
"--block-images",
];
println!("Testing Camoufox config generation with basic options...");
let output = TestUtils::execute_nodecar_command(&nodecar_path, &args).await?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let stdout = String::from_utf8_lossy(&output.stdout);
tracker.cleanup_all().await;
return Err(
format!("Camoufox generate-config failed - stdout: {stdout}, stderr: {stderr}").into(),
);
}
let stdout = String::from_utf8(output.stdout)?;
println!("Generated config output: {stdout}");
// Parse the generated config as JSON
let config: Value = serde_json::from_str(&stdout)?;
// Verify the config contains expected properties
assert!(
config.is_object(),
"Generated config should be a JSON object"
);
// Check for some expected fingerprint properties
assert!(
config.get("screen.width").is_some(),
"Config should contain screen.width"
);
assert!(
config.get("screen.height").is_some(),
"Config should contain screen.height"
);
assert!(
config.get("navigator.userAgent").is_some(),
"Config should contain navigator.userAgent"
);
println!("Camoufox generate-config basic test completed successfully");
tracker.cleanup_all().await;
Ok(())
}
/// Test Camoufox generate-config command with custom fingerprint
#[ignore = "CI is rate limited for camoufox download"]
#[tokio::test]
async fn test_nodecar_camoufox_generate_config_custom_fingerprint(
) -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let nodecar_path = setup_test().await?;
let tracker = TestResourceTracker::new(nodecar_path.clone());
// Create a custom fingerprint JSON
let custom_fingerprint = r#"{
"screen.width": 1440,
"screen.height": 900,
"navigator.userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:135.0) Gecko/20100101 Firefox/140.0",
"navigator.platform": "TestPlatform",
"timezone": "America/New_York",
"locale:language": "en",
"locale:region": "US"
}"#;
let args = [
"camoufox",
"generate-config",
"--fingerprint",
custom_fingerprint,
"--block-webrtc",
];
println!("Testing Camoufox config generation with custom fingerprint...");
let output = TestUtils::execute_nodecar_command(&nodecar_path, &args).await?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let stdout = String::from_utf8_lossy(&output.stdout);
tracker.cleanup_all().await;
return Err(
format!("Camoufox generate-config with custom fingerprint failed - stdout: {stdout}, stderr: {stderr}").into(),
);
}
let stdout = String::from_utf8(output.stdout)?;
// Parse the generated config as JSON
let config: Value = serde_json::from_str(&stdout)?;
// Verify the config contains expected properties
assert!(
config.is_object(),
"Generated config should be a JSON object"
);
// Check that our custom values are preserved
assert_eq!(
config.get("screen.width").and_then(|v| v.as_u64()),
Some(1440),
"Custom screen width should be preserved"
);
assert_eq!(
config.get("screen.height").and_then(|v| v.as_u64()),
Some(900),
"Custom screen height should be preserved"
);
assert_eq!(
config.get("navigator.userAgent").and_then(|v| v.as_str()),
Some("Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:135.0) Gecko/20100101 Firefox/140.0"),
"Custom user agent should be preserved"
);
assert_eq!(
config.get("timezone").and_then(|v| v.as_str()),
Some("America/New_York"),
"Custom timezone should be preserved"
);
println!("Camoufox generate-config custom fingerprint test completed successfully");
tracker.cleanup_all().await;
Ok(())
}
/// Test nodecar command validation
#[tokio::test]
async fn test_nodecar_command_validation() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let nodecar_path = setup_test().await?;
let tracker = TestResourceTracker::new(nodecar_path.clone());
// Test invalid command
let invalid_args = ["invalid", "command"];
let output = TestUtils::execute_nodecar_command(&nodecar_path, &invalid_args).await?;
assert!(!output.status.success(), "Invalid command should fail");
tracker.cleanup_all().await;
Ok(())
}
/// Test concurrent proxy operations
#[tokio::test]
async fn test_nodecar_concurrent_proxies() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let nodecar_path = setup_test().await?;
let mut tracker = TestResourceTracker::new(nodecar_path.clone());
// Start multiple proxies concurrently
let mut handles = vec![];
for i in 0..3 {
let nodecar_path_clone = nodecar_path.clone();
let handle = tokio::spawn(async move {
let args = [
"proxy",
"start",
"--host",
"httpbin.org",
"--proxy-port",
"80",
"--type",
"http",
];
TestUtils::execute_nodecar_command(&nodecar_path_clone, &args).await
});
handles.push((i, handle));
}
// Wait for all proxies to start
for (i, handle) in handles {
match handle.await.map_err(|e| format!("Join error: {e}"))? {
Ok(output) if output.status.success() => {
let stdout = String::from_utf8(output.stdout)?;
let config: Value = serde_json::from_str(&stdout)?;
let proxy_id = config["id"].as_str().unwrap().to_string();
tracker.track_proxy(proxy_id.clone());
println!("Proxy {i} started successfully");
}
Ok(output) => {
let stderr = String::from_utf8_lossy(&output.stderr);
println!("Proxy {i} failed to start: {stderr}");
}
Err(e) => {
println!("Proxy {i} error: {e}");
}
}
}
tracker.cleanup_all().await;
Ok(())
}
/// Test proxy with different upstream types
#[tokio::test]
async fn test_nodecar_proxy_types() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let nodecar_path = setup_test().await?;
let mut tracker = TestResourceTracker::new(nodecar_path.clone());
let test_cases = vec![
("http", "httpbin.org", "80"),
("https", "httpbin.org", "443"),
];
for (proxy_type, host, port) in test_cases {
println!("Testing {proxy_type} proxy to {host}:{port}");
let args = [
"proxy",
"start",
"--host",
host,
"--proxy-port",
port,
"--type",
proxy_type,
];
let output = TestUtils::execute_nodecar_command(&nodecar_path, &args).await?;
if output.status.success() {
let stdout = String::from_utf8(output.stdout)?;
let config: Value = serde_json::from_str(&stdout)?;
let proxy_id = config["id"].as_str().unwrap().to_string();
tracker.track_proxy(proxy_id.clone());
println!("{proxy_type} proxy test passed");
} else {
let stderr = String::from_utf8_lossy(&output.stderr);
println!("{proxy_type} proxy test failed: {stderr}");
}
}
tracker.cleanup_all().await;
Ok(())
}
/// Test direct proxy (no upstream) functionality
#[tokio::test]
async fn test_nodecar_direct_proxy() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
let nodecar_path = setup_test().await?;
let mut tracker = TestResourceTracker::new(nodecar_path.clone());
// Test starting a direct proxy (no upstream)
let args = ["proxy", "start"];
println!("Starting direct proxy with nodecar...");
let output = TestUtils::execute_nodecar_command(&nodecar_path, &args).await?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
let stdout = String::from_utf8_lossy(&output.stdout);
tracker.cleanup_all().await;
return Err(format!("Direct proxy start failed - stdout: {stdout}, stderr: {stderr}").into());
}
let stdout = String::from_utf8(output.stdout)?;
let config: Value = serde_json::from_str(&stdout)?;
// Verify proxy configuration structure
assert!(config["id"].is_string(), "Proxy ID should be a string");
assert!(
config["localPort"].is_number(),
"Local port should be a number"
);
assert!(
config["localUrl"].is_string(),
"Local URL should be a string"
);
assert_eq!(
config["upstreamUrl"].as_str().unwrap(),
"DIRECT",
"Upstream URL should be DIRECT"
);
let proxy_id = config["id"].as_str().unwrap().to_string();
let local_port = config["localPort"].as_u64().unwrap() as u16;
tracker.track_proxy(proxy_id.clone());
println!("Direct proxy started with ID: {proxy_id} on port: {local_port}");
// Wait for the proxy to start listening
let is_listening = TestUtils::wait_for_port_state(local_port, true, 10).await;
assert!(
is_listening,
"Direct proxy should be listening on the assigned port"
);
// Test stopping the proxy
let stop_args = ["proxy", "stop", "--id", &proxy_id];
let stop_output = TestUtils::execute_nodecar_command(&nodecar_path, &stop_args).await?;
assert!(
stop_output.status.success(),
"Direct proxy stop should succeed"
);
let port_available = TestUtils::wait_for_port_state(local_port, false, 5).await;
assert!(
port_available,
"Port should be available after stopping direct proxy"
);
println!("Direct proxy test completed successfully");
tracker.cleanup_all().await;
Ok(())
}
/// Test SOCKS5 proxy chaining - create two proxies where the second uses the first as upstream
#[tokio::test]
async fn test_nodecar_socks5_proxy_chaining() -> Result<(), Box<dyn std::error::Error + Send + Sync>>
{
let nodecar_path = setup_test().await?;
let mut tracker = TestResourceTracker::new(nodecar_path.clone());
// Step 1: Start a SOCKS5 proxy with a known working upstream (httpbin.org)
let socks5_args = [
"proxy",
"start",
"--host",
"httpbin.org",
"--proxy-port",
"80",
"--type",
"http", // Use HTTP upstream for the first proxy
];
println!("Starting first proxy with HTTP upstream...");
let socks5_output = TestUtils::execute_nodecar_command(&nodecar_path, &socks5_args).await?;
if !socks5_output.status.success() {
let stderr = String::from_utf8_lossy(&socks5_output.stderr);
let stdout = String::from_utf8_lossy(&socks5_output.stdout);
tracker.cleanup_all().await;
return Err(format!("First proxy start failed - stdout: {stdout}, stderr: {stderr}").into());
}
let socks5_stdout = String::from_utf8(socks5_output.stdout)?;
let socks5_config: Value = serde_json::from_str(&socks5_stdout)?;
let socks5_proxy_id = socks5_config["id"].as_str().unwrap().to_string();
let socks5_local_port = socks5_config["localPort"].as_u64().unwrap() as u16;
tracker.track_proxy(socks5_proxy_id.clone());
println!("First proxy started with ID: {socks5_proxy_id} on port: {socks5_local_port}");
// Step 2: Start a second proxy that uses the first proxy as upstream
let http_proxy_args = [
"proxy",
"start",
"--upstream",
&format!("http://127.0.0.1:{socks5_local_port}"),
];
println!("Starting second proxy with first proxy as upstream...");
let http_output = TestUtils::execute_nodecar_command(&nodecar_path, &http_proxy_args).await?;
if !http_output.status.success() {
let stderr = String::from_utf8_lossy(&http_output.stderr);
let stdout = String::from_utf8_lossy(&http_output.stdout);
tracker.cleanup_all().await;
return Err(
format!("Second proxy with chained upstream failed - stdout: {stdout}, stderr: {stderr}")
.into(),
);
}
let http_stdout = String::from_utf8(http_output.stdout)?;
let http_config: Value = serde_json::from_str(&http_stdout)?;
let http_proxy_id = http_config["id"].as_str().unwrap().to_string();
let http_local_port = http_config["localPort"].as_u64().unwrap() as u16;
tracker.track_proxy(http_proxy_id.clone());
println!(
"Second proxy started with ID: {http_proxy_id} on port: {http_local_port} (chained through first proxy)"
);
// Verify both proxies are listening by waiting for them to be occupied
let socks5_listening = TestUtils::wait_for_port_state(socks5_local_port, true, 5).await;
let http_listening = TestUtils::wait_for_port_state(http_local_port, true, 5).await;
assert!(
socks5_listening,
"First proxy should be listening on port {socks5_local_port}"
);
assert!(
http_listening,
"Second proxy should be listening on port {http_local_port}"
);
// Clean up both proxies
let stop_http_args = ["proxy", "stop", "--id", &http_proxy_id];
let stop_socks5_args = ["proxy", "stop", "--id", &socks5_proxy_id];
let http_stop_result = TestUtils::execute_nodecar_command(&nodecar_path, &stop_http_args).await;
let socks5_stop_result =
TestUtils::execute_nodecar_command(&nodecar_path, &stop_socks5_args).await;
// Verify cleanup
assert!(
http_stop_result.is_ok() && http_stop_result.unwrap().status.success(),
"Second proxy stop should succeed"
);
assert!(
socks5_stop_result.is_ok() && socks5_stop_result.unwrap().status.success(),
"First proxy stop should succeed"
);
let http_port_available = TestUtils::wait_for_port_state(http_local_port, false, 5).await;
let socks5_port_available = TestUtils::wait_for_port_state(socks5_local_port, false, 5).await;
assert!(
http_port_available,
"Second proxy port should be available after stopping"
);
assert!(
socks5_port_available,
"First proxy port should be available after stopping"
);
println!("Proxy chaining test completed successfully");
tracker.cleanup_all().await;
Ok(())
}
+8 -1
View File
@@ -1,10 +1,13 @@
"use client";
import { Geist, Geist_Mono } from "next/font/google";
import "@/styles/globals.css";
import "flag-icons/css/flag-icons.min.css";
import { useEffect } from "react";
import { CustomThemeProvider } from "@/components/theme-provider";
import { Toaster } from "@/components/ui/sonner";
import { TooltipProvider } from "@/components/ui/tooltip";
import { WindowDragArea } from "@/components/window-drag-area";
import { setupLogging } from "@/lib/logger";
const geistSans = Geist({
variable: "--font-geist-sans",
@@ -21,10 +24,14 @@ export default function RootLayout({
}: Readonly<{
children: React.ReactNode;
}>) {
useEffect(() => {
void setupLogging();
}, []);
return (
<html lang="en" suppressHydrationWarning>
<body
className={`${geistSans.variable} ${geistMono.variable} antialiased overflow-hidden`}
className={`${geistSans.variable} ${geistMono.variable} antialiased overflow-hidden bg-background`}
>
<CustomThemeProvider>
<WindowDragArea />
+250 -272
View File
@@ -3,7 +3,7 @@
import { invoke } from "@tauri-apps/api/core";
import { listen } from "@tauri-apps/api/event";
import { getCurrent } from "@tauri-apps/plugin-deep-link";
import { useCallback, useEffect, useRef, useState } from "react";
import { useCallback, useEffect, useMemo, useState } from "react";
import { CamoufoxConfigDialog } from "@/components/camoufox-config-dialog";
import { CreateProfileDialog } from "@/components/create-profile-dialog";
import { DeleteConfirmationDialog } from "@/components/delete-confirmation-dialog";
@@ -15,14 +15,19 @@ import { ImportProfileDialog } from "@/components/import-profile-dialog";
import { PermissionDialog } from "@/components/permission-dialog";
import { ProfilesDataTable } from "@/components/profile-data-table";
import { ProfileSelectorDialog } from "@/components/profile-selector-dialog";
import { ProxyAssignmentDialog } from "@/components/proxy-assignment-dialog";
import { ProxyManagementDialog } from "@/components/proxy-management-dialog";
import { SettingsDialog } from "@/components/settings-dialog";
import { useAppUpdateNotifications } from "@/hooks/use-app-update-notifications";
import { useGroupEvents } from "@/hooks/use-group-events";
import type { PermissionType } from "@/hooks/use-permissions";
import { usePermissions } from "@/hooks/use-permissions";
import { useProfileEvents } from "@/hooks/use-profile-events";
import { useProxyEvents } from "@/hooks/use-proxy-events";
import { useUpdateNotifications } from "@/hooks/use-update-notifications";
import { useVersionUpdater } from "@/hooks/use-version-updater";
import { showErrorToast, showToast } from "@/lib/toast-utils";
import type { BrowserProfile, CamoufoxConfig, GroupWithCount } from "@/types";
import type { BrowserProfile, CamoufoxConfig } from "@/types";
type BrowserTypeString =
| "mullvad-browser"
@@ -40,9 +45,30 @@ interface PendingUrl {
}
export default function Home() {
// Mount global version update listener/toasts
useVersionUpdater();
const [isInitializing, setIsInitializing] = useState(true);
const [profiles, setProfiles] = useState<BrowserProfile[]>([]);
const [error, setError] = useState<string | null>(null);
// Use the new profile events hook for centralized profile management
const {
profiles,
runningProfiles,
isLoading: profilesLoading,
error: profilesError,
} = useProfileEvents();
const {
groups: groupsData,
isLoading: groupsLoading,
error: groupsError,
} = useGroupEvents();
const {
storedProxies,
isLoading: proxiesLoading,
error: proxiesError,
} = useProxyEvents();
const [createProfileDialogOpen, setCreateProfileDialogOpen] = useState(false);
const [settingsDialogOpen, setSettingsDialogOpen] = useState(false);
const [importProfileDialogOpen, setImportProfileDialogOpen] = useState(false);
@@ -54,18 +80,22 @@ export default function Home() {
useState(false);
const [groupAssignmentDialogOpen, setGroupAssignmentDialogOpen] =
useState(false);
const [proxyAssignmentDialogOpen, setProxyAssignmentDialogOpen] =
useState(false);
const [selectedGroupId, setSelectedGroupId] = useState<string>("default");
const [selectedProfilesForGroup, setSelectedProfilesForGroup] = useState<
string[]
>([]);
const [selectedProfilesForProxy, setSelectedProfilesForProxy] = useState<
string[]
>([]);
const [selectedProfiles, setSelectedProfiles] = useState<string[]>([]);
const [searchQuery, setSearchQuery] = useState<string>("");
const [pendingUrls, setPendingUrls] = useState<PendingUrl[]>([]);
const [currentProfileForCamoufoxConfig, setCurrentProfileForCamoufoxConfig] =
useState<BrowserProfile | null>(null);
const [hasCheckedStartupPrompt, setHasCheckedStartupPrompt] = useState(false);
const [permissionDialogOpen, setPermissionDialogOpen] = useState(false);
const [groups, setGroups] = useState<GroupWithCount[]>([]);
const [areGroupsLoading, setGroupsLoading] = useState(true);
const [currentPermissionType, setCurrentPermissionType] =
useState<PermissionType>("microphone");
const [showBulkDeleteConfirmation, setShowBulkDeleteConfirmation] =
@@ -142,11 +172,6 @@ export default function Home() {
"Failed to download missing components:",
downloadError,
);
setError(
`Failed to download missing components: ${JSON.stringify(
downloadError,
)}`,
);
}
}
} catch (err: unknown) {
@@ -154,22 +179,6 @@ export default function Home() {
}
}, []);
// Simple profiles loader without updates check (for use as callback)
const loadProfiles = useCallback(async () => {
try {
const profileList = await invoke<BrowserProfile[]>(
"list_browser_profiles",
);
setProfiles(profileList);
// Check for missing binaries after loading profiles
await checkMissingBinaries();
} catch (err: unknown) {
console.error("Failed to load profiles:", err);
setError(`Failed to load profiles: ${JSON.stringify(err)}`);
}
}, [checkMissingBinaries]);
const [processingUrls, setProcessingUrls] = useState<Set<string>>(new Set());
const handleUrlOpen = useCallback(
@@ -203,26 +212,9 @@ export default function Home() {
);
// Auto-update functionality - use the existing hook for compatibility
const updateNotifications = useUpdateNotifications(loadProfiles);
const updateNotifications = useUpdateNotifications();
const { checkForUpdates, isUpdating } = updateNotifications;
// Profiles loader with update check (for initial load and manual refresh)
const loadProfilesWithUpdateCheck = useCallback(async () => {
try {
const profileList = await invoke<BrowserProfile[]>(
"list_browser_profiles",
);
setProfiles(profileList);
// Check for updates after loading profiles
await checkForUpdates();
await checkMissingBinaries();
} catch (err: unknown) {
console.error("Failed to load profiles:", err);
setError(`Failed to load profiles: ${JSON.stringify(err)}`);
}
}, [checkForUpdates, checkMissingBinaries]);
useAppUpdateNotifications();
// Check for startup URLs but only process them once
@@ -269,9 +261,8 @@ export default function Home() {
await invoke("warm_up_nodecar");
} catch (err) {
if (!cancelled) {
setError(
`Initialization failed: ${err instanceof Error ? err.message : String(err)}`,
);
// Don't set error here since useProfileEvents handles profile errors
console.error("Initialization failed:", err);
}
} finally {
if (!cancelled) setIsInitializing(false);
@@ -283,6 +274,27 @@ export default function Home() {
};
}, []);
// Handle profile errors from useProfileEvents hook
useEffect(() => {
if (profilesError) {
showErrorToast(profilesError);
}
}, [profilesError]);
// Handle group errors from useGroupEvents hook
useEffect(() => {
if (groupsError) {
showErrorToast(groupsError);
}
}, [groupsError]);
// Handle proxy errors from useProxyEvents hook
useEffect(() => {
if (proxiesError) {
showErrorToast(proxiesError);
}
}, [proxiesError]);
const checkAllPermissions = useCallback(async () => {
try {
// Wait for permissions to be initialized before checking
@@ -339,7 +351,7 @@ export default function Home() {
"Received show create profile dialog request:",
event.payload,
);
setError(
showErrorToast(
"No profiles available. Please create a profile first before opening URLs.",
);
setCreateProfileDialogOpen(true);
@@ -375,38 +387,24 @@ export default function Home() {
const handleSaveCamoufoxConfig = useCallback(
async (profile: BrowserProfile, config: CamoufoxConfig) => {
setError(null);
try {
await invoke("update_camoufox_config", {
profileName: profile.name,
profileId: profile.id,
config,
});
await loadProfiles();
// No need to manually reload - useProfileEvents will handle the update
setCamoufoxConfigDialogOpen(false);
} catch (err: unknown) {
console.error("Failed to update camoufox config:", err);
setError(`Failed to update camoufox config: ${JSON.stringify(err)}`);
showErrorToast(
`Failed to update camoufox config: ${JSON.stringify(err)}`,
);
throw err;
}
},
[loadProfiles],
[],
);
const loadGroups = useCallback(async () => {
setGroupsLoading(true);
try {
const groupsWithCounts = await invoke<GroupWithCount[]>(
"get_groups_with_profile_counts",
);
setGroups(groupsWithCounts);
} catch (err) {
console.error("Failed to load groups with counts:", err);
setGroups([]);
} finally {
setGroupsLoading(false);
}
}, []);
const handleCreateProfile = useCallback(
async (profileData: {
name: string;
@@ -417,29 +415,22 @@ export default function Home() {
camoufoxConfig?: CamoufoxConfig;
groupId?: string;
}) => {
setError(null);
try {
const _profile = await invoke<BrowserProfile>(
"create_browser_profile_new",
{
name: profileData.name,
browserStr: profileData.browserStr,
version: profileData.version,
releaseType: profileData.releaseType,
proxyId: profileData.proxyId,
camoufoxConfig: profileData.camoufoxConfig,
groupId:
profileData.groupId ||
(selectedGroupId !== "default" ? selectedGroupId : undefined),
},
);
await invoke<BrowserProfile>("create_browser_profile_new", {
name: profileData.name,
browserStr: profileData.browserStr,
version: profileData.version,
releaseType: profileData.releaseType,
proxyId: profileData.proxyId,
camoufoxConfig: profileData.camoufoxConfig,
groupId:
profileData.groupId ||
(selectedGroupId !== "default" ? selectedGroupId : undefined),
});
await loadProfiles();
await loadGroups();
// Trigger proxy data reload in the table
// No need to manually reload - useProfileEvents will handle the update
} catch (error) {
setError(
showErrorToast(
`Failed to create profile: ${
error instanceof Error ? error.message : String(error)
}`,
@@ -447,166 +438,102 @@ export default function Home() {
throw error;
}
},
[loadProfiles, loadGroups, selectedGroupId],
[selectedGroupId],
);
const [runningProfiles, setRunningProfiles] = useState<Set<string>>(
new Set(),
);
const launchProfile = useCallback(async (profile: BrowserProfile) => {
console.log("Starting launch for profile:", profile.name);
const runningProfilesRef = useRef<Set<string>>(new Set());
const checkBrowserStatus = useCallback(async (profile: BrowserProfile) => {
try {
const result = await invoke<BrowserProfile>("launch_browser_profile", {
profile,
});
console.log("Successfully launched profile:", result.name);
} catch (err: unknown) {
console.error("Failed to launch browser:", err);
const errorMessage = err instanceof Error ? err.message : String(err);
showErrorToast(`Failed to launch browser: ${errorMessage}`);
// Re-throw the error so the table component can handle loading state cleanup
throw err;
}
}, []);
const handleDeleteProfile = useCallback(async (profile: BrowserProfile) => {
console.log("Attempting to delete profile:", profile.name);
try {
// First check if the browser is running for this profile
const isRunning = await invoke<boolean>("check_browser_status", {
profile,
});
const currentRunning = runningProfilesRef.current.has(profile.name);
if (isRunning !== currentRunning) {
console.log(
`Profile ${profile.name} (${profile.browser}) status changed: ${currentRunning} -> ${isRunning}`,
if (isRunning) {
showErrorToast(
"Cannot delete profile while browser is running. Please stop the browser first.",
);
setRunningProfiles((prev) => {
const next = new Set(prev);
if (isRunning) {
next.add(profile.name);
} else {
next.delete(profile.name);
}
runningProfilesRef.current = next;
return next;
});
return;
}
} catch (err) {
console.error("Failed to check browser status:", err);
// Attempt to delete the profile
await invoke("delete_profile", { profileId: profile.id });
console.log("Profile deletion command completed successfully");
// No need to manually reload - useProfileEvents will handle the update
console.log("Profile deleted successfully");
} catch (err: unknown) {
console.error("Failed to delete profile:", err);
const errorMessage = err instanceof Error ? err.message : String(err);
showErrorToast(`Failed to delete profile: ${errorMessage}`);
}
}, []);
const launchProfile = useCallback(
async (profile: BrowserProfile) => {
setError(null);
// Check if browser is disabled due to ongoing update
try {
const isDisabled = await invoke<boolean>(
"is_browser_disabled_for_update",
{
browser: profile.browser,
},
);
if (isDisabled || isUpdating(profile.browser)) {
setError(
`${profile.browser} is currently being updated. Please wait for the update to complete.`,
);
return;
}
} catch (err) {
console.error("Failed to check browser update status:", err);
}
try {
const updatedProfile = await invoke<BrowserProfile>(
"launch_browser_profile",
{ profile },
);
await loadProfiles();
await checkBrowserStatus(updatedProfile);
} catch (err: unknown) {
console.error("Failed to launch browser:", err);
setError(`Failed to launch browser: ${JSON.stringify(err)}`);
}
},
[loadProfiles, checkBrowserStatus, isUpdating],
);
const handleDeleteProfile = useCallback(
async (profile: BrowserProfile) => {
setError(null);
console.log("Attempting to delete profile:", profile.name);
try {
// First check if the browser is running for this profile
const isRunning = await invoke<boolean>("check_browser_status", {
profile,
});
if (isRunning) {
setError(
"Cannot delete profile while browser is running. Please stop the browser first.",
);
return;
}
// Attempt to delete the profile
await invoke("delete_profile", { profileName: profile.name });
console.log("Profile deletion command completed successfully");
// Give a small delay to ensure file system operations complete
await new Promise((resolve) => setTimeout(resolve, 500));
// Reload profiles and groups to ensure UI is updated
await loadProfiles();
await loadGroups();
console.log("Profile deleted and profiles reloaded successfully");
} catch (err: unknown) {
console.error("Failed to delete profile:", err);
const errorMessage = err instanceof Error ? err.message : String(err);
setError(`Failed to delete profile: ${errorMessage}`);
}
},
[loadProfiles, loadGroups],
);
const handleRenameProfile = useCallback(
async (oldName: string, newName: string) => {
setError(null);
async (profileId: string, newName: string) => {
try {
await invoke("rename_profile", { oldName, newName });
await loadProfiles();
await invoke("rename_profile", { profileId, newName });
// No need to manually reload - useProfileEvents will handle the update
} catch (err: unknown) {
console.error("Failed to rename profile:", err);
setError(`Failed to rename profile: ${JSON.stringify(err)}`);
showErrorToast(`Failed to rename profile: ${JSON.stringify(err)}`);
throw err;
}
},
[loadProfiles],
[],
);
const handleKillProfile = useCallback(
async (profile: BrowserProfile) => {
setError(null);
try {
await invoke("kill_browser_profile", { profile });
await loadProfiles();
} catch (err: unknown) {
console.error("Failed to kill browser:", err);
setError(`Failed to kill browser: ${JSON.stringify(err)}`);
}
},
[loadProfiles],
);
const handleKillProfile = useCallback(async (profile: BrowserProfile) => {
console.log("Starting kill for profile:", profile.name);
try {
await invoke("kill_browser_profile", { profile });
console.log("Successfully killed profile:", profile.name);
// No need to manually reload - useProfileEvents will handle the update
} catch (err: unknown) {
console.error("Failed to kill browser:", err);
const errorMessage = err instanceof Error ? err.message : String(err);
showErrorToast(`Failed to kill browser: ${errorMessage}`);
// Re-throw the error so the table component can handle loading state cleanup
throw err;
}
}, []);
const handleDeleteSelectedProfiles = useCallback(
async (profileNames: string[]) => {
setError(null);
async (profileIds: string[]) => {
try {
await invoke("delete_selected_profiles", { profileNames });
await loadProfiles();
await loadGroups();
await invoke("delete_selected_profiles", { profileIds });
// No need to manually reload - useProfileEvents will handle the update
} catch (err: unknown) {
console.error("Failed to delete selected profiles:", err);
setError(`Failed to delete selected profiles: ${JSON.stringify(err)}`);
showErrorToast(
`Failed to delete selected profiles: ${JSON.stringify(err)}`,
);
}
},
[loadProfiles, loadGroups],
[],
);
const handleAssignProfilesToGroup = useCallback((profileNames: string[]) => {
setSelectedProfilesForGroup(profileNames);
const handleAssignProfilesToGroup = useCallback((profileIds: string[]) => {
setSelectedProfilesForGroup(profileIds);
setGroupAssignmentDialogOpen(true);
}, []);
@@ -621,19 +548,20 @@ export default function Home() {
setIsBulkDeleting(true);
try {
await invoke("delete_selected_profiles", {
profileNames: selectedProfiles,
profileIds: selectedProfiles,
});
await loadProfiles();
await loadGroups();
// No need to manually reload - useProfileEvents will handle the update
setSelectedProfiles([]);
setShowBulkDeleteConfirmation(false);
} catch (error) {
console.error("Failed to delete selected profiles:", error);
setError(`Failed to delete selected profiles: ${JSON.stringify(error)}`);
showErrorToast(
`Failed to delete selected profiles: ${JSON.stringify(error)}`,
);
} finally {
setIsBulkDeleting(false);
}
}, [selectedProfiles, loadProfiles, loadGroups]);
}, [selectedProfiles]);
const handleBulkGroupAssignment = useCallback(() => {
if (selectedProfiles.length === 0) return;
@@ -641,21 +569,34 @@ export default function Home() {
setSelectedProfiles([]);
}, [selectedProfiles, handleAssignProfilesToGroup]);
const handleAssignProfilesToProxy = useCallback((profileIds: string[]) => {
setSelectedProfilesForProxy(profileIds);
setProxyAssignmentDialogOpen(true);
}, []);
const handleBulkProxyAssignment = useCallback(() => {
if (selectedProfiles.length === 0) return;
handleAssignProfilesToProxy(selectedProfiles);
setSelectedProfiles([]);
}, [selectedProfiles, handleAssignProfilesToProxy]);
const handleGroupAssignmentComplete = useCallback(async () => {
await loadProfiles();
await loadGroups();
// No need to manually reload - useProfileEvents will handle the update
setGroupAssignmentDialogOpen(false);
setSelectedProfilesForGroup([]);
}, [loadProfiles, loadGroups]);
}, []);
const handleProxyAssignmentComplete = useCallback(async () => {
// No need to manually reload - useProfileEvents will handle the update
setProxyAssignmentDialogOpen(false);
setSelectedProfilesForProxy([]);
}, []);
const handleGroupManagementComplete = useCallback(async () => {
await loadGroups();
}, [loadGroups]);
// No need to manually reload - useProfileEvents will handle the update
}, []);
useEffect(() => {
void loadProfilesWithUpdateCheck();
void loadGroups();
// Check for startup default browser prompt
void checkStartupPrompt();
@@ -681,6 +622,11 @@ export default function Home() {
30 * 60 * 1000,
);
// Check for missing binaries after initial profile load
if (!profilesLoading && profiles.length > 0) {
void checkMissingBinaries();
}
return () => {
clearInterval(updateInterval);
if (cleanup) {
@@ -688,12 +634,13 @@ export default function Home() {
}
};
}, [
loadProfilesWithUpdateCheck,
checkForUpdates,
checkStartupPrompt,
listenForUrlEvents,
checkCurrentUrl,
loadGroups,
checkMissingBinaries,
profilesLoading,
profiles.length,
]);
// Show deprecation warning for unsupported profiles (with names)
@@ -729,31 +676,6 @@ export default function Home() {
}
}, [profiles]);
useEffect(() => {
if (profiles.length === 0) return;
const interval = setInterval(() => {
for (const profile of profiles) {
void checkBrowserStatus(profile);
}
}, 500);
return () => {
clearInterval(interval);
};
}, [profiles, checkBrowserStatus]);
useEffect(() => {
runningProfilesRef.current = runningProfiles;
}, [runningProfiles]);
useEffect(() => {
if (error) {
showErrorToast(error);
setError(null);
}
}, [error]);
// Check permissions when they are initialized
useEffect(() => {
if (isInitialized) {
@@ -761,30 +683,66 @@ export default function Home() {
}
}, [isInitialized, checkAllPermissions]);
// Filter data by selected group and search query
const filteredProfiles = useMemo(() => {
let filtered = profiles;
// Filter by group
if (!selectedGroupId || selectedGroupId === "default") {
filtered = profiles.filter((profile) => !profile.group_id);
} else {
filtered = profiles.filter(
(profile) => profile.group_id === selectedGroupId,
);
}
// Filter by search query
if (searchQuery.trim()) {
const query = searchQuery.toLowerCase().trim();
filtered = filtered.filter((profile) => {
// Search in profile name
if (profile.name.toLowerCase().includes(query)) return true;
// Search in note
if (profile.note?.toLowerCase().includes(query)) return true;
// Search in tags
if (profile.tags?.some((tag) => tag.toLowerCase().includes(query)))
return true;
return false;
});
}
return filtered;
}, [profiles, selectedGroupId, searchQuery]);
// Update loading states
const isLoading = profilesLoading || groupsLoading || proxiesLoading;
return (
<div className="grid grid-rows-[20px_1fr_20px] items-center justify-items-center min-h-screen gap-8 font-[family-name:var(--font-geist-sans)] bg-white dark:bg-black">
<main className="flex flex-col row-start-2 gap-6 items-center w-full max-w-3xl">
<div className="grid items-center justify-items-center min-h-screen gap-8 font-(family-name:--font-geist-sans) bg-background">
<main className="flex flex-col items-center w-full max-w-3xl">
<div className="w-full">
<HomeHeader
selectedProfiles={selectedProfiles}
onBulkDelete={handleBulkDelete}
onBulkGroupAssignment={handleBulkGroupAssignment}
onCreateProfileDialogOpen={setCreateProfileDialogOpen}
onGroupManagementDialogOpen={setGroupManagementDialogOpen}
onImportProfileDialogOpen={setImportProfileDialogOpen}
onProxyManagementDialogOpen={setProxyManagementDialogOpen}
onSettingsDialogOpen={setSettingsDialogOpen}
searchQuery={searchQuery}
onSearchQueryChange={setSearchQuery}
/>
</div>
<div className="space-y-4 w-full">
<div className="w-full mt-2.5">
<GroupBadges
selectedGroupId={selectedGroupId}
onGroupSelect={handleSelectGroup}
groups={groups}
isLoading={areGroupsLoading}
groups={groupsData}
isLoading={isLoading}
/>
<ProfilesDataTable
data={profiles}
profiles={filteredProfiles}
onLaunchProfile={launchProfile}
onKillProfile={handleKillProfile}
onDeleteProfile={handleDeleteProfile}
@@ -797,18 +755,21 @@ export default function Home() {
selectedGroupId={selectedGroupId}
selectedProfiles={selectedProfiles}
onSelectedProfilesChange={setSelectedProfiles}
onBulkDelete={handleBulkDelete}
onBulkGroupAssignment={handleBulkGroupAssignment}
onBulkProxyAssignment={handleBulkProxyAssignment}
/>
</div>
</main>
{isInitializing && (
<div className="fixed inset-0 z-[100000] backdrop-blur-sm bg-black/30 flex items-center justify-center">
<div className="bg-white dark:bg-neutral-900 rounded-xl p-6 shadow-xl border border-black/10 dark:border-white/10 w-[320px] text-center">
<div className="fixed inset-0 z-1000 backdrop-blur-sm bg-background/30 flex items-center justify-center">
<div className="bg-background rounded-xl p-6 shadow-xl border border-border/10 w-[320px] text-center">
<div className="text-lg font-medium">Initializing</div>
<div className="mt-1 mb-2 text-sm text-gray-600 dark:text-gray-300">
Please don't close the app
</div>
<div className="mx-auto mb-4 w-8 h-8 rounded-full border-2 border-gray-300 animate-spin border-t-gray-900 dark:border-gray-700 dark:border-t-white" />
<div className="mx-auto mb-4 w-8 h-8 rounded-full border-2 animate-spin border-muted/40 border-t-primary" />
</div>
</div>
)}
@@ -834,7 +795,6 @@ export default function Home() {
onClose={() => {
setImportProfileDialogOpen(false);
}}
onImportComplete={() => void loadProfiles()}
/>
<ProxyManagementDialog
@@ -875,6 +835,11 @@ export default function Home() {
}}
profile={currentProfileForCamoufoxConfig}
onSave={handleSaveCamoufoxConfig}
isRunning={
currentProfileForCamoufoxConfig
? runningProfiles.has(currentProfileForCamoufoxConfig.id)
: false
}
/>
<GroupManagementDialog
@@ -892,6 +857,18 @@ export default function Home() {
}}
selectedProfiles={selectedProfilesForGroup}
onAssignmentComplete={handleGroupAssignmentComplete}
profiles={profiles}
/>
<ProxyAssignmentDialog
isOpen={proxyAssignmentDialogOpen}
onClose={() => {
setProxyAssignmentDialogOpen(false);
}}
selectedProfiles={selectedProfilesForProxy}
onAssignmentComplete={handleProxyAssignmentComplete}
profiles={profiles}
storedProxies={storedProxies}
/>
<DeleteConfirmationDialog
@@ -902,7 +879,8 @@ export default function Home() {
description={`This action cannot be undone. This will permanently delete ${selectedProfiles.length} profile${selectedProfiles.length !== 1 ? "s" : ""} and all associated data.`}
confirmButtonText={`Delete ${selectedProfiles.length} Profile${selectedProfiles.length !== 1 ? "s" : ""}`}
isLoading={isBulkDeleting}
profileNames={selectedProfiles}
profileIds={selectedProfiles}
profiles={profiles.map((p) => ({ id: p.id, name: p.name }))}
/>
</div>
);
+35 -9
View File
@@ -1,6 +1,6 @@
"use client";
import { FaDownload, FaTimes } from "react-icons/fa";
import { FaDownload, FaExternalLinkAlt, FaTimes } from "react-icons/fa";
import { LuCheckCheck, LuCog, LuRefreshCw } from "react-icons/lu";
import { Badge } from "@/components/ui/badge";
import { Button } from "@/components/ui/button";
@@ -60,6 +60,16 @@ export function AppUpdateToast({
await onUpdate(updateInfo);
};
const handleViewRelease = () => {
if (updateInfo.release_page_url) {
// Trigger the same URL handling logic as if the URL came from the system
const event = new CustomEvent("url-open-request", {
detail: updateInfo.release_page_url,
});
window.dispatchEvent(event);
}
};
const showDownloadProgress =
isUpdating &&
updateProgress?.stage === "downloading" &&
@@ -101,6 +111,11 @@ export function AppUpdateToast({
<>
Update from {updateInfo.current_version} to{" "}
<span className="font-medium">{updateInfo.new_version}</span>
{updateInfo.manual_update_required && (
<span className="block mt-1 text-muted-foreground/80">
Manual download required on Linux
</span>
)}
</>
)}
</div>
@@ -155,14 +170,25 @@ export function AppUpdateToast({
{!isUpdating && (
<div className="flex gap-2 items-center mt-3">
<RippleButton
onClick={() => void handleUpdateClick()}
size="sm"
className="flex gap-2 items-center text-xs"
>
<FaDownload className="w-3 h-3" />
Update Now
</RippleButton>
{updateInfo.manual_update_required ? (
<RippleButton
onClick={handleViewRelease}
size="sm"
className="flex gap-2 items-center text-xs"
>
<FaExternalLinkAlt className="w-3 h-3" />
View Release
</RippleButton>
) : (
<RippleButton
onClick={() => void handleUpdateClick()}
size="sm"
className="flex gap-2 items-center text-xs"
>
<FaDownload className="w-3 h-3" />
Update Now
</RippleButton>
)}
<RippleButton
variant="outline"
onClick={onDismiss}
+118
View File
@@ -0,0 +1,118 @@
"use client";
import * as React from "react";
import { Area, AreaChart, ResponsiveContainer } from "recharts";
import { cn } from "@/lib/utils";
import type { BandwidthDataPoint } from "@/types";
interface BandwidthMiniChartProps {
data: BandwidthDataPoint[];
currentBandwidth?: number;
onClick?: () => void;
className?: string;
}
export function BandwidthMiniChart({
data,
currentBandwidth: externalBandwidth,
onClick,
className,
}: BandwidthMiniChartProps) {
// Transform data for the chart - combine sent and received for total bandwidth
const chartData = React.useMemo(() => {
// Fill in missing seconds with zeros for smooth chart
if (data.length === 0) {
// Create 60 seconds of zero data for the past minute
const now = Math.floor(Date.now() / 1000);
return Array.from({ length: 60 }, (_, i) => ({
time: now - (59 - i),
bandwidth: 0,
}));
}
const now = Math.floor(Date.now() / 1000);
const result: { time: number; bandwidth: number }[] = [];
// Get the last 60 seconds
for (let i = 59; i >= 0; i--) {
const targetTime = now - i;
const point = data.find((d) => d.timestamp === targetTime);
result.push({
time: targetTime,
bandwidth: point ? point.bytes_sent + point.bytes_received : 0,
});
}
return result;
}, [data]);
// Find max value for scaling
const _maxBandwidth = React.useMemo(() => {
const max = Math.max(...chartData.map((d) => d.bandwidth), 1);
return max;
}, [chartData]);
// Use external bandwidth if provided, otherwise calculate from last data point
const currentBandwidth =
externalBandwidth ?? chartData[chartData.length - 1]?.bandwidth ?? 0;
// Format bytes to human readable
const formatBytes = (bytes: number): string => {
if (bytes === 0) return "0 B/s";
if (bytes < 1024) return `${bytes} B/s`;
if (bytes < 1024 * 1024) return `${(bytes / 1024).toFixed(1)} KB/s`;
return `${(bytes / (1024 * 1024)).toFixed(1)} MB/s`;
};
return (
<button
type="button"
onClick={onClick}
className={cn(
"relative flex items-center gap-1.5 px-2 rounded cursor-pointer hover:bg-accent/50 transition-colors min-w-[130px] border-none bg-transparent",
className,
)}
>
<div className="flex-1 h-3">
<ResponsiveContainer width="100%" height="100%">
<AreaChart
data={chartData}
margin={{ top: 0, right: 0, bottom: 0, left: 0 }}
>
<defs>
<linearGradient
id="bandwidthGradient"
x1="0"
y1="0"
x2="0"
y2="1"
>
<stop
offset="0%"
stopColor="var(--chart-1)"
stopOpacity={0.6}
/>
<stop
offset="100%"
stopColor="var(--chart-1)"
stopOpacity={0.1}
/>
</linearGradient>
</defs>
<Area
type="monotone"
dataKey="bandwidth"
stroke="var(--chart-1)"
strokeWidth={1}
fill="url(#bandwidthGradient)"
isAnimationActive={false}
/>
</AreaChart>
</ResponsiveContainer>
</div>
<span className="text-xs text-muted-foreground whitespace-nowrap min-w-[60px] text-right">
{formatBytes(currentBandwidth)}
</span>
</button>
);
}
+33 -15
View File
@@ -10,7 +10,16 @@ import {
DialogTitle,
} from "@/components/ui/dialog";
import { ScrollArea } from "@/components/ui/scroll-area";
import type { BrowserProfile, CamoufoxConfig } from "@/types";
import type { BrowserProfile, CamoufoxConfig, CamoufoxOS } from "@/types";
const getCurrentOS = (): CamoufoxOS => {
if (typeof navigator === "undefined") return "linux";
const platform = navigator.platform.toLowerCase();
if (platform.includes("win")) return "windows";
if (platform.includes("mac")) return "macos";
return "linux";
};
import { LoadingButton } from "./loading-button";
import { RippleButton } from "./ui/ripple";
@@ -19,6 +28,7 @@ interface CamoufoxConfigDialogProps {
onClose: () => void;
profile: BrowserProfile | null;
onSave: (profile: BrowserProfile, config: CamoufoxConfig) => Promise<void>;
isRunning?: boolean;
}
export function CamoufoxConfigDialog({
@@ -26,10 +36,12 @@ export function CamoufoxConfigDialog({
onClose,
profile,
onSave,
isRunning = false,
}: CamoufoxConfigDialogProps) {
const [config, setConfig] = useState<CamoufoxConfig>({
const [config, setConfig] = useState<CamoufoxConfig>(() => ({
geoip: true,
});
os: getCurrentOS(),
}));
const [isSaving, setIsSaving] = useState(false);
// Initialize config when profile changes
@@ -38,6 +50,7 @@ export function CamoufoxConfigDialog({
setConfig(
profile.camoufox_config || {
geoip: true,
os: getCurrentOS(),
},
);
}
@@ -86,6 +99,7 @@ export function CamoufoxConfigDialog({
setConfig(
profile.camoufox_config || {
geoip: true,
os: getCurrentOS(),
},
);
}
@@ -101,33 +115,37 @@ export function CamoufoxConfigDialog({
return (
<Dialog open={isOpen} onOpenChange={handleClose}>
<DialogContent className="max-w-3xl max-h-[90vh] flex flex-col">
<DialogHeader className="flex-shrink-0">
<DialogHeader className="shrink-0">
<DialogTitle>
Configure Fingerprint Settings - {profile.name}
{isRunning ? "View" : "Configure"} Fingerprint Settings -{" "}
{profile.name}
</DialogTitle>
</DialogHeader>
<ScrollArea className="flex-1 h-[400px]">
<ScrollArea className="flex-1 h-[300px]">
<div className="py-4">
<SharedCamoufoxConfigForm
config={config}
onConfigChange={updateConfig}
forceAdvanced={true}
readOnly={isRunning}
/>
</div>
</ScrollArea>
<DialogFooter className="flex-shrink-0 pt-4 border-t">
<DialogFooter className="shrink-0 pt-4 border-t">
<RippleButton variant="outline" onClick={handleClose}>
Cancel
{isRunning ? "Close" : "Cancel"}
</RippleButton>
<LoadingButton
isLoading={isSaving}
onClick={handleSave}
disabled={isSaving}
>
Save
</LoadingButton>
{!isRunning && (
<LoadingButton
isLoading={isSaving}
onClick={handleSave}
disabled={isSaving}
>
Save
</LoadingButton>
)}
</DialogFooter>
</DialogContent>
</Dialog>
+512 -260
View File
@@ -6,7 +6,7 @@ import { GoPlus } from "react-icons/go";
import { LoadingButton } from "@/components/loading-button";
import { ProxyFormDialog } from "@/components/proxy-form-dialog";
import { SharedCamoufoxConfigForm } from "@/components/shared-camoufox-config-form";
import { Combobox } from "@/components/ui/combobox";
import { Button } from "@/components/ui/button";
import {
Dialog,
DialogContent,
@@ -25,9 +25,20 @@ import {
SelectValue,
} from "@/components/ui/select";
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs";
import { useBrowserDownload } from "@/hooks/use-browser-download";
import { useProxyEvents } from "@/hooks/use-proxy-events";
import { getBrowserIcon } from "@/lib/browser-utils";
import type { BrowserReleaseTypes, CamoufoxConfig, StoredProxy } from "@/types";
import type { BrowserReleaseTypes, CamoufoxConfig, CamoufoxOS } from "@/types";
const getCurrentOS = (): CamoufoxOS => {
if (typeof navigator === "undefined") return "linux";
const platform = navigator.platform.toLowerCase();
if (platform.includes("win")) return "windows";
if (platform.includes("mac")) return "macos";
return "linux";
};
import { RippleButton } from "./ui/ripple";
type BrowserTypeString =
@@ -98,30 +109,46 @@ export function CreateProfileDialog({
selectedGroupId,
}: CreateProfileDialogProps) {
const [profileName, setProfileName] = useState("");
const [currentStep, setCurrentStep] = useState<
"browser-selection" | "browser-config"
>("browser-selection");
const [activeTab, setActiveTab] = useState("anti-detect");
// Regular browser states
// Browser selection states
const [selectedBrowser, setSelectedBrowser] =
useState<BrowserTypeString | null>("camoufox");
useState<BrowserTypeString | null>(null);
const [selectedProxyId, setSelectedProxyId] = useState<string>();
const handleTabChange = (value: string) => {
if (value === "regular") {
setSelectedBrowser(null);
} else if (value === "anti-detect") {
setSelectedBrowser("camoufox");
}
// Camoufox anti-detect states
const [camoufoxConfig, setCamoufoxConfig] = useState<CamoufoxConfig>(() => ({
geoip: true, // Default to automatic geoip
os: getCurrentOS(), // Default to current OS
}));
setActiveTab(value);
// Handle browser selection from the initial screen
const handleBrowserSelect = (browser: BrowserTypeString) => {
setSelectedBrowser(browser);
setCurrentStep("browser-config");
};
// Camoufox anti-detect states
const [camoufoxConfig, setCamoufoxConfig] = useState<CamoufoxConfig>({
geoip: true, // Default to automatic geoip
});
// Handle back button
const handleBack = () => {
setCurrentStep("browser-selection");
setSelectedBrowser(null);
setProfileName("");
setSelectedProxyId(undefined);
};
const handleTabChange = (value: string) => {
setActiveTab(value);
setCurrentStep("browser-selection");
setSelectedBrowser(null);
setProfileName("");
setSelectedProxyId(undefined);
};
const [supportedBrowsers, setSupportedBrowsers] = useState<string[]>([]);
const [storedProxies, setStoredProxies] = useState<StoredProxy[]>([]);
const { storedProxies } = useProxyEvents();
const [showProxyForm, setShowProxyForm] = useState(false);
const [isCreating, setIsCreating] = useState(false);
const [releaseTypes, setReleaseTypes] = useState<BrowserReleaseTypes>();
@@ -144,15 +171,6 @@ export function CreateProfileDialog({
}
}, []);
const loadStoredProxies = useCallback(async () => {
try {
const proxies = await invoke<StoredProxy[]>("get_stored_proxies");
setStoredProxies(proxies);
} catch (error) {
console.error("Failed to load stored proxies:", error);
}
}, []);
const checkAndDownloadGeoIPDatabase = useCallback(async () => {
try {
const isAvailable = await invoke<boolean>("is_geoip_database_available");
@@ -178,6 +196,8 @@ export function CreateProfileDialog({
{ browserStr: browser },
);
await loadDownloadedVersions(browser);
// Only update state if this browser is still the one we're loading
if (loadingBrowserRef.current === browser) {
// Filter to enforce stable-only creation, except Firefox Developer (nightly-only)
@@ -197,12 +217,29 @@ export function CreateProfileDialog({
filtered.stable = rawReleaseTypes.stable;
setReleaseTypes(filtered);
}
// Load downloaded versions for this browser
await loadDownloadedVersions(browser);
}
} catch (error) {
console.error(`Failed to load release types for ${browser}:`, error);
// Fallback: still load downloaded versions and derive release type from them if possible
try {
const downloaded = await loadDownloadedVersions(browser);
if (loadingBrowserRef.current === browser && downloaded.length > 0) {
const latest = downloaded[0];
const fallback: BrowserReleaseTypes = {};
if (browser === "firefox-developer") {
fallback.nightly = latest;
} else {
fallback.stable = latest;
}
setReleaseTypes(fallback);
}
} catch (e) {
console.error(
`Failed to load downloaded versions for ${browser}:`,
e,
);
}
} finally {
// Clear loading state only if we're still loading this browser
if (loadingBrowserRef.current === browser) {
@@ -217,18 +254,21 @@ export function CreateProfileDialog({
useEffect(() => {
if (isOpen) {
void loadSupportedBrowsers();
void loadStoredProxies();
// Load camoufox release types when dialog opens
void loadReleaseTypes("camoufox");
// Load release types when a browser is selected
if (selectedBrowser) {
void loadReleaseTypes(selectedBrowser);
}
// Check and download GeoIP database if needed for Camoufox
void checkAndDownloadGeoIPDatabase();
if (selectedBrowser === "camoufox") {
void checkAndDownloadGeoIPDatabase();
}
}
}, [
isOpen,
loadSupportedBrowsers,
loadStoredProxies,
loadReleaseTypes,
checkAndDownloadGeoIPDatabase,
selectedBrowser,
]);
// Load release types when browser selection changes
@@ -283,7 +323,29 @@ export function CreateProfileDialog({
setIsCreating(true);
try {
if (activeTab === "regular") {
if (activeTab === "anti-detect") {
// Anti-detect browser - always use Camoufox with best available version
const bestCamoufoxVersion = getBestAvailableVersion("camoufox");
if (!bestCamoufoxVersion) {
console.error("No Camoufox version available");
return;
}
// The fingerprint will be generated at launch time by the Rust backend
// We don't need to generate it here during profile creation
const finalCamoufoxConfig = { ...camoufoxConfig };
await onCreateProfile({
name: profileName.trim(),
browserStr: "camoufox" as BrowserTypeString,
version: bestCamoufoxVersion.version,
releaseType: bestCamoufoxVersion.releaseType,
proxyId: selectedProxyId,
camoufoxConfig: finalCamoufoxConfig,
groupId: selectedGroupId !== "default" ? selectedGroupId : undefined,
});
} else {
// Regular browser
if (!selectedBrowser) {
console.error("Missing required browser selection");
return;
@@ -304,27 +366,6 @@ export function CreateProfileDialog({
proxyId: selectedProxyId,
groupId: selectedGroupId !== "default" ? selectedGroupId : undefined,
});
} else {
// Anti-detect tab - always use Camoufox with best available version
const bestCamoufoxVersion = getBestAvailableVersion("camoufox");
if (!bestCamoufoxVersion) {
console.error("No Camoufox version available");
return;
}
// The fingerprint will be generated at launch time by the Rust backend
// We don't need to generate it here during profile creation
const finalCamoufoxConfig = { ...camoufoxConfig };
await onCreateProfile({
name: profileName.trim(),
browserStr: "camoufox" as BrowserTypeString,
version: bestCamoufoxVersion.version,
releaseType: bestCamoufoxVersion.releaseType,
proxyId: selectedProxyId,
camoufoxConfig: finalCamoufoxConfig,
groupId: selectedGroupId !== "default" ? selectedGroupId : undefined,
});
}
handleClose();
@@ -341,13 +382,15 @@ export function CreateProfileDialog({
// Reset all states
setProfileName("");
setCurrentStep("browser-selection");
setActiveTab("anti-detect");
setSelectedBrowser(null);
setSelectedProxyId(undefined);
setReleaseTypes({});
setCamoufoxConfig({
geoip: true, // Reset to automatic geoip
os: getCurrentOS(), // Reset to current OS
});
setActiveTab("anti-detect");
onClose();
};
@@ -386,25 +429,23 @@ export function CreateProfileDialog({
isBrowserVersionAvailable,
]);
useEffect(() => {
console.log(
selectedBrowser,
selectedBrowser && isBrowserCurrentlyDownloading(selectedBrowser),
selectedBrowser && isBrowserVersionAvailable(selectedBrowser),
selectedBrowser && getBestAvailableVersion(selectedBrowser),
);
}, [
selectedBrowser,
isBrowserCurrentlyDownloading,
isBrowserVersionAvailable,
getBestAvailableVersion,
]);
// Filter supported browsers for regular browsers (excluding mullvad and tor)
const regularBrowsers = browserOptions.filter(
(browser) =>
supportedBrowsers.includes(browser.value) &&
browser.value !== "mullvad-browser" &&
browser.value !== "tor-browser",
);
return (
<Dialog open={isOpen} onOpenChange={handleClose}>
<DialogContent className="w-full max-h-[90vh] flex flex-col">
<DialogHeader className="flex-shrink-0">
<DialogTitle>Create New Profile</DialogTitle>
<DialogTitle>
{currentStep === "browser-selection"
? "Create New Profile"
: "Configure Profile"}
</DialogTitle>
</DialogHeader>
<Tabs
@@ -420,221 +461,432 @@ export function CreateProfileDialog({
<TabsTrigger value="regular">Regular</TabsTrigger>
</TabsList>
<ScrollArea className="flex-1 h-[330px] overflow-y-hidden">
<ScrollArea className="overflow-y-auto flex-1">
<div className="flex flex-col justify-center items-center w-full">
<div className="py-4 space-y-6 w-full max-w-md">
{/* Profile Name - Common to both tabs */}
<div className="space-y-2">
<Label htmlFor="profile-name">Profile Name</Label>
<Input
id="profile-name"
value={profileName}
onChange={(e) => setProfileName(e.target.value)}
placeholder="Enter profile name"
/>
</div>
{currentStep === "browser-selection" ? (
<>
<TabsContent value="anti-detect" className="mt-0 space-y-6">
{/* Anti-Detect Browser Selection */}
<div className="space-y-6">
<div className="text-center">
<h3 className="text-lg font-medium">
Anti-Detect Browser
</h3>
<p className="mt-2 text-sm text-muted-foreground">
Choose Firefox for anti-detection capabilities
</p>
</div>
<TabsContent value="regular" className="mt-0 space-y-6">
<div className="space-y-4">
<div className="space-y-2">
<Label>Browser</Label>
<Combobox
options={browserOptions
.filter(
(browser) =>
supportedBrowsers.includes(browser.value) &&
browser.value !== "mullvad-browser" &&
browser.value !== "tor-browser",
)
.map((browser) => {
const IconComponent = getBrowserIcon(browser.value);
return {
value: browser.value,
label: browser.label,
icon: IconComponent,
};
})}
value={selectedBrowser || ""}
onValueChange={(value) =>
setSelectedBrowser(value as BrowserTypeString)
}
placeholder="Select a browser..."
searchPlaceholder="Search browsers..."
/>
</div>
{selectedBrowser && (
<div className="space-y-3">
{!isBrowserCurrentlyDownloading(selectedBrowser) &&
!isBrowserVersionAvailable(selectedBrowser) &&
getBestAvailableVersion(selectedBrowser) && (
<div className="flex gap-3 items-center">
<p className="text-sm text-muted-foreground">
{(() => {
const bestVersion =
getBestAvailableVersion(selectedBrowser);
return `Latest version (${bestVersion?.version}) needs to be downloaded`;
})()}
</p>
<LoadingButton
onClick={() => handleDownload(selectedBrowser)}
isLoading={isBrowserCurrentlyDownloading(
selectedBrowser,
)}
className="ml-auto"
size="sm"
disabled={isBrowserCurrentlyDownloading(
selectedBrowser,
)}
>
Download
</LoadingButton>
</div>
)}
{!isBrowserCurrentlyDownloading(selectedBrowser) &&
isBrowserVersionAvailable(selectedBrowser) && (
<div className="text-sm text-muted-foreground">
{(() => {
const bestVersion =
getBestAvailableVersion(selectedBrowser);
return `✓ Latest version (${bestVersion?.version}) is available`;
})()}
</div>
)}
{isBrowserCurrentlyDownloading(selectedBrowser) && (
<div className="text-sm text-muted-foreground">
<Button
onClick={() => handleBrowserSelect("camoufox")}
className="flex gap-3 justify-start items-center p-4 w-full h-16 border-2 transition-colors hover:border-primary/50"
variant="outline"
>
<div className="flex justify-center items-center w-8 h-8">
{(() => {
const bestVersion =
getBestAvailableVersion(selectedBrowser);
return `Downloading version (${bestVersion?.version})...`;
const IconComponent = getBrowserIcon("firefox");
return IconComponent ? (
<IconComponent className="w-6 h-6" />
) : null;
})()}
</div>
)}
<div className="text-left">
<div className="font-medium">Firefox</div>
<div className="text-sm text-muted-foreground">
Anti-Detect Browser
</div>
</div>
</Button>
</div>
)}
</div>
</TabsContent>
</TabsContent>
<TabsContent value="anti-detect" className="mt-0 space-y-6">
<div className="space-y-6">
{/* Camoufox Download Status */}
{!isBrowserCurrentlyDownloading("camoufox") &&
!isBrowserVersionAvailable("camoufox") &&
getBestAvailableVersion("camoufox") && (
<div className="flex gap-3 items-center p-3 rounded-md border">
<p className="text-sm text-muted-foreground">
{(() => {
const bestVersion =
getBestAvailableVersion("camoufox");
return `Camoufox version (${bestVersion?.version}) needs to be downloaded`;
})()}
<TabsContent value="regular" className="mt-0 space-y-6">
{/* Regular Browser Selection */}
<div className="space-y-6">
<div className="text-center">
<h3 className="text-lg font-medium">
Regular Browsers
</h3>
<p className="mt-2 text-sm text-muted-foreground">
Choose from supported regular browsers
</p>
<LoadingButton
onClick={() => handleDownload("camoufox")}
isLoading={isBrowserCurrentlyDownloading(
"camoufox",
)}
size="sm"
disabled={isBrowserCurrentlyDownloading("camoufox")}
>
{isBrowserCurrentlyDownloading("camoufox")
? "Downloading..."
: "Download"}
</LoadingButton>
</div>
)}
{!isBrowserCurrentlyDownloading("camoufox") &&
isBrowserVersionAvailable("camoufox") && (
<div className="p-3 text-sm rounded-md border text-muted-foreground">
{(() => {
const bestVersion =
getBestAvailableVersion("camoufox");
return `✓ Camoufox version (${bestVersion?.version}) is available`;
})()}
<div className="space-y-3">
{regularBrowsers.map((browser) => {
if (browser.value === "camoufox") return null; // Skip camoufox as it's handled in anti-detect tab
const IconComponent = getBrowserIcon(browser.value);
return (
<Button
key={browser.value}
onClick={() =>
handleBrowserSelect(browser.value)
}
className="flex gap-3 justify-start items-center p-4 w-full h-16 border-2 transition-colors hover:border-primary/50"
variant="outline"
>
<div className="flex justify-center items-center w-8 h-8">
{IconComponent && (
<IconComponent className="w-6 h-6" />
)}
</div>
<div className="text-left">
<div className="font-medium">
{browser.label}
</div>
<div className="text-sm text-muted-foreground">
Regular Browser
</div>
</div>
</Button>
);
})}
</div>
)}
{isBrowserCurrentlyDownloading("camoufox") && (
<div className="p-3 text-sm rounded-md border text-muted-foreground">
{(() => {
const bestVersion =
getBestAvailableVersion("camoufox");
return `Downloading Camoufox version (${bestVersion?.version})...`;
})()}
</div>
)}
</TabsContent>
</>
) : (
<>
<TabsContent value="anti-detect" className="mt-0">
{/* Anti-Detect Configuration */}
<div className="space-y-6">
{/* Profile Name */}
<div className="space-y-2">
<Label htmlFor="profile-name">Profile Name</Label>
<Input
id="profile-name"
value={profileName}
onChange={(e) => setProfileName(e.target.value)}
placeholder="Enter profile name"
/>
</div>
<SharedCamoufoxConfigForm
config={camoufoxConfig}
onConfigChange={updateCamoufoxConfig}
isCreating
/>
</div>
</TabsContent>
{selectedBrowser === "camoufox" ? (
// Camoufox Configuration
<div className="space-y-6">
{/* Camoufox Download Status */}
{!isBrowserCurrentlyDownloading("camoufox") &&
!isBrowserVersionAvailable("camoufox") &&
getBestAvailableVersion("camoufox") && (
<div className="flex gap-3 items-center p-3 rounded-md border">
<p className="text-sm text-muted-foreground">
{(() => {
const bestVersion =
getBestAvailableVersion("camoufox");
return `Camoufox version (${bestVersion?.version}) needs to be downloaded`;
})()}
</p>
<LoadingButton
onClick={() => handleDownload("camoufox")}
isLoading={isBrowserCurrentlyDownloading(
"camoufox",
)}
size="sm"
disabled={isBrowserCurrentlyDownloading(
"camoufox",
)}
>
{isBrowserCurrentlyDownloading("camoufox")
? "Downloading..."
: "Download"}
</LoadingButton>
</div>
)}
{!isBrowserCurrentlyDownloading("camoufox") &&
isBrowserVersionAvailable("camoufox") && (
<div className="p-3 text-sm rounded-md border text-muted-foreground">
{(() => {
const bestVersion =
getBestAvailableVersion("camoufox");
return `✓ Camoufox version (${bestVersion?.version}) is available`;
})()}
</div>
)}
{isBrowserCurrentlyDownloading("camoufox") && (
<div className="p-3 text-sm rounded-md border text-muted-foreground">
{(() => {
const bestVersion =
getBestAvailableVersion("camoufox");
return `Downloading Camoufox version (${bestVersion?.version})...`;
})()}
</div>
)}
{/* Proxy Selection - Common to both tabs - Always visible */}
<div className="space-y-3">
<div className="flex justify-between items-center">
<Label>Proxy</Label>
<RippleButton
size="sm"
variant="outline"
onClick={() => setShowProxyForm(true)}
className="px-2 h-7 text-xs"
>
<GoPlus className="mr-1 w-3 h-3" /> Add Proxy
</RippleButton>
</div>
{storedProxies.length > 0 ? (
<Select
value={selectedProxyId || "none"}
onValueChange={(value) =>
setSelectedProxyId(value === "none" ? undefined : value)
}
>
<SelectTrigger>
<SelectValue placeholder="No proxy" />
</SelectTrigger>
<SelectContent>
<SelectItem value="none">No proxy</SelectItem>
{storedProxies.map((proxy) => (
<SelectItem key={proxy.id} value={proxy.id}>
{proxy.name}
</SelectItem>
))}
</SelectContent>
</Select>
) : (
<div className="flex gap-3 items-center p-3 text-sm rounded-md border text-muted-foreground">
No proxies available. Add one to route this profile's
traffic.
</div>
)}
</div>
<SharedCamoufoxConfigForm
config={camoufoxConfig}
onConfigChange={updateCamoufoxConfig}
isCreating
/>
</div>
) : (
// Regular Browser Configuration
<div className="space-y-4">
{selectedBrowser && (
<div className="space-y-3">
{!isBrowserCurrentlyDownloading(
selectedBrowser,
) &&
!isBrowserVersionAvailable(selectedBrowser) &&
getBestAvailableVersion(selectedBrowser) && (
<div className="flex gap-3 items-center">
<p className="text-sm text-muted-foreground">
{(() => {
const bestVersion =
getBestAvailableVersion(
selectedBrowser,
);
return `Latest version (${bestVersion?.version}) needs to be downloaded`;
})()}
</p>
<LoadingButton
onClick={() =>
handleDownload(selectedBrowser)
}
isLoading={isBrowserCurrentlyDownloading(
selectedBrowser,
)}
className="ml-auto"
size="sm"
disabled={isBrowserCurrentlyDownloading(
selectedBrowser,
)}
>
Download
</LoadingButton>
</div>
)}
{!isBrowserCurrentlyDownloading(
selectedBrowser,
) &&
isBrowserVersionAvailable(
selectedBrowser,
) && (
<div className="text-sm text-muted-foreground">
{(() => {
const bestVersion =
getBestAvailableVersion(
selectedBrowser,
);
return `✓ Latest version (${bestVersion?.version}) is available`;
})()}
</div>
)}
{isBrowserCurrentlyDownloading(
selectedBrowser,
) && (
<div className="text-sm text-muted-foreground">
{(() => {
const bestVersion =
getBestAvailableVersion(
selectedBrowser,
);
return `Downloading version (${bestVersion?.version})...`;
})()}
</div>
)}
</div>
)}
</div>
)}
{/* Proxy Selection - Always visible */}
<div className="space-y-3">
<div className="flex justify-between items-center">
<Label>Proxy</Label>
<RippleButton
size="sm"
variant="outline"
onClick={() => setShowProxyForm(true)}
className="px-2 h-7 text-xs"
>
<GoPlus className="mr-1 w-3 h-3" /> Add Proxy
</RippleButton>
</div>
{storedProxies.length > 0 ? (
<Select
value={selectedProxyId || "none"}
onValueChange={(value) =>
setSelectedProxyId(
value === "none" ? undefined : value,
)
}
>
<SelectTrigger>
<SelectValue placeholder="No proxy" />
</SelectTrigger>
<SelectContent>
<SelectItem value="none">No proxy</SelectItem>
{storedProxies.map((proxy) => (
<SelectItem key={proxy.id} value={proxy.id}>
{proxy.name}
</SelectItem>
))}
</SelectContent>
</Select>
) : (
<div className="flex gap-3 items-center p-3 text-sm rounded-md border text-muted-foreground">
No proxies available. Add one to route this
profile's traffic.
</div>
)}
</div>
</div>
</TabsContent>
<TabsContent value="regular" className="mt-0">
{/* Regular Browser Configuration */}
<div className="space-y-6">
{/* Profile Name */}
<div className="space-y-2">
<Label htmlFor="profile-name">Profile Name</Label>
<Input
id="profile-name"
value={profileName}
onChange={(e) => setProfileName(e.target.value)}
placeholder="Enter profile name"
/>
</div>
{/* Regular Browser Configuration */}
<div className="space-y-4">
{selectedBrowser && (
<div className="space-y-3">
{!isBrowserCurrentlyDownloading(
selectedBrowser,
) &&
!isBrowserVersionAvailable(selectedBrowser) &&
getBestAvailableVersion(selectedBrowser) && (
<div className="flex gap-3 items-center">
<p className="text-sm text-muted-foreground">
{(() => {
const bestVersion =
getBestAvailableVersion(
selectedBrowser,
);
return `Latest version (${bestVersion?.version}) needs to be downloaded`;
})()}
</p>
<LoadingButton
onClick={() =>
handleDownload(selectedBrowser)
}
isLoading={isBrowserCurrentlyDownloading(
selectedBrowser,
)}
className="ml-auto"
size="sm"
disabled={isBrowserCurrentlyDownloading(
selectedBrowser,
)}
>
Download
</LoadingButton>
</div>
)}
{!isBrowserCurrentlyDownloading(
selectedBrowser,
) &&
isBrowserVersionAvailable(selectedBrowser) && (
<div className="text-sm text-muted-foreground">
{(() => {
const bestVersion =
getBestAvailableVersion(
selectedBrowser,
);
return `✓ Latest version (${bestVersion?.version}) is available`;
})()}
</div>
)}
{isBrowserCurrentlyDownloading(
selectedBrowser,
) && (
<div className="text-sm text-muted-foreground">
{(() => {
const bestVersion =
getBestAvailableVersion(selectedBrowser);
return `Downloading version (${bestVersion?.version})...`;
})()}
</div>
)}
</div>
)}
</div>
{/* Proxy Selection - Always visible */}
<div className="space-y-3">
<div className="flex justify-between items-center">
<Label>Proxy</Label>
<RippleButton
size="sm"
variant="outline"
onClick={() => setShowProxyForm(true)}
className="px-2 h-7 text-xs"
>
<GoPlus className="mr-1 w-3 h-3" /> Add Proxy
</RippleButton>
</div>
{storedProxies.length > 0 ? (
<Select
value={selectedProxyId || "none"}
onValueChange={(value) =>
setSelectedProxyId(
value === "none" ? undefined : value,
)
}
>
<SelectTrigger>
<SelectValue placeholder="No proxy" />
</SelectTrigger>
<SelectContent>
<SelectItem value="none">No proxy</SelectItem>
{storedProxies.map((proxy) => (
<SelectItem key={proxy.id} value={proxy.id}>
{proxy.name}
</SelectItem>
))}
</SelectContent>
</Select>
) : (
<div className="flex gap-3 items-center p-3 text-sm rounded-md border text-muted-foreground">
No proxies available. Add one to route this
profile's traffic.
</div>
)}
</div>
</div>
</TabsContent>
</>
)}
</div>
</div>
</ScrollArea>
</Tabs>
<DialogFooter className="flex-shrink-0 pt-4 border-t">
<DialogFooter className="flex-shrink-0 pt-4 border-t">
{currentStep === "browser-config" ? (
<>
<RippleButton variant="outline" onClick={handleBack}>
Back
</RippleButton>
<LoadingButton
onClick={handleCreate}
isLoading={isCreating}
disabled={isCreateDisabled}
>
Create
</LoadingButton>
</>
) : (
<RippleButton variant="outline" onClick={handleClose}>
Cancel
</RippleButton>
<LoadingButton
onClick={handleCreate}
isLoading={isCreating}
disabled={isCreateDisabled}
>
Create
</LoadingButton>
</DialogFooter>
</Tabs>
)}
</DialogFooter>
</DialogContent>
<ProxyFormDialog
isOpen={showProxyForm}
onClose={() => setShowProxyForm(false)}
onSave={(proxy) => {
setStoredProxies((prev) => [...prev, proxy]);
setSelectedProxyId(proxy.id);
}}
/>
</Dialog>
);
+1 -1
View File
@@ -168,7 +168,7 @@ export function UnifiedToast(props: ToastProps) {
const progress = "progress" in props ? props.progress : undefined;
return (
<div className="flex items-start p-4 w-full max-w-md rounded-lg border shadow-lg bg-card border-border text-card-foreground">
<div className="flex items-start p-3 w-96 rounded-lg border shadow-lg bg-card border-border text-card-foreground">
<div className="mr-3 mt-0.5">{getToastIcon(type, stage)}</div>
<div className="flex-1 min-w-0">
<p className="text-sm font-semibold leading-tight text-foreground">
+176
View File
@@ -0,0 +1,176 @@
"use client";
import type { Table } from "@tanstack/react-table";
import { AnimatePresence, motion } from "motion/react";
import * as React from "react";
import * as ReactDOM from "react-dom";
import { LuX } from "react-icons/lu";
import { Button } from "@/components/ui/button";
import {
Tooltip,
TooltipContent,
TooltipTrigger,
} from "@/components/ui/tooltip";
import { cn } from "@/lib/utils";
interface DataTableActionBarProps<TData>
extends React.ComponentProps<typeof motion.div> {
table: Table<TData>;
visible?: boolean;
portalContainer?: Element | DocumentFragment | null;
}
function DataTableActionBar<TData>({
table,
visible: visibleProp,
portalContainer: portalContainerProp,
children,
className,
...props
}: DataTableActionBarProps<TData>) {
const [mounted, setMounted] = React.useState(false);
React.useLayoutEffect(() => {
setMounted(true);
}, []);
React.useEffect(() => {
function onKeyDown(event: KeyboardEvent) {
if (event.key === "Escape") {
table.toggleAllRowsSelected(false);
}
}
window.addEventListener("keydown", onKeyDown);
return () => window.removeEventListener("keydown", onKeyDown);
}, [table]);
const portalContainer =
portalContainerProp ?? (mounted ? globalThis.document?.body : null);
if (!portalContainer) return null;
const visible =
visibleProp ?? table.getFilteredSelectedRowModel().rows.length > 0;
return ReactDOM.createPortal(
<AnimatePresence>
{visible && (
<motion.div
role="toolbar"
aria-orientation="horizontal"
initial={{ opacity: 0, y: 20 }}
animate={{ opacity: 1, y: 0 }}
exit={{ opacity: 0, y: 20 }}
transition={{ duration: 0.2, ease: "easeInOut" }}
className={cn(
"fixed inset-x-0 bottom-6 z-50 mx-auto flex w-fit flex-wrap items-center justify-center gap-2 rounded-md border bg-background p-2 text-foreground shadow-sm",
className,
)}
{...props}
>
{children}
</motion.div>
)}
</AnimatePresence>,
portalContainer,
);
}
interface DataTableActionBarActionProps
extends React.ComponentProps<typeof Button> {
tooltip?: string;
isPending?: boolean;
}
function DataTableActionBarAction({
size = "sm",
tooltip,
isPending,
disabled,
className,
children,
...props
}: DataTableActionBarActionProps) {
const trigger = (
<Button
variant="secondary"
size={size}
className={cn(
"gap-1.5 border border-secondary bg-secondary/50 hover:bg-secondary/70 [&>svg]:size-3.5",
size === "icon" ? "size-7" : "h-7",
className,
)}
disabled={disabled || isPending}
{...props}
>
{isPending ? (
<div className="w-3.5 h-3.5 rounded-full border border-current animate-spin border-t-transparent" />
) : (
children
)}
</Button>
);
if (!tooltip) return trigger;
return (
<Tooltip>
<TooltipTrigger asChild>{trigger}</TooltipTrigger>
<TooltipContent
sideOffset={6}
className="border bg-accent font-semibold text-foreground dark:bg-zinc-900 [&>span]:hidden"
>
<p>{tooltip}</p>
</TooltipContent>
</Tooltip>
);
}
interface DataTableActionBarSelectionProps<TData> {
table: Table<TData>;
}
function DataTableActionBarSelection<TData>({
table,
}: DataTableActionBarSelectionProps<TData>) {
const onClearSelection = React.useCallback(() => {
table.toggleAllRowsSelected(false);
}, [table]);
return (
<div className="flex h-7 items-center rounded-md border pr-1 pl-2.5">
<span className="whitespace-nowrap text-xs">
{table.getFilteredSelectedRowModel().rows.length} selected
</span>
<div className="mr-1 ml-2 h-4 w-px bg-border" />
<Tooltip>
<TooltipTrigger asChild>
<Button
variant="ghost"
size="icon"
className="size-5"
onClick={onClearSelection}
>
<LuX className="size-3.5" />
</Button>
</TooltipTrigger>
<TooltipContent
sideOffset={10}
className="flex items-center gap-2 border bg-accent px-2 py-1 font-semibold text-foreground dark:bg-zinc-900 [&>span]:hidden"
>
<p>Clear selection</p>
<kbd className="select-none rounded border bg-background px-1.5 py-px font-mono font-normal text-[0.7rem] text-foreground shadow-xs">
<abbr title="Escape" className="no-underline">
Esc
</abbr>
</kbd>
</TooltipContent>
</Tooltip>
</div>
);
}
export {
DataTableActionBar,
DataTableActionBarAction,
DataTableActionBarSelection,
};
+14 -8
View File
@@ -19,7 +19,8 @@ interface DeleteConfirmationDialogProps {
description: string;
confirmButtonText?: string;
isLoading?: boolean;
profileNames?: string[];
profileIds?: string[];
profiles?: { id: string; name: string }[];
}
export function DeleteConfirmationDialog({
@@ -30,7 +31,8 @@ export function DeleteConfirmationDialog({
description,
confirmButtonText = "Delete",
isLoading = false,
profileNames,
profileIds,
profiles = [],
}: DeleteConfirmationDialogProps) {
const handleConfirm = async () => {
await onConfirm();
@@ -42,18 +44,22 @@ export function DeleteConfirmationDialog({
<DialogHeader>
<DialogTitle>{title}</DialogTitle>
<DialogDescription>{description}</DialogDescription>
{profileNames && profileNames.length > 0 && (
{profileIds && profileIds.length > 0 && (
<div className="mt-4">
<p className="text-sm font-medium mb-2">
Profiles to be deleted:
</p>
<div className="bg-muted rounded-md p-3 max-h-32 overflow-y-auto">
<ul className="space-y-1">
{profileNames.map((name) => (
<li key={name} className="text-sm text-muted-foreground">
{name}
</li>
))}
{profileIds.map((id) => {
const profile = profiles.find((p) => p.id === id);
const displayName = profile ? profile.name : id;
return (
<li key={id} className="text-sm text-muted-foreground">
{displayName}
</li>
);
})}
</ul>
</div>
</div>
+4 -4
View File
@@ -74,13 +74,13 @@ export function DeleteGroupDialog({
try {
if (deleteAction === "delete" && associatedProfiles.length > 0) {
// Delete all associated profiles first
const profileNames = associatedProfiles.map((p) => p.name);
await invoke("delete_selected_profiles", { profileNames });
const profileIds = associatedProfiles.map((p) => p.id);
await invoke("delete_selected_profiles", { profileIds });
} else if (deleteAction === "move" && associatedProfiles.length > 0) {
// Move profiles to default group (null group_id)
const profileNames = associatedProfiles.map((p) => p.name);
const profileIds = associatedProfiles.map((p) => p.id);
await invoke("assign_profiles_to_group", {
profileNames,
profileIds,
groupId: null,
});
}
+25
View File
@@ -0,0 +1,25 @@
import { getFlagIconClass } from "@/lib/flag-utils";
import { cn } from "@/lib/utils";
interface FlagIconProps {
countryCode?: string;
className?: string;
squared?: boolean;
}
export function FlagIcon({
countryCode,
className,
squared = false,
}: FlagIconProps) {
if (!countryCode) {
return null;
}
const flagClass = getFlagIconClass(countryCode);
if (!flagClass) {
return null;
}
return <span className={cn(flagClass, squared && "fis", className)} />;
}
+16 -7
View File
@@ -22,7 +22,7 @@ import {
SelectTrigger,
SelectValue,
} from "@/components/ui/select";
import type { ProfileGroup } from "@/types";
import type { BrowserProfile, ProfileGroup } from "@/types";
import { RippleButton } from "./ui/ripple";
interface GroupAssignmentDialogProps {
@@ -30,6 +30,7 @@ interface GroupAssignmentDialogProps {
onClose: () => void;
selectedProfiles: string[];
onAssignmentComplete: () => void;
profiles?: BrowserProfile[];
}
export function GroupAssignmentDialog({
@@ -37,6 +38,7 @@ export function GroupAssignmentDialog({
onClose,
selectedProfiles,
onAssignmentComplete,
profiles = [],
}: GroupAssignmentDialogProps) {
const [groups, setGroups] = useState<ProfileGroup[]>([]);
const [selectedGroupId, setSelectedGroupId] = useState<string | null>(null);
@@ -64,7 +66,7 @@ export function GroupAssignmentDialog({
setError(null);
try {
await invoke("assign_profiles_to_group", {
profileNames: selectedProfiles,
profileIds: selectedProfiles,
groupId: selectedGroupId,
});
@@ -119,11 +121,18 @@ export function GroupAssignmentDialog({
<Label>Selected Profiles:</Label>
<div className="p-3 bg-muted rounded-md max-h-32 overflow-y-auto">
<ul className="text-sm space-y-1">
{selectedProfiles.map((profileName) => (
<li key={profileName} className="truncate">
{profileName}
</li>
))}
{selectedProfiles.map((profileId) => {
// Find the profile name for display
const profile = profiles.find(
(p: BrowserProfile) => p.id === profileId,
);
const displayName = profile ? profile.name : profileId;
return (
<li key={profileId} className="truncate">
{displayName}
</li>
);
})}
</ul>
</div>
</div>
+161 -17
View File
@@ -1,5 +1,6 @@
"use client";
import { useCallback, useEffect, useRef, useState } from "react";
import { Badge } from "@/components/ui/badge";
import type { GroupWithCount } from "@/types";
@@ -17,9 +18,124 @@ export function GroupBadges({
groups,
isLoading,
}: GroupBadgesProps) {
const scrollContainerRef = useRef<HTMLDivElement>(null);
const [showLeftFade, setShowLeftFade] = useState(false);
const [showRightFade, setShowRightFade] = useState(false);
const [isDragging, setIsDragging] = useState(false);
const dragStartRef = useRef<{ x: number; scrollLeft: number } | null>(null);
const hasMovedRef = useRef(false);
const clickBlockedRef = useRef(false);
const checkScrollPosition = useCallback(() => {
const container = scrollContainerRef.current;
if (!container) return;
const { scrollLeft, scrollWidth, clientWidth } = container;
setShowLeftFade(scrollLeft > 0);
setShowRightFade(scrollLeft < scrollWidth - clientWidth - 1);
}, []);
const handleMouseDown = useCallback((e: React.MouseEvent) => {
const container = scrollContainerRef.current;
if (!container) return;
e.preventDefault();
dragStartRef.current = {
x: e.clientX,
scrollLeft: container.scrollLeft,
};
hasMovedRef.current = false;
setIsDragging(true);
container.style.cursor = "grabbing";
container.style.userSelect = "none";
}, []);
const handleMouseMove = useCallback(
(e: MouseEvent) => {
if (!isDragging || !dragStartRef.current) return;
const container = scrollContainerRef.current;
if (!container) return;
const deltaX = e.clientX - dragStartRef.current.x;
const distance = Math.abs(deltaX);
if (distance > 5) {
hasMovedRef.current = true;
}
container.scrollLeft = dragStartRef.current.scrollLeft - deltaX;
checkScrollPosition();
},
[isDragging, checkScrollPosition],
);
const handleMouseUp = useCallback(() => {
if (!isDragging) return;
const container = scrollContainerRef.current;
if (container) {
container.style.cursor = "";
container.style.userSelect = "";
}
clickBlockedRef.current = hasMovedRef.current;
setIsDragging(false);
dragStartRef.current = null;
setTimeout(() => {
hasMovedRef.current = false;
clickBlockedRef.current = false;
}, 100);
}, [isDragging]);
useEffect(() => {
if (isDragging) {
document.addEventListener("mousemove", handleMouseMove);
document.addEventListener("mouseup", handleMouseUp);
return () => {
document.removeEventListener("mousemove", handleMouseMove);
document.removeEventListener("mouseup", handleMouseUp);
};
}
}, [isDragging, handleMouseMove, handleMouseUp]);
useEffect(() => {
const container = scrollContainerRef.current;
if (!container) return;
checkScrollPosition();
container.addEventListener("scroll", checkScrollPosition);
const resizeObserver = new ResizeObserver(checkScrollPosition);
resizeObserver.observe(container);
return () => {
container.removeEventListener("scroll", checkScrollPosition);
resizeObserver.disconnect();
};
}, [checkScrollPosition]);
useEffect(() => {
if (groups.length === 0) {
setShowLeftFade(false);
setShowRightFade(false);
return;
}
const container = scrollContainerRef.current;
if (!container) return;
requestAnimationFrame(() => {
requestAnimationFrame(() => {
checkScrollPosition();
});
});
}, [groups, checkScrollPosition]);
if (isLoading && !groups.length) {
return (
<div className="flex flex-wrap gap-2 mb-4">
<div className="flex gap-2 mb-4">
<div className="flex items-center gap-2 px-4.5 py-1.5 text-xs">
Loading groups...
</div>
@@ -28,22 +144,50 @@ export function GroupBadges({
}
return (
<div className="flex flex-wrap gap-2 mb-4">
{groups.map((group) => (
<Badge
key={group.id}
variant={selectedGroupId === group.id ? "default" : "secondary"}
className="flex gap-2 items-center px-3 py-1 transition-colors cursor-pointer dark:hover:bg-primary/60 hover:bg-primary/80"
onClick={() => {
onGroupSelect(selectedGroupId === group.id ? "default" : group.id);
}}
>
<span>{group.name}</span>
<span className="bg-background/20 text-xs px-1.5 py-0.5 rounded-sm">
{group.count}
</span>
</Badge>
))}
<div className="relative mb-4">
{showLeftFade && (
<div className="absolute left-0 top-0 bottom-0 w-8 bg-gradient-to-r from-background to-transparent pointer-events-none z-10" />
)}
{showRightFade && (
<div className="absolute right-0 top-0 bottom-0 w-8 bg-gradient-to-l from-background to-transparent pointer-events-none z-10" />
)}
<div
ref={scrollContainerRef}
role="region"
aria-label="Profile groups"
className={`flex gap-2 overflow-x-auto pb-2 -mb-2 [scrollbar-width:none] [-ms-overflow-style:none] [&::-webkit-scrollbar]:hidden ${isDragging ? "cursor-grabbing" : "cursor-grab"}`}
onScroll={checkScrollPosition}
onMouseDown={handleMouseDown}
>
{groups.map((group) => (
<Badge
key={group.id}
variant={selectedGroupId === group.id ? "default" : "secondary"}
className="flex gap-2 items-center px-3 py-1 transition-colors cursor-pointer dark:hover:bg-primary/60 hover:bg-primary/80 flex-shrink-0"
onClick={(e) => {
if (hasMovedRef.current || clickBlockedRef.current) {
e.preventDefault();
e.stopPropagation();
return;
}
onGroupSelect(
selectedGroupId === group.id ? "default" : group.id,
);
}}
onMouseDown={(e) => {
if (isDragging) {
e.preventDefault();
e.stopPropagation();
}
}}
>
<span>{group.name}</span>
<span className="bg-background/20 text-xs px-1.5 py-0.5 rounded-sm">
{group.count}
</span>
</Badge>
))}
</div>
</div>
);
}
+77 -50
View File
@@ -7,6 +7,7 @@ import { LuPencil, LuTrash2 } from "react-icons/lu";
import { CreateGroupDialog } from "@/components/create-group-dialog";
import { DeleteGroupDialog } from "@/components/delete-group-dialog";
import { EditGroupDialog } from "@/components/edit-group-dialog";
import { Badge } from "@/components/ui/badge";
import { Button } from "@/components/ui/button";
import {
Dialog,
@@ -17,6 +18,7 @@ import {
DialogTitle,
} from "@/components/ui/dialog";
import { Label } from "@/components/ui/label";
import { ScrollArea } from "@/components/ui/scroll-area";
import {
Table,
TableBody,
@@ -25,7 +27,12 @@ import {
TableHeader,
TableRow,
} from "@/components/ui/table";
import type { ProfileGroup } from "@/types";
import {
Tooltip,
TooltipContent,
TooltipTrigger,
} from "@/components/ui/tooltip";
import type { GroupWithCount, ProfileGroup } from "@/types";
import { RippleButton } from "./ui/ripple";
interface GroupManagementDialogProps {
@@ -39,7 +46,7 @@ export function GroupManagementDialog({
onClose,
onGroupManagementComplete,
}: GroupManagementDialogProps) {
const [groups, setGroups] = useState<ProfileGroup[]>([]);
const [groups, setGroups] = useState<GroupWithCount[]>([]);
const [isLoading, setIsLoading] = useState(false);
const [error, setError] = useState<string | null>(null);
@@ -47,13 +54,17 @@ export function GroupManagementDialog({
const [createDialogOpen, setCreateDialogOpen] = useState(false);
const [editDialogOpen, setEditDialogOpen] = useState(false);
const [deleteDialogOpen, setDeleteDialogOpen] = useState(false);
const [selectedGroup, setSelectedGroup] = useState<ProfileGroup | null>(null);
const [selectedGroup, setSelectedGroup] = useState<GroupWithCount | null>(
null,
);
const loadGroups = useCallback(async () => {
setIsLoading(true);
setError(null);
try {
const groupList = await invoke<ProfileGroup[]>("get_profile_groups");
const groupList = await invoke<GroupWithCount[]>(
"get_groups_with_profile_counts",
);
setGroups(groupList);
} catch (err) {
console.error("Failed to load groups:", err);
@@ -64,23 +75,19 @@ export function GroupManagementDialog({
}, []);
const handleGroupCreated = useCallback(
(newGroup: ProfileGroup) => {
setGroups((prev) => [...prev, newGroup]);
(_newGroup: ProfileGroup) => {
void loadGroups();
onGroupManagementComplete();
},
[onGroupManagementComplete],
[loadGroups, onGroupManagementComplete],
);
const handleGroupUpdated = useCallback(
(updatedGroup: ProfileGroup) => {
setGroups((prev) =>
prev.map((group) =>
group.id === updatedGroup.id ? updatedGroup : group,
),
);
(_updatedGroup: ProfileGroup) => {
void loadGroups();
onGroupManagementComplete();
},
[onGroupManagementComplete],
[loadGroups, onGroupManagementComplete],
);
const handleGroupDeleted = useCallback(() => {
@@ -88,12 +95,12 @@ export function GroupManagementDialog({
onGroupManagementComplete();
}, [loadGroups, onGroupManagementComplete]);
const handleEditGroup = useCallback((group: ProfileGroup) => {
const handleEditGroup = useCallback((group: GroupWithCount) => {
setSelectedGroup(group);
setEditDialogOpen(true);
}, []);
const handleDeleteGroup = useCallback((group: ProfileGroup) => {
const handleDeleteGroup = useCallback((group: GroupWithCount) => {
setSelectedGroup(group);
setDeleteDialogOpen(true);
}, []);
@@ -148,41 +155,61 @@ export function GroupManagementDialog({
</div>
) : (
<div className="border rounded-md">
<Table>
<TableHeader>
<TableRow>
<TableHead>Name</TableHead>
<TableHead className="w-24">Actions</TableHead>
</TableRow>
</TableHeader>
<TableBody>
{groups.map((group) => (
<TableRow key={group.id}>
<TableCell className="font-medium">
{group.name}
</TableCell>
<TableCell>
<div className="flex gap-1">
<Button
variant="ghost"
size="sm"
onClick={() => handleEditGroup(group)}
>
<LuPencil className="w-4 h-4" />
</Button>
<Button
variant="ghost"
size="sm"
onClick={() => handleDeleteGroup(group)}
>
<LuTrash2 className="w-4 h-4" />
</Button>
</div>
</TableCell>
<ScrollArea className="h-[240px]">
<Table>
<TableHeader>
<TableRow>
<TableHead>Name</TableHead>
<TableHead className="w-20">Profiles</TableHead>
<TableHead className="w-24">Actions</TableHead>
</TableRow>
))}
</TableBody>
</Table>
</TableHeader>
<TableBody>
{groups.map((group) => (
<TableRow key={group.id}>
<TableCell className="font-medium">
{group.name}
</TableCell>
<TableCell>
<Badge variant="secondary">{group.count}</Badge>
</TableCell>
<TableCell>
<div className="flex gap-1">
<Tooltip>
<TooltipTrigger asChild>
<Button
variant="ghost"
size="sm"
onClick={() => handleEditGroup(group)}
>
<LuPencil className="w-4 h-4" />
</Button>
</TooltipTrigger>
<TooltipContent>
<p>Edit group</p>
</TooltipContent>
</Tooltip>
<Tooltip>
<TooltipTrigger asChild>
<Button
variant="ghost"
size="sm"
onClick={() => handleDeleteGroup(group)}
>
<LuTrash2 className="w-4 h-4" />
</Button>
</TooltipTrigger>
<TooltipContent>
<p>Delete group</p>
</TooltipContent>
</Tooltip>
</div>
</TableCell>
</TableRow>
))}
</TableBody>
</Table>
</ScrollArea>
</div>
)}
</div>
+51 -48
View File
@@ -1,7 +1,7 @@
import { FaDownload } from "react-icons/fa";
import { FiWifi } from "react-icons/fi";
import { GoGear, GoKebabHorizontal, GoPlus } from "react-icons/go";
import { LuTrash2, LuUsers } from "react-icons/lu";
import { LuSearch, LuUsers, LuX } from "react-icons/lu";
import { Logo } from "./icons/logo";
import { Button } from "./ui/button";
import { CardTitle } from "./ui/card";
@@ -11,29 +11,27 @@ import {
DropdownMenuItem,
DropdownMenuTrigger,
} from "./ui/dropdown-menu";
import { RippleButton } from "./ui/ripple";
import { Input } from "./ui/input";
import { Tooltip, TooltipContent, TooltipTrigger } from "./ui/tooltip";
type Props = {
selectedProfiles: string[];
onBulkGroupAssignment: () => void;
onBulkDelete: () => void;
onSettingsDialogOpen: (open: boolean) => void;
onProxyManagementDialogOpen: (open: boolean) => void;
onGroupManagementDialogOpen: (open: boolean) => void;
onImportProfileDialogOpen: (open: boolean) => void;
onCreateProfileDialogOpen: (open: boolean) => void;
searchQuery: string;
onSearchQueryChange: (query: string) => void;
};
const HomeHeader = ({
selectedProfiles,
onBulkGroupAssignment,
onBulkDelete,
onSettingsDialogOpen,
onProxyManagementDialogOpen,
onGroupManagementDialogOpen,
onImportProfileDialogOpen,
onCreateProfileDialogOpen,
searchQuery,
onSearchQueryChange,
}: Props) => {
const handleLogoClick = () => {
// Trigger the same URL handling logic as if the URL came from the system
@@ -43,7 +41,7 @@ const HomeHeader = ({
window.dispatchEvent(event);
};
return (
<div className="flex justify-between items-center">
<div className="flex justify-between items-center mt-6">
<div className="flex gap-3 items-center">
<button
type="button"
@@ -53,47 +51,47 @@ const HomeHeader = ({
>
<Logo className="w-10 h-10 transition-transform duration-300 ease-out will-change-transform hover:scale-110" />
</button>
{selectedProfiles.length > 0 ? (
<div className="flex gap-3 items-center">
<span className="text-sm font-medium">
{selectedProfiles.length} profile
{selectedProfiles.length !== 1 ? "s" : ""} selected
</span>
<div className="flex gap-2">
<RippleButton
variant="outline"
size="sm"
onClick={onBulkGroupAssignment}
className="flex gap-2 items-center"
>
<LuUsers className="w-4 h-4" />
Assign to Group
</RippleButton>
<RippleButton
variant="destructive"
size="sm"
onClick={onBulkDelete}
className="flex gap-2 items-center"
>
<LuTrash2 className="w-4 h-4" />
Delete Selected
</RippleButton>
</div>
</div>
) : (
<CardTitle>Donut</CardTitle>
)}
<CardTitle>Donut</CardTitle>
</div>
<div className="flex gap-2 items-center">
<div className="relative">
<Input
type="text"
placeholder="Search profiles..."
value={searchQuery}
onChange={(e) => onSearchQueryChange(e.target.value)}
className="pr-8 pl-10 w-48"
/>
<LuSearch className="absolute left-3 top-1/2 w-4 h-4 transform -translate-y-1/2 text-muted-foreground" />
{searchQuery && (
<button
type="button"
onClick={() => onSearchQueryChange("")}
className="absolute right-2 top-1/2 p-1 rounded-sm transition-colors transform -translate-y-1/2 hover:bg-accent"
aria-label="Clear search"
>
<LuX className="w-4 h-4 text-muted-foreground hover:text-foreground" />
</button>
)}
</div>
<DropdownMenu>
<DropdownMenuTrigger asChild>
<Button
size="sm"
variant="outline"
className="flex gap-2 items-center"
>
<GoKebabHorizontal className="w-4 h-4" />
</Button>
<span>
<Tooltip>
<TooltipTrigger asChild>
<span>
<Button
size="sm"
variant="outline"
className="flex gap-2 items-center h-[36px]"
>
<GoKebabHorizontal className="w-4 h-4" />
</Button>
</span>
</TooltipTrigger>
<TooltipContent>More actions</TooltipContent>
</Tooltip>
</span>
</DropdownMenuTrigger>
<DropdownMenuContent align="end">
<DropdownMenuItem
@@ -138,13 +136,18 @@ const HomeHeader = ({
onClick={() => {
onCreateProfileDialogOpen(true);
}}
className="flex gap-2 items-center"
className="flex gap-2 items-center h-[36px]"
>
<GoPlus className="w-4 h-4" />
</Button>
</span>
</TooltipTrigger>
<TooltipContent>Create a new profile</TooltipContent>
<TooltipContent
arrowOffset={-8}
style={{ transform: "translateX(-8px)" }}
>
Create a new profile
</TooltipContent>
</Tooltip>
</div>
</div>
+1 -19
View File
@@ -31,13 +31,11 @@ import { RippleButton } from "./ui/ripple";
interface ImportProfileDialogProps {
isOpen: boolean;
onClose: () => void;
onImportComplete?: () => void;
}
export function ImportProfileDialog({
isOpen,
onClose,
onImportComplete,
}: ImportProfileDialogProps) {
const [detectedProfiles, setDetectedProfiles] = useState<DetectedProfile[]>(
[],
@@ -140,9 +138,6 @@ export function ImportProfileDialog({
toast.success(
`Successfully imported profile "${autoDetectProfileName.trim()}"`,
);
if (onImportComplete) {
onImportComplete();
}
onClose();
} catch (error) {
console.error("Failed to import profile:", error);
@@ -168,7 +163,6 @@ export function ImportProfileDialog({
selectedDetectedProfile,
autoDetectProfileName,
detectedProfiles,
onImportComplete,
onClose,
]);
@@ -193,9 +187,6 @@ export function ImportProfileDialog({
toast.success(
`Successfully imported profile "${manualProfileName.trim()}"`,
);
if (onImportComplete) {
onImportComplete();
}
onClose();
} catch (error) {
console.error("Failed to import profile:", error);
@@ -217,13 +208,7 @@ export function ImportProfileDialog({
} finally {
setIsImporting(false);
}
}, [
manualBrowserType,
manualProfilePath,
manualProfileName,
onImportComplete,
onClose,
]);
}, [manualBrowserType, manualProfilePath, manualProfileName, onClose]);
const handleClose = () => {
setSelectedDetectedProfile(null);
@@ -345,9 +330,6 @@ export function ImportProfileDialog({
<span className="font-medium">
{profile.name}
</span>
<span className="text-xs text-muted-foreground">
{profile.description}
</span>
</div>
</div>
</SelectItem>
+43 -4
View File
@@ -287,6 +287,7 @@ const MultipleSelector = React.forwardRef<
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [debouncedSearchTerm, groupBy, open, triggerSearchOnFocus, onSearch]);
// biome-ignore lint/correctness/noNestedComponentDefinitions: public code, TODO: fix
const CreatableItem = () => {
if (!creatable) return undefined;
if (
@@ -352,6 +353,10 @@ const MultipleSelector = React.forwardRef<
[options, selected],
);
const hasAvailableOptions = React.useMemo(() => {
return Object.values(selectables).some((group) => group.length > 0);
}, [selectables]);
/** Avoid Creatable Selector freezing or lagging when paste a long string. */
const commandFilter = React.useCallback(() => {
if (commandProps?.filter) {
@@ -416,7 +421,7 @@ const MultipleSelector = React.forwardRef<
<button
type="button"
className={cn(
"ml-1 rounded-full outline-none ring-offset-background focus:ring-2 focus:ring-ring focus:ring-offset-2",
"cursor-pointer ml-1 rounded-full outline-none ring-offset-background focus:ring-2 focus:ring-ring focus:ring-offset-2",
(disabled || option.fixed) && "hidden",
)}
onKeyDown={(e) => {
@@ -445,6 +450,40 @@ const MultipleSelector = React.forwardRef<
setInputValue(value);
inputProps?.onValueChange?.(value);
}}
onKeyDown={(e) => {
// Allow consumer to handle first
inputProps?.onKeyDown?.(
e as unknown as React.KeyboardEvent<HTMLInputElement>,
);
if (e.defaultPrevented) return;
if (e.key === "Enter") {
const value = inputValue.trim();
if (value.length === 0) return;
// If option already exists among available options, pick that; otherwise create
const entries = Object.values(options).flat();
const existing = entries.find(
(o) => o.value === value && !o.disable,
);
// Prevent duplicates in the current selection
if (
selected.some((s) => s.value === (existing?.value ?? value))
) {
e.preventDefault();
setInputValue("");
return;
}
if (selected.length >= maxSelected) {
onMaxSelected?.(selected.length);
return;
}
e.preventDefault();
setInputValue("");
const picked = existing ?? { value, label: value };
const newOptions = [...selected, picked];
setSelected(newOptions);
onChange?.(newOptions);
}
}}
onBlur={(event) => {
setOpen(false);
inputProps?.onBlur?.(event);
@@ -465,7 +504,7 @@ const MultipleSelector = React.forwardRef<
"flex-1 bg-transparent outline-none placeholder:text-muted-foreground",
{
"w-full": hidePlaceholderWhenSelected,
"px-3 py-2": selected.length === 0,
"px-3 mt-1": selected.length === 0,
"ml-1": selected.length !== 0,
},
inputProps?.className,
@@ -474,7 +513,7 @@ const MultipleSelector = React.forwardRef<
</div>
</div>
<div className="relative">
{open && (
{open && hasAvailableOptions && (
<CommandList className="absolute top-1 z-10 w-full rounded-md border shadow-md outline-none bg-popover text-popover-foreground animate-in">
{isLoading ? (
loadingIndicator
@@ -489,7 +528,7 @@ const MultipleSelector = React.forwardRef<
<CommandGroup
key={key}
heading={key}
className="overflow-auto h-full"
className="overflow-auto h-24"
>
{dropdowns.map((option) => {
return (
File diff suppressed because it is too large Load Diff
+55 -87
View File
@@ -2,8 +2,6 @@
import { invoke } from "@tauri-apps/api/core";
import { useCallback, useEffect, useState } from "react";
import { LuCopy } from "react-icons/lu";
import { toast } from "sonner";
import { LoadingButton } from "@/components/loading-button";
import { Badge } from "@/components/ui/badge";
import {
@@ -27,8 +25,11 @@ import {
TooltipTrigger,
} from "@/components/ui/tooltip";
import { useBrowserState } from "@/hooks/use-browser-state";
import { useProfileEvents } from "@/hooks/use-profile-events";
import { useProxyEvents } from "@/hooks/use-proxy-events";
import { getBrowserDisplayName, getBrowserIcon } from "@/lib/browser-utils";
import type { BrowserProfile, StoredProxy } from "@/types";
import type { BrowserProfile } from "@/types";
import { CopyToClipboard } from "./ui/copy-to-clipboard";
import { RippleButton } from "./ui/ripple";
interface ProfileSelectorDialogProps {
@@ -43,14 +44,19 @@ export function ProfileSelectorDialog({
isOpen,
onClose,
url,
runningProfiles = new Set(),
runningProfiles: externalRunningProfiles,
isUpdating,
}: ProfileSelectorDialogProps) {
const [profiles, setProfiles] = useState<BrowserProfile[]>([]);
// Use the centralized profile events hook
const { profiles, runningProfiles: hookRunningProfiles } = useProfileEvents();
// Use external runningProfiles if provided, otherwise use hook's runningProfiles
const runningProfiles = externalRunningProfiles || hookRunningProfiles;
const { storedProxies } = useProxyEvents();
const [selectedProfile, setSelectedProfile] = useState<string | null>(null);
const [isLoading, setIsLoading] = useState(false);
const [isLaunching, setIsLaunching] = useState(false);
const [storedProxies, setStoredProxies] = useState<StoredProxy[]>([]);
const [launchingProfiles, setLaunchingProfiles] = useState<Set<string>>(
new Set(),
);
@@ -77,48 +83,6 @@ export function ProfileSelectorDialog({
[storedProxies],
);
const loadProfiles = useCallback(async () => {
setIsLoading(true);
try {
// Load both profiles and stored proxies
const [profileList, proxiesList] = await Promise.all([
invoke<BrowserProfile[]>("list_browser_profiles"),
invoke<StoredProxy[]>("get_stored_proxies"),
]);
// Sort profiles by name
profileList.sort((a, b) => a.name.localeCompare(b.name));
// Set both profiles and proxies
setProfiles(profileList);
setStoredProxies(proxiesList);
// Auto-select first available profile for link opening
if (profileList.length > 0) {
// First, try to find a running profile that can be used for opening links
const runningAvailableProfile = profileList.find((profile) => {
const isRunning = runningProfiles.has(profile.name);
// Simple check without browserState dependency
return (
isRunning &&
profile.browser !== "tor-browser" &&
profile.browser !== "mullvad-browser"
);
});
if (runningAvailableProfile) {
setSelectedProfile(runningAvailableProfile.name);
} else {
setSelectedProfile(profileList[0].name);
}
}
} catch (err) {
console.error("Failed to load profiles:", err);
} finally {
setIsLoading(false);
}
}, [runningProfiles]);
// Helper function to get tooltip content for profiles - now uses shared hook
const getProfileTooltipContent = (profile: BrowserProfile): string | null => {
return browserState.getProfileTooltipContent(profile);
@@ -128,10 +92,13 @@ export function ProfileSelectorDialog({
if (!selectedProfile || !url) return;
setIsLaunching(true);
setLaunchingProfiles((prev) => new Set(prev).add(selectedProfile));
const selected = profiles.find((p) => p.name === selectedProfile);
if (!selected) return;
setLaunchingProfiles((prev) => new Set(prev).add(selected.id));
try {
await invoke("open_url_with_profile", {
profileName: selectedProfile,
profileId: selected.id,
url,
});
onClose();
@@ -139,31 +106,21 @@ export function ProfileSelectorDialog({
console.error("Failed to open URL with profile:", error);
} finally {
setIsLaunching(false);
setLaunchingProfiles((prev) => {
const next = new Set(prev);
next.delete(selectedProfile);
return next;
});
if (selected) {
setLaunchingProfiles((prev) => {
const next = new Set(prev);
next.delete(selected.id);
return next;
});
}
}
}, [selectedProfile, url, onClose]);
}, [selectedProfile, url, onClose, profiles]);
const handleCancel = useCallback(() => {
setSelectedProfile(null);
onClose();
}, [onClose]);
const handleCopyUrl = useCallback(async () => {
if (!url) return;
try {
await navigator.clipboard.writeText(url);
toast.success("URL copied to clipboard!");
} catch (error) {
console.error("Failed to copy URL:", error);
toast.error("Failed to copy URL to clipboard");
}
}, [url]);
const selectedProfileData = profiles.find((p) => p.name === selectedProfile);
// Check if the selected profile can be used for opening links
@@ -178,11 +135,31 @@ export function ProfileSelectorDialog({
return getProfileTooltipContent(selectedProfileData);
};
// Auto-select first available profile when dialog opens and profiles are loaded
useEffect(() => {
if (isOpen) {
void loadProfiles();
if (isOpen && profiles.length > 0 && !selectedProfile) {
// First, try to find a running profile that can be used for opening links
const runningAvailableProfile = profiles.find((profile) => {
const isRunning = runningProfiles.has(profile.id);
// Simple check without browserState dependency
return (
isRunning &&
profile.browser !== "tor-browser" &&
profile.browser !== "mullvad-browser"
);
});
if (runningAvailableProfile) {
setSelectedProfile(runningAvailableProfile.name);
} else {
// Sort profiles by name and select first
const sortedProfiles = [...profiles].sort((a, b) =>
a.name.localeCompare(b.name),
);
setSelectedProfile(sortedProfiles[0].name);
}
}
}, [isOpen, loadProfiles]);
}, [isOpen, profiles, selectedProfile, runningProfiles]);
return (
<Dialog open={isOpen} onOpenChange={onClose}>
@@ -196,15 +173,10 @@ export function ProfileSelectorDialog({
<div className="space-y-2">
<div className="flex justify-between items-center">
<Label className="text-sm font-medium">Opening URL:</Label>
<RippleButton
variant="outline"
size="sm"
onClick={() => void handleCopyUrl()}
className="flex gap-2 items-center"
>
<LuCopy className="w-3 h-3" />
Copy
</RippleButton>
<CopyToClipboard
text={url}
successMessage="URL copied to clipboard!"
/>
</div>
<div className="p-2 text-sm break-all rounded bg-muted">
{url}
@@ -214,11 +186,7 @@ export function ProfileSelectorDialog({
<div className="space-y-2">
<Label htmlFor="profile-select">Select Profile:</Label>
{isLoading ? (
<div className="text-sm text-muted-foreground">
Loading profiles...
</div>
) : profiles.length === 0 ? (
{profiles.length === 0 ? (
<div className="space-y-2">
<div className="text-sm text-muted-foreground">
No profiles available. Please create a profile first.
@@ -238,7 +206,7 @@ export function ProfileSelectorDialog({
</SelectTrigger>
<SelectContent>
{profiles.map((profile) => {
const isRunning = runningProfiles.has(profile.name);
const isRunning = runningProfiles.has(profile.id);
const canUseForLinks =
browserState.canUseProfileForLinks(profile);
const tooltipContent = getProfileTooltipContent(profile);
+185
View File
@@ -0,0 +1,185 @@
"use client";
import { invoke } from "@tauri-apps/api/core";
import { emit } from "@tauri-apps/api/event";
import { useCallback, useEffect, useState } from "react";
import { toast } from "sonner";
import { LoadingButton } from "@/components/loading-button";
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from "@/components/ui/dialog";
import { Label } from "@/components/ui/label";
import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue,
} from "@/components/ui/select";
import type { BrowserProfile, StoredProxy } from "@/types";
import { RippleButton } from "./ui/ripple";
interface ProxyAssignmentDialogProps {
isOpen: boolean;
onClose: () => void;
selectedProfiles: string[];
onAssignmentComplete: () => void;
profiles?: BrowserProfile[];
storedProxies?: StoredProxy[];
}
export function ProxyAssignmentDialog({
isOpen,
onClose,
selectedProfiles,
onAssignmentComplete,
profiles = [],
storedProxies = [],
}: ProxyAssignmentDialogProps) {
const [selectedProxyId, setSelectedProxyId] = useState<string | null>(null);
const [isAssigning, setIsAssigning] = useState(false);
const [error, setError] = useState<string | null>(null);
const handleAssign = useCallback(async () => {
setIsAssigning(true);
setError(null);
try {
// Filter out TOR browser profiles as they don't support proxies
const validProfiles = selectedProfiles.filter((profileId) => {
const profile = profiles.find((p) => p.id === profileId);
return profile && profile.browser !== "tor-browser";
});
if (validProfiles.length === 0) {
setError("No valid profiles selected.");
setIsAssigning(false);
return;
}
// Update each profile's proxy sequentially to avoid file locking issues
for (const profileId of validProfiles) {
await invoke("update_profile_proxy", {
profileId,
proxyId: selectedProxyId,
});
}
// Notify other parts of the app so usage counts and lists refresh
await emit("profile-updated");
onAssignmentComplete();
onClose();
} catch (err) {
console.error("Failed to assign proxies to profiles:", err);
const errorMessage =
err instanceof Error
? err.message
: "Failed to assign proxies to profiles";
setError(errorMessage);
toast.error(errorMessage);
} finally {
setIsAssigning(false);
}
}, [
selectedProfiles,
selectedProxyId,
profiles,
onAssignmentComplete,
onClose,
]);
useEffect(() => {
if (isOpen) {
setSelectedProxyId(null);
setError(null);
}
}, [isOpen]);
return (
<Dialog open={isOpen} onOpenChange={onClose}>
<DialogContent className="max-w-md">
<DialogHeader>
<DialogTitle>Assign Proxy</DialogTitle>
<DialogDescription>
Assign a proxy to {selectedProfiles.length} selected profile(s).
</DialogDescription>
</DialogHeader>
<div className="space-y-4">
<div className="space-y-2">
<Label>Selected Profiles:</Label>
<div className="p-3 bg-muted rounded-md max-h-32 overflow-y-auto">
<ul className="text-sm space-y-1">
{selectedProfiles.map((profileId) => {
const profile = profiles.find(
(p: BrowserProfile) => p.id === profileId,
);
const displayName = profile ? profile.name : profileId;
const isTorBrowser = profile?.browser === "tor-browser";
return (
<li key={profileId} className="truncate">
{displayName}
{isTorBrowser && (
<span className="ml-2 text-xs text-muted-foreground">
(TOR - no proxy support)
</span>
)}
</li>
);
})}
</ul>
</div>
</div>
<div className="space-y-2">
<Label htmlFor="proxy-select">Assign Proxy:</Label>
<Select
value={selectedProxyId || "none"}
onValueChange={(value) => {
setSelectedProxyId(value === "none" ? null : value);
}}
>
<SelectTrigger>
<SelectValue placeholder="Select a proxy" />
</SelectTrigger>
<SelectContent>
<SelectItem value="none">No Proxy</SelectItem>
{storedProxies.map((proxy) => (
<SelectItem key={proxy.id} value={proxy.id}>
{proxy.name}
</SelectItem>
))}
</SelectContent>
</Select>
</div>
{error && (
<div className="p-3 text-sm text-red-600 bg-red-50 rounded-md dark:bg-red-900/20 dark:text-red-400">
{error}
</div>
)}
</div>
<DialogFooter>
<RippleButton
variant="outline"
onClick={onClose}
disabled={isAssigning}
>
Cancel
</RippleButton>
<LoadingButton
isLoading={isAssigning}
onClick={() => void handleAssign()}
>
Assign
</LoadingButton>
</DialogFooter>
</DialogContent>
</Dialog>
);
}
+166
View File
@@ -0,0 +1,166 @@
"use client";
import { invoke } from "@tauri-apps/api/core";
import * as React from "react";
import { FiCheck } from "react-icons/fi";
import { toast } from "sonner";
import { FlagIcon } from "@/components/flag-icon";
import { Button } from "@/components/ui/button";
import {
Tooltip,
TooltipContent,
TooltipTrigger,
} from "@/components/ui/tooltip";
import { formatRelativeTime } from "@/lib/flag-utils";
import type { ProxyCheckResult, StoredProxy } from "@/types";
interface ProxyCheckButtonProps {
proxy: StoredProxy;
profileId: string;
checkingProfileId: string | null;
cachedResult?: ProxyCheckResult;
onCheckComplete?: (result: ProxyCheckResult) => void;
onCheckFailed?: (result: ProxyCheckResult) => void;
disabled?: boolean;
setCheckingProfileId?: (id: string | null) => void;
}
export function ProxyCheckButton({
proxy,
profileId,
checkingProfileId,
cachedResult,
onCheckComplete,
onCheckFailed,
disabled = false,
setCheckingProfileId,
}: ProxyCheckButtonProps) {
const [localResult, setLocalResult] = React.useState<
ProxyCheckResult | undefined
>(cachedResult);
React.useEffect(() => {
setLocalResult(cachedResult);
}, [cachedResult]);
const handleCheck = React.useCallback(async () => {
if (checkingProfileId === profileId) return;
setCheckingProfileId?.(profileId);
try {
const result = await invoke<ProxyCheckResult>("check_proxy_validity", {
proxyId: proxy.id,
proxySettings: proxy.proxy_settings,
});
setLocalResult(result);
onCheckComplete?.(result);
// Show toast with location
const locationParts: string[] = [];
if (result.city) locationParts.push(result.city);
if (result.country) locationParts.push(result.country);
const location =
locationParts.length > 0 ? locationParts.join(", ") : "Unknown";
toast.success(
<div className="flex flex-col">
Your proxy location is:
<div className="flex items-center whitespace-nowrap">
{location}
{result.country_code && (
<FlagIcon
countryCode={result.country_code}
className="ml-1 text-sm"
/>
)}
</div>
</div>,
);
} catch (error) {
const errorMessage =
error instanceof Error ? error.message : String(error);
toast.error(`Proxy check failed: ${errorMessage}`);
// Save failed check result
const failedResult: ProxyCheckResult = {
ip: "",
city: undefined,
country: undefined,
country_code: undefined,
timestamp: Math.floor(Date.now() / 1000),
is_valid: false,
};
setLocalResult(failedResult);
onCheckFailed?.(failedResult);
} finally {
setCheckingProfileId?.(null);
}
}, [
proxy,
profileId,
checkingProfileId,
onCheckComplete,
onCheckFailed,
setCheckingProfileId,
]);
const isCurrentlyChecking = checkingProfileId === profileId;
const result = localResult;
return (
<Tooltip>
<TooltipTrigger asChild>
<Button
variant="ghost"
size="sm"
className="h-7 w-7 p-0"
onClick={handleCheck}
disabled={isCurrentlyChecking || disabled}
>
{isCurrentlyChecking ? (
<div className="w-3 h-3 rounded-full border border-current animate-spin border-t-transparent" />
) : result?.is_valid && result.country_code ? (
<span className="relative inline-flex items-center justify-center">
<FlagIcon countryCode={result.country_code} className="h-2.5" />
<FiCheck className="absolute bottom-[-6px] right-[-4px]" />
</span>
) : result && !result.is_valid ? (
<span className="text-destructive text-sm"></span>
) : (
<FiCheck className="w-3 h-3" />
)}
</Button>
</TooltipTrigger>
<TooltipContent>
{isCurrentlyChecking ? (
<p>Checking proxy...</p>
) : result?.is_valid ? (
<div className="space-y-1">
<p className="flex items-center gap-1">
{result.country_code && (
<FlagIcon countryCode={result.country_code} />
)}
{[result.city, result.country].filter(Boolean).join(", ") ||
"Unknown"}
</p>
<p className="text-xs text-primary-foreground/70">
IP: {result.ip}
</p>
<p className="text-xs text-primary-foreground/70">
Checked {formatRelativeTime(result.timestamp)}
</p>
</div>
) : result && !result.is_valid ? (
<div>
<p>Proxy check failed</p>
<p className="text-xs text-primary-foreground/70">
Failed {formatRelativeTime(result.timestamp)}
</p>
</div>
) : (
<p>Check proxy validity</p>
)}
</TooltipContent>
</Tooltip>
);
}
+3 -8
View File
@@ -35,14 +35,12 @@ interface ProxyFormData {
interface ProxyFormDialogProps {
isOpen: boolean;
onClose: () => void;
onSave: (proxy: StoredProxy) => void;
editingProxy?: StoredProxy | null;
}
export function ProxyFormDialog({
isOpen,
onClose,
onSave,
editingProxy,
}: ProxyFormDialogProps) {
const [isSubmitting, setIsSubmitting] = useState(false);
@@ -105,11 +103,9 @@ export function ProxyFormDialog({
password: formData.password.trim() || undefined,
};
let savedProxy: StoredProxy;
if (editingProxy) {
// Update existing proxy
savedProxy = await invoke<StoredProxy>("update_stored_proxy", {
await invoke("update_stored_proxy", {
proxyId: editingProxy.id,
name: formData.name.trim(),
proxySettings,
@@ -117,14 +113,13 @@ export function ProxyFormDialog({
toast.success("Proxy updated successfully");
} else {
// Create new proxy
savedProxy = await invoke<StoredProxy>("create_stored_proxy", {
await invoke("create_stored_proxy", {
name: formData.name.trim(),
proxySettings,
});
toast.success("Proxy created successfully");
}
onSave(savedProxy);
onClose();
} catch (error) {
console.error("Failed to save proxy:", error);
@@ -134,7 +129,7 @@ export function ProxyFormDialog({
} finally {
setIsSubmitting(false);
}
}, [formData, editingProxy, onSave, onClose]);
}, [formData, editingProxy, onClose]);
const handleClose = useCallback(() => {
if (!isSubmitting) {
+189 -194
View File
@@ -1,27 +1,42 @@
"use client";
import { invoke } from "@tauri-apps/api/core";
import { listen } from "@tauri-apps/api/event";
import { useCallback, useEffect, useState } from "react";
import { FiEdit2, FiPlus, FiTrash2, FiWifi } from "react-icons/fi";
import { emit } from "@tauri-apps/api/event";
import * as React from "react";
import { useCallback, useState } from "react";
import { GoPlus } from "react-icons/go";
import { LuPencil, LuTrash2 } from "react-icons/lu";
import { toast } from "sonner";
import { DeleteConfirmationDialog } from "@/components/delete-confirmation-dialog";
import { ProxyFormDialog } from "@/components/proxy-form-dialog";
import { Badge } from "@/components/ui/badge";
import { Button } from "@/components/ui/button";
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from "@/components/ui/dialog";
import { Label } from "@/components/ui/label";
import { ScrollArea } from "@/components/ui/scroll-area";
import {
Table,
TableBody,
TableCell,
TableHead,
TableHeader,
TableRow,
} from "@/components/ui/table";
import {
Tooltip,
TooltipContent,
TooltipTrigger,
} from "@/components/ui/tooltip";
import { trimName } from "@/lib/name-utils";
import type { StoredProxy } from "@/types";
import { useProxyEvents } from "@/hooks/use-proxy-events";
import type { ProxyCheckResult, StoredProxy } from "@/types";
import { ProxyCheckButton } from "./proxy-check-button";
import { RippleButton } from "./ui/ripple";
interface ProxyManagementDialogProps {
@@ -33,80 +48,61 @@ export function ProxyManagementDialog({
isOpen,
onClose,
}: ProxyManagementDialogProps) {
const [storedProxies, setStoredProxies] = useState<StoredProxy[]>([]);
const [loading, setLoading] = useState(false);
const [showProxyForm, setShowProxyForm] = useState(false);
const [editingProxy, setEditingProxy] = useState<StoredProxy | null>(null);
const [proxyUsage, setProxyUsage] = useState<Record<string, number>>({});
const [proxyToDelete, setProxyToDelete] = useState<StoredProxy | null>(null);
const [isDeleting, setIsDeleting] = useState(false);
const [checkingProxyId, setCheckingProxyId] = useState<string | null>(null);
const [proxyCheckResults, setProxyCheckResults] = useState<
Record<string, ProxyCheckResult>
>({});
const loadStoredProxies = useCallback(async () => {
try {
setLoading(true);
const proxies = await invoke<StoredProxy[]>("get_stored_proxies");
setStoredProxies(proxies);
} catch (error) {
console.error("Failed to load stored proxies:", error);
toast.error("Failed to load proxies");
} finally {
setLoading(false);
const { storedProxies, proxyUsage, isLoading } = useProxyEvents();
// Load cached check results on mount and when proxies change
React.useEffect(() => {
const loadCachedResults = async () => {
const results: Record<string, ProxyCheckResult> = {};
for (const proxy of storedProxies) {
try {
const cached = await invoke<ProxyCheckResult | null>(
"get_cached_proxy_check",
{ proxyId: proxy.id },
);
if (cached) {
results[proxy.id] = cached;
}
} catch (_error) {
// Ignore errors
}
}
setProxyCheckResults(results);
};
if (storedProxies.length > 0) {
void loadCachedResults();
}
}, [storedProxies]);
const handleDeleteProxy = useCallback((proxy: StoredProxy) => {
// Open in-app confirmation dialog
setProxyToDelete(proxy);
}, []);
const loadProxyUsage = useCallback(async () => {
const handleConfirmDelete = useCallback(async () => {
if (!proxyToDelete) return;
setIsDeleting(true);
try {
const profiles = await invoke<Array<{ proxy_id?: string }>>(
"list_browser_profiles",
);
const counts: Record<string, number> = {};
for (const p of profiles) {
if (p.proxy_id) counts[p.proxy_id] = (counts[p.proxy_id] ?? 0) + 1;
}
setProxyUsage(counts);
} catch (_err) {
// ignore non-critical errors
}
}, []);
useEffect(() => {
if (isOpen) {
loadStoredProxies();
void loadProxyUsage();
}
}, [isOpen, loadStoredProxies, loadProxyUsage]);
useEffect(() => {
let unlisten: (() => void) | undefined;
const setup = async () => {
try {
unlisten = await listen("profile-updated", () => {
void loadProxyUsage();
});
} catch (_err) {
// ignore non-critical errors
}
};
if (isOpen) void setup();
return () => {
if (unlisten) unlisten();
};
}, [isOpen, loadProxyUsage]);
const handleDeleteProxy = useCallback(async (proxy: StoredProxy) => {
if (
!confirm(`Are you sure you want to delete the proxy "${proxy.name}"?`)
) {
return;
}
try {
await invoke("delete_stored_proxy", { proxyId: proxy.id });
setStoredProxies((prev) => prev.filter((p) => p.id !== proxy.id));
await invoke("delete_stored_proxy", { proxyId: proxyToDelete.id });
toast.success("Proxy deleted successfully");
await emit("stored-proxies-changed");
} catch (error) {
console.error("Failed to delete proxy:", error);
toast.error("Failed to delete proxy");
} finally {
setIsDeleting(false);
setProxyToDelete(null);
}
}, []);
}, [proxyToDelete]);
const handleCreateProxy = useCallback(() => {
setEditingProxy(null);
@@ -118,23 +114,6 @@ export function ProxyManagementDialog({
setShowProxyForm(true);
}, []);
const handleProxySaved = useCallback((savedProxy: StoredProxy) => {
setStoredProxies((prev) => {
const existingIndex = prev.findIndex((p) => p.id === savedProxy.id);
if (existingIndex >= 0) {
// Update existing proxy
const updated = [...prev];
updated[existingIndex] = savedProxy;
return updated;
} else {
// Add new proxy
return [...prev, savedProxy];
}
});
setShowProxyForm(false);
setEditingProxy(null);
}, []);
const handleProxyFormClose = useCallback(() => {
setShowProxyForm(false);
setEditingProxy(null);
@@ -143,127 +122,135 @@ export function ProxyManagementDialog({
return (
<>
<Dialog open={isOpen} onOpenChange={onClose}>
<DialogContent className="max-w-2xl max-h-[80vh] flex flex-col">
<DialogHeader className="flex-shrink-0">
<div className="flex gap-2 items-center">
<FiWifi className="w-5 h-5" />
<DialogTitle>Proxy Management</DialogTitle>
</div>
<DialogContent className="max-w-2xl">
<DialogHeader>
<DialogTitle>Proxy Management</DialogTitle>
<DialogDescription>
Manage your saved proxy configurations for reuse across profiles
</DialogDescription>
</DialogHeader>
<div className="flex flex-col flex-1 gap-4 py-4 min-h-0">
{/* Header with Create Button */}
<div className="flex flex-shrink-0 justify-between items-center">
<div>
<h3 className="text-lg font-medium">Stored Proxies</h3>
<p className="text-sm text-muted-foreground">
Manage your saved proxy configurations for reuse across
profiles
</p>
</div>
<div className="space-y-4">
{/* Create new proxy button */}
<div className="flex justify-between items-center">
<Label>Proxies</Label>
<RippleButton
size="sm"
onClick={handleCreateProxy}
className="flex gap-2 items-center"
>
<FiPlus className="w-4 h-4" />
Create Proxy
<GoPlus className="w-4 h-4" />
Create
</RippleButton>
</div>
{/* Proxy List - Scrollable */}
<div className="flex-1 min-h-0">
{loading ? (
<div className="flex justify-center items-center h-32">
<p className="text-sm text-muted-foreground">
Loading proxies...
</p>
</div>
) : storedProxies.length === 0 ? (
<div className="flex flex-col justify-center items-center h-32 text-center">
<FiWifi className="mx-auto mb-4 w-12 h-12 text-muted-foreground" />
<p className="mb-2 text-muted-foreground">
No proxies configured
</p>
<p className="mb-4 text-sm text-muted-foreground">
Create your first proxy configuration to get started
</p>
<RippleButton variant="outline" onClick={handleCreateProxy}>
<FiPlus className="mr-2 w-4 h-4" />
Create First Proxy
</RippleButton>
</div>
) : (
<div className="overflow-y-auto pr-2 space-y-2 h-full">
{storedProxies.map((proxy) => (
<div
key={proxy.id}
className="flex justify-between items-center p-1 rounded border bg-card"
>
<div className="flex-1 ml-2 min-w-0">
{proxy.name.length > 30 ? (
<Tooltip>
<TooltipTrigger asChild>
<span className="block font-medium truncate text-card-foreground">
{trimName(proxy.name)}
</span>
</TooltipTrigger>
<TooltipContent>
<span className="text-sm font-medium text-card-foreground">
{proxy.name}
</span>
</TooltipContent>
</Tooltip>
) : (
<span className="text-sm font-medium text-card-foreground">
{/* Proxies list */}
{isLoading ? (
<div className="text-sm text-muted-foreground">
Loading proxies...
</div>
) : storedProxies.length === 0 ? (
<div className="text-sm text-muted-foreground">
No proxies created yet. Create your first proxy using the button
above.
</div>
) : (
<div className="border rounded-md">
<ScrollArea className="h-[240px]">
<Table>
<TableHeader>
<TableRow>
<TableHead>Name</TableHead>
<TableHead className="w-20">Usage</TableHead>
<TableHead className="w-24">Actions</TableHead>
</TableRow>
</TableHeader>
<TableBody>
{storedProxies.map((proxy) => (
<TableRow key={proxy.id}>
<TableCell className="font-medium">
{proxy.name}
</span>
)}
</div>
<div className="mr-2">
<Badge variant="secondary">
{proxyUsage[proxy.id] ?? 0}
</Badge>
</div>
<div className="flex flex-shrink-0 gap-1 items-center">
<Tooltip>
<TooltipTrigger asChild>
<Button
variant="ghost"
size="sm"
onClick={() => handleEditProxy(proxy)}
>
<FiEdit2 className="w-4 h-4" />
</Button>
</TooltipTrigger>
<TooltipContent>
<p>Edit proxy</p>
</TooltipContent>
</Tooltip>
<Tooltip>
<TooltipTrigger asChild>
<Button
variant="ghost"
size="sm"
onClick={() => handleDeleteProxy(proxy)}
className="text-destructive hover:text-destructive"
>
<FiTrash2 className="w-4 h-4" />
</Button>
</TooltipTrigger>
<TooltipContent>
<p>Delete proxy</p>
</TooltipContent>
</Tooltip>
</div>
</div>
))}
</div>
)}
</div>
</TableCell>
<TableCell>
<Badge variant="secondary">
{proxyUsage[proxy.id] ?? 0}
</Badge>
</TableCell>
<TableCell>
<div className="flex gap-1">
<ProxyCheckButton
proxy={proxy}
profileId={proxy.id}
checkingProfileId={checkingProxyId}
cachedResult={proxyCheckResults[proxy.id]}
setCheckingProfileId={setCheckingProxyId}
onCheckComplete={(result) => {
setProxyCheckResults((prev) => ({
...prev,
[proxy.id]: result,
}));
}}
onCheckFailed={(result) => {
setProxyCheckResults((prev) => ({
...prev,
[proxy.id]: result,
}));
}}
/>
<Tooltip>
<TooltipTrigger asChild>
<Button
variant="ghost"
size="sm"
onClick={() => handleEditProxy(proxy)}
>
<LuPencil className="w-4 h-4" />
</Button>
</TooltipTrigger>
<TooltipContent>
<p>Edit proxy</p>
</TooltipContent>
</Tooltip>
<Tooltip>
<TooltipTrigger asChild>
<span>
<Button
variant="ghost"
size="sm"
onClick={() => handleDeleteProxy(proxy)}
disabled={(proxyUsage[proxy.id] ?? 0) > 0}
>
<LuTrash2 className="w-4 h-4" />
</Button>
</span>
</TooltipTrigger>
<TooltipContent>
{(proxyUsage[proxy.id] ?? 0) > 0 ? (
<p>
Cannot delete: in use by{" "}
{proxyUsage[proxy.id]} profile
{proxyUsage[proxy.id] > 1 ? "s" : ""}
</p>
) : (
<p>Delete proxy</p>
)}
</TooltipContent>
</Tooltip>
</div>
</TableCell>
</TableRow>
))}
</TableBody>
</Table>
</ScrollArea>
</div>
)}
</div>
<DialogFooter className="flex-shrink-0">
<RippleButton onClick={onClose}>Close</RippleButton>
<DialogFooter>
<RippleButton variant="outline" onClick={onClose}>
Close
</RippleButton>
</DialogFooter>
</DialogContent>
</Dialog>
@@ -271,9 +258,17 @@ export function ProxyManagementDialog({
<ProxyFormDialog
isOpen={showProxyForm}
onClose={handleProxyFormClose}
onSave={handleProxySaved}
editingProxy={editingProxy}
/>
<DeleteConfirmationDialog
isOpen={proxyToDelete !== null}
onClose={() => setProxyToDelete(null)}
onConfirm={handleConfirmDelete}
title="Delete Proxy"
description={`This action cannot be undone. This will permanently delete the proxy "${proxyToDelete?.name ?? ""}".`}
confirmButtonText="Delete"
isLoading={isDeleting}
/>
</>
);
}
+648 -106
View File
@@ -1,13 +1,22 @@
"use client";
import { invoke } from "@tauri-apps/api/core";
import { listen } from "@tauri-apps/api/event";
import Color from "color";
import { useTheme } from "next-themes";
import { useCallback, useEffect, useState } from "react";
import { BsCamera, BsMic } from "react-icons/bs";
import { LoadingButton } from "@/components/loading-button";
import { Badge } from "@/components/ui/badge";
import { Checkbox } from "@/components/ui/checkbox";
import {
ColorPicker,
ColorPickerAlpha,
ColorPickerEyeDropper,
ColorPickerFormat,
ColorPickerHue,
ColorPickerOutput,
ColorPickerSelection,
} from "@/components/ui/color-picker";
import {
Dialog,
DialogContent,
@@ -16,6 +25,11 @@ import {
DialogTitle,
} from "@/components/ui/dialog";
import { Label } from "@/components/ui/label";
import {
Popover,
PopoverContent,
PopoverTrigger,
} from "@/components/ui/popover";
import {
Select,
SelectContent,
@@ -25,18 +39,28 @@ import {
} from "@/components/ui/select";
import type { PermissionType } from "@/hooks/use-permissions";
import { usePermissions } from "@/hooks/use-permissions";
import { getBrowserDisplayName } from "@/lib/browser-utils";
import {
dismissToast,
showErrorToast,
showSuccessToast,
showUnifiedVersionUpdateToast,
} from "@/lib/toast-utils";
getThemeByColors,
getThemeById,
THEME_VARIABLES,
THEMES,
} from "@/lib/themes";
import { showErrorToast, showSuccessToast } from "@/lib/toast-utils";
import { CopyToClipboard } from "./ui/copy-to-clipboard";
import { RippleButton } from "./ui/ripple";
interface AppSettings {
set_as_default_browser: boolean;
theme: string;
custom_theme?: Record<string, string>;
api_enabled: boolean;
api_port: number;
api_token?: string;
}
interface CustomThemeState {
selectedThemeId: string | null;
colors: Record<string, string>;
}
interface PermissionInfo {
@@ -45,14 +69,7 @@ interface PermissionInfo {
description: string;
}
interface VersionUpdateProgress {
current_browser: string;
total_browsers: number;
completed_browsers: number;
new_versions_found: number;
browser_new_versions: number;
status: string; // "updating", "completed", "error"
}
// Version update progress toasts are handled globally via useVersionUpdater
interface SettingsDialogProps {
isOpen: boolean;
@@ -63,10 +80,22 @@ export function SettingsDialog({ isOpen, onClose }: SettingsDialogProps) {
const [settings, setSettings] = useState<AppSettings>({
set_as_default_browser: false,
theme: "system",
custom_theme: undefined,
api_enabled: false,
api_port: 10108,
api_token: undefined,
});
const [originalSettings, setOriginalSettings] = useState<AppSettings>({
set_as_default_browser: false,
theme: "system",
custom_theme: undefined,
api_enabled: false,
api_port: 10108,
api_token: undefined,
});
const [customThemeState, setCustomThemeState] = useState<CustomThemeState>({
selectedThemeId: null,
colors: {},
});
const [isDefaultBrowser, setIsDefaultBrowser] = useState(false);
const [isLoading, setIsLoading] = useState(false);
@@ -78,6 +107,7 @@ export function SettingsDialog({ isOpen, onClose }: SettingsDialogProps) {
const [requestingPermission, setRequestingPermission] =
useState<PermissionType | null>(null);
const [isMacOS, setIsMacOS] = useState(false);
const [apiServerPort, setApiServerPort] = useState<number | null>(null);
const { setTheme } = useTheme();
const {
@@ -123,12 +153,40 @@ export function SettingsDialog({ isOpen, onClose }: SettingsDialogProps) {
return "Access to camera for browser applications";
}
}, []);
const loadSettings = useCallback(async () => {
setIsLoading(true);
try {
const appSettings = await invoke<AppSettings>("get_app_settings");
setSettings(appSettings);
setOriginalSettings(appSettings);
const tokyoNightTheme = getThemeById("tokyo-night");
if (!tokyoNightTheme) {
throw new Error("Tokyo Night theme not found");
}
const merged: AppSettings = {
...appSettings,
custom_theme:
appSettings.custom_theme &&
Object.keys(appSettings.custom_theme).length > 0
? appSettings.custom_theme
: tokyoNightTheme.colors,
};
setSettings(merged);
setOriginalSettings(merged);
// Initialize custom theme state
if (merged.theme === "custom" && merged.custom_theme) {
const matchingTheme = getThemeByColors(merged.custom_theme);
setCustomThemeState({
selectedThemeId: matchingTheme?.id || null,
colors: merged.custom_theme,
});
} else if (merged.theme === "custom") {
// Initialize with Tokyo Night if no custom theme exists
setCustomThemeState({
selectedThemeId: "tokyo-night",
colors: tokyoNightTheme.colors,
});
}
} catch (error) {
console.error("Failed to load settings:", error);
} finally {
@@ -136,6 +194,20 @@ export function SettingsDialog({ isOpen, onClose }: SettingsDialogProps) {
}
}, []);
const applyCustomTheme = useCallback((vars: Record<string, string>) => {
const root = document.documentElement;
Object.entries(vars).forEach(([k, v]) =>
root.style.setProperty(k, v, "important"),
);
}, []);
const clearCustomTheme = useCallback(() => {
const root = document.documentElement;
THEME_VARIABLES.forEach(({ key }) =>
root.style.removeProperty(key as string),
);
}, []);
const loadPermissions = useCallback(async () => {
setIsLoadingPermissions(true);
try {
@@ -196,6 +268,8 @@ export function SettingsDialog({ isOpen, onClose }: SettingsDialogProps) {
setIsClearingCache(true);
try {
await invoke("clear_all_version_cache_and_refetch");
// Also clear traffic stats cache
await invoke("clear_all_traffic_stats");
// Don't show immediate success toast - let the version update progress events handle it
} catch (error) {
console.error("Failed to clear cache:", error);
@@ -225,31 +299,162 @@ export function SettingsDialog({ isOpen, onClose }: SettingsDialogProps) {
},
[getPermissionDisplayName, requestPermission],
);
const handleSave = useCallback(async () => {
setIsSaving(true);
try {
await invoke("save_app_settings", { settings });
setTheme(settings.theme);
setOriginalSettings(settings);
// Update settings with current custom theme state
let settingsToSave: AppSettings = {
...settings,
custom_theme:
settings.theme === "custom"
? customThemeState.colors
: settings.custom_theme,
};
const savedSettings = await invoke<AppSettings>("save_app_settings", {
settings: settingsToSave,
});
// Update settings with any generated tokens
setSettings(savedSettings);
settingsToSave = savedSettings;
setTheme(settings.theme === "custom" ? "dark" : settings.theme);
// Apply or clear custom variables only on Save
if (settings.theme === "custom") {
if (
customThemeState.colors &&
Object.keys(customThemeState.colors).length > 0
) {
try {
const root = document.documentElement;
// Clear any previous custom vars first
THEME_VARIABLES.forEach(({ key }) =>
root.style.removeProperty(key as string),
);
Object.entries(customThemeState.colors).forEach(([k, v]) =>
root.style.setProperty(k, v, "important"),
);
} catch {}
}
} else {
try {
const root = document.documentElement;
THEME_VARIABLES.forEach(({ key }) =>
root.style.removeProperty(key as string),
);
} catch {}
}
// Handle API server start/stop based on settings
const wasApiEnabled = originalSettings.api_enabled;
const isApiEnabled = settingsToSave.api_enabled;
if (isApiEnabled && !wasApiEnabled) {
// Start API server
try {
const port = await invoke<number>("start_api_server", {
port: settingsToSave.api_port,
});
setApiServerPort(port);
showSuccessToast(`Local API started on port ${port}`);
} catch (error) {
console.error("Failed to start API server:", error);
showErrorToast("Failed to start API server", {
description:
error instanceof Error ? error.message : "Unknown error occurred",
});
// Revert the API enabled setting if start failed
settingsToSave.api_enabled = false;
const revertedSettings = await invoke<AppSettings>(
"save_app_settings",
{ settings: settingsToSave },
);
setSettings(revertedSettings);
settingsToSave = revertedSettings;
}
} else if (!isApiEnabled && wasApiEnabled) {
// Stop API server
try {
await invoke("stop_api_server");
setApiServerPort(null);
showSuccessToast("Local API stopped");
} catch (error) {
console.error("Failed to stop API server:", error);
showErrorToast("Failed to stop API server", {
description:
error instanceof Error ? error.message : "Unknown error occurred",
});
}
}
setOriginalSettings(settingsToSave);
onClose();
} catch (error) {
console.error("Failed to save settings:", error);
} finally {
setIsSaving(false);
}
}, [onClose, setTheme, settings]);
}, [onClose, setTheme, settings, customThemeState, originalSettings]);
const updateSetting = useCallback(
(key: keyof AppSettings, value: boolean | string) => {
setSettings((prev) => ({ ...prev, [key]: value }));
(
key: keyof AppSettings,
value: boolean | string | Record<string, string> | undefined,
) => {
setSettings((prev) => ({ ...prev, [key]: value as unknown as never }));
},
[],
);
const loadApiServerStatus = useCallback(async () => {
try {
const port = await invoke<number | null>("get_api_server_status");
setApiServerPort(port);
} catch (error) {
console.error("Failed to load API server status:", error);
setApiServerPort(null);
}
}, []);
const handleClose = useCallback(() => {
// Restore original theme when closing without saving
if (originalSettings.theme === "custom" && originalSettings.custom_theme) {
applyCustomTheme(originalSettings.custom_theme);
} else {
clearCustomTheme();
}
// Reset custom theme state to original
if (originalSettings.theme === "custom" && originalSettings.custom_theme) {
const matchingTheme = getThemeByColors(originalSettings.custom_theme);
setCustomThemeState({
selectedThemeId: matchingTheme?.id || null,
colors: originalSettings.custom_theme,
});
}
onClose();
}, [
originalSettings.theme,
originalSettings.custom_theme,
applyCustomTheme,
clearCustomTheme,
onClose,
]);
// Only clear custom theme when switching away from custom, don't apply live changes
useEffect(() => {
if (settings.theme !== "custom") {
clearCustomTheme();
}
}, [settings.theme, clearCustomTheme]);
useEffect(() => {
if (isOpen) {
loadSettings().catch(console.error);
checkDefaultBrowserStatus().catch(console.error);
loadApiServerStatus().catch(console.error);
// Check if we're on macOS
const userAgent = navigator.userAgent;
@@ -265,86 +470,18 @@ export function SettingsDialog({ isOpen, onClose }: SettingsDialogProps) {
checkDefaultBrowserStatus().catch(console.error);
}, 500); // Check every 500ms
// Listen for version update progress events
let unlistenFn: (() => void) | null = null;
const setupVersionUpdateListener = async () => {
try {
unlistenFn = await listen<VersionUpdateProgress>(
"version-update-progress",
(event) => {
const progress = event.payload;
if (progress.status === "updating") {
// Show unified progress toast
const currentBrowserName = progress.current_browser
? getBrowserDisplayName(progress.current_browser)
: undefined;
showUnifiedVersionUpdateToast(
"Checking for browser updates...",
{
description: currentBrowserName
? `Fetching ${currentBrowserName} release information...`
: "Initializing version check...",
progress: {
current: progress.completed_browsers,
total: progress.total_browsers,
found: progress.new_versions_found,
current_browser: currentBrowserName,
},
},
);
} else if (progress.status === "completed") {
dismissToast("unified-version-update");
if (progress.new_versions_found > 0) {
showSuccessToast("Browser versions updated successfully", {
duration: 5000,
description:
"Auto-downloads will start shortly for available updates.",
});
} else {
showSuccessToast("No new browser versions found", {
duration: 3000,
description: "All browser versions are up to date",
});
}
} else if (progress.status === "error") {
dismissToast("unified-version-update");
showErrorToast("Failed to update browser versions", {
duration: 6000,
description: "Check your internet connection and try again",
});
}
},
);
} catch (error) {
console.error(
"Failed to setup version update progress listener:",
error,
);
}
};
setupVersionUpdateListener();
// Cleanup interval and listener on component unmount or dialog close
// Cleanup interval on component unmount or dialog close
return () => {
clearInterval(intervalId);
if (unlistenFn) {
try {
unlistenFn();
} catch (error) {
console.error(
"Failed to cleanup version update progress listener:",
error,
);
}
}
};
}
}, [isOpen, loadPermissions, checkDefaultBrowserStatus, loadSettings]);
}, [
isOpen,
loadPermissions,
checkDefaultBrowserStatus,
loadSettings,
loadApiServerStatus,
]);
// Update permissions when the permission states change
useEffect(() => {
@@ -373,12 +510,20 @@ export function SettingsDialog({ isOpen, onClose }: SettingsDialogProps) {
]);
// Check if settings have changed (excluding default browser setting)
const hasChanges = settings.theme !== originalSettings.theme;
const hasChanges =
settings.theme !== originalSettings.theme ||
settings.api_enabled !== originalSettings.api_enabled ||
(settings.theme === "custom" &&
JSON.stringify(customThemeState.colors) !==
JSON.stringify(originalSettings.custom_theme ?? {})) ||
(settings.theme !== "custom" &&
JSON.stringify(settings.custom_theme ?? {}) !==
JSON.stringify(originalSettings.custom_theme ?? {}));
return (
<Dialog open={isOpen} onOpenChange={onClose}>
<Dialog open={isOpen} onOpenChange={handleClose}>
<DialogContent className="max-w-md max-h-[80vh] my-8 flex flex-col">
<DialogHeader className="flex-shrink-0">
<DialogHeader className="shrink-0">
<DialogTitle>Settings</DialogTitle>
</DialogHeader>
@@ -395,6 +540,15 @@ export function SettingsDialog({ isOpen, onClose }: SettingsDialogProps) {
value={settings.theme}
onValueChange={(value) => {
updateSetting("theme", value);
if (value === "custom") {
const tokyoNightTheme = getThemeById("tokyo-night");
if (tokyoNightTheme) {
setCustomThemeState({
selectedThemeId: "tokyo-night",
colors: tokyoNightTheme.colors,
});
}
}
}}
>
<SelectTrigger id="theme-select">
@@ -404,13 +558,126 @@ export function SettingsDialog({ isOpen, onClose }: SettingsDialogProps) {
<SelectItem value="light">Light</SelectItem>
<SelectItem value="dark">Dark</SelectItem>
<SelectItem value="system">System</SelectItem>
<SelectItem value="custom">Custom</SelectItem>
</SelectContent>
</Select>
</div>
<p className="text-xs text-muted-foreground">
Choose your preferred theme or follow your system settings.
Choose your preferred theme or follow your system settings. Custom
theme changes are applied only when you save.
</p>
{settings.theme === "custom" && (
<div className="space-y-3">
<div className="space-y-2">
<Label
htmlFor="theme-preset-select"
className="text-sm font-medium"
>
Theme Preset
</Label>
<Select
value={customThemeState.selectedThemeId || "custom"}
onValueChange={(value) => {
if (value === "custom") {
setCustomThemeState((prev) => ({
...prev,
selectedThemeId: null,
}));
} else {
const theme = getThemeById(value);
if (theme) {
setCustomThemeState({
selectedThemeId: value,
colors: theme.colors,
});
}
}
}}
>
<SelectTrigger id="theme-preset-select">
<SelectValue placeholder="Select a theme preset" />
</SelectTrigger>
<SelectContent>
{THEMES.map((theme) => (
<SelectItem key={theme.id} value={theme.id}>
{theme.name}
</SelectItem>
))}
<SelectItem value="custom">Your Own</SelectItem>
</SelectContent>
</Select>
</div>
<div className="text-sm font-medium">Custom Colors</div>
<div className="grid grid-cols-4 gap-3">
{THEME_VARIABLES.map(({ key, label }) => {
const colorValue =
customThemeState.colors[key] || "#000000";
return (
<div
key={key}
className="flex flex-col gap-1 items-center"
>
<Popover>
<PopoverTrigger asChild>
<button
type="button"
aria-label={label}
className="w-8 h-8 rounded-md border shadow-sm cursor-pointer"
style={{ backgroundColor: colorValue }}
/>
</PopoverTrigger>
<PopoverContent
className="w-[320px] p-3"
sideOffset={6}
>
<ColorPicker
className="p-3 rounded-md border shadow-sm bg-background"
value={colorValue}
onColorChange={([r, g, b, a]) => {
const next = Color({ r, g, b }).alpha(a);
const nextStr = next.hexa();
const newColors = {
...customThemeState.colors,
[key]: nextStr,
};
// Check if colors match any preset theme
const matchingTheme =
getThemeByColors(newColors);
setCustomThemeState({
selectedThemeId: matchingTheme?.id || null,
colors: newColors,
});
}}
>
<ColorPickerSelection className="h-36 rounded" />
<div className="flex gap-3 items-center mt-3">
<ColorPickerEyeDropper />
<div className="grid gap-1 w-full">
<ColorPickerHue />
<ColorPickerAlpha />
</div>
</div>
<div className="flex gap-2 items-center mt-3">
<ColorPickerOutput />
<ColorPickerFormat />
</div>
</ColorPicker>
</PopoverContent>
</Popover>
<div className="text-[10px] text-muted-foreground text-center leading-tight">
{label}
</div>
</div>
);
})}
</div>
</div>
)}
</div>
{/* Default Browser Section */}
@@ -505,6 +772,281 @@ export function SettingsDialog({ isOpen, onClose }: SettingsDialogProps) {
</div>
)}
{/* Local API Section */}
<div className="space-y-4">
<Label className="text-base font-medium">Local API</Label>
<div className="flex items-center space-x-2">
<Checkbox
id="api-enabled"
checked={settings.api_enabled}
onCheckedChange={async (checked: boolean) => {
updateSetting("api_enabled", checked);
try {
if (checked) {
// Ask backend to enable API and return settings with token
const next = await invoke<AppSettings>(
"save_app_settings",
{
settings: { ...settings, api_enabled: true },
},
);
setSettings(next);
} else {
const next = await invoke<AppSettings>(
"save_app_settings",
{
settings: {
...settings,
api_enabled: false,
api_token: null,
},
},
);
setSettings(next);
}
} catch (e) {
console.error("Failed to toggle API:", e);
}
}}
/>
<div className="grid gap-1.5 leading-none">
<Label
htmlFor="api-enabled"
className="text-sm font-medium leading-none peer-disabled:cursor-not-allowed peer-disabled:opacity-70"
>
(ALPHA) Enable Local API Server
</Label>
<p className="text-xs text-muted-foreground">
Allow managing the application data externally via REST API.
Server will start on port 10108 or a random port if
unavailable.
{apiServerPort && (
<span className="ml-1 font-medium text-green-600">
(Currently running on port {apiServerPort})
</span>
)}
</p>
</div>
</div>
{settings.api_enabled && settings.api_token && (
<div className="space-y-2">
<Label className="text-sm font-medium">
API Authentication Token
</Label>
<div className="flex items-center space-x-2">
<input
type="text"
value={settings.api_token}
readOnly
className="flex-1 px-3 py-2 font-mono text-sm rounded-md border bg-muted"
/>
<CopyToClipboard
text={settings.api_token || ""}
successMessage="API token copied to clipboard"
/>
</div>
<p className="text-xs text-muted-foreground">
Include this token in the Authorization header as "Bearer{" "}
{settings.api_token}" for all API requests.
</p>
{/* Temporary in-app API docs */}
<div className="p-3 mt-3 space-y-2 text-xs leading-relaxed rounded-md border bg-muted/40">
<div className="font-medium">
Temporary in-app API docs (alpha)
</div>
<div>
<div>
Base URL:{" "}
<code className="font-mono">{`http://127.0.0.1:${apiServerPort ?? settings.api_port ?? 10108}/v1`}</code>
</div>
<div>
Auth:{" "}
<code className="font-mono">
Authorization: Bearer {settings.api_token}
</code>
</div>
</div>
<div className="space-y-1">
<div className="font-medium">Profiles</div>
<ul className="list-disc ml-5 space-y-0.5">
<li>
<code className="font-mono">GET /profiles</code> list
profiles
</li>
<li>
<code className="font-mono">
GET /profiles/{"{"}id{"}"}
</code>{" "}
get one
</li>
<li>
<code className="font-mono">POST /profiles</code>
create
<span className="ml-1 text-muted-foreground">
(required: name, browser, version; optional:
release_type, proxy_id, camoufox_config, group_id,
tags)
</span>
</li>
<li>
<code className="font-mono">
PUT /profiles/{"{"}id{"}"}
</code>{" "}
update
<span className="ml-1 text-muted-foreground">
(any of: name, version, proxy_id, camoufox_config,
group_id, tags)
</span>
</li>
<li>
<code className="font-mono">
DELETE /profiles/{"{"}id{"}"}
</code>{" "}
delete
</li>
<li>
<code className="font-mono">
POST /profiles/{"{"}id{"}"}/run
</code>{" "}
launch with remote debugging
<span className="ml-1 text-muted-foreground">
(body: {"{"}url?, headless?{"}"})
</span>
</li>
<li>
<code className="font-mono">
POST /profiles/{"{"}id{"}"}/open-url
</code>{" "}
open URL in running profile
<span className="ml-1 text-muted-foreground">
(body: {"{"}url{"}"})
</span>
</li>
<li>
<code className="font-mono">
POST /profiles/{"{"}id{"}"}/kill
</code>{" "}
stop browser process
</li>
</ul>
</div>
<div className="space-y-1">
<div className="font-medium">Groups</div>
<ul className="list-disc ml-5 space-y-0.5">
<li>
<code className="font-mono">GET /groups</code> list
</li>
<li>
<code className="font-mono">
GET /groups/{"{"}id{"}"}
</code>{" "}
get one
</li>
<li>
<code className="font-mono">POST /groups</code> create
<span className="ml-1 text-muted-foreground">
(required: name)
</span>
</li>
<li>
<code className="font-mono">
PUT /groups/{"{"}id{"}"}
</code>{" "}
rename
<span className="ml-1 text-muted-foreground">
(required: name)
</span>
</li>
<li>
<code className="font-mono">
DELETE /groups/{"{"}id{"}"}
</code>{" "}
delete
</li>
</ul>
</div>
<div className="space-y-1">
<div className="font-medium">Tags</div>
<ul className="list-disc ml-5 space-y-0.5">
<li>
<code className="font-mono">GET /tags</code> list
</li>
</ul>
</div>
<div className="space-y-1">
<div className="font-medium">Proxies</div>
<ul className="list-disc ml-5 space-y-0.5">
<li>
<code className="font-mono">GET /proxies</code> list
</li>
<li>
<code className="font-mono">
GET /proxies/{"{"}id{"}"}
</code>{" "}
get one
</li>
<li>
<code className="font-mono">POST /proxies</code>
create
<span className="ml-1 text-muted-foreground">
(required: name, proxy_settings object)
</span>
</li>
<li>
<code className="font-mono">
PUT /proxies/{"{"}id{"}"}
</code>{" "}
update
<span className="ml-1 text-muted-foreground">
(optional: name, proxy_settings)
</span>
</li>
<li>
<code className="font-mono">
DELETE /proxies/{"{"}id{"}"}
</code>{" "}
delete
</li>
</ul>
</div>
<div className="space-y-1">
<div className="font-medium">Browsers</div>
<ul className="list-disc ml-5 space-y-0.5">
<li>
<code className="font-mono">
POST /browsers/download
</code>{" "}
download
<span className="ml-1 text-muted-foreground">
(required: browser, version)
</span>
</li>
<li>
<code className="font-mono">
GET /browsers/{"{"}browser{"}"}/versions
</code>{" "}
list versions
</li>
<li>
<code className="font-mono">
GET /browsers/{"{"}browser{"}"}/versions/{"{"}version
{"}"}/downloaded
</code>{" "}
is downloaded
</li>
</ul>
</div>
<div className="text-muted-foreground">
These docs are temporary and will be replaced with full
documentation later.
</div>
</div>
</div>
)}
</div>
{/* Advanced Section */}
<div className="space-y-4">
<Label className="text-base font-medium">Advanced</Label>
@@ -528,8 +1070,8 @@ export function SettingsDialog({ isOpen, onClose }: SettingsDialogProps) {
</div>
</div>
<DialogFooter className="flex-shrink-0">
<RippleButton variant="outline" onClick={onClose}>
<DialogFooter className="shrink-0">
<RippleButton variant="outline" onClick={handleClose}>
Cancel
</RippleButton>
<LoadingButton

Some files were not shown because too many files have changed in this diff Show More