167 Commits

Author SHA1 Message Date
Steve Lau
9421180dba docs: release procedure 2025-09-28 11:18:43 +08:00
Steve Lau
db07dec505 docs: release procedure 2025-09-28 11:16:07 +08:00
medcl
386ebb60c0 v0.8.0 2025-09-28 10:23:58 +08:00
SteveLauC
17c7227a44 chore: release 0.8 (#907) 2025-09-28 10:23:36 +08:00
ayangweb
23faaf6fc3 refactor: update extension icon (#906) 2025-09-27 12:51:58 +08:00
ayangweb
3131d3cea4 fix: update window not closing (#904) 2025-09-27 11:32:23 +08:00
ayangweb
3014dc8839 refactor: update icons for window management extension (#903) 2025-09-27 10:11:18 +08:00
SteveLauC
829d3868c4 chore: remove example iframe title (#902) 2025-09-26 15:39:39 +08:00
SteveLauC
6584504142 chore: convertFileSrc() "link[href]" and "img[src]" (#901)
These 2 tags could contain local file paths, we need to
`convertFileSrc()` them as well.
2025-09-26 14:16:05 +08:00
ayangweb
01c51d83d6 feat: support opening file in its containing folder (#900) 2025-09-26 14:14:59 +08:00
ayangweb
29442826c5 refactor: preserve top-most state when pinning (#899) 2025-09-26 10:41:10 +08:00
SteveLauC
e249c02123 fix: bump applications-rs to fix empty app name issue (#898) 2025-09-25 20:55:26 +08:00
SteveLauC
7ac4508e8d feat: new extension type View (#894)
This commit introduces a new extension type View, which enables developers
to implement extensions with GUI. It is implemented using iframe, developers
can specify the path to the HTML file in the `Extension.page` field, then
Coco will load and render that page when the extension gets opened.

coco-api

We provide a TypeScript library [1] that will contain the APIs developers
need to make the experience better.

We start from file system APIs. Since the embedded HTML page will be loaded
by WebView, which has no access to the local file system, we provide APIs
to bridge that gap. Currently, `fs:read_dir()` is the only API we implemented, more
will come soon.

Permission

As View extensions run user-provided code, we introduce a permision
mechanism to sandbox the code. Developers must manually specify the
permission their extension needs in the "plugin.json" file, e,g.:

"permissions": {
  "fs": [
    { "path": "/Users/foo/Downloads", "access": ["read", "write"] },
    { "path": "/Users/foo/Documents", "access": ["read"] }
  ],
  "http": [
    { "host": "api.github.com" }
  ],
  "api": ["fs:read_dir"]
}

Currently, both fs and api permissions are implemented. Permission checks
apply only to View extensions for now; Command extensions will support
them in the future.

[1]: https://github.com/infinilabs/coco-api
2025-09-25 11:12:29 +08:00
SteveLauC
450baccc92 fix: ensure search paths are indexed (#896)
The file search extension relies on the OS's desktop search to work, and it
is possible that the desktop search indexer may not index the search paths
we specify.

This commit adds a hook that signals to the indexer and lets it index
the paths we need. This hook will be invoked when:

* initialing the extension
* enabling the extension
* upon every configuration change

to make our best effort to fix the issue.
2025-09-22 18:10:33 +08:00
SteveLauC
bd0c9a740b chore: update extension detail API URL (#897)
Now we send requests to the dfault Coco server.
2025-09-18 10:59:09 +08:00
Medcl
fca11a9001 chore: skip login check for web widget (#895)
* chore: skip login check for web widget

* chore: update docs

* chore: update docs

* chore: bump widget version
2025-09-16 17:18:44 +08:00
weiqinzhou3
1aa30ee5bc Update README.md (#893)
docs(README): add official download links for prerequisites
2025-09-08 16:03:40 +08:00
SteveLauC
cdaa151028 feat: extension Window Management for macOS (#892)
* feat: extension Window Management for macOS

* release note

* revert frontend code changes

* new line char

* remove todo

* it is macos-only

* format code

* macos-only

* more conditional compilation

* correct field Document.icon
2025-09-08 12:14:11 +08:00
SteveLauC
fd8d5819b8 refactor: ensure Coco won't take focus on macOS (#891)
* refactor: ensure Coco won't take focus

Or the Window Management extension won't work

* bring back set_focus() on Win/Linux; doc code
2025-09-04 11:24:47 +08:00
ayangweb
4a5a4da399 fix: fix ai extension assistant list fetch (#890)
* fix: fix ai extension assistant list fetch

* refactor: update

* refactor: update

* refactor: update
2025-08-29 11:55:37 +08:00
SteveLauC
efaaf73cd7 fix: settings window rendering/loading issue (#889)
This commit fixes(I guess?) the issue that the Settings window may not be
rendered or loaded, you will see that the whole window is gray in that case.

Background, aka, why this issue exists
=============================================================

In commit [1], we wrapped all the backend setup routines in a tauri command, so
that frontend code can call it before invoking any other backend interfaces to
guarantee that these interfaces won't be called until the data/state they rely
on is ready.

The implementation in [1] had an issue that it didn't work with window reloading.
To fix this issue, we made another commit [2].  Commit [2] fixed the refresh
issue, but it also caused the settings window issue that this commit tries to fix.

The backend setup tauri command needs a state to track whether the setup has
completed.  In the previous implementation, this was done in the frontend.  In
this commit, it is moved to the backend.

Why didn't you guys move that state to backend in previous commits, e.g., commit [2]?
=============================================================

We tried, but failed.  In the previous tries, the backend would send an event
to the frontend, but the frontend couldn't receive it, for reasons we still
don’t understand.  And this weird issue still exists, we just happen to find
a way to work around it.

[1]: f93c527561
[2]: 993da9a8ad

Co-authored-by: ayang <473033518@qq.com>
2025-08-28 09:01:08 +08:00
SteveLauC
86540ad1a9 chore: clean up unused warning (#888)
The function `get_system_lang()` will only be used when the feature
"use_pizza_engine" is enabled, feature-gate it to clear the compiler
warning.
2025-08-27 09:51:50 +08:00
SteveLauC
950482608d fix: use kill_on_drop() to avoid zombie proc in error case (#887)
In the previous macOS file search implementation, we spawned an mdfind child
process and killed it when we got the results we needed to avoid zombie
processes.  However, this kill step would be skipped if an error happened
during query results processing as we propagate errors.

This commit replaces the manual kill operation with the `ChildProcHandle.kill_on_drop()`
API to let RAII do the job to fix the issue.
2025-08-26 17:26:31 +08:00
SteveLauC
412c8d8612 feat: file search for Linux/KDE (#886)
This commit implements the file search extension for Linux/KDE using its
desktop search engine Baloo.
2025-08-26 17:26:17 +08:00
SteveLauC
de3c78a5aa feat: file search for Linux/GNOME (#884)
This commit implements the file search extension for Linux with the
GNOME desktop environment by employing the engine that powers GNOME's
desktop search - Tracker.

It also fixes an edge case bug that the search and exclude path
configuration entries will not work.  For example, say I set the search path
to ["~/Dcouments"], and I have a file named "Documents_foobarbuzz" under
my home directory, this file is not in the specified search path but
Coco would return it because we verified this by checking string prefix.
Claude Code found this when I asked it to write unit tests.  Thank both
tests and Claude Code.
2025-08-25 19:29:37 +08:00
SteveLauC
eafa704ca5 docs: doc dylib dependencies in install doc (#885)
Update the installaction document [1] to mention that we have some
dynamic libraries that need to be installed as well.

[1]: https://docs.infinilabs.com/coco-app/main/docs/getting-started/installation/ubuntu/
2025-08-25 16:22:11 +08:00
Medcl
86357079f8 chore: update request accesstoken api (#866)
* chore: update request accesstoken api

* chore: update docs
2025-08-25 16:21:37 +08:00
SteveLauC
ed118151cc refactor: relax the file search conditions on macOS (#883)
* refactor: relax the file search conditions on macOS

This commit makes the file search conditions more permissive on macOS:

* Searching by filename

  Now this is case-insensitive

* Searching by filename and content

  We previously only searched for 2 attributes:

  1. kMDItemFSName
  2. kMDItemTextContent

  as the semantics should be exactly right (Search fileanme and content).  But
  kMDItemTextContent does not work as expected.  For example, if a PDF document
  contains both "Waterloo" and "waterloo", it is only matched by "Waterloo".

  To workaround this (I consider this a bug of Spotlight), now we search all
  the attributes.

* format code

* document
2025-08-25 09:37:23 +08:00
ayangweb
50b26e2d9e fix: resolve deeplink login issue (#881)
* fix: resolve deeplink login issue

* docs: update changelog

* refactor: update
2025-08-22 09:40:53 +08:00
ayangweb
a4aacc16d9 feat: support context menu in debug mode 2025-08-22 09:19:32 +08:00
ayangweb
9aa7d23632 fix: shortcut key not opening extension store (#877)
* fix: shortcut key not opening extension store

* docs: update changelog
2025-08-20 17:52:50 +08:00
SteveLauC
99b316da19 chore: bump applications-rs to latest commit (#880)
Bump it to include the support of localized app names for Linux.
2025-08-20 17:38:53 +08:00
SteveLauC
828c84762b fix: set up hotkey on main thread or Windows will complain (#879)
Coco panicked on Windows when I was testing the applications-rs crate on
Windows, the error message seemingly indicates that we should run hotkey setup
on the main thread, and doing that indeed fixes the issue, so let's do
it.
2025-08-20 17:35:18 +08:00
SteveLauC
5dae5d1cc1 refactor: accept both '-' and '_' as locale str separator (#876)
I wasn't aware that both '-' (specified by the standard) and '_' (not in the
standard, but is commonly used) can be used as the tag delimiter in locale
strings[1] when I originally wrote this commit[2].

Both "zh-CN" and "zh_CN" are valid locale strings!

Since '_' is more commonly used, I thought it was the only correct form and
thus our code only accepts it.

This commit refactors the implementation to accept both.

[1]: https://stackoverflow.com/a/36752015/14092446
[2]: f5b33af7f1
2025-08-19 20:08:25 +08:00
Steve Lau
23372655ca feat: index app names in system language 2025-08-19 14:26:30 +08:00
Steve Lau
f5b33af7f1 feat: index both en/zh_CN app names and show app name in Coco app language
After this commit, we index both English and Chinese application names
so that searches in either language will work.  And the names of the
applications in search results and application list will be in the app language.

Pizza index structure has been updated, but backward compatibility is preserved
by keeping the support code for the old index field.

The changes in this commit are not macOS-specific, it applies to all supported
platforms.  Though this feature won't work on Linux and Windows until we implement
the localized app name support in the underlying applications-rs crate.
2025-08-19 14:26:30 +08:00
ayangweb
993da9a8ad refactor: improved loss after refresh (#874)
* refactor: improved loss after refresh

* refactor: update

* style: change code line
2025-08-18 18:18:49 +08:00
ayangweb
93f1024230 refactor: optimize language changes (#873)
* refactor: optimize language changes

* update
2025-08-18 15:38:13 +08:00
SteveLauC
7b5e528060 refactor: index iOS apps and macOS apps that store icon in Assets.car (#872)
Bumps the 'applications' crate to include this commit[1].  With this,
Coco now indexes iOS apps and macOS apps that store icon in Assets.car.

[1]: 814b16ea84
2025-08-15 18:41:44 +08:00
SteveLauC
1d5ba3ab07 refactor: coordinate third-party extension operations using lock (#867)
refactor: coordinate third-party extension operations using lock

During debugging 783cb73b29,
I realized that some extension operations were not synchronized and thus would
behave incorrectly under concurrency.  Though GUI apps like Coco typically
won't have concurrency.  This commit synchronizes them by putting them behind
the lock.
2025-08-13 17:36:37 +08:00
SteveLauC
f93c527561 refactor: let frontend set up backend states to avoid races (#870)
Co-authored-by: ayang <473033518@qq.com>
2025-08-13 15:33:30 +08:00
BiggerRain
6065353ac9 chore: remove log (#868) 2025-08-11 11:45:16 +08:00
SteveLauC
783cb73b29 feat: deeplink handler for install ext from store (#860)
Co-authored-by: rain9 <15911122312@163.com>
Co-authored-by: ayang <473033518@qq.com>
2025-08-05 18:08:00 +08:00
SteveLauC
ee75f0d119 feat: impl extension settings 'hide_before_open' (#862)
This commit implementes a new extension setting entry
"hide_before_open":

> Extension plugin.json

```json
{
  "name": "Screenshot",
  "settings": {
    "hide_before_open": true
  }
}
```

that, if set to true, Coco will hide the main window before opening the
extension.

Only entensions that can be opened can set their "settings" field, a
check rule is added to guarantee this.
2025-08-04 16:58:27 +08:00
SteveLauC
aaac874f2c ci: check frontend code by building it (#861)
Adds build check for our frontend code
2025-08-03 16:04:32 +08:00
SteveLauC
cd9e454991 chore: remove unused deeplink-releated rust code (#859)
Deep linking is handled on the frontend, so this commit removes the related and
unused backend code.

"src-tauri/tauri.conf.json" is also modified, field "plugins.deep-link.schema"
does not exist so I removed it as well.
2025-08-03 14:38:47 +08:00
BiggerRain
d0fc79238b build: web component build error (#858)
* build: build error

* docs: update notes
2025-08-02 11:14:56 +08:00
BiggerRain
3ed84c2318 fix: web component login state (#857)
* fix: web component login state

* docs: update notes

* build: build error
2025-08-02 10:29:23 +08:00
ayangweb
bd039398ba refactor: optimize uninstall extension (#856) 2025-08-01 18:40:58 +08:00
ayangweb
568db6aba0 feat: add extension uninstall option in settings (#855)
* feat: add extension uninstall option in settings

* docs: update changelog

* update
2025-08-01 18:28:37 +08:00
SteveLauC
2eb10933e7 refactor: pinning window won't set CanJoinAllSpaces on macOS (#854)
This commit reverts the logic introduced in
e7dd27c744:

> Pinning the main window would bring "NSWindowCollectionBehaviorCanJoinAllSpaces"
> back to make it really stay pinned across all the spaces.

Commit 6bc78b41ef (diff-b55e9c1de63ea370ce736826e4dea5685bfa3992d8dee58427337e68b71a1fc1)
did a tiny refactor to the frontend code merged in the above commit,
these changes are reverted as well.

We revert these changes because we observed an issue with window
focus, and we don't know the root cause and how to fix the issue either.

The following change is kept because we don't want to hit this NS panel
bug[1].  But if the issue still exists after this commit, it will
be removed as well.

In "src-tauri/src/setup/mac.rs":

```diff
 panel.set_collection_behaviour(
-       NSWindowCollectionBehavior::NSWindowCollectionBehaviorCanJoinAllSpaces
+       NSWindowCollectionBehavior::NSWindowCollectionBehaviorMoveToActiveSpace
```

[1]: https://github.com/ahkohd/tauri-nspanel/issues/76
2025-08-01 15:28:11 +08:00
ayangweb
5c6cf18139 chore: revert microphone permission (#852) 2025-08-01 12:53:38 +08:00
ayangweb
01c31d884a fix: fix microphone permission issue (#851) 2025-08-01 11:02:25 +08:00
ayangweb
d48d4af7d2 refactor: optimize upload shortcut display (#850) 2025-08-01 10:24:30 +08:00
ayangweb
876d14f9d9 refactor: optimize enter key display (#849) 2025-08-01 10:20:47 +08:00
ayangweb
a8e090c9be refactor: optimized sending messages (#848) 2025-08-01 09:36:06 +08:00
SteveLauC
c30df6cee0 feat: sub extension can set 'platforms' now (#847)
Before this commit, sub extensions were not allowed to set their
"platforms" field, this restriction is lifted in this commit.

By allowing this, a group extension can have sub extensions for
different platforms, here is an example (irrelavent fields are omitted
for the sake of simplicity):

```json
{
  "name": "Suspend my machine",
  "type": "Group",
  "platforms": ["macos", "windows"],
  "commands": [
    {
      "name": "Suspend macOS":
      "platforms": ["macos"],
      "action": {...}
    },
    {
      "name": "Suspend Windows":
      "platforms": ["windows"],
      "action": {...}
    }
  ]
}
```

While loading or installing extensions, incompatible sub extensions will
be filtered out by Coco, e.g., you won't see that "Suspend Windows"
command if you are on macOS.

An extra check is added in this commit to ensure a sub extensions won't
support the platforms that are incompatible with its main extension.

Even though main extensions and sub extensions can both have "platforms"
specified, the semantics of this field, when not set, differs between them.
For main extensions, it means this extension is compatible with all the
platforms supported by Coco (null == all).  For sub extensions, having it
not set implicitly says that this field has the same value as the main
extension's "platforms" field.

The primary reason behind this design is that if we choose the semantics used
by the main extension, treating null as all, all the extensions we currently
have will become invalid, because they are all macOS-only, the main extensions's
"platforms" field is "macos" and sub extensions' "platforms" is not set (null),
they will be equivalent to:

```json
{
  "name": "this is macOS-only",
  "type": "Group",
  "platforms": ["macos"],
  "commands": [
    {
      "name": "How the fxxk can this support all the platforms!"
      "platforms": ["macos", "windows", "linux"],
      "type": "Command",
      "action": {...}
    }
  ]
}
```
This hits exactly the check we mentioned earlier and will be rejected by
Coco.  If we have users installed them, the installed extensions will be
treated invalid and rejected by future Coco release, boom, we break backward
compatibility.

Also, the current design actually makes sense.  Nobody wants to repeatedly
tell Coco that all the sub extensions support macOS if this can be said only
once:

```json
{
  "name": "this is macOS-only",
  "platforms": ["macos"],
  "commands": [
    {
      "name": "This supports macOS"
      "platforms": ["macos"],
    },
    {
      "name": "This supports macOS too"
      "platforms": ["macos"],
    },
    {
      "name": "Guess what! this also supports macOS"
      "platforms": ["macos"],
    },
    {
      "name": "Come on dude, do I really to say this platform=macos so many times"
      "platforms": ["macos"],
    }
  ]
}
```
2025-07-31 21:49:59 +08:00
BiggerRain
b833769c25 refactor: calling service related interfaces (#831)
* chore: server

* chore: add

* refactor: calling service related interfaces

* chore: server list

* chore: add

* chore: add

* update

* chore: remove logs

* focs: update notes

* docs: remove server doc

---------

Co-authored-by: ayang <473033518@qq.com>
2025-07-31 15:59:35 +08:00
ayangweb
855fb2a168 feat: support sending files in chat messages (#764)
* feat: support sending files in chat messages

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* docs: update changelog
2025-07-31 15:36:03 +08:00
SteveLauC
d2735ec13b refactor: check Extension/plugin.json from all sources (#846)
Coco App has 4 sources of Extension/plugin.json that should be checked:

1. From the "<data directory>/third_party_extensions" directory
2. Imported via "Import Local Extension"
3. Downloaded from the "store/extension/<extension ID>/_download" API
4. From coco-extensions repository

   Granted, Coco APP won't check these files directly, but we will
   re-use the code and run them in that repository's CI.

Previously, only the Extensions from the first source were checked/validated.
This commit extracts the validation logic to a function and applies it to all
4 sources.

Also, the return value of the Tauri command "list_extensions()" has changed.
We no longer return a boolean indicating if any invalid extensions
are found during loading, which only makes sense when installing
extensions requires users to manually edit data files. Since we now
support extension store and local extension imports, it could be omitted.
2025-07-31 14:27:23 +08:00
SteveLauC
c40fc5818a chore: ignore tauri::AppHandle's generic argument R (#845)
This commit removes the generic argument R from all the AppHandle imports, which
is feasible as it has a default type. This change is made not only for simplicity,
but also **consistency**. Trait SearchSource uses this type:

```rust
pub trait SearchSource {
    async fn search(
        &self,
        tauri_app_handle: AppHandle,
        query: SearchQuery,
    ) -> Result<QueryResponse, SearchError>;
}
```

In order to make trait SearchSource object-safe, the AppHandle used in it cannot
contain generic arguments. So some parts of Coco already omit this generic
argument. This commit cleans up the remaining instances and unifies the usage
project-wide.
2025-07-29 21:55:03 +08:00
SteveLauC
a553ebd593 feat: support Quicklink on Rust side (#760)
This commit implements the support for Quicklink on Rust side. We still
need the frontend part to make this complete.
2025-07-29 16:30:12 +08:00
BiggerRain
232166eb89 chore: delete unused code files and dependencies (#841)
Mainly delete unused webSocket content, and delete other unused code files and dependencies
2025-07-29 13:02:28 +08:00
Medcl
99144950d9 Revert "chore: add macos config for tauri (#837)" (#840)
This reverts commit ee45d21bbe.
2025-07-29 11:04:53 +08:00
SteveLauC
32d4f45144 feat: support installing local extensions (#749)
This commit adds support for installing extensions from a local folder path:

```text
extension-directory/
├── assets/
│   ├── icon.png
│   └── other-assets...
└── plugin.json
```

Useful for testing and development of extensions before publishing.

Co-authored-by: ayang <473033518@qq.com>
2025-07-29 10:26:47 +08:00
BiggerRain
6bc78b41ef chore: web component loading font icon (#838)
* chore: web component loading font icon

* docs: update notes
2025-07-28 19:03:40 +08:00
SteveLauC
cd54beee04 refactor: split query_coco_fusion() (#836)
This commit splits query_coco_fusion() into 2 functions:

1. query_coco_fusion_single_query_source()
2. query_coco_fusion_multi_query_sources()

query_coco_fusion_single_query_source(), as the name suggests, will only search
1 query source. Due to this simplicity, it does not need the complex re-ranking
procedure used by query_coco_fusion_multi_query_sources(), which is the primary
reason why this commit was made.

Another reason behind the change is that the re-ranking logic makes the
search results of querying single query source incorrect, it removes documents
from the results. I didn't investigate the issue because dropping the complex
logic in single query source search would be the best solution here.
2025-07-28 17:29:17 +08:00
ayangweb
ee45d21bbe chore: add macos config for tauri (#837) 2025-07-28 16:35:11 +08:00
ayangweb
4709f8c660 feat: enhance ui for skipped version (#834) 2025-07-28 11:43:10 +08:00
SteveLauC
4696aa1759 test: test extract_build_number() (#835)
This commit adds a test for extract_build_number(), which I forgot to do
in commit 067fb7144f6[1].

[1]: 067fb7144f
2025-07-28 11:42:50 +08:00
ayangweb
924fc09516 fix: fix issue with update check failure (#833)
* fix: fix issue with update check failure

* docs: update changelog
2025-07-28 10:06:07 +08:00
SteveLauC
5a700662dd chore: release notes for 0.7.1 (#832) 2025-07-28 10:00:12 +08:00
BiggerRain
8f992bfa92 chore: bump version number to 0.7.1 (#830) 2025-07-27 17:26:08 +08:00
BiggerRain
e7dd27c744 chore: add toggle_move_to_active_space_attribute (#829)
* chore: add toggle_move_to_active_space_attribute

* chore: pin

* chore: add

* update

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2025-07-27 16:50:11 +08:00
ayangweb
7914836c3e fix: correct enter key behavior (#828) 2025-07-27 11:52:40 +08:00
BiggerRain
b37bf1f7c7 chore: bump version number to 0.7.0 (#827) 2025-07-25 19:54:33 +08:00
BiggerRain
419d9d55c5 chore: web componet remove server name (#826) 2025-07-25 18:16:07 +08:00
BiggerRain
d3ed54c771 chore: web component add notification component (#825)
* chroe: web component add notification component

* docs: update notes
2025-07-25 18:15:49 +08:00
ayangweb
8f26dbcbe6 refactor: optimize subpage shortcut context menu (#822)
* refactor: optimize subpage shortcut context menu

* update

* update
2025-07-25 16:43:41 +08:00
ayangweb
663873ae14 refactor: optimize carriage return copying (#823) 2025-07-25 16:43:05 +08:00
SteveLauC
286b1be212 fix: panic on Ubuntu (GNOME) when opening apps (#821)
On Ubuntu (the GNOME version), Coco would panic when users open an app due
to the reason that Coco thinks it is running in an unsupported desktop
environment (DE).

We rely on the environment variable XDG_CURRENT_DESKTOP to detect the DE,
Ubuntu sets this variable to "ubuntu:GNOME" instead of just "GNOME",
which was not handled by the previous implementation.

This commit supports this case. Also, when Coco runs in an unsupported DE,
opening apps should not panic the app. After this commit, we would return
an error.
2025-07-25 15:32:48 +08:00
ayangweb
37221782b0 refactor: optimize shortcut key triggering (#820) 2025-07-25 14:54:32 +08:00
ayangweb
644e291105 fix: fix update window config sync (#818)
* fix: fix update window config sync

* docs: update changelog
2025-07-25 14:47:20 +08:00
BiggerRain
aae6984aa7 fix: re-search data initialization (#817) 2025-07-25 14:43:27 +08:00
ayangweb
dbd296d399 fix: fix enter key on subpages (#819)
* fix: fix enter key on subpages

* docs: update changelog
2025-07-25 14:43:16 +08:00
ayangweb
e2ad25967d fix: fix ctrl+k not working (#815) 2025-07-25 14:30:03 +08:00
ayangweb
21b61d80d8 refactor: optimize method calls for checking for updates (#814) 2025-07-25 13:42:12 +08:00
ayangweb
9f4c693ac4 refactor: optimize line breaks in input boxes (#813) 2025-07-25 12:36:07 +08:00
BiggerRain
45c27cac56 chore: cancel interface param (#816) 2025-07-25 12:16:23 +08:00
BiggerRain
e46035afd4 fix:the client id is the same (#812)
* chore: add

* fix: client id
2025-07-25 11:25:22 +08:00
BiggerRain
1004bb73f4 chore: delay the chat monitoring event (#811) 2025-07-24 20:03:30 +08:00
BiggerRain
d664fa7271 chore: handle reply to message (#799)
* chore: add reply to message

* chore: handle rust data

* log

* chore: id

* feat: add

* chore: loading step

* chore: cur id

* feat: add

* accept query parameters

* chore: add message id for cancel

* chore: remove log

* chore: remove log

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2025-07-24 18:06:59 +08:00
SteveLauC
067fb7144f refactor: use custom version comparator to determine if we should update (#810) 2025-07-24 16:05:36 +08:00
ayangweb
579f91f3aa refactor: refactor version update check (#809) 2025-07-24 11:56:57 +08:00
ayangweb
abe2aecedf fix: fix multiline input issue (#808) 2025-07-24 10:58:57 +08:00
SteveLauC
e8f9a4e627 chore: log querysources to search only when querysource is not set (#807) 2025-07-24 09:39:29 +08:00
ayangweb
22b1558e8b refactor: optimized data fetching for secondary pages (#803) 2025-07-23 18:56:56 +08:00
SteveLauC
ca3b514a65 fix: panic caused by "state() called before manage()" (#806)
This commit fixes the following panic:

```
Time: [2025-07-23-17-03-23]
Location: [/Users/steve/.cargo/registry/src/index.crates.io-1949cf8c6b5b557f/tauri-2.5.1/src/lib.rs:742:7]
Message: [state() called before manage() for tauri_plugin_global_shortcut::GlobalShortcut<tauri_runtime_wry::Wry<tauri::EventLoopMessage>>]
```

The root cause is that, in a Tauri application, before you can access a piece of
managed state with the .state() method, you must first register it with Tauri
using .manage(). When a user reigsters hotkey for an extension,
initializing extensions will invoke the .state() method, at that point,
.manage() hasn't been called.

The fix is simple, we simply call .manage() earlies (invoked by our
`shortcut::enable_shortcut(app)` function).
2025-07-23 18:56:16 +08:00
SteveLauC
c694c4eda9 chore: display backtrace in panic log (#805)
Having backtrace in the panic log will help debugging a lot. Under
release builds, we strip our binary so the symbols information is
unavailable, but this information is still useful in debug builds.

Panic log in release builds:

```
Time: [YYYY-MM-DD-HH-MM-SS]
Location: [/Users/foo/.cargo/registry/src/index.crates.io-1949cf8c6b5b557f/tauri-2.5.1/src/lib.rs:742:7]
Message: [state() called before manage() for tauri_plugin_global_shortcut::GlobalShortcut<tauri_runtime_wry::Wry<tauri::EventLoopMessage>>]
Backtrace:
   0: __mh_execute_header
   1: __mh_execute_header
   2: __mh_execute_header
   3: __mh_execute_header
   4: __mh_execute_header
   5: __mh_execute_header
   6: __mh_execute_header
   7: __mh_execute_header
   8: __mh_execute_header
   9: __mh_execute_header
  10: __mh_execute_header
  11: __mh_execute_header
  12: __mh_execute_header
  13: __mh_execute_header
  14: __mh_execute_header
  15: __mh_execute_header
  16: __mh_execute_header
  17: <unknown>
  18: <unknown>
```
2025-07-23 17:00:48 +08:00
ayangweb
ac835c76aa fix: fix shortcut issue in windows context menu (#804)
* fix: fix shortcut issue in windows context menu

* docs: update changelog
2025-07-23 16:20:46 +08:00
SteveLauC
25bbab7432 refactor: clean up unsupported characters from query string in Win Search (#802)
We found that Windows Search would error out if it encounters a single
quote character, the natural solution would be to escape it. But I couldn't
find out how. The approach mentioned by most posts:

```
~="<Unsupported Char>"
```

won't work in my test. So I decided to replace it with a whitespace.

Single quote is not the first unsupported character I know, the newline
character is not supported as well, so it will be handled in the same
way.
2025-07-23 16:13:15 +08:00
ayangweb
cca00e944e fix: fix selection issue after renaming (#800) 2025-07-23 13:59:33 +08:00
SteveLauC
e78fe4ac89 fix: broken windows search (#801)
This commit fixes the search issue introduced by [commit](5c0a865822). We have no idea why the tauri command `get_app_search_source` won't be invoked after that commit on Windows.

This commit resolves the issue by moving the extension init logic to the Rust side.

Also, update the querysource logs in `quey_coco_fusion()`, the old one won't say anything if the querysource list is empty, the new one will tell us that.
2025-07-23 12:33:18 +08:00
Medcl
60fd79f1fa fix: increase read_timeout for HTTP streaming stability (#798) 2025-07-22 18:44:27 +08:00
BiggerRain
5c0a865822 chore: not request the interface if not logged in (#795)
* chore: not request the interface if not logged in

* chore: res

* chore: res

* chore: common interface

* chore: no login

* chore: login

* chore: login

* chore: add

* dbg print servers

* chore: id

* docs: update notes

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2025-07-22 16:15:58 +08:00
SteveLauC
5b50e4b51b ci: add Rust code format check to CI (#797)
This commit adds the Rust code format check to our CI.
2025-07-22 15:11:13 +08:00
SteveLauC
b97386a827 refactor: avoid GLOBAL_TAURI_APP_HANLE if possible (#796)
This commit fixes the Windows panic issue. 

Coco panicked because it accessed `GLOBAL_TAURI_APP_HANDLE` when this global variable wasn't initialized. I removed all the uses of this variable except for the one use in `src-tauri/src/server/http_client.rs`, which I don't have a good way to refactor.

If you are wondering why this didn't happen in the past, the access was triggered by the frontend code, something there likely changed. Regardless, this global variable is still dangerous and error-prone, so we should avoid it.

Also, this commit fixes the issue that the panic hook does not work on Windows because the log filename contains ":", which is not allowed by the Windows file system.
2025-07-22 14:43:27 +08:00
SteveLauC
29aa26af94 chore: add a panic hook to catch panic msg (#793) 2025-07-22 10:34:27 +08:00
BiggerRain
3650d9914c fix: enter key problem (#794)
* fixed: enter key problem

* docs: update notes

* fix: enter key problem
2025-07-22 10:13:08 +08:00
SteveLauC
f26031047c fix: refreshing Coco server should register it to SearchSource (#792) 2025-07-22 08:51:57 +08:00
BiggerRain
c8719926be chore: add 401 unauthorized (#791) 2025-07-21 22:21:07 +08:00
BiggerRain
f1dfc5c730 fixed: chat message confusion (#782)
* fix: chat

* fix: chat

* chore: add session id

* fix: fixed incorrect taskbar icon display on linux (#783)

* fix: fixed incorrect taskbar icon display on linux

* docs: update changelog

* fix: fix data inconsistency issue on secondary pages (#784)

* chore: chat

* chore: chat

* chore: add logging message

* chore: chat

* chore: chat

* chore: add

* feat: add

* chore: chat end

* style: message width

---------

Co-authored-by: ayangweb <75017711+ayangweb@users.noreply.github.com>
Co-authored-by: medcl <m@medcl.net>
2025-07-21 21:17:20 +08:00
SteveLauC
74ed642a42 refactor: tighten up Coco servers state management (#790)
* refactor: tighten up Coco servers state management

* ignore unused warnings

* log out if the failed request has status 401
2025-07-21 20:39:16 +08:00
ayangweb
5a17173620 fix: incorrect status when installing extension (#789)
* fix: incorrect status when installing extension

* docs: update changelog
2025-07-21 18:17:30 +08:00
SteveLauC
29d14ff931 chore: remove unused type ServerTokenResponse (#788)
After this commit[1], type `ServerTokenResponse` became unused, remove
it as well.

[1]: 57ab08fb6d
2025-07-21 15:30:26 +08:00
ayangweb
ad01504766 refactor: decouple window switch services to ensure they operate independently (#786) 2025-07-20 17:26:15 +08:00
SteveLauC
57ab08fb6d chore: remove unused tauri cmd get_server_token (#787)
Found this tauri command while reading the code, then I realized that
token management logic should all be kept in the backend, there is no
need to expose it to the frontend. And indeed, searching for it in the
frontend code showed that it is not used at all.

```sh
$ cd src

$ rg get_server_token
commands/servers.ts
75:export function get_server_token(id: string): Promise<ServerTokenResponse> {
76:  return invokeWithErrorHandler(`get_server_token`, { id });
```

So remove it.
2025-07-20 17:25:32 +08:00
ayangweb
db5c09f80c fix: fix data inconsistency issue on secondary pages (#784) 2025-07-20 10:54:51 +08:00
ayangweb
b1e2c6961d fix: fixed incorrect taskbar icon display on linux (#783)
* fix: fixed incorrect taskbar icon display on linux

* docs: update changelog
2025-07-20 10:08:11 +08:00
BiggerRain
3f4abe51e5 fix: web component server list error (#781)
* chore: update app

* fix: web component server list error

* feat: add

* chore: remove defalut version
2025-07-19 17:07:11 +08:00
ayangweb
060c09e11c fix: resolved minor issues with voice playback (#780)
* fix: resolved minor issues with voice playback

* docs: update changelog

* update
2025-07-19 14:25:19 +08:00
ayangweb
657df482bf fix: correct incorrect assistant display when quick ai access (#779)
* fix: correct incorrect assistant display when quick ai access

* docs: update changelog
2025-07-19 13:54:39 +08:00
ayangweb
f4f7732927 refactor: show specific values in shortcut key conflict tips (#778)
* refactor: show specific values in shortcut key conflict tips

* update

* update

* update

* update

* update

* update

* update
2025-07-19 11:05:17 +08:00
ayangweb
5e536e1444 refactor: separate user agreement and privacy policy links (#777) 2025-07-19 10:24:29 +08:00
ayangweb
2b48cdf84a refactor: add border-radius to extended categories (#776) 2025-07-19 10:08:04 +08:00
BiggerRain
bc37616506 chore: search-chat add language and formatUrl parameters (#775)
* chore: add language

* build: build web

* docs: update notes
2025-07-19 09:34:38 +08:00
ayangweb
07bcd80776 refactor: invoke language update logic earlier (#774) 2025-07-18 16:44:43 +08:00
SteveLauC
7b8b396368 fix: indexing apps does not respect search scope config (#773)
This commit fixes the issue that indexing applications does not
respect the search scope configuration, it always uses the default
values.
2025-07-18 16:26:34 +08:00
ayangweb
823a95d601 fix: restore missing category titles on subpages (#772) 2025-07-18 16:25:44 +08:00
ayangweb
af0b98a41b refactor: rebuild app index with improved suggestions (#771) 2025-07-18 16:15:28 +08:00
SteveLauC
7d0e7cd7dc fix: unregister ext hotkey when it gets deleted (#770)
This commit fixes the bug that when an extension gets uninstalled, its
registered hotkey won't be cleared.
2025-07-18 13:20:41 +08:00
ayangweb
e56d6b1b60 refactor: close the file upload port (#769) 2025-07-18 10:45:05 +08:00
BiggerRain
941cf96a07 style: splash adapts to the width of mobile phones (#768)
* style: splash width style

* docs: update notes
2025-07-17 15:33:24 +08:00
SteveLauC
14fbf2ac5d refactor: do status code check before deserializing response (#767)
* refactor: do status code check before deserializing response

This commit adds a status code check to the following requests, only when
this check passes, we deserialize the response JSON body:

- get_connectors_by_server
- mcp_server_search
- datasource_search

A helper function `status_code_check(response, allowed_status_codes)`
is added to make refactoring easier.

* chore: release notes
2025-07-17 15:08:14 +08:00
SteveLauC
494e2f0d8a chore: Coco app http request headers (#744)
Add the following HTTP headers when making HTTP requests:

- X-OS-NAME
- X-OS-VER
- X-OS-ARCH
- X-APP-NAME
- X-APP-VER
- X-APP-LANG
2025-07-17 11:31:19 +08:00
BiggerRain
e3a3849fa4 chore: search-chat components add formatUrl & think data & icons url (#765)
* chore: web components add formatUrl & think data

* chore: add headers

* chore: add

* chhore: add server url

* docs: update notes

* chore: url

* docs: search chat docs
2025-07-17 09:22:23 +08:00
SteveLauC
0b5e31a476 chore(deps): bump the windows crate (#766)
This commit bumps the windows crate from "0.60.0" to "0.61.3", it should
solve the CI issue happened here[1]:

```text
error[E0277]: `DBOBJECT` doesn't implement `Debug`
     --> C:\Users\runneradmin\.cargo\registry\src\index.crates.io-1949cf8c6b5b557f\windows-0.60.0\src\Windows\Win32\System\Search\mod.rs:21828:5
      |
21826 | #[derive(Clone, Debug, PartialEq)]
      |                 ----- in this derive macro expansion
21827 | pub struct SSVARIANT_0_4 {
21828 |     pub dbobj: DBOBJECT,
      |     ^^^^^^^^^^^^^^^^^^^ the trait `Debug` is not implemented for `DBOBJECT`
      |
      = note: add `#[derive(Debug)]` to `DBOBJECT` or manually `impl Debug for DBOBJECT`
```

[1]: https://github.com/infinilabs/ci/actions/runs/16314479643/job/46076989290
2025-07-16 17:10:32 +08:00
SteveLauC
c8a723ed9d feat: file search for Windows (#762)
This commit implements the file search extension for Windows platforms using the [Windows Search](https://learn.microsoft.com/en-us/windows/win32/search/-search-3x-wds-qryidx-overview) functionality.

Something to note:

1. Searching by file content is not natively supported. Coco would search for all the columns (attributes/fields within the index) with this option:

```rust
        SearchBy::NameAndContents => {
            // Windows File Search does not support searching by file content.
            //
            // `CONTAINS('query_string')` would search all columns for `query_string`,
            // this is the closest solution we have.
            format!("((System.FileName LIKE '%{query_string}%') OR CONTAINS('{query_string}'))")
        }
```

2. Tests have been added, but they failed in our CI for unknown reasons so I disabled them:

```rust
// Skip these tests in our CI, they fail with the following error 
// "SQL is invalid: "0x80041820""
// 
// I have no idea about the underlying root cause
#[cfg(all(test, not(ci)))]
mod test {
```

3. The Windows Search index is not real-time and can return obsolete results. Opening the returned documents could fail if the chosen file has been deleted or moved.
2025-07-16 09:11:53 +08:00
ayangweb
aaf4bf2737 refactor: update the font icon link (#763) 2025-07-15 09:10:26 +08:00
BiggerRain
24b0123a61 docs: add deep wiki docs (#761) 2025-07-11 17:22:18 +08:00
ayangweb
e8bd970cdb refactor: updated the upload endpoint for attachments (#759) 2025-07-10 18:20:32 +08:00
ayangweb
dd3be3a819 refactor: refactored file icon retrieval logic (#757)
* refactor: refactored file icon retrieval logic

* update

* update

* update
2025-07-10 18:10:39 +08:00
Medcl
5b034c28ac chore: make optional fields optional (#758)
* chore: make optional fields optional

* chore: update docs
2025-07-10 18:06:05 +08:00
ayangweb
b17949fe29 refactor: enabling the upload file component (#755)
* refactor: enabling the upload file component

* update
2025-07-10 17:26:44 +08:00
SteveLauC
5d37420109 feat: tauri command get_file_icon() (#756) 2025-07-10 16:51:34 +08:00
ayangweb
1d3ceb0c70 refactor: remove speech-to-text shortcuts (#754) 2025-07-10 13:58:37 +08:00
BiggerRain
4d11afe18e chore: assistant params & styles (#753)
* chore: add

* chore: add

* chore: assistant params & styles

* docs: update notes
2025-07-10 11:47:10 +08:00
SteveLauC
0c0291c8c0 chore: rename QuickLink/quick_link to Quicklink/quicklink (#752)
* chore: rename QuickLink/quick_link to Quicklink/quicklink

Standardize varaible naming to match the correct term: "Quicklink" and "quicklink".
This updates all incorrect variants such as "QuickLink" and "quick_link".

* chore: release notes
2025-07-10 10:18:57 +08:00
ayangweb
cca672b2cb feat: text to speech now powered by LLM (#750)
* feat: support text to speech

* chore: receive bytes stream

* chore: update testing code

* feat: mp3 paly

* update

* docs: update changelog

* update

* update

* update

---------

Co-authored-by: medcl <m@medcl.net>
Co-authored-by: rain9 <15911122312@163.com>
2025-07-10 10:16:51 +08:00
BiggerRain
5b27488402 refactor: adjusted assistant, datasource, mcp_server interface parameters (#746)
* chore: handle mcp interface parameters

* docs: update notes

* chore: remove code

* chore: assistant params

* fix: assistant params

* docs: update notes
2025-07-10 09:48:42 +08:00
SteveLauC
c1c4e0db7b chore: bump dep applications-rs (#751)
* chore: bump dep applications-rs

Currently Coco depends on atty v0.2.14, a crate that has
[vulnerability](https://github.com/infinilabs/coco-app/security/dependabot/25),
here is the dependency chain:

```
coco -> applications-rs -> freedesktop-file-parser 0.1.0 -> atty 0.2.14
```

I bumped the [`freedesktop-file-parser`](7bdb070e45)
crate in our applications-rs crate, which would eliminate the `atty` crate
from the chain to fix the vulnerability.

This commit bumps the applications-rs crate to include the above change.

* chore: release notes
2025-07-09 18:52:17 +08:00
ayangweb
074a7c8b0a fix: prevent window from hiding when moved on Windows (#748)
* fix: prevent window from hiding when moved on Windows

* docs: update changelog

* update
2025-07-09 16:30:41 +08:00
SteveLauC
bc524e19db refactor: adjust extension code hierarchy (#747)
* refactor: adjust extension code hierarchy

In this commit, I refactored the extension code structure.

* We can only install third-party extensions so the `store.rs` file should
  belong to the `third_party` directory.

* Move tauri command `uninstall_extension()` to `extension/mod.rs` from
  `third_party.rs` since one can uninstall an extension regardless of
  how you installed it.

* Refactor the `install_extension_from_store()` function, add more
  descriptive code comments.

Also, a trivial change, bump Rust toolchain and edition to use the
[let-chains](https://blog.rust-lang.org/2025/06/26/Rust-1.88.0/#let-chains) syntax.

* chore: release notes
2025-07-09 16:28:59 +08:00
SteveLauC
05f70d26d9 chore: replace meval-rs with our fork to clear dep warning (#745)
* chore: replace meval-rs with our fork to clear dep warning

This commit replaces the meval-rs dependency with our
[fork](https://github.com/infinilabs/meval-rs). The original meval-rs
crate has not been maintained for a long time and uses nom 1.0, a crate
that was released 9 years ago, which would be rejected by future Rust
compiler because it contains outdated Rust syntaxes. This is the reason
why we are seeing the following warning:

```
warning: the following packages contain code that will be rejected by a future version of Rust: nom v1.2.4
note: to see what the problems were, use the option `--future-incompat-report`, or run `cargo report future-incompatibilities --id 1
```

Switching to our fork would solve this warning.

* chore: release notes
2025-07-08 15:39:58 +08:00
SteveLauC
ab26dc7c6a fix(file search): searching by name&content does not search file name (#743)
* fix(file search): searching by name&content does not search file name

* release note
2025-07-08 09:21:43 +08:00
BiggerRain
6ff6b46139 refactor: create chat & send chat api (#739)
* chore: code format

* fix: build error

* refactor: chat create & chat

* chore: aa

* chore: aa

* refactor: send chat messages

* chore: chat

* chore: web

* chore: add

* docs: update notes
2025-07-07 19:41:29 +08:00
SteveLauC
119fd87a25 fix(file search): apply filters before from/size parameters (#741) 2025-07-07 19:40:46 +08:00
SteveLauC
de226a8fa4 ci: compile-check rust code & run rust tests when Rust code changes (#742)
Run some basic Rust checks in our CI iff rust code changes
2025-07-07 18:14:25 +08:00
SteveLauC
6865957725 chore: icon support for more file types (#740)
This PR adds icon support for more types of files, see the code for the full file type list.

Co-authored-by: ayang <473033518@qq.com>
2025-07-02 16:27:44 +08:00
SteveLauC
87818d69ed refactor: change File Search ext type to extension (#738)
* refactor: change File Search ext type to extension

* chore: release notes
2025-07-02 10:45:54 +08:00
SteveLauC
38b67d01b8 refactor: prioritize stat(2) when checking if a file is dir (#737)
* refactor: prioritize stat(2) when checking if a file is dir

* chore: release notes
2025-07-02 10:00:33 +08:00
ayangweb
a4f4a24730 feat: voice input support in both search and chat modes (#732)
* feat: voice input support in both search and chat modes

* docs: update changelog

* update

* update

* update

* update
2025-07-02 09:35:16 +08:00
BiggerRain
87bd3d020f fix: build error (#736) 2025-07-02 07:03:09 +08:00
SteveLauC
825ac5d565 feat: file search using spotlight (#705)
Co-authored-by: ayang <473033518@qq.com>
2025-07-01 19:19:16 +08:00
BiggerRain
f21a35e15d fix: update information storage cache and styles (#735) 2025-07-01 15:46:37 +08:00
BiggerRain
6e90b28204 style: extension iocn styles (#734) 2025-07-01 13:44:44 +08:00
Hardy
e92e5e5158 chore: typo step name and env (#731)
Co-authored-by: hardy <luohf@infinilabs.com>
2025-06-30 14:40:26 +08:00
255 changed files with 18840 additions and 7782 deletions

2
.env
View File

@@ -1,5 +1,3 @@
COCO_SERVER_URL=http://localhost:9000 #https://coco.infini.cloud #http://localhost:9000
COCO_WEBSOCKET_URL=ws://localhost:9000/ws #wss://coco.infini.cloud/ws #ws://localhost:9000/ws
#TAURI_DEV_HOST=0.0.0.0

34
.github/workflows/frontend-ci.yml vendored Normal file
View File

@@ -0,0 +1,34 @@
name: Frontend Code Check
on:
pull_request:
# Only run it when Frontend code changes
paths:
- 'src/**'
jobs:
check:
strategy:
matrix:
platform: [ubuntu-latest, windows-latest, macos-latest]
runs-on: ${{ matrix.platform }}
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
# No need to pass the version arg as it is specified by "packageManager" in package.json
- name: Install pnpm
uses: pnpm/action-setup@v4
- name: Install dependencies
run: pnpm install --frozen-lockfile
- name: Build frontend
run: pnpm build

View File

@@ -77,7 +77,6 @@ jobs:
target: "aarch64-unknown-linux-gnu"
env:
APP_VERSION: ${{ needs.create-release.outputs.APP_VERSION }}
RELEASE_BODY: ${{ needs.create-release.outputs.RELEASE_BODY }}
runs-on: ${{ matrix.platform }}
steps:
@@ -105,9 +104,19 @@ jobs:
if: startsWith(matrix.platform, 'ubuntu-22.04')
run: |
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf xdg-utils
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf xdg-utils libtracker-sparql-3.0-dev
- name: Add Rust build target at ${{ matrix.platform}} for ${{ matrix.target }}
# On Windows, we need to generate bindings for 'searchapi.h' using bindgen.
# And bindgen relies on 'libclang'
# https://rust-lang.github.io/rust-bindgen/requirements.html#windows
- name: Install dependencies (Windows only)
if: startsWith(matrix.platform, 'windows-latest')
shell: bash
run: winget install LLVM.LLVM --silent --accept-package-agreements --accept-source-agreements
- name: Add Rust build target
working-directory: src-tauri
shell: bash
run: |
@@ -158,7 +167,7 @@ jobs:
with:
tagName: ${{ github.ref_name }}
releaseName: Coco ${{ env.APP_VERSION }}
releaseBody: "${{ env.RELEASE_BODY }}"
releaseBody: "${{ needs.create-release.outputs.RELEASE_BODY }}"
releaseDraft: true
prerelease: false
args: ${{ env.BUILD_ARGS }}

69
.github/workflows/rust_code_check.yml vendored Normal file
View File

@@ -0,0 +1,69 @@
name: Rust Code Check
on:
pull_request:
# Only run it when Rust code changes
paths:
- 'src-tauri/**'
jobs:
check:
strategy:
matrix:
platform: [ubuntu-latest, windows-latest, macos-latest]
runs-on: ${{ matrix.platform }}
steps:
- uses: actions/checkout@v4
- name: Checkout dependency (pizza-engine) repository
uses: actions/checkout@v4
with:
repository: 'infinilabs/pizza'
ssh-key: ${{ secrets.SSH_PRIVATE_KEY }}
submodules: recursive
ref: main
path: pizza
- name: Install dependencies (ubuntu only)
if: startsWith(matrix.platform, 'ubuntu-latest')
run: |
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf xdg-utils libtracker-sparql-3.0-dev
# On Windows, we need to generate bindings for 'searchapi.h' using bindgen.
# And bindgen relies on 'libclang'
# https://rust-lang.github.io/rust-bindgen/requirements.html#windows
- name: Install dependencies (Windows only)
if: startsWith(matrix.platform, 'windows-latest')
shell: bash
run: winget install LLVM.LLVM --silent --accept-package-agreements --accept-source-agreements
- name: Add pizza engine as a dependency
working-directory: src-tauri
shell: bash
run: cargo add --path ../pizza/lib/engine --features query_string_parser,persistence
- name: Format check
working-directory: src-tauri
shell: bash
run: |
rustup component add rustfmt
cargo fmt --all --check
- name: Check compilation (Without Pizza engine enabled)
working-directory: ./src-tauri
run: cargo check
- name: Check compilation (With Pizza engine enabled)
working-directory: ./src-tauri
run: cargo check --features use_pizza_engine
- name: Run tests (Without Pizza engine)
working-directory: ./src-tauri
run: cargo test
- name: Run tests (With Pizza engine)
working-directory: ./src-tauri
run: cargo test --features use_pizza_engine

10
.vscode/settings.json vendored
View File

@@ -8,6 +8,8 @@
"clsx",
"codegen",
"dataurl",
"deeplink",
"deepthink",
"dtolnay",
"dyld",
"elif",
@@ -30,6 +32,8 @@
"localstorage",
"lucide",
"maximizable",
"mdast",
"meval",
"Minimizable",
"msvc",
"nord",
@@ -39,9 +43,11 @@
"overscan",
"partialize",
"patchelf",
"Quicklink",
"Raycast",
"rehype",
"reqwest",
"rerank",
"rgba",
"rustup",
"screenshotable",
@@ -56,6 +62,7 @@
"traptitech",
"unlisten",
"unlistener",
"unlisteners",
"unminimize",
"uuidv",
"VITE",
@@ -76,5 +83,6 @@
"i18n-ally.keystyle": "nested",
"editor.tabSize": 2,
"editor.insertSpaces": true,
"editor.detectIndentation": false
"editor.detectIndentation": false,
"i18n-ally.displayLanguage": "zh"
}

View File

@@ -64,9 +64,9 @@ At Coco AI, we aim to streamline workplace collaboration by centralizing access
### Prerequisites
- Node.js >= 18.12
- Rust (latest stable)
- pnpm (package manager)
- [Node.js >= 18.12](https://nodejs.org/en/download/)
- [Rust (latest stable)](https://www.rust-lang.org/tools/install)
- [pnpm (package manager)](https://pnpm.io/installation)
### Development Setup
@@ -91,6 +91,8 @@ pnpm tauri build
- [Coco App Documentation](https://docs.infinilabs.com/coco-app/main/)
- [Coco Server Documentation](https://docs.infinilabs.com/coco-server/main/)
- [DeepWiki Coco App](https://deepwiki.com/infinilabs/coco-app)
- [DeepWiki Coco Server](https://deepwiki.com/infinilabs/coco-server)
- [Tauri Documentation](https://tauri.app/)
## Contributors

56
RELEASE_PROCEDURE.md Normal file
View File

@@ -0,0 +1,56 @@
1. Send a PR that updates the release notes "docs/content.en/docs/release-notes/_index.md", and
merge it into `main`.
2. Run release command (by @medcl)
Make sure you are on the latest main branch, then run `pnpm release`:
> NOTE: A tag is needed to trigger the [release CI][release_ci].
```sh
➜ coco-app git:(main) ✗ pnpm release
🚀 Let's release coco (currently at a.b.c)
Changelog:
* xxx
* xxx
✔ Select increment (next version):
Changeset:
M package.json
M src-tauri/Cargo.lock
M src-tauri/Cargo.toml
✔ Commit (vX.Y.Z)? Yes
✔ Tag (vX.Y.Z)? Yes
✔ Push? Yes
🏁 Done
```
3. Build & Move Release Package
1. [Build][ci] the package for this release
2. @luohoufu moves the package to the stable folder.
![release](./docs/static/img/release.png)
4. Update the [roadmap](https://coco.rs/en/roadmap) (if needed)
> You should update both English and Chinese JSON files
>
> * English: https://github.com/infinilabs/coco-website/blob/main/i18n/locales/en.json
> * Chinese: https://github.com/infinilabs/coco-website/blob/main/i18n/locales/zh.json
1. Add a new [section][roadmap_new] for the new release
2. Adjust the entries under [In Progress][in_prog] and [Up Next][up_next] accordingly
* Completed items should be removed from "In Progress"
* Some items should be moved from "Up Next" to "In Progress"
[release_ci]: https://github.com/infinilabs/coco-app/blob/main/.github/workflows/release.yml
[ci]: https://github.com/infinilabs/ci/actions/workflows/coco-app.yml
[roadmap_new]: https://github.com/infinilabs/coco-website/blob/5ae30bdfad0724bf27b4da8621b86be1dbe7bb8b/i18n/locales/en.json#L206-L218
[in_prog]: https://github.com/infinilabs/coco-website/blob/5ae30bdfad0724bf27b4da8621b86be1dbe7bb8b/i18n/locales/en.json#L121
[up_next]: https://github.com/infinilabs/coco-website/blob/5ae30bdfad0724bf27b4da8621b86be1dbe7bb8b/i18n/locales/en.json#L156

View File

@@ -13,6 +13,12 @@ asciinema: true
[x11_protocol]: https://en.wikipedia.org/wiki/X_Window_System
[if_x11]: https://unix.stackexchange.com/q/202891/498440
## Install dependencies
```sh
$ sudo apt-get update
$ sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf xdg-utils libtracker-sparql-3.0-dev
```
## Go to the download page

View File

@@ -5,15 +5,139 @@ title: "Release Notes"
# Release Notes
Information about release notes of Coco Server is provided here.
Information about release notes of Coco App is provided here.
## Latest (In development)
### ❌ Breaking changes
### 🚀 Features
### 🐛 Bug fix
### ✈️ Improvements
## 0.8.0 (2025-09-28)
### ❌ Breaking changes
- chore: update request accesstoken api #866
### 🚀 Features
- feat: enhance ui for skipped version #834
- feat: support installing local extensions #749
- feat: support sending files in chat messages #764
- feat: sub extension can set 'platforms' now #847
- feat: add extension uninstall option in settings #855
- feat: impl extension settings 'hide_before_open' #862
- feat: index both en/zh_CN app names and show app name in chosen language #875
- feat: support context menu in debug mode #882
- feat: file search for Linux/GNOME #884
- feat: file search for Linux/KDE #886
- feat: extension Window Management for macOS #892
- feat: new extension type View #894
- feat: support opening file in its containing folder #900
### 🐛 Bug fix
- fix: fix issue with update check failure #833
- fix: web component login state #857
- fix: shortcut key not opening extension store #877
- fix: set up hotkey on main thread or Windows will complain #879
- fix: resolve deeplink login issue #881
- fix: use kill_on_drop() to avoid zombie proc in error case #887
- fix: settings window rendering/loading issue 889
- fix: ensure search paths are indexed #896
- fix: bump applications-rs to fix empty app name issue #898
### ✈️ Improvements
- refactor: calling service related interfaces #831
- refactor: split query_coco_fusion() #836
- chore: web component loading font icon #838
- chore: delete unused code files and dependencies #841
- chore: ignore tauri::AppHandle's generic argument R #845
- refactor: check Extension/plugin.json from all sources #846
- refactor: pinning window won't set CanJoinAllSpaces on macOS #854
- build: web component build error #858
- refactor: coordinate third-party extension operations using lock #867
- refactor: index iOS apps and macOS apps that store icon in Assets.car #872
- refactor: accept both '-' and '\_' as locale str separator #876
- refactor: relax the file search conditions on macOS #883
- refactor: ensure Coco won't take focus #891
- chore: skip login check for web widget #895
- chore: convertFileSrc() "link[href]" and "img[src]" #901
## 0.7.1 (2025-07-27)
### ❌ Breaking changes
### 🚀 Features
### 🐛 Bug fix
- fix: correct enter key behavior #828
### ✈️ Improvements
- chore: web component add notification component #825
- refactor: collection behavior defaults to `MoveToActiveSpace`, and only use `CanJoinAllSpaces` when window is pinned #829
## 0.7.0 (2025-07-25)
### ❌ Breaking changes
### 🚀 Features
- feat: file search using spotlight #705
- feat: voice input support in both search and chat modes #732
- feat: text to speech now powered by LLM #750
- feat: file search for Windows #762
### 🐛 Bug fix
- fix(file search): apply filters before from/size parameters #741
- fix(file search): searching by name&content does not search file name #743
- fix: prevent window from hiding when moved on Windows #748
- fix: unregister ext hotkey when it gets deleted #770
- fix: indexing apps does not respect search scope config #773
- fix: restore missing category titles on subpages #772
- fix: correct incorrect assistant display when quick ai access #779
- fix: resolved minor issues with voice playback #780
- fix: fixed incorrect taskbar icon display on linux #783
- fix: fix data inconsistency issue on secondary pages #784
- fix: incorrect status when installing extension #789
- fix: increase read_timeout for HTTP streaming stability #798
- fix: enter key problem #794
- fix: fix selection issue after renaming #800
- fix: fix shortcut issue in windows context menu #804
- fix: panic caused by "state() called before manage()" #806
- fix: fix multiline input issue #808
- fix: fix ctrl+k not working #815
- fix: fix update window config sync #818
- fix: fix enter key on subpages #819
- fix: panic on Ubuntu (GNOME) when opening apps #821
### ✈️ Improvements
- refactor: prioritize stat(2) when checking if a file is dir #737
- refactor: change File Search ext type to extension #738
- refactor: create chat & send chat api #739
- chore: icon support for more file types #740
- chore: replace meval-rs with our fork to clear dep warning #745
- refactor: adjusted assistant, datasource, mcp_server interface parameters #746
- refactor: adjust extension code hierarchy #747
- chore: bump dep applications-rs #751
- chore: rename QuickLink/quick_link to Quicklink/quicklink #752
- chore: assistant params & styles #753
- chore: make optional fields optional #758
- chore: search-chat components add formatUrl & think data & icons url #765
- chore: Coco app http request headers #744
- refactor: do status code check before deserializing response #767
- style: splash adapts to the width of mobile phones #768
- chore: search-chat add language and formatUrl parameters #775
- chore: not request the interface if not logged in #795
- refactor: clean up unsupported characters from query string in Win Search #802
- chore: display backtrace in panic log #805
## 0.6.0 (2025-06-29)
### ❌ Breaking changes
@@ -301,4 +425,4 @@ Information about release notes of Coco Server is provided here.
### Bug fix
### Improvements
### Improvements

BIN
docs/static/img/release.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 27 KiB

View File

@@ -1,7 +1,7 @@
{
"name": "coco",
"private": true,
"version": "0.6.0",
"version": "0.8.0",
"type": "module",
"scripts": {
"dev": "vite",
@@ -18,7 +18,6 @@
"release-beta": "release-it --preRelease=beta --preReleaseBase=1"
},
"dependencies": {
"@ant-design/icons": "^6.0.0",
"@headlessui/react": "^2.2.2",
"@tauri-apps/api": "^2.5.0",
"@tauri-apps/plugin-autostart": "~2.2.0",
@@ -27,12 +26,11 @@
"@tauri-apps/plugin-global-shortcut": "~2.0.0",
"@tauri-apps/plugin-http": "~2.0.2",
"@tauri-apps/plugin-log": "~2.4.0",
"@tauri-apps/plugin-opener": "^2.2.7",
"@tauri-apps/plugin-opener": "^2.5.0",
"@tauri-apps/plugin-os": "^2.2.1",
"@tauri-apps/plugin-process": "^2.2.1",
"@tauri-apps/plugin-shell": "^2.2.1",
"@tauri-apps/plugin-updater": "github:infinilabs/tauri-plugin-updater#v2",
"@tauri-apps/plugin-websocket": "~2.3.0",
"@tauri-apps/plugin-window": "2.0.0-alpha.1",
"@wavesurfer/react": "^1.0.11",
"ahooks": "^3.8.4",
@@ -95,4 +93,4 @@
"vite": "^5.4.19"
},
"packageManager": "pnpm@10.11.0+sha512.6540583f41cc5f628eb3d9773ecee802f4f9ef9923cc45b69890fb47991d4b092964694ec3a4f738a420c918a333062c8b925d312f42e4f0c263eb603551f977"
}
}

132
pnpm-lock.yaml generated
View File

@@ -8,9 +8,6 @@ importers:
.:
dependencies:
'@ant-design/icons':
specifier: ^6.0.0
version: 6.0.0(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
'@headlessui/react':
specifier: ^2.2.2
version: 2.2.2(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
@@ -36,8 +33,8 @@ importers:
specifier: ~2.4.0
version: 2.4.0
'@tauri-apps/plugin-opener':
specifier: ^2.2.7
version: 2.2.7
specifier: ^2.5.0
version: 2.5.0
'@tauri-apps/plugin-os':
specifier: ^2.2.1
version: 2.2.1
@@ -50,9 +47,6 @@ importers:
'@tauri-apps/plugin-updater':
specifier: github:infinilabs/tauri-plugin-updater#v2
version: https://codeload.github.com/infinilabs/tauri-plugin-updater/tar.gz/358e689c65e9943b53eff50bcb9dfd5b1cfc4072
'@tauri-apps/plugin-websocket':
specifier: ~2.3.0
version: 2.3.0
'@tauri-apps/plugin-window':
specifier: 2.0.0-alpha.1
version: 2.0.0-alpha.1
@@ -239,23 +233,6 @@ packages:
resolution: {integrity: sha512-30iZtAPgz+LTIYoeivqYo853f02jBYSd5uGnGpkFV0M3xOt9aN73erkgYAmZU43x4VfqcnLxW9Kpg3R5LC4YYw==}
engines: {node: '>=6.0.0'}
'@ant-design/colors@8.0.0':
resolution: {integrity: sha512-6YzkKCw30EI/E9kHOIXsQDHmMvTllT8STzjMb4K2qzit33RW2pqCJP0sk+hidBntXxE+Vz4n1+RvCTfBw6OErw==}
'@ant-design/fast-color@3.0.0':
resolution: {integrity: sha512-eqvpP7xEDm2S7dUzl5srEQCBTXZMmY3ekf97zI+M2DHOYyKdJGH0qua0JACHTqbkRnD/KHFQP9J1uMJ/XWVzzA==}
engines: {node: '>=8.x'}
'@ant-design/icons-svg@4.4.2':
resolution: {integrity: sha512-vHbT+zJEVzllwP+CM+ul7reTEfBR0vgxFe7+lREAsAA7YGsYpboiq2sQNeQeRvh09GfQgs/GyFEvZpJ9cLXpXA==}
'@ant-design/icons@6.0.0':
resolution: {integrity: sha512-o0aCCAlHc1o4CQcapAwWzHeaW2x9F49g7P3IDtvtNXgHowtRWYb7kiubt8sQPFvfVIVU/jLw2hzeSlNt0FU+Uw==}
engines: {node: '>=8'}
peerDependencies:
react: '>=16.0.0'
react-dom: '>=16.0.0'
'@antfu/install-pkg@1.1.0':
resolution: {integrity: sha512-MGQsmw10ZyI+EJo45CdSER4zEb+p31LpDAFp2Z3gkSd1yqVZGi0Ebx++YTEMonJy4oChEMLsxZ64j8FH6sSqtQ==}
@@ -813,6 +790,9 @@ packages:
resolution: {integrity: sha512-O8jcjabXaleOG9DQ0+ARXWZBTfnP4WNAqzuiJK7ll44AmxGKv/J2M4TPjxjY3znBCfvBXFzucm1twdyFybFqEA==}
engines: {node: '>=12'}
'@jridgewell/gen-mapping@0.3.13':
resolution: {integrity: sha512-2kkt/7niJ6MgEPxF0bYdQ6etZaA+fQvDcLKckhy1yIQOzaoKjBBjSj63/aLVjYE3qhRt5dvM+uUyfCg6UKCBbA==}
'@jridgewell/gen-mapping@0.3.8':
resolution: {integrity: sha512-imAbBGkb+ebQyxKgzv5Hu2nmROxoDOXHh80evxdoXNOrvAnVx7zimzc1Oo5h9RlfV4vPXaE2iM5pOFbvOCClWA==}
engines: {node: '>=6.0.0'}
@@ -825,15 +805,21 @@ packages:
resolution: {integrity: sha512-R8gLRTZeyp03ymzP/6Lil/28tGeGEzhx1q2k703KGWRAI1VdvPIXdG70VJc2pAMw3NA6JKL5hhFu1sJX0Mnn/A==}
engines: {node: '>=6.0.0'}
'@jridgewell/source-map@0.3.6':
resolution: {integrity: sha512-1ZJTZebgqllO79ue2bm3rIGud/bOe0pP5BjSRCRxxYkEZS8STV7zN84UBbiYu7jy+eCKSnVIUgoWWE/tt+shMQ==}
'@jridgewell/source-map@0.3.11':
resolution: {integrity: sha512-ZMp1V8ZFcPG5dIWnQLr3NSI1MiCU7UETdS/A0G8V/XWHvJv3ZsFqutJn1Y5RPmAPX6F3BiE397OqveU/9NCuIA==}
'@jridgewell/sourcemap-codec@1.5.0':
resolution: {integrity: sha512-gv3ZRaISU3fjPAgNsriBRqGWQL6quFx04YMPW/zD8XMLsU32mhCCbfbO6KZFLjvYpCZ8zyDEgqsgf+PwPaM7GQ==}
'@jridgewell/sourcemap-codec@1.5.5':
resolution: {integrity: sha512-cYQ9310grqxueWbl+WuIUIaiUaDcj7WOq5fVhEljNVgRfOUhY9fy2zTvfoqWsnebh8Sl70VScFbICvJnLKB0Og==}
'@jridgewell/trace-mapping@0.3.25':
resolution: {integrity: sha512-vNk6aEwybGtawWmy/PzwnGDOjCkLWSD2wqvjGGAgOAwCGWySYXfYoxt00IJkTF+8Lb57DwOb3Aa0o9CApepiYQ==}
'@jridgewell/trace-mapping@0.3.31':
resolution: {integrity: sha512-zzNR+SdQSDJzc8joaeP8QQoCQr8NuYx2dIIytl1QeBEZHJ9uW6hebsrYgbz8hJwUQao3TWCMtmfV8Nu1twOLAw==}
'@mermaid-js/parser@0.4.0':
resolution: {integrity: sha512-wla8XOWvQAwuqy+gxiZqY+c7FokraOTHRWMsbB4AgRx9Sy7zKslNyejy7E+a77qHfey5GXw/ik3IXv/NHMJgaA==}
@@ -1005,12 +991,6 @@ packages:
resolution: {integrity: sha512-c83qWb22rNRuB0UaVCI0uRPNRr8Z0FWnEIvT47jiHAmOIUHbBOg5XvV7pM5x+rKn9HRpjxquDbXYSXr3fAKFcw==}
engines: {node: '>=12'}
'@rc-component/util@1.2.1':
resolution: {integrity: sha512-AUVu6jO+lWjQnUOOECwu8iR0EdElQgWW5NBv5vP/Uf9dWbAX3udhMutRlkVXjuac2E40ghkFy+ve00mc/3Fymg==}
peerDependencies:
react: '>=18.0.0'
react-dom: '>=18.0.0'
'@react-aria/focus@3.20.2':
resolution: {integrity: sha512-Q3rouk/rzoF/3TuH6FzoAIKrl+kzZi9LHmr8S5EqLAOyP9TXIKG34x2j42dZsAhrw7TbF9gA8tBKwnCNH4ZV+Q==}
peerDependencies:
@@ -1182,6 +1162,9 @@ packages:
'@tauri-apps/api@2.5.0':
resolution: {integrity: sha512-Ldux4ip+HGAcPUmuLT8EIkk6yafl5vK0P0c0byzAKzxJh7vxelVtdPONjfgTm96PbN24yjZNESY8CKo8qniluA==}
'@tauri-apps/api@2.8.0':
resolution: {integrity: sha512-ga7zdhbS2GXOMTIZRT0mYjKJtR9fivsXzsyq5U3vjDL0s6DTMwYRm0UHNjzTY5dh4+LSC68Sm/7WEiimbQNYlw==}
'@tauri-apps/cli-darwin-arm64@2.5.0':
resolution: {integrity: sha512-VuVAeTFq86dfpoBDNYAdtQVLbP0+2EKCHIIhkaxjeoPARR0sLpFHz2zs0PcFU76e+KAaxtEtAJAXGNUc8E1PzQ==}
engines: {node: '>= 10'}
@@ -1271,8 +1254,8 @@ packages:
'@tauri-apps/plugin-log@2.4.0':
resolution: {integrity: sha512-j7yrDtLNmayCBOO2esl3aZv9jSXy2an8MDLry3Ys9ZXerwUg35n1Y2uD8HoCR+8Ng/EUgx215+qOUfJasjYrHw==}
'@tauri-apps/plugin-opener@2.2.7':
resolution: {integrity: sha512-uduEyvOdjpPOEeDRrhwlCspG/f9EQalHumWBtLBnp3fRp++fKGLqDOyUhSIn7PzX45b/rKep//ZQSAQoIxobLA==}
'@tauri-apps/plugin-opener@2.5.0':
resolution: {integrity: sha512-B0LShOYae4CZjN8leiNDbnfjSrTwoZakqKaWpfoH6nXiJwt6Rgj6RnVIffG3DoJiKsffRhMkjmBV9VeilSb4TA==}
'@tauri-apps/plugin-os@2.2.1':
resolution: {integrity: sha512-cNYpNri2CCc6BaNeB6G/mOtLvg8dFyFQyCUdf2y0K8PIAKGEWdEcu8DECkydU2B+oj4OJihDPD2de5K6cbVl9A==}
@@ -1287,9 +1270,6 @@ packages:
resolution: {tarball: https://codeload.github.com/infinilabs/tauri-plugin-updater/tar.gz/358e689c65e9943b53eff50bcb9dfd5b1cfc4072}
version: 2.7.1
'@tauri-apps/plugin-websocket@2.3.0':
resolution: {integrity: sha512-eAwRGe3tnqDeQYE0wq4g1PUKbam9tYvlC4uP/au12Y/z7MP4lrS4ylv+aoZ5Ly+hTlBdi7hDkhHomwF/UeBesA==}
'@tauri-apps/plugin-window@2.0.0-alpha.1':
resolution: {integrity: sha512-dFOAgal/3Txz3SQ+LNQq0AK1EPC+acdaFlwPVB/6KXUZYmaFleIlzgxDVoJCQ+/xOhxvYrdQaFLefh0I/Kldbg==}
@@ -1496,6 +1476,11 @@ packages:
engines: {node: '>=0.4.0'}
hasBin: true
acorn@8.15.0:
resolution: {integrity: sha512-NZyJarBfL7nWwIq+FDL6Zp/yHEhePMNnnJ0y3qfieCrmNvYct8uvtiV41UvlSe6apAfk0fY1FbWx+NwfmpvtTg==}
engines: {node: '>=0.4.0'}
hasBin: true
agent-base@7.1.3:
resolution: {integrity: sha512-jRR5wdylq8CkOe6hei19GGZnxM6rBGwFl3Bg0YItGDimvjGtAvdZk4Pu6Cl4u4Igsws4a1fd1Vq3ezrhn4KmFw==}
engines: {node: '>= 14'}
@@ -1679,9 +1664,6 @@ packages:
resolution: {integrity: sha512-cYY9mypksY8NRqgDB1XD1RiJL338v/551niynFTGkZOO2LHuB2OmOYxDIe/ttN9AHwrqdum1360G3ald0W9kCg==}
engines: {node: '>=8'}
classnames@2.5.1:
resolution: {integrity: sha512-saHYOzhIQs6wy2sVxTM6bUDsQO4F50V9RQ22qBpEdCW+I+/Wmke2HOl6lS6dTpdxVhb88/I6+Hs+438c3lfUow==}
cli-boxes@3.0.0:
resolution: {integrity: sha512-/lzGpEWL/8PfI0BmBOPRwp0c/wFNX1RdUML3jK/RcSBA9T8mZDdQpqYBKtCFTOfQbwPqWEOpjqW+Fnayc0969g==}
engines: {node: '>=10'}
@@ -3161,9 +3143,6 @@ packages:
typescript:
optional: true
react-is@18.3.1:
resolution: {integrity: sha512-/LLMVyas0ljjAtoYiPqYiL8VWXzUUdThrmU5+n20DZv+a+ClRoevUzw5JxU+Ieh5/c87ytoTBV9G1FiKfNJdmg==}
react-markdown@9.1.0:
resolution: {integrity: sha512-xaijuJB0kzGiUdG7nc2MOMDUDBWPyGAjZtUrow9XxUeua8IqeP+VlIfAZ3bphpcLTnSZXz6z9jcVC/TCwbfgdw==}
peerDependencies:
@@ -3809,23 +3788,6 @@ snapshots:
'@jridgewell/gen-mapping': 0.3.8
'@jridgewell/trace-mapping': 0.3.25
'@ant-design/colors@8.0.0':
dependencies:
'@ant-design/fast-color': 3.0.0
'@ant-design/fast-color@3.0.0': {}
'@ant-design/icons-svg@4.4.2': {}
'@ant-design/icons@6.0.0(react-dom@18.3.1(react@18.3.1))(react@18.3.1)':
dependencies:
'@ant-design/colors': 8.0.0
'@ant-design/icons-svg': 4.4.2
'@rc-component/util': 1.2.1(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
classnames: 2.5.1
react: 18.3.1
react-dom: 18.3.1(react@18.3.1)
'@antfu/install-pkg@1.1.0':
dependencies:
package-manager-detector: 1.3.0
@@ -4285,6 +4247,12 @@ snapshots:
wrap-ansi: 8.1.0
wrap-ansi-cjs: wrap-ansi@7.0.0
'@jridgewell/gen-mapping@0.3.13':
dependencies:
'@jridgewell/sourcemap-codec': 1.5.5
'@jridgewell/trace-mapping': 0.3.31
optional: true
'@jridgewell/gen-mapping@0.3.8':
dependencies:
'@jridgewell/set-array': 1.2.1
@@ -4295,19 +4263,28 @@ snapshots:
'@jridgewell/set-array@1.2.1': {}
'@jridgewell/source-map@0.3.6':
'@jridgewell/source-map@0.3.11':
dependencies:
'@jridgewell/gen-mapping': 0.3.8
'@jridgewell/trace-mapping': 0.3.25
'@jridgewell/gen-mapping': 0.3.13
'@jridgewell/trace-mapping': 0.3.31
optional: true
'@jridgewell/sourcemap-codec@1.5.0': {}
'@jridgewell/sourcemap-codec@1.5.5':
optional: true
'@jridgewell/trace-mapping@0.3.25':
dependencies:
'@jridgewell/resolve-uri': 3.1.2
'@jridgewell/sourcemap-codec': 1.5.0
'@jridgewell/trace-mapping@0.3.31':
dependencies:
'@jridgewell/resolve-uri': 3.1.2
'@jridgewell/sourcemap-codec': 1.5.5
optional: true
'@mermaid-js/parser@0.4.0':
dependencies:
langium: 3.3.1
@@ -4468,12 +4445,6 @@ snapshots:
'@pnpm/network.ca-file': 1.0.2
config-chain: 1.1.13
'@rc-component/util@1.2.1(react-dom@18.3.1(react@18.3.1))(react@18.3.1)':
dependencies:
react: 18.3.1
react-dom: 18.3.1(react@18.3.1)
react-is: 18.3.1
'@react-aria/focus@3.20.2(react-dom@18.3.1(react@18.3.1))(react@18.3.1)':
dependencies:
'@react-aria/interactions': 3.25.0(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
@@ -4607,6 +4578,8 @@ snapshots:
'@tauri-apps/api@2.5.0': {}
'@tauri-apps/api@2.8.0': {}
'@tauri-apps/cli-darwin-arm64@2.5.0':
optional: true
@@ -4678,9 +4651,9 @@ snapshots:
dependencies:
'@tauri-apps/api': 2.5.0
'@tauri-apps/plugin-opener@2.2.7':
'@tauri-apps/plugin-opener@2.5.0':
dependencies:
'@tauri-apps/api': 2.5.0
'@tauri-apps/api': 2.8.0
'@tauri-apps/plugin-os@2.2.1':
dependencies:
@@ -4698,10 +4671,6 @@ snapshots:
dependencies:
'@tauri-apps/api': 2.5.0
'@tauri-apps/plugin-websocket@2.3.0':
dependencies:
'@tauri-apps/api': 2.5.0
'@tauri-apps/plugin-window@2.0.0-alpha.1':
dependencies:
'@tauri-apps/api': 2.0.0-alpha.6
@@ -4941,6 +4910,9 @@ snapshots:
acorn@8.14.1: {}
acorn@8.15.0:
optional: true
agent-base@7.1.3: {}
ahooks@3.8.4(react@18.3.1):
@@ -5132,8 +5104,6 @@ snapshots:
ci-info@4.2.0: {}
classnames@2.5.1: {}
cli-boxes@3.0.0: {}
cli-cursor@5.0.0:
@@ -6881,8 +6851,6 @@ snapshots:
react-dom: 18.3.1(react@18.3.1)
typescript: 5.8.3
react-is@18.3.1: {}
react-markdown@9.1.0(@types/react@18.3.21)(react@18.3.1):
dependencies:
'@types/hast': 3.0.4
@@ -7301,8 +7269,8 @@ snapshots:
terser@5.40.0:
dependencies:
'@jridgewell/source-map': 0.3.6
acorn: 8.14.1
'@jridgewell/source-map': 0.3.11
acorn: 8.15.0
commander: 2.20.3
source-map-support: 0.5.21
optional: true

3208
src-tauri/Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,9 +1,9 @@
[package]
name = "coco"
version = "0.6.0"
version = "0.8.0"
description = "Search, connect, collaborate all in one place."
authors = ["INFINI Labs"]
edition = "2021"
edition = "2024"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[lib]
@@ -15,6 +15,7 @@ crate-type = ["staticlib", "cdylib", "rlib"]
[build-dependencies]
tauri-build = { version = "2", features = ["default"] }
cfg-if = "1.0.1"
[features]
default = ["desktop"]
@@ -51,7 +52,6 @@ serde = { version = "1", features = ["derive"] }
# see: https://docs.rs/serde_json/latest/serde_json/struct.Number.html#method.from_u128
serde_json = { version = "1", features = ["arbitrary_precision", "preserve_order"] }
tauri-plugin-http = "2"
tauri-plugin-websocket = "2"
tauri-plugin-deep-link = "2.0.0"
tauri-plugin-store = "2.2.0"
tauri-plugin-os = "2"
@@ -62,7 +62,7 @@ tauri-plugin-drag = "2"
tauri-plugin-macos-permissions = "2"
tauri-plugin-fs-pro = "2"
tauri-plugin-screenshots = "2"
applications = { git = "https://github.com/infinilabs/applications-rs", rev = "7bb507e6b12f73c96f3a52f0578d0246a689f381" }
applications = { git = "https://github.com/infinilabs/applications-rs", rev = "b5fac4034a40d42e72f727f1aa1cc1f19fe86653" }
tokio-native-tls = "0.3" # For wss connections
tokio = { version = "1", features = ["full"] }
tokio-tungstenite = { version = "0.20", features = ["native-tls"] }
@@ -87,7 +87,7 @@ http = "1.1.0"
tungstenite = "0.24.0"
tokio-util = "0.7.14"
tauri-plugin-windows-version = "2"
meval = "0.2"
meval = { git = "https://github.com/infinilabs/meval-rs" }
chinese-number = "0.7"
num2words = "1"
tauri-plugin-log = "2"
@@ -102,12 +102,44 @@ tauri-plugin-opener = "2"
async-recursion = "1.1.1"
zip = "4.0.0"
url = "2.5.2"
camino = "1.1.10"
tokio-stream = { version = "0.1.17", features = ["io-util"] }
sysinfo = "0.35.2"
indexmap = { version = "2.10.0", features = ["serde"] }
strum = { version = "0.27.2", features = ["derive"] }
sys-locale = "0.3.2"
tauri-plugin-prevent-default = "1"
oneshot = "0.1.11"
bitflags = "2.9.3"
cfg-if = "1.0.1"
dunce = "1.0.5"
urlencoding = "2.1.3"
scraper = "0.17"
toml = "0.8"
path-clean = "1.0.1"
[dev-dependencies]
tempfile = "3.23.0"
[target."cfg(target_os = \"macos\")".dependencies]
tauri-nspanel = { git = "https://github.com/ahkohd/tauri-nspanel", branch = "v2" }
objc2-app-kit = { version = "0.3.1", features = ["NSWindow"] }
objc2 = "0.6.2"
objc2-core-foundation = {version = "0.3.1", features = ["CFString", "CFCGTypes", "CFArray"] }
objc2-application-services = { version = "0.3.1", features = ["HIServices"] }
objc2-core-graphics = { version = "=0.3.1", features = ["CGEvent"] }
[target."cfg(target_os = \"linux\")".dependencies]
gio = "0.21.2"
glib = "0.21.2"
tracker-rs = "0.7"
which = "8.0.0"
configparser = "3.1.0"
[target."cfg(any(target_os = \"macos\", windows, target_os = \"linux\"))".dependencies]
tauri-plugin-single-instance = { version = "2.0.0", features = ["deep-link"] }
serde = { version = "1.0.219", features = ["derive"], optional = true }
[profile.dev]
incremental = true # Compile your binary in smaller steps.
@@ -123,6 +155,13 @@ strip = true # Ensures debug symbols are removed.
tauri-plugin-autostart = "^2.2"
tauri-plugin-global-shortcut = "2"
tauri-plugin-updater = { git = "https://github.com/infinilabs/plugins-workspace", branch = "v2" }
# This should be compatible with the semver used by `tauri-plugin-updater`
semver = { version = "1", features = ["serde"] }
[target."cfg(target_os = \"windows\")".dependencies]
enigo="0.3"
windows = { version = "0.61", features = ["Win32_Foundation", "Win32_System_Com", "Win32_System_Ole", "Win32_System_Search", "Win32_UI_Shell_PropertiesSystem", "Win32_Data"] }
windows-sys = { version = "0.61", features = ["Win32", "Win32_System", "Win32_System_Com"] }
[target."cfg(target_os = \"windows\")".build-dependencies]
bindgen = "0.72.1"

View File

@@ -1,3 +1,42 @@
fn main() {
tauri_build::build()
tauri_build::build();
// If env var `GITHUB_ACTIONS` exists, we are running in CI, set up the `ci`
// attribute
if std::env::var("GITHUB_ACTIONS").is_ok() {
println!("cargo:rustc-cfg=ci");
}
// Notify `rustc` of this `cfg` attribute to suppress unknown attribute warnings.
//
// unexpected condition name: `ci`
println!("cargo::rustc-check-cfg=cfg(ci)");
// Bindgen searchapi.h on Windows as the windows create does not provide
// bindings for it
cfg_if::cfg_if! {
if #[cfg(target_os = "windows")] {
use std::env;
use std::path::PathBuf;
let wrapper_header = r#"#include <windows.h>
#include <searchapi.h>"#;
let searchapi_bindings = bindgen::Builder::default()
.header_contents("wrapper.h", wrapper_header)
.generate()
.expect("failed to generate bindings for <searchapi.h>");
let out_path = PathBuf::from(env::var("OUT_DIR").unwrap());
searchapi_bindings
.write_to_file(out_path.join("searchapi_bindings.rs"))
.expect("couldn't write bindings to <OUT_DIR/searchapi_bindings.rs>")
// Looks like there is no need to link the library that contains the
// implementation of functions declared in 'searchapi.h' manually as
// the FFI bindings work (without doing that).
//
// This is wield, I do not expect the linker will link it automatically.
}
}
}

View File

@@ -37,9 +37,6 @@
"http:allow-fetch-cancel",
"http:allow-fetch-read-body",
"http:allow-fetch-send",
"websocket:default",
"websocket:allow-connect",
"websocket:allow-send",
"autostart:allow-enable",
"autostart:allow-disable",
"autostart:allow-is-enabled",
@@ -72,6 +69,7 @@
"updater:default",
"windows-version:default",
"log:default",
"opener:default"
"opener:default",
"core:window:allow-unminimize"
]
}

View File

@@ -1,2 +1,2 @@
[toolchain]
channel = "nightly-2025-02-28"
channel = "nightly-2025-06-26"

View File

@@ -1,20 +1,20 @@
use crate::common::assistant::ChatRequestMessage;
use crate::common::http::{convert_query_params_to_strings, GetResponse};
use crate::common::http::convert_query_params_to_strings;
use crate::common::register::SearchSourceRegistry;
use crate::server::http_client::HttpClient;
use crate::{common, server::servers::COCO_SERVERS};
use futures::stream::FuturesUnordered;
use futures::StreamExt;
use futures::stream::FuturesUnordered;
use futures_util::TryStreamExt;
use http::Method;
use serde_json::Value;
use std::collections::HashMap;
use tauri::{AppHandle, Emitter, Manager, Runtime};
use tauri::{AppHandle, Emitter, Manager};
use tokio::io::AsyncBufReadExt;
#[tauri::command]
pub async fn chat_history<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn chat_history(
_app_handle: AppHandle,
server_id: String,
from: u32,
size: u32,
@@ -43,8 +43,8 @@ pub async fn chat_history<R: Runtime>(
}
#[tauri::command]
pub async fn session_chat_history<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn session_chat_history(
_app_handle: AppHandle,
server_id: String,
session_id: String,
from: u32,
@@ -66,8 +66,8 @@ pub async fn session_chat_history<R: Runtime>(
}
#[tauri::command]
pub async fn open_session_chat<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn open_session_chat(
_app_handle: AppHandle,
server_id: String,
session_id: String,
) -> Result<String, String> {
@@ -81,8 +81,8 @@ pub async fn open_session_chat<R: Runtime>(
}
#[tauri::command]
pub async fn close_session_chat<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn close_session_chat(
_app_handle: AppHandle,
server_id: String,
session_id: String,
) -> Result<String, String> {
@@ -95,14 +95,16 @@ pub async fn close_session_chat<R: Runtime>(
common::http::get_response_body_text(response).await
}
#[tauri::command]
pub async fn cancel_session_chat<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn cancel_session_chat(
_app_handle: AppHandle,
server_id: String,
session_id: String,
query_params: Option<HashMap<String, Value>>,
) -> Result<String, String> {
let path = format!("/chat/{}/_cancel", session_id);
let query_params = convert_query_params_to_strings(query_params);
let response = HttpClient::post(&server_id, path.as_str(), None, None)
let response = HttpClient::post(&server_id, path.as_str(), query_params, None)
.await
.map_err(|e| format!("Error cancel session: {}", e))?;
@@ -110,82 +112,161 @@ pub async fn cancel_session_chat<R: Runtime>(
}
#[tauri::command]
pub async fn new_chat<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn chat_create(
app_handle: AppHandle,
server_id: String,
websocket_id: String,
message: String,
message: Option<String>,
attachments: Option<Vec<String>>,
query_params: Option<HashMap<String, Value>>,
) -> Result<GetResponse, String> {
let body = if !message.is_empty() {
let message = ChatRequestMessage {
message: Some(message),
client_id: String,
) -> Result<(), String> {
println!("chat_create message: {:?}", message);
println!("chat_create attachments: {:?}", attachments);
let message_empty = message.as_ref().map_or(true, |m| m.is_empty());
let attachments_empty = attachments.as_ref().map_or(true, |a| a.is_empty());
if message_empty && attachments_empty {
return Err("Message and attachments are empty".to_string());
}
let body = {
let request_message: ChatRequestMessage = ChatRequestMessage {
message,
attachments,
};
println!("chat_create body: {:?}", request_message);
Some(
serde_json::to_string(&message)
serde_json::to_string(&request_message)
.map_err(|e| format!("Failed to serialize message: {}", e))?
.into(),
)
} else {
None
};
let mut headers = HashMap::new();
headers.insert("WEBSOCKET-SESSION-ID".to_string(), websocket_id.into());
let response = HttpClient::advanced_post(
&server_id,
"/chat/_new",
Some(headers),
"/chat/_create",
None,
convert_query_params_to_strings(query_params),
body,
)
.await
.map_err(|e| format!("Error sending message: {}", e))?;
let body_text = common::http::get_response_body_text(response).await?;
log::debug!("New chat response: {}", &body_text);
let chat_response: GetResponse = serde_json::from_str(&body_text)
.map_err(|e| format!("Failed to parse response JSON: {}", e))?;
if chat_response.result != "created" {
return Err(format!("Unexpected result: {}", chat_response.result));
if response.status() == 429 {
log::warn!("Rate limit exceeded for chat create");
return Err("Rate limited".to_string());
}
Ok(chat_response)
if !response.status().is_success() {
return Err(format!("Request failed with status: {}", response.status()));
}
let stream = response.bytes_stream();
let reader = tokio_util::io::StreamReader::new(
stream.map_err(|e| std::io::Error::new(std::io::ErrorKind::Other, e)),
);
let mut lines = tokio::io::BufReader::new(reader).lines();
log::info!("client_id_create: {}", &client_id);
while let Ok(Some(line)) = lines.next_line().await {
log::info!("Received chat stream line: {}", &line);
if let Err(err) = app_handle.emit(&client_id, line) {
log::error!("Emit failed: {:?}", err);
let _ = app_handle.emit("chat-create-error", format!("Emit failed: {:?}", err));
}
}
Ok(())
}
#[tauri::command]
pub async fn send_message<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn chat_chat(
app_handle: AppHandle,
server_id: String,
websocket_id: String,
session_id: String,
message: String,
message: Option<String>,
attachments: Option<Vec<String>>,
query_params: Option<HashMap<String, Value>>, //search,deep_thinking
) -> Result<String, String> {
let path = format!("/chat/{}/_send", session_id);
let msg = ChatRequestMessage {
message: Some(message),
client_id: String,
) -> Result<(), String> {
println!("chat_chat message: {:?}", message);
println!("chat_chat attachments: {:?}", attachments);
let message_empty = message.as_ref().map_or(true, |m| m.is_empty());
let attachments_empty = attachments.as_ref().map_or(true, |a| a.is_empty());
if message_empty && attachments_empty {
return Err("Message and attachments are empty".to_string());
}
let body = {
let request_message = ChatRequestMessage {
message,
attachments,
};
println!("chat_chat body: {:?}", request_message);
Some(
serde_json::to_string(&request_message)
.map_err(|e| format!("Failed to serialize message: {}", e))?
.into(),
)
};
let mut headers = HashMap::new();
headers.insert("WEBSOCKET-SESSION-ID".to_string(), websocket_id.into());
let path = format!("/chat/{}/_chat", session_id);
let body = reqwest::Body::from(serde_json::to_string(&msg).unwrap());
let response = HttpClient::advanced_post(
&server_id,
path.as_str(),
Some(headers),
None,
convert_query_params_to_strings(query_params),
Some(body),
body,
)
.await
.map_err(|e| format!("Error cancel session: {}", e))?;
.map_err(|e| format!("Error sending message: {}", e))?;
common::http::get_response_body_text(response).await
if response.status() == 429 {
log::warn!("Rate limit exceeded for chat create");
return Err("Rate limited".to_string());
}
if !response.status().is_success() {
return Err(format!("Request failed with status: {}", response.status()));
}
let stream = response.bytes_stream();
let reader = tokio_util::io::StreamReader::new(
stream.map_err(|e| std::io::Error::new(std::io::ErrorKind::Other, e)),
);
let mut lines = tokio::io::BufReader::new(reader).lines();
let mut first_log = true;
log::info!("client_id: {}", &client_id);
while let Ok(Some(line)) = lines.next_line().await {
log::info!("Received chat stream line: {}", &line);
if first_log {
log::info!("first stream line: {}", &line);
first_log = false;
}
if let Err(err) = app_handle.emit(&client_id, line) {
log::error!("Emit failed: {:?}", err);
print!("Error sending message: {:?}", err);
let _ = app_handle.emit("chat-create-error", format!("Emit failed: {:?}", err));
}
}
Ok(())
}
#[tauri::command]
@@ -232,8 +313,8 @@ pub async fn update_session_chat(
}
#[tauri::command]
pub async fn assistant_search<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn assistant_search(
_app_handle: AppHandle,
server_id: String,
query_params: Option<Vec<String>>,
) -> Result<Value, String> {
@@ -248,8 +329,8 @@ pub async fn assistant_search<R: Runtime>(
}
#[tauri::command]
pub async fn assistant_get<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn assistant_get(
_app_handle: AppHandle,
server_id: String,
assistant_id: String,
) -> Result<Value, String> {
@@ -272,8 +353,8 @@ pub async fn assistant_get<R: Runtime>(
///
/// Returns as soon as the assistant is found on any Coco server.
#[tauri::command]
pub async fn assistant_get_multi<R: Runtime>(
app_handle: AppHandle<R>,
pub async fn assistant_get_multi(
app_handle: AppHandle,
assistant_id: String,
) -> Result<Value, String> {
let search_sources = app_handle.state::<SearchSourceRegistry>();
@@ -366,8 +447,8 @@ pub fn remove_icon_fields(json: &str) -> String {
}
#[tauri::command]
pub async fn ask_ai<R: Runtime>(
app_handle: AppHandle<R>,
pub async fn ask_ai(
app_handle: AppHandle,
message: String,
server_id: String,
assistant_id: String,

View File

@@ -1,15 +1,15 @@
use std::{fs::create_dir, io::Read};
use tauri::{Manager, Runtime};
use tauri::{AppHandle, Manager};
use tauri_plugin_autostart::ManagerExt;
/// If the state reported from the OS and the state stored by us differ, our state is
/// prioritized and seen as the correct one. Update the OS state to make them consistent.
pub fn ensure_autostart_state_consistent(app: &mut tauri::App) -> Result<(), String> {
let autostart_manager = app.autolaunch();
pub fn ensure_autostart_state_consistent(tauri_app_handle: &AppHandle) -> Result<(), String> {
let autostart_manager = tauri_app_handle.autolaunch();
let os_state = autostart_manager.is_enabled().map_err(|e| e.to_string())?;
let coco_stored_state = current_autostart(app.app_handle()).map_err(|e| e.to_string())?;
let coco_stored_state = current_autostart(tauri_app_handle).map_err(|e| e.to_string())?;
if os_state != coco_stored_state {
log::warn!(
@@ -42,7 +42,7 @@ pub fn ensure_autostart_state_consistent(app: &mut tauri::App) -> Result<(), Str
Ok(())
}
fn current_autostart<R: Runtime>(app: &tauri::AppHandle<R>) -> Result<bool, String> {
fn current_autostart(app: &tauri::AppHandle) -> Result<bool, String> {
use std::fs::File;
let path = app.path().app_config_dir().unwrap();
@@ -65,10 +65,7 @@ fn current_autostart<R: Runtime>(app: &tauri::AppHandle<R>) -> Result<bool, Stri
}
#[tauri::command]
pub async fn change_autostart<R: Runtime>(
app: tauri::AppHandle<R>,
open: bool,
) -> Result<(), String> {
pub async fn change_autostart(app: tauri::AppHandle, open: bool) -> Result<(), String> {
use std::fs::File;
use std::io::Write;

View File

@@ -3,7 +3,10 @@ use serde_json::Value;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ChatRequestMessage {
#[serde(skip_serializing_if = "Option::is_none")]
pub message: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub attachments: Option<Vec<String>>,
}
#[allow(dead_code)]
@@ -30,4 +33,4 @@ pub struct Session {
#[derive(Debug, Serialize, Deserialize)]
pub struct SessionContext {
pub attachments: Option<Vec<String>>,
}
}

View File

@@ -1,6 +1,6 @@
use serde::{Deserialize, Serialize};
#[derive(Debug,Clone, Serialize, Deserialize)]
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Connector {
pub id: String,
pub created: Option<String>,
@@ -13,7 +13,7 @@ pub struct Connector {
pub url: Option<String>,
pub assets: Option<ConnectorAssets>,
}
#[derive(Debug,Clone, Serialize, Deserialize)]
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ConnectorAssets {
pub icons: Option<std::collections::HashMap<String, String>>,
}
}

View File

@@ -18,4 +18,4 @@ pub struct DataSource {
pub struct ConnectorConfig {
pub id: Option<String>,
pub config: Option<serde_json::Value>, // Using serde_json::Value to handle any type of config
}
}

View File

@@ -1,5 +1,9 @@
#[cfg(target_os = "macos")]
use crate::extension::built_in::window_management::actions::Action;
use crate::extension::{ExtensionPermission, ExtensionSettings};
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
use tauri::{AppHandle, Emitter};
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct RichLabel {
@@ -29,17 +33,63 @@ pub struct EditorInfo {
pub timestamp: Option<String>,
}
/// Defines the action that would be performed when a document gets opened.
/// Defines the action that would be performed when a [document](Document) gets opened.
///
/// "Document" is a uniform type that the backend uses to send the search results
/// back to the frontend. Since Coco can search many sources, "Document" can
/// represent different things, application, web page, local file, extensions, and
/// so on. Each has its own specific open action.
#[derive(Debug, Clone, Serialize, Deserialize)]
pub(crate) enum OnOpened {
/// Launch the application
Application { app_path: String },
/// Open the URL.
Document { url: String },
/// Perform this WM action.
#[cfg(target_os = "macos")]
WindowManagementAction { action: Action },
/// The document is an extension.
Extension(ExtensionOnOpened),
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub(crate) struct ExtensionOnOpened {
/// Different types of extensions have different open behaviors.
pub(crate) ty: ExtensionOnOpenedType,
/// Extensions settings. Some could affect open action.
///
/// Optional because not all extensions have their settings.
pub(crate) settings: Option<ExtensionSettings>,
/// Permission needed by this extension.
///
/// We do permission check when opening this permission. Currently, we only
/// do this to View extensions.
pub(crate) permission: Option<ExtensionPermission>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub(crate) enum ExtensionOnOpenedType {
/// Spawn a child process to run the `CommandAction`.
Command {
action: crate::extension::CommandAction,
},
/// Open the `link`.
//
// NOTE that this variant has the same definition as `struct Quicklink`, but we
// cannot use it directly, its `link` field should be deserialized/serialized
// from/to a string, but we need a JSON object here.
//
// See also the comments in `struct Quicklink`.
Quicklink {
link: crate::extension::QuicklinkLink,
open_with: Option<String>,
},
View {
/// Path to the HTML file that coco will load and render.
///
/// It should be an absolute path or Tauri cannot open it.
page: String,
},
}
impl OnOpened {
@@ -47,64 +97,158 @@ impl OnOpened {
match self {
Self::Application { app_path } => app_path.clone(),
Self::Document { url } => url.clone(),
Self::Command { action } => {
const WHITESPACE: &str = " ";
let mut ret = action.exec.clone();
ret.push_str(WHITESPACE);
if let Some(ref args) = action.args {
ret.push_str(args.join(WHITESPACE).as_str());
}
#[cfg(target_os = "macos")]
Self::WindowManagementAction { action: _ } => {
// We don't have URL for this
String::from("N/A")
}
Self::Extension(ext_on_opened) => {
match &ext_on_opened.ty {
ExtensionOnOpenedType::Command { action } => {
const WHITESPACE: &str = " ";
let mut ret = action.exec.clone();
ret.push_str(WHITESPACE);
if let Some(ref args) = action.args {
ret.push_str(args.join(WHITESPACE).as_str());
}
ret
ret
}
// Currently, our URL is static and does not support dynamic parameters.
// The URL of a quicklink is nearly useless without such dynamic user
// inputs, so until we have dynamic URL support, we just use "N/A".
ExtensionOnOpenedType::Quicklink { .. } => String::from("N/A"),
ExtensionOnOpenedType::View { page: _ } => {
// We currently don't have URL for this kind of extension.
String::from("N/A")
}
}
}
}
}
}
#[tauri::command]
pub(crate) async fn open(on_opened: OnOpened) -> Result<(), String> {
log::debug!("open({})", on_opened.url());
pub(crate) async fn open(
tauri_app_handle: AppHandle,
on_opened: OnOpened,
extra_args: Option<HashMap<String, String>>,
) -> Result<(), String> {
use crate::util::open as homemade_tauri_shell_open;
use crate::GLOBAL_TAURI_APP_HANDLE;
use std::process::Command;
let global_tauri_app_handle = GLOBAL_TAURI_APP_HANDLE
.get()
.expect("global tauri app handle not set");
match on_opened {
OnOpened::Application { app_path } => {
homemade_tauri_shell_open(global_tauri_app_handle.clone(), app_path).await?
log::debug!("open application [{}]", app_path);
homemade_tauri_shell_open(tauri_app_handle.clone(), app_path).await?
}
OnOpened::Document { url } => {
homemade_tauri_shell_open(global_tauri_app_handle.clone(), url).await?
}
OnOpened::Command { action } => {
let mut cmd = Command::new(action.exec);
if let Some(args) = action.args {
cmd.args(args);
}
let output = cmd.output().map_err(|e| e.to_string())?;
// Sometimes, we wanna see the result in logs even though it doesn't fail.
log::debug!(
"executing open(Command) result, exit code: [{}], stdout: [{}], stderr: [{}]",
output.status,
String::from_utf8_lossy(&output.stdout),
String::from_utf8_lossy(&output.stderr)
);
if !output.status.success() {
log::warn!(
"executing open(Command) failed, exit code: [{}], stdout: [{}], stderr: [{}]",
output.status,
String::from_utf8_lossy(&output.stdout),
String::from_utf8_lossy(&output.stderr)
);
log::debug!("open document [{}]", url);
return Err(format!(
"Command failed, stderr [{}]",
String::from_utf8_lossy(&output.stderr)
));
homemade_tauri_shell_open(tauri_app_handle.clone(), url).await?
}
#[cfg(target_os = "macos")]
OnOpened::WindowManagementAction { action } => {
log::debug!("perform Window Management action [{:?}]", action);
crate::extension::built_in::window_management::perform_action_on_main_thread(
&tauri_app_handle,
action,
)?;
}
OnOpened::Extension(ext_on_opened) => {
// Apply the settings that would affect open behavior
if let Some(settings) = ext_on_opened.settings {
if let Some(should_hide) = settings.hide_before_open {
if should_hide {
crate::hide_coco(tauri_app_handle.clone()).await;
}
}
}
let permission = ext_on_opened.permission;
match ext_on_opened.ty {
ExtensionOnOpenedType::Command { action } => {
log::debug!("open (execute) command [{:?}]", action);
let mut cmd = Command::new(action.exec);
if let Some(args) = action.args {
cmd.args(args);
}
let output = cmd.output().map_err(|e| e.to_string())?;
// Sometimes, we wanna see the result in logs even though it doesn't fail.
log::debug!(
"executing open(Command) result, exit code: [{}], stdout: [{}], stderr: [{}]",
output.status,
String::from_utf8_lossy(&output.stdout),
String::from_utf8_lossy(&output.stderr)
);
if !output.status.success() {
log::warn!(
"executing open(Command) failed, exit code: [{}], stdout: [{}], stderr: [{}]",
output.status,
String::from_utf8_lossy(&output.stdout),
String::from_utf8_lossy(&output.stderr)
);
return Err(format!(
"Command failed, stderr [{}]",
String::from_utf8_lossy(&output.stderr)
));
}
}
ExtensionOnOpenedType::Quicklink {
link,
open_with: opt_open_with,
} => {
let url = link.concatenate_url(&extra_args);
log::debug!("open quicklink [{}] with [{:?}]", url, opt_open_with);
cfg_if::cfg_if! {
// The `open_with` functionality is only supported on macOS, provided
// by the `open -a` command.
if #[cfg(target_os = "macos")] {
let mut cmd = Command::new("open");
if let Some(ref open_with) = opt_open_with {
cmd.arg("-a");
cmd.arg(open_with.as_str());
}
cmd.arg(&url);
let output = cmd.output().map_err(|e| format!("failed to spawn [open] due to error [{}]", e))?;
if !output.status.success() {
return Err(format!(
"failed to open with app {:?}: {}",
opt_open_with,
String::from_utf8_lossy(&output.stderr)
));
}
} else {
homemade_tauri_shell_open(tauri_app_handle.clone(), url).await?
}
}
}
ExtensionOnOpenedType::View { page } => {
/*
* Emit an event to let the frontend code open this extension.
*
* Payload `page_and_permission` contains the information needed
* to do that.
*
* See "src/pages/main/index.tsx" for more info.
*/
use serde_json::Value as Json;
use serde_json::to_value;
let page_and_permission: [Json; 2] =
[Json::String(page), to_value(permission).unwrap()];
tauri_app_handle
.emit("open_view_extension", page_and_permission)
.unwrap();
}
}
}
}

View File

@@ -1,8 +1,22 @@
use serde::{Deserialize, Serialize};
use reqwest::StatusCode;
use serde::{Deserialize, Serialize, Serializer};
use thiserror::Error;
fn serialize_optional_status_code<S>(
status_code: &Option<StatusCode>,
serializer: S,
) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
match status_code {
Some(code) => serializer.serialize_str(&format!("{:?}", code)),
None => serializer.serialize_none(),
}
}
#[allow(unused)]
#[derive(Debug, Deserialize)]
#[allow(dead_code)]
pub struct ErrorCause {
#[serde(default)]
pub r#type: Option<String>,
@@ -11,7 +25,7 @@ pub struct ErrorCause {
}
#[derive(Debug, Deserialize)]
#[allow(dead_code)]
#[allow(unused)]
pub struct ErrorDetail {
#[serde(default)]
pub root_cause: Option<Vec<ErrorCause>>,
@@ -24,18 +38,22 @@ pub struct ErrorDetail {
}
#[derive(Debug, Deserialize)]
#[allow(dead_code)]
pub struct ErrorResponse {
#[serde(default)]
pub error: Option<ErrorDetail>,
#[serde(default)]
#[allow(unused)]
pub status: Option<u16>,
}
#[derive(Debug, Error, Serialize)]
pub enum SearchError {
#[error("HttpError: {0}")]
HttpError(String),
#[error("HttpError: status code [{status_code:?}], msg [{msg}]")]
HttpError {
#[serde(serialize_with = "serialize_optional_status_code")]
status_code: Option<StatusCode>,
msg: String,
},
#[error("ParseError: {0}")]
ParseError(String),
@@ -43,12 +61,7 @@ pub enum SearchError {
#[error("Timeout occurred")]
Timeout,
#[error("UnknownError: {0}")]
#[allow(dead_code)]
Unknown(String),
#[error("InternalError: {0}")]
#[allow(dead_code)]
InternalError(String),
}
@@ -59,7 +72,10 @@ impl From<reqwest::Error> for SearchError {
} else if err.is_decode() {
SearchError::ParseError(err.to_string())
} else {
SearchError::HttpError(err.to_string())
SearchError::HttpError {
status_code: err.status(),
msg: err.to_string(),
}
}
}
}

View File

@@ -38,7 +38,6 @@ pub async fn get_response_body_text(response: Response) -> Result<String, String
return Err(fallback_error);
}
match serde_json::from_str::<common::error::ErrorResponse>(&body) {
Ok(parsed_error) => {
dbg!(&parsed_error);
@@ -57,7 +56,6 @@ pub async fn get_response_body_text(response: Response) -> Result<String, String
}
}
pub fn convert_query_params_to_strings(
query_params: Option<HashMap<String, JsonValue>>,
) -> Option<Vec<String>> {
@@ -68,13 +66,10 @@ pub fn convert_query_params_to_strings(
JsonValue::Number(n) => Some(format!("{}={}", k, n)),
JsonValue::Bool(b) => Some(format!("{}={}", k, b)),
_ => {
eprintln!(
"Skipping unsupported query value for key '{}': {:?}",
k, v
);
eprintln!("Skipping unsupported query value for key '{}': {:?}", k, v);
None
}
})
.collect()
})
}
}

View File

@@ -13,4 +13,4 @@ pub struct UserProfile {
pub email: String,
pub avatar: Option<String>,
pub preferences: Option<Preferences>,
}
}

View File

@@ -7,8 +7,8 @@ use std::error::Error;
#[derive(Debug, Serialize, Deserialize)]
pub struct SearchResponse<T> {
pub took: u64,
pub timed_out: bool,
pub took: Option<u64>,
pub timed_out: Option<bool>,
pub _shards: Option<Shards>,
pub hits: Hits<T>,
}
@@ -83,20 +83,6 @@ where
.collect())
}
#[allow(dead_code)]
pub async fn parse_search_results_with_score<T>(
response: Response,
) -> Result<Vec<(T, Option<f64>)>, Box<dyn Error>>
where
T: for<'de> Deserialize<'de> + std::fmt::Debug,
{
Ok(parse_search_hits(response)
.await?
.into_iter()
.map(|hit| (hit._source, hit._score))
.collect())
}
#[derive(Debug, Clone, Serialize)]
pub struct SearchQuery {
pub from: u64,

View File

@@ -50,9 +50,17 @@ pub struct Server {
pub updated: String,
#[serde(default = "default_enabled_type")]
pub enabled: bool,
/// Public Coco servers can be used without signing in.
#[serde(default = "default_bool_type")]
pub public: bool,
/// A coco server is available if:
///
/// 1. It is still online, we check this via the `GET /base_url/provider/_info`
/// interface.
/// 2. A user is logged in to this Coco server, i.e., a token is stored in the
/// `SERVER_TOKEN_LIST_CACHE`.
/// For public Coco servers, requirement 2 is not needed.
#[serde(default = "default_available_type")]
pub available: bool,
@@ -84,7 +92,10 @@ pub struct ServerAccessToken {
#[serde(default = "default_empty_string")] // Custom default function for empty string
pub id: String,
pub access_token: String,
pub expired_at: u32, //unix timestamp in seconds
/// Unix timestamp in seconds
///
/// Currently, this is UNUSED.
pub expired_at: u32,
}
impl ServerAccessToken {

View File

@@ -2,10 +2,15 @@ use crate::common::error::SearchError;
use crate::common::search::SearchQuery;
use crate::common::search::{QueryResponse, QuerySource};
use async_trait::async_trait;
use tauri::AppHandle;
#[async_trait]
pub trait SearchSource: Send + Sync {
fn get_type(&self) -> QuerySource;
async fn search(&self, query: SearchQuery) -> Result<QueryResponse, SearchError>;
async fn search(
&self,
tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError>;
}

View File

@@ -0,0 +1,5 @@
# Complete Coco extension API list grouped by its category.
fs = [
"read_dir"
]

View File

@@ -0,0 +1,22 @@
//! File system APIs
use tokio::fs::read_dir as tokio_read_dir;
#[tauri::command]
pub(crate) async fn read_dir(path: String) -> Result<Vec<String>, String> {
let mut iter = tokio_read_dir(path).await.map_err(|e| e.to_string())?;
let mut file_names = Vec::new();
loop {
let opt_entry = iter.next_entry().await.map_err(|e| e.to_string())?;
let Some(entry) = opt_entry else {
break;
};
let file_name = entry.file_name().to_string_lossy().into_owned();
file_names.push(file_name);
}
Ok(file_names)
}

View File

@@ -0,0 +1,21 @@
//! The Rust implementation of the Coco extension APIs.
//!
//! Extension developers do not use these Rust APIs directly, they use our
//! [Typescript library][ts_lib], which eventually calls these APIs.
//!
//! [ts_lib]: https://github.com/infinilabs/coco-api
pub(crate) mod fs;
use std::collections::HashMap;
/// Return all the available APIs grouped by their category.
#[tauri::command]
pub(crate) fn apis() -> HashMap<String, Vec<String>> {
static APIS_TOML: &str = include_str!("./apis.toml");
let apis: HashMap<String, Vec<String>> =
toml::from_str(APIS_TOML).expect("Failed to parse apis.toml file");
apis
}

View File

@@ -14,6 +14,8 @@ pub use without_feature::*;
#[derive(Debug, Serialize, Clone)]
#[serde(rename_all = "camelCase")]
#[allow(dead_code)]
pub struct AppEntry {
path: String,
name: String,
@@ -45,4 +47,4 @@ pub(crate) const PLUGIN_JSON_FILE: &str = r#"
"type": "group",
"enabled": true
}
"#;
"#;

View File

@@ -1,8 +1,9 @@
use super::super::Extension;
use super::super::pizza_engine_runtime::RUNTIME_TX;
use super::super::pizza_engine_runtime::SearchSourceState;
use super::super::pizza_engine_runtime::Task;
use super::super::pizza_engine_runtime::RUNTIME_TX;
use super::super::Extension;
use super::AppMetadata;
use crate::GLOBAL_TAURI_APP_HANDLE;
use crate::common::document::{DataSourceReference, Document, OnOpened};
use crate::common::error::SearchError;
use crate::common::search::{QueryResponse, QuerySource, SearchQuery};
@@ -10,7 +11,6 @@ use crate::common::traits::SearchSource;
use crate::extension::ExtensionType;
use crate::extension::LOCAL_QUERY_SOURCE_TYPE;
use crate::util::open;
use crate::GLOBAL_TAURI_APP_HANDLE;
use applications::{App, AppTrait};
use async_trait::async_trait;
use log::{error, warn};
@@ -23,12 +23,12 @@ use pizza_engine::error::PizzaEngineError;
use pizza_engine::search::{OriginalQuery, QueryContext, SearchResult, Searcher};
use pizza_engine::store::{DiskStore, DiskStoreSnapshot};
use pizza_engine::writer::Writer;
use pizza_engine::{doc, Engine, EngineBuilder};
use pizza_engine::{Engine, EngineBuilder, doc};
use serde_json::Value as Json;
use std::path::Path;
use std::path::PathBuf;
use tauri::{async_runtime, AppHandle, Manager, Runtime};
use tauri_plugin_fs_pro::{icon, metadata, name, IconOptions};
use tauri::{AppHandle, Manager, async_runtime};
use tauri_plugin_fs_pro::{IconOptions, icon, metadata};
use tauri_plugin_global_shortcut::GlobalShortcutExt;
use tauri_plugin_global_shortcut::Shortcut;
use tauri_plugin_global_shortcut::ShortcutEvent;
@@ -36,7 +36,13 @@ use tauri_plugin_global_shortcut::ShortcutState;
use tauri_plugin_store::StoreExt;
use tokio::sync::oneshot::Sender as OneshotSender;
// Deprecated. We no longer index this field, but to be backward-compatible, we
// have to keep it.
const FIELD_APP_NAME: &str = "app_name";
const FIELD_APP_NAME_IN_SYSTEM_LANG: &str = "app_name_in_system_lang";
const FIELD_APP_NAME_ZH: &str = "app_name_zh";
const FIELD_APP_NAME_EN: &str = "app_name_en";
const FIELD_ICON_PATH: &str = "icon_path";
const FIELD_APP_ALIAS: &str = "app_alias";
const APPLICATION_SEARCH_SOURCE_ID: &str = "application";
@@ -58,37 +64,18 @@ const INDEX_DIR: &str = "local_application_index";
pub(crate) const QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME: &str = "Applications";
pub fn get_default_search_paths() -> Vec<String> {
#[cfg(target_os = "macos")]
{
let home_dir =
PathBuf::from(std::env::var_os("HOME").expect("environment variable $HOME not found"));
return vec![
"/Applications".into(),
"/System/Applications".into(),
"/System/Library/CoreServices".into(),
home_dir
.join("Applications")
.into_os_string()
.into_string()
.expect("this path should be UTF-8 encoded"),
];
let paths = applications::get_default_search_paths();
let mut ret = Vec::with_capacity(paths.len());
for search_path in paths {
let path_string = search_path
.into_os_string()
.into_string()
.expect("path should be UTF-8 encoded");
ret.push(path_string);
}
#[cfg(not(target_os = "macos"))]
{
let paths = applications::get_default_search_paths();
let mut ret = Vec::with_capacity(paths.len());
for search_path in paths {
let path_string = search_path
.into_os_string()
.into_string()
.expect("path should be UTF-8 encoded");
ret.push(path_string);
}
ret
}
ret
}
/// Helper function to return `app`'s path.
@@ -115,26 +102,63 @@ fn get_app_path(app: &App) -> String {
.expect("should be UTF-8 encoded")
}
/// Helper function to return `app`'s path.
///
/// * macOS: extract `app_path`'s file name and remove the file extension
/// * Windows/Linux: return the name specified in `.desktop` file
async fn get_app_name(app: &App) -> String {
if cfg!(any(target_os = "linux", target_os = "windows")) {
app.name.clone()
/// Helper function to return `app`'s Chinese name.
async fn get_app_name_zh(app: &App) -> String {
// zh_CN or zh-CN
if let Some(name) = app.localized_app_names.get("zh_CN") {
return name.clone();
}
if let Some(name) = app.localized_app_names.get("zh-CN") {
return name.clone();
}
// zh_Hans or zh-Hans
if let Some(name) = app.localized_app_names.get("zh_Hans") {
return name.clone();
}
if let Some(name) = app.localized_app_names.get("zh-Hans") {
return name.clone();
}
// Fall back to base name
app.name.clone()
}
/// Helper function to return `app`'s English name.
async fn get_app_name_en(app: &App) -> String {
// en_US or en-US
if let Some(name) = app.localized_app_names.get("en_US") {
return name.clone();
}
if let Some(name) = app.localized_app_names.get("en-US") {
return name.clone();
}
// English (General)
if let Some(name) = app.localized_app_names.get("en") {
return name.clone();
}
// Fall back to base name
app.name.clone()
}
/// Helper function to return `app`'s name in system language.
async fn get_app_name_in_system_lang(app: &App) -> String {
let system_lang = crate::util::system_lang::get_system_lang();
if let Some(name) = app.localized_app_names.get(&system_lang) {
name.clone()
} else {
let app_path = get_app_path(app);
name(app_path.into()).await
// Fall back to base name
app.name.clone()
}
}
/// Helper function to return an absolute path to `app`'s icon.
///
/// On macOS/Windows, we cache icons in our data directory using the `icon()` function.
async fn get_app_icon_path<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
app: &App,
) -> Result<String, String> {
async fn get_app_icon_path(tauri_app_handle: &AppHandle, app: &App) -> Result<String, String> {
let res_path = if cfg!(target_os = "linux") {
let icon_path = app
.icon_path
@@ -213,8 +237,8 @@ impl SearchSourceState for ApplicationSearchSourceState {
}
/// Index applications if they have not been indexed (by checking if `app_index_dir` exists).
async fn index_applications_if_not_indexed<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
async fn index_applications_if_not_indexed(
tauri_app_handle: &AppHandle,
app_index_dir: &Path,
) -> anyhow::Result<ApplicationSearchSourceState> {
let index_exists = app_index_dir.exists();
@@ -224,9 +248,17 @@ async fn index_applications_if_not_indexed<R: Runtime>(
pizza_engine_builder.set_data_store(disk_store);
let mut schema = Schema::new();
let field_app_name = Property::builder(FieldType::Text).build();
let field_app_name_zh = Property::builder(FieldType::Text).build();
schema
.add_property(FIELD_APP_NAME, field_app_name)
.add_property(FIELD_APP_NAME_ZH, field_app_name_zh)
.expect("no collision could happen");
let field_app_name_en = Property::builder(FieldType::Text).build();
schema
.add_property(FIELD_APP_NAME_EN, field_app_name_en)
.expect("no collision could happen");
let field_app_name_in_system_lang = Property::builder(FieldType::Text).build();
schema
.add_property(FIELD_APP_NAME_IN_SYSTEM_LANG, field_app_name_in_system_lang)
.expect("no collision could happen");
let property_icon = Property::builder(FieldType::Text).index(false).build();
schema
@@ -245,26 +277,65 @@ async fn index_applications_if_not_indexed<R: Runtime>(
let mut writer = pizza_engine.acquire_writer();
if !index_exists {
let default_search_path = get_default_search_paths();
let apps = list_app_in(default_search_path).map_err(|str| anyhow::anyhow!(str))?;
let search_path = {
let disabled_app_list_and_search_path_store =
tauri_app_handle.store(TAURI_STORE_DISABLED_APP_LIST_AND_SEARCH_PATH)?;
let search_path_json = disabled_app_list_and_search_path_store
.get(TAURI_STORE_KEY_SEARCH_PATH)
.unwrap_or_else(|| {
panic!("search path should be persisted in the store, but it is not, plz ensure that the store gets initialized before calling this function")
});
let search_path: Vec<String> = match search_path_json {
Json::Array(array) => array
.into_iter()
.map(|json| match json {
Json::String(str) => str,
_ => unreachable!("search path is stored in a string"),
})
.collect(),
_ => unreachable!("search path is stored in an array"),
};
search_path
};
let apps = list_app_in(search_path).map_err(|str| anyhow::anyhow!(str))?;
for app in apps.iter() {
let app_path = get_app_path(app);
let app_name = get_app_name(app).await;
let app_name_zh = get_app_name_zh(app).await;
let app_name_en = get_app_name_en(app).await;
let app_name_in_system_lang = get_app_name_in_system_lang(app).await;
let app_icon_path = get_app_icon_path(&tauri_app_handle, app)
.await
.map_err(|str| anyhow::anyhow!(str))?;
let app_alias = get_app_alias(&tauri_app_handle, &app_path).unwrap_or(String::new());
if app_name.is_empty() || app_name.eq(&tauri_app_handle.package_info().name) {
// Skip if all names are empty
if app_name_zh.is_empty()
&& app_name_en.is_empty()
&& app_name_in_system_lang.is_empty()
{
continue;
}
// Skip if this is Coco itself
//
// Coco does not have localized app names, so app_name_en and app_name_zh
// should both have value "Coco-AI", so either should work.
if app_name_en == tauri_app_handle.package_info().name {
continue;
}
// You cannot write `app_name.clone()` within the `doc!()` macro, we should fix this.
let app_name_clone = app_name.clone();
let app_name_zh_clone = app_name_zh.clone();
let app_name_en_clone = app_name_en.clone();
let app_name_in_system_lang = app_name_in_system_lang.clone();
let app_path_clone = app_path.clone();
let document = doc!( app_path_clone, {
FIELD_APP_NAME => app_name_clone,
FIELD_APP_NAME_ZH => app_name_zh_clone,
FIELD_APP_NAME_EN => app_name_en_clone,
FIELD_APP_NAME_IN_SYSTEM_LANG => app_name_in_system_lang,
FIELD_ICON_PATH => app_icon_path,
FIELD_APP_ALIAS => app_alias,
}
@@ -273,8 +344,9 @@ async fn index_applications_if_not_indexed<R: Runtime>(
// We don't error out because one failure won't break the whole thing
if let Err(e) = writer.create_document(document).await {
warn!(
"failed to index application [app name: '{}', app path: '{}'] due to error [{}]", app_name, app_path, e
)
"failed to index application [app name zh: '{}', app name en: '{}', app path: '{}'] due to error [{}]",
app_name_zh, app_name_en, app_path, e
)
}
}
@@ -293,13 +365,13 @@ async fn index_applications_if_not_indexed<R: Runtime>(
}
/// Upon application start, index all the applications found in the `get_default_search_paths()`.
struct IndexAllApplicationsTask<R: Runtime> {
tauri_app_handle: AppHandle<R>,
struct IndexAllApplicationsTask {
tauri_app_handle: AppHandle,
callback: Option<tokio::sync::oneshot::Sender<Result<(), String>>>,
}
#[async_trait::async_trait(?Send)]
impl<R: Runtime> Task for IndexAllApplicationsTask<R> {
impl Task for IndexAllApplicationsTask {
fn search_source_id(&self) -> &'static str {
APPLICATION_SEARCH_SOURCE_ID
}
@@ -321,13 +393,13 @@ impl<R: Runtime> Task for IndexAllApplicationsTask<R> {
}
}
struct ReindexAllApplicationsTask<R: Runtime> {
tauri_app_handle: AppHandle<R>,
struct ReindexAllApplicationsTask {
tauri_app_handle: AppHandle,
callback: Option<tokio::sync::oneshot::Sender<Result<(), String>>>,
}
#[async_trait::async_trait(?Send)]
impl<R: Runtime> Task for ReindexAllApplicationsTask<R> {
impl Task for ReindexAllApplicationsTask {
fn search_source_id(&self) -> &'static str {
APPLICATION_SEARCH_SOURCE_ID
}
@@ -355,14 +427,14 @@ impl<R: Runtime> Task for ReindexAllApplicationsTask<R> {
}
}
struct SearchApplicationsTask<R: Runtime> {
tauri_app_handle: AppHandle<R>,
struct SearchApplicationsTask {
tauri_app_handle: AppHandle,
query_string: String,
callback: Option<OneshotSender<Result<SearchResult, PizzaEngineError>>>,
}
#[async_trait::async_trait(?Send)]
impl<R: Runtime> Task for SearchApplicationsTask<R> {
impl Task for SearchApplicationsTask {
fn search_source_id(&self) -> &'static str {
APPLICATION_SEARCH_SOURCE_ID
}
@@ -380,7 +452,9 @@ impl<R: Runtime> Task for SearchApplicationsTask<R> {
let rx_dropped_error = callback.send(Ok(empty_hits)).is_err();
if rx_dropped_error {
warn!("failed to send local app search result back because the corresponding channel receiver end has been unexpected dropped, which could happen due to a low query timeout")
warn!(
"failed to send local app search result back because the corresponding channel receiver end has been unexpected dropped, which could happen due to a low query timeout"
)
}
return;
@@ -400,8 +474,20 @@ impl<R: Runtime> Task for SearchApplicationsTask<R> {
//
// It will be passed to Pizza like "Google\nChrome". Using Display impl would result
// in an invalid query DSL and serde will complain.
//
// In order to be backward compatible, we still do match and prefix queries to the
// app_name field.
let dsl = format!(
"{{ \"query\": {{ \"bool\": {{ \"should\": [ {{ \"match\": {{ \"{FIELD_APP_NAME}\": {:?} }} }}, {{ \"prefix\": {{ \"{FIELD_APP_NAME}\": {:?} }} }} ] }} }} }}", self.query_string, self.query_string);
"{{ \"query\": {{ \"bool\": {{ \"should\": [ {{ \"match\": {{ \"{FIELD_APP_NAME_ZH}\": {:?} }} }}, {{ \"prefix\": {{ \"{FIELD_APP_NAME_ZH}\": {:?} }} }}, {{ \"match\": {{ \"{FIELD_APP_NAME_EN}\": {:?} }} }}, {{ \"prefix\": {{ \"{FIELD_APP_NAME_EN}\": {:?} }} }}, {{ \"match\": {{ \"{FIELD_APP_NAME_IN_SYSTEM_LANG}\": {:?} }} }}, {{ \"prefix\": {{ \"{FIELD_APP_NAME_IN_SYSTEM_LANG}\": {:?} }} }}, {{ \"match\": {{ \"{FIELD_APP_NAME}\": {:?} }} }}, {{ \"prefix\": {{ \"{FIELD_APP_NAME}\": {:?} }} }} ] }} }} }}",
self.query_string,
self.query_string,
self.query_string,
self.query_string,
self.query_string,
self.query_string,
self.query_string,
self.query_string
);
let state = state
.as_mut_any()
@@ -432,7 +518,9 @@ impl<R: Runtime> Task for SearchApplicationsTask<R> {
let rx_dropped_error = callback.send(Ok(search_result)).is_err();
if rx_dropped_error {
warn!("failed to send local app search result back because the corresponding channel receiver end has been unexpected dropped, which could happen due to a low query timeout")
warn!(
"failed to send local app search result back because the corresponding channel receiver end has been unexpected dropped, which could happen due to a low query timeout"
)
}
}
}
@@ -486,9 +574,33 @@ impl Task for IndexNewApplicationsTask {
pub struct ApplicationSearchSource;
impl ApplicationSearchSource {
pub async fn prepare_index_and_store<R: Runtime>(
app_handle: AppHandle<R>,
) -> Result<(), String> {
pub async fn prepare_index_and_store(app_handle: AppHandle) -> Result<(), String> {
app_handle
.store(TAURI_STORE_APP_HOTKEY)
.map_err(|e| e.to_string())?;
let disabled_app_list_and_search_path_store = app_handle
.store(TAURI_STORE_DISABLED_APP_LIST_AND_SEARCH_PATH)
.map_err(|e| e.to_string())?;
if disabled_app_list_and_search_path_store
.get(TAURI_STORE_KEY_DISABLED_APP_LIST)
.is_none()
{
disabled_app_list_and_search_path_store
.set(TAURI_STORE_KEY_DISABLED_APP_LIST, Json::Array(Vec::new()));
}
// IndexAllApplicationsTask will read the apps installed in search paths and
// index them, so it depends on this configuration entry. Init this entry
// before indexing apps.
if disabled_app_list_and_search_path_store
.get(TAURI_STORE_KEY_SEARCH_PATH)
.is_none()
{
let default_search_path = get_default_search_paths();
disabled_app_list_and_search_path_store
.set(TAURI_STORE_KEY_SEARCH_PATH, default_search_path);
}
let (tx, rx) = tokio::sync::oneshot::channel();
let index_applications_task = IndexAllApplicationsTask {
tauri_app_handle: app_handle.clone(),
@@ -509,29 +621,6 @@ impl ApplicationSearchSource {
)
}
app_handle
.store(TAURI_STORE_APP_HOTKEY)
.map_err(|e| e.to_string())?;
let disabled_app_list_and_search_path_store = app_handle
.store(TAURI_STORE_DISABLED_APP_LIST_AND_SEARCH_PATH)
.map_err(|e| e.to_string())?;
if disabled_app_list_and_search_path_store
.get(TAURI_STORE_KEY_DISABLED_APP_LIST)
.is_none()
{
disabled_app_list_and_search_path_store
.set(TAURI_STORE_KEY_DISABLED_APP_LIST, Json::Array(Vec::new()));
}
if disabled_app_list_and_search_path_store
.get(TAURI_STORE_KEY_SEARCH_PATH)
.is_none()
{
let default_search_path = get_default_search_paths();
disabled_app_list_and_search_path_store
.set(TAURI_STORE_KEY_SEARCH_PATH, default_search_path);
}
Ok(())
}
}
@@ -549,7 +638,11 @@ impl SearchSource for ApplicationSearchSource {
}
}
async fn search(&self, query: SearchQuery) -> Result<QueryResponse, SearchError> {
async fn search(
&self,
_tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
let query_string = query
.query_strings
.get("query")
@@ -590,7 +683,7 @@ impl SearchSource for ApplicationSearchSource {
let total_hits = search_result.total_hits;
let source = self.get_type();
let hits = pizza_engine_hits_to_coco_hits(search_result.hits);
let hits = pizza_engine_hits_to_coco_hits(search_result.hits).await;
Ok(QueryResponse {
source,
@@ -600,9 +693,11 @@ impl SearchSource for ApplicationSearchSource {
}
}
fn pizza_engine_hits_to_coco_hits(
async fn pizza_engine_hits_to_coco_hits(
pizza_engine_hits: Option<Vec<PizzaEngineDocument>>,
) -> Vec<(Document, f64)> {
use crate::util::app_lang::{Lang, get_app_lang};
let Some(engine_hits) = pizza_engine_hits else {
return Vec::new();
};
@@ -611,10 +706,43 @@ fn pizza_engine_hits_to_coco_hits(
for engine_hit in engine_hits {
let score = engine_hit.score.unwrap_or(0.0) as f64;
let mut document_fields = engine_hit.fields;
let app_name = match document_fields.remove(FIELD_APP_NAME).unwrap() {
FieldValue::Text(string) => string,
_ => unreachable!("field name is of type Text"),
// Get both Chinese and English names
let opt_app_name_zh = match document_fields.remove(FIELD_APP_NAME_ZH) {
Some(FieldValue::Text(string)) => Some(string),
_ => None,
};
let opt_app_name_en = match document_fields.remove(FIELD_APP_NAME_EN) {
Some(FieldValue::Text(string)) => Some(string),
_ => None,
};
let opt_app_name_deprecated = match document_fields.remove(FIELD_APP_NAME) {
Some(FieldValue::Text(string)) => Some(string),
_ => None,
};
let app_name: String = {
if let Some(legacy_app_name) = opt_app_name_deprecated {
// Old version of index, which only contains the field app_name.
legacy_app_name
} else {
// New version of index store the following 2 fields
let panic_msg = format!(
"new version of index should contain field [{}] and [{}]",
FIELD_APP_NAME_EN, FIELD_APP_NAME_ZH
);
let app_name_zh = opt_app_name_zh.expect(&panic_msg);
let app_name_en = opt_app_name_en.expect(&panic_msg);
// Choose the appropriate name based on current language
match get_app_lang().await {
Lang::zh_CN => app_name_zh,
Lang::en_US => app_name_en,
}
}
};
let app_path = engine_hit.key.expect("key should be set to app path");
let app_icon_path = match document_fields.remove(FIELD_ICON_PATH).unwrap() {
FieldValue::Text(string) => string,
@@ -634,7 +762,7 @@ fn pizza_engine_hits_to_coco_hits(
}),
id: app_path.clone(),
category: Some("Application".to_string()),
title: Some(app_name.clone()),
title: Some(app_name),
icon: Some(app_icon_path),
on_opened: Some(on_opened),
url: Some(url),
@@ -648,7 +776,7 @@ fn pizza_engine_hits_to_coco_hits(
coco_hits
}
pub fn set_app_alias<R: Runtime>(tauri_app_handle: &AppHandle<R>, app_path: &str, alias: &str) {
pub fn set_app_alias(tauri_app_handle: &AppHandle, app_path: &str, alias: &str) {
let store = tauri_app_handle
.store(TAURI_STORE_APP_ALIAS)
.unwrap_or_else(|_| panic!("store [{}] not found/loaded", TAURI_STORE_APP_ALIAS));
@@ -661,7 +789,7 @@ pub fn set_app_alias<R: Runtime>(tauri_app_handle: &AppHandle<R>, app_path: &str
// deleted while updating it.
}
fn get_app_alias<R: Runtime>(tauri_app_handle: &AppHandle<R>, app_path: &str) -> Option<String> {
fn get_app_alias(tauri_app_handle: &AppHandle, app_path: &str) -> Option<String> {
let store = tauri_app_handle
.store(TAURI_STORE_APP_ALIAS)
.unwrap_or_else(|_| panic!("store [{}] not found/loaded", TAURI_STORE_APP_ALIAS));
@@ -679,9 +807,9 @@ fn get_app_alias<R: Runtime>(tauri_app_handle: &AppHandle<R>, app_path: &str) ->
/// The handler that will be invoked when an application hotkey is pressed.
///
/// The `app_path` argument is for logging-only.
fn app_hotkey_handler<R: Runtime>(
fn app_hotkey_handler(
app_path: String,
) -> impl Fn(&AppHandle<R>, &Shortcut, ShortcutEvent) + Send + Sync + 'static {
) -> impl Fn(&AppHandle, &Shortcut, ShortcutEvent) + Send + Sync + 'static {
move |tauri_app_handle, _hot_key, event| {
if event.state() == ShortcutState::Pressed {
let app_path_clone = app_path.clone();
@@ -697,7 +825,7 @@ fn app_hotkey_handler<R: Runtime>(
}
/// For all the applications, if it is enabled & has hotkey set, then set it up.
pub(crate) fn set_apps_hotkey<R: Runtime>(tauri_app_handle: &AppHandle<R>) -> Result<(), String> {
pub(crate) fn set_apps_hotkey(tauri_app_handle: &AppHandle) -> Result<(), String> {
let app_hotkey_store = tauri_app_handle
.store(TAURI_STORE_APP_HOTKEY)
.unwrap_or_else(|_| panic!("store [{}] not found/loaded", TAURI_STORE_APP_HOTKEY));
@@ -721,7 +849,7 @@ pub(crate) fn set_apps_hotkey<R: Runtime>(tauri_app_handle: &AppHandle<R>) -> Re
}
/// For all the applications, if it is enabled & has hotkey set, then unset it.
pub(crate) fn unset_apps_hotkey<R: Runtime>(tauri_app_handle: &AppHandle<R>) -> Result<(), String> {
pub(crate) fn unset_apps_hotkey(tauri_app_handle: &AppHandle) -> Result<(), String> {
let app_hotkey_store = tauri_app_handle
.store(TAURI_STORE_APP_HOTKEY)
.unwrap_or_else(|_| panic!("store [{}] not found/loaded", TAURI_STORE_APP_HOTKEY));
@@ -748,8 +876,8 @@ pub(crate) fn unset_apps_hotkey<R: Runtime>(tauri_app_handle: &AppHandle<R>) ->
}
/// Set the hotkey but won't persist this settings change.
pub(crate) fn set_app_hotkey<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
pub(crate) fn set_app_hotkey(
tauri_app_handle: &AppHandle,
app_path: &str,
hotkey: &str,
) -> Result<(), String> {
@@ -759,8 +887,8 @@ pub(crate) fn set_app_hotkey<R: Runtime>(
.map_err(|e| e.to_string())
}
pub fn register_app_hotkey<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
pub fn register_app_hotkey(
tauri_app_handle: &AppHandle,
app_path: &str,
hotkey: &str,
) -> Result<(), String> {
@@ -777,10 +905,7 @@ pub fn register_app_hotkey<R: Runtime>(
Ok(())
}
pub fn unregister_app_hotkey<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
app_path: &str,
) -> Result<(), String> {
pub fn unregister_app_hotkey(tauri_app_handle: &AppHandle, app_path: &str) -> Result<(), String> {
let app_hotkey_store = tauri_app_handle
.store(TAURI_STORE_APP_HOTKEY)
.unwrap_or_else(|_| panic!("store [{}] not found/loaded", TAURI_STORE_APP_HOTKEY));
@@ -807,7 +932,9 @@ pub fn unregister_app_hotkey<R: Runtime>(
.global_shortcut()
.is_registered(hotkey.as_str())
{
panic!("inconsistent state, tauri store a hotkey is stored in the tauri store but it is not registered");
panic!(
"inconsistent state, tauri store a hotkey is stored in the tauri store but it is not registered"
);
}
tauri_app_handle
@@ -818,7 +945,7 @@ pub fn unregister_app_hotkey<R: Runtime>(
Ok(())
}
fn get_disabled_app_list<R: Runtime>(tauri_app_handle: &AppHandle<R>) -> Vec<String> {
fn get_disabled_app_list(tauri_app_handle: &AppHandle) -> Vec<String> {
let store = tauri_app_handle
.store(TAURI_STORE_DISABLED_APP_LIST_AND_SEARCH_PATH)
.unwrap_or_else(|_| {
@@ -855,10 +982,7 @@ pub fn is_app_search_enabled(app_path: &str) -> bool {
disabled_app_list.iter().all(|path| path != app_path)
}
pub fn disable_app_search<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
app_path: &str,
) -> Result<(), String> {
pub fn disable_app_search(tauri_app_handle: &AppHandle, app_path: &str) -> Result<(), String> {
let store = tauri_app_handle
.store(TAURI_STORE_DISABLED_APP_LIST_AND_SEARCH_PATH)
.unwrap_or_else(|_| {
@@ -902,10 +1026,7 @@ pub fn disable_app_search<R: Runtime>(
Ok(())
}
pub fn enable_app_search<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
app_path: &str,
) -> Result<(), String> {
pub fn enable_app_search(tauri_app_handle: &AppHandle, app_path: &str) -> Result<(), String> {
let store = tauri_app_handle
.store(TAURI_STORE_DISABLED_APP_LIST_AND_SEARCH_PATH)
.unwrap_or_else(|_| {
@@ -947,8 +1068,8 @@ pub fn enable_app_search<R: Runtime>(
}
#[tauri::command]
pub async fn add_app_search_path<R: Runtime>(
tauri_app_handle: AppHandle<R>,
pub async fn add_app_search_path(
tauri_app_handle: AppHandle,
search_path: String,
) -> Result<(), String> {
let mut search_paths = get_app_search_path(tauri_app_handle.clone()).await;
@@ -973,8 +1094,8 @@ pub async fn add_app_search_path<R: Runtime>(
}
#[tauri::command]
pub async fn remove_app_search_path<R: Runtime>(
tauri_app_handle: AppHandle<R>,
pub async fn remove_app_search_path(
tauri_app_handle: AppHandle,
search_path: String,
) -> Result<(), String> {
let mut search_paths = get_app_search_path(tauri_app_handle.clone()).await;
@@ -999,7 +1120,7 @@ pub async fn remove_app_search_path<R: Runtime>(
}
#[tauri::command]
pub async fn get_app_search_path<R: Runtime>(tauri_app_handle: AppHandle<R>) -> Vec<String> {
pub async fn get_app_search_path(tauri_app_handle: AppHandle) -> Vec<String> {
let store = tauri_app_handle
.store(TAURI_STORE_DISABLED_APP_LIST_AND_SEARCH_PATH)
.unwrap_or_else(|_| {
@@ -1028,18 +1149,25 @@ pub async fn get_app_search_path<R: Runtime>(tauri_app_handle: AppHandle<R>) ->
}
#[tauri::command]
pub async fn get_app_list<R: Runtime>(
tauri_app_handle: AppHandle<R>,
) -> Result<Vec<Extension>, String> {
pub async fn get_app_list(tauri_app_handle: AppHandle) -> Result<Vec<Extension>, String> {
use crate::util::app_lang::{Lang, get_app_lang};
let search_paths = get_app_search_path(tauri_app_handle.clone()).await;
let apps = list_app_in(search_paths)?;
let mut app_entries = Vec::with_capacity(apps.len());
let lang = get_app_lang().await;
for app in apps {
let name = get_app_name(&app).await;
let name = match lang {
Lang::zh_CN => get_app_name_zh(&app).await,
Lang::en_US => get_app_name_en(&app).await,
};
// filter out Coco-AI
//
// Coco does not have localized app names, so regardless the chosen language, name
// should have value "Coco-AI".
if name.eq(&tauri_app_handle.package_info().name) {
continue;
}
@@ -1107,11 +1235,14 @@ pub async fn get_app_list<R: Runtime>(
quicklink: None,
commands: None,
scripts: None,
views: None,
quicklinks: None,
alias: Some(alias),
hotkey,
enabled,
settings: None,
page: None,
permission: None,
screenshots: None,
url: None,
version: None,
@@ -1165,9 +1296,7 @@ pub async fn get_app_metadata(app_name: String, app_path: String) -> Result<AppM
}
#[tauri::command]
pub async fn reindex_applications<R: Runtime>(
tauri_app_handle: AppHandle<R>,
) -> Result<(), String> {
pub async fn reindex_applications(tauri_app_handle: AppHandle) -> Result<(), String> {
let (tx, rx) = tokio::sync::oneshot::channel();
let reindex_applications_task = ReindexAllApplicationsTask {
tauri_app_handle: tauri_app_handle.clone(),

View File

@@ -5,16 +5,14 @@ use crate::common::search::{QueryResponse, QuerySource, SearchQuery};
use crate::common::traits::SearchSource;
use crate::extension::LOCAL_QUERY_SOURCE_TYPE;
use async_trait::async_trait;
use tauri::{AppHandle, Runtime};
use tauri::AppHandle;
pub(crate) const QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME: &str = "Applications";
pub struct ApplicationSearchSource;
impl ApplicationSearchSource {
pub async fn prepare_index_and_store<R: Runtime>(
_app_handle: AppHandle<R>,
) -> Result<(), String> {
pub async fn prepare_index_and_store(_app_handle: AppHandle) -> Result<(), String> {
Ok(())
}
}
@@ -32,7 +30,11 @@ impl SearchSource for ApplicationSearchSource {
}
}
async fn search(&self, _query: SearchQuery) -> Result<QueryResponse, SearchError> {
async fn search(
&self,
_tauri_app_handle: AppHandle,
_query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
@@ -41,37 +43,28 @@ impl SearchSource for ApplicationSearchSource {
}
}
pub fn set_app_alias<R: Runtime>(_tauri_app_handle: &AppHandle<R>, _app_path: &str, _alias: &str) {
pub fn set_app_alias(_tauri_app_handle: &AppHandle, _app_path: &str, _alias: &str) {
unreachable!("app list should be empty, there is no way this can be invoked")
}
pub fn register_app_hotkey<R: Runtime>(
_tauri_app_handle: &AppHandle<R>,
pub fn register_app_hotkey(
_tauri_app_handle: &AppHandle,
_app_path: &str,
_hotkey: &str,
) -> Result<(), String> {
unreachable!("app list should be empty, there is no way this can be invoked")
}
pub fn unregister_app_hotkey<R: Runtime>(
_tauri_app_handle: &AppHandle<R>,
_app_path: &str,
) -> Result<(), String> {
pub fn unregister_app_hotkey(_tauri_app_handle: &AppHandle, _app_path: &str) -> Result<(), String> {
unreachable!("app list should be empty, there is no way this can be invoked")
}
pub fn disable_app_search<R: Runtime>(
_tauri_app_handle: &AppHandle<R>,
_app_path: &str,
) -> Result<(), String> {
pub fn disable_app_search(_tauri_app_handle: &AppHandle, _app_path: &str) -> Result<(), String> {
// no-op
Ok(())
}
pub fn enable_app_search<R: Runtime>(
_tauri_app_handle: &AppHandle<R>,
_app_path: &str,
) -> Result<(), String> {
pub fn enable_app_search(_tauri_app_handle: &AppHandle, _app_path: &str) -> Result<(), String> {
// no-op
Ok(())
}
@@ -81,8 +74,8 @@ pub fn is_app_search_enabled(_app_path: &str) -> bool {
}
#[tauri::command]
pub async fn add_app_search_path<R: Runtime>(
_tauri_app_handle: AppHandle<R>,
pub async fn add_app_search_path(
_tauri_app_handle: AppHandle,
_search_path: String,
) -> Result<(), String> {
// no-op
@@ -90,8 +83,8 @@ pub async fn add_app_search_path<R: Runtime>(
}
#[tauri::command]
pub async fn remove_app_search_path<R: Runtime>(
_tauri_app_handle: AppHandle<R>,
pub async fn remove_app_search_path(
_tauri_app_handle: AppHandle,
_search_path: String,
) -> Result<(), String> {
// no-op
@@ -99,43 +92,37 @@ pub async fn remove_app_search_path<R: Runtime>(
}
#[tauri::command]
pub async fn get_app_search_path<R: Runtime>(_tauri_app_handle: AppHandle<R>) -> Vec<String> {
pub async fn get_app_search_path(_tauri_app_handle: AppHandle) -> Vec<String> {
// Return an empty list
Vec::new()
}
#[tauri::command]
pub async fn get_app_list<R: Runtime>(
_tauri_app_handle: AppHandle<R>,
) -> Result<Vec<Extension>, String> {
pub async fn get_app_list(_tauri_app_handle: AppHandle) -> Result<Vec<Extension>, String> {
// Return an empty list
Ok(Vec::new())
}
#[tauri::command]
pub async fn get_app_metadata<R: Runtime>(
_tauri_app_handle: AppHandle<R>,
pub async fn get_app_metadata(
_tauri_app_handle: AppHandle,
_app_path: String,
) -> Result<AppMetadata, String> {
unreachable!("app list should be empty, there is no way this can be invoked")
}
pub(crate) fn set_apps_hotkey<R: Runtime>(_tauri_app_handle: &AppHandle<R>) -> Result<(), String> {
pub(crate) fn set_apps_hotkey(_tauri_app_handle: &AppHandle) -> Result<(), String> {
// no-op
Ok(())
}
pub(crate) fn unset_apps_hotkey<R: Runtime>(
_tauri_app_handle: &AppHandle<R>,
) -> Result<(), String> {
pub(crate) fn unset_apps_hotkey(_tauri_app_handle: &AppHandle) -> Result<(), String> {
// no-op
Ok(())
}
#[tauri::command]
pub async fn reindex_applications<R: Runtime>(
_tauri_app_handle: AppHandle<R>,
) -> Result<(), String> {
pub async fn reindex_applications(_tauri_app_handle: AppHandle) -> Result<(), String> {
// no-op
Ok(())
}

View File

@@ -10,6 +10,7 @@ use chinese_number::{ChineseCase, ChineseCountMethod, ChineseVariant, NumberToCh
use num2words::Num2Words;
use serde_json::Value;
use std::collections::HashMap;
use tauri::AppHandle;
pub(crate) const DATA_SOURCE_ID: &str = "Calculator";
@@ -120,7 +121,11 @@ impl SearchSource for CalculatorSource {
}
}
async fn search(&self, query: SearchQuery) -> Result<QueryResponse, SearchError> {
async fn search(
&self,
_tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
let Some(query_string) = query.query_strings.get("query") else {
return Ok(QueryResponse {
source: self.get_type(),
@@ -176,13 +181,11 @@ impl SearchSource for CalculatorSource {
total_hits: 1,
}
}
Err(_) => {
QueryResponse {
source: query_source,
hits: Vec::new(),
total_hits: 0,
}
}
Err(_) => QueryResponse {
source: query_source,
hits: Vec::new(),
total_hits: 0,
},
}
};

View File

@@ -0,0 +1,216 @@
//! File Search configuration entries definition and getter/setter functions.
use crate::extension::built_in::file_search::implementation::apply_config;
use serde::Deserialize;
use serde::Serialize;
use serde_json::Value;
use std::sync::LazyLock;
use tauri::AppHandle;
use tauri_plugin_store::StoreExt;
// Tauri store keys for file system configuration
const TAURI_STORE_FILE_SYSTEM_CONFIG: &str = "file_system_config";
const TAURI_STORE_KEY_SEARCH_BY: &str = "search_by";
const TAURI_STORE_KEY_SEARCH_PATHS: &str = "search_paths";
const TAURI_STORE_KEY_EXCLUDE_PATHS: &str = "exclude_paths";
const TAURI_STORE_KEY_FILE_TYPES: &str = "file_types";
static HOME_DIR: LazyLock<String> = LazyLock::new(|| {
let os_string = dirs::home_dir()
.expect("$HOME should be set")
.into_os_string();
os_string
.into_string()
.expect("User home directory should be encoded with UTF-8")
});
#[derive(Debug, Clone, Serialize, Deserialize, Copy, PartialEq)]
pub enum SearchBy {
Name,
NameAndContents,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct FileSearchConfig {
pub search_paths: Vec<String>,
pub exclude_paths: Vec<String>,
pub file_types: Vec<String>,
pub search_by: SearchBy,
}
impl Default for FileSearchConfig {
fn default() -> Self {
Self {
search_paths: vec![
format!("{}/Documents", HOME_DIR.as_str()),
format!("{}/Desktop", HOME_DIR.as_str()),
format!("{}/Downloads", HOME_DIR.as_str()),
],
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::Name,
}
}
}
impl FileSearchConfig {
pub(crate) fn get(tauri_app_handle: &AppHandle) -> Self {
let store = tauri_app_handle
.store(TAURI_STORE_FILE_SYSTEM_CONFIG)
.unwrap_or_else(|e| {
panic!(
"store [{}] not found/loaded, error [{}]",
TAURI_STORE_FILE_SYSTEM_CONFIG, e
)
});
// Default value, will be used when specific config entries are not set
let default_config = FileSearchConfig::default();
let search_paths = {
if let Some(search_paths) = store.get(TAURI_STORE_KEY_SEARCH_PATHS) {
match search_paths {
Value::Array(arr) => {
let mut vec = Vec::with_capacity(arr.len());
for v in arr {
match v {
Value::String(s) => vec.push(s),
other => panic!(
"Expected all elements of 'search_paths' to be strings, but found: {:?}",
other
),
}
}
vec
}
other => panic!(
"Expected 'search_paths' to be an array of strings in the file system config store, but got: {:?}",
other
),
}
} else {
store.set(
TAURI_STORE_KEY_SEARCH_PATHS,
default_config.search_paths.as_slice(),
);
default_config.search_paths
}
};
let exclude_paths = {
if let Some(exclude_paths) = store.get(TAURI_STORE_KEY_EXCLUDE_PATHS) {
match exclude_paths {
Value::Array(arr) => {
let mut vec = Vec::with_capacity(arr.len());
for v in arr {
match v {
Value::String(s) => vec.push(s),
other => panic!(
"Expected all elements of 'exclude_paths' to be strings, but found: {:?}",
other
),
}
}
vec
}
other => panic!(
"Expected 'exclude_paths' to be an array of strings in the file system config store, but got: {:?}",
other
),
}
} else {
store.set(
TAURI_STORE_KEY_EXCLUDE_PATHS,
default_config.exclude_paths.as_slice(),
);
default_config.exclude_paths
}
};
let file_types = {
if let Some(file_types) = store.get(TAURI_STORE_KEY_FILE_TYPES) {
match file_types {
Value::Array(arr) => {
let mut vec = Vec::with_capacity(arr.len());
for v in arr {
match v {
Value::String(s) => vec.push(s),
other => panic!(
"Expected all elements of 'file_types' to be strings, but found: {:?}",
other
),
}
}
vec
}
other => panic!(
"Expected 'file_types' to be an array of strings in the file system config store, but got: {:?}",
other
),
}
} else {
store.set(
TAURI_STORE_KEY_FILE_TYPES,
default_config.file_types.as_slice(),
);
default_config.file_types
}
};
let search_by = {
if let Some(search_by) = store.get(TAURI_STORE_KEY_SEARCH_BY) {
serde_json::from_value(search_by.clone()).unwrap_or_else(|e| {
panic!(
"Failed to deserialize 'search_by' from file system config store. Invalid JSON: {:?}, error: {}",
search_by, e
)
})
} else {
store.set(
TAURI_STORE_KEY_SEARCH_BY,
serde_json::to_value(default_config.search_by).unwrap(),
);
default_config.search_by
}
};
Self {
search_by,
search_paths,
exclude_paths,
file_types,
}
}
}
// Tauri commands for managing file system configuration
#[tauri::command]
pub async fn get_file_system_config(tauri_app_handle: AppHandle) -> FileSearchConfig {
FileSearchConfig::get(&tauri_app_handle)
}
#[tauri::command]
pub async fn set_file_system_config(
tauri_app_handle: AppHandle,
config: FileSearchConfig,
) -> Result<(), String> {
let store = tauri_app_handle
.store(TAURI_STORE_FILE_SYSTEM_CONFIG)
.map_err(|e| e.to_string())?;
store.set(TAURI_STORE_KEY_SEARCH_PATHS, config.search_paths.as_slice());
store.set(
TAURI_STORE_KEY_EXCLUDE_PATHS,
config.exclude_paths.as_slice(),
);
store.set(TAURI_STORE_KEY_FILE_TYPES, config.file_types.as_slice());
store.set(
TAURI_STORE_KEY_SEARCH_BY,
serde_json::to_value(config.search_by).unwrap(),
);
// Apply the config when we know that this set operation won't fail
apply_config(&config)?;
Ok(())
}

View File

@@ -0,0 +1,388 @@
//! File system powered by GNOME's Tracker engine.
use super::super::super::EXTENSION_ID;
use super::super::super::config::FileSearchConfig;
use super::super::should_be_filtered_out;
use crate::common::document::DataSourceReference;
use crate::extension::LOCAL_QUERY_SOURCE_TYPE;
use crate::util::file::sync_get_file_icon;
use crate::{
common::document::{Document, OnOpened},
extension::built_in::file_search::config::SearchBy,
};
use camino::Utf8Path;
use gio::Cancellable;
use gio::Settings;
use gio::prelude::SettingsExtManual;
use glib::GString;
use glib::collections::strv::StrV;
use tracker::{SparqlConnection, SparqlCursor, prelude::SparqlCursorExtManual};
/// The service that we will connect to.
const SERVICE_NAME: &str = "org.freedesktop.Tracker3.Miner.Files";
/// Tracker won't return scores when we are not using full-text seach. In that
/// case, we use this score.
const SCORE: f64 = 1.0;
/// Helper function to return different SPARQL queries depending on the different configurations.
fn query_sparql(query_string: &str, config: &FileSearchConfig) -> String {
match config.search_by {
SearchBy::Name => {
// Cannot use the inverted index as that searches for all the attributes,
// but we only want to search the filename.
format!(
"SELECT nie:url(?file_item) WHERE {{ ?file_item nfo:fileName ?fileName . FILTER(regex(?fileName, '{query_string}', 'i')) }}"
)
}
SearchBy::NameAndContents => {
// Full-text search against all attributes
// OR
// filename search
format!(
"SELECT nie:url(?file_item) fts:rank(?file_item) WHERE {{ {{ ?file_item fts:match '{query_string}' }} UNION {{ ?file_item nfo:fileName ?fileName . FILTER(regex(?fileName, '{query_string}', 'i')) }} }} ORDER BY DESC fts:rank(?file_item)"
)
}
}
}
/// Helper function to replace unsupported characters with whitespace.
///
/// Tracker will error out if it encounters these characters.
///
/// The complete list of unsupported characters is unknown and we don't know how
/// to escape them, so let's replace them.
fn query_string_cleanup(old: &str) -> String {
const UNSUPPORTED_CHAR: [char; 3] = ['\'', '\n', '\\'];
// Using len in bytes is ok
let mut chars = Vec::with_capacity(old.len());
for char in old.chars() {
if UNSUPPORTED_CHAR.contains(&char) {
chars.push(' ');
} else {
chars.push(char);
}
}
chars.into_iter().collect()
}
struct Query {
conn: SparqlConnection,
cursor: SparqlCursor,
}
impl Query {
fn new(query_string: &str, config: &FileSearchConfig) -> Result<Self, String> {
let query_string = query_string_cleanup(query_string);
let sparql = query_sparql(&query_string, config);
let conn =
SparqlConnection::bus_new(SERVICE_NAME, None, None).map_err(|e| e.to_string())?;
let cursor = conn
.query(&sparql, Cancellable::NONE)
.map_err(|e| e.to_string())?;
Ok(Self { conn, cursor })
}
}
impl Drop for Query {
fn drop(&mut self) {
self.cursor.close();
self.conn.close();
}
}
impl Iterator for Query {
/// It yields a tuple `(file path, score)`
type Item = Result<(String, f64), String>;
fn next(&mut self) -> Option<Self::Item> {
loop {
let has_next = match self
.cursor
.next(Cancellable::NONE)
.map_err(|e| e.to_string())
{
Ok(has_next) => has_next,
Err(err_str) => return Some(Err(err_str)),
};
if !has_next {
return None;
}
// The first column is the URL
let file_url_column = self.cursor.string(0);
// It could be None (or NULL ptr if you use C), I have no clue why.
let opt_str = file_url_column.as_ref().map(|gstr| gstr.as_str());
match opt_str {
Some(url) => {
// The returned URL has a prefix that we need to trim
const PREFIX: &str = "file://";
const PREFIX_LEN: usize = PREFIX.len();
let file_path = url[PREFIX_LEN..].to_string();
assert!(!file_path.is_empty());
assert_ne!(file_path, "/", "file search should not hit the root path");
let score = {
// The second column is the score, this column may not
// exist. We use SCORE if the real value is absent.
let score_column = self.cursor.string(1);
let opt_score_str = score_column.as_ref().map(|g_str| g_str.as_str());
let opt_score = opt_score_str.map(|str| {
str.parse::<f64>()
.expect("score should be valid for type f64")
});
opt_score.unwrap_or(SCORE)
};
return Some(Ok((file_path, score)));
}
None => {
// another try
continue;
}
}
}
}
}
pub(crate) async fn hits(
query_string: &str,
from: usize,
size: usize,
config: &FileSearchConfig,
) -> Result<Vec<(Document, f64)>, String> {
// Special cases that will make querying faster.
if query_string.is_empty() || size == 0 || config.search_paths.is_empty() {
return Ok(Vec::new());
}
let mut result_hits = Vec::with_capacity(size);
let need_to_skip = {
if matches!(config.search_by, SearchBy::Name) {
// We don't use full-text search in this case, the returned documents
// won't be scored, the query hits won't be sorted, so processing the
// from parameter is meaningless.
false
} else {
from > 0
}
};
let mut num_skipped = 0;
let should_skip = from;
let query = Query::new(query_string, config)?;
for res_entry in query {
let (file_path, score) = res_entry?;
// This should be called before processing the `from` parameter.
if should_be_filtered_out(config, &file_path, true, true, true) {
continue;
}
// Process the `from` parameter.
if need_to_skip && num_skipped < should_skip {
// Skip this
num_skipped += 1;
continue;
}
let icon = sync_get_file_icon(&file_path);
let file_path_of_type_path = camino::Utf8Path::new(&file_path);
let r#where = file_path_of_type_path
.parent()
.unwrap_or_else(|| {
panic!(
"expect path [{}] to have a parent, but it does not",
file_path
);
})
.to_string();
let file_name = file_path_of_type_path.file_name().unwrap_or_else(|| {
panic!(
"expect path [{}] to have a file name, but it does not",
file_path
);
});
let on_opened = OnOpened::Document {
url: file_path.to_string(),
};
let doc = Document {
id: file_path.to_string(),
title: Some(file_name.to_string()),
source: Some(DataSourceReference {
r#type: Some(LOCAL_QUERY_SOURCE_TYPE.into()),
name: Some(EXTENSION_ID.into()),
id: Some(EXTENSION_ID.into()),
icon: Some(String::from("font_Filesearch")),
}),
category: Some(r#where),
on_opened: Some(on_opened),
url: Some(file_path),
icon: Some(icon.to_string()),
..Default::default()
};
result_hits.push((doc, score));
// Collected enough documents, return
if result_hits.len() >= size {
break;
}
}
Ok(result_hits)
}
fn ensure_path_in_recursive_indexing_scope(list: &mut StrV, path: &str) {
for item in list.iter() {
let item_path = Utf8Path::new(item.as_str());
let path = Utf8Path::new(path);
// It is already covered or listed
if path.starts_with(item_path) {
return;
}
}
list.push(
GString::from_utf8_checked(path.as_bytes().to_vec())
.expect("search_path_str contains an interior NUL"),
);
}
fn ensure_path_and_descendants_not_in_single_indexing_scope(list: &mut StrV, path: &str) {
// Indexes to the items that should be removed
let mut item_to_remove = Vec::new();
for (idx, item) in list.iter().enumerate() {
let item_path = Utf8Path::new(item.as_str());
let path = Utf8Path::new(path);
if item_path.starts_with(path) {
item_to_remove.push(idx);
}
}
// Reverse the indexes so that the remove operation won't invalidate them.
for idx in item_to_remove.into_iter().rev() {
list.remove(idx);
}
}
pub(crate) fn apply_config(config: &FileSearchConfig) -> Result<(), String> {
// Tracker provides the following configuration entries to allow users to
// tweak the indexing scope:
//
// 1. ignored-directories: A list of names, directories with such names will be ignored.
// ['po', 'CVS', 'core-dumps', 'lost+found']
// 2. ignored-directories-with-content: Avoid any directory containing a file blocklisted here
// ['.trackerignore', '.git', '.hg', '.nomedia']
// 3. ignored-files: List of file patterns to avoid
// ['*~', '*.o', '*.la', '*.lo', '*.loT', '*.in', '*.m4', '*.rej', ...]
// 4. index-recursive-directories: List of directories to index recursively
// ['&DESKTOP', '&DOCUMENTS', '&MUSIC', '&PICTURES', '&VIDEOS']
// 5. index-single-directories: List of directories to index without inspecting subfolders,
// ['$HOME', '&DOWNLOAD']
//
// The first 3 entries specify patterns, in order to use them, we have to walk
// through the whole directory tree listed in search paths, which is impractical.
// So we only use the last 2 entries.
//
//
// Just want to mention that setting search path to "/home" could break Tracker:
//
// ```text
// Unknown target graph for uri:'file:///home' and mime:'inode/directory'
// ```
//
// See the related bug reports:
//
// https://gitlab.gnome.org/GNOME/localsearch/-/issues/313
// https://bugs.launchpad.net/bugs/2077181
//
//
// There is nothing we can do.
const TRACKER_SETTINGS_SCHEMA: &str = "org.freedesktop.Tracker3.Miner.Files";
const KEY_INDEX_RECURSIVE_DIRECTORIES: &str = "index-recursive-directories";
const KEY_INDEX_SINGLE_DIRECTORIES: &str = "index-single-directories";
let search_paths = &config.search_paths;
let settings = Settings::new(TRACKER_SETTINGS_SCHEMA);
let mut recursive_list: StrV = settings.strv(KEY_INDEX_RECURSIVE_DIRECTORIES);
let mut single_list: StrV = settings.strv(KEY_INDEX_SINGLE_DIRECTORIES);
for search_path in search_paths {
// We want our search path to be included in the recursive directories or
// any directory within the list covers it.
ensure_path_in_recursive_indexing_scope(&mut recursive_list, search_path);
// We want our search path and its any descendants are not listed in
// the index directories list.
ensure_path_and_descendants_not_in_single_indexing_scope(&mut single_list, search_path);
}
settings
.set_strv(KEY_INDEX_RECURSIVE_DIRECTORIES, recursive_list)
.expect("key is not read-only");
settings
.set_strv(KEY_INDEX_SINGLE_DIRECTORIES, single_list)
.expect("key is not be read-only");
Ok(())
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_query_string_cleanup_basic() {
assert_eq!(query_string_cleanup("test"), "test");
assert_eq!(query_string_cleanup("hello world"), "hello world");
assert_eq!(query_string_cleanup("file.txt"), "file.txt");
}
#[test]
fn test_query_string_cleanup_unsupported_chars() {
assert_eq!(query_string_cleanup("test'file"), "test file");
assert_eq!(query_string_cleanup("test\nfile"), "test file");
assert_eq!(query_string_cleanup("test\\file"), "test file");
}
#[test]
fn test_query_string_cleanup_multiple_unsupported() {
assert_eq!(query_string_cleanup("test'file\nname"), "test file name");
assert_eq!(query_string_cleanup("test\'file"), "test file");
assert_eq!(query_string_cleanup("\n'test"), " test");
}
#[test]
fn test_query_string_cleanup_edge_cases() {
assert_eq!(query_string_cleanup(""), "");
assert_eq!(query_string_cleanup("'"), " ");
assert_eq!(query_string_cleanup("\n"), " ");
assert_eq!(query_string_cleanup("\\"), " ");
assert_eq!(query_string_cleanup(" '\n\\ "), " ");
}
#[test]
fn test_query_string_cleanup_mixed_content() {
assert_eq!(
query_string_cleanup("document's content\nwith\\backslash"),
"document s content with backslash"
);
assert_eq!(
query_string_cleanup("path/to'file\nextension\\test"),
"path/to file extension test"
);
}
}

View File

@@ -0,0 +1,308 @@
//! File search for KDE, powered by its Baloo engine.
use super::super::super::EXTENSION_ID;
use super::super::super::config::FileSearchConfig;
use super::super::super::config::SearchBy;
use super::super::should_be_filtered_out;
use crate::common::document::{DataSourceReference, Document};
use crate::extension::LOCAL_QUERY_SOURCE_TYPE;
use crate::extension::OnOpened;
use crate::util::file::sync_get_file_icon;
use camino::Utf8Path;
use configparser::ini::Ini;
use configparser::ini::WriteOptions;
use futures::stream::Stream;
use futures::stream::StreamExt;
use std::os::fd::OwnedFd;
use std::path::PathBuf;
use tokio::io::AsyncBufReadExt;
use tokio::io::BufReader;
use tokio::process::Child;
use tokio::process::Command;
use tokio_stream::wrappers::LinesStream;
/// Baloo does not support scoring, use this score for all the documents.
const SCORE: f64 = 1.0;
/// KDE6 updates the binary name to "baloosearch6", but I believe there still have
/// distros using the original name. So we need to check both.
fn cli_tool_lookup() -> Option<PathBuf> {
use which::which;
let res_path = which("baloosearch").or_else(|_| which("baloosearch6"));
res_path.ok()
}
pub(crate) async fn hits(
query_string: &str,
_from: usize,
size: usize,
config: &FileSearchConfig,
) -> Result<Vec<(Document, f64)>, String> {
// Special cases that will make querying faster.
if query_string.is_empty() || size == 0 || config.search_paths.is_empty() {
return Ok(Vec::new());
}
// If the tool is not found, return an empty result as well.
let Some(tool_path) = cli_tool_lookup() else {
return Ok(Vec::new());
};
let (mut iter, _baloosearch_child_process) =
execute_baloosearch_query(tool_path, query_string, size, config)?;
// Convert results to documents
let mut hits: Vec<(Document, f64)> = Vec::new();
while let Some(res_file_path) = iter.next().await {
let file_path = res_file_path.map_err(|io_err| io_err.to_string())?;
let icon = sync_get_file_icon(&file_path);
let file_path_of_type_path = camino::Utf8Path::new(&file_path);
let r#where = file_path_of_type_path
.parent()
.unwrap_or_else(|| {
panic!(
"expect path [{}] to have a parent, but it does not",
file_path
);
})
.to_string();
let file_name = file_path_of_type_path.file_name().unwrap_or_else(|| {
panic!(
"expect path [{}] to have a file name, but it does not",
file_path
);
});
let on_opened = OnOpened::Document {
url: file_path.clone(),
};
let doc = Document {
id: file_path.clone(),
title: Some(file_name.to_string()),
source: Some(DataSourceReference {
r#type: Some(LOCAL_QUERY_SOURCE_TYPE.into()),
name: Some(EXTENSION_ID.into()),
id: Some(EXTENSION_ID.into()),
icon: Some(String::from("font_Filesearch")),
}),
category: Some(r#where),
on_opened: Some(on_opened),
url: Some(file_path),
icon: Some(icon.to_string()),
..Default::default()
};
hits.push((doc, SCORE));
}
Ok(hits)
}
/// Return an array containing the `baloosearch` command and its arguments.
fn build_baloosearch_query(
tool_path: PathBuf,
query_string: &str,
config: &FileSearchConfig,
) -> Vec<String> {
let tool_path = tool_path
.into_os_string()
.into_string()
.expect("binary path should be UTF-8 encoded");
let mut args = vec![tool_path];
match config.search_by {
SearchBy::Name => {
args.push(format!("filename:{query_string}"));
}
SearchBy::NameAndContents => {
args.push(query_string.to_string());
}
}
for search_path in config.search_paths.iter() {
args.extend_from_slice(&["-d".into(), search_path.clone()]);
}
args
}
/// Spawn the `baloosearch` child process and return an async iterator over its output,
/// allowing us to collect the results asynchronously.
///
/// # Return value:
///
/// * impl Stream: an async iterator that will yield the matched files
/// * Child: The handle to the baloosearch process. The child process will be
/// killed when this handle gets dropped so we need to keep it alive util we
/// exhaust the stream.
fn execute_baloosearch_query(
tool_path: PathBuf,
query_string: &str,
size: usize,
config: &FileSearchConfig,
) -> Result<(impl Stream<Item = std::io::Result<String>>, Child), String> {
let args = build_baloosearch_query(tool_path, query_string, config);
let (rx, tx) = std::io::pipe().unwrap();
let rx_owned = OwnedFd::from(rx);
let async_rx = tokio::net::unix::pipe::Receiver::from_owned_fd(rx_owned).unwrap();
let buffered_rx = BufReader::new(async_rx);
let lines = LinesStream::new(buffered_rx.lines());
let child = Command::new(&args[0])
.args(&args[1..])
.stdout(tx)
.stderr(std::process::Stdio::null())
// The child process will be killed when the Child instance gets dropped.
.kill_on_drop(true)
.spawn()
.map_err(|e| format!("Failed to spawn baloosearch: {e}"))?;
let config_clone = config.clone();
let iter = lines
.filter(move |res_path| {
std::future::ready({
match res_path {
Ok(path) => !should_be_filtered_out(&config_clone, path, false, true, true),
Err(_) => {
// Don't filter out Err() values
true
}
}
})
})
.take(size);
Ok((iter, child))
}
pub(crate) fn apply_config(config: &FileSearchConfig) -> Result<(), String> {
// Users can tweak Baloo via its configuration file, below are the fields that
// we need to modify:
//
// * Indexing-Enabled: turn indexing on or off
// * only basic indexing: If true, Baloo only indexes file names
// * folders: directories to index
// * exclude folders: directories to skip
//
// ```ini
// [Basic Settings]
// Indexing-Enabled=true
//
// [General]
// only basic indexing=true
// folders[$e]=$HOME/
// exclude folders[$e]=$HOME/FolderA/,$HOME/FolderB/
// ```
const SECTION_GENERAL: &str = "General";
const KEY_INCLUDE_FOLDERS: &str = "folders[$e]";
const KEY_EXCLUDE_FOLDERS: &str = "exclude folders[$e]";
const FOLDERS_SEPARATOR: &str = ",";
let rc_file_path = {
let mut home = dirs::home_dir()
.expect("cannot find the home directory, Coco should never run in such a environment");
home.push(".config/baloofilerc");
home
};
// Parse and load the rc file, it is in format INI
//
// Use `new_cs()`, the case-sensitive version of constructor as the config
// file contains uppercase letters, so it is case-sensitive.
let mut baloo_config = Ini::new_cs();
if rc_file_path.try_exists().map_err(|e| e.to_string())? {
let _ = baloo_config.load(rc_file_path.as_path())?;
}
// Ensure indexing is enabled
let _ = baloo_config.setstr("Basic Settings", "Indexing-Enabled", Some("true"));
// Let baloo index file content if we need that
if config.search_by == SearchBy::NameAndContents {
let _ = baloo_config.setstr(SECTION_GENERAL, "only basic indexing", Some("false"));
}
let mut include_folders = {
match baloo_config.get(SECTION_GENERAL, KEY_INCLUDE_FOLDERS) {
Some(str) => str
.split(FOLDERS_SEPARATOR)
.map(|str| str.to_string())
.collect::<Vec<String>>(),
None => Vec::new(),
}
};
let mut exclude_folders = {
match baloo_config.get(SECTION_GENERAL, KEY_EXCLUDE_FOLDERS) {
Some(str) => str
.split(FOLDERS_SEPARATOR)
.map(|str| str.to_string())
.collect::<Vec<String>>(),
None => Vec::new(),
}
};
fn ensure_path_included_include_folders(
include_folders: &mut Vec<String>,
search_path: &Utf8Path,
) {
for include_folder in include_folders.iter() {
let include_folder = Utf8Path::new(include_folder.as_str());
if search_path.starts_with(include_folder) {
return;
}
}
include_folders.push(search_path.as_str().to_string());
}
fn ensure_path_and_descendants_not_excluded(
exclude_folders: &mut Vec<String>,
search_path: &Utf8Path,
) {
let mut items_to_remove = Vec::new();
for (idx, exclude_folder) in exclude_folders.iter().enumerate() {
let exclude_folder = Utf8Path::new(exclude_folder);
if exclude_folder.starts_with(search_path) {
items_to_remove.push(idx);
}
}
for idx in items_to_remove.into_iter().rev() {
exclude_folders.remove(idx);
}
}
for search_path in config.search_paths.iter() {
let search_path = Utf8Path::new(search_path.as_str());
ensure_path_included_include_folders(&mut include_folders, search_path);
ensure_path_and_descendants_not_excluded(&mut exclude_folders, search_path);
}
let include_folders_str: String = include_folders.as_slice().join(FOLDERS_SEPARATOR);
let exclude_folders_str: String = exclude_folders.as_slice().join(FOLDERS_SEPARATOR);
let _ = baloo_config.set(
SECTION_GENERAL,
KEY_INCLUDE_FOLDERS,
Some(include_folders_str),
);
let _ = baloo_config.set(
SECTION_GENERAL,
KEY_EXCLUDE_FOLDERS,
Some(exclude_folders_str),
);
baloo_config
.pretty_write(rc_file_path.as_path(), &WriteOptions::new())
.map_err(|e| e.to_string())?;
Ok(())
}

View File

@@ -0,0 +1,50 @@
mod gnome;
mod kde;
use super::super::config::FileSearchConfig;
use crate::common::document::Document;
use crate::util::LinuxDesktopEnvironment;
use crate::util::get_linux_desktop_environment;
use std::ops::Deref;
use std::sync::LazyLock;
static DESKTOP_ENVIRONMENT: LazyLock<Option<LinuxDesktopEnvironment>> =
LazyLock::new(|| get_linux_desktop_environment());
/// Dispatch to implementations powered by different backends.
pub(crate) async fn hits(
query_string: &str,
from: usize,
size: usize,
config: &FileSearchConfig,
) -> Result<Vec<(Document, f64)>, String> {
let de = DESKTOP_ENVIRONMENT.deref();
match de {
Some(LinuxDesktopEnvironment::Gnome) => gnome::hits(query_string, from, size, config).await,
Some(LinuxDesktopEnvironment::Kde) => kde::hits(query_string, from, size, config).await,
Some(LinuxDesktopEnvironment::Unsupported {
xdg_current_desktop: _,
}) => {
return Err("file search is not supported on this desktop environment".into());
}
None => {
return Err("could not determine Linux desktop environment".into());
}
}
}
pub(crate) fn apply_config(config: &FileSearchConfig) -> Result<(), String> {
let de = DESKTOP_ENVIRONMENT.deref();
match de {
Some(LinuxDesktopEnvironment::Gnome) => gnome::apply_config(config),
Some(LinuxDesktopEnvironment::Kde) => kde::apply_config(config),
Some(LinuxDesktopEnvironment::Unsupported {
xdg_current_desktop: _,
}) => {
return Err("file search is not supported on this desktop environment".into());
}
None => {
return Err("could not determine Linux desktop environment".into());
}
}
}

View File

@@ -0,0 +1,190 @@
use super::super::EXTENSION_ID;
use super::super::config::FileSearchConfig;
use super::super::config::SearchBy;
use super::should_be_filtered_out;
use crate::common::document::{DataSourceReference, Document};
use crate::extension::LOCAL_QUERY_SOURCE_TYPE;
use crate::extension::OnOpened;
use crate::util::file::sync_get_file_icon;
use futures::stream::Stream;
use futures::stream::StreamExt;
use std::os::fd::OwnedFd;
use std::path::Path;
use tokio::io::AsyncBufReadExt;
use tokio::io::BufReader;
use tokio::process::Child;
use tokio::process::Command;
use tokio_stream::wrappers::LinesStream;
/// `mdfind` won't return scores, we use this score for all the documents.
const SCORE: f64 = 1.0;
pub(crate) async fn hits(
query_string: &str,
from: usize,
size: usize,
config: &FileSearchConfig,
) -> Result<Vec<(Document, f64)>, String> {
let (mut iter, _mdfind_child_process) =
execute_mdfind_query(&query_string, from, size, &config)?;
// Convert results to documents
let mut hits: Vec<(Document, f64)> = Vec::new();
while let Some(res_file_path) = iter.next().await {
let file_path = res_file_path.map_err(|io_err| io_err.to_string())?;
let icon = sync_get_file_icon(&file_path);
let file_path_of_type_path = camino::Utf8Path::new(&file_path);
let r#where = file_path_of_type_path
.parent()
.unwrap_or_else(|| {
panic!(
"expect path [{}] to have a parent, but it does not",
file_path
);
})
.to_string();
let file_name = file_path_of_type_path.file_name().unwrap_or_else(|| {
panic!(
"expect path [{}] to have a file name, but it does not",
file_path
);
});
let on_opened = OnOpened::Document {
url: file_path.clone(),
};
let doc = Document {
id: file_path.clone(),
title: Some(file_name.to_string()),
source: Some(DataSourceReference {
r#type: Some(LOCAL_QUERY_SOURCE_TYPE.into()),
name: Some(EXTENSION_ID.into()),
id: Some(EXTENSION_ID.into()),
icon: Some(String::from("font_Filesearch")),
}),
category: Some(r#where),
on_opened: Some(on_opened),
url: Some(file_path),
icon: Some(icon.to_string()),
..Default::default()
};
hits.push((doc, SCORE));
}
Ok(hits)
}
/// Return an array containing the `mdfind` command and its arguments.
fn build_mdfind_query(query_string: &str, config: &FileSearchConfig) -> Vec<String> {
let mut args = vec!["mdfind".to_string()];
match config.search_by {
SearchBy::Name => {
// The tailing char 'c' makes the search case-insensitive.
//
// According to [1], we should use this syntax "kMDItemFSName ==[c] '*{}*'",
// but it does not work on my machine (macOS 26 beta 7), and you
// can find similar complaints as well [2].
//
// [1]: https://developer.apple.com/library/archive/documentation/Carbon/Conceptual/SpotlightQuery/Concepts/QueryFormat.html
// [2]: https://apple.stackexchange.com/q/263671/394687
args.push(format!("kMDItemFSName == '*{}*'c", query_string));
}
SearchBy::NameAndContents => {
// Do not specify any File System Metadata Attribute Keys to search
// all of them, it is case-insensitive by default.
//
// Previously, we use:
//
// "kMDItemFSName == '*{}*' || kMDItemTextContent == '{}'"
//
// But the kMDItemTextContent attribute does not work as expected.
// For example, if a PDF document contains both "Waterloo" and
// "waterloo", it is only matched by "Waterloo".
args.push(query_string.to_string());
}
}
// Add search paths using -onlyin
for path in &config.search_paths {
if Path::new(path).exists() {
args.extend_from_slice(&["-onlyin".to_string(), path.to_string()]);
}
}
args
}
/// Spawn the `mdfind` child process and return an async iterator over its output,
/// allowing us to collect the results asynchronously.
///
/// # Return value:
///
/// * impl Stream: an async iterator that will yield the matched files
/// * Child: The handle to the mdfind process. The child process will be killed
/// when this handle gets dropped, we need to keep it alive until we exhaust
/// all the query results.
fn execute_mdfind_query(
query_string: &str,
from: usize,
size: usize,
config: &FileSearchConfig,
) -> Result<(impl Stream<Item = std::io::Result<String>>, Child), String> {
let args = build_mdfind_query(query_string, &config);
let (rx, tx) = std::io::pipe().unwrap();
let rx_owned = OwnedFd::from(rx);
let async_rx = tokio::net::unix::pipe::Receiver::from_owned_fd(rx_owned).unwrap();
let buffered_rx = BufReader::new(async_rx);
let lines = LinesStream::new(buffered_rx.lines());
let child = Command::new(&args[0])
.args(&args[1..])
.stdout(tx)
.stderr(std::process::Stdio::null())
.kill_on_drop(true)
.spawn()
.map_err(|e| format!("Failed to spawn mdfind: {}", e))?;
let config_clone = config.clone();
let iter = lines
.filter(move |res_path| {
std::future::ready({
match res_path {
Ok(path) => !should_be_filtered_out(&config_clone, path, false, true, true),
Err(_) => {
// Don't filter out Err() values
true
}
}
})
})
.skip(from)
.take(size);
Ok((iter, child))
}
pub(crate) fn apply_config(_: &FileSearchConfig) -> Result<(), String> {
// By default, macOS indexes all the files within a volume if indexing is
// enabled. So, to ensure our search paths are indexed by Spotlight,
// theoretically, we can do the following things:
//
// 1. Ensure indexing is enabled on the volumes where our search paths reside.
// However, we cannot do this as doing so requires `sudo`.
//
// 2. Ensure the search paths are not excluded from indexing scope. Users can
// stop Spotlight from indexing a directory by:
// 1. adding it to the "Privacy" list in 'System Settings'. Coco cannot
// modify this list, since the only way to change it is manually
// through System Settings.
// 2. Renaming directory name, adding a `.noindex` file extension to it.
// I don't want to use this trick, users won't feel comfortable and it
// could break at any time.
// 3. Creating a `.metadata_never_index` file within the directory (no longer works
// since macOS Mojave)
//
// There is nothing we can do.
Ok(())
}

View File

@@ -0,0 +1,396 @@
use cfg_if::cfg_if;
// * hits: the implementation of search
//
// * apply_config: Routines that should be performed to keep "other things"
// synchronous with the passed configuration.
// Currently, "other things" only include system indexer's setting entries.
cfg_if! {
if #[cfg(target_os = "linux")] {
mod linux;
pub(crate) use linux::hits;
pub(crate) use linux::apply_config;
} else if #[cfg(target_os = "macos")] {
mod macos;
pub(crate) use macos::hits;
pub(crate) use macos::apply_config;
} else if #[cfg(target_os = "windows")] {
mod windows;
pub(crate) use windows::hits;
pub(crate) use windows::apply_config;
}
}
cfg_if! {
if #[cfg(not(target_os = "windows"))] {
use super::config::FileSearchConfig;
use camino::Utf8Path;
}
}
/// If `file_path` should be removed from the search results given the filter
/// conditions specified in `config`.
#[cfg(not(target_os = "windows"))] // Not used on Windows
pub(crate) fn should_be_filtered_out(
config: &FileSearchConfig,
file_path: &str,
check_search_paths: bool,
check_exclude_paths: bool,
check_file_type: bool,
) -> bool {
let file_path = Utf8Path::new(file_path);
if check_search_paths {
// search path
let in_search_paths = config.search_paths.iter().any(|search_path| {
let search_path = Utf8Path::new(search_path);
file_path.starts_with(search_path)
});
if !in_search_paths {
return true;
}
}
if check_exclude_paths {
// exclude path
let is_excluded = config
.exclude_paths
.iter()
.any(|exclude_path| file_path.starts_with(exclude_path));
if is_excluded {
return true;
}
}
if check_file_type {
// file type
let matches_file_type = if config.file_types.is_empty() {
true
} else {
let path_obj = camino::Utf8Path::new(&file_path);
if let Some(extension) = path_obj.extension() {
config
.file_types
.iter()
.any(|file_type| file_type == extension)
} else {
// `config.file_types` is not empty, the hit files should have extensions.
false
}
};
if !matches_file_type {
return true;
}
}
false
}
// should_be_filtered_out() is not defined for Windows
#[cfg(all(test, not(target_os = "windows")))]
mod tests {
use super::super::config::SearchBy;
use super::*;
#[test]
fn test_should_be_filtered_out_with_no_check() {
let config = FileSearchConfig {
search_paths: vec!["/home/user/Documents".to_string()],
exclude_paths: vec![],
file_types: vec!["fffffff".into()],
search_by: SearchBy::Name,
};
assert!(!should_be_filtered_out(
&config, "abbc", false, false, false
));
}
#[test]
fn test_should_be_filtered_out_search_paths() {
let config = FileSearchConfig {
search_paths: vec![
"/home/user/Documents".to_string(),
"/home/user/Downloads".to_string(),
],
exclude_paths: vec![],
file_types: vec![],
search_by: SearchBy::Name,
};
// Files in search paths should not be filtered
assert!(!should_be_filtered_out(
&config,
"/home/user/Documents/file.txt",
true,
true,
true
));
assert!(!should_be_filtered_out(
&config,
"/home/user/Downloads/image.jpg",
true,
true,
true
));
assert!(!should_be_filtered_out(
&config,
"/home/user/Documents/folder/file.txt",
true,
true,
true
));
// Files not in search paths should be filtered
assert!(should_be_filtered_out(
&config,
"/home/user/Pictures/photo.jpg",
true,
true,
true
));
assert!(should_be_filtered_out(
&config,
"/tmp/tempfile",
true,
true,
true
));
assert!(should_be_filtered_out(
&config,
"/usr/bin/ls",
true,
true,
true
));
}
#[test]
fn test_should_be_filtered_out_exclude_paths() {
let config = FileSearchConfig {
search_paths: vec!["/home/user".to_string()],
exclude_paths: vec![
"/home/user/Trash".to_string(),
"/home/user/.cache".to_string(),
],
file_types: vec![],
search_by: SearchBy::Name,
};
// Files in search paths but not excluded should not be filtered
assert!(!should_be_filtered_out(
&config,
"/home/user/Documents/file.txt",
true,
true,
true
));
assert!(!should_be_filtered_out(
&config,
"/home/user/Downloads/image.jpg",
true,
true,
true
));
// Files in excluded paths should be filtered
assert!(should_be_filtered_out(
&config,
"/home/user/Trash/deleted_file",
true,
true,
true
));
assert!(should_be_filtered_out(
&config,
"/home/user/.cache/temp",
true,
true,
true
));
assert!(should_be_filtered_out(
&config,
"/home/user/Trash/folder/file.txt",
true,
true,
true
));
}
#[test]
fn test_should_be_filtered_out_file_types() {
let config = FileSearchConfig {
search_paths: vec!["/home/user/Documents".to_string()],
exclude_paths: vec![],
file_types: vec!["txt".to_string(), "md".to_string()],
search_by: SearchBy::Name,
};
// Files with allowed extensions should not be filtered
assert!(!should_be_filtered_out(
&config,
"/home/user/Documents/notes.txt",
true,
true,
true
));
assert!(!should_be_filtered_out(
&config,
"/home/user/Documents/readme.md",
true,
true,
true
));
// Files with disallowed extensions should be filtered
assert!(should_be_filtered_out(
&config,
"/home/user/Documents/image.jpg",
true,
true,
true
));
assert!(should_be_filtered_out(
&config,
"/home/user/Documents/document.pdf",
true,
true,
true
));
// Files without extensions should be filtered when file_types is not empty
assert!(should_be_filtered_out(
&config,
"/home/user/Documents/file",
true,
true,
true
));
assert!(should_be_filtered_out(
&config,
"/home/user/Documents/folder",
true,
true,
true
));
}
#[test]
fn test_should_be_filtered_out_empty_file_types() {
let config = FileSearchConfig {
search_paths: vec!["/home/user/Documents".to_string()],
exclude_paths: vec![],
file_types: vec![],
search_by: SearchBy::Name,
};
// When file_types is empty, all file types should be allowed
assert!(!should_be_filtered_out(
&config,
"/home/user/Documents/file.txt",
true,
true,
true
));
assert!(!should_be_filtered_out(
&config,
"/home/user/Documents/image.jpg",
true,
true,
true
));
assert!(!should_be_filtered_out(
&config,
"/home/user/Documents/document",
true,
true,
true
));
assert!(!should_be_filtered_out(
&config,
"/home/user/Documents/folder/",
true,
true,
true
));
}
#[test]
fn test_should_be_filtered_out_combined_filters() {
let config = FileSearchConfig {
search_paths: vec!["/home/user".to_string()],
exclude_paths: vec!["/home/user/Trash".to_string()],
file_types: vec!["txt".to_string()],
search_by: SearchBy::Name,
};
// Should pass all filters: in search path, not excluded, and correct file type
assert!(!should_be_filtered_out(
&config,
"/home/user/Documents/notes.txt",
true,
true,
true
));
// Fails file type filter
assert!(should_be_filtered_out(
&config,
"/home/user/Documents/image.jpg",
true,
true,
true
));
// Fails exclude path filter
assert!(should_be_filtered_out(
&config,
"/home/user/Trash/deleted.txt",
true,
true,
true
));
// Fails search path filter
assert!(should_be_filtered_out(
&config,
"/tmp/temp.txt",
true,
true,
true
));
}
#[test]
fn test_should_be_filtered_out_edge_cases() {
let config = FileSearchConfig {
search_paths: vec!["/home/user".to_string()],
exclude_paths: vec![],
file_types: vec!["txt".to_string()],
search_by: SearchBy::Name,
};
// Empty path
assert!(should_be_filtered_out(&config, "", true, true, true));
// Root path
assert!(should_be_filtered_out(&config, "/", true, true, true));
// Path that starts with search path but continues differently
assert!(!should_be_filtered_out(
&config,
"/home/user/document.txt",
true,
true,
true
));
assert!(should_be_filtered_out(
&config,
"/home/user_other/file.txt",
true,
true,
true
));
}
}

View File

@@ -0,0 +1,234 @@
//! Wraps Windows `ISearchCrawlScopeManager`
mod searchapi_h_bindings;
use searchapi_h_bindings::CLSID_CSEARCH_MANAGER;
use searchapi_h_bindings::IID_ISEARCH_MANAGER;
use searchapi_h_bindings::{
HRESULT, ISearchCatalogManager, ISearchCatalogManagerVtbl, ISearchCrawlScopeManager,
ISearchCrawlScopeManagerVtbl, ISearchManager,
};
use std::ffi::OsStr;
use std::ffi::OsString;
use std::os::windows::ffi::OsStrExt;
use std::path::Path;
use std::path::PathBuf;
use std::ptr::null_mut;
use windows::core::w;
use windows_sys::Win32::Foundation::S_OK;
use windows_sys::Win32::System::Com::{
CLSCTX_LOCAL_SERVER, COINIT_APARTMENTTHREADED, CoCreateInstance, CoInitializeEx, CoUninitialize,
};
#[derive(Debug, thiserror::Error)]
#[error("{msg}, function [{function}], HRESULT [{hresult}]")]
pub(crate) struct WindowSearchApiError {
function: &'static str,
hresult: HRESULT,
msg: String,
}
/// See doc of [`Rule`].
#[derive(Debug, PartialEq)]
pub(crate) enum RuleMode {
Inclusion,
Exclusion,
}
/// A rule adds or removes one or more paths to/from the Windows Search index.
#[derive(Debug)]
pub(crate) struct Rule {
/// A path or path pattern (wildcard supported, only for exclusion rule) that
/// specifies the paths that this rule applies to.
///
/// The rules used by Windows Search actually specify URLs rather than paths,
/// but we only care about paths, i.e., URLs with schema `file://`
pub(crate) paths: PathBuf,
/// Add or remove paths to/from the index.
pub(crate) mode: RuleMode,
}
/// A wrapper around Window's `ISearchCrawlScopeManager` type
pub(crate) struct CrawlScopeManager {
i_search_crawl_scope_manager: *mut ISearchCrawlScopeManager,
}
impl CrawlScopeManager {
fn vtable(&self) -> *mut ISearchCrawlScopeManagerVtbl {
unsafe { (*self.i_search_crawl_scope_manager).lpVtbl }
}
pub(crate) fn new() -> Result<Self, WindowSearchApiError> {
unsafe {
// 1. Initialize the COM library, use Apartment-threading as Self is not Send/Sync
let hr = CoInitializeEx(null_mut(), COINIT_APARTMENTTHREADED as u32);
if hr != S_OK {
return Err(WindowSearchApiError {
function: "CoInitializeEx()",
hresult: hr,
msg: "failed to initialize the COM library".into(),
});
}
// 2. Create an instance of the CSearchManager.
let mut search_manager: *mut ISearchManager = null_mut();
let hr = CoCreateInstance(
&CLSID_CSEARCH_MANAGER, // CLSID of the object
null_mut(), // No outer unknown
CLSCTX_LOCAL_SERVER, // Server context
&IID_ISEARCH_MANAGER, // IID of the interface we want
&mut search_manager as *mut _ as *mut _, // Pointer to receive the interface
);
if hr != S_OK {
return Err(WindowSearchApiError {
function: "CoCreateInstance()",
hresult: hr,
msg: "failed to initialize ISearchManager".into(),
});
}
assert!(!search_manager.is_null());
let search_manger_vtable = (*search_manager).lpVtbl;
let search_manager_fn_get_catalog = (*search_manger_vtable).GetCatalog.unwrap();
let mut search_catalog_manager: *mut ISearchCatalogManager = null_mut();
let string_literal_system_index = w!("SystemIndex");
let hr: HRESULT = search_manager_fn_get_catalog(
search_manager,
string_literal_system_index.0,
&mut search_catalog_manager as *mut *mut ISearchCatalogManager,
);
if hr != S_OK {
return Err(WindowSearchApiError {
function: "ISearchManager::GetCatalog()",
hresult: hr,
msg: "failed to initialize ISearchCatalogManager".into(),
});
}
assert!(!search_catalog_manager.is_null());
let search_catalog_manager_vtable: *mut ISearchCatalogManagerVtbl =
(*search_catalog_manager).lpVtbl;
let fn_get_crawl_scope_manager = (*search_catalog_manager_vtable)
.GetCrawlScopeManager
.unwrap();
let mut search_crawl_scope_manager: *mut ISearchCrawlScopeManager = null_mut();
let hr =
fn_get_crawl_scope_manager(search_catalog_manager, &mut search_crawl_scope_manager);
if hr != S_OK {
return Err(WindowSearchApiError {
function: "ISearchCatalogManager::GetCrawlScopeManager()",
hresult: hr,
msg: "failed to initialize ISearchCrawlScopeManager".into(),
});
}
assert!(!search_crawl_scope_manager.is_null());
Ok(Self {
i_search_crawl_scope_manager: search_crawl_scope_manager,
})
}
}
/// Does nothing unless you [`commit()`] the changes.
pub(crate) fn add_rule(&mut self, rule: Rule) -> Result<(), WindowSearchApiError> {
unsafe {
let vtable = self.vtable();
let fn_add_rule = (*vtable).AddUserScopeRule.unwrap();
let url: Vec<u16> = encode_path(&rule.paths);
let inclusion = (rule.mode == RuleMode::Inclusion) as i32;
let override_child_rules = true as i32;
let follow_flag = 0x1_u32; /* FF_INDEXCOMPLEXURLS */
let hr = fn_add_rule(
self.i_search_crawl_scope_manager,
url.as_ptr(),
inclusion,
override_child_rules,
follow_flag,
);
if hr != S_OK {
return Err(WindowSearchApiError {
function: "ISearchCrawlScopeManager::AddUserScopeRule()",
hresult: hr,
msg: "failed to add scope rule".into(),
});
}
Ok(())
}
}
pub(crate) fn is_path_included<P: AsRef<Path> + ?Sized>(
&self,
path: &P,
) -> Result<bool, WindowSearchApiError> {
unsafe {
let vtable = self.vtable();
let fn_included_in_crawl_scope = (*vtable).IncludedInCrawlScope.unwrap();
let path: Vec<u16> = encode_path(path);
let mut included: i32 = 0 /* false */;
let hr = fn_included_in_crawl_scope(
self.i_search_crawl_scope_manager,
path.as_ptr(),
&mut included,
);
if hr != S_OK {
return Err(WindowSearchApiError {
function: "ISearchCrawlScopeManager::IncludedInCrawlScope()",
hresult: hr,
msg: "failed to call IncludedInCrawlScope()".into(),
});
}
Ok(included == 1)
}
}
pub(crate) fn commit(&self) -> Result<(), WindowSearchApiError> {
unsafe {
let vtable = self.vtable();
let fn_commit = (*vtable).SaveAll.unwrap();
let hr = fn_commit(self.i_search_crawl_scope_manager);
if hr != S_OK {
return Err(WindowSearchApiError {
function: "ISearchCrawlScopeManager::SaveAll()",
hresult: hr,
msg: "failed to commit the changes".into(),
});
}
Ok(())
}
}
}
impl Drop for CrawlScopeManager {
fn drop(&mut self) {
unsafe {
CoUninitialize();
}
}
}
fn encode_path<P: AsRef<Path> + ?Sized>(path: &P) -> Vec<u16> {
let mut buffer = OsString::new();
// schema
buffer.push("file:///");
buffer.push(path.as_ref().as_os_str());
osstr_to_wstr(&buffer)
}
fn osstr_to_wstr<S: AsRef<OsStr> + ?Sized>(str: &S) -> Vec<u16> {
let os_str: &OsStr = str.as_ref();
let mut chars = os_str.encode_wide().collect::<Vec<u16>>();
chars.push(0 /* NUL */);
chars
}

View File

@@ -0,0 +1,30 @@
//! Rust binding of the types and functions declared in 'searchapi.h'
#![allow(unused)]
#![allow(non_camel_case_types)]
#![allow(non_snake_case)]
#![allow(non_upper_case_globals)]
#![allow(unsafe_op_in_unsafe_fn)]
#![allow(unnecessary_transmutes)]
include!(concat!(env!("OUT_DIR"), "/searchapi_bindings.rs"));
// The bindings.rs contains a GUID type as well, we use the one provided by
// the windows_sys crate here.
use windows_sys::core::GUID as WIN_SYS_GUID;
// https://github.com/search?q=CLSID_CSearchManager+language%3AC&type=code&l=C
pub(crate) static CLSID_CSEARCH_MANAGER: WIN_SYS_GUID = WIN_SYS_GUID {
data1: 0x7d096c5f,
data2: 0xac08,
data3: 0x4f1f,
data4: [0xbe, 0xb7, 0x5c, 0x22, 0xc5, 0x17, 0xce, 0x39],
};
// https://github.com/search?q=IID_ISearchManager+language%3AC&type=code
pub(crate) static IID_ISEARCH_MANAGER: WIN_SYS_GUID = WIN_SYS_GUID {
data1: 0xAB310581,
data2: 0xac80,
data3: 0x11d1,
data4: [0x8d, 0xf3, 0x00, 0xc0, 0x4f, 0xb6, 0xef, 0x69],
};

View File

@@ -0,0 +1,834 @@
//! # Credits
//!
//! https://github.com/IRONAGE-Park/rag-sample/blob/3f0ad8c8012026cd3a7e453d08f041609426cb91/src/native/windows.rs
//! is the starting point of this implementation.
mod crawl_scope_manager;
use super::super::EXTENSION_ID;
use super::super::config::FileSearchConfig;
use super::super::config::SearchBy;
use crate::common::document::{DataSourceReference, Document};
use crate::extension::LOCAL_QUERY_SOURCE_TYPE;
use crate::extension::OnOpened;
use crate::util::file::sync_get_file_icon;
use std::borrow::Borrow;
use std::path::PathBuf;
use windows::{
Win32::System::{
Com::{CLSCTX_INPROC_SERVER, CoCreateInstance},
Ole::{OleInitialize, OleUninitialize},
Search::{
DB_NULL_HCHAPTER, DBACCESSOR_ROWDATA, DBBINDING, DBMEMOWNER_CLIENTOWNED,
DBPARAMIO_NOTPARAM, DBPART_VALUE, DBTYPE_WSTR, HACCESSOR, IAccessor, ICommand,
ICommandText, IDBCreateCommand, IDBCreateSession, IDBInitialize, IDataInitialize,
IRowset, MSDAINITIALIZE,
},
},
core::{GUID, IUnknown, Interface, PWSTR, w},
};
/// Owned version of `PWSTR` that holds the heap memory.
///
/// Use `as_pwstr()` to convert it to a raw pointer.
struct PwStrOwned(Vec<u16>);
impl PwStrOwned {
/// # SAFETY
///
/// The returned `PWSTR` is basically a raw pointer, it is only valid within the
/// lifetime of `PwStrOwned`.
unsafe fn as_pwstr(&mut self) -> PWSTR {
let raw_ptr = self.0.as_mut_ptr();
PWSTR::from_raw(raw_ptr)
}
}
/// Construct `PwStrOwned` from any `str`.
impl<S: AsRef<str> + ?Sized> From<&S> for PwStrOwned {
fn from(value: &S) -> Self {
let mut utf16_bytes = value.as_ref().encode_utf16().collect::<Vec<u16>>();
utf16_bytes.push(0); // the tailing NULL
PwStrOwned(utf16_bytes)
}
}
/// Helper function to replace unsupported characters with whitespace.
///
/// Windows search will error out if it encounters these characters.
///
/// The complete list of unsupported characters is unknown and we don't know how
/// to escape them, so let's replace them.
fn query_string_cleanup(old: &str) -> String {
const UNSUPPORTED_CHAR: [char; 2] = ['\'', '\n'];
// Using len in bytes is ok
let mut chars = Vec::with_capacity(old.len());
for char in old.chars() {
if UNSUPPORTED_CHAR.contains(&char) {
chars.push(' ');
} else {
chars.push(char);
}
}
chars.into_iter().collect()
}
/// Helper function to construct the Windows Search SQL.
///
/// Paging is not natively supported by windows Search SQL, it only supports `size`
/// via the `TOP` keyword ("SELECT TOP {n} {columns}"). The SQL returned by this
/// function will have `{n}` set to `from + size`, then we will manually implement
/// paging.
fn query_sql(query_string: &str, from: usize, size: usize, config: &FileSearchConfig) -> String {
let top_n = from
.checked_add(size)
.expect("[from + size] cannot fit into an [usize]");
// System.ItemUrl is a column that contains the file path
// example: "file:C:/Users/desktop.ini"
//
// System.Search.Rank is the relevance score
let mut sql = format!(
"SELECT TOP {} System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE",
top_n
);
let query_string = query_string_cleanup(query_string);
let search_by_predicate = match config.search_by {
SearchBy::Name => {
// `contains(System.FileName, '{query_string}')` would be faster
// because it uses inverted index, but that's not what we want
// due to the limitation of tokenization. For example, suppose "Coco AI.rs"
// will be tokenized to `["Coco", "AI", "rs"]`, then if users search
// via `Co`, this file won't be returned because term `Co` does not
// exist in the index.
//
// So we use wildcard instead even though it is slower.
format!("(System.FileName LIKE '%{query_string}%')")
}
SearchBy::NameAndContents => {
// Windows File Search does not support searching by file content.
//
// `CONTAINS('query_string')` would search all columns for `query_string`,
// this is the closest solution we have.
format!("((System.FileName LIKE '%{query_string}%') OR CONTAINS('{query_string}'))")
}
};
let search_paths_predicate: Option<String> = {
if config.search_paths.is_empty() {
None
} else {
let mut output = String::from("(");
for (idx, search_path) in config.search_paths.iter().enumerate() {
if idx != 0 {
output.push_str(" OR ");
}
output.push_str("SCOPE = 'file:");
output.push_str(&search_path);
output.push('\'');
}
output.push(')');
Some(output)
}
};
let exclude_paths_predicate: Option<String> = {
if config.exclude_paths.is_empty() {
None
} else {
let mut output = String::from("(");
for (idx, exclude_path) in config.exclude_paths.iter().enumerate() {
if idx != 0 {
output.push_str(" AND ");
}
output.push_str("(NOT SCOPE = 'file:");
output.push_str(&exclude_path);
output.push('\'');
output.push(')');
}
output.push(')');
Some(output)
}
};
let file_types_predicate: Option<String> = {
if config.file_types.is_empty() {
None
} else {
let mut output = String::from("(");
for (idx, file_type) in config.file_types.iter().enumerate() {
if idx != 0 {
output.push_str(" OR ");
}
// NOTE that this column contains a starting dot
output.push_str("System.FileExtension = '.");
output.push_str(&file_type);
output.push('\'');
}
output.push(')');
Some(output)
}
};
sql.push(' ');
sql.push_str(search_by_predicate.as_str());
if let Some(search_paths_predicate) = search_paths_predicate {
sql.push_str(" AND ");
sql.push_str(search_paths_predicate.as_str());
}
if let Some(exclude_paths_predicate) = exclude_paths_predicate {
sql.push_str(" AND ");
sql.push_str(exclude_paths_predicate.as_str());
}
if let Some(file_types_predicate) = file_types_predicate {
sql.push_str(" AND ");
sql.push_str(file_types_predicate.as_str());
}
sql
}
/// Default GUID for Search.CollatorDSO.1
const DBGUID_DEFAULT: GUID = GUID {
data1: 0xc8b521fb,
data2: 0x5cf3,
data3: 0x11ce,
data4: [0xad, 0xe5, 0x00, 0xaa, 0x00, 0x44, 0x77, 0x3d],
};
unsafe fn create_accessor_handle(accessor: &IAccessor, index: usize) -> Result<HACCESSOR, String> {
let bindings = DBBINDING {
iOrdinal: index,
obValue: 0,
obStatus: 0,
obLength: 0,
dwPart: DBPART_VALUE.0 as u32,
dwMemOwner: DBMEMOWNER_CLIENTOWNED.0 as u32,
eParamIO: DBPARAMIO_NOTPARAM.0 as u32,
cbMaxLen: 512,
dwFlags: 0,
wType: DBTYPE_WSTR.0 as u16,
bPrecision: 0,
bScale: 0,
..Default::default()
};
let mut status = 0;
let mut accessor_handle = HACCESSOR::default();
unsafe {
accessor
.CreateAccessor(
DBACCESSOR_ROWDATA.0 as u32,
1,
&bindings,
0,
&mut accessor_handle,
Some(&mut status),
)
.map_err(|e| e.to_string())?;
}
Ok(accessor_handle)
}
fn create_db_initialize() -> Result<IDBInitialize, String> {
unsafe {
let data_init: IDataInitialize =
CoCreateInstance(&MSDAINITIALIZE, None, CLSCTX_INPROC_SERVER)
.map_err(|e| e.to_string())?;
let mut unknown: Option<IUnknown> = None;
data_init
.GetDataSource(
None,
CLSCTX_INPROC_SERVER.0,
w!("provider=Search.CollatorDSO.1;EXTENDED PROPERTIES=\"Application=Windows\""),
&IDBInitialize::IID,
&mut unknown as *mut _ as *mut _,
)
.map_err(|e| e.to_string())?;
Ok(unknown.unwrap().cast().map_err(|e| e.to_string())?)
}
}
fn create_command(db_init: IDBInitialize) -> Result<ICommandText, String> {
unsafe {
let db_create_session: IDBCreateSession = db_init.cast().map_err(|e| e.to_string())?;
let session: IUnknown = db_create_session
.CreateSession(None, &IUnknown::IID)
.map_err(|e| e.to_string())?;
let db_create_command: IDBCreateCommand = session.cast().map_err(|e| e.to_string())?;
Ok(db_create_command
.CreateCommand(None, &ICommand::IID)
.map_err(|e| e.to_string())?
.cast()
.map_err(|e| e.to_string())?)
}
}
fn execute_windows_search_sql(sql_query: &str) -> Result<Vec<(String, String)>, String> {
unsafe {
let mut pwstr_owned_sql = PwStrOwned::from(sql_query);
// SAFETY: pwstr_owned_sql will live for the whole lifetime of this function.
let sql_query = pwstr_owned_sql.as_pwstr();
let db_init = create_db_initialize()?;
db_init.Initialize().map_err(|e| e.to_string())?;
let command = create_command(db_init)?;
// Set the command text
command
.SetCommandText(&DBGUID_DEFAULT, sql_query)
.map_err(|e| e.to_string())?;
// Execute the command
let mut rowset: Option<IRowset> = None;
command
.Execute(
None,
&IRowset::IID,
None,
None,
Some(&mut rowset as *mut _ as *mut _),
)
.map_err(|e| e.to_string())?;
let rowset = rowset.ok_or_else(|| {
format!(
"No rowset returned for query: {}",
// SAFETY: the raw pointer is not dangling
sql_query
.to_string()
.expect("the conversion should work as `sql_query` was created from a String",)
)
})?;
let accessor: IAccessor = rowset
.cast()
.map_err(|e| format!("Failed to cast to IAccessor: {}", e.to_string()))?;
let mut output = Vec::new();
let mut count = 0;
loop {
let mut rows_fetched = 0;
let mut row_handles = [std::ptr::null_mut(); 1];
let result = rowset.GetNextRows(
DB_NULL_HCHAPTER as usize,
0,
&mut rows_fetched,
&mut row_handles,
);
if result.is_err() {
break;
}
if rows_fetched == 0 {
break;
}
let mut data = Vec::new();
for i in 0..2 {
let mut item_name = [0u16; 512];
let accessor_handle = create_accessor_handle(&accessor, i + 1)?;
rowset
.GetData(
*row_handles[0],
accessor_handle,
item_name.as_mut_ptr() as *mut _,
)
.map_err(|e| {
format!(
"Failed to get data at count {}, index {}: {}",
count,
i,
e.to_string()
)
})?;
let name = String::from_utf16_lossy(&item_name);
// Remove null characters
data.push(name.trim_end_matches('\u{0000}').to_string());
accessor
.ReleaseAccessor(accessor_handle, None)
.map_err(|e| {
format!(
"Failed to release accessor at count {}, index {}: {}",
count,
i,
e.to_string()
)
})?;
}
output.push((data[0].clone(), data[1].clone()));
count += 1;
rowset
.ReleaseRows(
1,
row_handles[0],
std::ptr::null_mut(),
std::ptr::null_mut(),
std::ptr::null_mut(),
)
.map_err(|e| {
format!(
"Failed to release rows at count {}: {}",
count,
e.to_string()
)
})?;
}
Ok(output)
}
}
pub(crate) async fn hits(
query_string: &str,
from: usize,
size: usize,
config: &FileSearchConfig,
) -> Result<Vec<(Document, f64)>, String> {
let sql = query_sql(query_string, from, size, config);
unsafe { OleInitialize(None).map_err(|e| e.to_string())? };
let result = execute_windows_search_sql(&sql)?;
unsafe { OleUninitialize() };
// .take(size) is not needed as `result` will contain `from+size` files at most
let result_with_paging = result.into_iter().skip(from);
// result_with_paging won't contain more than `size` entries
let mut hits = Vec::with_capacity(size);
const ITEM_URL_PREFIX: &str = "file:";
const ITEM_URL_PREFIX_LEN: usize = ITEM_URL_PREFIX.len();
for (item_url, score_str) in result_with_paging {
// path returned from Windows Search contains a prefix, we need to trim it.
//
// "file:C:/Users/desktop.ini" => "C:/Users/desktop.ini"
let file_path = &item_url[ITEM_URL_PREFIX_LEN..];
let icon = sync_get_file_icon(file_path);
let file_path_of_type_path = camino::Utf8Path::new(&file_path);
let r#where = file_path_of_type_path
.parent()
.unwrap_or_else(|| {
panic!(
"expect path [{}] to have a parent, but it does not",
file_path
);
})
.to_string();
let file_name = file_path_of_type_path.file_name().unwrap_or_else(|| {
panic!(
"expect path [{}] to have a file name, but it does not",
file_path
);
});
let on_opened = OnOpened::Document {
url: file_path.to_string(),
};
let doc = Document {
id: file_path.to_string(),
title: Some(file_name.to_string()),
source: Some(DataSourceReference {
r#type: Some(LOCAL_QUERY_SOURCE_TYPE.into()),
name: Some(EXTENSION_ID.into()),
id: Some(EXTENSION_ID.into()),
icon: Some(String::from("font_Filesearch")),
}),
category: Some(r#where),
on_opened: Some(on_opened),
url: Some(file_path.into()),
icon: Some(icon.to_string()),
..Default::default()
};
let score: f64 = score_str.parse().expect(
"System.Search.Rank should be in range [0, 1000], which should be valid for [f64]",
);
hits.push((doc, score));
}
Ok(hits)
}
pub(crate) fn apply_config(config: &FileSearchConfig) -> Result<(), String> {
// To ensure Windows Search indexer index the paths we specified in the
// config, we will:
//
// 1. Add an inclusion rule for every search path to ensure indexer index
// them
// 2. For the exclude paths, we exclude them from the crawl scope if they
// were not included in the scope before we update the scope. Otherwise,
// we cannot exclude them as doing that could potentially break other
// apps (by removing the indexes they rely on).
//
// Windows APIs are pretty smart. They won't blindly add an inclusion rule if
// the path you are trying to include is already included. The same applies
// to exclusion rules as well. Since Windows APIs handle these checks for us,
// we don't need to worry about them.
use crawl_scope_manager::CrawlScopeManager;
use crawl_scope_manager::Rule;
use crawl_scope_manager::RuleMode;
use std::borrow::Cow;
/// Windows APIs need the path to contain a tailing '\'
fn add_tailing_backslash(path: &str) -> Cow<'_, str> {
if path.ends_with(r#"\"#) {
Cow::Borrowed(path)
} else {
let mut owned = path.to_string();
owned.push_str(r#"\"#);
Cow::Owned(owned)
}
}
let mut manager = CrawlScopeManager::new().map_err(|e| e.to_string())?;
let search_paths = &config.search_paths;
let exclude_paths = &config.exclude_paths;
// indexes to `exclude_paths` of the paths we need to exclude
let mut paths_to_exclude: Vec<usize> = Vec::new();
for (idx, exclude_path) in exclude_paths.into_iter().enumerate() {
let exclude_path = add_tailing_backslash(&exclude_path);
let exclude_path: &str = exclude_path.borrow();
if !manager
.is_path_included(exclude_path)
.map_err(|e| e.to_string())?
{
paths_to_exclude.push(idx);
}
}
for search_path in search_paths {
let inclusion_rule = Rule {
paths: PathBuf::from(add_tailing_backslash(&search_path).into_owned()),
mode: RuleMode::Inclusion,
};
manager
.add_rule(inclusion_rule)
.map_err(|e| e.to_string())?;
}
for idx in paths_to_exclude {
let exclusion_rule = Rule {
paths: PathBuf::from(add_tailing_backslash(&exclude_paths[idx]).into_owned()),
mode: RuleMode::Exclusion,
};
manager
.add_rule(exclusion_rule)
.map_err(|e| e.to_string())?;
}
manager.commit().map_err(|e| e.to_string())?;
Ok(())
}
// Skip these tests in our CI, they fail with the following error
// "SQL is invalid: "0x80041820""
//
// I have no idea about the underlying root cause
#[cfg(all(test, not(ci)))]
mod test_windows_search {
use super::*;
/// Helper function for ensuring `sql` is valid SQL by actually executing it.
fn ensure_it_is_valid_sql(sql: &str) {
unsafe { OleInitialize(None).unwrap() };
execute_windows_search_sql(&sql).expect("SQL is invalid");
unsafe { OleUninitialize() };
}
#[test]
fn test_query_sql_empty_config_search_by_name() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("coco", 0, 10, &config);
assert_eq!(
sql,
"SELECT TOP 10 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%coco%')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_empty_config_search_by_name_and_content() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::NameAndContents,
};
let sql = query_sql("coco", 0, 10, &config);
assert_eq!(
sql,
"SELECT TOP 10 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE ((System.FileName LIKE '%coco%') OR CONTAINS('coco'))"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_search_paths() {
let config = FileSearchConfig {
search_paths: vec!["C:/Users/".into()],
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("coco", 0, 10, &config);
assert_eq!(
sql,
"SELECT TOP 10 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%coco%') AND (SCOPE = 'file:C:/Users/')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_multiple_search_paths() {
let config = FileSearchConfig {
search_paths: vec![
"C:/Users/".into(),
"D:/Projects/".into(),
"E:/Documents/".into(),
],
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("test", 0, 5, &config);
assert_eq!(
sql,
"SELECT TOP 5 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%test%') AND (SCOPE = 'file:C:/Users/' OR SCOPE = 'file:D:/Projects/' OR SCOPE = 'file:E:/Documents/')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_exclude_paths() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: vec!["C:/Windows/".into()],
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("file", 0, 20, &config);
assert_eq!(
sql,
"SELECT TOP 20 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%file%') AND ((NOT SCOPE = 'file:C:/Windows/'))"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_multiple_exclude_paths() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: vec!["C:/Windows/".into(), "C:/System/".into(), "C:/Temp/".into()],
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("data", 5, 15, &config);
assert_eq!(
sql,
"SELECT TOP 20 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%data%') AND ((NOT SCOPE = 'file:C:/Windows/') AND (NOT SCOPE = 'file:C:/System/') AND (NOT SCOPE = 'file:C:/Temp/'))"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_file_types() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: Vec::new(),
file_types: vec!["txt".into()],
search_by: SearchBy::Name,
};
let sql = query_sql("readme", 0, 10, &config);
assert_eq!(
sql,
"SELECT TOP 10 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%readme%') AND (System.FileExtension = '.txt')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_multiple_file_types() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: Vec::new(),
file_types: vec!["rs".into(), "toml".into(), "md".into(), "json".into()],
search_by: SearchBy::Name,
};
let sql = query_sql("config", 0, 50, &config);
assert_eq!(
sql,
"SELECT TOP 50 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%config%') AND (System.FileExtension = '.rs' OR System.FileExtension = '.toml' OR System.FileExtension = '.md' OR System.FileExtension = '.json')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_all_fields_combined() {
let config = FileSearchConfig {
search_paths: vec!["C:/Projects/".into(), "D:/Code/".into()],
exclude_paths: vec!["C:/Projects/temp/".into()],
file_types: vec!["rs".into(), "ts".into()],
search_by: SearchBy::Name,
};
let sql = query_sql("main", 10, 25, &config);
assert_eq!(
sql,
"SELECT TOP 35 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%main%') AND (SCOPE = 'file:C:/Projects/' OR SCOPE = 'file:D:/Code/') AND ((NOT SCOPE = 'file:C:/Projects/temp/')) AND (System.FileExtension = '.rs' OR System.FileExtension = '.ts')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_special_characters() {
let config = FileSearchConfig {
search_paths: vec!["C:/Users/John Doe/".into()],
exclude_paths: Vec::new(),
file_types: vec!["c++".into()],
search_by: SearchBy::Name,
};
let sql = query_sql("hello-world", 0, 10, &config);
assert_eq!(
sql,
"SELECT TOP 10 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%hello-world%') AND (SCOPE = 'file:C:/Users/John Doe/') AND (System.FileExtension = '.c++')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_edge_case_large_offset() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("test", 100, 50, &config);
assert_eq!(
sql,
"SELECT TOP 150 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%test%')"
);
ensure_it_is_valid_sql(&sql);
}
}
#[cfg(test)]
mod test {
use super::*;
#[test]
fn test_query_string_cleanup_no_unsupported_chars() {
let input = "hello world";
let result = query_string_cleanup(input);
assert_eq!(result, input);
}
#[test]
fn test_query_string_cleanup_single_quote() {
let input = "don't worry";
let result = query_string_cleanup(input);
assert_eq!(result, "don t worry");
}
#[test]
fn test_query_string_cleanup_newline() {
let input = "line1\nline2";
let result = query_string_cleanup(input);
assert_eq!(result, "line1 line2");
}
#[test]
fn test_query_string_cleanup_both_unsupported_chars() {
let input = "don't\nworry";
let result = query_string_cleanup(input);
assert_eq!(result, "don t worry");
}
#[test]
fn test_query_string_cleanup_multiple_single_quotes() {
let input = "it's a 'test' string";
let result = query_string_cleanup(input);
assert_eq!(result, "it s a test string");
}
#[test]
fn test_query_string_cleanup_multiple_newlines() {
let input = "line1\n\nline2\nline3";
let result = query_string_cleanup(input);
assert_eq!(result, "line1 line2 line3");
}
#[test]
fn test_query_string_cleanup_empty_string() {
let input = "";
let result = query_string_cleanup(input);
assert_eq!(result, input);
}
#[test]
fn test_query_string_cleanup_only_unsupported_chars() {
let input = "'\n'";
let result = query_string_cleanup(input);
assert_eq!(result, " ");
}
#[test]
fn test_query_string_cleanup_unicode_characters() {
let input = "héllo wörld's\nfile";
let result = query_string_cleanup(input);
assert_eq!(result, "héllo wörld s file");
}
#[test]
fn test_query_string_cleanup_special_chars_preserved() {
let input = "test@file#name$with%symbols";
let result = query_string_cleanup(input);
assert_eq!(result, input);
}
}

View File

@@ -0,0 +1,97 @@
pub(crate) mod config;
pub(crate) mod implementation;
use super::super::LOCAL_QUERY_SOURCE_TYPE;
use crate::common::{
error::SearchError,
search::{QueryResponse, QuerySource, SearchQuery},
traits::SearchSource,
};
use async_trait::async_trait;
use config::FileSearchConfig;
use hostname;
use tauri::AppHandle;
pub(crate) const EXTENSION_ID: &str = "File Search";
/// JSON file for this extension.
pub(crate) const PLUGIN_JSON_FILE: &str = r#"
{
"id": "File Search",
"name": "File Search",
"platforms": ["macos", "windows", "linux"],
"description": "Search files on your system",
"icon": "font_Filesearch",
"type": "extension"
}
"#;
pub struct FileSearchExtensionSearchSource;
#[async_trait]
impl SearchSource for FileSearchExtensionSearchSource {
fn get_type(&self) -> QuerySource {
QuerySource {
r#type: LOCAL_QUERY_SOURCE_TYPE.into(),
name: hostname::get()
.unwrap_or(EXTENSION_ID.into())
.to_string_lossy()
.into(),
id: EXTENSION_ID.into(),
}
}
async fn search(
&self,
tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
let Some(query_string) = query.query_strings.get("query") else {
return Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
});
};
let from = usize::try_from(query.from).expect("from too big");
let size = usize::try_from(query.size).expect("size too big");
let query_string = query_string.trim();
if query_string.is_empty() {
return Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
});
}
// Get configuration from tauri store
let config = FileSearchConfig::get(&tauri_app_handle);
// If search paths are empty, then the hit should be empty.
//
// Without this, empty search paths will result in a mdfind that has no `-onlyin`
// option, which will in turn query the whole disk volume.
if config.search_paths.is_empty() {
return Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
});
}
// Execute search in a blocking task
let query_source = self.get_type();
let hits = implementation::hits(&query_string, from, size, &config)
.await
.map_err(SearchError::InternalError)?;
let total_hits = hits.len();
Ok(QueryResponse {
source: query_source,
hits,
total_hits,
})
}
}

View File

@@ -3,34 +3,32 @@
pub mod ai_overview;
pub mod application;
pub mod calculator;
pub mod file_system;
pub mod file_search;
pub mod pizza_engine_runtime;
pub mod quick_ai_access;
#[cfg(target_os = "macos")]
pub mod window_management;
use super::Extension;
use crate::SearchSourceRegistry;
use crate::extension::built_in::application::{set_apps_hotkey, unset_apps_hotkey};
use crate::extension::{
alter_extension_json_file, ExtensionBundleIdBorrowed, PLUGIN_JSON_FILE_NAME,
ExtensionBundleIdBorrowed, PLUGIN_JSON_FILE_NAME, alter_extension_json_file,
};
use crate::{SearchSourceRegistry, GLOBAL_TAURI_APP_HANDLE};
use anyhow::Context;
use file_search::config::FileSearchConfig;
use file_search::implementation::apply_config as file_search_apply_config;
use std::path::{Path, PathBuf};
use std::sync::LazyLock;
use tauri::{AppHandle, Manager, Runtime};
use tauri::{AppHandle, Manager};
pub(crate) static BUILT_IN_EXTENSION_DIRECTORY: LazyLock<PathBuf> = LazyLock::new(|| {
let mut resource_dir = GLOBAL_TAURI_APP_HANDLE
.get()
.expect("global tauri app handle not set")
.path()
.app_data_dir()
.expect(
"User home directory not found, which should be impossible on desktop environments",
);
pub(crate) fn get_built_in_extension_directory(tauri_app_handle: &AppHandle) -> PathBuf {
let mut resource_dir = tauri_app_handle.path().app_data_dir().expect(
"User home directory not found, which should be impossible on desktop environments",
);
resource_dir.push("built_in_extensions");
resource_dir
});
}
/// Helper function to load the built-in extension specified by `extension_id`, used
/// in `list_built_in_extensions()`.
@@ -85,7 +83,10 @@ async fn load_built_in_extension(
.map_err(|e| e.to_string())?;
let res_plugin_json = serde_json::from_str::<Extension>(&plugin_json_file_content);
let Ok(plugin_json) = res_plugin_json else {
log::warn!("user invalidated built-in extension [{}] file, overwriting it with the default template", extension_id);
log::warn!(
"user invalidated built-in extension [{}] file, overwriting it with the default template",
extension_id
);
// If the JSON file cannot be parsed as `struct Extension`, overwrite it with the default template and return.
tokio::fs::write(plugin_json_file_path, default_plugin_json_file)
@@ -136,13 +137,15 @@ async fn load_built_in_extension(
/// We only read alias/hotkey/enabled from the JSON file, we have ensured that if
/// alias/hotkey is not supported, then it will be `None`. Besides that, no further
/// validation is needed because nothing could go wrong.
pub(crate) async fn list_built_in_extensions() -> Result<Vec<Extension>, String> {
let dir = BUILT_IN_EXTENSION_DIRECTORY.as_path();
pub(crate) async fn list_built_in_extensions(
tauri_app_handle: &AppHandle,
) -> Result<Vec<Extension>, String> {
let dir = get_built_in_extension_directory(tauri_app_handle);
let mut built_in_extensions = Vec::new();
built_in_extensions.push(
load_built_in_extension(
dir,
&dir,
application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME,
application::PLUGIN_JSON_FILE,
)
@@ -150,7 +153,7 @@ pub(crate) async fn list_built_in_extensions() -> Result<Vec<Extension>, String>
);
built_in_extensions.push(
load_built_in_extension(
dir,
&dir,
calculator::DATA_SOURCE_ID,
calculator::PLUGIN_JSON_FILE,
)
@@ -158,7 +161,7 @@ pub(crate) async fn list_built_in_extensions() -> Result<Vec<Extension>, String>
);
built_in_extensions.push(
load_built_in_extension(
dir,
&dir,
ai_overview::EXTENSION_ID,
ai_overview::PLUGIN_JSON_FILE,
)
@@ -166,22 +169,44 @@ pub(crate) async fn list_built_in_extensions() -> Result<Vec<Extension>, String>
);
built_in_extensions.push(
load_built_in_extension(
dir,
&dir,
quick_ai_access::EXTENSION_ID,
quick_ai_access::PLUGIN_JSON_FILE,
)
.await?,
);
built_in_extensions.push(
load_built_in_extension(
&dir,
file_search::EXTENSION_ID,
file_search::PLUGIN_JSON_FILE,
)
.await?,
);
cfg_if::cfg_if! {
if #[cfg(target_os = "macos")] {
built_in_extensions.push(
load_built_in_extension(
&dir,
window_management::EXTENSION_ID,
window_management::PLUGIN_JSON_FILE,
)
.await?,
);
}
}
Ok(built_in_extensions)
}
pub(super) async fn init_built_in_extension<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
pub(super) async fn init_built_in_extension(
tauri_app_handle: &AppHandle,
extension: &Extension,
search_source_registry: &SearchSourceRegistry,
) -> Result<(), String> {
log::trace!("initializing built-in extensions");
log::trace!("initializing built-in extensions [{}]", extension.id);
if extension.id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
search_source_registry
@@ -199,6 +224,30 @@ pub(super) async fn init_built_in_extension<R: Runtime>(
log::debug!("built-in extension [{}] initialized", extension.id);
}
if extension.id == file_search::EXTENSION_ID {
let file_system_search = file_search::FileSearchExtensionSearchSource;
search_source_registry
.register_source(file_system_search)
.await;
let file_search_config = FileSearchConfig::get(tauri_app_handle);
file_search_apply_config(&file_search_config)?;
log::debug!("built-in extension [{}] initialized", extension.id);
}
cfg_if::cfg_if! {
if #[cfg(target_os = "macos")] {
if extension.id == window_management::EXTENSION_ID {
let file_system_search = window_management::search_source::WindowManagementSearchSource;
search_source_registry
.register_source(file_system_search)
.await;
window_management::set_up_commands_hotkeys(tauri_app_handle, extension)?;
log::debug!("built-in extension [{}] initialized", extension.id);
}
}
}
Ok(())
}
@@ -207,11 +256,9 @@ pub(crate) fn is_extension_built_in(bundle_id: &ExtensionBundleIdBorrowed<'_>) -
}
pub(crate) async fn enable_built_in_extension(
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
) -> Result<(), String> {
let tauri_app_handle = GLOBAL_TAURI_APP_HANDLE
.get()
.expect("global tauri app handle not set");
let search_source_registry_tauri_state = tauri_app_handle.state::<SearchSourceRegistry>();
let update_extension = |extension: &mut Extension| -> Result<(), String> {
@@ -228,7 +275,7 @@ pub(crate) async fn enable_built_in_extension(
set_apps_hotkey(tauri_app_handle)?;
alter_extension_json_file(
&BUILT_IN_EXTENSION_DIRECTORY.as_path(),
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
@@ -251,7 +298,7 @@ pub(crate) async fn enable_built_in_extension(
.register_source(calculator_search)
.await;
alter_extension_json_file(
&BUILT_IN_EXTENSION_DIRECTORY.as_path(),
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
@@ -260,7 +307,7 @@ pub(crate) async fn enable_built_in_extension(
if bundle_id.extension_id == quick_ai_access::EXTENSION_ID {
alter_extension_json_file(
&BUILT_IN_EXTENSION_DIRECTORY.as_path(),
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
@@ -269,22 +316,69 @@ pub(crate) async fn enable_built_in_extension(
if bundle_id.extension_id == ai_overview::EXTENSION_ID {
alter_extension_json_file(
&BUILT_IN_EXTENSION_DIRECTORY.as_path(),
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
if bundle_id.extension_id == file_search::EXTENSION_ID {
let file_system_search = file_search::FileSearchExtensionSearchSource;
search_source_registry_tauri_state
.register_source(file_system_search)
.await;
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
let file_search_config = FileSearchConfig::get(tauri_app_handle);
file_search_apply_config(&file_search_config)?;
return Ok(());
}
cfg_if::cfg_if! {
if #[cfg(target_os = "macos")] {
if bundle_id.extension_id == window_management::EXTENSION_ID
&& bundle_id.sub_extension_id.is_none()
{
let built_in_extension_dir = get_built_in_extension_directory(tauri_app_handle);
let file_system_search = window_management::search_source::WindowManagementSearchSource;
search_source_registry_tauri_state
.register_source(file_system_search)
.await;
let extension =
load_extension_from_json_file(&built_in_extension_dir, bundle_id.extension_id)?;
window_management::set_up_commands_hotkeys(tauri_app_handle, &extension)?;
alter_extension_json_file(&built_in_extension_dir, bundle_id, update_extension)?;
return Ok(());
}
if bundle_id.extension_id == window_management::EXTENSION_ID {
if let Some(command_id) = bundle_id.sub_extension_id {
let built_in_extension_dir = get_built_in_extension_directory(tauri_app_handle);
alter_extension_json_file(&built_in_extension_dir, bundle_id, update_extension)?;
let extension =
load_extension_from_json_file(&built_in_extension_dir, bundle_id.extension_id)?;
window_management::set_up_command_hotkey(tauri_app_handle, &extension, command_id)?;
}
}
}
}
Ok(())
}
pub(crate) async fn disable_built_in_extension(
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
) -> Result<(), String> {
let tauri_app_handle = GLOBAL_TAURI_APP_HANDLE
.get()
.expect("global tauri app handle not set");
let search_source_registry_tauri_state = tauri_app_handle.state::<SearchSourceRegistry>();
let update_extension = |extension: &mut Extension| -> Result<(), String> {
@@ -301,7 +395,7 @@ pub(crate) async fn disable_built_in_extension(
unset_apps_hotkey(tauri_app_handle)?;
alter_extension_json_file(
&BUILT_IN_EXTENSION_DIRECTORY.as_path(),
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
@@ -322,7 +416,7 @@ pub(crate) async fn disable_built_in_extension(
.remove_source(bundle_id.extension_id)
.await;
alter_extension_json_file(
&BUILT_IN_EXTENSION_DIRECTORY.as_path(),
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
@@ -331,7 +425,7 @@ pub(crate) async fn disable_built_in_extension(
if bundle_id.extension_id == quick_ai_access::EXTENSION_ID {
alter_extension_json_file(
&BUILT_IN_EXTENSION_DIRECTORY.as_path(),
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
@@ -341,7 +435,7 @@ pub(crate) async fn disable_built_in_extension(
if bundle_id.extension_id == ai_overview::EXTENSION_ID {
alter_extension_json_file(
&BUILT_IN_EXTENSION_DIRECTORY.as_path(),
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
@@ -349,49 +443,157 @@ pub(crate) async fn disable_built_in_extension(
return Ok(());
}
if bundle_id.extension_id == file_search::EXTENSION_ID {
search_source_registry_tauri_state
.remove_source(bundle_id.extension_id)
.await;
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
cfg_if::cfg_if! {
if #[cfg(target_os = "macos")] {
if bundle_id.extension_id == window_management::EXTENSION_ID
&& bundle_id.sub_extension_id.is_none()
{
let built_in_extension_dir = get_built_in_extension_directory(tauri_app_handle);
search_source_registry_tauri_state
.remove_source(bundle_id.extension_id)
.await;
alter_extension_json_file(&built_in_extension_dir, bundle_id, update_extension)?;
let extension =
load_extension_from_json_file(&built_in_extension_dir, bundle_id.extension_id)?;
window_management::unset_commands_hotkeys(tauri_app_handle, &extension)?;
}
if bundle_id.extension_id == window_management::EXTENSION_ID {
if let Some(command_id) = bundle_id.sub_extension_id {
let built_in_extension_dir = get_built_in_extension_directory(tauri_app_handle);
alter_extension_json_file(&built_in_extension_dir, bundle_id, update_extension)?;
let extension =
load_extension_from_json_file(&built_in_extension_dir, bundle_id.extension_id)?;
window_management::unset_command_hotkey(tauri_app_handle, &extension, command_id)?;
}
}
}
}
Ok(())
}
pub(crate) fn set_built_in_extension_alias(bundle_id: &ExtensionBundleIdBorrowed<'_>, alias: &str) {
let tauri_app_handle = GLOBAL_TAURI_APP_HANDLE
.get()
.expect("global tauri app handle not set");
pub(crate) fn set_built_in_extension_alias(
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
alias: &str,
) -> Result<(), String> {
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
if let Some(app_path) = bundle_id.sub_extension_id {
application::set_app_alias(tauri_app_handle, app_path, alias);
}
}
cfg_if::cfg_if! {
if #[cfg(target_os = "macos")] {
if bundle_id.extension_id == window_management::EXTENSION_ID
&& bundle_id.sub_extension_id.is_some()
{
let update_function = |ext: &mut Extension| {
ext.alias = Some(alias.to_string());
Ok(())
};
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_function,
)?;
}
}
}
Ok(())
}
pub(crate) fn register_built_in_extension_hotkey(
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
hotkey: &str,
) -> Result<(), String> {
let tauri_app_handle = GLOBAL_TAURI_APP_HANDLE
.get()
.expect("global tauri app handle not set");
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
if let Some(app_path) = bundle_id.sub_extension_id {
application::register_app_hotkey(&tauri_app_handle, app_path, hotkey)?;
}
}
cfg_if::cfg_if! {
if #[cfg(target_os = "macos")] {
let update_function = |ext: &mut Extension| {
ext.hotkey = Some(hotkey.into());
Ok(())
};
if bundle_id.extension_id == window_management::EXTENSION_ID {
if let Some(command_id) = bundle_id.sub_extension_id {
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_function,
)?;
window_management::register_command_hotkey(tauri_app_handle, command_id, hotkey)?;
}
}
}
}
Ok(())
}
pub(crate) fn unregister_built_in_extension_hotkey(
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
) -> Result<(), String> {
let tauri_app_handle = GLOBAL_TAURI_APP_HANDLE
.get()
.expect("global tauri app handle not set");
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
if let Some(app_path) = bundle_id.sub_extension_id {
application::unregister_app_hotkey(&tauri_app_handle, app_path)?;
}
}
cfg_if::cfg_if! {
if #[cfg(target_os = "macos")] {
let update_function = |ext: &mut Extension| {
ext.hotkey = None;
Ok(())
};
if bundle_id.extension_id == window_management::EXTENSION_ID {
if let Some(command_id) = bundle_id.sub_extension_id {
let extension = load_extension_from_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id.extension_id,
)
.unwrap();
window_management::unregister_command_hotkey(tauri_app_handle, &extension, command_id)?;
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_function,
)
.unwrap();
}
}
}
}
Ok(())
}
@@ -431,12 +633,12 @@ fn load_extension_from_json_file(
Ok(extension)
}
#[allow(unused_macros)] // #[function_name::named] only used on macOS
#[function_name::named]
pub(crate) async fn is_built_in_extension_enabled(
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
) -> Result<bool, String> {
let tauri_app_handle = GLOBAL_TAURI_APP_HANDLE
.get()
.expect("global tauri app handle not set");
let search_source_registry_tauri_state = tauri_app_handle.state::<SearchSourceRegistry>();
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME
@@ -464,7 +666,7 @@ pub(crate) async fn is_built_in_extension_enabled(
if bundle_id.extension_id == quick_ai_access::EXTENSION_ID {
let extension = load_extension_from_json_file(
&BUILT_IN_EXTENSION_DIRECTORY.as_path(),
&get_built_in_extension_directory(tauri_app_handle),
bundle_id.extension_id,
)?;
return Ok(extension.enabled);
@@ -472,11 +674,51 @@ pub(crate) async fn is_built_in_extension_enabled(
if bundle_id.extension_id == ai_overview::EXTENSION_ID {
let extension = load_extension_from_json_file(
&BUILT_IN_EXTENSION_DIRECTORY.as_path(),
&get_built_in_extension_directory(tauri_app_handle),
bundle_id.extension_id,
)?;
return Ok(extension.enabled);
}
if bundle_id.extension_id == file_search::EXTENSION_ID && bundle_id.sub_extension_id.is_none() {
return Ok(search_source_registry_tauri_state
.get_source(bundle_id.extension_id)
.await
.is_some());
}
cfg_if::cfg_if! {
if #[cfg(target_os = "macos")] {
// Window Management
if bundle_id.extension_id == window_management::EXTENSION_ID
&& bundle_id.sub_extension_id.is_none()
{
return Ok(search_source_registry_tauri_state
.get_source(bundle_id.extension_id)
.await
.is_some());
}
// Window Management commands
if bundle_id.extension_id == window_management::EXTENSION_ID
&& let Some(command_id) = bundle_id.sub_extension_id
{
let extension = load_extension_from_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id.extension_id,
)?;
let commands = extension
.commands
.expect("window management extension has commands");
let extension = commands.iter().find( |cmd| cmd.id == command_id).unwrap_or_else(|| {
panic!("function [{}()] invoked with a Window Management command that does not exist, extension ID [{}] ", function_name!(), command_id)
});
return Ok(extension.enabled);
}
}
}
unreachable!("extension [{:?}] is not a built-in extension", bundle_id)
}

View File

@@ -8,8 +8,8 @@
//! which forces us to create a dedicated thread/runtime to execute them.
use std::any::Any;
use std::collections::hash_map::Entry;
use std::collections::HashMap;
use std::collections::hash_map::Entry;
use std::sync::OnceLock;
pub(crate) trait SearchSourceState {

View File

@@ -0,0 +1,134 @@
#[derive(Debug, Clone, PartialEq, Copy, Hash, serde::Serialize, serde::Deserialize)]
pub enum Action {
/// Move the window to fill left half of the screen.
TopHalf,
/// Move the window to fill bottom half of the screen.
BottomHalf,
/// Move the window to fill left half of the screen.
LeftHalf,
/// Move the window to fill right half of the screen.
RightHalf,
/// Move the window to fill center half of the screen.
CenterHalf,
/// Resize window to the top left quarter of the screen.
TopLeftQuarter,
/// Resize window to the top right quarter of the screen.
TopRightQuarter,
/// Resize window to the bottom left quarter of the screen.
BottomLeftQuarter,
/// Resize window to the bottom right quarter of the screen.
BottomRightQuarter,
/// Resize window to the top left sixth of the screen.
TopLeftSixth,
/// Resize window to the top center sixth of the screen.
TopCenterSixth,
/// Resize window to the top right sixth of the screen.
TopRightSixth,
/// Resize window to the bottom left sixth of the screen.
BottomLeftSixth,
/// Resize window to the bottom center sixth of the screen.
BottomCenterSixth,
/// Resize window to the bottom right sixth of the screen.
BottomRightSixth,
/// Resize window to the top third of the screen.
TopThird,
/// Resize window to the middle third of the screen.
MiddleThird,
/// Resize window to the bottom third of the screen.
BottomThird,
/// Center window in the screen.
Center,
/// Resize window to the first fourth of the screen.
FirstFourth,
/// Resize window to the second fourth of the screen.
SecondFourth,
/// Resize window to the third fourth of the screen.
ThirdFourth,
/// Resize window to the last fourth of the screen.
LastFourth,
/// Resize window to the first third of the screen.
FirstThird,
/// Resize window to the center third of the screen.
CenterThird,
/// Resize window to the last third of the screen.
LastThird,
/// Resize window to the first two thirds of the screen.
FirstTwoThirds,
/// Resize window to the center two thirds of the screen.
CenterTwoThirds,
/// Resize window to the last two thirds of the screen.
LastTwoThirds,
/// Resize window to the first three fourths of the screen.
FirstThreeFourths,
/// Resize window to the center three fourths of the screen.
CenterThreeFourths,
/// Resize window to the last three fourths of the screen.
LastThreeFourths,
/// Resize window to the top three fourths of the screen.
TopThreeFourths,
/// Resize window to the bottom three fourths of the screen.
BottomThreeFourths,
/// Resize window to the top two thirds of the screen.
TopTwoThirds,
/// Resize window to the bottom two thirds of the screen.
BottomTwoThirds,
/// Resize window to the top center two thirds of the screen.
TopCenterTwoThirds,
/// Resize window to the top first fourth of the screen.
TopFirstFourth,
/// Resize window to the top second fourth of the screen.
TopSecondFourth,
/// Resize window to the top third fourth of the screen.
TopThirdFourth,
/// Resize window to the top last fourth of the screen.
TopLastFourth,
/// Increase the window until it reaches the screen size.
MakeLarger,
/// Decrease the window until it reaches its minimal size.
MakeSmaller,
/// Maximize window to almost fit the screen.
AlmostMaximize,
/// Maximize window to fit the screen.
Maximize,
/// Maximize width of window to fit the screen.
MaximizeWidth,
/// Maximize height of window to fit the screen.
MaximizeHeight,
/// Move window to the top edge of the screen.
MoveUp,
/// Move window to the bottom of the screen.
MoveDown,
/// Move window to the left edge of the screen.
MoveLeft,
/// Move window to the right edge of the screen.
MoveRight,
/// Move window to the next desktop.
NextDesktop,
/// Move window to the previous desktop.
PreviousDesktop,
/// Move window to the next display.
NextDisplay,
/// Move window to the previous display.
PreviousDisplay,
/// Restore window to its last position.
Restore,
/// Toggle fullscreen mode.
ToggleFullscreen,
}

View File

@@ -0,0 +1,638 @@
//! This module calls macOS APIs to implement various helper functions needed by
//! to perform the defined actions.
mod private;
use std::ffi::c_uint;
use std::ffi::c_ushort;
use std::ffi::c_void;
use std::ops::Deref;
use std::ptr::NonNull;
use objc2::MainThreadMarker;
use objc2_app_kit::NSEvent;
use objc2_app_kit::NSScreen;
use objc2_app_kit::NSWorkspace;
use objc2_application_services::AXError;
use objc2_application_services::AXUIElement;
use objc2_application_services::AXValue;
use objc2_application_services::AXValueType;
use objc2_core_foundation::CFBoolean;
use objc2_core_foundation::CFRetained;
use objc2_core_foundation::CFString;
use objc2_core_foundation::CFType;
use objc2_core_foundation::CGPoint;
use objc2_core_foundation::CGRect;
use objc2_core_foundation::CGSize;
use objc2_core_foundation::Type;
use objc2_core_foundation::{CFArray, CFDictionary, CFNumber};
use objc2_core_graphics::CGError;
use objc2_core_graphics::CGEvent;
use objc2_core_graphics::CGEventFlags;
use objc2_core_graphics::CGEventTapLocation;
use objc2_core_graphics::CGEventType;
use objc2_core_graphics::CGMouseButton;
use objc2_core_graphics::CGRectGetMidX;
use objc2_core_graphics::CGRectGetMinY;
use objc2_core_graphics::CGWindowID;
use super::error::Error;
use private::CGSCopyManagedDisplaySpaces;
use private::CGSGetActiveSpace;
use private::CGSMainConnectionID;
use private::CGSSpaceID;
use std::collections::HashMap;
use std::sync::{LazyLock, Mutex};
fn intersects(r1: CGRect, r2: CGRect) -> bool {
let overlapping = !(r1.origin.x + r1.size.width < r2.origin.x
|| r1.origin.y + r1.size.height < r2.origin.y
|| r1.origin.x > r2.origin.x + r2.size.width
|| r1.origin.y > r2.origin.y + r2.size.height);
overlapping
}
/// Core graphics APIs use flipped coordinate system, while AppKit uses the
/// unflippled version, they differ in the y-axis. We need to do the conversion
/// (to `CGPoint.y`) manually.
fn flip_frame_y(main_screen_height: f64, frame_height: f64, frame_unflipped_y: f64) -> f64 {
main_screen_height - (frame_unflipped_y + frame_height)
}
/// Helper function to extract an UI element's origin.
fn get_ui_element_origin(ui_element: &CFRetained<AXUIElement>) -> Result<CGPoint, Error> {
let mut position_value: *const CFType = std::ptr::null();
let ptr_to_position_value = NonNull::new(&mut position_value).unwrap();
let position_attr = CFString::from_static_str("AXPosition");
let error = unsafe { ui_element.copy_attribute_value(&position_attr, ptr_to_position_value) };
if error != AXError::Success {
return Err(Error::AXError(error));
}
assert!(!position_value.is_null());
let position: CFRetained<AXValue> =
unsafe { CFRetained::from_raw(NonNull::new(position_value.cast_mut().cast()).unwrap()) };
let mut position_cg_point = CGPoint::ZERO;
let ptr_to_position_cg_point =
NonNull::new((&mut position_cg_point as *mut CGPoint).cast()).unwrap();
let result = unsafe { position.value(AXValueType::CGPoint, ptr_to_position_cg_point) };
assert!(result, "type mismatched");
Ok(position_cg_point)
}
/// Helper function to extract an UI element's size.
fn get_ui_element_size(ui_element: &CFRetained<AXUIElement>) -> Result<CGSize, Error> {
let mut size_value: *const CFType = std::ptr::null();
let ptr_to_size_value = NonNull::new(&mut size_value).unwrap();
let size_attr = CFString::from_static_str("AXSize");
let error = unsafe { ui_element.copy_attribute_value(&size_attr, ptr_to_size_value) };
if error != AXError::Success {
return Err(Error::AXError(error));
}
assert!(!size_value.is_null());
let size: CFRetained<AXValue> =
unsafe { CFRetained::from_raw(NonNull::new(size_value.cast_mut().cast()).unwrap()) };
let mut size_cg_size = CGSize::ZERO;
let ptr_to_size_cg_size = NonNull::new((&mut size_cg_size as *mut CGSize).cast()).unwrap();
let result = unsafe { size.value(AXValueType::CGSize, ptr_to_size_cg_size) };
assert!(result, "type mismatched");
Ok(size_cg_size)
}
/// Get the frontmost/focused window (as an UI element).
fn get_frontmost_window() -> Result<CFRetained<AXUIElement>, Error> {
let workspace = unsafe { NSWorkspace::sharedWorkspace() };
let frontmost_app =
unsafe { workspace.frontmostApplication() }.ok_or(Error::CannotFindFocusWindow)?;
let pid = unsafe { frontmost_app.processIdentifier() };
let app_element = unsafe { AXUIElement::new_application(pid) };
let mut window_element: *const CFType = std::ptr::null();
let ptr_to_window_element = NonNull::new(&mut window_element).unwrap();
let focused_window_attr = CFString::from_static_str("AXFocusedWindow");
let error =
unsafe { app_element.copy_attribute_value(&focused_window_attr, ptr_to_window_element) };
if error != AXError::Success {
return Err(Error::AXError(error));
}
assert!(!window_element.is_null());
let window_element: *mut AXUIElement = window_element.cast::<AXUIElement>().cast_mut();
let window = unsafe { CFRetained::from_raw(NonNull::new(window_element).unwrap()) };
Ok(window)
}
/// Get the CGWindowID of the frontmost/focused window.
#[allow(unused)] // In case we need it in the future
pub(crate) fn get_frontmost_window_id() -> Result<CGWindowID, Error> {
let element = get_frontmost_window()?;
let ptr: NonNull<AXUIElement> = CFRetained::as_ptr(&element);
let mut window_id_buffer: CGWindowID = 0;
let error =
unsafe { private::_AXUIElementGetWindow(ptr.as_ptr(), &mut window_id_buffer as *mut _) };
if error != AXError::Success {
return Err(Error::AXError(error));
}
Ok(window_id_buffer)
}
/// Returns the workspace ID list grouped by display. For example, suppose you
/// have 2 displays and 10 workspaces (5 workspaces per display), then this
/// function might return something like:
///
/// ```text
/// [
/// [8, 11, 12, 13, 24],
/// [519, 77, 15, 249, 414]
/// ]
/// ```
///
/// Even though this function return macOS internal space IDs, they should correspond
/// to the logical workspace that users are familiar with. The display that contains
/// workspaces `[8, 11, 12, 13, 24]` should be your main display; workspace 8 represents
/// Desktop 1, and workspace 414 represents Desktop 10.
fn workspace_ids_grouped_by_display() -> Vec<Vec<CGSSpaceID>> {
unsafe {
let mut ret = Vec::new();
let conn = CGSMainConnectionID();
let display_spaces_raw = CGSCopyManagedDisplaySpaces(conn);
let display_spaces: CFRetained<CFArray> =
CFRetained::from_raw(NonNull::new(display_spaces_raw).unwrap());
let key_spaces: CFRetained<CFString> = CFString::from_static_str("Spaces");
let key_spaces_ptr: NonNull<CFString> = CFRetained::as_ptr(&key_spaces);
let key_id64: CFRetained<CFString> = CFString::from_static_str("id64");
let key_id64_ptr: NonNull<CFString> = CFRetained::as_ptr(&key_id64);
for i in 0..display_spaces.count() {
let mut workspaces_of_this_display = Vec::new();
let dict_ref = display_spaces.value_at_index(i);
let dict: &CFDictionary = &*(dict_ref as *const CFDictionary);
let mut ptr_to_value_buffer: *const c_void = std::ptr::null();
let key_exists = dict.value_if_present(
key_spaces_ptr.as_ptr().cast::<c_void>().cast_const(),
&mut ptr_to_value_buffer as *mut _,
);
assert!(key_exists);
assert!(!ptr_to_value_buffer.is_null());
let spaces_raw: *const CFArray = ptr_to_value_buffer.cast::<CFArray>();
let spaces = &*spaces_raw;
for idx in 0..spaces.count() {
let workspace_dictionary: &CFDictionary =
&*spaces.value_at_index(idx).cast::<CFDictionary>();
let mut ptr_to_value_buffer: *const c_void = std::ptr::null();
let key_exists = workspace_dictionary.value_if_present(
key_id64_ptr.as_ptr().cast::<c_void>().cast_const(),
&mut ptr_to_value_buffer as *mut _,
);
assert!(key_exists);
assert!(!ptr_to_value_buffer.is_null());
let ptr_workspace_id = ptr_to_value_buffer.cast::<CFNumber>();
let workspace_id = (&*ptr_workspace_id).as_i32().unwrap();
workspaces_of_this_display.push(workspace_id);
}
ret.push(workspaces_of_this_display);
}
ret
}
}
/// Get the next workspace's logical ID. By logical ID, we mean the ID that
/// users are familiar with, workspace 1/2/3 and so on, rather than the internal
/// `CGSSpaceID`.
///
/// NOTE that this function returns None when the current workspace is the last
/// workspace in the current display.
pub(crate) fn get_next_workspace_logical_id() -> Option<usize> {
let window_server_connection = unsafe { CGSMainConnectionID() };
let current_workspace_id = unsafe { CGSGetActiveSpace(window_server_connection) };
// Logical ID starts from 1
let mut logical_id = 1_usize;
for workspaces_in_a_display in workspace_ids_grouped_by_display() {
for (idx, workspace_raw_id) in workspaces_in_a_display.iter().enumerate() {
if *workspace_raw_id == current_workspace_id {
// We found it, now check if it is the last workspace in this display
if idx == workspaces_in_a_display.len() - 1 {
return None;
} else {
return Some(logical_id + 1);
}
} else {
logical_id += 1;
continue;
}
}
}
unreachable!(
"unless the private API CGSGetActiveSpace() is broken, it should return an ID that is in the workspace ID list"
)
}
/// Get the previous workspace's logical ID.
///
/// See [`get_next_workspace_logical_id`] for the doc.
pub(crate) fn get_previous_workspace_logical_id() -> Option<usize> {
let window_server_connection = unsafe { CGSMainConnectionID() };
let current_workspace_id = unsafe { CGSGetActiveSpace(window_server_connection) };
// Logical ID starts from 1
let mut logical_id = 1_usize;
for workspaces_in_a_display in workspace_ids_grouped_by_display() {
for (idx, workspace_raw_id) in workspaces_in_a_display.iter().enumerate() {
if *workspace_raw_id == current_workspace_id {
// We found it, now check if it is the first workspace in this display
if idx == 0 {
return None;
} else {
// this sub operation is safe, logical_id is at least 2
return Some(logical_id - 1);
}
} else {
logical_id += 1;
continue;
}
}
}
unreachable!(
"unless the private API CGSGetActiveSpace() is broken, it should return an ID that is in the workspace ID list"
)
}
/// Move the frontmost window to the specified workspace.
///
/// Credits to the Silica library
///
/// * https://github.com/ianyh/Silica/blob/b91a18dbb822e99ce6b487d1cb4841e863139b2a/Silica/Sources/SIWindow.m#L215-L260
/// * https://github.com/ianyh/Silica/blob/b91a18dbb822e99ce6b487d1cb4841e863139b2a/Silica/Sources/SISystemWideElement.m#L29-L65
pub(crate) fn move_frontmost_window_to_workspace(space: usize) -> Result<(), Error> {
assert!(space >= 1);
if space > 16 {
return Err(Error::TooManyWorkspace);
}
let window_frame = get_frontmost_window_frame()?;
let close_button_frame = get_frontmost_window_close_button_frame()?;
let mouse_cursor_point = CGPoint::new(
unsafe { CGRectGetMidX(close_button_frame) },
window_frame.origin.y
+ (window_frame.origin.y - unsafe { CGRectGetMinY(close_button_frame) }).abs() / 2.0,
);
let mouse_move_event = unsafe {
CGEvent::new_mouse_event(
None,
CGEventType::MouseMoved,
mouse_cursor_point,
CGMouseButton::Left,
)
};
let mouse_drag_event = unsafe {
CGEvent::new_mouse_event(
None,
CGEventType::LeftMouseDragged,
mouse_cursor_point,
CGMouseButton::Left,
)
};
let mouse_down_event = unsafe {
CGEvent::new_mouse_event(
None,
CGEventType::LeftMouseDown,
mouse_cursor_point,
CGMouseButton::Left,
)
};
let mouse_up_event = unsafe {
CGEvent::new_mouse_event(
None,
CGEventType::LeftMouseUp,
mouse_cursor_point,
CGMouseButton::Left,
)
};
unsafe {
CGEvent::set_flags(mouse_move_event.as_deref(), CGEventFlags(0));
CGEvent::set_flags(mouse_down_event.as_deref(), CGEventFlags(0));
CGEvent::set_flags(mouse_up_event.as_deref(), CGEventFlags(0));
// Move the mouse into place at the window's toolbar
CGEvent::post(CGEventTapLocation::HIDEventTap, mouse_move_event.as_deref());
// Mouse down to set up the drag
CGEvent::post(CGEventTapLocation::HIDEventTap, mouse_down_event.as_deref());
// Drag event to grab hold of the window
CGEvent::post(CGEventTapLocation::HIDEventTap, mouse_drag_event.as_deref());
}
// cast is safe as space is in range [1, 16]
let hot_key: c_ushort = 118 + space as c_ushort - 1;
let mut flags: c_uint = 0;
let mut key_code: c_ushort = 0;
let error = unsafe {
private::CGSGetSymbolicHotKeyValue(hot_key, std::ptr::null_mut(), &mut key_code, &mut flags)
};
if error != CGError::Success {
return Err(Error::CGError(error));
}
unsafe {
// If the hotkey is disabled, enable it.
if !private::CGSIsSymbolicHotKeyEnabled(hot_key) {
if private::CGSSetSymbolicHotKeyEnabled(hot_key, true) != CGError::Success {
return Err(Error::CGError(error));
}
}
}
let opt_keyboard_event = unsafe { CGEvent::new_keyboard_event(None, key_code, true) };
unsafe {
// cast is safe (uint -> u64)
CGEvent::set_flags(opt_keyboard_event.as_deref(), CGEventFlags(flags as u64));
}
let keyboard_event = opt_keyboard_event.unwrap();
let event = unsafe { NSEvent::eventWithCGEvent(&keyboard_event) }.unwrap();
let keyboard_event_up = unsafe { CGEvent::new_keyboard_event(None, event.keyCode(), false) };
unsafe {
CGEvent::set_flags(keyboard_event_up.as_deref(), CGEventFlags(0));
// Send the shortcut command to get Mission Control to switch spaces from under the window.
CGEvent::post(CGEventTapLocation::HIDEventTap, event.CGEvent().as_deref());
CGEvent::post(
CGEventTapLocation::HIDEventTap,
keyboard_event_up.as_deref(),
);
}
unsafe {
// Let go of the window.
CGEvent::post(CGEventTapLocation::HIDEventTap, mouse_up_event.as_deref());
}
Ok(())
}
pub(crate) fn get_frontmost_window_origin() -> Result<CGPoint, Error> {
let frontmost_window = get_frontmost_window()?;
get_ui_element_origin(&frontmost_window)
}
pub(crate) fn get_frontmost_window_size() -> Result<CGSize, Error> {
let frontmost_window = get_frontmost_window()?;
get_ui_element_size(&frontmost_window)
}
pub(crate) fn get_frontmost_window_frame() -> Result<CGRect, Error> {
let origin = get_frontmost_window_origin()?;
let size = get_frontmost_window_size()?;
Ok(CGRect { origin, size })
}
/// Get the frontmost window's close button, then extract its frame.
fn get_frontmost_window_close_button_frame() -> Result<CGRect, Error> {
let window = get_frontmost_window()?;
let mut ptr_to_close_button: *const CFType = std::ptr::null();
let ptr_to_buffer = NonNull::new(&mut ptr_to_close_button).unwrap();
let close_button_attribute = CFString::from_static_str("AXCloseButton");
let error = unsafe { window.copy_attribute_value(&close_button_attribute, ptr_to_buffer) };
if error != AXError::Success {
return Err(Error::AXError(error));
}
assert!(!ptr_to_close_button.is_null());
let close_button_element = ptr_to_close_button.cast::<AXUIElement>().cast_mut();
let close_button = unsafe { CFRetained::from_raw(NonNull::new(close_button_element).unwrap()) };
let origin = get_ui_element_origin(&close_button)?;
let size = get_ui_element_size(&close_button)?;
Ok(CGRect { origin, size })
}
/// This function returns the "visible frame" [^1] of all the screens.
///
/// FIXME: This function relies on the [`visibleFrame()`][vf_doc] API, which
/// has 2 bugs we need to work around:
///
/// 1. It assumes the Dock is on the main display, which in reality depends on
/// how users arrange their displays and the "Dock position on screen" setting
/// entry.
/// 2. For non-main displays, it assumes that they don't have a menu bar, but macOS
/// puts a menu bar on every display.
///
///
/// [^1]: Visible frame: a rectangle defines the portion of the screen in which it
/// is currently safe to draw your apps content.
///
/// [vf_doc]: https://developer.apple.com/documentation/AppKit/NSScreen/visibleFrame
pub(crate) fn list_visible_frame_of_all_screens() -> Result<Vec<CGRect>, Error> {
let main_thread_marker = MainThreadMarker::new().ok_or(Error::NotInMainThread)?;
let screens = NSScreen::screens(main_thread_marker).to_vec();
if screens.is_empty() {
return Ok(Vec::new());
}
let main_screen = screens.first().expect("screens is not empty");
let frames = screens
.iter()
.map(|ns_screen| {
// NSScreen is an AppKit API, which uses unflipped coordinate
// system, flip it
let mut unflipped_frame = ns_screen.visibleFrame();
let flipped_frame_origin_y = flip_frame_y(
main_screen.frame().size.height,
unflipped_frame.size.height,
unflipped_frame.origin.y,
);
unflipped_frame.origin.y = flipped_frame_origin_y;
unflipped_frame
})
.collect();
Ok(frames)
}
/// Get the Visible frame of the "active screen"[^1].
///
///
/// [^1]: the screen which the frontmost window is on.
pub(crate) fn get_active_screen_visible_frame() -> Result<CGRect, Error> {
let main_thread_marker = MainThreadMarker::new().ok_or(Error::NotInMainThread)?;
let frontmost_window_frame = get_frontmost_window_frame()?;
let screens = NSScreen::screens(main_thread_marker)
.into_iter()
.collect::<Vec<_>>();
if screens.is_empty() {
return Err(Error::NoDisplay);
}
let main_screen_height = screens[0].frame().size.height;
// AppKit uses Unflipped Coordinate System, but Accessibility APIs use
// Flipped Coordinate System, we need to flip the origin of these screens.
for screen in screens {
let mut screen_frame = screen.frame();
let unflipped_y = screen_frame.origin.y;
let flipped_y = flip_frame_y(main_screen_height, screen_frame.size.height, unflipped_y);
screen_frame.origin.y = flipped_y;
if intersects(screen_frame, frontmost_window_frame) {
let mut visible_frame = screen.visibleFrame();
let flipped_y = flip_frame_y(
main_screen_height,
visible_frame.size.height,
visible_frame.origin.y,
);
visible_frame.origin.y = flipped_y;
return Ok(visible_frame);
}
}
unreachable!()
}
/// Move the frontmost window's origin to the point specified by `x` and `y`.
pub fn move_frontmost_window(x: f64, y: f64) -> Result<(), Error> {
let frontmost_window = get_frontmost_window()?;
let mut point = CGPoint::new(x, y);
let ptr_to_point = NonNull::new((&mut point as *mut CGPoint).cast::<c_void>()).unwrap();
let pos_value = unsafe { AXValue::new(AXValueType::CGPoint, ptr_to_point) }.unwrap();
let pos_attr = CFString::from_static_str("AXPosition");
let error = unsafe { frontmost_window.set_attribute_value(&pos_attr, pos_value.deref()) };
if error != AXError::Success {
return Err(Error::AXError(error));
}
Ok(())
}
/// Set the frontmost window's frame to the specified frame - adjust size and
/// location at the same time.
pub fn set_frontmost_window_frame(frame: CGRect) -> Result<(), Error> {
let frontmost_window = get_frontmost_window()?;
let mut point = frame.origin;
let ptr_to_point = NonNull::new((&mut point as *mut CGPoint).cast::<c_void>()).unwrap();
let pos_value = unsafe { AXValue::new(AXValueType::CGPoint, ptr_to_point) }.unwrap();
let pos_attr = CFString::from_static_str("AXPosition");
let error = unsafe { frontmost_window.set_attribute_value(&pos_attr, pos_value.deref()) };
if error != AXError::Success {
return Err(Error::AXError(error));
}
let mut size = frame.size;
let ptr_to_size = NonNull::new((&mut size as *mut CGSize).cast::<c_void>()).unwrap();
let size_value = unsafe { AXValue::new(AXValueType::CGSize, ptr_to_size) }.unwrap();
let size_attr = CFString::from_static_str("AXSize");
let error = unsafe { frontmost_window.set_attribute_value(&size_attr, size_value.deref()) };
if error != AXError::Success {
return Err(Error::AXError(error));
}
Ok(())
}
pub fn toggle_fullscreen() -> Result<(), Error> {
let frontmost_window = get_frontmost_window()?;
let fullscreen_attr = CFString::from_static_str("AXFullScreen");
let mut current_value_ref: *const CFType = std::ptr::null();
let error = unsafe {
frontmost_window.copy_attribute_value(
&fullscreen_attr,
NonNull::new(&mut current_value_ref).unwrap(),
)
};
// TODO: If the attribute doesn't exist, error won't be Success as well.
// Before we handle that, we need to know the error case that will be
// returned in that case.
if error != AXError::Success {
return Err(Error::AXError(error));
}
assert!(!current_value_ref.is_null());
let current_value = unsafe {
let retained_boolean: CFRetained<CFBoolean> = CFRetained::from_raw(
NonNull::new(current_value_ref.cast::<CFBoolean>().cast_mut()).unwrap(),
);
retained_boolean.as_bool()
};
let new_value = !current_value;
let new_value_ref: CFRetained<CFBoolean> = CFBoolean::new(new_value).retain();
let error =
unsafe { frontmost_window.set_attribute_value(&fullscreen_attr, new_value_ref.deref()) };
if error != AXError::Success {
return Err(Error::AXError(error));
}
Ok(())
}
static LAST_FRAME: LazyLock<Mutex<HashMap<CGWindowID, CGRect>>> =
LazyLock::new(|| Mutex::new(HashMap::new()));
pub(crate) fn set_frontmost_window_last_frame(window_id: CGWindowID, frame: CGRect) {
let mut map = LAST_FRAME.lock().unwrap();
map.insert(window_id, frame);
}
pub(crate) fn get_frontmost_window_last_frame(window_id: CGWindowID) -> Option<CGRect> {
let map = LAST_FRAME.lock().unwrap();
map.get(&window_id).cloned()
}

View File

@@ -0,0 +1,70 @@
//! Private macOS APIs.
use bitflags::bitflags;
use objc2_application_services::AXError;
use objc2_application_services::AXUIElement;
use objc2_core_foundation::CFArray;
use objc2_core_graphics::CGError;
use objc2_core_graphics::CGWindowID;
use std::ffi::c_int;
use std::ffi::c_uint;
use std::ffi::c_ushort;
pub(crate) type CGSConnectionID = u32;
pub(crate) type CGSSpaceID = c_int;
bitflags! {
#[derive(Debug, Copy, Clone, PartialEq, Eq)]
#[repr(transparent)]
pub struct CGSSpaceMask: c_int {
const INCLUDE_CURRENT = 1 << 0;
const INCLUDE_OTHERS = 1 << 1;
const INCLUDE_USER = 1 << 2;
const INCLUDE_OS = 1 << 3;
const VISIBLE = 1 << 16;
const CURRENT_SPACES = Self::INCLUDE_USER.bits() | Self::INCLUDE_CURRENT.bits();
const OTHER_SPACES = Self::INCLUDE_USER.bits() | Self::INCLUDE_OTHERS.bits();
const ALL_SPACES =
Self::INCLUDE_USER.bits() | Self::INCLUDE_OTHERS.bits() | Self::INCLUDE_CURRENT.bits();
const ALL_VISIBLE_SPACES = Self::ALL_SPACES.bits() | Self::VISIBLE.bits();
const CURRENT_OS_SPACES = Self::INCLUDE_OS.bits() | Self::INCLUDE_CURRENT.bits();
const OTHER_OS_SPACES = Self::INCLUDE_OS.bits() | Self::INCLUDE_OTHERS.bits();
const ALL_OS_SPACES =
Self::INCLUDE_OS.bits() | Self::INCLUDE_OTHERS.bits() | Self::INCLUDE_CURRENT.bits();
}
}
unsafe extern "C" {
/// Extract `window_id` from an AXUIElement.
pub(crate) fn _AXUIElementGetWindow(
elem: *mut AXUIElement,
window_id: *mut CGWindowID,
) -> AXError;
/// Connect to the WindowServer and get a connection descriptor.
pub(crate) fn CGSMainConnectionID() -> CGSConnectionID;
/// It returns a CFArray of dictionaries. Each dictionary contains information
/// about a display, including a list of all the spaces (CGSSpaceID) on that display.
pub(crate) fn CGSCopyManagedDisplaySpaces(cid: CGSConnectionID) -> *mut CFArray;
/// Gets the ID of the space currently visible to the user.
pub(crate) fn CGSGetActiveSpace(cid: CGSConnectionID) -> CGSSpaceID;
/// Returns the values the symbolic hot key represented by the given UID is configured with.
pub(crate) fn CGSGetSymbolicHotKeyValue(
hotKey: c_ushort,
outKeyEquivalent: *mut c_ushort,
outVirtualKeyCode: *mut c_ushort,
outModifiers: *mut c_uint,
) -> CGError;
/// Returns whether the symbolic hot key represented by the given UID is enabled.
pub(crate) fn CGSIsSymbolicHotKeyEnabled(hotKey: c_ushort) -> bool;
/// Sets whether the symbolic hot key represented by the given UID is enabled.
pub(crate) fn CGSSetSymbolicHotKeyEnabled(hotKey: c_ushort, isEnabled: bool) -> CGError;
}

View File

@@ -0,0 +1,25 @@
use objc2_application_services::AXError;
use objc2_core_graphics::CGError;
use thiserror::Error;
#[derive(Debug, Error)]
pub enum Error {
/// Cannot find the focused window.
#[error("Cannot find the focused window.")]
CannotFindFocusWindow,
/// Error code from the macOS Accessibility APIs.
#[error("Error code from the macOS Accessibility APIs: {0:?}")]
AXError(AXError),
/// Function should be in called from the main thread, but it is not.
#[error("Function should be in called from the main thread, but it is not.")]
NotInMainThread,
/// No monitor detected.
#[error("No monitor detected.")]
NoDisplay,
/// Can only handle 16 Workspaces at most.
#[error("libwmgr can only handle 16 Workspaces at most.")]
TooManyWorkspace,
/// Error code from the macOS Core Graphics APIs.
#[error("Error code from the macOS Core Graphics APIs: {0:?}")]
CGError(CGError),
}

View File

@@ -0,0 +1,973 @@
pub(crate) mod actions;
mod backend;
mod error;
pub(crate) mod on_opened;
pub(crate) mod search_source;
use crate::common::document::open;
use crate::extension::Extension;
use actions::Action;
use backend::get_active_screen_visible_frame;
use backend::get_frontmost_window_frame;
use backend::get_frontmost_window_id;
use backend::get_frontmost_window_last_frame;
use backend::get_next_workspace_logical_id;
use backend::get_previous_workspace_logical_id;
use backend::list_visible_frame_of_all_screens;
use backend::move_frontmost_window;
use backend::move_frontmost_window_to_workspace;
use backend::set_frontmost_window_frame;
use backend::set_frontmost_window_last_frame;
use backend::toggle_fullscreen;
use error::Error;
use objc2_core_foundation::{CGPoint, CGRect, CGSize};
use oneshot::channel as oneshot_channel;
use tauri::AppHandle;
use tauri::async_runtime;
use tauri_plugin_global_shortcut::GlobalShortcutExt;
use tauri_plugin_global_shortcut::ShortcutState;
pub(crate) const EXTENSION_ID: &str = "Window Management";
/// JSON file for this extension.
pub(crate) const PLUGIN_JSON_FILE: &str = include_str!("./plugin.json");
pub(crate) fn perform_action_on_main_thread(
tauri_app_handle: &AppHandle,
action: Action,
) -> Result<(), String> {
let (tx, rx) = oneshot_channel();
tauri_app_handle
.run_on_main_thread(move || {
let res = perform_action(action).map_err(|e| e.to_string());
tx.send(res)
.expect("oneshot channel receiver unexpectedly dropped");
})
.expect("tauri internal bug, channel receiver dropped");
rx.recv()
.expect("oneshot channel sender unexpectedly dropped before sending function return value")
}
/// Perform this action to the focused window.
fn perform_action(action: Action) -> Result<(), Error> {
let visible_frame = get_active_screen_visible_frame()?;
let frontmost_window_id = get_frontmost_window_id()?;
let frontmost_window_frame = get_frontmost_window_frame()?;
set_frontmost_window_last_frame(frontmost_window_id, frontmost_window_frame);
match action {
Action::TopHalf => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width,
height: visible_frame.size.height / 2.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::BottomHalf => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y + visible_frame.size.height / 2.0,
};
let size = CGSize {
width: visible_frame.size.width,
height: visible_frame.size.height / 2.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::LeftHalf => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width / 2.0,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::RightHalf => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width / 2.0,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width / 2.0,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::CenterHalf => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width / 4.0,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width / 2.0,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::TopLeftQuarter => {
let origin = visible_frame.origin;
let size = CGSize {
width: visible_frame.size.width / 2.0,
height: visible_frame.size.height / 2.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::TopRightQuarter => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width / 2.0,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width / 2.0,
height: visible_frame.size.height / 2.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::BottomLeftQuarter => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y + visible_frame.size.height / 2.0,
};
let size = CGSize {
width: visible_frame.size.width / 2.0,
height: visible_frame.size.height / 2.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::BottomRightQuarter => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width / 2.0,
y: visible_frame.origin.y + visible_frame.size.height / 2.0,
};
let size = CGSize {
width: visible_frame.size.width / 2.0,
height: visible_frame.size.height / 2.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::TopLeftSixth => {
let origin = visible_frame.origin;
let size = CGSize {
width: visible_frame.size.width / 3.0,
height: visible_frame.size.height / 2.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::TopCenterSixth => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width / 3.0,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width / 3.0,
height: visible_frame.size.height / 2.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::TopRightSixth => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width * 2.0 / 3.0,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width / 3.0,
height: visible_frame.size.height / 2.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::BottomLeftSixth => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y + visible_frame.size.height / 2.0,
};
let size = CGSize {
width: visible_frame.size.width / 3.0,
height: visible_frame.size.height / 2.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::BottomCenterSixth => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width / 3.0,
y: visible_frame.origin.y + visible_frame.size.height / 2.0,
};
let size = CGSize {
width: visible_frame.size.width / 3.0,
height: visible_frame.size.height / 2.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::BottomRightSixth => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width * 2.0 / 3.0,
y: visible_frame.origin.y + visible_frame.size.height / 2.0,
};
let size = CGSize {
width: visible_frame.size.width / 3.0,
height: visible_frame.size.height / 2.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::TopThird => {
let origin = visible_frame.origin;
let size = CGSize {
width: visible_frame.size.width,
height: visible_frame.size.height / 3.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::MiddleThird => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y + visible_frame.size.height / 3.0,
};
let size = CGSize {
width: visible_frame.size.width,
height: visible_frame.size.height / 3.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::BottomThird => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y + visible_frame.size.height * 2.0 / 3.0,
};
let size = CGSize {
width: visible_frame.size.width,
height: visible_frame.size.height / 3.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::Center => {
let window_size = frontmost_window_frame.size;
let origin = CGPoint {
x: visible_frame.origin.x + (visible_frame.size.width - window_size.width) / 2.0,
y: visible_frame.origin.y + (visible_frame.size.height - window_size.height) / 2.0,
};
move_frontmost_window(origin.x, origin.y)
}
Action::FirstFourth => {
let origin = visible_frame.origin;
let size = CGSize {
width: visible_frame.size.width / 4.0,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::SecondFourth => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width / 4.0,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width / 4.0,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::ThirdFourth => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width * 2.0 / 4.0,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width / 4.0,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::LastFourth => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width * 3.0 / 4.0,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width / 4.0,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::FirstThird => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width / 3.0,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::CenterThird => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width / 3.0,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width / 3.0,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::LastThird => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width * 2.0 / 3.0,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width / 3.0,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::FirstTwoThirds => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width * 2.0 / 3.0,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::CenterTwoThirds => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width / 6.0,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width * 2.0 / 3.0,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::LastTwoThirds => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width / 3.0,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width * 2.0 / 3.0,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::FirstThreeFourths => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width * 3.0 / 4.0,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::CenterThreeFourths => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width / 8.0,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width * 3.0 / 4.0,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::LastThreeFourths => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width / 4.0,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width * 3.0 / 4.0,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::TopThreeFourths => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width,
height: visible_frame.size.height * 3.0 / 4.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::BottomThreeFourths => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y + visible_frame.size.height / 4.0,
};
let size = CGSize {
width: visible_frame.size.width,
height: visible_frame.size.height * 3.0 / 4.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::TopTwoThirds => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width,
height: visible_frame.size.height * 2.0 / 3.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::BottomTwoThirds => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y + visible_frame.size.height / 3.0,
};
let size = CGSize {
width: visible_frame.size.width,
height: visible_frame.size.height * 2.0 / 3.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::TopCenterTwoThirds => {
let origin = CGPoint {
x: visible_frame.origin.x + visible_frame.size.width / 6.0,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width * 2.0 / 3.0,
height: visible_frame.size.height * 2.0 / 3.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::TopFirstFourth => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y,
};
let size = CGSize {
width: visible_frame.size.width,
height: visible_frame.size.height / 4.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::TopSecondFourth => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y + visible_frame.size.height / 4.0,
};
let size = CGSize {
width: visible_frame.size.width,
height: visible_frame.size.height / 4.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::TopThirdFourth => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y + visible_frame.size.height * 2.0 / 4.0,
};
let size = CGSize {
width: visible_frame.size.width,
height: visible_frame.size.height / 4.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::TopLastFourth => {
let origin = CGPoint {
x: visible_frame.origin.x,
y: visible_frame.origin.y + visible_frame.size.height * 3.0 / 4.0,
};
let size = CGSize {
width: visible_frame.size.width,
height: visible_frame.size.height / 4.0,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::MakeLarger => {
let window_origin = frontmost_window_frame.origin;
let window_size = frontmost_window_frame.size;
let delta_width = 20_f64;
let delta_height = window_size.height / window_size.width * delta_width;
let delta_origin_x = delta_width / 2.0;
let delta_origin_y = delta_height / 2.0;
let new_width = {
let possible_value = window_size.width + delta_width;
if possible_value > visible_frame.size.width {
visible_frame.size.width
} else {
possible_value
}
};
let new_height = {
let possible_value = window_size.height + delta_height;
if possible_value > visible_frame.size.height {
visible_frame.size.height
} else {
possible_value
}
};
let new_origin_x = {
let possible_value = window_origin.x - delta_origin_x;
if possible_value < visible_frame.origin.x {
visible_frame.origin.x
} else {
possible_value
}
};
let new_origin_y = {
let possible_value = window_origin.y - delta_origin_y;
if possible_value < visible_frame.origin.y {
visible_frame.origin.y
} else {
possible_value
}
};
let origin = CGPoint {
x: new_origin_x,
y: new_origin_y,
};
let size = CGSize {
width: new_width,
height: new_height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::MakeSmaller => {
let window_origin = frontmost_window_frame.origin;
let window_size = frontmost_window_frame.size;
let delta_width = 20_f64;
let delta_height = window_size.height / window_size.width * delta_width;
let delta_origin_x = delta_width / 2.0;
let delta_origin_y = delta_height / 2.0;
let origin = CGPoint {
x: window_origin.x + delta_origin_x,
y: window_origin.y + delta_origin_y,
};
let size = CGSize {
width: window_size.width - delta_width,
height: window_size.height - delta_height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::AlmostMaximize => {
let new_size = CGSize {
width: visible_frame.size.width * 0.9,
height: visible_frame.size.height * 0.9,
};
let new_origin = CGPoint {
x: visible_frame.origin.x + (visible_frame.size.width * 0.1),
y: visible_frame.origin.y + (visible_frame.size.height * 0.1),
};
let new_frame = CGRect {
origin: new_origin,
size: new_size,
};
set_frontmost_window_frame(new_frame)
}
Action::Maximize => {
let new_frame = CGRect {
origin: visible_frame.origin,
size: visible_frame.size,
};
set_frontmost_window_frame(new_frame)
}
Action::MaximizeWidth => {
let window_origin = frontmost_window_frame.origin;
let window_size = frontmost_window_frame.size;
let origin = CGPoint {
x: visible_frame.origin.x,
y: window_origin.y,
};
let size = CGSize {
width: visible_frame.size.width,
height: window_size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::MaximizeHeight => {
let window_origin = frontmost_window_frame.origin;
let window_size = frontmost_window_frame.size;
let origin = CGPoint {
x: window_origin.x,
y: visible_frame.origin.y,
};
let size = CGSize {
width: window_size.width,
height: visible_frame.size.height,
};
let new_frame = CGRect { origin, size };
set_frontmost_window_frame(new_frame)
}
Action::MoveUp => {
let window_origin = frontmost_window_frame.origin;
let new_y = (window_origin.y - 10.0).max(visible_frame.origin.y);
move_frontmost_window(window_origin.x, new_y)
}
Action::MoveDown => {
let window_origin = frontmost_window_frame.origin;
let window_size = frontmost_window_frame.size;
let new_y = (window_origin.y + 10.0)
.min(visible_frame.origin.y + visible_frame.size.height - window_size.height);
move_frontmost_window(window_origin.x, new_y)
}
Action::MoveLeft => {
let window_origin = frontmost_window_frame.origin;
let new_x = (window_origin.x - 10.0).max(visible_frame.origin.x);
move_frontmost_window(new_x, window_origin.y)
}
Action::MoveRight => {
let window_origin = frontmost_window_frame.origin;
let window_size = frontmost_window_frame.size;
let new_x = (window_origin.x + 10.0)
.min(visible_frame.origin.x + visible_frame.size.width - window_size.width);
move_frontmost_window(new_x, window_origin.y)
}
Action::NextDesktop => {
let Some(next_workspace_logical_id) = get_next_workspace_logical_id() else {
// nothing to do
return Ok(());
};
move_frontmost_window_to_workspace(next_workspace_logical_id)
}
Action::PreviousDesktop => {
let Some(previous_workspace_logical_id) = get_previous_workspace_logical_id() else {
// nothing to do
return Ok(());
};
// Now let's switch the workspace
move_frontmost_window_to_workspace(previous_workspace_logical_id)
}
Action::NextDisplay => {
const TOO_MANY_MONITORS: &str = "I don't think you can have so many monitors";
let frames = list_visible_frame_of_all_screens()?;
let n_frames = frames.len();
if n_frames == 0 {
return Err(Error::NoDisplay);
}
if n_frames == 1 {
return Ok(());
}
let index = frames
.iter()
.position(|fr| fr == &visible_frame)
.expect("active screen should be in the list");
let new_index: usize = {
let index_i32: i32 = index.try_into().expect(TOO_MANY_MONITORS);
let index_i32_plus_one = index_i32.checked_add(1).expect(TOO_MANY_MONITORS);
let final_value = index_i32_plus_one % n_frames as i32;
final_value
.try_into()
.expect("final value should be positive")
};
let new_frame = frames[new_index];
set_frontmost_window_frame(new_frame)
}
Action::PreviousDisplay => {
const TOO_MANY_MONITORS: &str = "I don't think you can have so many monitors";
let frames = list_visible_frame_of_all_screens()?;
let n_frames = frames.len();
if n_frames == 0 {
return Err(Error::NoDisplay);
}
if n_frames == 1 {
return Ok(());
}
let index = frames
.iter()
.position(|fr| fr == &visible_frame)
.expect("active screen should be in the list");
let new_index: usize = {
let index_i32: i32 = index.try_into().expect(TOO_MANY_MONITORS);
let index_i32_minus_one = index_i32 - 1;
let n_frames_i32: i32 = n_frames.try_into().expect(TOO_MANY_MONITORS);
let final_value = (index_i32_minus_one + n_frames_i32) % n_frames_i32;
final_value
.try_into()
.expect("final value should be positive")
};
let new_frame = frames[new_index];
set_frontmost_window_frame(new_frame)
}
Action::Restore => {
let Some(previous_frame) = get_frontmost_window_last_frame(frontmost_window_id) else {
// Previous frame found, Nothing to do
return Ok(());
};
set_frontmost_window_frame(previous_frame)
}
Action::ToggleFullscreen => toggle_fullscreen(),
}
}
pub(crate) fn set_up_commands_hotkeys(
tauri_app_handle: &AppHandle,
wm_extension: &Extension,
) -> Result<(), String> {
for command in wm_extension
.commands
.as_ref()
.expect("Window Management extension has commands")
.iter()
.filter(|cmd| cmd.enabled)
{
if let Some(ref hotkey) = command.hotkey {
let on_opened = on_opened::on_opened(&command.id);
let extension_id_clone = command.id.clone();
tauri_app_handle
.global_shortcut()
.on_shortcut(hotkey.as_str(), move |tauri_app_handle, _hotkey, event| {
let on_opened_clone = on_opened.clone();
let extension_id_clone = extension_id_clone.clone();
let app_handle_clone = tauri_app_handle.clone();
if event.state() == ShortcutState::Pressed {
async_runtime::spawn(async move {
let result = open(app_handle_clone, on_opened_clone, None).await;
if let Err(msg) = result {
log::warn!(
"failed to open extension [{}], error [{}]",
extension_id_clone,
msg
);
}
});
}
})
.map_err(|e| e.to_string())?;
}
}
Ok(())
}
pub(crate) fn unset_commands_hotkeys(
tauri_app_handle: &AppHandle,
wm_extension: &Extension,
) -> Result<(), String> {
for command in wm_extension
.commands
.as_ref()
.expect("Window Management extension has commands")
.iter()
.filter(|cmd| cmd.enabled)
{
if let Some(ref hotkey) = command.hotkey {
tauri_app_handle
.global_shortcut()
.unregister(hotkey.as_str())
.map_err(|e| e.to_string())?;
}
}
Ok(())
}
pub(crate) fn set_up_command_hotkey(
tauri_app_handle: &AppHandle,
wm_extension: &Extension,
command_id: &str,
) -> Result<(), String> {
let commands = wm_extension
.commands
.as_ref()
.expect("Window Management has commands");
let opt_command = commands.iter().find(|ext| ext.id == command_id);
let Some(command) = opt_command else {
panic!("Window Management command does not exist {}", command_id);
};
if let Some(ref hotkey) = command.hotkey {
let on_opened = on_opened::on_opened(&command.id);
let extension_id_clone = command.id.clone();
tauri_app_handle
.global_shortcut()
.on_shortcut(hotkey.as_str(), move |tauri_app_handle, _hotkey, event| {
let on_opened_clone = on_opened.clone();
let extension_id_clone = extension_id_clone.clone();
let app_handle_clone = tauri_app_handle.clone();
if event.state() == ShortcutState::Pressed {
async_runtime::spawn(async move {
let result = open(app_handle_clone, on_opened_clone, None).await;
if let Err(msg) = result {
log::warn!(
"failed to open extension [{}], error [{}]",
extension_id_clone,
msg
);
}
});
}
})
.map_err(|e| e.to_string())?;
}
Ok(())
}
pub(crate) fn unset_command_hotkey(
tauri_app_handle: &AppHandle,
wm_extension: &Extension,
command_id: &str,
) -> Result<(), String> {
let commands = wm_extension
.commands
.as_ref()
.expect("Window Management has commands");
let opt_command = commands.iter().find(|ext| ext.id == command_id);
let Some(command) = opt_command else {
panic!("Window Management command does not exist {}", command_id);
};
if let Some(ref hotkey) = command.hotkey {
tauri_app_handle
.global_shortcut()
.unregister(hotkey.as_str())
.map_err(|e| e.to_string())?;
}
Ok(())
}
pub(crate) fn register_command_hotkey(
tauri_app_handle: &AppHandle,
command_id: &str,
hotkey: &str,
) -> Result<(), String> {
let on_opened = on_opened::on_opened(&command_id);
let extension_id_clone = command_id.to_string();
tauri_app_handle
.global_shortcut()
.on_shortcut(hotkey, move |tauri_app_handle, _hotkey, event| {
let on_opened_clone = on_opened.clone();
let extension_id_clone = extension_id_clone.clone();
let app_handle_clone = tauri_app_handle.clone();
if event.state() == ShortcutState::Pressed {
async_runtime::spawn(async move {
let result = open(app_handle_clone, on_opened_clone, None).await;
if let Err(msg) = result {
log::warn!(
"failed to open extension [{}], error [{}]",
extension_id_clone,
msg
);
}
});
}
})
.map_err(|e| e.to_string())?;
Ok(())
}
pub(crate) fn unregister_command_hotkey(
tauri_app_handle: &AppHandle,
wm_extension: &Extension,
command_id: &str,
) -> Result<(), String> {
let commands = wm_extension
.commands
.as_ref()
.expect("Window Management has commands");
let opt_command = commands.iter().find(|ext| ext.id == command_id);
let Some(command) = opt_command else {
panic!("Window Management command does not exist {}", command_id);
};
let Some(ref hotkey) = command.hotkey else {
return Ok(());
};
tauri_app_handle
.global_shortcut()
.unregister(hotkey.as_str())
.map_err(|e| e.to_string())?;
Ok(())
}

View File

@@ -0,0 +1,10 @@
use super::actions::Action;
use crate::common::document::OnOpened;
use serde_plain;
pub(crate) fn on_opened(command_id: &str) -> OnOpened {
let action: Action = serde_plain::from_str(command_id).unwrap_or_else(|_| {
panic!("Window Management commands IDs should be valid for `enum Action`, someone corrupts the JSON file");
});
OnOpened::WindowManagementAction { action }
}

View File

@@ -0,0 +1,415 @@
{
"id": "Window Management",
"name": "Window Management",
"platforms": [
"macos"
],
"description": "Resize, reorganize and move your focused window effortlessly",
"icon": "font_a-Windowmanagement",
"type": "extension",
"category": "Utilities",
"tags": [
"Productivity"
],
"commands": [
{
"id": "TopHalf",
"name": "Top Half",
"description": "Move the focused window to fill left half of the screen.",
"icon": "font_a-TopHalf",
"type": "command"
},
{
"id": "BottomHalf",
"name": "Bottom Half",
"description": "Move the focused window to fill bottom half of the screen.",
"icon": "font_a-BottomHalf",
"type": "command"
},
{
"id": "LeftHalf",
"name": "Left Half",
"description": "Move the focused window to fill left half of the screen.",
"icon": "font_a-LeftHalf",
"type": "command"
},
{
"id": "RightHalf",
"name": "Right Half",
"description": "Move the focused window to fill right half of the screen.",
"icon": "font_a-RightHalf",
"type": "command"
},
{
"id": "CenterHalf",
"name": "Center Half",
"description": "Move the focused window to fill center half of the screen.",
"icon": "font_a-CenterHalf",
"type": "command"
},
{
"id": "Maximize",
"name": "Maximize",
"description": "Maximize the focused window to fit the screen.",
"icon": "font_Maximize",
"type": "command"
},
{
"id": "TopLeftQuarter",
"name": "Top Left Quarter",
"description": "Resize the focused window to the top left quarter of the screen.",
"icon": "font_a-TopLeftQuarter",
"type": "command"
},
{
"id": "TopRightQuarter",
"name": "Top Right Quarter",
"description": "Resize the focused window to the top right quarter of the screen.",
"icon": "font_a-TopRightQuarter",
"type": "command"
},
{
"id": "BottomLeftQuarter",
"name": "Bottom Left Quarter",
"description": "Resize the focused window to the bottom left quarter of the screen.",
"icon": "font_a-BottomLeftQuarter",
"type": "command"
},
{
"id": "BottomRightQuarter",
"name": "Bottom Right Quarter",
"description": "Resize the focused window to the bottom right quarter of the screen.",
"icon": "font_a-BottomRightQuarter",
"type": "command"
},
{
"id": "TopLeftSixth",
"name": "Top Left Sixth",
"description": "Resize the focused window to the top left sixth of the screen.",
"icon": "font_a-TopLeftSixth",
"type": "command"
},
{
"id": "TopCenterSixth",
"name": "Top Center Sixth",
"description": "Resize the focused window to the top center sixth of the screen.",
"icon": "font_a-TopCenterSixth",
"type": "command"
},
{
"id": "TopRightSixth",
"name": "Top Right Sixth",
"description": "Resize the focused window to the top right sixth of the screen.",
"icon": "font_a-TopRightSixth",
"type": "command"
},
{
"id": "BottomLeftSixth",
"name": "Bottom Left Sixth",
"description": "Resize the focused window to the bottom left sixth of the screen.",
"icon": "font_a-BottomLeftSixth",
"type": "command"
},
{
"id": "BottomCenterSixth",
"name": "Bottom Center Sixth",
"description": "Resize the focused window to the bottom center sixth of the screen.",
"icon": "font_a-BottomCenterSixth",
"type": "command"
},
{
"id": "BottomRightSixth",
"name": "Bottom Right Sixth",
"description": "Resize the focused window to the bottom right sixth of the screen.",
"icon": "font_a-BottomRightSixth",
"type": "command"
},
{
"id": "TopThird",
"name": "Top Third",
"description": "Resize the focused window to the top third of the screen.",
"icon": "font_a-TopThirdFourth",
"type": "command"
},
{
"id": "MiddleThird",
"name": "Middle Third",
"description": "Resize the focused window to the middle third of the screen.",
"icon": "font_a-MiddleThird",
"type": "command"
},
{
"id": "BottomThird",
"name": "Bottom Third",
"description": "Resize the focused window to the bottom third of the screen.",
"icon": "font_a-BottomThird",
"type": "command"
},
{
"id": "Center",
"name": "Center",
"description": "Center the focused window in the screen.",
"icon": "font_Center",
"type": "command"
},
{
"id": "FirstFourth",
"name": "First Fourth",
"description": "Resize the focused window to the first fourth of the screen.",
"icon": "font_a-FirstFourth",
"type": "command"
},
{
"id": "SecondFourth",
"name": "Second Fourth",
"description": "Resize the focused window to the second fourth of the screen.",
"icon": "font_a-SecondFourth",
"type": "command"
},
{
"id": "ThirdFourth",
"name": "Third Fourth",
"description": "Resize the focused window to the third fourth of the screen.",
"icon": "font_a-ThirdFourth",
"type": "command"
},
{
"id": "LastFourth",
"name": "Last Fourth",
"description": "Resize the focused window to the last fourth of the screen.",
"icon": "font_a-LastFourth",
"type": "command"
},
{
"id": "FirstThird",
"name": "First Third",
"description": "Resize the focused window to the first third of the screen.",
"icon": "font_a-FirstThird",
"type": "command"
},
{
"id": "CenterThird",
"name": "Center Third",
"description": "Resize the focused window to the center third of the screen.",
"icon": "font_a-CenterThird",
"type": "command"
},
{
"id": "LastThird",
"name": "Last Third",
"description": "Resize the focused window to the last third of the screen.",
"icon": "font_a-LastThird",
"type": "command"
},
{
"id": "FirstTwoThirds",
"name": "First Two Thirds",
"description": "Resize the focused window to the first two thirds of the screen.",
"icon": "font_a-FirstTwoThirds",
"type": "command"
},
{
"id": "CenterTwoThirds",
"name": "Center Two Thirds",
"description": "Resize the focused window to the center two thirds of the screen.",
"icon": "font_a-CenterTwoThirds",
"type": "command"
},
{
"id": "LastTwoThirds",
"name": "Last Two Thirds",
"description": "Resize the focused window to the last two thirds of the screen.",
"icon": "font_a-LastTwoThirds",
"type": "command"
},
{
"id": "FirstThreeFourths",
"name": "First Three Fourths",
"description": "Resize the focused window to the first three fourths of the screen.",
"icon": "font_a-FirstThreeFourths",
"type": "command"
},
{
"id": "CenterThreeFourths",
"name": "Center Three Fourths",
"description": "Resize the focused window to the center three fourths of the screen.",
"icon": "font_a-CenterThreeFourths",
"type": "command"
},
{
"id": "LastThreeFourths",
"name": "Last Three Fourths",
"description": "Resize the focused window to the last three fourths of the screen.",
"icon": "font_a-LastThreeFourths",
"type": "command"
},
{
"id": "TopThreeFourths",
"name": "Top Three Fourths",
"description": "Resize the focused window to the top three fourths of the screen.",
"icon": "font_a-TopThreeFourths",
"type": "command"
},
{
"id": "BottomThreeFourths",
"name": "Bottom Three Fourths",
"description": "Resize the focused window to the bottom three fourths of the screen.",
"icon": "font_a-BottomThreeFourths",
"type": "command"
},
{
"id": "TopTwoThirds",
"name": "Top Two Thirds",
"description": "Resize the focused window to the top two thirds of the screen.",
"icon": "font_a-TopTwoThirds",
"type": "command"
},
{
"id": "BottomTwoThirds",
"name": "Bottom Two Thirds",
"description": "Resize the focused window to the bottom two thirds of the screen.",
"icon": "font_a-BottomTwoThirds",
"type": "command"
},
{
"id": "TopCenterTwoThirds",
"name": "Top Center Two Thirds",
"description": "Resize the focused window to the top center two thirds of the screen.",
"icon": "font_a-TopCenterTwoThirds",
"type": "command"
},
{
"id": "TopFirstFourth",
"name": "Top First Fourth",
"description": "Resize the focused window to the top first fourth of the screen.",
"icon": "font_a-TopFirstFourth",
"type": "command"
},
{
"id": "TopSecondFourth",
"name": "Top Second Fourth",
"description": "Resize the focused window to the top second fourth of the screen.",
"icon": "font_a-TopSecondFourth",
"type": "command"
},
{
"id": "TopThirdFourth",
"name": "Top Third Fourth",
"description": "Resize the focused window to the top third fourth of the screen.",
"icon": "font_a-TopThirdFourth",
"type": "command"
},
{
"id": "TopLastFourth",
"name": "Top Last Fourth",
"description": "Resize the focused window to the top last fourth of the screen.",
"icon": "font_a-TopLastFourth",
"type": "command"
},
{
"id": "MakeLarger",
"name": "Make Larger",
"description": "Increase the focused window until it reaches the screen size.",
"icon": "font_a-MakeLarger",
"type": "command"
},
{
"id": "MakeSmaller",
"name": "Make Smaller",
"description": "Decrease the focused window until it reaches its minimal size.",
"icon": "font_a-MakeSmaller",
"type": "command"
},
{
"id": "AlmostMaximize",
"name": "Almost Maximize",
"description": "Maximize the focused window to almost fit the screen.",
"icon": "font_a-AlmostMaximize",
"type": "command"
},
{
"id": "MaximizeWidth",
"name": "Maximize Width",
"description": "Maximize width of the focused window to fit the screen.",
"icon": "font_a-MaximizeWidth",
"type": "command"
},
{
"id": "MaximizeHeight",
"name": "Maximize Height",
"description": "Maximize height of the focused window to fit the screen.",
"icon": "font_a-MaximizeHeight",
"type": "command"
},
{
"id": "MoveUp",
"name": "Move Up",
"description": "Move the focused window to the top edge of the screen.",
"icon": "font_a-MoveUp",
"type": "command"
},
{
"id": "MoveDown",
"name": "Move Down",
"description": "Move the focused window to the bottom of the screen.",
"icon": "font_a-MoveDown",
"type": "command"
},
{
"id": "MoveLeft",
"name": "Move Left",
"description": "Move the focused window to the left edge of the screen.",
"icon": "font_a-MoveLeft",
"type": "command"
},
{
"id": "MoveRight",
"name": "Move Right",
"description": "Move the focused window to the right edge of the screen.",
"icon": "font_a-MoveRight",
"type": "command"
},
{
"id": "NextDesktop",
"name": "Next Desktop",
"description": "Move the focused window to the next desktop.",
"icon": "font_a-NextDesktop",
"type": "command"
},
{
"id": "PreviousDesktop",
"name": "Previous Desktop",
"description": "Move the focused window to the previous desktop.",
"icon": "font_a-PreviousDesktop",
"type": "command"
},
{
"id": "NextDisplay",
"name": "Next Display",
"description": "Move the focused window to the next display.",
"icon": "font_a-NextDisplay",
"type": "command"
},
{
"id": "PreviousDisplay",
"name": "Previous Display",
"description": "Move the focused window to the previous display.",
"icon": "font_a-PreviousDisplay",
"type": "command"
},
{
"id": "Restore",
"name": "Restore",
"description": "Restore the focused window to its last position.",
"icon": "font_Restore",
"type": "command"
},
{
"id": "ToggleFullscreen",
"name": "Toggle Fullscreen",
"description": "Toggle fullscreen mode.",
"icon": "font_a-ToggleFullscreen",
"type": "command"
}
]
}

View File

@@ -0,0 +1,127 @@
use super::EXTENSION_ID;
use crate::common::document::{DataSourceReference, Document};
use crate::common::{
error::SearchError,
search::{QueryResponse, QuerySource, SearchQuery},
traits::SearchSource,
};
use crate::extension::built_in::{get_built_in_extension_directory, load_extension_from_json_file};
use crate::extension::{ExtensionType, LOCAL_QUERY_SOURCE_TYPE, calculate_text_similarity};
use async_trait::async_trait;
use hostname;
use tauri::AppHandle;
/// A search source to allow users to search WM actions.
pub(crate) struct WindowManagementSearchSource;
#[async_trait]
impl SearchSource for WindowManagementSearchSource {
fn get_type(&self) -> QuerySource {
QuerySource {
r#type: LOCAL_QUERY_SOURCE_TYPE.into(),
name: hostname::get()
.unwrap_or(EXTENSION_ID.into())
.to_string_lossy()
.into(),
id: EXTENSION_ID.into(),
}
}
async fn search(
&self,
tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
let Some(query_string) = query.query_strings.get("query") else {
return Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
});
};
let from = usize::try_from(query.from).expect("from too big");
let size = usize::try_from(query.size).expect("size too big");
let query_string = query_string.trim();
if query_string.is_empty() {
return Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
});
}
let query_string_lowercase = query_string.to_lowercase();
let extension = load_extension_from_json_file(
&get_built_in_extension_directory(&tauri_app_handle),
super::EXTENSION_ID,
)
.map_err(SearchError::InternalError)?;
let commands = extension.commands.expect("this extension has commands");
let mut hits: Vec<(Document, f64)> = Vec::new();
// We know they are all commands
let command_type_string = ExtensionType::Command.to_string();
for command in commands.iter().filter(|ext| ext.enabled) {
let score = {
let mut score = 0_f64;
if let Some(name_score) =
calculate_text_similarity(&query_string_lowercase, &command.name.to_lowercase())
{
score += name_score;
}
if let Some(ref alias) = command.alias {
if let Some(alias_score) =
calculate_text_similarity(&query_string_lowercase, &alias.to_lowercase())
{
score += alias_score;
}
}
score
};
if score > 0.0 {
let on_opened = super::on_opened::on_opened(&command.id);
let url = on_opened.url();
let document = Document {
id: command.id.clone(),
title: Some(command.name.clone()),
icon: Some(command.icon.clone()),
on_opened: Some(on_opened),
url: Some(url),
category: Some(command_type_string.clone()),
source: Some(DataSourceReference {
id: Some(command_type_string.clone()),
name: Some(command_type_string.clone()),
icon: None,
r#type: Some(command_type_string.clone()),
}),
..Default::default()
};
hits.push((document, score));
}
}
hits.sort_by(|(_, score_a), (_, score_b)| {
score_a
.partial_cmp(&score_b)
.expect("expect no NAN/INFINITY/...")
});
let total_hits = hits.len();
let from_size_applied = hits.into_iter().skip(from).take(size).collect();
Ok(QueryResponse {
source: self.get_type(),
hits: from_size_applied,
total_hits,
})
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,789 @@
//! Coco has 4 sources of `plugin.json` to check and validate:
//!
//! 1. From coco-extensions repository
//!
//! Granted, Coco APP won't check these files directly, but the code here
//! will run in that repository's CI to prevent errors in the first place.
//!
//! 2. From the "<data directory>/third_party_extensions" directory
//! 3. Imported via "Import Local Extension"
//! 4. Downloaded from the "store/extension/<extension ID>/_download" API
//!
//! This file contains the checks that are general enough to be applied to all
//! these 4 sources
use crate::extension::Extension;
use crate::extension::ExtensionType;
use crate::util::platform::Platform;
use std::collections::HashSet;
pub(crate) fn general_check(extension: &Extension) -> Result<(), String> {
// Check main extension
check_main_extension_only(extension)?;
check_main_extension_or_sub_extension(extension, &format!("extension [{}]", extension.id))?;
// `None` if `extension` is compatible with all the platforms. Otherwise `Some(limited_platforms)`
let limited_supported_platforms = match extension.platforms.as_ref() {
Some(platforms) => {
if platforms.len() == Platform::num_of_supported_platforms() {
None
} else {
Some(platforms)
}
}
None => None,
};
// Check sub extensions
let commands = match extension.commands {
Some(ref v) => v.as_slice(),
None => &[],
};
let scripts = match extension.scripts {
Some(ref v) => v.as_slice(),
None => &[],
};
let quicklinks = match extension.quicklinks {
Some(ref v) => v.as_slice(),
None => &[],
};
let views = match extension.views {
Some(ref v) => v.as_slice(),
None => &[],
};
let sub_extensions = [commands, scripts, quicklinks, views].concat();
let mut sub_extension_ids = HashSet::new();
for sub_extension in sub_extensions.iter() {
check_sub_extension_only(&extension.id, sub_extension, limited_supported_platforms)?;
check_main_extension_or_sub_extension(
extension,
&format!("sub-extension [{}-{}]", extension.id, sub_extension.id),
)?;
if !sub_extension_ids.insert(sub_extension.id.as_str()) {
// extension ID already exists
return Err(format!(
"sub-extension with ID [{}] already exists",
sub_extension.id
));
}
}
Ok(())
}
/// This checks the main extension only, it won't check sub-extensions.
fn check_main_extension_only(extension: &Extension) -> Result<(), String> {
// Group and Extension cannot have alias
if extension.alias.is_some() {
if extension.r#type == ExtensionType::Group || extension.r#type == ExtensionType::Extension
{
return Err(format!(
"invalid extension [{}], extension of type [{:?}] cannot have alias",
extension.id, extension.r#type
));
}
}
// Group and Extension cannot have hotkey
if extension.hotkey.is_some() {
if extension.r#type == ExtensionType::Group || extension.r#type == ExtensionType::Extension
{
return Err(format!(
"invalid extension [{}], extension of type [{:?}] cannot have hotkey",
extension.id, extension.r#type
));
}
}
if extension.commands.is_some()
|| extension.scripts.is_some()
|| extension.quicklinks.is_some()
|| extension.views.is_some()
{
if extension.r#type != ExtensionType::Group && extension.r#type != ExtensionType::Extension
{
return Err(format!(
"invalid extension [{}], only extension of type [Group] and [Extension] can have sub-extensions",
extension.id,
));
}
}
if extension.settings.is_some() {
// Sub-extensions are all searchable, so this check is only for main extensions.
if !extension.searchable() {
return Err(format!(
"invalid extension {}, field [settings] is currently only allowed in searchable extension, this type of extension is not searchable [{}]",
extension.id, extension.r#type
));
}
}
Ok(())
}
fn check_sub_extension_only(
extension_id: &str,
sub_extension: &Extension,
limited_platforms: Option<&HashSet<Platform>>,
) -> Result<(), String> {
if sub_extension.r#type == ExtensionType::Group
|| sub_extension.r#type == ExtensionType::Extension
{
return Err(format!(
"invalid sub-extension [{}-{}]: sub-extensions should not be of type [Group] or [Extension]",
extension_id, sub_extension.id
));
}
if sub_extension.commands.is_some()
|| sub_extension.scripts.is_some()
|| sub_extension.quicklinks.is_some()
|| sub_extension.views.is_some()
{
return Err(format!(
"invalid sub-extension [{}-{}]: fields [commands/scripts/quicklinks/views] should not be set in sub-extensions",
extension_id, sub_extension.id
));
}
if sub_extension.developer.is_some() {
return Err(format!(
"invalid sub-extension [{}-{}]: field [developer] should not be set in sub-extensions",
extension_id, sub_extension.id
));
}
if let Some(platforms_supported_by_main_extension) = limited_platforms {
match sub_extension.platforms {
Some(ref platforms_supported_by_sub_extension) => {
let diff = platforms_supported_by_sub_extension
.difference(&platforms_supported_by_main_extension)
.into_iter()
.map(|p| p.to_string())
.collect::<Vec<String>>();
if !diff.is_empty() {
return Err(format!(
"invalid sub-extension [{}-{}]: it supports platforms {:?} that are not supported by the main extension",
extension_id, sub_extension.id, diff
));
}
}
None => {
// if `sub_extension.platform` is None, it means it has the same value
// as main extension's `platforms` field, so we don't need to check it.
}
}
}
Ok(())
}
fn check_main_extension_or_sub_extension(
extension: &Extension,
identifier: &str,
) -> Result<(), String> {
// If field `action` is Some, then it should be a Command
if extension.action.is_some() && extension.r#type != ExtensionType::Command {
return Err(format!(
"invalid {}, field [action] is set for a non-Command extension",
identifier
));
}
if extension.r#type == ExtensionType::Command && extension.action.is_none() {
return Err(format!(
"invalid {}, field [action] should be set for a Command extension",
identifier
));
}
// If field `quicklink` is Some, then it should be a Quicklink
if extension.quicklink.is_some() && extension.r#type != ExtensionType::Quicklink {
return Err(format!(
"invalid {}, field [quicklink] is set for a non-Quicklink extension",
identifier
));
}
if extension.r#type == ExtensionType::Quicklink && extension.quicklink.is_none() {
return Err(format!(
"invalid {}, field [quicklink] should be set for a Quicklink extension",
identifier
));
}
// If field `page` is Some, then it should be a View
if extension.page.is_some() && extension.r#type != ExtensionType::View {
return Err(format!(
"invalid {}, field [page] is set for a non-View extension",
identifier
));
}
if extension.r#type == ExtensionType::View && extension.page.is_none() {
return Err(format!(
"invalid {}, field [page] should be set for a View extension",
identifier
));
}
Ok(())
}
#[cfg(test)]
mod tests {
use super::*;
use crate::extension::{
CommandAction, ExtensionSettings, Quicklink, QuicklinkLink, QuicklinkLinkComponent,
};
/// Helper function to create a basic valid extension
fn create_basic_extension(id: &str, extension_type: ExtensionType) -> Extension {
let page = if extension_type == ExtensionType::View {
Some("index.html".into())
} else {
None
};
Extension {
id: id.to_string(),
name: "Test Extension".to_string(),
developer: None,
platforms: None,
description: "Test description".to_string(),
icon: "test-icon.png".to_string(),
r#type: extension_type,
action: None,
quicklink: None,
commands: None,
scripts: None,
quicklinks: None,
views: None,
alias: None,
hotkey: None,
enabled: true,
page,
permission: None,
settings: None,
screenshots: None,
url: None,
version: None,
}
}
/// Helper function to create a command action
fn create_command_action() -> CommandAction {
CommandAction {
exec: "echo".to_string(),
args: Some(vec!["test".to_string()]),
}
}
/// Helper function to create a quicklink
fn create_quicklink() -> Quicklink {
Quicklink {
link: QuicklinkLink {
components: vec![QuicklinkLinkComponent::StaticStr(
"https://example.com".to_string(),
)],
},
open_with: None,
}
}
/* test_check_main_extension_only */
#[test]
fn test_group_cannot_have_alias() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
extension.alias = Some("group-alias".to_string());
let result = general_check(&extension);
assert!(result.is_err());
assert!(result.unwrap_err().contains("cannot have alias"));
}
#[test]
fn test_extension_cannot_have_alias() {
let mut extension = create_basic_extension("test-ext", ExtensionType::Extension);
extension.alias = Some("ext-alias".to_string());
let result = general_check(&extension);
assert!(result.is_err());
assert!(result.unwrap_err().contains("cannot have alias"));
}
#[test]
fn test_group_cannot_have_hotkey() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
extension.hotkey = Some("cmd+g".to_string());
let result = general_check(&extension);
assert!(result.is_err());
assert!(result.unwrap_err().contains("cannot have hotkey"));
}
#[test]
fn test_extension_cannot_have_hotkey() {
let mut extension = create_basic_extension("test-ext", ExtensionType::Extension);
extension.hotkey = Some("cmd+e".to_string());
let result = general_check(&extension);
assert!(result.is_err());
assert!(result.unwrap_err().contains("cannot have hotkey"));
}
#[test]
fn test_non_container_types_cannot_have_sub_extensions() {
let mut extension = create_basic_extension("test-cmd", ExtensionType::Command);
extension.action = Some(create_command_action());
extension.commands = Some(vec![create_basic_extension(
"sub-cmd",
ExtensionType::Command,
)]);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("only extension of type [Group] and [Extension] can have sub-extensions")
);
}
#[test]
fn test_non_searchable_extension_set_field_settings() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
extension.settings = Some(ExtensionSettings {
hide_before_open: None,
});
let error_msg = general_check(&extension).unwrap_err();
assert!(
error_msg
.contains("field [settings] is currently only allowed in searchable extension")
);
let mut extension = create_basic_extension("test-extension", ExtensionType::Extension);
extension.settings = Some(ExtensionSettings {
hide_before_open: None,
});
let error_msg = general_check(&extension).unwrap_err();
assert!(
error_msg
.contains("field [settings] is currently only allowed in searchable extension")
);
}
/* test_check_main_extension_only */
/* test check_main_extension_or_sub_extension */
#[test]
fn test_command_must_have_action() {
let extension = create_basic_extension("test-cmd", ExtensionType::Command);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("field [action] should be set for a Command extension")
);
}
#[test]
fn test_non_command_cannot_have_action() {
let mut extension = create_basic_extension("test-script", ExtensionType::Script);
extension.action = Some(create_command_action());
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("field [action] is set for a non-Command extension")
);
}
#[test]
fn test_quicklink_must_have_quicklink_field() {
let extension = create_basic_extension("test-quicklink", ExtensionType::Quicklink);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("field [quicklink] should be set for a Quicklink extension")
);
}
#[test]
fn test_non_quicklink_cannot_have_quicklink_field() {
let mut extension = create_basic_extension("test-cmd", ExtensionType::Command);
extension.action = Some(create_command_action());
extension.quicklink = Some(create_quicklink());
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("field [quicklink] is set for a non-Quicklink extension")
);
}
#[test]
fn test_view_must_have_page_field() {
let mut extension = create_basic_extension("test-view", ExtensionType::View);
// create_basic_extension() will set its page field if type is View, clear it
extension.page = None;
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("field [page] should be set for a View extension")
);
}
#[test]
fn test_non_view_cannot_have_page_field() {
let mut extension = create_basic_extension("test-cmd", ExtensionType::Command);
extension.action = Some(create_command_action());
extension.page = Some("index.html".into());
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("field [page] is set for a non-View extension")
);
}
/* test check_main_extension_or_sub_extension */
/* Test check_sub_extension_only */
#[test]
fn test_sub_extension_cannot_be_group() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
let sub_group = create_basic_extension("sub-group", ExtensionType::Group);
extension.commands = Some(vec![sub_group]);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("sub-extensions should not be of type [Group] or [Extension]")
);
}
#[test]
fn test_sub_extension_cannot_be_extension() {
let mut extension = create_basic_extension("test-ext", ExtensionType::Extension);
let sub_ext = create_basic_extension("sub-ext", ExtensionType::Extension);
extension.scripts = Some(vec![sub_ext]);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("sub-extensions should not be of type [Group] or [Extension]")
);
}
#[test]
fn test_sub_extension_cannot_have_developer() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.developer = Some("test-dev".to_string());
extension.commands = Some(vec![sub_cmd]);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("field [developer] should not be set in sub-extensions")
);
}
#[test]
fn test_sub_extension_cannot_have_sub_extensions() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.commands = Some(vec![create_basic_extension(
"nested-cmd",
ExtensionType::Command,
)]);
extension.commands = Some(vec![sub_cmd]);
let result = general_check(&extension);
assert!(result.is_err());
assert!(result.unwrap_err().contains(
"fields [commands/scripts/quicklinks/views] should not be set in sub-extensions"
));
}
/* Test check_sub_extension_only */
#[test]
fn test_duplicate_sub_extension_ids() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
let mut cmd1 = create_basic_extension("duplicate-id", ExtensionType::Command);
cmd1.action = Some(create_command_action());
let mut cmd2 = create_basic_extension("duplicate-id", ExtensionType::Command);
cmd2.action = Some(create_command_action());
extension.commands = Some(vec![cmd1, cmd2]);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("sub-extension with ID [duplicate-id] already exists")
);
}
#[test]
fn test_duplicate_ids_across_different_sub_extension_types() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
let mut cmd = create_basic_extension("same-id", ExtensionType::Command);
cmd.action = Some(create_command_action());
let script = create_basic_extension("same-id", ExtensionType::Script);
extension.commands = Some(vec![cmd]);
extension.scripts = Some(vec![script]);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("sub-extension with ID [same-id] already exists")
);
}
#[test]
fn test_valid_group_extension() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
extension.commands = Some(vec![create_basic_extension("cmd1", ExtensionType::Command)]);
assert!(general_check(&extension).is_ok());
}
#[test]
fn test_valid_extension_type() {
let mut extension = create_basic_extension("test-ext", ExtensionType::Extension);
extension.scripts = Some(vec![create_basic_extension(
"script1",
ExtensionType::Script,
)]);
assert!(general_check(&extension).is_ok());
}
#[test]
fn test_valid_command_extension() {
let mut extension = create_basic_extension("test-cmd", ExtensionType::Command);
extension.action = Some(create_command_action());
assert!(general_check(&extension).is_ok());
}
#[test]
fn test_valid_quicklink_extension() {
let mut extension = create_basic_extension("test-quicklink", ExtensionType::Quicklink);
extension.quicklink = Some(create_quicklink());
assert!(general_check(&extension).is_ok());
}
#[test]
fn test_valid_complex_extension() {
let mut extension = create_basic_extension("spotify-controls", ExtensionType::Extension);
// Add valid commands
let mut play_pause = create_basic_extension("play-pause", ExtensionType::Command);
play_pause.action = Some(create_command_action());
let mut next_track = create_basic_extension("next-track", ExtensionType::Command);
next_track.action = Some(create_command_action());
let mut prev_track = create_basic_extension("prev-track", ExtensionType::Command);
prev_track.action = Some(create_command_action());
extension.commands = Some(vec![play_pause, next_track, prev_track]);
assert!(general_check(&extension).is_ok());
}
#[test]
fn test_valid_single_layer_command() {
let mut extension = create_basic_extension("empty-trash", ExtensionType::Command);
extension.action = Some(create_command_action());
assert!(general_check(&extension).is_ok());
}
#[test]
fn test_command_alias_and_hotkey_allowed() {
let mut extension = create_basic_extension("test-cmd", ExtensionType::Command);
extension.action = Some(create_command_action());
extension.alias = Some("cmd-alias".to_string());
extension.hotkey = Some("cmd+t".to_string());
assert!(general_check(&extension).is_ok());
}
/*
* Tests for check that sub extension cannot support extensions that are not
* supported by the main extension
*
* Start here
*/
#[test]
fn test_platform_validation_both_none() {
// Case 1: main extension's platforms = None, sub extension's platforms = None
// Should return Ok(())
let mut main_extension = create_basic_extension("main-ext", ExtensionType::Group);
main_extension.platforms = None;
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.platforms = None;
main_extension.commands = Some(vec![sub_cmd]);
let result = general_check(&main_extension);
assert!(result.is_ok());
}
#[test]
fn test_platform_validation_main_all_sub_none() {
// Case 2: main extension's platforms = Some(all platforms), sub extension's platforms = None
// Should return Ok(())
let mut main_extension = create_basic_extension("main-ext", ExtensionType::Group);
main_extension.platforms = Some(Platform::all());
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.platforms = None;
main_extension.commands = Some(vec![sub_cmd]);
let result = general_check(&main_extension);
assert!(result.is_ok());
}
#[test]
fn test_platform_validation_main_none_sub_some() {
// Case 3: main extension's platforms = None, sub extension's platforms = Some([Platform::Macos])
// Should return Ok(()) because None means supports all platforms
let mut main_extension = create_basic_extension("main-ext", ExtensionType::Group);
main_extension.platforms = None;
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.platforms = Some(HashSet::from([Platform::Macos]));
main_extension.commands = Some(vec![sub_cmd]);
let result = general_check(&main_extension);
assert!(result.is_ok());
}
#[test]
fn test_platform_validation_main_all_sub_subset() {
// Case 4: main extension's platforms = Some(all platforms), sub extension's platforms = Some([Platform::Macos])
// Should return Ok(()) because sub extension supports a subset of main extension's platforms
let mut main_extension = create_basic_extension("main-ext", ExtensionType::Group);
main_extension.platforms = Some(Platform::all());
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.platforms = Some(HashSet::from([Platform::Macos]));
main_extension.commands = Some(vec![sub_cmd]);
let result = general_check(&main_extension);
assert!(result.is_ok());
}
#[test]
fn test_platform_validation_main_limited_sub_unsupported() {
// Case 5: main extension's platforms = Some([Platform::Macos]), sub extension's platforms = Some([Platform::Linux])
// Should return Err because sub extension supports a platform not supported by main extension
let mut main_extension = create_basic_extension("main-ext", ExtensionType::Group);
main_extension.platforms = Some(HashSet::from([Platform::Macos]));
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.platforms = Some(HashSet::from([Platform::Linux]));
main_extension.commands = Some(vec![sub_cmd]);
let result = general_check(&main_extension);
assert!(result.is_err());
let error_msg = result.unwrap_err();
assert!(error_msg.contains("it supports platforms"));
assert!(error_msg.contains("that are not supported by the main extension"));
assert!(error_msg.contains("Linux")); // Should mention the unsupported platform
}
#[test]
fn test_platform_validation_main_partial_sub_unsupported() {
// Case 6: main extension's platforms = Some([Platform::Macos, Platform::Windows]), sub extension's platforms = Some([Platform::Linux])
// Should return Err because sub extension supports a platform not supported by main extension
let mut main_extension = create_basic_extension("main-ext", ExtensionType::Group);
main_extension.platforms = Some(HashSet::from([Platform::Macos, Platform::Windows]));
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.platforms = Some(HashSet::from([Platform::Linux]));
main_extension.commands = Some(vec![sub_cmd]);
let result = general_check(&main_extension);
assert!(result.is_err());
let error_msg = result.unwrap_err();
assert!(error_msg.contains("it supports platforms"));
assert!(error_msg.contains("that are not supported by the main extension"));
assert!(error_msg.contains("Linux")); // Should mention the unsupported platform
}
#[test]
fn test_platform_validation_main_limited_sub_none() {
// Case 7: main extension's platforms = Some([Platform::Macos]), sub extension's platforms = None
// Should return Ok(()) because when sub extension's platforms is None, it inherits main extension's platforms
let mut main_extension = create_basic_extension("main-ext", ExtensionType::Group);
main_extension.platforms = Some(HashSet::from([Platform::Macos]));
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.platforms = None;
main_extension.commands = Some(vec![sub_cmd]);
let result = general_check(&main_extension);
assert!(result.is_ok());
}
/*
* Tests for check that sub extension cannot support extensions that are not
* supported by the main extension
*
* End here
*/
}

View File

@@ -0,0 +1,303 @@
use crate::extension::third_party::check::general_check;
use crate::extension::third_party::install::{
convert_page, filter_out_incompatible_sub_extensions, is_extension_installed,
};
use crate::extension::third_party::{
THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE, get_third_party_extension_directory,
};
use crate::extension::{
Extension, canonicalize_relative_icon_path, canonicalize_relative_page_path,
};
use crate::extension::{ExtensionType, PLUGIN_JSON_FILE_NAME};
use crate::util::platform::Platform;
use serde_json::Value as Json;
use std::path::Path;
use std::path::PathBuf;
use tauri::AppHandle;
use tokio::fs;
/// All the extensions installed from local file will belong to a special developer
/// "__local__".
const DEVELOPER_ID_LOCAL: &str = "__local__";
/// Install the extension specified by `path`.
///
/// `path` should point to a directory with the following structure:
///
/// ```text
/// extension-directory/
/// ├── assets/
/// │ ├── icon.png
/// │ └── other-assets...
/// └── plugin.json
/// ```
#[tauri::command]
pub(crate) async fn install_local_extension(
tauri_app_handle: AppHandle,
path: PathBuf,
) -> Result<(), String> {
let extension_dir_name = path
.file_name()
.ok_or_else(|| "Invalid extension: no directory name".to_string())?
.to_str()
.ok_or_else(|| "Invalid extension: non-UTF8 extension id".to_string())?;
// we use extension directory name as the extension ID.
let extension_id = extension_dir_name;
if is_extension_installed(DEVELOPER_ID_LOCAL, extension_id).await {
// The frontend code uses this string to distinguish between 2 error cases:
//
// 1. This extension is already imported
// 2. This extension is incompatible with the current platform
// 3. The selected directory does not contain a valid extension
//
// do NOT edit this without updating the frontend code.
//
// ```ts
// if (errorMessage === "already imported") {
// addError(t("settings.extensions.hints.extensionAlreadyImported"));
// } else if (errorMessage === "incompatible") {
// addError(t("settings.extensions.hints.incompatibleExtension"));
// } else {
// addError(t("settings.extensions.hints.importFailed"));
// }
// ```
//
// This is definitely error-prone, but we have to do this until we have
// structured error type
return Err("already imported".into());
}
let plugin_json_path = path.join(PLUGIN_JSON_FILE_NAME);
let plugin_json_content = fs::read_to_string(&plugin_json_path)
.await
.map_err(|e| e.to_string())?;
// Parse as JSON first as it is not valid for `struct Extension`, we need to
// correct it (set fields `id` and `developer`) before converting it to `struct Extension`:
let mut extension_json: Json =
serde_json::from_str(&plugin_json_content).map_err(|e| e.to_string())?;
// Set the main extension ID to the directory name
let extension_obj = extension_json
.as_object_mut()
.expect("extension_json should be an object");
extension_obj.insert("id".to_string(), Json::String(extension_id.to_string()));
extension_obj.insert(
"developer".to_string(),
Json::String(DEVELOPER_ID_LOCAL.to_string()),
);
// Counter for sub-extension IDs
let mut counter = 1u32;
// Set IDs for commands
if let Some(commands) = extension_obj.get_mut("commands") {
if let Some(commands_array) = commands.as_array_mut() {
for command in commands_array {
if let Some(command_obj) = command.as_object_mut() {
command_obj.insert("id".to_string(), Json::String(counter.to_string()));
counter += 1;
}
}
}
}
// Set IDs for quicklinks
if let Some(quicklinks) = extension_obj.get_mut("quicklinks") {
if let Some(quicklinks_array) = quicklinks.as_array_mut() {
for quicklink in quicklinks_array {
if let Some(quicklink_obj) = quicklink.as_object_mut() {
quicklink_obj.insert("id".to_string(), Json::String(counter.to_string()));
counter += 1;
}
}
}
}
// Set IDs for scripts
if let Some(scripts) = extension_obj.get_mut("scripts") {
if let Some(scripts_array) = scripts.as_array_mut() {
for script in scripts_array {
if let Some(script_obj) = script.as_object_mut() {
script_obj.insert("id".to_string(), Json::String(counter.to_string()));
counter += 1;
}
}
}
}
// Now we can convert JSON to `struct Extension`
let mut extension: Extension =
serde_json::from_value(extension_json).map_err(|e| e.to_string())?;
let current_platform = Platform::current();
/* Check begins here */
general_check(&extension)?;
if let Some(ref platforms) = extension.platforms {
if !platforms.contains(&current_platform) {
// The frontend code uses this string to distinguish between 3 error cases:
//
// 1. This extension is already imported
// 2. This extension is incompatible with the current platform
// 3. The selected directory does not contain a valid extension
//
// do NOT edit this without updating the frontend code.
//
// ```ts
// if (errorMessage === "already imported") {
// addError(t("settings.extensions.hints.extensionAlreadyImported"));
// } else if (errorMessage === "incompatible") {
// addError(t("settings.extensions.hints.incompatibleExtension"));
// } else {
// addError(t("settings.extensions.hints.importFailed"));
// }
// ```
//
// This is definitely error-prone, but we have to do this until we have
// structured error type
return Err("incompatible".into());
}
}
/* Check ends here */
// Extension is compatible with current platform, but it could contain sub
// extensions that are not, filter them out.
filter_out_incompatible_sub_extensions(&mut extension, current_platform);
// We are going to modify our third-party extension list, grab the write lock
// to ensure exclusive access.
let mut third_party_ext_list_write_lock = THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE
.get()
.expect("global third party search source not set")
.write_lock()
.await;
// Create destination directory
let dest_dir = get_third_party_extension_directory(&tauri_app_handle)
.join(DEVELOPER_ID_LOCAL)
.join(extension_dir_name);
fs::create_dir_all(&dest_dir)
.await
.map_err(|e| e.to_string())?;
// Copy all files except plugin.json
let mut entries = fs::read_dir(&path).await.map_err(|e| e.to_string())?;
while let Some(entry) = entries.next_entry().await.map_err(|e| e.to_string())? {
let file_name = entry.file_name();
let file_name_str = file_name
.to_str()
.ok_or_else(|| "Invalid filename: non-UTF8".to_string())?;
// plugin.json will be handled separately.
if file_name_str == PLUGIN_JSON_FILE_NAME {
continue;
}
let src_path = entry.path();
let dest_path = dest_dir.join(&file_name);
if src_path.is_dir() {
// Recursively copy directory
copy_dir_recursively(&src_path, &dest_path).await?;
} else {
// Copy file
fs::copy(&src_path, &dest_path)
.await
.map_err(|e| e.to_string())?;
}
}
// Write the corrected plugin.json file
let corrected_plugin_json =
serde_json::to_string_pretty(&extension).map_err(|e| e.to_string())?;
let dest_plugin_json_path = dest_dir.join(PLUGIN_JSON_FILE_NAME);
fs::write(&dest_plugin_json_path, corrected_plugin_json)
.await
.map_err(|e| e.to_string())?;
/*
* Call convert_page() to update the page files. This has to be done after
* writing the extension files
*/
let absolute_page_paths: Vec<PathBuf> = {
fn canonicalize_page_path(page_path: &Path, extension_root: &Path) -> PathBuf {
if page_path.is_relative() {
// It is relative to the extension root directory
extension_root.join(page_path)
} else {
page_path.into()
}
}
if extension.r#type == ExtensionType::View {
let page = extension
.page
.as_ref()
.expect("View extension should set its page field");
let path = canonicalize_page_path(Path::new(page.as_str()), &dest_dir);
vec![path]
} else if extension.r#type.contains_sub_items()
&& let Some(ref views) = extension.views
{
let mut paths = Vec::with_capacity(views.len());
for view in views.iter() {
let page = view
.page
.as_ref()
.expect("View extension should set its page field");
let path = canonicalize_page_path(Path::new(page.as_str()), &dest_dir);
paths.push(path);
}
paths
} else {
// No pages in this extension
Vec::new()
}
};
for page_path in absolute_page_paths {
convert_page(&page_path).await?;
}
// Canonicalize relative icon and page paths
canonicalize_relative_icon_path(&dest_dir, &mut extension)?;
canonicalize_relative_page_path(&dest_dir, &mut extension)?;
// Add extension to the search source
third_party_ext_list_write_lock.push(extension);
Ok(())
}
/// Helper function to recursively copy directories.
#[async_recursion::async_recursion]
async fn copy_dir_recursively(src: &Path, dest: &Path) -> Result<(), String> {
tokio::fs::create_dir_all(dest)
.await
.map_err(|e| e.to_string())?;
let mut read_dir = tokio::fs::read_dir(src).await.map_err(|e| e.to_string())?;
while let Some(entry) = read_dir.next_entry().await.map_err(|e| e.to_string())? {
let src_path = entry.path();
let dest_path = dest.join(entry.file_name());
if src_path.is_dir() {
copy_dir_recursively(&src_path, &dest_path).await?;
} else {
tokio::fs::copy(&src_path, &dest_path)
.await
.map_err(|e| e.to_string())?;
}
}
Ok(())
}

View File

@@ -0,0 +1,684 @@
//! This module contains the code of extension installation.
//!
//!
//! # How
//!
//! Technically, installing an extension involves the following steps. The order
//! may vary between implementations.
//!
//! 1. Check if it is already installed, if so, return
//!
//! 2. Correct the `plugin.json` JSON if it does not conform to our `struct
//! Extension` definition. This can happen because the JSON written by
//! developers is in a simplified form for a better developer experience.
//!
//! 3. Validate the corrected `plugin.json`
//! 1. misc checks
//! 2. Platform compatibility check
//!
//! 4. Write the extension files to the corresponding location
//!
//! * developer directory
//! * extension directory
//! * assets directory
//! * various assets files, e.g., "icon.png"
//! * plugin.json file
//! * View pages if exist
//!
//! 5. If this extension contains any View extensions, call `convert_page()`
//! on them to make them loadable by Tauri/webview.
//!
//! See `convert_page()` for more info.
//!
//! 6. Canonicalize `Extension.icon` and `Extension.page` fields if they are
//! relative paths
//!
//! * icon: relative to the `assets` directory
//! * page: relative to the extension root directory
//!
//! 7. Add the extension to the in-memory extension list.
pub(crate) mod local_extension;
pub(crate) mod store;
use crate::extension::Extension;
use crate::util::platform::Platform;
use std::path::Path;
use super::THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE;
pub(crate) async fn is_extension_installed(developer: &str, extension_id: &str) -> bool {
THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE
.get()
.expect("global third party search source not set")
.extension_exists(developer, extension_id)
.await
}
/// Filters out sub-extensions that are not compatible with the current platform.
///
/// We make `current_platform` an argument so that this function is testable.
pub(crate) fn filter_out_incompatible_sub_extensions(
extension: &mut Extension,
current_platform: Platform,
) {
// Only process extensions of type Group or Extension that can have sub-extensions
if !extension.r#type.contains_sub_items() {
return;
}
// For main extensions, None means all.
let main_extension_supported_platforms = extension.platforms.clone().unwrap_or(Platform::all());
// Filter commands
if let Some(ref mut commands) = extension.commands {
commands.retain(|sub_ext| {
if let Some(ref platforms) = sub_ext.platforms {
platforms.contains(&current_platform)
} else {
main_extension_supported_platforms.contains(&current_platform)
}
});
}
// Filter scripts
if let Some(ref mut scripts) = extension.scripts {
scripts.retain(|sub_ext| {
if let Some(ref platforms) = sub_ext.platforms {
platforms.contains(&current_platform)
} else {
main_extension_supported_platforms.contains(&current_platform)
}
});
}
// Filter quicklinks
if let Some(ref mut quicklinks) = extension.quicklinks {
quicklinks.retain(|sub_ext| {
if let Some(ref platforms) = sub_ext.platforms {
platforms.contains(&current_platform)
} else {
main_extension_supported_platforms.contains(&current_platform)
}
});
}
// Filter views
if let Some(ref mut views) = extension.views {
views.retain(|sub_ext| {
if let Some(ref platforms) = sub_ext.platforms {
platforms.contains(&current_platform)
} else {
main_extension_supported_platforms.contains(&current_platform)
}
});
}
}
/// Convert the page file to make it loadable by the Tauri/Webview.
pub(crate) async fn convert_page(absolute_page_path: &Path) -> Result<(), String> {
assert!(absolute_page_path.is_absolute());
let page_content = tokio::fs::read_to_string(absolute_page_path)
.await
.map_err(|e| e.to_string())?;
let new_page_content = _convert_page(&page_content, absolute_page_path)?;
// overwrite it
tokio::fs::write(absolute_page_path, new_page_content)
.await
.map_err(|e| e.to_string())?;
Ok(())
}
/// NOTE: There is no Rust implementation of `convertFileSrc()` in Tauri. Our
/// impl here is based on [comment](https://github.com/tauri-apps/tauri/issues/12022#issuecomment-2572879115)
fn convert_file_src(path: &Path) -> Result<String, String> {
#[cfg(any(windows, target_os = "android"))]
let base = "http://asset.localhost/";
#[cfg(not(any(windows, target_os = "android")))]
let base = "asset://localhost/";
let path =
dunce::canonicalize(path).map_err(|e| format!("Failed to canonicalize path: {}", e))?;
let path_str = path.to_string_lossy();
let encoded = urlencoding::encode(&path_str);
Ok(format!("{base}{encoded}"))
}
/// Tauri cannot directly access the file system, to make a file loadable, we
/// have to `canonicalize()` and `convertFileSrc()` its path before passing it
/// to Tauri.
///
/// View extension's page is a HTML file that Coco (Tauri) will load, we need
/// to process all `<PATH>` tags:
///
/// 1. `<script type="xxx" crossorigin src="<PATH>"></script>`
/// 2. `<a href="<PATH>">xxx</a>`
/// 3. `<link rel="xxx" href="<PATH>"/>`
/// 4. `<img class="xxx" src="<PATH>" alt="xxx"/>`
fn _convert_page(page_content: &str, absolute_page_path: &Path) -> Result<String, String> {
use scraper::{Html, Selector};
/// Helper function.
///
/// Search `document` for the tag attributes specified by `tag_with_attribute`
/// and `tag_attribute`, call `convert_file_src()`, then update the attribute
/// value with the function return value.
fn modify_tag_attributes(
document: &Html,
modified_html: &mut String,
base_dir: &Path,
tag_with_attribute: &str,
tag_attribute: &str,
) -> Result<(), String> {
let script_selector = Selector::parse(tag_with_attribute).unwrap();
for element in document.select(&script_selector) {
if let Some(src) = element.value().attr(tag_attribute) {
if !src.starts_with("http://")
&& !src.starts_with("https://")
&& !src.starts_with("asset://")
&& !src.starts_with("http://asset.localhost/")
{
// It could be a path like "/assets/index-41be3ec9.js", but it
// is still a relative path. We need to remove the starting /
// or path.join() will think it is an absolute path and does nothing
let corrected_src = if src.starts_with('/') { &src[1..] } else { src };
let full_path = base_dir.join(corrected_src);
let converted_path = convert_file_src(full_path.as_path())?;
*modified_html = modified_html.replace(
&format!("{}=\"{}\"", tag_attribute, src),
&format!("{}=\"{}\"", tag_attribute, converted_path),
);
}
}
}
Ok(())
}
let base_dir = absolute_page_path
.parent()
.ok_or_else(|| format!("page path is invalid, it should have a parent path"))?;
let document: Html = Html::parse_document(page_content);
let mut modified_html: String = page_content.to_string();
modify_tag_attributes(
&document,
&mut modified_html,
base_dir,
"script[src]",
"src",
)?;
modify_tag_attributes(&document, &mut modified_html, base_dir, "a[href]", "href")?;
modify_tag_attributes(
&document,
&mut modified_html,
base_dir,
"link[href]",
"href",
)?;
modify_tag_attributes(&document, &mut modified_html, base_dir, "img[src]", "src")?;
Ok(modified_html)
}
#[cfg(test)]
mod tests {
use super::*;
use crate::extension::ExtensionType;
use std::collections::HashSet;
/// Helper function to create a basic extension for testing
/// `filter_out_incompatible_sub_extensions`
fn create_test_extension(
extension_type: ExtensionType,
platforms: Option<HashSet<Platform>>,
) -> Extension {
Extension {
id: "ID".into(),
name: "name".into(),
developer: None,
platforms,
description: "Test extension".to_string(),
icon: "test-icon".to_string(),
r#type: extension_type,
action: None,
quicklink: None,
commands: None,
scripts: None,
quicklinks: None,
views: None,
alias: None,
hotkey: None,
enabled: true,
settings: None,
page: None,
permission: None,
screenshots: None,
url: None,
version: None,
}
}
#[test]
fn test_filter_out_incompatible_sub_extensions_filter_non_group_extension_unchanged() {
// Command
let mut extension = create_test_extension(ExtensionType::Command, None);
let clone = extension.clone();
filter_out_incompatible_sub_extensions(&mut extension, Platform::Linux);
assert_eq!(extension, clone);
// Quicklink
let mut extension = create_test_extension(ExtensionType::Quicklink, None);
let clone = extension.clone();
filter_out_incompatible_sub_extensions(&mut extension, Platform::Linux);
assert_eq!(extension, clone);
}
#[test]
fn test_filter_out_incompatible_sub_extensions() {
let mut main_extension = create_test_extension(ExtensionType::Group, None);
// init sub extensions, which are macOS-only
let commands = vec![create_test_extension(
ExtensionType::Command,
Some(HashSet::from([Platform::Macos])),
)];
let quicklinks = vec![create_test_extension(
ExtensionType::Quicklink,
Some(HashSet::from([Platform::Macos])),
)];
let scripts = vec![create_test_extension(
ExtensionType::Script,
Some(HashSet::from([Platform::Macos])),
)];
let views = vec![create_test_extension(
ExtensionType::View,
Some(HashSet::from([Platform::Macos])),
)];
// Set sub extensions
main_extension.commands = Some(commands);
main_extension.quicklinks = Some(quicklinks);
main_extension.scripts = Some(scripts);
main_extension.views = Some(views);
// Current platform is Linux, all the sub extensions should be filtered out.
filter_out_incompatible_sub_extensions(&mut main_extension, Platform::Linux);
// assertions
assert!(main_extension.commands.unwrap().is_empty());
assert!(main_extension.quicklinks.unwrap().is_empty());
assert!(main_extension.scripts.unwrap().is_empty());
assert!(main_extension.views.unwrap().is_empty());
}
/// Sub extensions are compatible with all the platforms, nothing to filter out.
#[test]
fn test_filter_out_incompatible_sub_extensions_all_compatible() {
{
let mut main_extension = create_test_extension(ExtensionType::Group, None);
// init sub extensions, which are compatible with all the platforms
let commands = vec![create_test_extension(
ExtensionType::Command,
Some(Platform::all()),
)];
let quicklinks = vec![create_test_extension(
ExtensionType::Quicklink,
Some(Platform::all()),
)];
let scripts = vec![create_test_extension(
ExtensionType::Script,
Some(Platform::all()),
)];
let views = vec![create_test_extension(
ExtensionType::View,
Some(Platform::all()),
)];
// Set sub extensions
main_extension.commands = Some(commands);
main_extension.quicklinks = Some(quicklinks);
main_extension.scripts = Some(scripts);
main_extension.views = Some(views);
// Current platform is Linux, all the sub extensions should be filtered out.
filter_out_incompatible_sub_extensions(&mut main_extension, Platform::Linux);
// assertions
assert_eq!(main_extension.commands.unwrap().len(), 1);
assert_eq!(main_extension.quicklinks.unwrap().len(), 1);
assert_eq!(main_extension.scripts.unwrap().len(), 1);
assert_eq!(main_extension.views.unwrap().len(), 1);
}
// main extension is compatible with all platforms, sub extension's platforms
// is None, which means all platforms are supported
{
let mut main_extension = create_test_extension(ExtensionType::Group, None);
// init sub extensions, which are compatible with all the platforms
let commands = vec![create_test_extension(ExtensionType::Command, None)];
let quicklinks = vec![create_test_extension(ExtensionType::Quicklink, None)];
let scripts = vec![create_test_extension(ExtensionType::Script, None)];
let views = vec![create_test_extension(ExtensionType::View, None)];
// Set sub extensions
main_extension.commands = Some(commands);
main_extension.quicklinks = Some(quicklinks);
main_extension.scripts = Some(scripts);
main_extension.views = Some(views);
// Current platform is Linux, all the sub extensions should be filtered out.
filter_out_incompatible_sub_extensions(&mut main_extension, Platform::Linux);
// assertions
assert_eq!(main_extension.commands.unwrap().len(), 1);
assert_eq!(main_extension.quicklinks.unwrap().len(), 1);
assert_eq!(main_extension.scripts.unwrap().len(), 1);
assert_eq!(main_extension.views.unwrap().len(), 1);
}
}
#[test]
fn test_main_extension_is_incompatible_sub_extension_platforms_none() {
{
let mut main_extension =
create_test_extension(ExtensionType::Group, Some(HashSet::from([Platform::Macos])));
let commands = vec![create_test_extension(ExtensionType::Command, None)];
main_extension.commands = Some(commands);
filter_out_incompatible_sub_extensions(&mut main_extension, Platform::Linux);
assert_eq!(main_extension.commands.unwrap().len(), 0);
}
{
let mut main_extension =
create_test_extension(ExtensionType::Group, Some(HashSet::from([Platform::Macos])));
let scripts = vec![create_test_extension(ExtensionType::Script, None)];
main_extension.scripts = Some(scripts);
filter_out_incompatible_sub_extensions(&mut main_extension, Platform::Linux);
assert_eq!(main_extension.scripts.unwrap().len(), 0);
}
{
let mut main_extension =
create_test_extension(ExtensionType::Group, Some(HashSet::from([Platform::Macos])));
let quicklinks = vec![create_test_extension(ExtensionType::Quicklink, None)];
main_extension.quicklinks = Some(quicklinks);
filter_out_incompatible_sub_extensions(&mut main_extension, Platform::Linux);
assert_eq!(main_extension.quicklinks.unwrap().len(), 0);
}
{
let mut main_extension =
create_test_extension(ExtensionType::Group, Some(HashSet::from([Platform::Macos])));
let views = vec![create_test_extension(ExtensionType::View, None)];
main_extension.views = Some(views);
filter_out_incompatible_sub_extensions(&mut main_extension, Platform::Linux);
assert_eq!(main_extension.views.unwrap().len(), 0);
}
}
#[test]
fn test_main_extension_compatible_sub_extension_platforms_none() {
let mut main_extension =
create_test_extension(ExtensionType::Group, Some(HashSet::from([Platform::Macos])));
let views = vec![create_test_extension(ExtensionType::View, None)];
main_extension.views = Some(views);
filter_out_incompatible_sub_extensions(&mut main_extension, Platform::Macos);
assert_eq!(main_extension.views.unwrap().len(), 1);
}
#[test]
fn test_convert_page_script_tag() {
use tempfile::TempDir;
let temp_dir = TempDir::new().unwrap();
let html_file = temp_dir.path().join("test.html");
let js_file = temp_dir.path().join("main.js");
let html_content = r#"<html><body><script src="main.js"></script></body></html>"#;
std::fs::write(&html_file, html_content).unwrap();
std::fs::write(&js_file, "").unwrap();
let result = _convert_page(html_content, &html_file).unwrap();
let path = convert_file_src(&js_file).unwrap();
let expected = format!(
"<html><body><script src=\"{}\"></script></body></html>",
path
);
assert_eq!(result, expected);
}
#[test]
fn test_convert_page_script_tag_with_a_root_char() {
use tempfile::TempDir;
let temp_dir = TempDir::new().unwrap();
let html_file = temp_dir.path().join("test.html");
let js_file = temp_dir.path().join("main.js");
let html_content = r#"<html><body><script src="/main.js"></script></body></html>"#;
std::fs::write(&html_file, html_content).unwrap();
std::fs::write(&js_file, "").unwrap();
let result = _convert_page(html_content, &html_file).unwrap();
let path = convert_file_src(&js_file).unwrap();
let expected = format!(
"<html><body><script src=\"{}\"></script></body></html>",
path
);
assert_eq!(result, expected);
}
#[test]
fn test_convert_page_a_tag() {
use tempfile::TempDir;
let temp_dir = TempDir::new().unwrap();
let html_file = temp_dir.path().join("test.html");
let js_file = temp_dir.path().join("main.js");
let html_content = r#"<html><body><a href="main.js">foo</a></body></html>"#;
std::fs::write(&html_file, html_content).unwrap();
std::fs::write(&js_file, "").unwrap();
let result = _convert_page(html_content, &html_file).unwrap();
let path = convert_file_src(&js_file).unwrap();
let expected = format!("<html><body><a href=\"{}\">foo</a></body></html>", path);
assert_eq!(result, expected);
}
#[test]
fn test_convert_page_a_tag_with_a_root_char() {
use tempfile::TempDir;
let temp_dir = TempDir::new().unwrap();
let html_file = temp_dir.path().join("test.html");
let js_file = temp_dir.path().join("main.js");
let html_content = r#"<html><body><a href="/main.js">foo</a></body></html>"#;
std::fs::write(&html_file, html_content).unwrap();
std::fs::write(&js_file, "").unwrap();
let result = _convert_page(html_content, &html_file).unwrap();
let path = convert_file_src(&js_file).unwrap();
let expected = format!("<html><body><a href=\"{}\">foo</a></body></html>", path);
assert_eq!(result, expected);
}
#[test]
fn test_convert_page_link_href_tag() {
use tempfile::TempDir;
let temp_dir = TempDir::new().unwrap();
let html_file = temp_dir.path().join("test.html");
let css_file = temp_dir.path().join("main.css");
let html_content = r#"<html><body><link rel="stylesheet" href="main.css"/></body></html>"#;
std::fs::write(&html_file, html_content).unwrap();
std::fs::write(&css_file, "").unwrap();
let result = _convert_page(html_content, &html_file).unwrap();
let path = convert_file_src(&css_file).unwrap();
let expected = format!(
"<html><body><link rel=\"stylesheet\" href=\"{}\"/></body></html>",
path
);
assert_eq!(result, expected);
}
#[test]
fn test_convert_page_link_href_tag_with_a_root_tag() {
use tempfile::TempDir;
let temp_dir = TempDir::new().unwrap();
let html_file = temp_dir.path().join("test.html");
let css_file = temp_dir.path().join("main.css");
let html_content = r#"<html><body><link rel="stylesheet" href="/main.css"/></body></html>"#;
std::fs::write(&html_file, html_content).unwrap();
std::fs::write(&css_file, "").unwrap();
let result = _convert_page(html_content, &html_file).unwrap();
let path = convert_file_src(&css_file).unwrap();
let expected = format!(
"<html><body><link rel=\"stylesheet\" href=\"{}\"/></body></html>",
path
);
assert_eq!(result, expected);
}
#[test]
fn test_convert_page_img_src_tag() {
use tempfile::TempDir;
let temp_dir = TempDir::new().unwrap();
let html_file = temp_dir.path().join("test.html");
let png_file = temp_dir.path().join("main.png");
let html_content =
r#"<html><body> <img class="fit-picture" src="main.png" alt="xxx" /></body></html>"#;
std::fs::write(&html_file, html_content).unwrap();
std::fs::write(&png_file, "").unwrap();
let result = _convert_page(html_content, &html_file).unwrap();
let path = convert_file_src(&png_file).unwrap();
let expected = format!(
"<html><body> <img class=\"fit-picture\" src=\"{}\" alt=\"xxx\" /></body></html>",
path
);
assert_eq!(result, expected);
}
#[test]
fn test_convert_page_img_src_tag_with_a_root_tag() {
use tempfile::TempDir;
let temp_dir = TempDir::new().unwrap();
let html_file = temp_dir.path().join("test.html");
let png_file = temp_dir.path().join("main.png");
let html_content =
r#"<html><body> <img class="fit-picture" src="/main.png" alt="xxx" /></body></html>"#;
std::fs::write(&html_file, html_content).unwrap();
std::fs::write(&png_file, "").unwrap();
let result = _convert_page(html_content, &html_file).unwrap();
let path = convert_file_src(&png_file).unwrap();
let expected = format!(
"<html><body> <img class=\"fit-picture\" src=\"{}\" alt=\"xxx\" /></body></html>",
path
);
assert_eq!(result, expected);
}
#[test]
fn test_convert_page_contain_both_script_and_a_tags() {
use tempfile::TempDir;
let temp_dir = TempDir::new().unwrap();
let html_file = temp_dir.path().join("test.html");
let js_file = temp_dir.path().join("main.js");
let html_content =
r#"<html><body><a href="main.js">foo</a><script src="main.js"></script></body></html>"#;
std::fs::write(&html_file, html_content).unwrap();
std::fs::write(&js_file, "").unwrap();
let result = _convert_page(html_content, &html_file).unwrap();
let path = convert_file_src(&js_file).unwrap();
let expected = format!(
"<html><body><a href=\"{}\">foo</a><script src=\"{}\"></script></body></html>",
path, path
);
assert_eq!(result, expected);
}
#[test]
fn test_convert_page_contain_both_script_and_a_tags_with_root_char() {
use tempfile::TempDir;
let temp_dir = TempDir::new().unwrap();
let html_file = temp_dir.path().join("test.html");
let js_file = temp_dir.path().join("main.js");
let html_content = r#"<html><body><a href="/main.js">foo</a><script src="/main.js"></script></body></html>"#;
std::fs::write(&html_file, html_content).unwrap();
std::fs::write(&js_file, "").unwrap();
let result = _convert_page(html_content, &html_file).unwrap();
let path = convert_file_src(&js_file).unwrap();
let expected = format!(
"<html><body><a href=\"{}\">foo</a><script src=\"{}\"></script></body></html>",
path, path
);
assert_eq!(result, expected);
}
#[test]
fn test_convert_page_empty_html() {
use tempfile::TempDir;
let temp_dir = TempDir::new().unwrap();
let html_file = temp_dir.path().join("test.html");
let html_content = "";
std::fs::write(&html_file, html_content).unwrap();
let result = _convert_page(html_content, &html_file).unwrap();
assert!(result.is_empty());
}
#[test]
fn test_convert_page_only_html_tag() {
use tempfile::TempDir;
let temp_dir = TempDir::new().unwrap();
let html_file = temp_dir.path().join("test.html");
let html_content = "<html></html>";
std::fs::write(&html_file, html_content).unwrap();
let result = _convert_page(html_content, &html_file).unwrap();
assert_eq!(result, html_content);
}
}

View File

@@ -1,6 +1,7 @@
//! Extension store related stuff.
use super::LOCAL_QUERY_SOURCE_TYPE;
use super::super::LOCAL_QUERY_SOURCE_TYPE;
use super::is_extension_installed;
use crate::common::document::DataSourceReference;
use crate::common::document::Document;
use crate::common::error::SearchError;
@@ -8,16 +9,26 @@ use crate::common::search::QueryResponse;
use crate::common::search::QuerySource;
use crate::common::search::SearchQuery;
use crate::common::traits::SearchSource;
use crate::extension::canonicalize_relative_icon_path;
use crate::extension::third_party::THIRD_PARTY_EXTENSIONS_DIRECTORY;
use crate::extension::Extension;
use crate::extension::ExtensionType;
use crate::extension::PLUGIN_JSON_FILE_NAME;
use crate::extension::THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE;
use crate::extension::canonicalize_relative_icon_path;
use crate::extension::canonicalize_relative_page_path;
use crate::extension::third_party::check::general_check;
use crate::extension::third_party::get_third_party_extension_directory;
use crate::extension::third_party::install::convert_page;
use crate::extension::third_party::install::filter_out_incompatible_sub_extensions;
use crate::server::http_client::HttpClient;
use crate::util::platform::Platform;
use async_trait::async_trait;
use reqwest::StatusCode;
use serde_json::Map as JsonObject;
use serde_json::Value as Json;
use std::io::Read;
use std::path::Path;
use std::path::PathBuf;
use tauri::AppHandle;
const DATA_SOURCE_ID: &str = "Extension Store";
@@ -36,7 +47,11 @@ impl SearchSource for ExtensionStore {
}
}
async fn search(&self, query: SearchQuery) -> Result<QueryResponse, SearchError> {
async fn search(
&self,
_tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
const SCORE: f64 = 2000.0;
let Some(query_string) = query.query_strings.get("query") else {
@@ -146,14 +161,12 @@ pub(crate) async fn search_extension(
.get("developer")
.and_then(|dev| dev.get("id"))
.and_then(|id| id.as_str())
.expect("developer.id should exist")
.to_string();
.expect("developer.id should exist");
let extension_id = source_obj
.get("id")
.and_then(|id| id.as_str())
.expect("extension id should exist")
.to_string();
.expect("extension id should exist");
let installed = is_extension_installed(developer_id, extension_id).await;
source_obj.insert("installed".to_string(), Json::Bool(installed));
@@ -164,16 +177,58 @@ pub(crate) async fn search_extension(
Ok(extensions)
}
async fn is_extension_installed(developer: String, extension_id: String) -> bool {
THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE
.get()
.unwrap()
.extension_exists(&developer, &extension_id)
#[tauri::command]
pub(crate) async fn extension_detail(
id: String,
) -> Result<Option<JsonObject<String, Json>>, String> {
let path = format!("store/extension/{}", id);
let response = HttpClient::get("default_coco_server", path.as_str(), None)
.await
.map_err(|e| format!("Failed to send request: {:?}", e))?;
if response.status() == StatusCode::NOT_FOUND {
return Ok(None);
}
let response_dbg_str = format!("{:?}", response);
// The response of an ES style GET request
let mut response: JsonObject<String, Json> = response.json().await.unwrap_or_else(|_e| {
panic!(
"response body of [/store/extension/<ID>] is not a JSON object, response [{:?}]",
response_dbg_str
)
});
let source_json = response.remove("_source").unwrap_or_else(|| {
panic!("field [_source] not found in the JSON returned from [/store/extension/<ID>]")
});
let mut source_obj = match source_json {
Json::Object(obj) => obj,
_ => panic!(
"field [_source] should be a JSON object, but it is not, value: [{}]",
source_json
),
};
let developer_id = match &source_obj["developer"]["id"] {
Json::String(dev) => dev,
_ => {
panic!(
"field [_source.developer.id] should be a string, but it is not, value: [{}]",
source_obj["developer"]["id"]
)
}
};
let installed = is_extension_installed(developer_id, &id).await;
source_obj.insert("installed".to_string(), Json::Bool(installed));
Ok(Some(source_obj))
}
#[tauri::command]
pub(crate) async fn install_extension(id: String) -> Result<(), String> {
pub(crate) async fn install_extension_from_store(
tauri_app_handle: AppHandle,
id: String,
) -> Result<(), String> {
let path = format!("store/extension/{}/_download", id);
let response = HttpClient::get("default_coco_server", &path, None)
.await
@@ -192,7 +247,15 @@ pub(crate) async fn install_extension(id: String) -> Result<(), String> {
let mut archive =
zip::ZipArchive::new(cursor).map_err(|e| format!("Failed to read zip archive: {}", e))?;
let mut plugin_json = archive.by_name("plugin.json").map_err(|e| e.to_string())?;
// The plugin.json sent from the server does not conform to our `struct Extension` definition:
//
// 1. Its `developer` field is a JSON object, but we need a string
// 2. sub-extensions won't have their `id` fields set
//
// we need to correct it
let mut plugin_json = archive
.by_name(PLUGIN_JSON_FILE_NAME)
.map_err(|e| e.to_string())?;
let mut plugin_json_content = String::new();
std::io::Read::read_to_string(&mut plugin_json, &mut plugin_json_content)
.map_err(|e| e.to_string())?;
@@ -213,7 +276,6 @@ pub(crate) async fn install_extension(id: String) -> Result<(), String> {
// Set IDs for sub-extensions (commands, quicklinks, scripts)
let mut counter = 0;
// Set IDs for commands
// Helper function to set IDs for array fields
fn set_ids_for_field(extension: &mut Json, field_name: &str, counter: &mut i32) {
if let Some(field) = extension.as_object_mut().unwrap().get_mut(field_name) {
@@ -229,72 +291,107 @@ pub(crate) async fn install_extension(id: String) -> Result<(), String> {
}
}
}
// Set IDs for sub-extensions
set_ids_for_field(&mut extension, "commands", &mut counter);
set_ids_for_field(&mut extension, "quicklinks", &mut counter);
set_ids_for_field(&mut extension, "scripts", &mut counter);
// Now the extension JSON is valid
let mut extension: Extension = serde_json::from_value(extension).unwrap_or_else(|e| {
panic!(
"cannot parse plugin.json as struct Extension, error [{:?}]",
e
);
});
let developer_id = extension.developer.clone().expect("developer has been set");
drop(plugin_json);
let developer = extension.developer.clone().unwrap_or_default();
let extension_id = extension.id.clone();
general_check(&extension)?;
// Extract the zip file
let current_platform = Platform::current();
if let Some(ref platforms) = extension.platforms {
if !platforms.contains(&current_platform) {
return Err("this extension is not compatible with your OS".into());
}
}
if is_extension_installed(&developer_id, &id).await {
return Err("Extension already installed.".into());
}
// Extension is compatible with current platform, but it could contain sub
// extensions that are not, filter them out.
filter_out_incompatible_sub_extensions(&mut extension, current_platform);
// We are going to modify our third-party extension list, grab the write lock
// to ensure exclusive access.
let mut third_party_ext_list_write_lock = THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE
.get()
.expect("global third party search source not set")
.write_lock()
.await;
// Write extension files to the extension directory
let extension_id = extension.id.clone();
let extension_directory = {
let mut path = THIRD_PARTY_EXTENSIONS_DIRECTORY.to_path_buf();
path.push(developer);
let mut path = get_third_party_extension_directory(&tauri_app_handle);
path.push(developer_id);
path.push(extension_id.as_str());
path
};
tokio::fs::create_dir_all(extension_directory.as_path())
.await
.map_err(|e| e.to_string())?;
// Extract all files except plugin.json
for i in 0..archive.len() {
let mut file = archive.by_index(i).map_err(|e| e.to_string())?;
let outpath = match file.enclosed_name() {
Some(path) => extension_directory.join(path),
None => continue,
};
let mut zip_file = archive.by_index(i).map_err(|e| e.to_string())?;
// `.name()` is safe to use in our cases, the cases listed in the below
// page won't happen to us.
//
// https://docs.rs/zip/4.2.0/zip/read/struct.ZipFile.html#method.name
//
// Example names:
//
// * `assets/icon.png`
// * `assets/screenshot.png`
// * `plugin.json`
//
// Yes, the `assets` directory is not a part of it.
let zip_file_name = zip_file.name();
// Skip the plugin.json file as we'll create it from the extension variable
if file.name() == "plugin.json" {
if zip_file_name == PLUGIN_JSON_FILE_NAME {
continue;
}
if file.name().ends_with('/') {
tokio::fs::create_dir_all(&outpath)
.await
.map_err(|e| e.to_string())?;
} else {
if let Some(p) = outpath.parent() {
if !p.exists() {
tokio::fs::create_dir_all(p)
.await
.map_err(|e| e.to_string())?;
}
}
let mut outfile = tokio::fs::File::create(&outpath)
.await
.map_err(|e| e.to_string())?;
let mut content = Vec::new();
std::io::Read::read_to_end(&mut file, &mut content).map_err(|e| e.to_string())?;
tokio::io::AsyncWriteExt::write_all(&mut outfile, &content)
let dest_file_path = extension_directory.join(zip_file_name);
// For cases like `assets/xxx.png`
if let Some(parent_dir) = dest_file_path.parent()
&& !parent_dir.exists()
{
tokio::fs::create_dir_all(parent_dir)
.await
.map_err(|e| e.to_string())?;
}
}
let mut dest_file = tokio::fs::File::create(&dest_file_path)
.await
.map_err(|e| e.to_string())?;
let mut src_bytes = Vec::with_capacity(
zip_file
.size()
.try_into()
.expect("we won't have a extension file that is bigger than 4GiB"),
);
zip_file
.read_to_end(&mut src_bytes)
.map_err(|e| e.to_string())?;
tokio::io::copy(&mut src_bytes.as_slice(), &mut dest_file)
.await
.map_err(|e| e.to_string())?;
}
// Create plugin.json from the extension variable
let plugin_json_path = extension_directory.join(PLUGIN_JSON_FILE_NAME);
let extension_json = serde_json::to_string_pretty(&extension).map_err(|e| e.to_string())?;
@@ -302,44 +399,58 @@ pub(crate) async fn install_extension(id: String) -> Result<(), String> {
.await
.map_err(|e| e.to_string())?;
// Turn it into an absolute path if it is a valid relative path because frontend code need this.
canonicalize_relative_icon_path(&extension_directory, &mut extension)?;
/*
* Call convert_page() to update the page files. This has to be done after
* writing the extension files
*/
let absolute_page_paths: Vec<PathBuf> = {
fn canonicalize_page_path(page_path: &Path, extension_root: &Path) -> PathBuf {
if page_path.is_relative() {
// It is relative to the extension root directory
extension_root.join(page_path)
} else {
page_path.into()
}
}
THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE
.get()
.unwrap()
.add_extension(extension)
.await;
if extension.r#type == ExtensionType::View {
let page = extension
.page
.as_ref()
.expect("View extension should set its page field");
let path = canonicalize_page_path(Path::new(page.as_str()), &extension_directory);
Ok(())
}
vec![path]
} else if extension.r#type.contains_sub_items()
&& let Some(ref views) = extension.views
{
let mut paths = Vec::with_capacity(views.len());
#[tauri::command]
pub(crate) async fn uninstall_extension(
developer: String,
extension_id: String,
) -> Result<(), String> {
let extension_dir = {
let mut path = THIRD_PARTY_EXTENSIONS_DIRECTORY.join(developer.as_str());
path.push(extension_id.as_str());
for view in views.iter() {
let page = view
.page
.as_ref()
.expect("View extension should set its page field");
let path = canonicalize_page_path(Path::new(page.as_str()), &extension_directory);
path
paths.push(path);
}
paths
} else {
// No pages in this extension
Vec::new()
}
};
if !extension_dir.try_exists().map_err(|e| e.to_string())? {
panic!(
"we are uninstalling extension [{}/{}], but there is no such extension files on disk",
developer, extension_id
)
for page_path in absolute_page_paths {
convert_page(&page_path).await?;
}
tokio::fs::remove_dir_all(extension_dir.as_path())
.await
.map_err(|e| e.to_string())?;
THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE
.get()
.unwrap()
.remove_extension(&developer, &extension_id)
.await;
// Canonicalize relative icon and page paths
canonicalize_relative_icon_path(&extension_directory, &mut extension)?;
canonicalize_relative_page_path(&extension_directory, &mut extension)?;
third_party_ext_list_write_lock.push(extension);
Ok(())
}

View File

@@ -1,66 +1,55 @@
use super::alter_extension_json_file;
use super::canonicalize_relative_icon_path;
pub(crate) mod check;
pub(crate) mod install;
use super::Extension;
use super::ExtensionType;
use super::Platform;
use super::LOCAL_QUERY_SOURCE_TYPE;
use super::PLUGIN_JSON_FILE_NAME;
use crate::common::document::open;
use super::alter_extension_json_file;
use super::canonicalize_relative_icon_path;
use crate::common::document::DataSourceReference;
use crate::common::document::Document;
use crate::common::document::open;
use crate::common::error::SearchError;
use crate::common::search::QueryResponse;
use crate::common::search::QuerySource;
use crate::common::search::SearchQuery;
use crate::common::traits::SearchSource;
use crate::extension::ExtensionBundleIdBorrowed;
use crate::GLOBAL_TAURI_APP_HANDLE;
use crate::extension::calculate_text_similarity;
use crate::extension::canonicalize_relative_page_path;
use crate::util::platform::Platform;
use async_trait::async_trait;
use borrowme::ToOwned;
use check::general_check;
use function_name::named;
use std::ffi::OsStr;
use std::io::ErrorKind;
use std::path::Path;
use std::path::PathBuf;
use std::sync::Arc;
use std::sync::LazyLock;
use std::sync::OnceLock;
use tauri::async_runtime;
use tauri::AppHandle;
use tauri::Manager;
use tauri::async_runtime;
use tauri_plugin_global_shortcut::GlobalShortcutExt;
use tauri_plugin_global_shortcut::ShortcutState;
use tokio::fs::read_dir;
use tokio::sync::RwLock;
use tokio::sync::RwLockWriteGuard;
pub(crate) static THIRD_PARTY_EXTENSIONS_DIRECTORY: LazyLock<PathBuf> = LazyLock::new(|| {
let mut app_data_dir = GLOBAL_TAURI_APP_HANDLE
.get()
.expect("global tauri app handle not set")
.path()
.app_data_dir()
.expect(
"User home directory not found, which should be impossible on desktop environments",
);
pub(crate) fn get_third_party_extension_directory(tauri_app_handle: &AppHandle) -> PathBuf {
let mut app_data_dir = tauri_app_handle.path().app_data_dir().expect(
"User home directory not found, which should be impossible on desktop environments",
);
app_data_dir.push("third_party_extensions");
app_data_dir
});
/// Helper function to determine the current platform.
fn current_platform() -> Platform {
let os_str = std::env::consts::OS;
serde_plain::from_str(os_str).unwrap_or_else(|_e| {
panic!("std::env::consts::OS is [{}], which is not a valid value for [enum Platform], valid values: ['macos', 'linux', 'windows']", os_str)
})
}
pub(crate) async fn list_third_party_extensions(
pub(crate) async fn load_third_party_extensions_from_directory(
directory: &Path,
) -> Result<(bool, Vec<Extension>), String> {
let mut found_invalid_extensions = false;
) -> Result<Vec<Extension>, String> {
let mut extensions_dir_iter = read_dir(&directory).await.map_err(|e| e.to_string())?;
let current_platform = current_platform();
let current_platform = Platform::current();
let mut extensions = Vec::new();
@@ -74,7 +63,6 @@ pub(crate) async fn list_third_party_extensions(
};
let developer_dir_file_type = developer_dir.file_type().await.map_err(|e| e.to_string())?;
if !developer_dir_file_type.is_dir() {
found_invalid_extensions = true;
log::warn!(
"file [{}] under the third party extension directory should be a directory, but it is not",
developer_dir.file_name().display()
@@ -84,18 +72,6 @@ pub(crate) async fn list_third_party_extensions(
continue 'developer;
}
let Ok(developer) = developer_dir.file_name().into_string() else {
found_invalid_extensions = true;
log::warn!(
"developer [{}] ID is not UTF-8 encoded",
developer_dir.file_name().display()
);
// Skip this file
continue 'developer;
};
let mut developer_dir_iter = read_dir(&developer_dir.path())
.await
.map_err(|e| e.to_string())?;
@@ -108,14 +84,17 @@ pub(crate) async fn list_third_party_extensions(
let Some(extension_dir) = opt_extension_dir else {
break 'extension;
};
let extension_dir_file_name = extension_dir
.file_name()
.into_string()
.expect("extension directory name should be UTF-8 encoded");
let extension_dir_file_type =
extension_dir.file_type().await.map_err(|e| e.to_string())?;
if !extension_dir_file_type.is_dir() {
found_invalid_extensions = true;
log::warn!(
"invalid extension [{}]: a valid extension should be a directory, but it is not",
extension_dir.file_name().display()
extension_dir_file_name
);
// Skip invalid extension
@@ -130,7 +109,6 @@ pub(crate) async fn list_third_party_extensions(
};
if !plugin_json_file_path.is_file() {
found_invalid_extensions = true;
log::warn!(
"invalid extension: [{}]: extension file [{}] should be a JSON file, but it is not",
extension_dir.file_name().display(),
@@ -147,10 +125,9 @@ pub(crate) async fn list_third_party_extensions(
let mut extension = match serde_json::from_str::<Extension>(&plugin_json_file_content) {
Ok(extension) => extension,
Err(e) => {
found_invalid_extensions = true;
log::warn!(
"invalid extension: [{}]: extension file [{}] is invalid, error: '{}'",
extension_dir.file_name().display(),
"invalid extension: [{}]: cannot parse file [{}] as a [struct Extension], error: '{}'",
extension_dir_file_name,
plugin_json_file_path.display(),
e
);
@@ -158,22 +135,56 @@ pub(crate) async fn list_third_party_extensions(
}
};
// Turn it into an absolute path if it is a valid relative path because frontend code need this.
canonicalize_relative_icon_path(&extension_dir.path(), &mut extension)?;
/* Check starts here */
if extension.id != extension_dir_file_name {
log::warn!(
"extension under [{}:{}] has an ID that is not same as the [{}]",
developer_dir.file_name().display(),
extension_dir_file_name,
extension.id,
);
continue;
}
// Extension should be unique
if extensions.iter().any(|ext: &Extension| {
ext.id == extension.id && ext.developer == extension.developer
}) {
log::warn!(
"an extension with the same bundle ID [ID {}, developer {:?}] already exists, skip this one",
extension.id,
extension.developer
);
continue;
}
if let Err(error_msg) = general_check(&extension) {
log::warn!("{}", error_msg);
if !validate_extension(
&extension,
&extension_dir.file_name(),
&extensions,
current_platform,
) {
found_invalid_extensions = true;
// Skip invalid extension
continue;
}
// Set extension's developer info manually.
extension.developer = Some(developer.clone());
if let Some(ref platforms) = extension.platforms {
if !platforms.contains(&current_platform) {
log::warn!(
"installed third-party extension [developer {}, ID {}] is not compatible with current platform, either user messes our directory or something wrong with our extension check",
extension
.developer
.as_ref()
.expect("third party extension should have [developer] set"),
extension.id
);
continue;
}
}
/* Check ends here */
// Turn it into an absolute path if it is a valid relative path because frontend code needs this.
canonicalize_relative_icon_path(&extension_dir.path(), &mut extension)?;
canonicalize_relative_page_path(&extension_dir.path(), &mut extension)?;
extensions.push(extension);
}
@@ -187,194 +198,7 @@ pub(crate) async fn list_third_party_extensions(
.collect::<Vec<_>>()
);
Ok((found_invalid_extensions, extensions))
}
/// Helper function to validate `extension`, return `true` if it is valid.
fn validate_extension(
extension: &Extension,
extension_dir_name: &OsStr,
listed_extensions: &[Extension],
current_platform: Platform,
) -> bool {
if OsStr::new(&extension.id) != extension_dir_name {
log::warn!(
"invalid extension []: id [{}] and extension directory name [{}] do not match",
extension.id,
extension_dir_name.display()
);
return false;
}
// Extension ID should be unique
if listed_extensions.iter().any(|ext| ext.id == extension.id) {
log::warn!(
"invalid extension []: extension with id [{}] already exists",
extension.id,
);
return false;
}
if !validate_extension_or_sub_item(extension) {
return false;
}
// Extension is incompatible
if let Some(ref platforms) = extension.platforms {
if !platforms.contains(&current_platform) {
log::warn!("extension [{}] is not compatible with the current platform [{}], it is available to {:?}", extension.id, current_platform, platforms.iter().map(|os|os.to_string()).collect::<Vec<_>>());
return false;
}
}
if let Some(ref commands) = extension.commands {
if !validate_sub_items(&extension.id, commands) {
return false;
}
}
if let Some(ref scripts) = extension.scripts {
if !validate_sub_items(&extension.id, scripts) {
return false;
}
}
if let Some(ref quick_links) = extension.quicklinks {
if !validate_sub_items(&extension.id, quick_links) {
return false;
}
}
true
}
/// Checks that can be performed against an extension or a sub item.
fn validate_extension_or_sub_item(extension: &Extension) -> bool {
// If field `action` is Some, then it should be a Command
if extension.action.is_some() && extension.r#type != ExtensionType::Command {
log::warn!(
"invalid extension [{}], [action] is set for a non-Command extension",
extension.id
);
return false;
}
if extension.r#type == ExtensionType::Command && extension.action.is_none() {
log::warn!(
"invalid extension [{}], [action] should be set for a Command extension",
extension.id
);
return false;
}
// If field `quick_link` is Some, then it should be a QuickLink
if extension.quicklink.is_some() && extension.r#type != ExtensionType::Quicklink {
log::warn!(
"invalid extension [{}], [quick_link] is set for a non-QuickLink extension",
extension.id
);
return false;
}
if extension.r#type == ExtensionType::Quicklink && extension.quicklink.is_none() {
log::warn!(
"invalid extension [{}], [quick_link] should be set for a QuickLink extension",
extension.id
);
return false;
}
// Group and Extension cannot have alias
if extension.alias.is_some() {
if extension.r#type == ExtensionType::Group || extension.r#type == ExtensionType::Extension
{
log::warn!(
"invalid extension [{}], extension of type [{:?}] cannot have alias",
extension.id,
extension.r#type
);
return false;
}
}
// Group and Extension cannot have hotkey
if extension.hotkey.is_some() {
if extension.r#type == ExtensionType::Group || extension.r#type == ExtensionType::Extension
{
log::warn!(
"invalid extension [{}], extension of type [{:?}] cannot have hotkey",
extension.id,
extension.r#type
);
return false;
}
}
if extension.commands.is_some() || extension.scripts.is_some() || extension.quicklinks.is_some()
{
if extension.r#type != ExtensionType::Group && extension.r#type != ExtensionType::Extension
{
log::warn!(
"invalid extension [{}], only extension of type [Group] and [Extension] can have sub-items",
extension.id,
);
return false;
}
}
true
}
/// Helper function to check sub-items.
fn validate_sub_items(extension_id: &str, sub_items: &[Extension]) -> bool {
for (sub_item_index, sub_item) in sub_items.iter().enumerate() {
// If field `action` is Some, then it should be a Command
if sub_item.action.is_some() && sub_item.r#type != ExtensionType::Command {
log::warn!(
"invalid extension sub-item [{}-{}]: [action] is set for a non-Command extension",
extension_id,
sub_item.id
);
return false;
}
if sub_item.r#type == ExtensionType::Group || sub_item.r#type == ExtensionType::Extension {
log::warn!(
"invalid extension sub-item [{}-{}]: sub-item should not be of type [Group] or [Extension]",
extension_id, sub_item.id
);
return false;
}
let sub_item_with_same_id_count = sub_items
.iter()
.enumerate()
.filter(|(_idx, ext)| ext.id == sub_item.id)
.filter(|(idx, _ext)| *idx != sub_item_index)
.count();
if sub_item_with_same_id_count != 0 {
log::warn!(
"invalid extension [{}]: found more than one sub-items with the same ID [{}]",
extension_id,
sub_item.id
);
return false;
}
if !validate_extension_or_sub_item(sub_item) {
return false;
}
if sub_item.platforms.is_some() {
log::warn!(
"invalid extension [{}]: key [platforms] should not be set in sub-items",
extension_id,
);
return false;
}
}
true
Ok(extensions)
}
/// All the third-party extensions will be registered as one search source.
@@ -382,7 +206,7 @@ fn validate_sub_items(extension_id: &str, sub_items: &[Extension]) -> bool {
/// Since some `#[tauri::command]`s need to access it, we store it in a global
/// static variable as well.
#[derive(Debug, Clone)]
pub(super) struct ThirdPartyExtensionsSearchSource {
pub(crate) struct ThirdPartyExtensionsSearchSource {
inner: Arc<ThirdPartyExtensionsSearchSourceInner>,
}
@@ -415,11 +239,11 @@ impl ThirdPartyExtensionsSearchSource {
/// Note that when you enable a parent extension, its **enabled** children extensions
/// should also be enabled.
#[async_recursion::async_recursion]
async fn _enable_extension(extension: &Extension) -> Result<(), String> {
async fn _enable_extension(
tauri_app_handle: &AppHandle,
extension: &Extension,
) -> Result<(), String> {
if extension.supports_alias_hotkey() {
let tauri_app_handle = GLOBAL_TAURI_APP_HANDLE
.get()
.expect("global tauri app handle not set");
if let Some(ref hotkey) = extension.hotkey {
let on_opened = extension.on_opened().unwrap_or_else(|| panic!( "extension has hotkey, but on_open() returns None, extension ID [{}], extension type [{:?}]", extension.id, extension.r#type));
@@ -427,12 +251,14 @@ impl ThirdPartyExtensionsSearchSource {
tauri_app_handle
.global_shortcut()
.on_shortcut(hotkey.as_str(), move |_tauri_app_handle, _hotkey, event| {
.on_shortcut(hotkey.as_str(), move |tauri_app_handle, _hotkey, event| {
let on_opened_clone = on_opened.clone();
let extension_id_clone = extension_id_clone.clone();
let app_handle_clone = tauri_app_handle.clone();
if event.state() == ShortcutState::Pressed {
async_runtime::spawn(async move {
let result = open(on_opened_clone).await;
let result = open(app_handle_clone, on_opened_clone, None).await;
if let Err(msg) = result {
log::warn!(
"failed to open extension [{}], error [{}]",
@@ -451,19 +277,24 @@ impl ThirdPartyExtensionsSearchSource {
if extension.r#type.contains_sub_items() {
if let Some(commands) = &extension.commands {
for command in commands.iter().filter(|ext| ext.enabled) {
Self::_enable_extension(command).await?;
Self::_enable_extension(&tauri_app_handle, command).await?;
}
}
if let Some(scripts) = &extension.scripts {
for script in scripts.iter().filter(|ext| ext.enabled) {
Self::_enable_extension(script).await?;
Self::_enable_extension(&tauri_app_handle, script).await?;
}
}
if let Some(quicklinks) = &extension.quicklinks {
for quicklink in quicklinks.iter().filter(|ext| ext.enabled) {
Self::_enable_extension(quicklink).await?;
Self::_enable_extension(&tauri_app_handle, quicklink).await?;
}
}
if let Some(views) = &extension.views {
for view in views.iter().filter(|ext| ext.enabled) {
Self::_enable_extension(&tauri_app_handle, view).await?;
}
}
}
@@ -478,12 +309,11 @@ impl ThirdPartyExtensionsSearchSource {
/// Note that when you disable a parent extension, its **enabled** children extensions
/// should also be disabled.
#[async_recursion::async_recursion]
async fn _disable_extension(extension: &Extension) -> Result<(), String> {
async fn _disable_extension(
tauri_app_handle: &AppHandle,
extension: &Extension,
) -> Result<(), String> {
if let Some(ref hotkey) = extension.hotkey {
let tauri_app_handle = GLOBAL_TAURI_APP_HANDLE
.get()
.expect("global tauri app handle not set");
tauri_app_handle
.global_shortcut()
.unregister(hotkey.as_str())
@@ -494,19 +324,24 @@ impl ThirdPartyExtensionsSearchSource {
if extension.r#type.contains_sub_items() {
if let Some(commands) = &extension.commands {
for command in commands.iter().filter(|ext| ext.enabled) {
Self::_disable_extension(command).await?;
Self::_disable_extension(tauri_app_handle, command).await?;
}
}
if let Some(scripts) = &extension.scripts {
for script in scripts.iter().filter(|ext| ext.enabled) {
Self::_disable_extension(script).await?;
Self::_disable_extension(tauri_app_handle, script).await?;
}
}
if let Some(quicklinks) = &extension.quicklinks {
for quicklink in quicklinks.iter().filter(|ext| ext.enabled) {
Self::_disable_extension(quicklink).await?;
Self::_disable_extension(tauri_app_handle, quicklink).await?;
}
}
if let Some(views) = &extension.views {
for view in views.iter().filter(|ext| ext.enabled) {
Self::_disable_extension(tauri_app_handle, view).await?;
}
}
}
@@ -522,9 +357,15 @@ impl ThirdPartyExtensionsSearchSource {
}
}
/// Acquire the write lock to the extension list.
pub(crate) async fn write_lock(&self) -> RwLockWriteGuard<'_, Vec<Extension>> {
self.inner.extensions.write().await
}
#[named]
pub(super) async fn enable_extension(
&self,
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
) -> Result<(), String> {
let mut extensions_write_lock = self.inner.extensions.write().await;
@@ -552,11 +393,11 @@ impl ThirdPartyExtensionsSearchSource {
update_extension(extension)?;
alter_extension_json_file(
&THIRD_PARTY_EXTENSIONS_DIRECTORY,
&get_third_party_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
Self::_enable_extension(extension).await?;
Self::_enable_extension(tauri_app_handle, extension).await?;
Ok(())
}
@@ -564,6 +405,7 @@ impl ThirdPartyExtensionsSearchSource {
#[named]
pub(super) async fn disable_extension(
&self,
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
) -> Result<(), String> {
let mut extensions_write_lock = self.inner.extensions.write().await;
@@ -591,11 +433,11 @@ impl ThirdPartyExtensionsSearchSource {
update_extension(extension)?;
alter_extension_json_file(
&THIRD_PARTY_EXTENSIONS_DIRECTORY,
&get_third_party_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
Self::_disable_extension(extension).await?;
Self::_disable_extension(tauri_app_handle, extension).await?;
Ok(())
}
@@ -603,6 +445,7 @@ impl ThirdPartyExtensionsSearchSource {
#[named]
pub(super) async fn set_extension_alias(
&self,
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
alias: &str,
) -> Result<(), String> {
@@ -623,7 +466,7 @@ impl ThirdPartyExtensionsSearchSource {
update_extension(extension)?;
alter_extension_json_file(
&THIRD_PARTY_EXTENSIONS_DIRECTORY,
&get_third_party_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
@@ -633,11 +476,11 @@ impl ThirdPartyExtensionsSearchSource {
/// Initialize the third-party extensions, which literally means
/// enabling/activating the enabled extensions.
pub(super) async fn init(&self) -> Result<(), String> {
pub(crate) async fn init(&self, tauri_app_handle: &AppHandle) -> Result<(), String> {
let extensions_read_lock = self.inner.extensions.read().await;
for extension in extensions_read_lock.iter().filter(|ext| ext.enabled) {
Self::_enable_extension(extension).await?;
Self::_enable_extension(tauri_app_handle, extension).await?;
}
Ok(())
@@ -646,10 +489,12 @@ impl ThirdPartyExtensionsSearchSource {
#[named]
pub(super) async fn register_extension_hotkey(
&self,
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
hotkey: &str,
) -> Result<(), String> {
self.unregister_extension_hotkey(bundle_id).await?;
self.unregister_extension_hotkey(tauri_app_handle, bundle_id)
.await?;
let mut extensions_write_lock = self.inner.extensions.write().await;
let extension =
@@ -669,15 +514,12 @@ impl ThirdPartyExtensionsSearchSource {
// Update extension (memory and file)
update_extension(extension)?;
alter_extension_json_file(
&THIRD_PARTY_EXTENSIONS_DIRECTORY,
&get_third_party_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
// Set hotkey
let tauri_app_handle = GLOBAL_TAURI_APP_HANDLE
.get()
.expect("global tauri app handle not set");
let on_opened = extension.on_opened().unwrap_or_else(|| panic!(
"setting hotkey for an extension that cannot be opened, extension ID [{:?}], extension type [{:?}]", bundle_id, extension.r#type,
));
@@ -685,12 +527,14 @@ impl ThirdPartyExtensionsSearchSource {
let bundle_id_owned = bundle_id.to_owned();
tauri_app_handle
.global_shortcut()
.on_shortcut(hotkey, move |_tauri_app_handle, _hotkey, event| {
.on_shortcut(hotkey, move |tauri_app_handle, _hotkey, event| {
let on_opened_clone = on_opened.clone();
let bundle_id_clone = bundle_id_owned.clone();
let app_handle_clone = tauri_app_handle.clone();
if event.state() == ShortcutState::Pressed {
async_runtime::spawn(async move {
let result = open(on_opened_clone).await;
let result = open(app_handle_clone, on_opened_clone, None).await;
if let Err(msg) = result {
log::warn!(
"failed to open extension [{:?}], error [{}]",
@@ -711,6 +555,7 @@ impl ThirdPartyExtensionsSearchSource {
#[named]
pub(super) async fn unregister_extension_hotkey(
&self,
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
) -> Result<(), String> {
let mut extensions_write_lock = self.inner.extensions.write().await;
@@ -738,15 +583,12 @@ impl ThirdPartyExtensionsSearchSource {
update_extension(extension)?;
alter_extension_json_file(
&THIRD_PARTY_EXTENSIONS_DIRECTORY,
&get_third_party_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
// Set hotkey
let tauri_app_handle = GLOBAL_TAURI_APP_HANDLE
.get()
.expect("global tauri app handle not set");
tauri_app_handle
.global_shortcut()
.unregister(hotkey.as_str())
@@ -805,46 +647,68 @@ impl ThirdPartyExtensionsSearchSource {
.any(|ext| ext.developer.as_deref() == Some(developer) && ext.id == extension_id)
}
pub(crate) async fn add_extension(&self, extension: Extension) {
assert!(
extension.developer.is_some(),
"loaded third party extension should have its developer set"
);
pub(crate) async fn uninstall_extension(
&self,
tauri_app_handle: &AppHandle,
developer: &str,
extension_id: &str,
) -> Result<(), String> {
let mut write_lock = self.inner.extensions.write().await;
let mut write_lock_guard = self.inner.extensions.write().await;
if write_lock_guard
.iter()
.any(|ext| ext.developer == extension.developer && ext.id == extension.id)
{
panic!(
"extension [{}/{}] already installed",
extension
.developer
.as_ref()
.expect("just checked it is Some"),
extension.id
);
}
write_lock_guard.push(extension);
}
pub(crate) async fn remove_extension(&self, developer: &str, extension_id: &str) {
let mut write_lock_guard = self.inner.extensions.write().await;
let Some(index) = write_lock_guard
let Some(index) = write_lock
.iter()
.position(|ext| ext.developer.as_deref() == Some(developer) && ext.id == extension_id)
else {
panic!(
"extension [{}/{}] not installed, but we are trying to remove it",
return Err(format!(
"The extension we are trying to uninstall [{}/{}] does not exist",
developer, extension_id
);
));
};
let deleted_extension = write_lock.remove(index);
let extension_dir = {
let mut path = get_third_party_extension_directory(&tauri_app_handle);
path.push(developer);
path.push(extension_id);
path
};
write_lock_guard.remove(index);
if let Err(e) = tokio::fs::remove_dir_all(extension_dir.as_path()).await {
let error_kind = e.kind();
if error_kind == ErrorKind::NotFound {
// We accept this error because we do want it to not exist. But
// since it is not a state we expect, throw a warning.
log::warn!(
"trying to uninstalling extension [developer {} id {}], but its directory does not exist",
developer,
extension_id
);
} else {
return Err(format!(
"failed to uninstall extension [developer {} id {}] due to error {}",
developer, extension_id, e
));
}
}
// Unregister the extension hotkey, if set.
//
// Unregistering hotkey is the only thing that we will do when we disable
// an extension, so we directly use this function here even though "disabling"
// the extension that one is trying to uninstall does not make too much sense.
Self::_disable_extension(&tauri_app_handle, &deleted_extension).await?;
Ok(())
}
/// Take a point-in-time snapshot at the extension list and return it.
pub(crate) async fn extensions_snapshot(&self) -> Vec<Extension> {
self.inner.extensions.read().await.clone()
}
}
pub(super) static THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE: OnceLock<ThirdPartyExtensionsSearchSource> =
pub(crate) static THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE: OnceLock<ThirdPartyExtensionsSearchSource> =
OnceLock::new();
#[derive(Debug)]
@@ -865,7 +729,11 @@ impl SearchSource for ThirdPartyExtensionsSearchSource {
}
}
async fn search(&self, query: SearchQuery) -> Result<QueryResponse, SearchError> {
async fn search(
&self,
_tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
let Some(query_string) = query.query_strings.get("query") else {
return Ok(QueryResponse {
source: self.get_type(),
@@ -909,10 +777,10 @@ impl SearchSource for ThirdPartyExtensionsSearchSource {
}
}
if let Some(ref quick_links) = extension.quicklinks {
for quick_link in quick_links.iter().filter(|link| link.enabled) {
if let Some(ref quicklinks) = extension.quicklinks {
for quicklink in quicklinks.iter().filter(|link| link.enabled) {
if let Some(hit) = extension_to_hit(
quick_link,
quicklink,
&query_lower,
opt_data_source.as_deref(),
) {
@@ -920,6 +788,16 @@ impl SearchSource for ThirdPartyExtensionsSearchSource {
}
}
}
if let Some(ref views) = extension.views {
for view in views.iter().filter(|link| link.enabled) {
if let Some(hit) =
extension_to_hit(view, &query_lower, opt_data_source.as_deref())
{
hits.push(hit);
}
}
}
} else {
if let Some(hit) =
extension_to_hit(extension, &query_lower, opt_data_source.as_deref())
@@ -948,7 +826,20 @@ impl SearchSource for ThirdPartyExtensionsSearchSource {
}
}
fn extension_to_hit(
#[tauri::command]
pub(crate) async fn uninstall_extension(
tauri_app_handle: AppHandle,
developer: String,
extension_id: String,
) -> Result<(), String> {
THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE
.get()
.expect("global third party search source not set")
.uninstall_extension(&tauri_app_handle, &developer, &extension_id)
.await
}
pub(crate) fn extension_to_hit(
extension: &Extension,
query_lower: &str,
opt_data_source: Option<&str>,
@@ -1017,144 +908,3 @@ fn extension_to_hit(
None
}
}
// Calculates a similarity score between a query and a text, aiming for a [0, 1] range.
// Assumes query and text are already lowercased.
fn calculate_text_similarity(query: &str, text: &str) -> Option<f64> {
if query.is_empty() || text.is_empty() {
return None;
}
if text == query {
return Some(1.0); // Perfect match
}
let query_len = query.len() as f64;
let text_len = text.len() as f64;
let ratio = query_len / text_len;
let mut score: f64 = 0.0;
// Case 1: Text starts with the query (prefix match)
// Score: base 0.5, bonus up to 0.4 for how much of `text` is covered by `query`. Max 0.9.
if text.starts_with(query) {
score = score.max(0.5 + 0.4 * ratio);
}
// Case 2: Text contains the query (substring match, not necessarily prefix)
// Score: base 0.3, bonus up to 0.3. Max 0.6.
// `score.max` ensures that if it's both a prefix and contains, the higher score (prefix) is taken.
if text.contains(query) {
score = score.max(0.3 + 0.3 * ratio);
}
// Case 3: Fallback for "all query characters exist in text" (order-independent)
if score < 0.2 {
if query.chars().all(|c_q| text.contains(c_q)) {
score = score.max(0.15); // Fixed low score for this weaker match type
}
}
if score > 0.0 {
// Cap non-perfect matches slightly below 1.0 to make perfect (1.0) distinct.
Some(score.min(0.95))
} else {
None
}
}
#[cfg(test)]
mod tests {
use super::*;
// Helper function for approximate floating point comparison
fn approx_eq(a: f64, b: f64) -> bool {
(a - b).abs() < 1e-10
}
#[test]
fn test_empty_strings() {
assert_eq!(calculate_text_similarity("", "text"), None);
assert_eq!(calculate_text_similarity("query", ""), None);
assert_eq!(calculate_text_similarity("", ""), None);
}
#[test]
fn test_perfect_match() {
assert_eq!(calculate_text_similarity("text", "text"), Some(1.0));
assert_eq!(calculate_text_similarity("a", "a"), Some(1.0));
}
#[test]
fn test_prefix_match() {
// For "te" and "text":
// score = 0.5 + 0.4 * (2/4) = 0.5 + 0.2 = 0.7
let score = calculate_text_similarity("te", "text").unwrap();
assert!(approx_eq(score, 0.7));
// For "tex" and "text":
// score = 0.5 + 0.4 * (3/4) = 0.5 + 0.3 = 0.8
let score = calculate_text_similarity("tex", "text").unwrap();
assert!(approx_eq(score, 0.8));
}
#[test]
fn test_substring_match() {
// For "ex" and "text":
// score = 0.3 + 0.3 * (2/4) = 0.3 + 0.15 = 0.45
let score = calculate_text_similarity("ex", "text").unwrap();
assert!(approx_eq(score, 0.45));
// Prefix should score higher than substring
assert!(
calculate_text_similarity("te", "text").unwrap()
> calculate_text_similarity("ex", "text").unwrap()
);
}
#[test]
fn test_character_presence() {
// Characters present but not in sequence
// "tac" in "contact" - not a substring, but all chars exist
let score = calculate_text_similarity("tac", "contact").unwrap();
assert!(approx_eq(0.3 + 0.3 * (3.0 / 7.0), score));
assert!(calculate_text_similarity("ac", "contact").is_some());
// Should not apply if some characters are missing
assert_eq!(calculate_text_similarity("xyz", "contact"), None);
}
#[test]
fn test_combined_scenarios() {
// Test that character presence fallback doesn't override higher scores
// "tex" is a prefix of "text" with score 0.8
let score = calculate_text_similarity("tex", "text").unwrap();
assert!(approx_eq(score, 0.8));
// Test a case where the characters exist but it's already a substring
// "act" is a substring of "contact" with score > 0.2, so fallback won't apply
let expected_score = 0.3 + 0.3 * (3.0 / 7.0);
let actual_score = calculate_text_similarity("act", "contact").unwrap();
assert!(approx_eq(actual_score, expected_score));
}
#[test]
fn test_no_similarity() {
assert_eq!(calculate_text_similarity("xyz", "test"), None);
}
#[test]
fn test_score_capping() {
// Use a long query that's a prefix of a slightly longer text
let long_text = "abcdefghijklmnopqrstuvwxyz";
let long_prefix = "abcdefghijklmnopqrstuvwxy"; // All but last letter
// Expected score would be 0.5 + 0.4 * (25/26) = 0.5 + 0.385 = 0.885
let expected_score = 0.5 + 0.4 * (25.0 / 26.0);
let actual_score = calculate_text_similarity(long_prefix, long_text).unwrap();
assert!(approx_eq(actual_score, expected_score));
// Verify that non-perfect matches are capped at 0.95
assert!(calculate_text_similarity("almost", "almost perfect").unwrap() <= 0.95);
}
}

View File

@@ -10,16 +10,15 @@ mod shortcut;
mod util;
use crate::common::register::SearchSourceRegistry;
// use crate::common::traits::SearchSource;
use crate::common::{CHECK_WINDOW_LABEL, MAIN_WINDOW_LABEL, SETTINGS_WINDOW_LABEL};
use crate::server::servers::{load_or_insert_default_server, load_servers_token};
use autostart::{change_autostart, ensure_autostart_state_consistent};
use crate::util::prevent_default;
use autostart::change_autostart;
use lazy_static::lazy_static;
use std::sync::Mutex;
use std::sync::OnceLock;
use tauri::async_runtime::block_on;
use tauri::plugin::TauriPlugin;
use tauri::{AppHandle, Emitter, Manager, PhysicalPosition, Runtime, WebviewWindow, WindowEvent};
use tauri::{AppHandle, Emitter, Manager, PhysicalPosition, WebviewWindow, WindowEvent};
use tauri_plugin_autostart::MacosLauncher;
/// Tauri store name
@@ -28,9 +27,14 @@ pub(crate) const COCO_TAURI_STORE: &str = "coco_tauri_store";
lazy_static! {
static ref PREVIOUS_MONITOR_NAME: Mutex<Option<String>> = Mutex::new(None);
}
/// To allow us to access tauri's `AppHandle` when its context is inaccessible,
/// store it globally. It will be set in `init()`.
///
/// # WARNING
///
/// You may find this work, but the usage is discouraged and should be generally
/// avoided. If you do need it, always be careful that it may not be set() when
/// you access it.
pub(crate) static GLOBAL_TAURI_APP_HANDLE: OnceLock<AppHandle> = OnceLock::new();
#[tauri::command]
@@ -65,10 +69,8 @@ pub fn run() {
#[cfg(desktop)]
{
app_builder = app_builder.plugin(tauri_plugin_single_instance::init(|_app, argv, _cwd| {
log::debug!("a new app instance was opened with {argv:?} and the deep link event was already triggered");
// when defining deep link schemes at runtime, you must also check `argv` here
}));
app_builder =
app_builder.plugin(tauri_plugin_single_instance::init(|_app, _argv, _cwd| {}));
}
app_builder = app_builder
@@ -85,9 +87,14 @@ pub fn run() {
.plugin(tauri_plugin_macos_permissions::init())
.plugin(tauri_plugin_screenshots::init())
.plugin(tauri_plugin_process::init())
.plugin(tauri_plugin_updater::Builder::new().build())
.plugin(
tauri_plugin_updater::Builder::new()
.default_version_comparator(crate::util::updater::custom_version_comparator)
.build(),
)
.plugin(tauri_plugin_windows_version::init())
.plugin(tauri_plugin_opener::init());
.plugin(tauri_plugin_opener::init())
.plugin(prevent_default::init());
// Conditional compilation for macOS
#[cfg(target_os = "macos")]
@@ -107,7 +114,6 @@ pub fn run() {
show_settings,
show_check,
hide_check,
server::servers::get_server_token,
server::servers::add_coco_server,
server::servers::remove_coco_server,
server::servers::list_coco_servers,
@@ -122,8 +128,8 @@ pub fn run() {
server::connector::get_connectors_by_server,
search::query_coco_fusion,
assistant::chat_history,
assistant::new_chat,
assistant::send_message,
assistant::chat_create,
assistant::chat_chat,
assistant::session_chat_history,
assistant::open_session_chat,
assistant::close_session_chat,
@@ -135,21 +141,19 @@ pub fn run() {
assistant::assistant_get_multi,
// server::get_coco_server_datasources,
// server::get_coco_server_connectors,
server::websocket::connect_to_server,
server::websocket::disconnect,
get_app_search_source,
server::attachment::upload_attachment,
server::attachment::get_attachment,
server::attachment::get_attachment_by_ids,
server::attachment::delete_attachment,
server::transcription::transcription,
server::system_settings::get_system_settings,
simulate_mouse_click,
extension::built_in::application::get_app_list,
extension::built_in::application::get_app_search_path,
extension::built_in::application::get_app_metadata,
extension::built_in::application::add_app_search_path,
extension::built_in::application::remove_app_search_path,
extension::built_in::application::reindex_applications,
extension::quicklink_link_arguments,
extension::list_extensions,
extension::enable_extension,
extension::disable_extension,
@@ -157,13 +161,24 @@ pub fn run() {
extension::register_extension_hotkey,
extension::unregister_extension_hotkey,
extension::is_extension_enabled,
extension::store::search_extension,
extension::store::install_extension,
extension::store::uninstall_extension,
extension::third_party::install::store::search_extension,
extension::third_party::install::store::extension_detail,
extension::third_party::install::store::install_extension_from_store,
extension::third_party::install::local_extension::install_local_extension,
extension::third_party::uninstall_extension,
extension::api::apis,
extension::api::fs::read_dir,
settings::set_allow_self_signature,
settings::get_allow_self_signature,
assistant::ask_ai,
crate::common::document::open,
extension::built_in::file_search::config::get_file_system_config,
extension::built_in::file_search::config::set_file_system_config,
server::synthesize::synthesize,
util::file::get_file_icon,
setup::backend_setup,
util::app_lang::update_app_lang,
util::path::path_absolute,
])
.setup(|app| {
#[cfg(target_os = "macos")]
@@ -173,55 +188,21 @@ pub fn run() {
log::trace!("Dock icon should be hidden now");
}
let app_handle = app.handle().clone();
GLOBAL_TAURI_APP_HANDLE
.set(app_handle.clone())
.expect("variable already initialized");
log::trace!("global Tauri app handle set");
let registry = SearchSourceRegistry::default();
app.manage(registry); // Store registry in Tauri's app state
app.manage(server::websocket::WebSocketManager::default());
block_on(async {
init(app.handle()).await;
});
shortcut::enable_shortcut(app);
ensure_autostart_state_consistent(app)?;
// app.listen("theme-changed", move |event| {
// if let Ok(payload) = serde_json::from_str::<ThemeChangedPayload>(event.payload()) {
// // switch_tray_icon(app.app_handle(), payload.is_dark_mode);
// log::debug!("Theme changed: is_dark_mode = {}", payload.is_dark_mode);
// }
// });
#[cfg(desktop)]
{
#[cfg(any(windows, target_os = "linux"))]
{
app.deep_link().register("coco")?;
use tauri_plugin_deep_link::DeepLinkExt;
app.deep_link().register_all()?;
}
}
// app.deep_link().on_open_url(|event| {
// dbg!(event.urls());
// });
let main_window = app.get_webview_window(MAIN_WINDOW_LABEL).unwrap();
let settings_window = app.get_webview_window(SETTINGS_WINDOW_LABEL).unwrap();
let check_window = app.get_webview_window(CHECK_WINDOW_LABEL).unwrap();
/* ----------- This code must be executed on the main thread and must not be relocated. ----------- */
let app_handle = app.app_handle();
let main_window = app_handle.get_webview_window(MAIN_WINDOW_LABEL).unwrap();
let settings_window = app_handle
.get_webview_window(SETTINGS_WINDOW_LABEL)
.unwrap();
let check_window = app_handle.get_webview_window(CHECK_WINDOW_LABEL).unwrap();
setup::default(
app,
app_handle,
main_window.clone(),
settings_window.clone(),
check_window.clone(),
);
/* ----------- This code must be executed on the main thread and must not be relocated. ----------- */
Ok(())
})
@@ -256,7 +237,7 @@ pub fn run() {
});
}
pub async fn init<R: Runtime>(app_handle: &AppHandle<R>) {
pub async fn init(app_handle: &AppHandle) {
// Await the async functions to load the servers and tokens
if let Err(err) = load_or_insert_default_server(app_handle).await {
log::error!("Failed to load servers: {}", err);
@@ -266,7 +247,7 @@ pub async fn init<R: Runtime>(app_handle: &AppHandle<R>) {
log::error!("Failed to load server tokens: {}", err);
}
let coco_servers = server::servers::get_all_servers();
let coco_servers = server::servers::get_all_servers().await;
// Get the registry from Tauri's state
// let registry: State<SearchSourceRegistry> = app_handle.state::<SearchSourceRegistry>();
@@ -280,12 +261,22 @@ pub async fn init<R: Runtime>(app_handle: &AppHandle<R>) {
}
#[tauri::command]
async fn show_coco<R: Runtime>(app_handle: AppHandle<R>) {
async fn show_coco(app_handle: AppHandle) {
if let Some(window) = app_handle.get_webview_window(MAIN_WINDOW_LABEL) {
move_window_to_active_monitor(&window);
let _ = window.show();
let _ = window.unminimize();
// The Window Management (WM) extension (macOS-only) controls the
// frontmost window. Setting focus on macOS makes Coco the frontmost
// window, which means the WM extension would control Coco instead of other
// windows, which is not what we want.
//
// On Linux/Windows, however, setting focus is a necessity to ensure that
// users open Coco's window, then they can start typing, without needing
// to click on the window.
#[cfg(not(target_os = "macos"))]
let _ = window.set_focus();
let _ = app_handle.emit("show-coco", ());
@@ -293,7 +284,7 @@ async fn show_coco<R: Runtime>(app_handle: AppHandle<R>) {
}
#[tauri::command]
async fn hide_coco<R: Runtime>(app: AppHandle<R>) {
async fn hide_coco(app: AppHandle) {
if let Some(window) = app.get_webview_window(MAIN_WINDOW_LABEL) {
if let Err(err) = window.hide() {
log::error!("Failed to hide the window: {}", err);
@@ -305,7 +296,7 @@ async fn hide_coco<R: Runtime>(app: AppHandle<R>) {
}
}
fn move_window_to_active_monitor<R: Runtime>(window: &WebviewWindow<R>) {
fn move_window_to_active_monitor(window: &WebviewWindow) {
//dbg!("Moving window to active monitor");
// Try to get the available monitors, handle failure gracefully
let available_monitors = match window.available_monitors() {
@@ -399,13 +390,7 @@ fn move_window_to_active_monitor<R: Runtime>(window: &WebviewWindow<R>) {
}
#[tauri::command]
async fn get_app_search_source<R: Runtime>(app_handle: AppHandle<R>) -> Result<(), String> {
// We want all the extensions here, so no filter condition specified.
let (_found_invalid_extensions, extensions) = extension::list_extensions(None, None, false)
.await
.map_err(|e| e.to_string())?;
extension::init_extensions(extensions).await?;
async fn get_app_search_source(app_handle: AppHandle) -> Result<(), String> {
let _ = server::connector::refresh_all_connectors(&app_handle).await;
let _ = server::datasource::refresh_all_datasources(&app_handle).await;
@@ -446,52 +431,6 @@ async fn hide_check(app_handle: AppHandle) {
window.hide().unwrap();
}
#[tauri::command]
async fn simulate_mouse_click<R: Runtime>(window: WebviewWindow<R>, is_chat_mode: bool) {
#[cfg(target_os = "windows")]
{
use enigo::{Button, Coordinate, Direction, Enigo, Mouse, Settings};
use std::{thread, time::Duration};
if let Ok(mut enigo) = Enigo::new(&Settings::default()) {
// Save the current mouse position
if let Ok((original_x, original_y)) = enigo.location() {
// Retrieve the window's outer position (top-left corner)
if let Ok(position) = window.outer_position() {
// Retrieve the window's inner size (client area)
if let Ok(size) = window.inner_size() {
// Calculate the center position of the title bar
let x = position.x + (size.width as i32 / 2);
let y = if is_chat_mode {
position.y + size.height as i32 - 50
} else {
position.y + 30
};
// Move the mouse cursor to the calculated position
if enigo.move_mouse(x, y, Coordinate::Abs).is_ok() {
// // Simulate a left mouse click
let _ = enigo.button(Button::Left, Direction::Click);
// let _ = enigo.button(Button::Left, Direction::Release);
thread::sleep(Duration::from_millis(100));
// Move the mouse cursor back to the original position
let _ = enigo.move_mouse(original_x, original_y, Coordinate::Abs);
}
}
}
}
}
}
#[cfg(not(target_os = "windows"))]
{
let _ = window;
let _ = is_chat_mode;
}
}
/// Log format:
///
/// ```text
@@ -601,7 +540,12 @@ fn set_up_tauri_logger() -> TauriPlugin<tauri::Wry> {
// When running the built binary, set `COCO_LOG` to `coco_lib=trace` to capture all logs
// that come from Coco in the log file, which helps with debugging.
if !tauri::is_dev() {
std::env::set_var("COCO_LOG", "coco_lib=trace");
// We have absolutely no guarantee that we (We have control over the Rust
// code, but definitely no idea about the libc C code, all the shared objects
// that we will link) will not concurrently read/write `envp`, so just use unsafe.
unsafe {
std::env::set_var("COCO_LOG", "coco_lib=trace");
}
}
let mut builder = tauri_plugin_log::Builder::new();

View File

@@ -1,5 +1,112 @@
// Prevents additional console window on Windows in release, DO NOT REMOVE!!
#![cfg_attr(not(debug_assertions), windows_subsystem = "windows")]
use std::fs::OpenOptions;
use std::io::Write;
use std::path::PathBuf;
/// Helper function to return the log directory.
///
/// This should return the same value as `tauri_app_handle.path().app_log_dir().unwrap()`.
fn app_log_dir() -> PathBuf {
// This function `app_log_dir()` is for the panic hook, which should be set
// before Tauri performs any initialization. At that point, we do not have
// access to the identifier provided by Tauri, so we need to define our own
// one here.
//
// NOTE: If you update identifier in the following files, update this one
// as well!
//
// src-tauri/tauri.linux.conf.json
// src-tauri/Entitlements.plist
// src-tauri/tauri.conf.json
// src-tauri/Info.plist
const IDENTIFIER: &str = "rs.coco.app";
#[cfg(target_os = "macos")]
let path = dirs::home_dir()
.expect("cannot find the home directory, Coco should never run in such a environment")
.join("Library/Logs")
.join(IDENTIFIER);
#[cfg(not(target_os = "macos"))]
let path = dirs::data_local_dir()
.expect("app local dir is None, we should not encounter this")
.join(IDENTIFIER)
.join("logs");
path
}
/// Set up panic hook to log panic information to a file
fn setup_panic_hook() {
std::panic::set_hook(Box::new(|panic_info| {
let timestamp = chrono::Local::now();
// "%Y-%m-%d %H:%M:%S"
//
// I would like to use the above format, but Windows does not allow that
// and complains with OS error 123.
let datetime_str = timestamp.format("%Y-%m-%d-%H-%M-%S").to_string();
let log_dir = app_log_dir();
// Ensure the log directory exists
if let Err(e) = std::fs::create_dir_all(&log_dir) {
eprintln!("Panic hook error: failed to create log directory: {}", e);
return;
}
let panic_file = log_dir.join(format!("{}_rust_panic.log", datetime_str));
// Prepare panic information
let panic_message = if let Some(s) = panic_info.payload().downcast_ref::<&str>() {
s.to_string()
} else if let Some(s) = panic_info.payload().downcast_ref::<String>() {
s.clone()
} else {
"Unknown panic message".to_string()
};
let location = if let Some(location) = panic_info.location() {
format!(
"{}:{}:{}",
location.file(),
location.line(),
location.column()
)
} else {
"Unknown location".to_string()
};
// Use `force_capture()` instead of `capture()` as we want backtrace
// regardless of whether the corresponding env vars are set or not.
let backtrace = std::backtrace::Backtrace::force_capture();
let panic_log = format!(
"Time: [{}]\nLocation: [{}]\nMessage: [{}]\nBacktrace: \n{}",
datetime_str, location, panic_message, backtrace
);
// Write to panic file
match OpenOptions::new()
.create(true)
.append(true)
.open(&panic_file)
{
Ok(mut file) => {
if let Err(e) = writeln!(file, "{}", panic_log) {
eprintln!("Panic hook error: Failed to write panic to file: {}", e);
}
}
Err(e) => {
eprintln!("Panic hook error: Failed to open panic log file: {}", e);
}
}
}));
}
fn main() {
// Panic hook setup should be the first thing to do, everything could panic!
setup_panic_hook();
coco_lib::run();
}

View File

@@ -1,141 +1,210 @@
use crate::common::error::SearchError;
use crate::common::register::SearchSourceRegistry;
use crate::common::search::{
FailedRequest, MultiSourceQueryResponse, QueryHits, QueryResponse, QuerySource, SearchQuery,
FailedRequest, MultiSourceQueryResponse, QueryHits, QuerySource, SearchQuery,
};
use crate::common::traits::SearchSource;
use crate::server::servers::logout_coco_server;
use crate::server::servers::mark_server_as_offline;
use function_name::named;
use futures::stream::FuturesUnordered;
use futures::StreamExt;
use futures::stream::FuturesUnordered;
use reqwest::StatusCode;
use std::cmp::Reverse;
use std::collections::HashMap;
use std::collections::HashSet;
use std::future::Future;
use std::sync::Arc;
use tauri::{AppHandle, Manager, Runtime};
use tokio::time::error::Elapsed;
use tokio::time::{timeout, Duration};
/// Helper function to return the Future used for querying querysources.
///
/// It is a workaround for the limitations:
///
/// 1. 2 async blocks have different types in Rust's type system even though
/// they are literally same
/// 2. `futures::stream::FuturesUnordered` needs the `Futures` pushed to it to
/// have only 1 type
///
/// Putting the async block in a function to unify the types.
fn same_type_futures(
query_source: QuerySource,
query_source_trait_object: Arc<dyn SearchSource>,
timeout_duration: Duration,
search_query: SearchQuery,
) -> impl Future<
Output = (
QuerySource,
Result<Result<QueryResponse, SearchError>, Elapsed>,
),
> + 'static {
async move {
(
// Store `query_source` as part of future for debugging purposes.
query_source,
timeout(timeout_duration, async {
query_source_trait_object.search(search_query).await
})
.await,
)
}
}
use tauri::{AppHandle, Manager};
use tokio::time::{Duration, timeout};
#[named]
#[tauri::command]
pub async fn query_coco_fusion<R: Runtime>(
app_handle: AppHandle<R>,
pub async fn query_coco_fusion(
tauri_app_handle: AppHandle,
from: u64,
size: u64,
query_strings: HashMap<String, String>,
query_timeout: u64,
) -> Result<MultiSourceQueryResponse, SearchError> {
let query_keyword = query_strings
.get("query")
.unwrap_or(&"".to_string())
.clone();
let opt_query_source_id = query_strings.get("querysource");
let search_sources = app_handle.state::<SearchSourceRegistry>();
let sources_future = search_sources.get_sources();
let mut futures = FuturesUnordered::new();
let mut sources_list = sources_future.await;
let sources_list_len = sources_list.len();
// Time limit for each query
let search_sources = tauri_app_handle.state::<SearchSourceRegistry>();
let query_source_list = search_sources.get_sources().await;
let timeout_duration = Duration::from_millis(query_timeout);
let search_query = SearchQuery::new(from, size, query_strings.clone());
log::debug!(
"{}(): {:?}, timeout: {:?}",
"{}() invoked with parameters: from: [{}], size: [{}], query_strings: [{:?}], timeout: [{:?}]",
function_name!(),
from,
size,
query_strings,
timeout_duration
);
let search_query = SearchQuery::new(from, size, query_strings.clone());
// Dispatch to different `query_coco_fusion_xxx()` functions.
if let Some(query_source_id) = opt_query_source_id {
// If this query source ID is specified, we only query this query source.
log::debug!(
"parameter [querysource={}] specified, will only query this querysource",
query_source_id
);
let opt_query_source_trait_object_index = sources_list
.iter()
.position(|query_source| &query_source.get_type().id == query_source_id);
let Some(query_source_trait_object_index) = opt_query_source_trait_object_index else {
// It is possible (an edge case) that the frontend invokes `query_coco_fusion()` with a
// datasource that does not exist in the source list:
//
// 1. Search applications
// 2. Navigate to the application sub page
// 3. Disable the application extension in settings
// 4. hide the search window
// 5. Re-open the search window and search for something
//
// The application search source is not in the source list because the extension
// has been disabled, but the last search is indeed invoked with parameter
// `datasource=application`.
return Ok(MultiSourceQueryResponse {
failed: Vec::new(),
hits: Vec::new(),
total_hits: 0,
});
};
let query_source_trait_object = sources_list.remove(query_source_trait_object_index);
let query_source = query_source_trait_object.get_type();
futures.push(same_type_futures(
query_source,
query_source_trait_object,
query_coco_fusion_single_query_source(
tauri_app_handle,
query_source_list,
query_source_id.clone(),
timeout_duration,
search_query,
));
)
.await
} else {
for query_source_trait_object in sources_list {
let query_source = query_source_trait_object.get_type().clone();
log::debug!("will query querysource [{}]", query_source.id);
futures.push(same_type_futures(
query_source,
query_source_trait_object,
timeout_duration,
search_query.clone(),
));
query_coco_fusion_multi_query_sources(
tauri_app_handle,
query_source_list,
timeout_duration,
search_query,
)
.await
}
}
/// Query only 1 query source.
///
/// The logic here is much simpler than `query_coco_fusion_multi_query_sources()`
/// as we don't need to re-rank due to fact that this does not involve multiple
/// query sources.
async fn query_coco_fusion_single_query_source(
tauri_app_handle: AppHandle,
mut query_source_list: Vec<Arc<dyn SearchSource>>,
id_of_query_source_to_query: String,
timeout_duration: Duration,
search_query: SearchQuery,
) -> Result<MultiSourceQueryResponse, SearchError> {
// If this query source ID is specified, we only query this query source.
log::debug!(
"parameter [querysource={}] specified, will only query this query source",
id_of_query_source_to_query
);
let opt_query_source_trait_object_index = query_source_list
.iter()
.position(|query_source| query_source.get_type().id == id_of_query_source_to_query);
let Some(query_source_trait_object_index) = opt_query_source_trait_object_index else {
// It is possible (an edge case) that the frontend invokes `query_coco_fusion()`
// with a querysource that does not exist in the source list:
//
// 1. Search applications
// 2. Navigate to the application sub page
// 3. Disable the application extension in settings, which removes this
// query source from the list
// 4. hide the search window
// 5. Re-open the search window, you will still be in the sub page, type to search
// something
//
// The application query source is not in the source list because the extension
// was disabled and thus removed from the query sources, but the last
// search is indeed invoked with parameter `querysource=application`.
return Ok(MultiSourceQueryResponse {
failed: Vec::new(),
hits: Vec::new(),
total_hits: 0,
});
};
let query_source_trait_object = query_source_list.remove(query_source_trait_object_index);
let query_source = query_source_trait_object.get_type();
let search_fut = query_source_trait_object.search(tauri_app_handle.clone(), search_query);
let timeout_result = timeout(timeout_duration, search_fut).await;
let mut failed_requests: Vec<FailedRequest> = Vec::new();
let mut hits = Vec::new();
let mut total_hits = 0;
match timeout_result {
// Ignore the `_timeout` variable as it won't provide any useful debugging information.
Err(_timeout) => {
log::warn!(
"searching query source [{}] timed out, skip this request",
query_source.id
);
}
Ok(query_result) => match query_result {
Ok(response) => {
total_hits = response.total_hits;
for (document, score) in response.hits {
log::debug!(
"document from query source [{}]: ID [{}], title [{:?}], score [{}]",
response.source.id,
document.id,
document.title,
score
);
let query_hit = QueryHits {
source: Some(response.source.clone()),
score,
document,
};
hits.push(query_hit);
}
}
Err(search_error) => {
query_coco_fusion_handle_failed_request(
tauri_app_handle.clone(),
&mut failed_requests,
query_source,
search_error,
)
.await;
}
},
}
Ok(MultiSourceQueryResponse {
failed: failed_requests,
hits,
total_hits,
})
}
async fn query_coco_fusion_multi_query_sources(
tauri_app_handle: AppHandle,
query_source_trait_object_list: Vec<Arc<dyn SearchSource>>,
timeout_duration: Duration,
search_query: SearchQuery,
) -> Result<MultiSourceQueryResponse, SearchError> {
log::debug!(
"will query query sources {:?}",
query_source_trait_object_list
.iter()
.map(|search_source| search_source.get_type().id.clone())
.collect::<Vec<String>>()
);
let query_keyword = search_query
.query_strings
.get("query")
.unwrap_or(&"".to_string())
.clone();
let size = search_query.size;
let mut futures = FuturesUnordered::new();
let query_source_list_len = query_source_trait_object_list.len();
for query_source_trait_object in query_source_trait_object_list {
let query_source = query_source_trait_object.get_type().clone();
let tauri_app_handle_clone = tauri_app_handle.clone();
let search_query_clone = search_query.clone();
futures.push(async move {
(
// Store `query_source` as part of future for debugging purposes.
query_source,
timeout(timeout_duration, async {
query_source_trait_object
.search(tauri_app_handle_clone, search_query_clone)
.await
})
.await,
)
});
}
let mut total_hits = 0;
@@ -144,7 +213,7 @@ pub async fn query_coco_fusion<R: Runtime>(
let mut all_hits: Vec<(String, QueryHits, f64)> = Vec::new();
let mut hits_per_source: HashMap<String, Vec<(QueryHits, f64)>> = HashMap::new();
if sources_list_len > 1 {
if query_source_list_len > 1 {
need_rerank = true; // If we have more than one source, we need to rerank the hits
}
@@ -156,25 +225,25 @@ pub async fn query_coco_fusion<R: Runtime>(
"searching query source [{}] timed out, skip this request",
query_source.id
);
// failed_requests.push(FailedRequest {
// source: query_source,
// status: 0,
// error: Some("querying timed out".into()),
// reason: None,
// });
}
Ok(query_result) => match query_result {
Ok(response) => {
total_hits += response.total_hits;
let source_id = response.source.id.clone();
for (doc, score) in response.hits {
log::debug!("doc: {}, {:?}, {}", doc.id, doc.title, score);
for (document, score) in response.hits {
log::debug!(
"document from query source [{}]: ID [{}], title [{:?}], score [{}]",
response.source.id,
document.id,
document.title,
score
);
let query_hit = QueryHits {
source: Some(response.source.clone()),
score,
document: doc,
document,
};
all_hits.push((source_id.clone(), query_hit.clone(), score));
@@ -186,17 +255,13 @@ pub async fn query_coco_fusion<R: Runtime>(
}
}
Err(search_error) => {
log::error!(
"searching query source [{}] failed, error [{}]",
query_source.id,
search_error
);
failed_requests.push(FailedRequest {
source: query_source,
status: 0,
error: Some(search_error.to_string()),
reason: None,
});
query_coco_fusion_handle_failed_request(
tauri_app_handle.clone(),
&mut failed_requests,
query_source,
search_error,
)
.await;
}
},
}
@@ -356,3 +421,54 @@ fn boosted_levenshtein_rerank(query: &str, titles: Vec<(usize, &str)>) -> Vec<(u
})
.collect()
}
/// Helper function to handle a failed request.
///
/// Extracted as a function because `query_coco_fusion_single_query_source()` and
/// `query_coco_fusion_multi_query_sources()` share the same error handling logic.
async fn query_coco_fusion_handle_failed_request(
tauri_app_handle: AppHandle,
failed_requests: &mut Vec<FailedRequest>,
query_source: QuerySource,
search_error: SearchError,
) {
log::error!(
"searching query source [{}] failed, error [{}]",
query_source.id,
search_error
);
let mut status_code_num: u16 = 0;
if let SearchError::HttpError {
status_code: opt_status_code,
msg: _,
} = search_error
{
if let Some(status_code) = opt_status_code {
status_code_num = status_code.as_u16();
if status_code != StatusCode::OK {
if status_code == StatusCode::UNAUTHORIZED {
// This Coco server is unavailable. In addition to marking it as
// unavailable, we need to log out because the status code is 401.
logout_coco_server(tauri_app_handle.clone(), query_source.id.to_string()).await.unwrap_or_else(|e| {
panic!(
"the search request to Coco server [id {}, name {}] failed with status code {}, the login token is invalid, we are trying to log out, but failed with error [{}]",
query_source.id, query_source.name, StatusCode::UNAUTHORIZED, e
);
})
} else {
// This Coco server is unavailable
mark_server_as_offline(tauri_app_handle.clone(), &query_source.id).await;
}
}
}
}
failed_requests.push(FailedRequest {
source: query_source,
status: status_code_num,
error: Some(search_error.to_string()),
reason: None,
});
}

View File

@@ -15,42 +15,6 @@ pub struct UploadAttachmentResponse {
pub attachments: Vec<String>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct AttachmentSource {
pub id: String,
pub created: String,
pub updated: String,
pub session: String,
pub name: String,
pub icon: String,
pub url: String,
pub size: u64,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct AttachmentHit {
pub _index: String,
pub _type: Option<String>,
pub _id: String,
pub _score: Option<f64>,
pub _source: AttachmentSource,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct AttachmentHits {
pub total: Value,
pub max_score: Option<f64>,
pub hits: Option<Vec<AttachmentHit>>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct GetAttachmentResponse {
pub took: u32,
pub timed_out: bool,
pub _shards: Option<Value>,
pub hits: AttachmentHits,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct DeleteAttachmentResponse {
pub _id: String,
@@ -60,7 +24,6 @@ pub struct DeleteAttachmentResponse {
#[command]
pub async fn upload_attachment(
server_id: String,
session_id: String,
file_paths: Vec<PathBuf>,
) -> Result<UploadAttachmentResponse, String> {
let mut form = Form::new();
@@ -82,10 +45,12 @@ pub async fn upload_attachment(
form = form.part("files", part);
}
let server = get_server_by_id(&server_id).ok_or("Server not found")?;
let url = HttpClient::join_url(&server.endpoint, &format!("chat/{}/_upload", session_id));
let server = get_server_by_id(&server_id)
.await
.ok_or("Server not found")?;
let url = HttpClient::join_url(&server.endpoint, &format!("attachment/_upload"));
let token = get_server_token(&server_id).await?;
let token = get_server_token(&server_id).await;
let mut headers = HashMap::new();
if let Some(token) = token {
headers.insert("X-API-TOKEN".to_string(), token.access_token);
@@ -107,20 +72,25 @@ pub async fn upload_attachment(
}
#[command]
pub async fn get_attachment(
pub async fn get_attachment_by_ids(
server_id: String,
session_id: String,
) -> Result<GetAttachmentResponse, String> {
let mut query_params = Vec::new();
query_params.push(format!("session={}", session_id));
attachments: Vec<String>,
) -> Result<Value, String> {
println!("get_attachment_by_ids server_id: {}", server_id);
println!("get_attachment_by_ids attachments: {:?}", attachments);
let response = HttpClient::get(&server_id, "/attachment/_search", Some(query_params))
let request_body = serde_json::json!({
"attachments": attachments
});
let body = reqwest::Body::from(serde_json::to_string(&request_body).unwrap());
let response = HttpClient::post(&server_id, "/attachment/_search", None, Some(body))
.await
.map_err(|e| format!("Request error: {}", e))?;
let body = get_response_body_text(response).await?;
serde_json::from_str::<GetAttachmentResponse>(&body)
serde_json::from_str::<Value>(&body)
.map_err(|e| format!("Failed to parse attachment response: {}", e))
}

View File

@@ -4,31 +4,31 @@ use crate::server::servers::{
get_server_by_id, persist_servers, persist_servers_token, save_access_token, save_server,
try_register_server_to_search_source,
};
use tauri::{AppHandle, Runtime};
use tauri::AppHandle;
#[allow(dead_code)]
fn request_access_token_url(request_id: &str) -> String {
// Remove the endpoint part and keep just the path for the request
format!("/auth/request_access_token?request_id={}", request_id)
format!("/auth/access_token?request_id={}", request_id)
}
#[tauri::command]
pub async fn handle_sso_callback<R: Runtime>(
app_handle: AppHandle<R>,
pub async fn handle_sso_callback(
app_handle: AppHandle,
server_id: String,
request_id: String,
code: String,
) -> Result<(), String> {
// Retrieve the server details using the server ID
let server = get_server_by_id(&server_id);
let server = get_server_by_id(&server_id).await;
let expire_in = 3600; // TODO, need to update to actual expire_in value
if let Some(mut server) = server {
// Save the access token for the server
let access_token = ServerAccessToken::new(server_id.clone(), code.clone(), expire_in);
// dbg!(&server_id, &request_id, &code, &token);
save_access_token(server_id.clone(), access_token);
persist_servers_token(&app_handle)?;
save_access_token(server_id.clone(), access_token).await;
persist_servers_token(&app_handle).await?;
// Register the server to the search source
try_register_server_to_search_source(app_handle.clone(), &server).await;
@@ -41,7 +41,7 @@ pub async fn handle_sso_callback<R: Runtime>(
Ok(p) => {
server.profile = Some(p);
server.available = true;
save_server(&server);
save_server(&server).await;
persist_servers(&app_handle).await?;
Ok(())
}

View File

@@ -1,11 +1,12 @@
use crate::common::connector::Connector;
use crate::common::search::parse_search_results;
use crate::server::http_client::HttpClient;
use crate::server::http_client::{HttpClient, status_code_check};
use crate::server::servers::get_all_servers;
use http::StatusCode;
use lazy_static::lazy_static;
use std::collections::HashMap;
use std::sync::{Arc, RwLock};
use tauri::{AppHandle, Runtime};
use tauri::AppHandle;
lazy_static! {
static ref CONNECTOR_CACHE: Arc<RwLock<HashMap<String, HashMap<String, Connector>>>> =
@@ -28,8 +29,8 @@ pub fn get_connector_by_id(server_id: &str, connector_id: &str) -> Option<Connec
Some(connector.clone())
}
pub async fn refresh_all_connectors<R: Runtime>(app_handle: &AppHandle<R>) -> Result<(), String> {
let servers = get_all_servers();
pub async fn refresh_all_connectors(app_handle: &AppHandle) -> Result<(), String> {
let servers = get_all_servers().await;
// Collect all the tasks for fetching and refreshing connectors
let mut server_map = HashMap::new();
@@ -107,6 +108,7 @@ pub async fn fetch_connectors_by_server(id: &str) -> Result<Vec<Connector>, Stri
// dbg!("Error fetching connector for id {}: {}", &id, &e);
format!("Error fetching connector: {}", e)
})?;
status_code_check(&resp, &[StatusCode::OK, StatusCode::CREATED])?;
// Parse the search results directly from the response body
let datasource: Vec<Connector> = parse_search_results(resp)
@@ -120,8 +122,8 @@ pub async fn fetch_connectors_by_server(id: &str) -> Result<Vec<Connector>, Stri
}
#[tauri::command]
pub async fn get_connectors_by_server<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn get_connectors_by_server(
_app_handle: AppHandle,
id: String,
) -> Result<Vec<Connector>, String> {
let connectors = fetch_connectors_by_server(&id).await?;

View File

@@ -1,12 +1,13 @@
use crate::common::datasource::DataSource;
use crate::common::search::parse_search_results;
use crate::server::connector::get_connector_by_id;
use crate::server::http_client::HttpClient;
use crate::server::http_client::{HttpClient, status_code_check};
use crate::server::servers::get_all_servers;
use http::StatusCode;
use lazy_static::lazy_static;
use std::collections::HashMap;
use std::sync::{Arc, RwLock};
use tauri::{AppHandle, Runtime};
use tauri::AppHandle;
lazy_static! {
static ref DATASOURCE_CACHE: Arc<RwLock<HashMap<String, HashMap<String, DataSource>>>> =
@@ -25,15 +26,15 @@ pub fn save_datasource_to_cache(server_id: &str, datasources: Vec<DataSource>) {
#[allow(dead_code)]
pub fn get_datasources_from_cache(server_id: &str) -> Option<HashMap<String, DataSource>> {
let cache = DATASOURCE_CACHE.read().unwrap(); // Acquire read lock
// dbg!("cache: {:?}", &cache);
// dbg!("cache: {:?}", &cache);
let server_cache = cache.get(server_id)?; // Get the server's cache
Some(server_cache.clone())
}
pub async fn refresh_all_datasources<R: Runtime>(_app_handle: &AppHandle<R>) -> Result<(), String> {
pub async fn refresh_all_datasources(_app_handle: &AppHandle) -> Result<(), String> {
// dbg!("Attempting to refresh all datasources");
let servers = get_all_servers();
let servers = get_all_servers().await;
let mut server_map = HashMap::new();
@@ -95,6 +96,7 @@ pub async fn datasource_search(
let resp = HttpClient::post(id, "/datasource/_search", query_params, None)
.await
.map_err(|e| format!("Error fetching datasource: {}", e))?;
status_code_check(&resp, &[StatusCode::OK, StatusCode::CREATED])?;
// Parse the search results from the response
let datasources: Vec<DataSource> = parse_search_results(resp).await.map_err(|e| {
@@ -117,6 +119,7 @@ pub async fn mcp_server_search(
let resp = HttpClient::post(id, "/mcp_server/_search", query_params, None)
.await
.map_err(|e| format!("Error fetching datasource: {}", e))?;
status_code_check(&resp, &[StatusCode::OK, StatusCode::CREATED])?;
// Parse the search results from the response
let mcp_server: Vec<DataSource> = parse_search_results(resp).await.map_err(|e| {

View File

@@ -1,16 +1,19 @@
use crate::server::servers::{get_server_by_id, get_server_token};
use http::{HeaderName, HeaderValue};
use crate::util::app_lang::get_app_lang;
use crate::util::platform::Platform;
use http::{HeaderName, HeaderValue, StatusCode};
use once_cell::sync::Lazy;
use reqwest::{Client, Method, RequestBuilder};
use std::collections::HashMap;
use std::sync::LazyLock;
use std::time::Duration;
use tokio::sync::Mutex;
pub(crate) fn new_reqwest_http_client(accept_invalid_certs: bool) -> Client {
Client::builder()
.read_timeout(Duration::from_secs(3)) // Set a timeout of 3 second
.connect_timeout(Duration::from_secs(3)) // Set a timeout of 3 second
.timeout(Duration::from_secs(10)) // Set a timeout of 10 seconds
.read_timeout(Duration::from_secs(60)) // Set a timeout of 60 second
.connect_timeout(Duration::from_secs(30)) // Set a timeout of 30 second
.timeout(Duration::from_secs(5 * 60)) // Set a timeout of 5 minute
.danger_accept_invalid_certs(accept_invalid_certs) // allow self-signed certificates
.build()
.expect("Failed to build client")
@@ -26,6 +29,26 @@ pub static HTTP_CLIENT: Lazy<Mutex<Client>> = Lazy::new(|| {
Mutex::new(new_reqwest_http_client(allow_self_signature))
});
/// These header values won't change during a process's lifetime.
static STATIC_HEADERS: LazyLock<HashMap<String, String>> = LazyLock::new(|| {
HashMap::from([
(
"X-OS-NAME".into(),
Platform::current()
.to_os_name_http_header_str()
.into_owned(),
),
(
"X-OS-VER".into(),
sysinfo::System::os_version()
.expect("sysinfo::System::os_version() should be Some on major systems"),
),
("X-OS-ARCH".into(), sysinfo::System::cpu_arch()),
("X-APP-NAME".into(), "coco-app".into()),
("X-APP-VER".into(), env!("CARGO_PKG_VERSION").into()),
])
});
pub struct HttpClient;
impl HttpClient {
@@ -81,8 +104,32 @@ impl HttpClient {
// Build the request
let mut request_builder = client.request(method.clone(), url);
// Populate the headers defined by us
let mut req_headers = reqwest::header::HeaderMap::new();
for (key, value) in STATIC_HEADERS.iter() {
let key = HeaderName::from_bytes(key.as_bytes())
.expect("headers defined by us should be valid");
let value = HeaderValue::from_str(value.trim()).unwrap_or_else(|e| {
panic!(
"header value [{}] is invalid, error [{}], this should be unreachable",
value, e
);
});
req_headers.insert(key, value);
}
let app_lang = get_app_lang().await.to_string();
req_headers.insert(
"X-APP-LANG",
HeaderValue::from_str(&app_lang).unwrap_or_else(|e| {
panic!(
"header value [{}] is invalid, error [{}], this should be unreachable",
app_lang, e
);
}),
);
// Headers from the function parameter
if let Some(h) = headers {
let mut req_headers = reqwest::header::HeaderMap::new();
for (key, value) in h.into_iter() {
match (
HeaderName::from_bytes(key.as_bytes()),
@@ -106,10 +153,8 @@ impl HttpClient {
}
if let Some(params) = query_params {
let query: Vec<(&str, &str)> = params
.iter()
.filter_map(|s| s.split_once('='))
.collect();
let query: Vec<(&str, &str)> =
params.iter().filter_map(|s| s.split_once('=')).collect();
request_builder = request_builder.query(&query);
}
@@ -121,7 +166,6 @@ impl HttpClient {
request_builder
}
pub async fn send_request(
server_id: &str,
method: Method,
@@ -131,14 +175,14 @@ impl HttpClient {
body: Option<reqwest::Body>,
) -> Result<reqwest::Response, String> {
// Fetch the server using the server_id
let server = get_server_by_id(server_id);
let server = get_server_by_id(server_id).await;
if let Some(s) = server {
// Construct the URL
let url = HttpClient::join_url(&s.endpoint, path);
// Retrieve the token for the server (token is optional)
let token = get_server_token(server_id)
.await?
.await
.map(|t| t.access_token.clone());
let mut headers = if let Some(custom_headers) = custom_headers {
@@ -161,7 +205,7 @@ impl HttpClient {
Self::send_raw_request(method, &url, query_params, Some(headers), body).await
} else {
Err("Server not found".to_string())
Err(format!("Server [{}] not found", server_id))
}
}
@@ -171,8 +215,7 @@ impl HttpClient {
path: &str,
query_params: Option<Vec<String>>,
) -> Result<reqwest::Response, String> {
HttpClient::send_request(server_id, Method::GET, path, None, query_params,
None).await
HttpClient::send_request(server_id, Method::GET, path, None, query_params, None).await
}
// Convenience method for POST requests
@@ -200,7 +243,7 @@ impl HttpClient {
query_params,
body,
)
.await
.await
}
// Convenience method for PUT requests
@@ -220,7 +263,7 @@ impl HttpClient {
query_params,
body,
)
.await
.await
}
// Convenience method for DELETE requests
@@ -239,6 +282,33 @@ impl HttpClient {
query_params,
None,
)
.await
.await
}
}
/// Helper function to check status code.
///
/// If the status code is not in the `allowed_status_codes` list, return an error.
pub(crate) fn status_code_check(
response: &reqwest::Response,
allowed_status_codes: &[StatusCode],
) -> Result<(), String> {
let status_code = response.status();
if !allowed_status_codes.contains(&status_code) {
let msg = format!(
"Response of request [{}] status code failed: status code [{}], which is not in the 'allow' list {:?}",
response.url(),
status_code,
allowed_status_codes
.iter()
.map(|status| status.to_string())
.collect::<Vec<String>>()
);
log::warn!("{}", msg);
Err(msg)
} else {
Ok(())
}
}

View File

@@ -8,6 +8,6 @@ pub mod http_client;
pub mod profile;
pub mod search;
pub mod servers;
pub mod synthesize;
pub mod system_settings;
pub mod transcription;
pub mod websocket;

View File

@@ -1,11 +1,11 @@
use crate::common::http::get_response_body_text;
use crate::common::profile::UserProfile;
use crate::server::http_client::HttpClient;
use tauri::{AppHandle, Runtime};
use tauri::AppHandle;
#[tauri::command]
pub async fn get_user_profiles<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn get_user_profiles(
_app_handle: AppHandle,
server_id: String,
) -> Result<UserProfile, String> {
// Use the generic GET method from HttpClient

View File

@@ -6,10 +6,10 @@ use crate::common::server::Server;
use crate::common::traits::SearchSource;
use crate::server::http_client::HttpClient;
use async_trait::async_trait;
// use futures::stream::StreamExt;
use ordered_float::OrderedFloat;
use reqwest::StatusCode;
use std::collections::HashMap;
// use std::hash::Hash;
use tauri::AppHandle;
#[allow(dead_code)]
pub(crate) struct DocumentsSizedCollector {
@@ -44,7 +44,7 @@ impl DocumentsSizedCollector {
}
}
fn documents(self) -> impl ExactSizeIterator<Item=Document> {
fn documents(self) -> impl ExactSizeIterator<Item = Document> {
self.docs.into_iter().map(|(_, doc, _)| doc)
}
@@ -90,7 +90,11 @@ impl SearchSource for CocoSearchSource {
}
}
async fn search(&self, query: SearchQuery) -> Result<QueryResponse, SearchError> {
async fn search(
&self,
_tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
let url = "/query/_search";
let mut total_hits = 0;
let mut hits: Vec<(Document, f64)> = Vec::new();
@@ -108,7 +112,18 @@ impl SearchSource for CocoSearchSource {
let response = HttpClient::get(&self.server.id, &url, Some(query_params))
.await
.map_err(|e| SearchError::HttpError(format!("{}", e)))?;
.map_err(|e| SearchError::HttpError {
status_code: None,
msg: format!("{}", e),
})?;
let status_code = response.status();
if ![StatusCode::OK, StatusCode::CREATED].contains(&status_code) {
return Err(SearchError::HttpError {
status_code: Some(status_code),
msg: format!("Request failed with status code [{}]", status_code),
});
}
// Use the helper function to parse the response body
let response_body = get_response_body_text(response)
@@ -123,7 +138,6 @@ impl SearchSource for CocoSearchSource {
let parsed: SearchResponse<Document> = serde_json::from_str(&response_body)
.map_err(|e| SearchError::ParseError(format!("{}", e)))?;
// Process the parsed response
total_hits = parsed.hits.total.value as usize;

View File

@@ -1,3 +1,4 @@
use crate::COCO_TAURI_STORE;
use crate::common::http::get_response_body_text;
use crate::common::register::SearchSourceRegistry;
use crate::common::server::{AuthProvider, Provider, Server, ServerAccessToken, Sso, Version};
@@ -5,68 +6,71 @@ use crate::server::connector::fetch_connectors_by_server;
use crate::server::datasource::datasource_search;
use crate::server::http_client::HttpClient;
use crate::server::search::CocoSearchSource;
use crate::COCO_TAURI_STORE;
use lazy_static::lazy_static;
use function_name;
use http::StatusCode;
use reqwest::Method;
use serde_json::from_value;
use serde_json::Value as JsonValue;
use serde_json::from_value;
use std::collections::HashMap;
use std::sync::Arc;
use std::sync::RwLock;
use tauri::Runtime;
use std::sync::LazyLock;
use tauri::{AppHandle, Manager};
use tauri_plugin_store::StoreExt;
// Assuming you're using serde_json
use tokio::sync::RwLock;
lazy_static! {
static ref SERVER_CACHE: Arc<RwLock<HashMap<String, Server>>> =
Arc::new(RwLock::new(HashMap::new()));
static ref SERVER_TOKEN: Arc<RwLock<HashMap<String, ServerAccessToken>>> =
Arc::new(RwLock::new(HashMap::new()));
}
/// Coco sever list
static SERVER_LIST_CACHE: LazyLock<RwLock<HashMap<String, Server>>> =
LazyLock::new(|| RwLock::new(HashMap::new()));
#[allow(dead_code)]
fn check_server_exists(id: &str) -> bool {
let cache = SERVER_CACHE.read().unwrap(); // Acquire read lock
cache.contains_key(id)
}
/// If a server has a token stored here that has not expired, it is considered logged in.
///
/// Since the `expire_at` field of `struct ServerAccessToken` is currently unused,
/// all servers stored here are treated as logged in.
static SERVER_TOKEN_LIST_CACHE: LazyLock<RwLock<HashMap<String, ServerAccessToken>>> =
LazyLock::new(|| RwLock::new(HashMap::new()));
pub fn get_server_by_id(id: &str) -> Option<Server> {
let cache = SERVER_CACHE.read().unwrap(); // Acquire read lock
/// `SERVER_LIST_CACHE` will be stored in KV store COCO_TAURI_STORE, under this key.
pub const COCO_SERVERS: &str = "coco_servers";
/// `SERVER_TOKEN_LIST_CACHE` will be stored in KV store COCO_TAURI_STORE, under this key.
const COCO_SERVER_TOKENS: &str = "coco_server_tokens";
pub async fn get_server_by_id(id: &str) -> Option<Server> {
let cache = SERVER_LIST_CACHE.read().await;
cache.get(id).cloned()
}
#[tauri::command]
pub async fn get_server_token(id: &str) -> Result<Option<ServerAccessToken>, String> {
let cache = SERVER_TOKEN.read().map_err(|err| err.to_string())?;
pub async fn get_server_token(id: &str) -> Option<ServerAccessToken> {
let cache = SERVER_TOKEN_LIST_CACHE.read().await;
Ok(cache.get(id).cloned())
cache.get(id).cloned()
}
pub fn save_access_token(server_id: String, token: ServerAccessToken) -> bool {
let mut cache = SERVER_TOKEN.write().unwrap();
pub async fn save_access_token(server_id: String, token: ServerAccessToken) -> bool {
let mut cache = SERVER_TOKEN_LIST_CACHE.write().await;
cache.insert(server_id, token).is_none()
}
fn check_endpoint_exists(endpoint: &str) -> bool {
let cache = SERVER_CACHE.read().unwrap();
async fn check_endpoint_exists(endpoint: &str) -> bool {
let cache = SERVER_LIST_CACHE.read().await;
cache.values().any(|server| server.endpoint == endpoint)
}
pub fn save_server(server: &Server) -> bool {
let mut cache = SERVER_CACHE.write().unwrap();
cache.insert(server.id.clone(), server.clone()).is_none() // If the server id did not exist, `insert` will return `None`
/// Return true if `server` does not exists in the server list, i.e., it is a newly-added
/// server.
pub async fn save_server(server: &Server) -> bool {
let mut cache = SERVER_LIST_CACHE.write().await;
cache.insert(server.id.clone(), server.clone()).is_none()
}
fn remove_server_by_id(id: String) -> bool {
/// Return the removed `Server` if it exists in the server list.
async fn remove_server_by_id(id: &str) -> Option<Server> {
log::debug!("remove server by id: {}", &id);
let mut cache = SERVER_CACHE.write().unwrap();
let deleted = cache.remove(id.as_str());
deleted.is_some()
let mut cache = SERVER_LIST_CACHE.write().await;
cache.remove(id)
}
pub async fn persist_servers<R: Runtime>(app_handle: &AppHandle<R>) -> Result<(), String> {
let cache = SERVER_CACHE.read().unwrap(); // Acquire a read lock, not a write lock, since you're not modifying the cache
pub async fn persist_servers(app_handle: &AppHandle) -> Result<(), String> {
let cache = SERVER_LIST_CACHE.read().await;
// Convert HashMap to Vec for serialization (iterating over values of HashMap)
let servers: Vec<Server> = cache.values().cloned().collect();
@@ -86,14 +90,16 @@ pub async fn persist_servers<R: Runtime>(app_handle: &AppHandle<R>) -> Result<()
Ok(())
}
pub fn remove_server_token(id: &str) -> bool {
/// Return true if the server token of the server specified by `id` exists in
/// the token list and gets deleted.
pub async fn remove_server_token(id: &str) -> bool {
log::debug!("remove server token by id: {}", &id);
let mut cache = SERVER_TOKEN.write().unwrap();
let mut cache = SERVER_TOKEN_LIST_CACHE.write().await;
cache.remove(id).is_some()
}
pub fn persist_servers_token<R: Runtime>(app_handle: &AppHandle<R>) -> Result<(), String> {
let cache = SERVER_TOKEN.read().unwrap(); // Acquire a read lock, not a write lock, since you're not modifying the cache
pub async fn persist_servers_token(app_handle: &AppHandle) -> Result<(), String> {
let cache = SERVER_TOKEN_LIST_CACHE.read().await;
// Convert HashMap to Vec for serialization (iterating over values of HashMap)
let servers: Vec<ServerAccessToken> = cache.values().cloned().collect();
@@ -151,9 +157,7 @@ fn get_default_server() -> Server {
}
}
pub async fn load_servers_token<R: Runtime>(
app_handle: &AppHandle<R>,
) -> Result<Vec<ServerAccessToken>, String> {
pub async fn load_servers_token(app_handle: &AppHandle) -> Result<Vec<ServerAccessToken>, String> {
log::debug!("Attempting to load servers token");
let store = app_handle
@@ -173,30 +177,46 @@ pub async fn load_servers_token<R: Runtime>(
servers.ok_or_else(|| "Failed to read servers from store: No servers found".to_string())?;
// Convert each item in the JsonValue array to a Server
if let JsonValue::Array(servers_array) = servers {
// Deserialize each JsonValue into Server, filtering out any errors
let deserialized_tokens: Vec<ServerAccessToken> = servers_array
.into_iter()
.filter_map(|server_json| from_value(server_json).ok()) // Only keep valid Server instances
.collect();
match servers {
JsonValue::Array(servers_array) => {
let mut deserialized_tokens: Vec<ServerAccessToken> =
Vec::with_capacity(servers_array.len());
for server_json in servers_array {
match from_value(server_json.clone()) {
Ok(token) => {
deserialized_tokens.push(token);
}
Err(e) => {
panic!(
"failed to deserialize JSON [{}] to [struct ServerAccessToken], error [{}], store [{}] key [{}] is possibly corrupted!",
server_json, e, COCO_TAURI_STORE, COCO_SERVER_TOKENS
);
}
}
}
if deserialized_tokens.is_empty() {
return Err("Failed to deserialize any servers from the store.".to_string());
if deserialized_tokens.is_empty() {
return Err("Failed to deserialize any servers from the store.".to_string());
}
for server in deserialized_tokens.iter() {
save_access_token(server.id.clone(), server.clone()).await;
}
log::debug!("loaded {:?} servers's token", &deserialized_tokens.len());
Ok(deserialized_tokens)
}
for server in deserialized_tokens.iter() {
save_access_token(server.id.clone(), server.clone());
_ => {
unreachable!(
"coco server tokens should be stored in an array under store [{}] key [{}], but it is not",
COCO_TAURI_STORE, COCO_SERVER_TOKENS
);
}
log::debug!("loaded {:?} servers's token", &deserialized_tokens.len());
Ok(deserialized_tokens)
} else {
Err("Failed to read servers from store: Invalid format".to_string())
}
}
pub async fn load_servers<R: Runtime>(app_handle: &AppHandle<R>) -> Result<Vec<Server>, String> {
pub async fn load_servers(app_handle: &AppHandle) -> Result<Vec<Server>, String> {
let store = app_handle
.store(COCO_TAURI_STORE)
.expect("create or load a store should not fail");
@@ -214,33 +234,46 @@ pub async fn load_servers<R: Runtime>(app_handle: &AppHandle<R>) -> Result<Vec<S
servers.ok_or_else(|| "Failed to read servers from store: No servers found".to_string())?;
// Convert each item in the JsonValue array to a Server
if let JsonValue::Array(servers_array) = servers {
// Deserialize each JsonValue into Server, filtering out any errors
let deserialized_servers: Vec<Server> = servers_array
.into_iter()
.filter_map(|server_json| from_value(server_json).ok()) // Only keep valid Server instances
.collect();
match servers {
JsonValue::Array(servers_array) => {
let mut deserialized_servers = Vec::with_capacity(servers_array.len());
for server_json in servers_array {
match from_value(server_json.clone()) {
Ok(server) => {
deserialized_servers.push(server);
}
Err(e) => {
panic!(
"failed to deserialize JSON [{}] to [struct Server], error [{}], store [{}] key [{}] is possibly corrupted!",
server_json, e, COCO_TAURI_STORE, COCO_SERVERS
);
}
}
}
if deserialized_servers.is_empty() {
return Err("Failed to deserialize any servers from the store.".to_string());
if deserialized_servers.is_empty() {
return Err("Failed to deserialize any servers from the store.".to_string());
}
for server in deserialized_servers.iter() {
save_server(&server).await;
}
log::debug!("load servers: {:?}", &deserialized_servers);
Ok(deserialized_servers)
}
for server in deserialized_servers.iter() {
save_server(&server);
_ => {
unreachable!(
"coco servers should be stored in an array under store [{}] key [{}], but it is not",
COCO_TAURI_STORE, COCO_SERVERS
);
}
log::debug!("load servers: {:?}", &deserialized_servers);
Ok(deserialized_servers)
} else {
Err("Failed to read servers from store: Invalid format".to_string())
}
}
/// Function to load servers or insert a default one if none exist
pub async fn load_or_insert_default_server<R: Runtime>(
app_handle: &AppHandle<R>,
) -> Result<Vec<Server>, String> {
pub async fn load_or_insert_default_server(app_handle: &AppHandle) -> Result<Vec<Server>, String> {
log::debug!("Attempting to load or insert default server");
let exists_servers = load_servers(&app_handle).await;
@@ -250,7 +283,7 @@ pub async fn load_or_insert_default_server<R: Runtime>(
}
let default = get_default_server();
save_server(&default);
save_server(&default).await;
log::debug!("loaded default servers");
@@ -258,47 +291,32 @@ pub async fn load_or_insert_default_server<R: Runtime>(
}
#[tauri::command]
pub async fn list_coco_servers<R: Runtime>(
_app_handle: AppHandle<R>,
) -> Result<Vec<Server>, String> {
pub async fn list_coco_servers(app_handle: AppHandle) -> Result<Vec<Server>, String> {
//hard fresh all server's info, in order to get the actual health
refresh_all_coco_server_info(_app_handle.clone()).await;
refresh_all_coco_server_info(app_handle.clone()).await;
let servers: Vec<Server> = get_all_servers().await;
let servers: Vec<Server> = get_all_servers();
Ok(servers)
}
#[allow(dead_code)]
pub fn get_servers_as_hashmap() -> HashMap<String, Server> {
let cache = SERVER_CACHE.read().unwrap();
cache.clone()
}
pub fn get_all_servers() -> Vec<Server> {
let cache = SERVER_CACHE.read().unwrap();
pub async fn get_all_servers() -> Vec<Server> {
let cache = SERVER_LIST_CACHE.read().await;
cache.values().cloned().collect()
}
/// We store added Coco servers in the Tauri store using this key.
pub const COCO_SERVERS: &str = "coco_servers";
const COCO_SERVER_TOKENS: &str = "coco_server_tokens";
pub async fn refresh_all_coco_server_info<R: Runtime>(app_handle: AppHandle<R>) {
let servers = get_all_servers();
pub async fn refresh_all_coco_server_info(app_handle: AppHandle) {
let servers = get_all_servers().await;
for server in servers {
let _ = refresh_coco_server_info(app_handle.clone(), server.id.clone()).await;
}
}
#[tauri::command]
pub async fn refresh_coco_server_info<R: Runtime>(
app_handle: AppHandle<R>,
id: String,
) -> Result<Server, String> {
pub async fn refresh_coco_server_info(app_handle: AppHandle, id: String) -> Result<Server, String> {
// Retrieve the server from the cache
let cached_server = {
let cache = SERVER_CACHE.read().unwrap();
let cache = SERVER_LIST_CACHE.read().await;
cache.get(&id).cloned()
};
@@ -313,19 +331,16 @@ pub async fn refresh_coco_server_info<R: Runtime>(
let profile = server.profile;
// Send request to fetch updated server info
let response = HttpClient::get(&id, "/provider/_info", None)
.await
.map_err(|e| format!("Failed to contact the server: {}", e));
if response.is_err() {
let _ = mark_server_as_offline(app_handle, &id).await;
return Err(response.err().unwrap());
}
let response = response?;
let response = match HttpClient::get(&id, "/provider/_info", None).await {
Ok(response) => response,
Err(e) => {
mark_server_as_offline(app_handle, &id).await;
return Err(e);
}
};
if !response.status().is_success() {
let _ = mark_server_as_offline(app_handle, &id).await;
mark_server_as_offline(app_handle, &id).await;
return Err(format!("Request failed with status: {}", response.status()));
}
@@ -336,19 +351,26 @@ pub async fn refresh_coco_server_info<R: Runtime>(
let mut updated_server: Server = serde_json::from_str(&body)
.map_err(|e| format!("Failed to deserialize the response: {}", e))?;
// Mark server as online
let _ = mark_server_as_online(app_handle.clone(), &id).await;
// Restore local state
updated_server.id = id.clone();
updated_server.builtin = is_builtin;
updated_server.enabled = is_enabled;
updated_server.available = true;
updated_server.available = {
if server.public {
// Public Coco servers are available as long as they are online.
true
} else {
// For non-public Coco servers, we still need to check if it is
// logged in, i.e., has a token stored in `SERVER_TOKEN_LIST_CACHE`.
get_server_token(&id).await.is_some()
}
};
updated_server.profile = profile;
trim_endpoint_last_forward_slash(&mut updated_server);
// Save and persist
save_server(&updated_server);
save_server(&updated_server).await;
try_register_server_to_search_source(app_handle.clone(), &updated_server).await;
persist_servers(&app_handle)
.await
.map_err(|e| format!("Failed to persist servers: {}", e))?;
@@ -361,20 +383,17 @@ pub async fn refresh_coco_server_info<R: Runtime>(
}
#[tauri::command]
pub async fn add_coco_server<R: Runtime>(
app_handle: AppHandle<R>,
endpoint: String,
) -> Result<Server, String> {
pub async fn add_coco_server(app_handle: AppHandle, endpoint: String) -> Result<Server, String> {
load_or_insert_default_server(&app_handle)
.await
.map_err(|e| format!("Failed to load default servers: {}", e))?;
let endpoint = endpoint.trim_end_matches('/');
if check_endpoint_exists(endpoint) {
if check_endpoint_exists(endpoint).await {
log::debug!(
"This Coco server has already been registered: {:?}",
&endpoint
"trying to register a Coco server [{}] that has already been registered",
endpoint
);
return Err("This Coco server has already been registered.".into());
}
@@ -386,6 +405,15 @@ pub async fn add_coco_server<R: Runtime>(
log::debug!("Get provider info response: {:?}", &response);
if response.status() != StatusCode::OK {
log::debug!(
"trying to register a Coco server [{}] that is possibly down",
endpoint
);
return Err("This Coco server is possibly down".into());
}
let body = get_response_body_text(response).await?;
let mut server: Server = serde_json::from_str(&body)
@@ -393,15 +421,32 @@ pub async fn add_coco_server<R: Runtime>(
trim_endpoint_last_forward_slash(&mut server);
// The JSON returned from `provider/_info` won't have this field, serde will set
// it to an empty string during deserialization, we need to set a valid value here.
if server.id.is_empty() {
server.id = pizza_common::utils::uuid::Uuid::new().to_string();
}
// Use the default name, if it is not set.
if server.name.is_empty() {
server.name = "Coco Server".to_string();
}
save_server(&server);
// Update the `available` field
if server.public {
// Serde already sets this to true, but just to make the code clear, do it again.
server.available = true;
} else {
let opt_token = get_server_token(&server.id).await;
assert!(
opt_token.is_none(),
"this Coco server is newly-added, we should have no token stored for it!"
);
// This is a non-public Coco server, and it is not logged in, so it is unavailable.
server.available = false;
}
save_server(&server).await;
try_register_server_to_search_source(app_handle.clone(), &server).await;
persist_servers(&app_handle)
@@ -413,58 +458,106 @@ pub async fn add_coco_server<R: Runtime>(
}
#[tauri::command]
pub async fn remove_coco_server<R: Runtime>(
app_handle: AppHandle<R>,
id: String,
) -> Result<(), ()> {
#[function_name::named]
pub async fn remove_coco_server(app_handle: AppHandle, id: String) -> Result<(), ()> {
let registry = app_handle.state::<SearchSourceRegistry>();
registry.remove_source(id.as_str()).await;
remove_server_token(id.as_str());
remove_server_by_id(id);
let opt_server = remove_server_by_id(id.as_str()).await;
let Some(server) = opt_server else {
panic!(
"[{}()] invoked with a server [{}] that does not exist! Mismatched states between frontend and backend!",
function_name!(),
id
);
};
persist_servers(&app_handle)
.await
.expect("failed to save servers");
persist_servers_token(&app_handle).expect("failed to save server tokens");
// Only non-public Coco servers require tokens
if !server.public {
// If is logged in, clear the token as well.
let deleted = remove_server_token(id.as_str()).await;
if deleted {
persist_servers_token(&app_handle)
.await
.expect("failed to save server tokens");
}
}
Ok(())
}
#[tauri::command]
pub async fn enable_server<R: Runtime>(app_handle: AppHandle<R>, id: String) -> Result<(), ()> {
println!("enable_server: {}", id);
#[function_name::named]
pub async fn enable_server(app_handle: AppHandle, id: String) -> Result<(), ()> {
let opt_server = get_server_by_id(id.as_str()).await;
let server = get_server_by_id(id.as_str());
if let Some(mut server) = server {
server.enabled = true;
save_server(&server);
let Some(mut server) = opt_server else {
panic!(
"[{}()] invoked with a server [{}] that does not exist! Mismatched states between frontend and backend!",
function_name!(),
id
);
};
// Register the server to the search source
try_register_server_to_search_source(app_handle.clone(), &server).await;
server.enabled = true;
save_server(&server).await;
persist_servers(&app_handle)
.await
.expect("failed to save servers");
}
// Register the server to the search source
try_register_server_to_search_source(app_handle.clone(), &server).await;
persist_servers(&app_handle)
.await
.expect("failed to save servers");
Ok(())
}
pub async fn try_register_server_to_search_source(
app_handle: AppHandle<impl Runtime>,
server: &Server,
) {
#[tauri::command]
#[function_name::named]
pub async fn disable_server(app_handle: AppHandle, id: String) -> Result<(), ()> {
let opt_server = get_server_by_id(id.as_str()).await;
let Some(mut server) = opt_server else {
panic!(
"[{}()] invoked with a server [{}] that does not exist! Mismatched states between frontend and backend!",
function_name!(),
id
);
};
server.enabled = false;
let registry = app_handle.state::<SearchSourceRegistry>();
registry.remove_source(id.as_str()).await;
save_server(&server).await;
persist_servers(&app_handle)
.await
.expect("failed to save servers");
Ok(())
}
/// For non-public Coco servers, we add it to the search source as long as it is
/// enabled.
///
/// For public Coco server, an extra token is required.
pub async fn try_register_server_to_search_source(app_handle: AppHandle, server: &Server) {
if server.enabled {
log::trace!(
"Server {} is public: {} and available: {}",
"Server [name: {}, id: {}] is public: {} and available: {}",
&server.name,
&server.id,
&server.public,
&server.available
);
if !server.public {
let token = get_server_token(&server.id).await;
let opt_token = get_server_token(&server.id).await;
if !token.is_ok() || token.is_ok() && token.unwrap().is_none() {
if opt_token.is_none() {
log::debug!("Server {} is not public and no token was found", &server.id);
return;
}
@@ -476,113 +569,107 @@ pub async fn try_register_server_to_search_source(
}
}
#[tauri::command]
pub async fn mark_server_as_online<R: Runtime>(
app_handle: AppHandle<R>, id: &str) -> Result<(), ()> {
// println!("server_is_offline: {}", id);
let server = get_server_by_id(id);
#[function_name::named]
#[allow(unused)]
async fn mark_server_as_online(app_handle: AppHandle, id: &str) {
let server = get_server_by_id(id).await;
if let Some(mut server) = server {
server.available = true;
server.health = None;
save_server(&server);
save_server(&server).await;
try_register_server_to_search_source(app_handle.clone(), &server).await;
} else {
log::warn!(
"[{}()] invoked with a server [{}] that does not exist!",
function_name!(),
id
);
}
Ok(())
}
#[tauri::command]
pub async fn mark_server_as_offline<R: Runtime>(
app_handle: AppHandle<R>,
id: &str,
) -> Result<(), ()> {
// println!("server_is_offline: {}", id);
let server = get_server_by_id(id);
#[function_name::named]
pub(crate) async fn mark_server_as_offline(app_handle: AppHandle, id: &str) {
let server = get_server_by_id(id).await;
if let Some(mut server) = server {
server.available = false;
server.health = None;
save_server(&server);
save_server(&server).await;
let registry = app_handle.state::<SearchSourceRegistry>();
registry.remove_source(id).await;
} else {
log::warn!(
"[{}()] invoked with a server [{}] that does not exist!",
function_name!(),
id
);
}
Ok(())
}
#[tauri::command]
pub async fn disable_server<R: Runtime>(app_handle: AppHandle<R>, id: String) -> Result<(), ()> {
let server = get_server_by_id(id.as_str());
if let Some(mut server) = server {
server.enabled = false;
let registry = app_handle.state::<SearchSourceRegistry>();
registry.remove_source(id.as_str()).await;
save_server(&server);
persist_servers(&app_handle)
.await
.expect("failed to save servers");
}
Ok(())
}
#[tauri::command]
pub async fn logout_coco_server<R: Runtime>(
app_handle: AppHandle<R>,
id: String,
) -> Result<(), String> {
#[function_name::named]
pub async fn logout_coco_server(app_handle: AppHandle, id: String) -> Result<(), String> {
log::debug!("Attempting to log out server by id: {}", &id);
// Check if server token exists
if let Some(_token) = get_server_token(id.as_str()).await? {
log::debug!("Found server token for id: {}", &id);
// Check if the server exists
let Some(mut server) = get_server_by_id(id.as_str()).await else {
panic!(
"[{}()] invoked with a server [{}] that does not exist! Mismatched states between frontend and backend!",
function_name!(),
id
);
};
// Clear server profile
server.profile = None;
// Logging out from a non-public Coco server makes it unavailable
if !server.public {
server.available = false;
}
// Save the updated server data
save_server(&server).await;
// Persist the updated server data
if let Err(e) = persist_servers(&app_handle).await {
log::debug!("Failed to save server for id: {}. Error: {:?}", &id, &e);
return Err(format!("Failed to save server: {}", &e));
}
let has_token = get_server_token(id.as_str()).await.is_some();
if server.public {
if has_token {
panic!("Public Coco server won't have token")
}
} else {
assert!(
has_token,
"This is a non-public Coco server, and it is logged in, we should have a token"
);
// Remove the server token from cache
remove_server_token(id.as_str());
remove_server_token(id.as_str()).await;
// Persist the updated tokens
if let Err(e) = persist_servers_token(&app_handle) {
if let Err(e) = persist_servers_token(&app_handle).await {
log::debug!("Failed to save tokens for id: {}. Error: {:?}", &id, &e);
return Err(format!("Failed to save tokens: {}", &e));
}
} else {
// Log the case where server token is not found
log::debug!("No server token found for id: {}", &id);
}
// Check if the server exists
if let Some(mut server) = get_server_by_id(id.as_str()) {
log::debug!("Found server for id: {}", &id);
// Clear server profile
server.profile = None;
let _ = mark_server_as_offline(app_handle.clone(), id.as_str()).await;
// Save the updated server data
save_server(&server);
// Persist the updated server data
if let Err(e) = persist_servers(&app_handle).await {
log::debug!("Failed to save server for id: {}. Error: {:?}", &id, &e);
return Err(format!("Failed to save server: {}", &e));
}
} else {
// Log the case where server is not found
log::debug!("No server found for id: {}", &id);
return Err(format!("No server found for id: {}", id));
// Remove it from the search source if it becomes unavailable
if !server.available {
let registry = app_handle.state::<SearchSourceRegistry>();
registry.remove_source(id.as_str()).await;
}
log::debug!("Successfully logged out server with id: {}", &id);
Ok(())
}
/// Removes the trailing slash from the server's endpoint if present.
/// Helper function to remove the trailing slash from the server's endpoint if present.
fn trim_endpoint_last_forward_slash(server: &mut Server) {
if server.endpoint.ends_with('/') {
server.endpoint.pop(); // Remove the last character
while server.endpoint.ends_with('/') {
server.endpoint.pop();
}
let endpoint = &mut server.endpoint;
while endpoint.ends_with('/') {
endpoint.pop();
}
}
@@ -591,42 +678,47 @@ fn provider_info_url(endpoint: &str) -> String {
format!("{endpoint}/provider/_info")
}
#[test]
fn test_trim_endpoint_last_forward_slash() {
let mut server = Server {
id: "test".to_string(),
builtin: false,
enabled: true,
name: "".to_string(),
endpoint: "https://example.com///".to_string(),
provider: Provider {
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_trim_endpoint_last_forward_slash() {
let mut server = Server {
id: "test".to_string(),
builtin: false,
enabled: true,
name: "".to_string(),
icon: "".to_string(),
website: "".to_string(),
eula: "".to_string(),
privacy_policy: "".to_string(),
banner: "".to_string(),
description: "".to_string(),
},
version: Version {
number: "".to_string(),
},
minimal_client_version: None,
updated: "".to_string(),
public: false,
available: false,
health: None,
profile: None,
auth_provider: AuthProvider {
sso: Sso {
url: "".to_string(),
endpoint: "https://example.com///".to_string(),
provider: Provider {
name: "".to_string(),
icon: "".to_string(),
website: "".to_string(),
eula: "".to_string(),
privacy_policy: "".to_string(),
banner: "".to_string(),
description: "".to_string(),
},
},
priority: 0,
stats: None,
};
version: Version {
number: "".to_string(),
},
minimal_client_version: None,
updated: "".to_string(),
public: false,
available: false,
health: None,
profile: None,
auth_provider: AuthProvider {
sso: Sso {
url: "".to_string(),
},
},
priority: 0,
stats: None,
};
trim_endpoint_last_forward_slash(&mut server);
trim_endpoint_last_forward_slash(&mut server);
assert_eq!(server.endpoint, "https://example.com");
assert_eq!(server.endpoint, "https://example.com");
}
}

View File

@@ -0,0 +1,57 @@
use crate::server::http_client::HttpClient;
use futures_util::StreamExt;
use http::Method;
use serde_json::json;
use tauri::{AppHandle, Emitter, command};
#[command]
pub async fn synthesize(
app_handle: AppHandle,
client_id: String,
server_id: String,
voice: String,
content: String,
) -> Result<(), String> {
let body = json!({
"voice": voice,
"content": content,
})
.to_string();
let response = HttpClient::send_request(
server_id.as_str(),
Method::POST,
"/services/audio/synthesize",
None,
None,
Some(reqwest::Body::from(body.to_string())),
)
.await?;
log::info!("Synthesize response status: {}", response.status());
if response.status() == 429 {
return Ok(());
}
if !response.status().is_success() {
return Err(format!("Request Failed: {}", response.status()));
}
let mut stream = response.bytes_stream();
while let Some(chunk) = stream.next().await {
match chunk {
Ok(bytes) => {
if let Err(err) = app_handle.emit(&client_id, bytes.to_vec()) {
log::error!("Emit error: {:?}", err);
}
}
Err(e) => {
log::error!("Stream error: {:?}", e);
break;
}
}
}
Ok(())
}

View File

@@ -1,41 +1,96 @@
use crate::common::http::get_response_body_text;
use crate::server::http_client::HttpClient;
use serde::{Deserialize, Serialize};
use serde_json::{Value, from_str};
use tauri::command;
#[derive(Debug, Serialize, Deserialize)]
pub struct TranscriptionResponse {
pub text: String,
task_id: String,
results: Vec<Value>,
}
#[command]
pub async fn transcription(
server_id: String,
_audio_type: String,
_audio_content: String,
audio_content: String,
) -> Result<TranscriptionResponse, String> {
// let mut query_params = HashMap::new();
// query_params.insert("type".to_string(), JsonValue::String(audio_type));
// query_params.insert("content".to_string(), JsonValue::String(audio_content));
// Send the HTTP POST request
let response = HttpClient::post(
// Send request to initiate transcription task
let init_response = HttpClient::post(
&server_id,
"/services/audio/transcription",
None,
None,
Some(audio_content.into()),
)
.await
.map_err(|e| format!("Error sending transcription request: {}", e))?;
.await
.map_err(|e| format!("Failed to initiate transcription: {}", e))?;
// Use get_response_body_text to extract the response body as text
let response_body = get_response_body_text(response)
// Extract response body as text
let init_response_text = get_response_body_text(init_response)
.await
.map_err(|e| format!("Failed to read response body: {}", e))?;
.map_err(|e| format!("Failed to read initial response body: {}", e))?;
// Deserialize the response body into TranscriptionResponse
let transcription_response: TranscriptionResponse = serde_json::from_str(&response_body)
.map_err(|e| format!("Failed to parse transcription response: {}", e))?;
// Parse response JSON to extract task ID
let init_response_json: Value = from_str(&init_response_text).map_err(|e| {
format!(
"Failed to parse initial response JSON: {}. Raw response: {}",
e, init_response_text
)
})?;
let transcription_task_id = init_response_json["task_id"]
.as_str()
.ok_or_else(|| {
format!(
"Missing or invalid task_id in initial response: {}",
init_response_text
)
})?
.to_string();
// Set up polling with timeout
let polling_start = std::time::Instant::now();
const POLLING_TIMEOUT: std::time::Duration = std::time::Duration::from_secs(30);
const POLLING_INTERVAL: std::time::Duration = std::time::Duration::from_millis(200);
let mut transcription_response: TranscriptionResponse;
loop {
// Poll for transcription results
let poll_response = HttpClient::get(
&server_id,
&format!("/services/audio/task/{}", transcription_task_id),
None,
)
.await
.map_err(|e| format!("Failed to poll transcription task: {}", e))?;
// Extract poll response body
let poll_response_text = get_response_body_text(poll_response)
.await
.map_err(|e| format!("Failed to read poll response body: {}", e))?;
// Parse poll response JSON
transcription_response = from_str(&poll_response_text).map_err(|e| {
format!(
"Failed to parse poll response JSON: {}. Raw response: {}",
e, poll_response_text
)
})?;
// Check if transcription results are available
if !transcription_response.results.is_empty() {
break;
}
// Check for timeout
if polling_start.elapsed() >= POLLING_TIMEOUT {
return Err("Transcription task timed out after 30 seconds".to_string());
}
// Wait before next poll
tokio::time::sleep(POLLING_INTERVAL).await;
}
Ok(transcription_response)
}

View File

@@ -1,170 +0,0 @@
use crate::server::servers::{get_server_by_id, get_server_token};
use futures::StreamExt;
use std::collections::HashMap;
use std::sync::Arc;
use tauri::{AppHandle, Emitter, Runtime};
use tokio::net::TcpStream;
use tokio::sync::{mpsc, Mutex};
use tokio_tungstenite::tungstenite::handshake::client::generate_key;
use tokio_tungstenite::tungstenite::Message;
use tokio_tungstenite::MaybeTlsStream;
use tokio_tungstenite::WebSocketStream;
use tokio_tungstenite::{connect_async_tls_with_config, Connector};
#[derive(Default)]
pub struct WebSocketManager {
connections: Arc<Mutex<HashMap<String, Arc<WebSocketInstance>>>>,
}
struct WebSocketInstance {
ws_connection: Mutex<WebSocketStream<MaybeTlsStream<TcpStream>>>, // No need to lock the entire map
cancel_tx: mpsc::Sender<()>,
}
fn convert_to_websocket(endpoint: &str) -> Result<String, String> {
let url = url::Url::parse(endpoint).map_err(|e| format!("Invalid URL: {}", e))?;
let ws_protocol = if url.scheme() == "https" {
"wss://"
} else {
"ws://"
};
let host = url.host_str().ok_or("No host found in URL")?;
let port = url
.port_or_known_default()
.unwrap_or(if url.scheme() == "https" { 443 } else { 80 });
let ws_endpoint = if port == 80 || port == 443 {
format!("{}{}{}", ws_protocol, host, "/ws")
} else {
format!("{}{}:{}/ws", ws_protocol, host, port)
};
Ok(ws_endpoint)
}
#[tauri::command]
pub async fn connect_to_server<R: Runtime>(
tauri_app_handle: AppHandle<R>,
id: String,
client_id: String,
state: tauri::State<'_, WebSocketManager>,
app_handle: AppHandle,
) -> Result<(), String> {
let connections_clone = state.connections.clone();
// Disconnect old connection first
disconnect(client_id.clone(), state.clone()).await.ok();
let server = get_server_by_id(&id).ok_or(format!("Server with ID {} not found", id))?;
let endpoint = convert_to_websocket(&server.endpoint)?;
let token = get_server_token(&id).await?.map(|t| t.access_token.clone());
let mut request =
tokio_tungstenite::tungstenite::client::IntoClientRequest::into_client_request(&endpoint)
.map_err(|e| format!("Failed to create WebSocket request: {}", e))?;
request
.headers_mut()
.insert("Connection", "Upgrade".parse().unwrap());
request
.headers_mut()
.insert("Upgrade", "websocket".parse().unwrap());
request
.headers_mut()
.insert("Sec-WebSocket-Version", "13".parse().unwrap());
request
.headers_mut()
.insert("Sec-WebSocket-Key", generate_key().parse().unwrap());
if let Some(token) = token {
request
.headers_mut()
.insert("X-API-TOKEN", token.parse().unwrap());
}
let allow_self_signature =
crate::settings::get_allow_self_signature(tauri_app_handle.clone()).await;
let tls_connector = tokio_native_tls::native_tls::TlsConnector::builder()
.danger_accept_invalid_certs(allow_self_signature)
.build()
.map_err(|e| format!("TLS build error: {:?}", e))?;
let connector = Connector::NativeTls(tls_connector.into());
let (ws_stream, _) = connect_async_tls_with_config(
request,
None, // WebSocketConfig
true, // disable_nagle
Some(connector), // Connector
)
.await
.map_err(|e| format!("WebSocket TLS error: {:?}", e))?;
let (cancel_tx, mut cancel_rx) = mpsc::channel(1);
let instance = Arc::new(WebSocketInstance {
ws_connection: Mutex::new(ws_stream),
cancel_tx,
});
// Insert connection into the map (lock is held briefly)
{
let mut connections = connections_clone.lock().await;
connections.insert(client_id.clone(), instance.clone());
}
// Spawn WebSocket handler in a separate task
let app_handle_clone = app_handle.clone();
let client_id_clone = client_id.clone();
tokio::spawn(async move {
let ws = &mut *instance.ws_connection.lock().await;
loop {
tokio::select! {
msg = ws.next() => {
match msg {
Some(Ok(Message::Text(text))) => {
let _ = app_handle_clone.emit(&format!("ws-message-{}", client_id_clone), text);
},
Some(Err(_)) | None => {
log::debug!("WebSocket connection closed or error");
let _ = app_handle_clone.emit(&format!("ws-error-{}", client_id_clone), id.clone());
break;
}
_ => {}
}
}
_ = cancel_rx.recv() => {
log::debug!("WebSocket connection cancelled");
let _ = app_handle_clone.emit(&format!("ws-cancel-{}", client_id_clone), id.clone());
break;
}
}
}
// Remove connection after it closes
let mut connections = connections_clone.lock().await;
connections.remove(&client_id_clone);
});
Ok(())
}
#[tauri::command]
pub async fn disconnect(
client_id: String,
state: tauri::State<'_, WebSocketManager>,
) -> Result<(), String> {
let instance = {
let mut connections = state.connections.lock().await;
connections.remove(&client_id)
};
if let Some(instance) = instance {
let _ = instance.cancel_tx.send(()).await;
// Close WebSocket (lock only the connection, not the whole map)
let mut ws = instance.ws_connection.lock().await;
let _ = ws.close(None).await;
}
Ok(())
}

View File

@@ -1,12 +1,12 @@
use crate::COCO_TAURI_STORE;
use serde_json::Value as Json;
use tauri::{AppHandle, Runtime};
use tauri::AppHandle;
use tauri_plugin_store::StoreExt;
const SETTINGS_ALLOW_SELF_SIGNATURE: &str = "settings_allow_self_signature";
#[tauri::command]
pub async fn set_allow_self_signature<R: Runtime>(tauri_app_handle: AppHandle<R>, value: bool) {
pub async fn set_allow_self_signature(tauri_app_handle: AppHandle, value: bool) {
use crate::server::http_client;
let store = tauri_app_handle
@@ -40,7 +40,7 @@ pub async fn set_allow_self_signature<R: Runtime>(tauri_app_handle: AppHandle<R>
}
/// Synchronous version of `async get_allow_self_signature()`.
pub fn _get_allow_self_signature<R: Runtime>(tauri_app_handle: AppHandle<R>) -> bool {
pub fn _get_allow_self_signature(tauri_app_handle: AppHandle) -> bool {
let store = tauri_app_handle
.store(COCO_TAURI_STORE)
.unwrap_or_else(|e| {
@@ -67,6 +67,6 @@ pub fn _get_allow_self_signature<R: Runtime>(tauri_app_handle: AppHandle<R>) ->
}
#[tauri::command]
pub async fn get_allow_self_signature<R: Runtime>(tauri_app_handle: AppHandle<R>) -> bool {
pub async fn get_allow_self_signature(tauri_app_handle: AppHandle) -> bool {
_get_allow_self_signature(tauri_app_handle)
}

View File

@@ -1,7 +1,7 @@
use tauri::{App, WebviewWindow};
use tauri::{AppHandle, WebviewWindow};
pub fn platform(
_app: &mut App,
_tauri_app_handle: &AppHandle,
_main_window: WebviewWindow,
_settings_window: WebviewWindow,
_check_window: WebviewWindow,

View File

@@ -1,11 +1,9 @@
//credits to: https://github.com/ayangweb/ayangweb-EcoPaste/blob/169323dbe6365ffe4abb64d867439ed2ea84c6d1/src-tauri/src/core/setup/mac.rs
use tauri::{App, Emitter, EventTarget, WebviewWindow};
use tauri_nspanel::{cocoa::appkit::NSWindowCollectionBehavior, panel_delegate, WebviewWindowExt};
//! credits to: https://github.com/ayangweb/ayangweb-EcoPaste/blob/169323dbe6365ffe4abb64d867439ed2ea84c6d1/src-tauri/src/core/setup/mac.rs
use crate::common::MAIN_WINDOW_LABEL;
#[allow(non_upper_case_globals)]
const NSWindowStyleMaskNonActivatingPanel: i32 = 1 << 7;
use objc2_app_kit::NSNonactivatingPanelMask;
use tauri::{AppHandle, Emitter, EventTarget, WebviewWindow};
use tauri_nspanel::{WebviewWindowExt, cocoa::appkit::NSWindowCollectionBehavior, panel_delegate};
const WINDOW_FOCUS_EVENT: &str = "tauri://focus";
const WINDOW_BLUR_EVENT: &str = "tauri://blur";
@@ -13,7 +11,7 @@ const WINDOW_MOVED_EVENT: &str = "tauri://move";
const WINDOW_RESIZED_EVENT: &str = "tauri://resize";
pub fn platform(
_app: &mut App,
_tauri_app_handle: &AppHandle,
main_window: WebviewWindow,
_settings_window: WebviewWindow,
_check_window: WebviewWindow,
@@ -21,15 +19,21 @@ pub fn platform(
// Convert ns_window to ns_panel
let panel = main_window.to_panel().unwrap();
// Make the window above the dock
panel.set_level(20);
// Do not steal focus from other windows
panel.set_style_mask(NSWindowStyleMaskNonActivatingPanel);
//
// Cast is safe
panel.set_style_mask(NSNonactivatingPanelMask.0 as i32);
// Set its level to NSFloatingWindowLevel to ensure it appears in front of
// all normal-level windows
//
// NOTE: some Chinese input methods use a level between NSDockWindowLevel (20)
// and NSMainMenuWindowLevel (24), setting our level above NSDockWindowLevel
// would block their window
panel.set_floating_panel(true);
// Share the window across all desktop spaces and full screen
// Open the window in the active workspace and full screen
panel.set_collection_behaviour(
NSWindowCollectionBehavior::NSWindowCollectionBehaviorCanJoinAllSpaces
NSWindowCollectionBehavior::NSWindowCollectionBehaviorMoveToActiveSpace
| NSWindowCollectionBehavior::NSWindowCollectionBehaviorStationary
| NSWindowCollectionBehavior::NSWindowCollectionBehaviorFullScreenAuxiliary,
);

View File

@@ -1,4 +1,10 @@
use tauri::{App, WebviewWindow};
use crate::GLOBAL_TAURI_APP_HANDLE;
use crate::autostart;
use crate::common::register::SearchSourceRegistry;
use crate::util::app_lang::update_app_lang;
use std::sync::atomic::AtomicBool;
use std::sync::atomic::Ordering;
use tauri::{AppHandle, Manager, WebviewWindow};
#[cfg(target_os = "macos")]
mod mac;
@@ -19,7 +25,7 @@ pub use windows::*;
pub use linux::*;
pub fn default(
app: &mut App,
tauri_app_handle: &AppHandle,
main_window: WebviewWindow,
settings_window: WebviewWindow,
check_window: WebviewWindow,
@@ -29,9 +35,83 @@ pub fn default(
main_window.open_devtools();
platform(
app,
tauri_app_handle,
main_window.clone(),
settings_window.clone(),
check_window.clone(),
);
}
/// Indicates if the setup job is completed.
static BACKEND_SETUP_COMPLETED: AtomicBool = AtomicBool::new(false);
/// The function `backup_setup()` may be called concurrently, use this lock to
/// synchronize that only 1 async task can do the actual setup job.
static MUTEX_LOCK: tokio::sync::Mutex<()> = tokio::sync::Mutex::const_new(());
/// This function includes the setup job that has to be coordinated with the
/// frontend, or the App will panic due to races[1]. The way we coordinate is to
/// expose this function as a Tauri command, and let the frontend code invoke
/// it.
///
/// The frontend code should ensure that:
///
/// 1. This command gets called before invoking other commands.
/// 2. This command should only be called once.
///
/// [1]: For instance, Tauri command `list_extensions()` relies on an in-memory
/// extension list that won't be initialized until `init_extensions()` gets
/// called. If the frontend code invokes `list_extensions()` before `init_extension()`
/// gets executed, we get a panic.
#[tauri::command]
pub(crate) async fn backend_setup(tauri_app_handle: AppHandle, app_lang: String) {
if BACKEND_SETUP_COMPLETED.load(Ordering::Relaxed) {
return;
}
// Race to let one async task do the setup job
let _guard = MUTEX_LOCK.lock().await;
// Re-check in case the current async task is not the first one that acquires
// the lock
if BACKEND_SETUP_COMPLETED.load(Ordering::Relaxed) {
return;
}
GLOBAL_TAURI_APP_HANDLE
.set(tauri_app_handle.clone())
.expect("global tauri AppHandle already initialized");
log::trace!("global Tauri AppHandle set");
let registry = SearchSourceRegistry::default();
tauri_app_handle.manage(registry); // Store registry in Tauri's app state
// This has to be called before initializing extensions as doing that
// requires access to the shortcut store, which will be set by this
// function.
//
//
// Windows requires that hotkey setup has to be done on the main thread, or
// we will get error "ERROR_WINDOW_OF_OTHER_THREAD 1408 (0x580)"
let tauri_app_handle_clone = tauri_app_handle.clone();
tauri_app_handle
.run_on_main_thread(move || {
crate::shortcut::enable_shortcut(&tauri_app_handle_clone);
})
.expect("failed to run this closure on the main thread");
crate::init(&tauri_app_handle).await;
if let Err(err) = crate::extension::init_extensions(&tauri_app_handle).await {
log::error!(
"failed to initialize extension-related stuff, error [{}]",
err
);
}
autostart::ensure_autostart_state_consistent(&tauri_app_handle).unwrap();
update_app_lang(app_lang).await;
// Invoked, now update the state
BACKEND_SETUP_COMPLETED.store(true, Ordering::Relaxed);
}

View File

@@ -1,7 +1,7 @@
use tauri::{App, WebviewWindow};
use tauri::{AppHandle, WebviewWindow};
pub fn platform(
_app: &mut App,
_tauri_app_handle: &AppHandle,
_main_window: WebviewWindow,
_settings_window: WebviewWindow,
_check_window: WebviewWindow,

View File

@@ -1,5 +1,6 @@
use crate::{hide_coco, show_coco, COCO_TAURI_STORE};
use tauri::{async_runtime, App, AppHandle, Manager, Runtime};
use crate::common::MAIN_WINDOW_LABEL;
use crate::{COCO_TAURI_STORE, hide_coco, show_coco};
use tauri::{AppHandle, Manager, async_runtime};
use tauri_plugin_global_shortcut::{GlobalShortcutExt, Shortcut, ShortcutState};
use tauri_plugin_store::{JsonValue, StoreExt};
@@ -16,9 +17,9 @@ const DEFAULT_SHORTCUT: &str = "command+shift+space";
const DEFAULT_SHORTCUT: &str = "ctrl+shift+space";
/// Set up the shortcut upon app start.
pub fn enable_shortcut(app: &App) {
pub fn enable_shortcut(tauri_app_handle: &AppHandle) {
log::trace!("setting up Coco hotkey");
let store = app
let store = tauri_app_handle
.store(COCO_TAURI_STORE)
.expect("creating a store should not fail");
@@ -33,7 +34,7 @@ pub fn enable_shortcut(app: &App) {
let stored_shortcut = stored_shortcut_str
.parse::<Shortcut>()
.expect("stored shortcut string should be valid");
_register_shortcut_upon_start(app, stored_shortcut);
_register_shortcut_upon_start(tauri_app_handle, stored_shortcut);
} else {
store.set(
COCO_GLOBAL_SHORTCUT,
@@ -42,7 +43,7 @@ pub fn enable_shortcut(app: &App) {
let default_shortcut = DEFAULT_SHORTCUT
.parse::<Shortcut>()
.expect("default shortcut should never be invalid");
_register_shortcut_upon_start(app, default_shortcut);
_register_shortcut_upon_start(tauri_app_handle, default_shortcut);
}
log::trace!("Coco hotkey has been set");
}
@@ -50,14 +51,14 @@ pub fn enable_shortcut(app: &App) {
/// Get the stored shortcut as a string, same as [`_get_shortcut()`], except that
/// this is a `tauri::command` interface.
#[tauri::command]
pub async fn get_current_shortcut<R: Runtime>(app: AppHandle<R>) -> Result<String, String> {
pub async fn get_current_shortcut(app: AppHandle) -> Result<String, String> {
let shortcut = _get_shortcut(&app);
Ok(shortcut)
}
/// Get the current shortcut and unregister it on the tauri side.
#[tauri::command]
pub async fn unregister_shortcut<R: Runtime>(app: AppHandle<R>) {
pub async fn unregister_shortcut(app: AppHandle) {
let shortcut_str = _get_shortcut(&app);
let shortcut = shortcut_str
.parse::<Shortcut>()
@@ -70,9 +71,9 @@ pub async fn unregister_shortcut<R: Runtime>(app: AppHandle<R>) {
/// Change the global shortcut to `key`.
#[tauri::command]
pub async fn change_shortcut<R: Runtime>(
app: AppHandle<R>,
_window: tauri::Window<R>,
pub async fn change_shortcut(
app: AppHandle,
_window: tauri::Window,
key: String,
) -> Result<(), String> {
println!("key {}:", key);
@@ -94,7 +95,7 @@ pub async fn change_shortcut<R: Runtime>(
}
/// Helper function to register a shortcut, used for shortcut updates.
fn _register_shortcut<R: Runtime>(app: &AppHandle<R>, shortcut: Shortcut) {
fn _register_shortcut(app: &AppHandle, shortcut: Shortcut) {
app.global_shortcut()
.on_shortcut(shortcut, move |app, scut, event| {
if scut == &shortcut {
@@ -118,12 +119,9 @@ fn _register_shortcut<R: Runtime>(app: &AppHandle<R>, shortcut: Shortcut) {
.unwrap();
}
use crate::common::MAIN_WINDOW_LABEL;
/// Helper function to register a shortcut, used to set up the shortcut up App's first start.
fn _register_shortcut_upon_start(app: &App, shortcut: Shortcut) {
let handler = app.app_handle();
handler
fn _register_shortcut_upon_start(tauri_app_handle: &AppHandle, shortcut: Shortcut) {
tauri_app_handle
.plugin(
tauri_plugin_global_shortcut::Builder::new()
.with_handler(move |app, scut, event| {
@@ -147,11 +145,14 @@ fn _register_shortcut_upon_start(app: &App, shortcut: Shortcut) {
.build(),
)
.unwrap();
app.global_shortcut().register(shortcut).unwrap();
tauri_app_handle
.global_shortcut()
.register(shortcut)
.unwrap();
}
/// Helper function to get the stored global shortcut, as a string.
pub fn _get_shortcut<R: Runtime>(app: &AppHandle<R>) -> String {
pub fn _get_shortcut(app: &AppHandle) -> String {
let store = app
.get_store(COCO_TAURI_STORE)
.expect("store should be loaded or created");

View File

@@ -0,0 +1,62 @@
//! Configuration entry App language is persisted in the frontend code, but we
//! need to access it on the backend.
//!
//! So we duplicate it here **in the MEMORY** and expose a setter method to the
//! frontend so that the value can be updated and stay update-to-date.
use tokio::sync::RwLock;
#[derive(Debug, Clone, Copy, PartialEq)]
#[allow(non_camel_case_types)]
pub(crate) enum Lang {
en_US,
zh_CN,
}
impl std::fmt::Display for Lang {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Lang::en_US => write!(f, "en_US"),
Lang::zh_CN => write!(f, "zh_CN"),
}
}
}
/// Frontend code uses "en" and "zh" to represent the Application language.
///
/// This impl is not meant to be used as a parser for locale strings such as
/// "en_US" or "zh_CN".
impl std::str::FromStr for Lang {
type Err = String;
fn from_str(s: &str) -> Result<Self, Self::Err> {
match s {
"en" => Ok(Lang::en_US),
"zh" => Ok(Lang::zh_CN),
_ => Err(format!("Invalid language: {}", s)),
}
}
}
/// Cache the language config in memory.
static APP_LANG: RwLock<Option<Lang>> = RwLock::const_new(None);
/// Update the in-memory cached `APP_LANG` config.
#[tauri::command]
pub(crate) async fn update_app_lang(lang: String) {
let app_lang = lang.parse::<Lang>().unwrap_or_else(|e| {
panic!(
"invalid argument [{}], could not parse it to [struct Lang], parsing error [{}]",
lang, e
)
});
let mut write_guard = APP_LANG.write().await;
*write_guard = Some(app_lang);
}
/// Helper getter method to handle the `None` case.
pub(crate) async fn get_app_lang() -> Lang {
let opt_lang = *APP_LANG.read().await;
opt_lang.expect("frontend code did not invoke [update_app_lang()] to set the APP_LANG")
}

179
src-tauri/src/util/file.rs Normal file
View File

@@ -0,0 +1,179 @@
#[derive(Debug, Clone, PartialEq, Copy)]
pub(crate) enum FileType {
Folder,
JPEGImage,
PNGImage,
PDFDocument,
PlainTextDocument,
MicrosoftWordDocument,
MicrosoftExcelSpreadsheet,
AudioFile,
VideoFile,
CHeaderFile,
TOMLDocument,
RustScript,
CSourceCode,
MarkdownDocument,
TerminalSettings,
ZipArchive,
Dmg,
Html,
Json,
Xml,
Yaml,
Css,
Vue,
React,
Sql,
Csv,
Javascript,
Lnk,
Typescript,
Python,
Java,
Golang,
Ruby,
Php,
Sass,
Sketch,
AdobeAi,
AdobePsd,
AdobePr,
AdobeAu,
AdobeAe,
AdobeLr,
AdobeXd,
AdobeFl,
AdobeId,
Svg,
Epub,
Unknown,
}
fn get_file_type(path: &str) -> FileType {
let path = camino::Utf8Path::new(path);
// stat() is more precise than file extension, use it if possible.
if path.is_dir() {
return FileType::Folder;
}
let Some(ext) = path.extension() else {
return FileType::Unknown;
};
let ext = ext.to_lowercase();
match ext.as_str() {
"pdf" => FileType::PDFDocument,
"txt" | "text" => FileType::PlainTextDocument,
"doc" | "docx" => FileType::MicrosoftWordDocument,
"xls" | "xlsx" => FileType::MicrosoftExcelSpreadsheet,
"jpg" | "jpeg" => FileType::JPEGImage,
"png" => FileType::PNGImage,
"mp3" | "wav" | "flac" | "aac" | "ogg" | "m4a" => FileType::AudioFile,
"mp4" | "avi" | "mov" | "mkv" | "wmv" | "flv" | "webm" => FileType::VideoFile,
"h" | "hpp" => FileType::CHeaderFile,
"c" | "cpp" | "cc" | "cxx" => FileType::CSourceCode,
"toml" => FileType::TOMLDocument,
"rs" => FileType::RustScript,
"md" | "markdown" => FileType::MarkdownDocument,
"terminal" => FileType::TerminalSettings,
"zip" | "rar" | "7z" | "tar" | "gz" | "bz2" => FileType::ZipArchive,
"dmg" => FileType::Dmg,
"html" | "htm" => FileType::Html,
"json" => FileType::Json,
"xml" => FileType::Xml,
"yaml" | "yml" => FileType::Yaml,
"css" => FileType::Css,
"vue" => FileType::Vue,
"jsx" | "tsx" => FileType::React,
"sql" => FileType::Sql,
"csv" => FileType::Csv,
"js" | "mjs" => FileType::Javascript,
"ts" => FileType::Typescript,
"py" | "pyw" => FileType::Python,
"java" => FileType::Java,
"go" => FileType::Golang,
"rb" => FileType::Ruby,
"php" => FileType::Php,
"sass" | "scss" => FileType::Sass,
"sketch" => FileType::Sketch,
"ai" => FileType::AdobeAi,
"psd" => FileType::AdobePsd,
"prproj" => FileType::AdobePr,
"aup" | "aup3" => FileType::AdobeAu,
"aep" => FileType::AdobeAe,
"lrcat" => FileType::AdobeLr,
"xd" => FileType::AdobeXd,
"fla" => FileType::AdobeFl,
"indd" => FileType::AdobeId,
"svg" => FileType::Svg,
"epub" => FileType::Epub,
"lnk" => FileType::Lnk,
_ => FileType::Unknown,
}
}
fn type_to_icon(ty: FileType) -> &'static str {
match ty {
FileType::Folder => "font_file_folder",
FileType::JPEGImage => "font_file_image",
FileType::PNGImage => "font_file_image",
FileType::PDFDocument => "font_file_document_pdf",
FileType::PlainTextDocument => "font_file_txt",
FileType::MicrosoftWordDocument => "font_file_document_word",
FileType::MicrosoftExcelSpreadsheet => "font_file_spreadsheet_excel",
FileType::AudioFile => "font_file_audio",
FileType::VideoFile => "font_file_video",
FileType::CHeaderFile => "font_file_csource",
FileType::TOMLDocument => "font_file_toml",
FileType::RustScript => "font_file_rustscript1",
FileType::CSourceCode => "font_file_csource",
FileType::MarkdownDocument => "font_file_markdown",
FileType::TerminalSettings => "font_file_terminal1",
FileType::ZipArchive => "font_file_zip",
FileType::Dmg => "font_file_dmg",
FileType::Html => "font_file_html",
FileType::Json => "font_file_json",
FileType::Xml => "font_file_xml",
FileType::Yaml => "font_file_yaml",
FileType::Css => "font_file_css",
FileType::Vue => "font_file_vue",
FileType::React => "font_file_react",
FileType::Sql => "font_file_sql",
FileType::Csv => "font_file_csv",
FileType::Javascript => "font_file_javascript",
FileType::Lnk => "font_file_lnk",
FileType::Typescript => "font_file_typescript",
FileType::Python => "font_file_python",
FileType::Java => "font_file_java",
FileType::Golang => "font_file_golang",
FileType::Ruby => "font_file_ruby",
FileType::Php => "font_file_php",
FileType::Sass => "font_file_sass",
FileType::Sketch => "font_file_sketch",
FileType::AdobeAi => "font_file_adobe_ai",
FileType::AdobePsd => "font_file_adobe_psd",
FileType::AdobePr => "font_file_adobe_pr",
FileType::AdobeAu => "font_file_adobe_au",
FileType::AdobeAe => "font_file_adobe_ae",
FileType::AdobeLr => "font_file_adobe_lr",
FileType::AdobeXd => "font_file_adobe_xd",
FileType::AdobeFl => "font_file_adobe_fl",
FileType::AdobeId => "font_file_adobe_id",
FileType::Svg => "font_file_svg",
FileType::Epub => "font_file_epub",
FileType::Unknown => "font_file_unknown",
}
}
/// Synchronous version of `get_file_icon()`.
pub(crate) fn sync_get_file_icon(path: &str) -> &'static str {
let ty = get_file_type(path);
type_to_icon(ty)
}
#[tauri::command]
pub(crate) async fn get_file_icon(path: String) -> &'static str {
sync_get_file_icon(&path)
}

View File

@@ -1,10 +1,23 @@
pub(crate) mod app_lang;
pub(crate) mod file;
pub(crate) mod path;
pub(crate) mod platform;
pub(crate) mod prevent_default;
pub(crate) mod system_lang;
pub(crate) mod updater;
use std::{path::Path, process::Command};
use tauri::{AppHandle, Runtime};
use tauri::AppHandle;
use tauri_plugin_shell::ShellExt;
enum LinuxDesktopEnvironment {
/// We use this env variable to determine the DE on Linux.
const XDG_CURRENT_DESKTOP: &str = "XDG_CURRENT_DESKTOP";
#[derive(Debug, PartialEq)]
pub(crate) enum LinuxDesktopEnvironment {
Gnome,
Kde,
Unsupported { xdg_current_desktop: String },
}
impl LinuxDesktopEnvironment {
@@ -30,6 +43,14 @@ impl LinuxDesktopEnvironment {
.arg(path)
.output()
.map_err(|e| e.to_string())?,
Self::Unsupported {
xdg_current_desktop,
} => {
return Err(format!(
"Cannot open apps as this Linux desktop environment [{}] is not supported",
xdg_current_desktop
));
}
};
if !cmd_output.status.success() {
@@ -44,20 +65,23 @@ impl LinuxDesktopEnvironment {
}
}
fn get_linux_desktop_environment() -> Option<LinuxDesktopEnvironment> {
let de_os_str = std::env::var_os("XDG_CURRENT_DESKTOP")?;
let de_str = de_os_str
.into_string()
.expect("$XDG_CURRENT_DESKTOP should be UTF-8 encoded");
/// None means that it is likely that we do not have a desktop environment.
pub(crate) fn get_linux_desktop_environment() -> Option<LinuxDesktopEnvironment> {
let de_os_str = std::env::var_os(XDG_CURRENT_DESKTOP)?;
let de_str = de_os_str.into_string().unwrap_or_else(|_os_string| {
panic!("${} should be UTF-8 encoded", XDG_CURRENT_DESKTOP);
});
let de = match de_str.as_str() {
"GNOME" => LinuxDesktopEnvironment::Gnome,
// Ubuntu uses "ubuntu:GNOME" instead of just "GNOME", they really love
// their distro name.
"ubuntu:GNOME" => LinuxDesktopEnvironment::Gnome,
"KDE" => LinuxDesktopEnvironment::Kde,
unsupported_de => unimplemented!(
"This desktop environment [{}] has not been supported yet",
unsupported_de
),
_ => LinuxDesktopEnvironment::Unsupported {
xdg_current_desktop: de_str,
},
};
Some(de)
@@ -67,12 +91,12 @@ fn get_linux_desktop_environment() -> Option<LinuxDesktopEnvironment> {
//
// tauri_plugin_shell::open() is deprecated, but we still use it.
#[allow(deprecated)]
pub async fn open<R: Runtime>(app_handle: AppHandle<R>, path: String) -> Result<(), String> {
pub async fn open(app_handle: AppHandle, path: String) -> Result<(), String> {
if cfg!(target_os = "linux") {
let borrowed_path = Path::new(&path);
if let Some(file_extension) = borrowed_path.extension() {
if file_extension == "desktop" {
let desktop_environment = get_linux_desktop_environment().expect("The Linux OS is running without a desktop, Coco could never run in such a environment");
let desktop_environment = get_linux_desktop_environment().expect("The Linux OS is running without a desktop, Coco could never run in such an environment");
return desktop_environment.launch_app_via_desktop_file(path);
}
}
@@ -83,3 +107,55 @@ pub async fn open<R: Runtime>(app_handle: AppHandle<R>, path: String) -> Result<
.open(path, None)
.map_err(|e| e.to_string())
}
#[cfg(test)]
mod tests {
use super::*;
// This test modifies env var XDG_CURRENT_DESKTOP, which is kinda unsafe
// but considering this is just test, it is ok to do so.
#[test]
fn test_get_linux_desktop_environment() {
// SAFETY: Rust code won't modify/read XDG_CURRENT_DESKTOP concurrently, we
// have no guarantee from the underlying C code.
unsafe {
// Save the original value if it exists
let original_value = std::env::var_os(XDG_CURRENT_DESKTOP);
// Test when XDG_CURRENT_DESKTOP is not set
std::env::remove_var(XDG_CURRENT_DESKTOP);
assert!(get_linux_desktop_environment().is_none());
// Test GNOME
std::env::set_var(XDG_CURRENT_DESKTOP, "GNOME");
let result = get_linux_desktop_environment();
assert_eq!(result.unwrap(), LinuxDesktopEnvironment::Gnome);
// Test ubuntu:GNOME
std::env::set_var(XDG_CURRENT_DESKTOP, "ubuntu:GNOME");
let result = get_linux_desktop_environment();
assert_eq!(result.unwrap(), LinuxDesktopEnvironment::Gnome);
// Test KDE
std::env::set_var(XDG_CURRENT_DESKTOP, "KDE");
let result = get_linux_desktop_environment();
assert_eq!(result.unwrap(), LinuxDesktopEnvironment::Kde);
// Test unsupported desktop environment
std::env::set_var(XDG_CURRENT_DESKTOP, "XFCE");
let result = get_linux_desktop_environment();
assert_eq!(
result.unwrap(),
LinuxDesktopEnvironment::Unsupported {
xdg_current_desktop: "XFCE".into()
}
);
// Restore the original value
match original_value {
Some(value) => std::env::set_var(XDG_CURRENT_DESKTOP, value),
None => std::env::remove_var(XDG_CURRENT_DESKTOP),
}
}
}
}

View File

@@ -0,0 +1,12 @@
#[tauri::command]
pub(crate) fn path_absolute(path: &str) -> String {
// We do not use std::path::absolute() because it does not clean ".."
// https://doc.rust-lang.org/stable/std/path/fn.absolute.html#platform-specific-behavior
use path_clean::clean;
let clean_path = clean(path);
clean_path
.into_os_string()
.into_string()
.expect("path should be UTF-8 encoded")
}

View File

@@ -0,0 +1,60 @@
use derive_more::Display;
use serde::{Deserialize, Serialize};
use std::borrow::Cow;
use strum::EnumCount;
use strum::VariantArray;
#[derive(
Debug,
Deserialize,
Serialize,
Copy,
Clone,
Hash,
PartialEq,
Eq,
Display,
EnumCount,
VariantArray,
)]
#[serde(rename_all(serialize = "lowercase", deserialize = "lowercase"))]
pub(crate) enum Platform {
#[display("macOS")]
Macos,
#[display("Linux")]
Linux,
#[display("windows")]
Windows,
}
impl Platform {
/// Helper function to determine the current platform.
pub(crate) fn current() -> Platform {
let os_str = std::env::consts::OS;
serde_plain::from_str(os_str).unwrap_or_else(|_e| {
panic!("std::env::consts::OS is [{}], which is not a valid value for [enum Platform], valid values: {:?}", os_str, Self::VARIANTS.iter().map(|platform|platform.to_string()).collect::<Vec<String>>());
})
}
/// Return the `X-OS-NAME` HTTP request header.
pub(crate) fn to_os_name_http_header_str(&self) -> Cow<'static, str> {
match self {
Self::Macos => Cow::Borrowed("macos"),
Self::Windows => Cow::Borrowed("windows"),
// For Linux, we need the actual distro `ID`, not just a "linux".
Self::Linux => Cow::Owned(sysinfo::System::distribution_id()),
}
}
/// Returns the number of platforms supported by Coco.
//
// a.k.a., the number of this enum's variants.
pub(crate) fn num_of_supported_platforms() -> usize {
Platform::COUNT
}
/// Returns a set that contains all the platforms.
pub(crate) fn all() -> std::collections::HashSet<Self> {
Platform::VARIANTS.into_iter().copied().collect()
}
}

View File

@@ -0,0 +1,13 @@
pub fn init() -> tauri::plugin::TauriPlugin<tauri::Wry> {
#[cfg(debug_assertions)]
{
use tauri_plugin_prevent_default::Flags;
tauri_plugin_prevent_default::Builder::new()
.with_flags(Flags::all().difference(Flags::CONTEXT_MENU))
.build()
}
#[cfg(not(debug_assertions))]
tauri_plugin_prevent_default::init()
}

View File

@@ -0,0 +1,14 @@
/// Helper function to get the system language.
///
/// We cannot return `enum Lang` here because Coco has limited language support
/// but the OS supports many more languages.
#[cfg(feature = "use_pizza_engine")]
pub(crate) fn get_system_lang() -> String {
use sys_locale::get_locale;
// fall back to English (general) when we cannot get the locale
//
// We replace '-' with '_' in applications-rs, to make the locales match,
// we need to do this here as well.
get_locale().unwrap_or("en".into()).replace('-', "_")
}

View File

@@ -0,0 +1,87 @@
use semver::Version;
use tauri_plugin_updater::RemoteRelease;
/// Helper function to extract the build number out of `version`.
///
/// If the version string is in the `x.y.z` format and does not include a build
/// number, we assume a build number of 0.
fn extract_build_number(version: &Version) -> u32 {
let pre = &version.pre;
if pre.is_empty() {
// A special value for the versions that do not have array
0
} else {
let pre_str = pre.as_str();
let build_number_str = {
match pre_str.strip_prefix("SNAPSHOT-") {
Some(str) => str,
None => pre_str,
}
};
let build_number : u32 = build_number_str.parse().unwrap_or_else(|e| {
panic!(
"invalid build number, cannot parse [{}] to a valid build number, error [{}], version [{}]",
build_number_str, e, version
)
});
build_number
}
}
/// # Local version format
///
/// Packages built in our CI use the following format:
///
/// * `x.y.z-SNAPSHOT-<build number>`
/// * `x.y.z-<build number>`
///
/// If you build Coco from src, the version will be in format `x.y.z`
///
/// # Remote version format
///
/// `x.y.z-<build number>`
///
/// # How we compare versions
///
/// We compare versions based solely on the build number.
/// If the version string is in the `x.y.z` format and does not include a build number,
/// we assume a build number of 0. As a result, such versions are considered older
/// than any version with an explicit build number.
pub(crate) fn custom_version_comparator(local: Version, remote_release: RemoteRelease) -> bool {
let remote = remote_release.version;
let local_build_number = extract_build_number(&local);
let remote_build_number = extract_build_number(&remote);
let should_update = remote_build_number > local_build_number;
log::debug!(
"custom version comparator invoked, local version [{}], remote version [{}], should update [{}]",
local,
remote,
should_update
);
should_update
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_extract_build_number() {
// 0.6.0 => 0
let version = Version::parse("0.6.0").unwrap();
assert_eq!(extract_build_number(&version), 0);
// 0.6.0-2371 => 2371
let version = Version::parse("0.6.0-2371").unwrap();
assert_eq!(extract_build_number(&version), 2371);
// 0.6.0-SNAPSHOT-2371 => 2371
let version = Version::parse("0.6.0-SNAPSHOT-2371").unwrap();
assert_eq!(extract_build_number(&version), 2371);
}
}

View File

@@ -113,20 +113,6 @@
"icons/Square310x310Logo.png",
"icons/StoreLogo.png"
],
"macOS": {
"minimumSystemVersion": "10.12",
"hardenedRuntime": true,
"dmg": {
"appPosition": {
"x": 180,
"y": 180
},
"applicationFolderPosition": {
"x": 480,
"y": 180
}
}
},
"resources": ["assets/**/*", "icons"]
},
"plugins": {
@@ -140,11 +126,9 @@
"https://release.infinilabs.com/coco/app/.latest.json?target={{target}}&arch={{arch}}&current_version={{current_version}}"
]
},
"websocket": {},
"shell": {},
"globalShortcut": {},
"deep-link": {
"schema": "coco",
"mobile": [
{
"host": "app.infini.cloud",

View File

@@ -0,0 +1,15 @@
{
"identifier": "rs.coco.app",
"bundle": {
"linux": {
"deb": {
"depends": ["gstreamer1.0-plugins-good"],
"desktopTemplate": "./Coco.desktop"
},
"rpm": {
"depends": ["gstreamer1-plugins-good"],
"desktopTemplate": "./Coco.desktop"
}
}
}
}

View File

@@ -86,6 +86,7 @@ export const Get = <T>(
} else {
res = result?.data as FcResponse<T>;
}
resolve([null, res as FcResponse<T>]);
})
.catch((err) => {
@@ -96,14 +97,14 @@ export const Get = <T>(
export const Post = <T>(
url: string,
data: IAnyObj,
data: IAnyObj | undefined,
params: IAnyObj = {},
headers: IAnyObj = {}
): Promise<[any, FcResponse<T> | undefined]> => {
return new Promise((resolve) => {
const appStore = JSON.parse(localStorage.getItem("app-store") || "{}");
let baseURL = appStore.state?.endpoint_http
let baseURL = appStore.state?.endpoint_http;
if (!baseURL || baseURL === "undefined") {
baseURL = "";
}

63
src/api/streamFetch.ts Normal file
View File

@@ -0,0 +1,63 @@
export async function streamPost({
url,
body,
queryParams,
headers,
onMessage,
onError,
}: {
url: string;
body: any;
queryParams?: Record<string, any>;
headers?: Record<string, string>;
onMessage: (chunk: string) => void;
onError?: (err: any) => void;
}) {
const appStore = JSON.parse(localStorage.getItem("app-store") || "{}");
let baseURL = appStore.state?.endpoint_http;
if (!baseURL || baseURL === "undefined") {
baseURL = "";
}
const headersStr = localStorage.getItem("headers") || "{}";
const headersStorage = JSON.parse(headersStr);
const query = new URLSearchParams(queryParams || {}).toString();
const fullUrl = `${baseURL}${url}?${query}`;
try {
const res = await fetch(fullUrl, {
method: "POST",
headers: {
"Content-Type": "application/json",
...(headersStorage),
...(headers || {}),
},
body: JSON.stringify(body),
});
if (!res.ok || !res.body) throw new Error("Stream failed");
const reader = res.body.getReader();
const decoder = new TextDecoder("utf-8");
let buffer = "";
while (true) {
const { done, value } = await reader.read();
if (done) break;
buffer += decoder.decode(value, { stream: true });
const lines = buffer.split("\n");
for (let i = 0; i < lines.length - 1; i++) {
const line = lines[i].trim();
if (line) onMessage(line);
}
buffer = lines[lines.length - 1];
}
} catch (err) {
console.error("streamPost error:", err);
onError?.(err);
}
}

View File

@@ -1,133 +0,0 @@
import { fetch } from "@tauri-apps/plugin-http";
import { clientEnv } from "@/utils/env";
import { useLogStore } from "@/stores/logStore";
import { get_server_token } from "@/commands";
interface FetchRequestConfig {
url: string;
method?: "GET" | "POST" | "PUT" | "DELETE";
headers?: Record<string, string>;
body?: any;
timeout?: number;
parseAs?: "json" | "text" | "binary";
baseURL?: string;
}
interface FetchResponse<T = any> {
data: T;
status: number;
statusText: string;
headers: Headers;
}
const timeoutPromise = (ms: number) => {
return new Promise<never>((_, reject) =>
setTimeout(() => reject(new Error(`Request timed out after ${ms} ms`)), ms)
);
};
export const tauriFetch = async <T = any>({
url,
method = "GET",
headers = {},
body,
timeout = 30,
parseAs = "json",
baseURL = clientEnv.COCO_SERVER_URL
}: FetchRequestConfig): Promise<FetchResponse<T>> => {
const addLog = useLogStore.getState().addLog;
try {
const appStore = JSON.parse(localStorage.getItem("app-store") || "{}");
const connectStore = JSON.parse(localStorage.getItem("connect-store") || "{}");
console.log("baseURL", appStore.state?.endpoint_http)
baseURL = appStore.state?.endpoint_http || baseURL;
const authStore = JSON.parse(localStorage.getItem("auth-store") || "{}")
const auth = authStore?.state?.auth
console.log("auth", auth)
if (baseURL.endsWith("/")) {
baseURL = baseURL.slice(0, -1);
}
if (!url.startsWith("http://") && !url.startsWith("https://")) {
// If not, prepend the defaultPrefix
url = baseURL + url;
}
if (method !== "GET") {
headers["Content-Type"] = "application/json";
}
const server_id = connectStore.state?.currentService?.id || "default_coco_server"
const res: any = await get_server_token(server_id);
headers["X-API-TOKEN"] = headers["X-API-TOKEN"] || res?.access_token || undefined;
// debug API
const requestInfo = {
url,
method,
headers,
body,
timeout,
parseAs,
};
const fetchPromise = fetch(url, {
method,
headers,
body,
});
const response = await Promise.race([
fetchPromise,
timeoutPromise(timeout * 1000),
]);
const statusText = response.ok ? "OK" : "Error";
let data: any;
if (parseAs === "json") {
data = await response.json();
} else if (parseAs === "text") {
data = await response.text();
} else {
data = await response.arrayBuffer();
}
// debug API
const log = {
request: requestInfo,
response: {
data,
status: response.status,
statusText,
headers: response.headers,
},
};
addLog(log);
return log.response;
} catch (error) {
console.error("Request failed:", error);
// debug API
const log = {
request: {
url,
method,
headers,
body,
timeout,
parseAs,
},
error,
};
addLog(log);
throw error;
}
};

View File

@@ -1,14 +1,13 @@
import { invoke } from "@tauri-apps/api/core";
import {
ServerTokenResponse,
Server,
Connector,
DataSource,
GetResponse,
UploadAttachmentPayload,
UploadAttachmentResponse,
GetAttachmentPayload,
GetAttachmentByIdsPayload,
GetAttachmentResponse,
DeleteAttachmentPayload,
TranscriptionPayload,
@@ -16,7 +15,10 @@ import {
MultiSourceQueryResponse,
} from "@/types/commands";
import { useAppStore } from "@/stores/appStore";
import { useAuthStore } from "@/stores/authStore";
import {
getCurrentWindowService,
handleLogout,
} from "@/commands/windowService";
// Endpoints that don't require authentication
const WHITELIST_SERVERS = [
@@ -36,8 +38,9 @@ async function invokeWithErrorHandler<T>(
command: string,
args?: Record<string, any>
): Promise<T> {
const isCurrentLogin = useAuthStore.getState().isCurrentLogin;
if (!WHITELIST_SERVERS.includes(command) && !isCurrentLogin) {
const service = await getCurrentWindowService();
if (!WHITELIST_SERVERS.includes(command) && !service?.profile) {
console.error("This command requires authentication");
throw new Error("This command requires authentication");
}
@@ -64,18 +67,31 @@ async function invokeWithErrorHandler<T>(
}
}
// Server Data log
let parsedResult = result;
let logData = result;
if (typeof result === "string") {
parsedResult = JSON.parse(result);
logData = parsedResult;
}
infoLog({
username: "@/commands/servers.ts",
logName: command,
})(logData);
return result;
} catch (error: any) {
const errorMessage = error || "Command execution failed";
addError(command + ":" + errorMessage, "error");
// 401 Unauthorized
if (errorMessage.includes("Unauthorized")) {
handleLogout();
} else {
addError(command + ":" + errorMessage, "error");
}
throw error;
}
}
export function get_server_token(id: string): Promise<ServerTokenResponse> {
return invokeWithErrorHandler(`get_server_token`, { id });
}
export function list_coco_servers(): Promise<Server[]> {
return invokeWithErrorHandler(`list_coco_servers`);
}
@@ -146,14 +162,6 @@ export function mcp_server_search({
return invokeWithErrorHandler(`mcp_server_search`, { id, queryParams });
}
export function connect_to_server(id: string, clientId: string): Promise<void> {
return invokeWithErrorHandler(`connect_to_server`, { id, clientId });
}
export function disconnect(clientId: string): Promise<void> {
return invokeWithErrorHandler(`disconnect`, { clientId });
}
export function chat_history({
serverId,
from = 0,
@@ -221,54 +229,63 @@ export function open_session_chat({
export function cancel_session_chat({
serverId,
sessionId,
queryParams,
}: {
serverId: string;
sessionId: string;
queryParams?: Record<string, any>;
}): Promise<string> {
return invokeWithErrorHandler(`cancel_session_chat`, {
serverId,
sessionId,
});
}
export function new_chat({
serverId,
websocketId,
message,
queryParams,
}: {
serverId: string;
websocketId: string;
message: string;
queryParams?: Record<string, any>;
}): Promise<GetResponse> {
return invokeWithErrorHandler(`new_chat`, {
serverId,
websocketId,
message,
queryParams,
});
}
export function send_message({
export function chat_create({
serverId,
message,
attachments,
queryParams,
clientId,
}: {
serverId: string;
message: string;
attachments: string[];
queryParams?: Record<string, any>;
clientId: string;
}): Promise<GetResponse> {
return invokeWithErrorHandler(`chat_create`, {
serverId,
message,
attachments,
queryParams,
clientId,
});
}
export function chat_chat({
serverId,
websocketId,
sessionId,
message,
attachments,
queryParams,
clientId,
}: {
serverId: string;
websocketId: string;
sessionId: string;
message: string;
attachments: string[];
queryParams?: Record<string, any>;
clientId: string;
}): Promise<string> {
return invokeWithErrorHandler(`send_message`, {
return invokeWithErrorHandler(`chat_chat`, {
serverId,
websocketId,
sessionId,
message,
attachments,
queryParams,
clientId,
});
}
@@ -290,9 +307,7 @@ export const update_session_chat = (payload: {
export const assistant_search = (payload: {
serverId: string;
from: number;
size: number;
query?: Record<string, any>;
queryParams?: string[];
}): Promise<boolean> => {
return invokeWithErrorHandler<boolean>("assistant_search", payload);
};
@@ -323,10 +338,13 @@ export const upload_attachment = async (payload: UploadAttachmentPayload) => {
}
};
export const get_attachment = (payload: GetAttachmentPayload) => {
return invokeWithErrorHandler<GetAttachmentResponse>("get_attachment", {
...payload,
});
export const get_attachment_by_ids = (payload: GetAttachmentByIdsPayload) => {
return invokeWithErrorHandler<GetAttachmentResponse>(
"get_attachment_by_ids",
{
...payload,
}
);
};
export const delete_attachment = (payload: DeleteAttachmentPayload) => {
@@ -349,3 +367,7 @@ export const query_coco_fusion = (payload: {
...payload,
});
};
export const get_app_search_source = () => {
return invokeWithErrorHandler<void>("get_app_search_source");
};

View File

@@ -0,0 +1,73 @@
import { useConnectStore } from "@/stores/connectStore";
import { SETTINGS_WINDOW_LABEL } from "@/constants";
import platformAdapter from "@/utils/platformAdapter";
import { useAuthStore } from "@/stores/authStore";
import { useExtensionsStore } from "@/stores/extensionsStore";
export async function getCurrentWindowService() {
const currentService = useConnectStore.getState().currentService;
const cloudSelectService = useConnectStore.getState().cloudSelectService;
const windowLabel = await platformAdapter.getCurrentWindowLabel();
return windowLabel === SETTINGS_WINDOW_LABEL
? cloudSelectService
: currentService;
}
export async function setCurrentWindowService(service: any, isAll?: boolean) {
const { setCurrentService, setCloudSelectService } =
useConnectStore.getState();
// all refresh logout
if (isAll) {
setCloudSelectService(service);
return setCurrentService(service);
}
// current refresh
const windowLabel = await platformAdapter.getCurrentWindowLabel();
if (windowLabel === SETTINGS_WINDOW_LABEL) {
const { currentService } = useConnectStore.getState();
const {
aiOverviewServer,
setAiOverviewServer,
quickAiAccessServer,
setQuickAiAccessServer,
} = useExtensionsStore.getState();
if (currentService?.id === service.id) {
setCurrentService(service);
}
if (aiOverviewServer?.id === service.id) {
setAiOverviewServer(service);
}
if (quickAiAccessServer?.id === service.id) {
setQuickAiAccessServer(service);
}
return setCloudSelectService(service);
}
return setCurrentService(service);
}
export async function handleLogout(serverId?: string) {
const setIsCurrentLogin = useAuthStore.getState().setIsCurrentLogin;
const { serverList, setServerList } = useConnectStore.getState();
const service = await getCurrentWindowService();
const id = serverId || service?.id;
if (!id) return;
// Update the status first
setIsCurrentLogin(false);
if (service?.id === id) {
await setCurrentWindowService({ ...service, profile: null }, true);
}
const updatedServerList = serverList.map((server) =>
server.id === id ? { ...server, profile: null } : server
);
setServerList(updatedServerList);
}

View File

@@ -1,10 +1,8 @@
import { useRef } from "react";
import { Post } from "@/api/axiosRequest";
import platformAdapter from "@/utils/platformAdapter";
import { useConnectStore } from "@/stores/connectStore";
import { useAppStore } from "@/stores/appStore";
import { parseSearchQuery, SearchQuery, unrequitable } from "@/utils";
import { parseSearchQuery, unrequitable } from "@/utils";
interface AssistantFetcherProps {
debounceKeyword?: string;
@@ -15,8 +13,6 @@ export const AssistantFetcher = ({
debounceKeyword = "",
assistantIDs = [],
}: AssistantFetcherProps) => {
const isTauri = useAppStore((state) => state.isTauri);
const { currentService, currentAssistant, setCurrentAssistant } =
useConnectStore();
@@ -29,7 +25,7 @@ export const AssistantFetcher = ({
query?: string;
}) => {
try {
if (unrequitable()) {
if (await unrequitable()) {
return {
total: 0,
list: [],
@@ -43,7 +39,7 @@ export const AssistantFetcher = ({
query,
} = params;
const searchQuery: SearchQuery = {
const queryParams = parseSearchQuery({
from: (current - 1) * pageSize,
size: pageSize,
query: query ?? debounceKeyword,
@@ -52,38 +48,15 @@ export const AssistantFetcher = ({
enabled: true,
id: assistantIDs,
},
};
});
const queryParams = parseSearchQuery(searchQuery);
const body: Record<string, any> = {
const response = await platformAdapter.fetchAssistant(
serverId,
queryParams,
};
let response: any;
if (isTauri) {
if (!currentService?.id) {
throw new Error("currentService is undefined");
}
response = await platformAdapter.commands("assistant_search", body);
} else {
body.serverId = undefined;
const [error, res] = await Post(`/assistant/_search`, body);
if (error) {
throw new Error(error);
}
response = res;
}
queryParams
);
let assistantList = response?.hits?.hits ?? [];
console.log("assistantList", assistantList);
if (
!currentAssistant?._id ||
currentService?.id !== lastServerId.current

Some files were not shown because too many files have changed in this diff Show More