321 Commits

Author SHA1 Message Date
Steve Lau
7d0d11860c try to enable microphone access for signed macOS builds 2025-08-01 11:54:06 +08:00
ayangweb
d48d4af7d2 refactor: optimize upload shortcut display (#850) 2025-08-01 10:24:30 +08:00
ayangweb
876d14f9d9 refactor: optimize enter key display (#849) 2025-08-01 10:20:47 +08:00
ayangweb
a8e090c9be refactor: optimized sending messages (#848) 2025-08-01 09:36:06 +08:00
SteveLauC
c30df6cee0 feat: sub extension can set 'platforms' now (#847)
Before this commit, sub extensions were not allowed to set their
"platforms" field, this restriction is lifted in this commit.

By allowing this, a group extension can have sub extensions for
different platforms, here is an example (irrelavent fields are omitted
for the sake of simplicity):

```json
{
  "name": "Suspend my machine",
  "type": "Group",
  "platforms": ["macos", "windows"],
  "commands": [
    {
      "name": "Suspend macOS":
      "platforms": ["macos"],
      "action": {...}
    },
    {
      "name": "Suspend Windows":
      "platforms": ["windows"],
      "action": {...}
    }
  ]
}
```

While loading or installing extensions, incompatible sub extensions will
be filtered out by Coco, e.g., you won't see that "Suspend Windows"
command if you are on macOS.

An extra check is added in this commit to ensure a sub extensions won't
support the platforms that are incompatible with its main extension.

Even though main extensions and sub extensions can both have "platforms"
specified, the semantics of this field, when not set, differs between them.
For main extensions, it means this extension is compatible with all the
platforms supported by Coco (null == all).  For sub extensions, having it
not set implicitly says that this field has the same value as the main
extension's "platforms" field.

The primary reason behind this design is that if we choose the semantics used
by the main extension, treating null as all, all the extensions we currently
have will become invalid, because they are all macOS-only, the main extensions's
"platforms" field is "macos" and sub extensions' "platforms" is not set (null),
they will be equivalent to:

```json
{
  "name": "this is macOS-only",
  "type": "Group",
  "platforms": ["macos"],
  "commands": [
    {
      "name": "How the fxxk can this support all the platforms!"
      "platforms": ["macos", "windows", "linux"],
      "type": "Command",
      "action": {...}
    }
  ]
}
```
This hits exactly the check we mentioned earlier and will be rejected by
Coco.  If we have users installed them, the installed extensions will be
treated invalid and rejected by future Coco release, boom, we break backward
compatibility.

Also, the current design actually makes sense.  Nobody wants to repeatedly
tell Coco that all the sub extensions support macOS if this can be said only
once:

```json
{
  "name": "this is macOS-only",
  "platforms": ["macos"],
  "commands": [
    {
      "name": "This supports macOS"
      "platforms": ["macos"],
    },
    {
      "name": "This supports macOS too"
      "platforms": ["macos"],
    },
    {
      "name": "Guess what! this also supports macOS"
      "platforms": ["macos"],
    },
    {
      "name": "Come on dude, do I really to say this platform=macos so many times"
      "platforms": ["macos"],
    }
  ]
}
```
2025-07-31 21:49:59 +08:00
BiggerRain
b833769c25 refactor: calling service related interfaces (#831)
* chore: server

* chore: add

* refactor: calling service related interfaces

* chore: server list

* chore: add

* chore: add

* update

* chore: remove logs

* focs: update notes

* docs: remove server doc

---------

Co-authored-by: ayang <473033518@qq.com>
2025-07-31 15:59:35 +08:00
ayangweb
855fb2a168 feat: support sending files in chat messages (#764)
* feat: support sending files in chat messages

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* update

* docs: update changelog
2025-07-31 15:36:03 +08:00
SteveLauC
d2735ec13b refactor: check Extension/plugin.json from all sources (#846)
Coco App has 4 sources of Extension/plugin.json that should be checked:

1. From the "<data directory>/third_party_extensions" directory
2. Imported via "Import Local Extension"
3. Downloaded from the "store/extension/<extension ID>/_download" API
4. From coco-extensions repository

   Granted, Coco APP won't check these files directly, but we will
   re-use the code and run them in that repository's CI.

Previously, only the Extensions from the first source were checked/validated.
This commit extracts the validation logic to a function and applies it to all
4 sources.

Also, the return value of the Tauri command "list_extensions()" has changed.
We no longer return a boolean indicating if any invalid extensions
are found during loading, which only makes sense when installing
extensions requires users to manually edit data files. Since we now
support extension store and local extension imports, it could be omitted.
2025-07-31 14:27:23 +08:00
SteveLauC
c40fc5818a chore: ignore tauri::AppHandle's generic argument R (#845)
This commit removes the generic argument R from all the AppHandle imports, which
is feasible as it has a default type. This change is made not only for simplicity,
but also **consistency**. Trait SearchSource uses this type:

```rust
pub trait SearchSource {
    async fn search(
        &self,
        tauri_app_handle: AppHandle,
        query: SearchQuery,
    ) -> Result<QueryResponse, SearchError>;
}
```

In order to make trait SearchSource object-safe, the AppHandle used in it cannot
contain generic arguments. So some parts of Coco already omit this generic
argument. This commit cleans up the remaining instances and unifies the usage
project-wide.
2025-07-29 21:55:03 +08:00
SteveLauC
a553ebd593 feat: support Quicklink on Rust side (#760)
This commit implements the support for Quicklink on Rust side. We still
need the frontend part to make this complete.
2025-07-29 16:30:12 +08:00
BiggerRain
232166eb89 chore: delete unused code files and dependencies (#841)
Mainly delete unused webSocket content, and delete other unused code files and dependencies
2025-07-29 13:02:28 +08:00
Medcl
99144950d9 Revert "chore: add macos config for tauri (#837)" (#840)
This reverts commit ee45d21bbe.
2025-07-29 11:04:53 +08:00
SteveLauC
32d4f45144 feat: support installing local extensions (#749)
This commit adds support for installing extensions from a local folder path:

```text
extension-directory/
├── assets/
│   ├── icon.png
│   └── other-assets...
└── plugin.json
```

Useful for testing and development of extensions before publishing.

Co-authored-by: ayang <473033518@qq.com>
2025-07-29 10:26:47 +08:00
BiggerRain
6bc78b41ef chore: web component loading font icon (#838)
* chore: web component loading font icon

* docs: update notes
2025-07-28 19:03:40 +08:00
SteveLauC
cd54beee04 refactor: split query_coco_fusion() (#836)
This commit splits query_coco_fusion() into 2 functions:

1. query_coco_fusion_single_query_source()
2. query_coco_fusion_multi_query_sources()

query_coco_fusion_single_query_source(), as the name suggests, will only search
1 query source. Due to this simplicity, it does not need the complex re-ranking
procedure used by query_coco_fusion_multi_query_sources(), which is the primary
reason why this commit was made.

Another reason behind the change is that the re-ranking logic makes the
search results of querying single query source incorrect, it removes documents
from the results. I didn't investigate the issue because dropping the complex
logic in single query source search would be the best solution here.
2025-07-28 17:29:17 +08:00
ayangweb
ee45d21bbe chore: add macos config for tauri (#837) 2025-07-28 16:35:11 +08:00
ayangweb
4709f8c660 feat: enhance ui for skipped version (#834) 2025-07-28 11:43:10 +08:00
SteveLauC
4696aa1759 test: test extract_build_number() (#835)
This commit adds a test for extract_build_number(), which I forgot to do
in commit 067fb7144f6[1].

[1]: 067fb7144f
2025-07-28 11:42:50 +08:00
ayangweb
924fc09516 fix: fix issue with update check failure (#833)
* fix: fix issue with update check failure

* docs: update changelog
2025-07-28 10:06:07 +08:00
SteveLauC
5a700662dd chore: release notes for 0.7.1 (#832) 2025-07-28 10:00:12 +08:00
BiggerRain
8f992bfa92 chore: bump version number to 0.7.1 (#830) 2025-07-27 17:26:08 +08:00
BiggerRain
e7dd27c744 chore: add toggle_move_to_active_space_attribute (#829)
* chore: add toggle_move_to_active_space_attribute

* chore: pin

* chore: add

* update

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2025-07-27 16:50:11 +08:00
ayangweb
7914836c3e fix: correct enter key behavior (#828) 2025-07-27 11:52:40 +08:00
BiggerRain
b37bf1f7c7 chore: bump version number to 0.7.0 (#827) 2025-07-25 19:54:33 +08:00
BiggerRain
419d9d55c5 chore: web componet remove server name (#826) 2025-07-25 18:16:07 +08:00
BiggerRain
d3ed54c771 chore: web component add notification component (#825)
* chroe: web component add notification component

* docs: update notes
2025-07-25 18:15:49 +08:00
ayangweb
8f26dbcbe6 refactor: optimize subpage shortcut context menu (#822)
* refactor: optimize subpage shortcut context menu

* update

* update
2025-07-25 16:43:41 +08:00
ayangweb
663873ae14 refactor: optimize carriage return copying (#823) 2025-07-25 16:43:05 +08:00
SteveLauC
286b1be212 fix: panic on Ubuntu (GNOME) when opening apps (#821)
On Ubuntu (the GNOME version), Coco would panic when users open an app due
to the reason that Coco thinks it is running in an unsupported desktop
environment (DE).

We rely on the environment variable XDG_CURRENT_DESKTOP to detect the DE,
Ubuntu sets this variable to "ubuntu:GNOME" instead of just "GNOME",
which was not handled by the previous implementation.

This commit supports this case. Also, when Coco runs in an unsupported DE,
opening apps should not panic the app. After this commit, we would return
an error.
2025-07-25 15:32:48 +08:00
ayangweb
37221782b0 refactor: optimize shortcut key triggering (#820) 2025-07-25 14:54:32 +08:00
ayangweb
644e291105 fix: fix update window config sync (#818)
* fix: fix update window config sync

* docs: update changelog
2025-07-25 14:47:20 +08:00
BiggerRain
aae6984aa7 fix: re-search data initialization (#817) 2025-07-25 14:43:27 +08:00
ayangweb
dbd296d399 fix: fix enter key on subpages (#819)
* fix: fix enter key on subpages

* docs: update changelog
2025-07-25 14:43:16 +08:00
ayangweb
e2ad25967d fix: fix ctrl+k not working (#815) 2025-07-25 14:30:03 +08:00
ayangweb
21b61d80d8 refactor: optimize method calls for checking for updates (#814) 2025-07-25 13:42:12 +08:00
ayangweb
9f4c693ac4 refactor: optimize line breaks in input boxes (#813) 2025-07-25 12:36:07 +08:00
BiggerRain
45c27cac56 chore: cancel interface param (#816) 2025-07-25 12:16:23 +08:00
BiggerRain
e46035afd4 fix:the client id is the same (#812)
* chore: add

* fix: client id
2025-07-25 11:25:22 +08:00
BiggerRain
1004bb73f4 chore: delay the chat monitoring event (#811) 2025-07-24 20:03:30 +08:00
BiggerRain
d664fa7271 chore: handle reply to message (#799)
* chore: add reply to message

* chore: handle rust data

* log

* chore: id

* feat: add

* chore: loading step

* chore: cur id

* feat: add

* accept query parameters

* chore: add message id for cancel

* chore: remove log

* chore: remove log

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2025-07-24 18:06:59 +08:00
SteveLauC
067fb7144f refactor: use custom version comparator to determine if we should update (#810) 2025-07-24 16:05:36 +08:00
ayangweb
579f91f3aa refactor: refactor version update check (#809) 2025-07-24 11:56:57 +08:00
ayangweb
abe2aecedf fix: fix multiline input issue (#808) 2025-07-24 10:58:57 +08:00
SteveLauC
e8f9a4e627 chore: log querysources to search only when querysource is not set (#807) 2025-07-24 09:39:29 +08:00
ayangweb
22b1558e8b refactor: optimized data fetching for secondary pages (#803) 2025-07-23 18:56:56 +08:00
SteveLauC
ca3b514a65 fix: panic caused by "state() called before manage()" (#806)
This commit fixes the following panic:

```
Time: [2025-07-23-17-03-23]
Location: [/Users/steve/.cargo/registry/src/index.crates.io-1949cf8c6b5b557f/tauri-2.5.1/src/lib.rs:742:7]
Message: [state() called before manage() for tauri_plugin_global_shortcut::GlobalShortcut<tauri_runtime_wry::Wry<tauri::EventLoopMessage>>]
```

The root cause is that, in a Tauri application, before you can access a piece of
managed state with the .state() method, you must first register it with Tauri
using .manage(). When a user reigsters hotkey for an extension,
initializing extensions will invoke the .state() method, at that point,
.manage() hasn't been called.

The fix is simple, we simply call .manage() earlies (invoked by our
`shortcut::enable_shortcut(app)` function).
2025-07-23 18:56:16 +08:00
SteveLauC
c694c4eda9 chore: display backtrace in panic log (#805)
Having backtrace in the panic log will help debugging a lot. Under
release builds, we strip our binary so the symbols information is
unavailable, but this information is still useful in debug builds.

Panic log in release builds:

```
Time: [YYYY-MM-DD-HH-MM-SS]
Location: [/Users/foo/.cargo/registry/src/index.crates.io-1949cf8c6b5b557f/tauri-2.5.1/src/lib.rs:742:7]
Message: [state() called before manage() for tauri_plugin_global_shortcut::GlobalShortcut<tauri_runtime_wry::Wry<tauri::EventLoopMessage>>]
Backtrace:
   0: __mh_execute_header
   1: __mh_execute_header
   2: __mh_execute_header
   3: __mh_execute_header
   4: __mh_execute_header
   5: __mh_execute_header
   6: __mh_execute_header
   7: __mh_execute_header
   8: __mh_execute_header
   9: __mh_execute_header
  10: __mh_execute_header
  11: __mh_execute_header
  12: __mh_execute_header
  13: __mh_execute_header
  14: __mh_execute_header
  15: __mh_execute_header
  16: __mh_execute_header
  17: <unknown>
  18: <unknown>
```
2025-07-23 17:00:48 +08:00
ayangweb
ac835c76aa fix: fix shortcut issue in windows context menu (#804)
* fix: fix shortcut issue in windows context menu

* docs: update changelog
2025-07-23 16:20:46 +08:00
SteveLauC
25bbab7432 refactor: clean up unsupported characters from query string in Win Search (#802)
We found that Windows Search would error out if it encounters a single
quote character, the natural solution would be to escape it. But I couldn't
find out how. The approach mentioned by most posts:

```
~="<Unsupported Char>"
```

won't work in my test. So I decided to replace it with a whitespace.

Single quote is not the first unsupported character I know, the newline
character is not supported as well, so it will be handled in the same
way.
2025-07-23 16:13:15 +08:00
ayangweb
cca00e944e fix: fix selection issue after renaming (#800) 2025-07-23 13:59:33 +08:00
SteveLauC
e78fe4ac89 fix: broken windows search (#801)
This commit fixes the search issue introduced by [commit](5c0a865822). We have no idea why the tauri command `get_app_search_source` won't be invoked after that commit on Windows.

This commit resolves the issue by moving the extension init logic to the Rust side.

Also, update the querysource logs in `quey_coco_fusion()`, the old one won't say anything if the querysource list is empty, the new one will tell us that.
2025-07-23 12:33:18 +08:00
Medcl
60fd79f1fa fix: increase read_timeout for HTTP streaming stability (#798) 2025-07-22 18:44:27 +08:00
BiggerRain
5c0a865822 chore: not request the interface if not logged in (#795)
* chore: not request the interface if not logged in

* chore: res

* chore: res

* chore: common interface

* chore: no login

* chore: login

* chore: login

* chore: add

* dbg print servers

* chore: id

* docs: update notes

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2025-07-22 16:15:58 +08:00
SteveLauC
5b50e4b51b ci: add Rust code format check to CI (#797)
This commit adds the Rust code format check to our CI.
2025-07-22 15:11:13 +08:00
SteveLauC
b97386a827 refactor: avoid GLOBAL_TAURI_APP_HANLE if possible (#796)
This commit fixes the Windows panic issue. 

Coco panicked because it accessed `GLOBAL_TAURI_APP_HANDLE` when this global variable wasn't initialized. I removed all the uses of this variable except for the one use in `src-tauri/src/server/http_client.rs`, which I don't have a good way to refactor.

If you are wondering why this didn't happen in the past, the access was triggered by the frontend code, something there likely changed. Regardless, this global variable is still dangerous and error-prone, so we should avoid it.

Also, this commit fixes the issue that the panic hook does not work on Windows because the log filename contains ":", which is not allowed by the Windows file system.
2025-07-22 14:43:27 +08:00
SteveLauC
29aa26af94 chore: add a panic hook to catch panic msg (#793) 2025-07-22 10:34:27 +08:00
BiggerRain
3650d9914c fix: enter key problem (#794)
* fixed: enter key problem

* docs: update notes

* fix: enter key problem
2025-07-22 10:13:08 +08:00
SteveLauC
f26031047c fix: refreshing Coco server should register it to SearchSource (#792) 2025-07-22 08:51:57 +08:00
BiggerRain
c8719926be chore: add 401 unauthorized (#791) 2025-07-21 22:21:07 +08:00
BiggerRain
f1dfc5c730 fixed: chat message confusion (#782)
* fix: chat

* fix: chat

* chore: add session id

* fix: fixed incorrect taskbar icon display on linux (#783)

* fix: fixed incorrect taskbar icon display on linux

* docs: update changelog

* fix: fix data inconsistency issue on secondary pages (#784)

* chore: chat

* chore: chat

* chore: add logging message

* chore: chat

* chore: chat

* chore: add

* feat: add

* chore: chat end

* style: message width

---------

Co-authored-by: ayangweb <75017711+ayangweb@users.noreply.github.com>
Co-authored-by: medcl <m@medcl.net>
2025-07-21 21:17:20 +08:00
SteveLauC
74ed642a42 refactor: tighten up Coco servers state management (#790)
* refactor: tighten up Coco servers state management

* ignore unused warnings

* log out if the failed request has status 401
2025-07-21 20:39:16 +08:00
ayangweb
5a17173620 fix: incorrect status when installing extension (#789)
* fix: incorrect status when installing extension

* docs: update changelog
2025-07-21 18:17:30 +08:00
SteveLauC
29d14ff931 chore: remove unused type ServerTokenResponse (#788)
After this commit[1], type `ServerTokenResponse` became unused, remove
it as well.

[1]: 57ab08fb6d
2025-07-21 15:30:26 +08:00
ayangweb
ad01504766 refactor: decouple window switch services to ensure they operate independently (#786) 2025-07-20 17:26:15 +08:00
SteveLauC
57ab08fb6d chore: remove unused tauri cmd get_server_token (#787)
Found this tauri command while reading the code, then I realized that
token management logic should all be kept in the backend, there is no
need to expose it to the frontend. And indeed, searching for it in the
frontend code showed that it is not used at all.

```sh
$ cd src

$ rg get_server_token
commands/servers.ts
75:export function get_server_token(id: string): Promise<ServerTokenResponse> {
76:  return invokeWithErrorHandler(`get_server_token`, { id });
```

So remove it.
2025-07-20 17:25:32 +08:00
ayangweb
db5c09f80c fix: fix data inconsistency issue on secondary pages (#784) 2025-07-20 10:54:51 +08:00
ayangweb
b1e2c6961d fix: fixed incorrect taskbar icon display on linux (#783)
* fix: fixed incorrect taskbar icon display on linux

* docs: update changelog
2025-07-20 10:08:11 +08:00
BiggerRain
3f4abe51e5 fix: web component server list error (#781)
* chore: update app

* fix: web component server list error

* feat: add

* chore: remove defalut version
2025-07-19 17:07:11 +08:00
ayangweb
060c09e11c fix: resolved minor issues with voice playback (#780)
* fix: resolved minor issues with voice playback

* docs: update changelog

* update
2025-07-19 14:25:19 +08:00
ayangweb
657df482bf fix: correct incorrect assistant display when quick ai access (#779)
* fix: correct incorrect assistant display when quick ai access

* docs: update changelog
2025-07-19 13:54:39 +08:00
ayangweb
f4f7732927 refactor: show specific values in shortcut key conflict tips (#778)
* refactor: show specific values in shortcut key conflict tips

* update

* update

* update

* update

* update

* update

* update
2025-07-19 11:05:17 +08:00
ayangweb
5e536e1444 refactor: separate user agreement and privacy policy links (#777) 2025-07-19 10:24:29 +08:00
ayangweb
2b48cdf84a refactor: add border-radius to extended categories (#776) 2025-07-19 10:08:04 +08:00
BiggerRain
bc37616506 chore: search-chat add language and formatUrl parameters (#775)
* chore: add language

* build: build web

* docs: update notes
2025-07-19 09:34:38 +08:00
ayangweb
07bcd80776 refactor: invoke language update logic earlier (#774) 2025-07-18 16:44:43 +08:00
SteveLauC
7b8b396368 fix: indexing apps does not respect search scope config (#773)
This commit fixes the issue that indexing applications does not
respect the search scope configuration, it always uses the default
values.
2025-07-18 16:26:34 +08:00
ayangweb
823a95d601 fix: restore missing category titles on subpages (#772) 2025-07-18 16:25:44 +08:00
ayangweb
af0b98a41b refactor: rebuild app index with improved suggestions (#771) 2025-07-18 16:15:28 +08:00
SteveLauC
7d0e7cd7dc fix: unregister ext hotkey when it gets deleted (#770)
This commit fixes the bug that when an extension gets uninstalled, its
registered hotkey won't be cleared.
2025-07-18 13:20:41 +08:00
ayangweb
e56d6b1b60 refactor: close the file upload port (#769) 2025-07-18 10:45:05 +08:00
BiggerRain
941cf96a07 style: splash adapts to the width of mobile phones (#768)
* style: splash width style

* docs: update notes
2025-07-17 15:33:24 +08:00
SteveLauC
14fbf2ac5d refactor: do status code check before deserializing response (#767)
* refactor: do status code check before deserializing response

This commit adds a status code check to the following requests, only when
this check passes, we deserialize the response JSON body:

- get_connectors_by_server
- mcp_server_search
- datasource_search

A helper function `status_code_check(response, allowed_status_codes)`
is added to make refactoring easier.

* chore: release notes
2025-07-17 15:08:14 +08:00
SteveLauC
494e2f0d8a chore: Coco app http request headers (#744)
Add the following HTTP headers when making HTTP requests:

- X-OS-NAME
- X-OS-VER
- X-OS-ARCH
- X-APP-NAME
- X-APP-VER
- X-APP-LANG
2025-07-17 11:31:19 +08:00
BiggerRain
e3a3849fa4 chore: search-chat components add formatUrl & think data & icons url (#765)
* chore: web components add formatUrl & think data

* chore: add headers

* chore: add

* chhore: add server url

* docs: update notes

* chore: url

* docs: search chat docs
2025-07-17 09:22:23 +08:00
SteveLauC
0b5e31a476 chore(deps): bump the windows crate (#766)
This commit bumps the windows crate from "0.60.0" to "0.61.3", it should
solve the CI issue happened here[1]:

```text
error[E0277]: `DBOBJECT` doesn't implement `Debug`
     --> C:\Users\runneradmin\.cargo\registry\src\index.crates.io-1949cf8c6b5b557f\windows-0.60.0\src\Windows\Win32\System\Search\mod.rs:21828:5
      |
21826 | #[derive(Clone, Debug, PartialEq)]
      |                 ----- in this derive macro expansion
21827 | pub struct SSVARIANT_0_4 {
21828 |     pub dbobj: DBOBJECT,
      |     ^^^^^^^^^^^^^^^^^^^ the trait `Debug` is not implemented for `DBOBJECT`
      |
      = note: add `#[derive(Debug)]` to `DBOBJECT` or manually `impl Debug for DBOBJECT`
```

[1]: https://github.com/infinilabs/ci/actions/runs/16314479643/job/46076989290
2025-07-16 17:10:32 +08:00
SteveLauC
c8a723ed9d feat: file search for Windows (#762)
This commit implements the file search extension for Windows platforms using the [Windows Search](https://learn.microsoft.com/en-us/windows/win32/search/-search-3x-wds-qryidx-overview) functionality.

Something to note:

1. Searching by file content is not natively supported. Coco would search for all the columns (attributes/fields within the index) with this option:

```rust
        SearchBy::NameAndContents => {
            // Windows File Search does not support searching by file content.
            //
            // `CONTAINS('query_string')` would search all columns for `query_string`,
            // this is the closest solution we have.
            format!("((System.FileName LIKE '%{query_string}%') OR CONTAINS('{query_string}'))")
        }
```

2. Tests have been added, but they failed in our CI for unknown reasons so I disabled them:

```rust
// Skip these tests in our CI, they fail with the following error 
// "SQL is invalid: "0x80041820""
// 
// I have no idea about the underlying root cause
#[cfg(all(test, not(ci)))]
mod test {
```

3. The Windows Search index is not real-time and can return obsolete results. Opening the returned documents could fail if the chosen file has been deleted or moved.
2025-07-16 09:11:53 +08:00
ayangweb
aaf4bf2737 refactor: update the font icon link (#763) 2025-07-15 09:10:26 +08:00
BiggerRain
24b0123a61 docs: add deep wiki docs (#761) 2025-07-11 17:22:18 +08:00
ayangweb
e8bd970cdb refactor: updated the upload endpoint for attachments (#759) 2025-07-10 18:20:32 +08:00
ayangweb
dd3be3a819 refactor: refactored file icon retrieval logic (#757)
* refactor: refactored file icon retrieval logic

* update

* update

* update
2025-07-10 18:10:39 +08:00
Medcl
5b034c28ac chore: make optional fields optional (#758)
* chore: make optional fields optional

* chore: update docs
2025-07-10 18:06:05 +08:00
ayangweb
b17949fe29 refactor: enabling the upload file component (#755)
* refactor: enabling the upload file component

* update
2025-07-10 17:26:44 +08:00
SteveLauC
5d37420109 feat: tauri command get_file_icon() (#756) 2025-07-10 16:51:34 +08:00
ayangweb
1d3ceb0c70 refactor: remove speech-to-text shortcuts (#754) 2025-07-10 13:58:37 +08:00
BiggerRain
4d11afe18e chore: assistant params & styles (#753)
* chore: add

* chore: add

* chore: assistant params & styles

* docs: update notes
2025-07-10 11:47:10 +08:00
SteveLauC
0c0291c8c0 chore: rename QuickLink/quick_link to Quicklink/quicklink (#752)
* chore: rename QuickLink/quick_link to Quicklink/quicklink

Standardize varaible naming to match the correct term: "Quicklink" and "quicklink".
This updates all incorrect variants such as "QuickLink" and "quick_link".

* chore: release notes
2025-07-10 10:18:57 +08:00
ayangweb
cca672b2cb feat: text to speech now powered by LLM (#750)
* feat: support text to speech

* chore: receive bytes stream

* chore: update testing code

* feat: mp3 paly

* update

* docs: update changelog

* update

* update

* update

---------

Co-authored-by: medcl <m@medcl.net>
Co-authored-by: rain9 <15911122312@163.com>
2025-07-10 10:16:51 +08:00
BiggerRain
5b27488402 refactor: adjusted assistant, datasource, mcp_server interface parameters (#746)
* chore: handle mcp interface parameters

* docs: update notes

* chore: remove code

* chore: assistant params

* fix: assistant params

* docs: update notes
2025-07-10 09:48:42 +08:00
SteveLauC
c1c4e0db7b chore: bump dep applications-rs (#751)
* chore: bump dep applications-rs

Currently Coco depends on atty v0.2.14, a crate that has
[vulnerability](https://github.com/infinilabs/coco-app/security/dependabot/25),
here is the dependency chain:

```
coco -> applications-rs -> freedesktop-file-parser 0.1.0 -> atty 0.2.14
```

I bumped the [`freedesktop-file-parser`](7bdb070e45)
crate in our applications-rs crate, which would eliminate the `atty` crate
from the chain to fix the vulnerability.

This commit bumps the applications-rs crate to include the above change.

* chore: release notes
2025-07-09 18:52:17 +08:00
ayangweb
074a7c8b0a fix: prevent window from hiding when moved on Windows (#748)
* fix: prevent window from hiding when moved on Windows

* docs: update changelog

* update
2025-07-09 16:30:41 +08:00
SteveLauC
bc524e19db refactor: adjust extension code hierarchy (#747)
* refactor: adjust extension code hierarchy

In this commit, I refactored the extension code structure.

* We can only install third-party extensions so the `store.rs` file should
  belong to the `third_party` directory.

* Move tauri command `uninstall_extension()` to `extension/mod.rs` from
  `third_party.rs` since one can uninstall an extension regardless of
  how you installed it.

* Refactor the `install_extension_from_store()` function, add more
  descriptive code comments.

Also, a trivial change, bump Rust toolchain and edition to use the
[let-chains](https://blog.rust-lang.org/2025/06/26/Rust-1.88.0/#let-chains) syntax.

* chore: release notes
2025-07-09 16:28:59 +08:00
SteveLauC
05f70d26d9 chore: replace meval-rs with our fork to clear dep warning (#745)
* chore: replace meval-rs with our fork to clear dep warning

This commit replaces the meval-rs dependency with our
[fork](https://github.com/infinilabs/meval-rs). The original meval-rs
crate has not been maintained for a long time and uses nom 1.0, a crate
that was released 9 years ago, which would be rejected by future Rust
compiler because it contains outdated Rust syntaxes. This is the reason
why we are seeing the following warning:

```
warning: the following packages contain code that will be rejected by a future version of Rust: nom v1.2.4
note: to see what the problems were, use the option `--future-incompat-report`, or run `cargo report future-incompatibilities --id 1
```

Switching to our fork would solve this warning.

* chore: release notes
2025-07-08 15:39:58 +08:00
SteveLauC
ab26dc7c6a fix(file search): searching by name&content does not search file name (#743)
* fix(file search): searching by name&content does not search file name

* release note
2025-07-08 09:21:43 +08:00
BiggerRain
6ff6b46139 refactor: create chat & send chat api (#739)
* chore: code format

* fix: build error

* refactor: chat create & chat

* chore: aa

* chore: aa

* refactor: send chat messages

* chore: chat

* chore: web

* chore: add

* docs: update notes
2025-07-07 19:41:29 +08:00
SteveLauC
119fd87a25 fix(file search): apply filters before from/size parameters (#741) 2025-07-07 19:40:46 +08:00
SteveLauC
de226a8fa4 ci: compile-check rust code & run rust tests when Rust code changes (#742)
Run some basic Rust checks in our CI iff rust code changes
2025-07-07 18:14:25 +08:00
SteveLauC
6865957725 chore: icon support for more file types (#740)
This PR adds icon support for more types of files, see the code for the full file type list.

Co-authored-by: ayang <473033518@qq.com>
2025-07-02 16:27:44 +08:00
SteveLauC
87818d69ed refactor: change File Search ext type to extension (#738)
* refactor: change File Search ext type to extension

* chore: release notes
2025-07-02 10:45:54 +08:00
SteveLauC
38b67d01b8 refactor: prioritize stat(2) when checking if a file is dir (#737)
* refactor: prioritize stat(2) when checking if a file is dir

* chore: release notes
2025-07-02 10:00:33 +08:00
ayangweb
a4f4a24730 feat: voice input support in both search and chat modes (#732)
* feat: voice input support in both search and chat modes

* docs: update changelog

* update

* update

* update

* update
2025-07-02 09:35:16 +08:00
BiggerRain
87bd3d020f fix: build error (#736) 2025-07-02 07:03:09 +08:00
SteveLauC
825ac5d565 feat: file search using spotlight (#705)
Co-authored-by: ayang <473033518@qq.com>
2025-07-01 19:19:16 +08:00
BiggerRain
f21a35e15d fix: update information storage cache and styles (#735) 2025-07-01 15:46:37 +08:00
BiggerRain
6e90b28204 style: extension iocn styles (#734) 2025-07-01 13:44:44 +08:00
Hardy
e92e5e5158 chore: typo step name and env (#731)
Co-authored-by: hardy <luohf@infinilabs.com>
2025-06-30 14:40:26 +08:00
Hardy
2ac81566c6 Fix run shell (#730)
* fix: windows platform run with shell

* chore: add rust target

* fix: fix app version and release body

* chore: update step id

---------

Co-authored-by: hardy <luohf@infinilabs.com>
2025-06-30 14:29:52 +08:00
Hardy
b004670dec fix: windows platform run with shell (#729)
* fix: windows platform run with shell

* chore: add rust target

---------

Co-authored-by: hardy <luohf@infinilabs.com>
2025-06-30 14:11:51 +08:00
Hardy
a426e33e6b fix: feature dependcy local path (#728)
* fix: feature dependcy local path

* chore: use build args from env

* chore: remove no use step

---------

Co-authored-by: hardy <luohf@infinilabs.com>
2025-06-30 13:16:54 +08:00
Hardy
bb7dd6bf7c fix: build error on windows platform with cargo add git repo (#727)
Co-authored-by: hardy <luohf@infinilabs.com>
2025-06-30 12:13:46 +08:00
BiggerRain
37c5f2de24 fix: tray not on display (#726) 2025-06-30 10:53:40 +08:00
SteveLauC
ab6c25fe96 chore: release notes for 0.6.0 (#725) 2025-06-29 17:42:32 +08:00
BiggerRain
1fb464df09 fix: open extension store display (#724) 2025-06-29 17:38:14 +08:00
SteveLauC
65aa75043f chore: bump version number to 0.6.0 (#723) 2025-06-29 17:08:06 +08:00
BiggerRain
79dcc7b4ec fix: text display error (#722)
* fix: text  display error

* fix: text  display error

* fix: select extension display install

* fix: select extension display install
2025-06-29 16:47:02 +08:00
BiggerRain
3d29cfe235 chore: rebuild index position (#721) 2025-06-29 15:53:43 +08:00
BiggerRain
aea3a7ba98 chore: rebuild index (#720)
* chore: rebuild index

* chore: rebuild index
2025-06-29 15:39:01 +08:00
BiggerRain
190dfc6ecd chore: adjust styles and add button reindex (#719)
* chore: adjust styles and add button reindex

* docs: update notes

* style: remove margin bottom
2025-06-29 13:32:07 +08:00
SteveLauC
316a7940d6 chore: log command execution results (#718)
* chore: log command execution results

* release note
2025-06-29 10:46:47 +08:00
SteveLauC
acfc1bb32d feat: interface reindex_applications() (#704)
* feat: impl re-indexing applications

* drop pizza engine
2025-06-29 10:27:02 +08:00
ayangweb
c4d178dc2d feat: support back navigation via delete key (#717)
* feat: support back navigation via delete key

* docs: update changelog
2025-06-27 19:17:27 +08:00
ayangweb
6333c697d5 refactor: support large preview for extensions (#716) 2025-06-27 17:30:49 +08:00
ayangweb
810541494f refactor: update extension detail page ui (#715) 2025-06-27 15:07:34 +08:00
ayangweb
e45dc2acbe fix: context menu search not working (#713) 2025-06-27 14:18:54 +08:00
ayangweb
2d1ccb9744 refactor: improve layout of the extension list (#714) 2025-06-27 14:18:32 +08:00
SteveLauC
406f3b31e9 chore: change extension store request URL to default coco server (#712) 2025-06-27 10:40:12 +08:00
ayangweb
f51dd81014 refactor: optimized some issues with extensions (#711) 2025-06-27 10:22:51 +08:00
SteveLauC
3b38cbfb6c chore: update category name and icon (#710) 2025-06-27 10:16:27 +08:00
ayangweb
a4483ba277 fix: some input fields couldn’t accept spaces (#709)
* fix: some input fields couldn’t accept spaces

* docs: update changelog

* update
2025-06-27 10:16:02 +08:00
ayangweb
bf46979b80 refactor: remove special character filtering and clean up related code (#708) 2025-06-27 10:08:33 +08:00
ayangweb
070f171ad4 refactor: update context menu color for the delete action (#707) 2025-06-27 09:43:35 +08:00
ayangweb
3180704a0d refactor: show all extensions by default in the extension store (#706) 2025-06-27 09:36:20 +08:00
SteveLauC
b3f68697ce feat: impl extension store (#699)
Implements extension store so that users can install extensions from a GUI interface


---------

Co-authored-by: ayang <473033518@qq.com>
2025-06-26 18:40:33 +08:00
BiggerRain
69d2b4b834 chore: add message for latest version check (#703)
* chore: add message for latest version check

* docs: update notes
2025-06-25 10:38:38 +08:00
BiggerRain
6837286061 feat: add manual check for updates (#701)
* feat: add check for update

* feat: add Check for Updates

* docs: update notes

* build: build bundle test

* docs: update notes

* chore: recovering files
2025-06-19 20:58:54 +08:00
ayangweb
a431ead22a feat: support Tab and Enter for delete dialog buttons (#700)
* feat: support `Tab` and `Enter` for delete dialog buttons

* docs: update changelog

* refactor: update
2025-06-19 08:59:01 +08:00
ayangweb
7ec41dfe80 refactor: request data when service is available (#698) 2025-06-18 15:47:49 +08:00
ayangweb
06053e9fd9 refactor: getting service info only when a profile is available (#697)
* refactor: getting service info only when a profile is available

* refactor: update
2025-06-18 14:47:21 +08:00
Medcl
70b048fba3 fix: take coco server back on refresh (#696)
* fix: take coco server back on refresh

* chore: update release notes:
2025-06-18 13:33:59 +08:00
ayangweb
45083f829b refactor: optimized the style of the drop-down selection box (#695)
* refactor: optimized the style of the drop-down selection box

* refactor: update
2025-06-17 18:15:40 +08:00
SteveLauC
e4f6fb8e98 fix: toggle extension should register/unregister hotkey (#691) 2025-06-17 16:56:06 +08:00
BiggerRain
ee182b22da chore: keeping windows and documents safe (#694) 2025-06-17 15:39:18 +08:00
BiggerRain
a37e22c227 fix: quick ai state synchronous (#693)
* fix: quick ai state synchronous

* docs: update notes
2025-06-17 15:38:39 +08:00
BiggerRain
d75ab1018d chore: improve server list selection with enter key (#692)
* chore: server list enter selected

* docs: update notes

* chore: remove log
2025-06-17 09:36:04 +08:00
Medcl
40ad066e69 refactor: refactoring search api (#679)
* refactor: refactoring search api

* chore: interface type

* chore: interface type

* refactor: assistant search

* refactor: arrays into multiple fields

* refactor: update

* feat: search to add fuzziness to 5

* refactor: update

* chore: update release notes

---------

Co-authored-by: rain9 <15911122312@163.com>
Co-authored-by: ayang <473033518@qq.com>
Co-authored-by: ayangweb <75017711+ayangweb@users.noreply.github.com>
2025-06-17 09:31:43 +08:00
BiggerRain
a2a5a9f8fe chore: continue to chat page display (#690)
* chore: Continue to chat page display

* docs: update notes
2025-06-16 18:02:47 +08:00
SteveLauC
5fd9339e56 refactor: use author/ext_id as extension unique identifier (#643)
* refactor: use author/ext_id as extension unique identifier

* refactor: refactoring extended component interfaces and logic

* refactor: update

* style: remove console

* refactor: update

* drop pizza engine

* refactor: restore hotkey upon start no matter if the ext is enabled or not

* chore: release note

---------

Co-authored-by: ayang <473033518@qq.com>
2025-06-16 10:52:01 +08:00
Hardy
a8a9208b1f fix: no make target with project (#689)
* fix: no make with project

* chore: set working directory

---------

Co-authored-by: hardy <luohf@infinilabs.com>
2025-06-13 22:17:37 +08:00
medcl
8c9a2ff441 v0.5.0 2025-06-13 19:28:38 +08:00
Medcl
2251b0af95 chore: update release notes (#687) 2025-06-13 18:37:47 +08:00
BiggerRain
560a12ab93 fix: search & chat dispaly (#686) 2025-06-13 18:18:46 +08:00
ayangweb
2ff66c0b91 fix: arrow inserting escape sequences (#683)
* fix: arrow inserting escape sequences

* fix build

* docs: update changelog

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2025-06-13 18:06:21 +08:00
ayangweb
ef4a184233 refactor: optimize the operation of the small assistant on the secondary page (#685)
* refactor: optimize the operation of the small assistant on the secondary page

* refactor: update
2025-06-13 16:13:31 +08:00
ayangweb
8422bc03e7 refactor: optimize the timing of arrow key triggers on secondary pages (#684) 2025-06-13 15:52:20 +08:00
BiggerRain
370113129c fix: web component start page (#681) 2025-06-13 15:17:52 +08:00
ayangweb
cb758ef452 feat: context menu support for secondary pages (#680)
* feat: context menu support for secondary pages

* docs: update changelog
2025-06-13 15:07:05 +08:00
ayangweb
12b9b4bb81 refactor: blocking the default behavior of the tab key (#678)
* refactor: blocking the default behavior of the tab key

* refactor: update

* refactor: update

* refactor: update
2025-06-13 14:19:27 +08:00
BiggerRain
562db19f16 fix: filter services for unlogged-in users (#677) 2025-06-13 11:04:58 +08:00
ayangweb
dc5cd9aecb fix: fix problem with up and down key indexing (#676)
* fix: fix problem with up and down key indexing

* refactor: update

* docs: update changelog
2025-06-13 10:39:27 +08:00
BiggerRain
0b018cd24f chore: search & deep think & mcp (#675)
* fix: keep line breaks

* chore: search & deep think & mcp
2025-06-12 22:06:48 +08:00
BiggerRain
2ed22d3d7c fix: keep line breaks (#674) 2025-06-12 18:20:44 +08:00
BiggerRain
4ce9561eb7 style: safari styles (#673) 2025-06-12 14:50:02 +08:00
BiggerRain
3aeb39b3af refactor: optimize global state synchronization (#672)
* refactor: optimize global state synchronization

* refactor: reconstruct the language change processing logic

---------

Co-authored-by: ayang <473033518@qq.com>
2025-06-12 14:45:33 +08:00
BiggerRain
27e99d4629 fix: web assistant list (#671) 2025-06-12 11:28:10 +08:00
ayangweb
df70276a54 refactor: ai assistant hides the copy menu (#670)
* refactor: ai assistant hides the copy menu

* style: remove console
2025-06-12 10:39:37 +08:00
BiggerRain
6553a8f5d3 chore: add special character filtering (#668)
* chore: add special character filtering

* docs: update notes
2025-06-12 10:31:15 +08:00
ayangweb
4ebbc9ec6e refactor: improved ai overview and ai quick access blank issue (#669)
* refactor: improved ai overview and ai quick access blank issue

* refactor: update
2025-06-12 10:30:41 +08:00
BiggerRain
4208633556 fix: Fix Special Character input (#667) 2025-06-11 17:50:39 +08:00
ayangweb
fc43fbe798 refactor: improve AI assistant interaction logic and Tab key handling (#666)
* refactor: improve AI assistant interaction logic and Tab key handling

* refactor: update

* style: remove
2025-06-11 17:49:05 +08:00
ayangweb
b5bb9105d4 refactor: re-enable the service to get a list of assistants (#665) 2025-06-11 16:28:53 +08:00
BiggerRain
b6ebd6e5f8 fix: web component dispaly (#663)
* fix: web component dispaly

* fix: web component dispaly

* fix: add showChatHistory & connected

* fix: add isCurrentLogin

* chore: add history
2025-06-11 16:28:43 +08:00
ayangweb
22216491b6 refactor: dynamically generated copy button id (#664) 2025-06-11 15:52:49 +08:00
ayangweb
44ca66259c refactor: don't hide pinned window on search result open (#662)
* refactor: don't hide pinned window on search result open

* refactor: update
2025-06-11 15:26:06 +08:00
ayangweb
be3cae36e2 fix: number keys not following settings (#661)
* fix: number keys not following settings

* refactor: remove unused `modifierKey` dependencies

* docs: update changelog
2025-06-11 14:15:32 +08:00
ayangweb
35ea30626f refactor: improve tooltip display in chinese (#660) 2025-06-11 14:01:08 +08:00
BiggerRain
4bcae5cffb fix: delete history (#659) 2025-06-11 13:36:21 +08:00
BiggerRain
76458db8ab chore: remove enter disabled (#658) 2025-06-11 12:10:37 +08:00
BiggerRain
5b41e190d3 chore: add i18n to services (#657) 2025-06-11 11:03:42 +08:00
ayangweb
43ac9a054c refactor: remove the behavior that organizes event bubbling (#656) 2025-06-11 10:14:05 +08:00
BiggerRain
ac485a32cc style: user message styles (#655) 2025-06-10 19:25:54 +08:00
ayangweb
e10908a095 refactor: optimize the timing of the enter key (#654)
* refactor: optimize the timing of the enter key

* fix: remove input element

---------

Co-authored-by: rain <15911122312@163.com>
2025-06-10 19:01:25 +08:00
BiggerRain
78b8908ac8 fix: stop event bubbling (#653) 2025-06-10 18:22:54 +08:00
ayangweb
3c54cb84a8 refactor: filter unavailable servers (#652) 2025-06-10 17:37:57 +08:00
ayangweb
8ed808c591 fix: fix the problem of local path not opening (#650)
* fix: fix the problem of local path not opening

* docs: update changelog

* chore: remove pizza-engine
2025-06-10 17:26:19 +08:00
ayangweb
7a2dde7448 refactor: check if the message block is purely blank (#651) 2025-06-10 17:22:13 +08:00
BiggerRain
65451fc63e style: user message line break (#648) 2025-06-10 15:41:08 +08:00
BiggerRain
5d108a46d3 style: differentiate between hover and selected styles (#649) 2025-06-10 15:37:17 +08:00
BiggerRain
f9567c2d46 chore: remove defalut current service (#647) 2025-06-10 14:54:07 +08:00
BiggerRain
da917e6012 fix: web page unmount event (#645)
* fix: web page unmont event

* docs: update notes
2025-06-10 14:28:00 +08:00
ayangweb
335a906674 refactor: refactoring shortcut reset logic and optimizing UI interactions (#646) 2025-06-10 14:27:23 +08:00
ayangweb
a50a636d59 fix: input lost when reopening dialog after search (#644)
* fix: input lost when reopening dialog after search

* docs: update changelog
2025-06-10 11:45:45 +08:00
ayangweb
2dd3f776e6 fix: arrow keys still navigated search when menu opened with Cmd+K (#642)
* fix: arrow keys still navigated search when menu opened with `Cmd+K`

* docs: update changelog
2025-06-10 09:56:27 +08:00
BiggerRain
40f6aa0ccd chore: copy supports http protocol (#639)
* chore: copy supports http protocol

* docs: update notes
2025-06-09 18:12:43 +08:00
ayangweb
4da9e024e0 refactor: update login status when service is not enabled (#638) 2025-06-09 18:11:35 +08:00
ayangweb
c20bba51f5 fix: tab key hides window in chat mode (#641)
* fix: tab key hides window in chat mode

* docs: update changelog
2025-06-09 18:10:56 +08:00
BiggerRain
0a62a2095b fix: add shift line break to chat input (#637) 2025-06-09 15:06:59 +08:00
SteveLauC
5677995185 chore: more logs for the setup process (#634)
* chore: more logs for the setup process

* chore: more logs for the setup process

* chore: more logs for the setup process

* chore: release note
2025-06-09 14:46:06 +08:00
BiggerRain
ec4e5e7d1d fix: remove stopImmediatePropagation event (#636) 2025-06-09 12:05:27 +08:00
BiggerRain
1df5265b1a chore: add onContextMenu event (#629) 2025-06-09 11:57:48 +08:00
ayangweb
fb8a4684dc refactor: improved page content after disabling the service (#635)
* refactor: improved page content after disabling the service

* style: remove unless code

* style: remove unless code
2025-06-09 11:54:44 +08:00
BiggerRain
0b609e570d chore: web component default mode (#627) 2025-06-09 09:54:09 +08:00
BiggerRain
f91f6bdc17 fix: web component set IsDark (#630) 2025-06-07 10:49:16 +08:00
ayangweb
57590f3b57 feat: add internationalized translations of AI-related extensions (#632)
* feat: add internationalized translations of AI-related extensions

* docs: update changelog

* refactor: update
2025-06-07 10:48:55 +08:00
ayangweb
c18f9ea154 refactor: optimized input box logic for transparency (#628) 2025-06-06 17:58:18 +08:00
ayangweb
441875d9b4 refactor: optimize data filtering logic (#626) 2025-06-06 17:20:45 +08:00
ayangweb
eddf9075bb feat: add ai overview minimum number of search results configuration (#625)
* feat: add ai overview minimum number of search results configuration

* docs: update changelog

* style: remove unless code
2025-06-06 17:05:20 +08:00
ayangweb
9eac8f8a8e feat: support right-click actions after text selection (#624)
* feat: support right-click actions after text selection

* docs: update changelog

* feat: support for selecting messages sent by users
2025-06-06 16:43:27 +08:00
ayangweb
515260c43f feat: calculator extension add description (#623)
* feat: calculator extension add description

* docs: update changelog
2025-06-06 15:43:24 +08:00
ayangweb
118de0e80b fix: fix ai overview hidden height before message (#622)
* fix: fix ai overview hidden height before message

* docs: update changelog
2025-06-06 15:30:42 +08:00
SteveLauC
19ce896fdc chore: release note for PR 620 (#621) 2025-06-06 15:17:59 +08:00
SteveLauC
4a41ea5d8b fix: invalid DSL error if input contains multiple lines (#620) 2025-06-06 14:58:45 +08:00
ayangweb
880e1206ce fix: fixed modifier keys not working with continue chat (#619)
* fix: fixed modifier keys not working with continue chat

* docs: update changelog
2025-06-06 14:24:36 +08:00
SteveLauC
1e6d9f9550 fix: do not panic when the datasource specified does not exist (#618)
* fix: do not panic when the datasource specified does not exist

* release note
2025-06-06 14:07:27 +08:00
BiggerRain
ff0faf425f fix: only select history and then set the assistant (#617)
* fix: only select history and then set the assistant

* fix: only select history and then set the assistant
2025-06-06 14:06:49 +08:00
ayangweb
1fbf5d6552 fix: resolved an issue where number keys were not working on the web (#616)
* fix: resolved an issue where number keys were not working on the web

* docs: update changelog
2025-06-06 11:47:38 +08:00
ayangweb
db41e817c3 feat: add key monitoring during reset (#615)
* feat: add key monitoring during reset

* docs: update changelog
2025-06-06 11:23:40 +08:00
BiggerRain
1296755bc5 fix: datasource and mcp data updates (#614) 2025-06-06 11:11:33 +08:00
ayangweb
d410f20864 refactor: remove footer from standalone history window (#613) 2025-06-06 11:11:06 +08:00
ayangweb
61d0a3b79a fix: fix chat log update and sorting issues (#612)
* fix: fix chat log update and sorting issues

* docs: update changelog
2025-06-06 10:52:47 +08:00
BiggerRain
b24319b649 fix: datasource refresh status feedback (#611) 2025-06-06 10:51:31 +08:00
BiggerRain
3c0fb24548 fix: shortcut key prompts cannot be hidden (#610) 2025-06-06 10:51:09 +08:00
BiggerRain
2fcbed0381 fix: i18n is not accurate (#609) 2025-06-06 10:50:36 +08:00
SteveLauC
7444347e0c docs: new doc for macOS (#608) 2025-06-05 19:23:14 +08:00
SteveLauC
725ce042de docs: remove the hyperlink in title (#607) 2025-06-05 18:26:09 +08:00
BiggerRain
3b67de5387 chore: initialize current assistant from history (#606)
* chore: the last assistant in history is set as current

* docs: update notes

* docs: update notes
2025-06-05 08:54:39 +08:00
SteveLauC
9b53a026ff refactor: execute Calculator/Extension search() in spawn_blocking (#601) 2025-06-04 18:45:17 +08:00
ayangweb
9ea7dbf3aa fix: resolve regex error on older macOS versions (#605)
* fix: fix: resolve regex error on older macOS versions

* docs: update changelog

* style: remove unless code

* style: remove unless code
2025-06-04 18:38:34 +08:00
BiggerRain
55622911ac style: Switch selected color in dark mode (#604) 2025-06-04 14:10:17 +08:00
BiggerRain
92f78ad08c fix: new chat assistant id not found (#603)
* fix: new chat assistant id

* docs: update notes
2025-06-04 13:06:30 +08:00
ayangweb
f690dbaab2 refactor: web use the default icon for now (#602) 2025-06-04 11:30:59 +08:00
ayangweb
210efe763d fix: fixed issue with incorrect login status (#600)
* fix: fixed issue with incorrect login status

* style: remove unless code

* fix: user avatar error

* refactor: replace with default svg icon

* style: remove unless code

* docs: update changelog

---------

Co-authored-by: rain <15911122312@163.com>
2025-06-04 10:24:56 +08:00
BiggerRain
f23498afa0 fix: web icon isAbsolute (#599) 2025-06-03 19:28:26 +08:00
BiggerRain
a80a5d928f fix: app icon load console error (#598) 2025-06-03 15:47:58 +08:00
ayangweb
b733bb5516 feat: ai overview support is enabled with shortcut (#597)
* feat: ai overview support is enabled with shortcut

* docs: update changelog
2025-06-03 15:01:29 +08:00
ayangweb
5046754534 refactor: optimized loading of font icons on the web side (#596)
* refactor: optimized loading of font icons on the web side

* refactor: update
2025-06-03 11:22:22 +08:00
SteveLauC
f557f7e780 chore: set log level to coco_lib=trace for built Coco app (#595) 2025-06-03 11:18:28 +08:00
BiggerRain
18feb2d690 fix: set chat message assistant (#594) 2025-06-03 10:53:01 +08:00
BiggerRain
af59f2fe9f fix: web component removes redundant parameters (#593) 2025-06-03 10:35:26 +08:00
BiggerRain
5e1bb54d5e chore: web component adds variable process (#592) 2025-06-03 10:12:22 +08:00
Hardy
33fa516aad fix: rustup for i688 (#590)
Co-authored-by: hardy <luohf@infinilabs.com>
2025-06-01 07:19:28 +08:00
Hardy
d2c1cf513d chore: use version fix (#591)
Co-authored-by: hardy <luohf@infinilabs.com>
2025-05-31 20:22:53 +08:00
Hardy
f81bec8403 chore: rollback publish (#589)
* chore: rollback publish

* chore: set toolchain

---------

Co-authored-by: hardy <luohf@infinilabs.com>
2025-05-31 16:29:08 +08:00
medcl
cce956ac15 v0.5.2 2025-05-31 16:06:12 +08:00
Hardy
0d1174c8dd chore: fix ci publish error (#588)
* chore: fix ci publish error

* docs: update release notes

---------

Co-authored-by: hardy <luohf@infinilabs.com>
2025-05-31 16:05:28 +08:00
ayangweb
e0258dc2fa fix: fixed issue with quick ai access making multiple requests at once (#586) 2025-05-31 15:56:35 +08:00
medcl
310a70838b v0.5.1 2025-05-31 15:55:33 +08:00
Hardy
94d7f809d2 chore: add ssh private key for pizza engine (#587)
Co-authored-by: hardy <luohf@infinilabs.com>
2025-05-31 15:51:20 +08:00
medcl
e1d1bc2684 v0.5.0 2025-05-31 15:01:02 +08:00
Medcl
a9e3bb3eee chore: ignore throttle message (#585) 2025-05-31 11:07:01 +08:00
Medcl
d184851e3b chore: remove icon field before ask ai (#584) 2025-05-31 10:03:19 +08:00
BiggerRain
c9b785ccf3 fix: sent chat once more (#583) 2025-05-31 08:53:37 +08:00
Medcl
4c5ae8c718 chore: update error handling (#582)
* chore: update error handling

* chore: update min osx version
2025-05-31 08:50:27 +08:00
Hardy
8a7f7bc708 chore: add pizza feature for release (#581)
Co-authored-by: hardy <luohf@infinilabs.com>
2025-05-30 22:28:44 +08:00
ayangweb
3d44d10048 refactor: remove unused disabledExtensions related code (#580) 2025-05-30 19:41:51 +08:00
BiggerRain
97d880ea27 fix: useScript error (#579) 2025-05-30 19:41:29 +08:00
Medcl
6c53056edd chore: update default coco server (#578) 2025-05-30 19:27:41 +08:00
ayangweb
a6fd2ebd16 fix: fix web carriage return not jumping (#577) 2025-05-30 18:41:58 +08:00
SteveLauC
b509176572 fix: make extension search source respect parameter datasource (#576) 2025-05-30 18:39:09 +08:00
ayangweb
17f2bcf7a8 fix: fix the problem that web cannot click on the jump (#575) 2025-05-30 18:22:18 +08:00
ayangweb
c471a83821 feat: support third party extensions (#572)
* refactor: support third party extensions

* fix tests

* fix: assistant_get error

* aaa

* bbb

* ccc

* ddd

* fix: aa

* fix: aa

* sss

* fix:asds

* eee

* refactor: loosen restriction of query string length

* fix: input auto

* feat: add ai overview trigger condition configuration

* refactor: continue chatting to select the corresponding mini-helper

* chore: settings width height

* aaa

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
Co-authored-by: rain <15911122312@163.com>
2025-05-30 17:18:52 +08:00
SteveLauC
51b0a2a545 refactor: remove thread app list synchronizer as it leaks memory on macOS (#573) 2025-05-29 17:55:24 +08:00
BiggerRain
baded2af1e refactor: search result related components (#571)
* refactor: search result related components

* refactor: search result related components

* docs: update notes

* refactor: search result related components

* fix: ArrowLeft error

* chore: remove log

* fix: ask ai
2025-05-29 16:01:52 +08:00
BiggerRain
2b21426355 refactor: input box related components (#568)
* refactor: input box components

* chore: change variable name

* docs: update notes

* fix: shortcut key failure issue
2025-05-28 12:29:28 +08:00
BiggerRain
8edc938426 chore: only show available servers in chat (#570)
* chore: add server available

* docs: update notes

* docs: update notes
2025-05-28 10:51:25 +08:00
Medcl
fa919bee11 chore: mark unavailable server to offline on refresh info (#569)
* chore: mark server offline on refresh info

* chore: update release notes
2025-05-28 10:43:53 +08:00
Medcl
50f1e611c3 refactor: refactoring rerank feature (#567)
* refactor: refactoring rerank feature

* chore: remove unused code

* chore: pull back unrelated changes
2025-05-27 18:27:53 +08:00
BiggerRain
4c3cf28012 chore: assistant chat placeholder & refactor input box components (#566)
* chore: input placeholder

* chore: add assitant

* impl assistant_get_multi()

* chore: add assitant

* refactor: input box components

* chore: ask ai search placeholder

* chore: ask ai search placeholder

* docs: update notes

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2025-05-27 16:29:43 +08:00
BiggerRain
89fcc67222 fix: assistant list (#563)
* fix: assistant list

* fix: assistant list

* fix: assistant list

* fix: assistant list
2025-05-27 09:24:58 +08:00
Medcl
33c9ce67df chore: remove pizza deps (#565) 2025-05-27 09:09:17 +08:00
SteveLauC
c6dadfd83e ci: deny dep pizza-engine (#564)
* ci: deny dep pizza-engine

* ci: set PWD to cargo workspace
2025-05-27 08:59:46 +08:00
Medcl
e707a8b5c7 chore: rerank support ignore case (#562)
* chore: rerank support ignore case

* chore: remove unused deps
2025-05-26 19:24:01 +08:00
BiggerRain
5c5364974a chore: web component start page config (#560)
* chore: web component start page config

* chore: web component start page config

* docs: update notes
2025-05-26 18:54:33 +08:00
Medcl
9d3e3e8dde feat: rerank search results (#561)
* feat: rerank search results

* chore: update release notes
2025-05-26 18:54:06 +08:00
BiggerRain
e065ba749f chore: assistant keyboard events and mouse events (#559)
* chore: assistant keyboard events and mouse events

* docs: update notes
2025-05-26 15:44:05 +08:00
ayangweb
2dd8e3160c fix: resolved navigation error on continue chat action (#558)
* fix: resolved navigation error on continue chat action

* docs: update changelog
2025-05-26 10:56:29 +08:00
ayangweb
6aeecfe3ac feat: add quick AI access to search mode (#556)
* feat: add quick AI access to search mode

* feat: add aI assistant quick access

* refactor: adjusting lodash-es import location to optimize code structure

* docs: update changelog

* fix: fix the logic of assigning serverId in AskAi component

* refactor: optimized layout

* refactor: optimized some issues
2025-05-23 18:14:41 +08:00
SteveLauC
334e29d69b chore: add make cmd dev-build-with-pizza (#555) 2025-05-23 16:43:38 +08:00
BiggerRain
382f89ace0 fix: independent chat app has no datasources (#554)
* fix: independent chat window has no data

* docs: update notes
2025-05-23 16:42:35 +08:00
BiggerRain
32c7cc5060 fix: suggestion list position (#553)
* fix: suggestion List position

* docs: update notes
2025-05-23 15:31:27 +08:00
BiggerRain
c13151d69e fix: the scroll button is not displayed by default (#552)
* fix: the scroll button is not displayed by default

* docs: update notes
2025-05-23 14:53:57 +08:00
BiggerRain
07c4ab03b5 fix: secondary page cannot be searched (#551)
* fix: secondary page cannot be searched

* docs: update notes
2025-05-22 19:45:28 +08:00
BiggerRain
cf3f2affa5 fix: history list height (#550)
* fix: history list height

* docs: update notes
2025-05-22 16:28:11 +08:00
BiggerRain
401832ad43 chore: logout update server profile (#549)
* chore: logout update server profile

* docs: update notes
2025-05-22 11:53:23 +08:00
Medcl
6a6f48d2fc chore: mark server offline on user logout (#546)
* chore: mark server offline on user logout

* update release notes
2025-05-22 11:37:20 +08:00
BiggerRain
8a6c90d124 chore: add global login judgment (#544)
* chore: add global login judgment

* docs: update notes
2025-05-22 10:59:46 +08:00
BiggerRain
34acecbcb0 chore: add assistant count (#542)
* fix: switch server assistant and session session unchanged

* docs: update notes

* fix: add server error

* chore: add assistant count

* docs: update notes
2025-05-21 15:29:04 +08:00
SteveLauC
4474212b7d chore: dead code cleanup (#543) 2025-05-21 14:40:38 +08:00
Medcl
1187b641d4 refactor: refactoring search error (#541)
* refactor: refactoring search error

* chore: update release notes
2025-05-21 14:27:17 +08:00
BiggerRain
ef8cd569e4 fix: switch server assistant and session session unchanged (#540)
* fix: switch server assistant and session session unchanged

* docs: update notes
2025-05-21 11:34:03 +08:00
BiggerRain
5ef06bfc95 fix: service switching error (#539)
* fix: service switching error

* build: build error

* chore: chat content can be copied

* docs: update notes

* fix: service switching error

* chore: change to send cancel event to ws_cancel

* chore: add ws-cancel

---------

Co-authored-by: medcl <m@medcl.net>
2025-05-21 09:04:57 +08:00
SteveLauC
2b59addb08 fix: panic when fetching app metadata on Windows (#538)
* fix: panic when fetching app metadata on Windows

* release note
2025-05-21 09:04:08 +08:00
BiggerRain
ee750620f2 refactor: service info related components (#537)
* refactor: service info related components

* docs: update notes

* refactor: chat header service status
2025-05-20 17:02:10 +08:00
Medcl
acc3b1a0d2 chore: skip register server that not logged in (#536)
* chore: update logging message

* chore: skip register server that not logged in

* chore: update logging message

* chore: update release notes
2025-05-20 15:10:27 +08:00
SteveLauC
4372747014 feat: dynamic log level via env var COCO_LOG (#535) 2025-05-20 12:54:07 +08:00
BiggerRain
ee531209aa fix: server image loading failure (#534)
* fix: server image loading failure

* docs: update notes
2025-05-20 09:31:54 +08:00
BiggerRain
ee0bbce3e2 style: search error styles (#533)
* style: search error styles

* docs: update notes
2025-05-19 19:54:34 +08:00
SteveLauC
7eccf99f92 fix: do not pass whitespace-only strings to Calculator expr evaluation lib (#532) 2025-05-19 19:24:32 +08:00
SteveLauC
5044a98bb7 fix: app hotkey hanlder invoked twice (key pressed and released) (#531) 2025-05-19 18:40:44 +08:00
SteveLauC
72165812bf refactor: ignore the error happens while indexing a specific app (#530)
* refactor: ignore the error happens while indexing a specific app

* refactor: ignore the error happens while indexing a specific app
2025-05-19 17:28:13 +08:00
BiggerRain
f9c1be8517 fix: app icon & category icon (#529) 2025-05-19 17:24:51 +08:00
BiggerRain
71ce23ef21 style: history component styles (#528)
* style: history component styles

* docs: update notes

* build: build & publish web componet version 1.2.1

* build: build & publish web componet version 1.2.2
2025-05-19 16:56:00 +08:00
Medcl
3e6041cbd8 chroe: update minimum macOS version to 10 (#527) 2025-05-18 15:06:06 +08:00
SteveLauC
0b9e158b55 fix: panic caused by an unwrap() (#526) 2025-05-17 18:44:17 +08:00
BiggerRain
688ced3fc3 build: build & publish web component (#524) 2025-05-17 16:53:17 +08:00
BiggerRain
16e0382a8b docs: update release notes (#525) 2025-05-17 16:52:26 +08:00
BiggerRain
91c9cd5725 fix: show only enabled datasource & MCP list (#523)
* fix: show only enabled datasource & MCP list

* docs: update notes

* fix: show only enabled datasource & MCP list
2025-05-17 12:01:18 +08:00
ayangweb
7f3e602bb3 feat: add a component for text reading aloud (#522)
* feat: add a component for text reading aloud

* docs: update changelog
2025-05-16 16:21:57 +08:00
BiggerRain
5e9d41ea5c fix: datasource & MCP list synchronization update (#521)
* fix: datasource & MCP list update

* docs: update notes

* docs:update notes
2025-05-16 15:09:51 +08:00
Medcl
8bdb93d813 refactor: refactoring icon component (#514)
* chore: try to fix icon for insecure-tls deployment

* chore: handling icon resource loading errors

* refactor: refactored icon component

* chore: update release notes

---------

Co-authored-by: rain <15911122312@163.com>
2025-05-16 12:03:43 +08:00
ayangweb
690e6a3225 refactor: optimizing list styles in markdown content (#520)
* refactor: optimizing list styles in markdown content

* docs: update changelog

* style: remove unless code
2025-05-16 10:21:41 +08:00
ayangweb
111d9bddca style: remove useless code (#519) 2025-05-16 09:17:41 +08:00
ayangweb
7645b3e736 feat: add AI summary component (#518)
* feat: add AI summary component

* docs: update changelog

* refactor: update
2025-05-15 18:27:17 +08:00
311 changed files with 21796 additions and 8970 deletions

2
.env
View File

@@ -1,5 +1,3 @@
COCO_SERVER_URL=http://localhost:9000 #https://coco.infini.cloud #http://localhost:9000
COCO_WEBSOCKET_URL=ws://localhost:9000/ws #wss://coco.infini.cloud/ws #ws://localhost:9000/ws
#TAURI_DEV_HOST=0.0.0.0

View File

@@ -0,0 +1,18 @@
name: Enforce no dependency pizza-engine
on:
pull_request:
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name:
working-directory: ./src-tauri
run: |
# if cargo remove pizza-engine succeeds, then it is in our dependency list, fail the CI pipeline.
if cargo remove pizza-engine; then exit 1; fi

View File

@@ -9,10 +9,16 @@ on:
jobs:
create-release:
runs-on: ubuntu-latest
outputs:
APP_VERSION: ${{ steps.get-version.outputs.APP_VERSION }}
RELEASE_BODY: ${{ steps.get-changelog.outputs.RELEASE_BODY }}
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Set output
id: vars
run: echo "tag=${GITHUB_REF#refs/*/}" >> $GITHUB_OUTPUT
@@ -22,11 +28,28 @@ jobs:
with:
node-version: 20
- name: Get build version
shell: bash
id: get-version
run: |
PACKAGE_VERSION=$(jq -r '.version' package.json)
CARGO_VERSION=$(grep -m 1 '^version =' src-tauri/Cargo.toml | sed -E 's/.*"([^"]+)".*/\1/')
if [ "$PACKAGE_VERSION" != "$CARGO_VERSION" ]; then
echo "::error::Version mismatch!"
else
echo "Version match: $PACKAGE_VERSION"
fi
echo "APP_VERSION=$PACKAGE_VERSION" >> $GITHUB_OUTPUT
- name: Generate changelog
id: create_release
run: npx changelogithub --draft --name ${{ steps.vars.outputs.tag }}
id: get-changelog
run: |
CHANGELOG_BODY=$(npx changelogithub --draft --name ${{ steps.vars.outputs.tag }})
echo "RELEASE_BODY<<EOF" >> $GITHUB_OUTPUT
echo "$CHANGELOG_BODY" >> $GITHUB_OUTPUT
echo "EOF" >> $GITHUB_OUTPUT
env:
GITHUB_TOKEN: ${{ secrets.RELEASE_TOKEN }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
build-app:
needs: create-release
@@ -52,11 +75,23 @@ jobs:
target: "x86_64-unknown-linux-gnu"
- platform: "ubuntu-22.04-arm"
target: "aarch64-unknown-linux-gnu"
env:
APP_VERSION: ${{ needs.create-release.outputs.APP_VERSION }}
runs-on: ${{ matrix.platform }}
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Checkout dependency repository
uses: actions/checkout@v4
with:
repository: 'infinilabs/pizza'
ssh-key: ${{ secrets.SSH_PRIVATE_KEY }}
submodules: recursive
ref: main
path: pizza
- name: Setup node
uses: actions/setup-node@v4
with:
@@ -65,17 +100,31 @@ jobs:
with:
version: latest
- name: Install rust target
run: rustup target add ${{ matrix.target }}
- name: Install dependencies (ubuntu only)
if: startsWith(matrix.platform, 'ubuntu-22.04')
run: |
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf xdg-utils
- name: Install Rust stable
run: rustup toolchain install stable
- name: Add Rust build target
working-directory: src-tauri
shell: bash
run: |
rustup target add ${{ matrix.target }} || true
- name: Add pizza engine as a dependency
working-directory: src-tauri
shell: bash
run: |
BUILD_ARGS="--target ${{ matrix.target }}"
if [[ "${{matrix.target }}" != "i686-pc-windows-msvc" ]]; then
echo "Adding pizza engine as a dependency for ${{matrix.platform }}-${{matrix.target }}"
( cargo add --path ../pizza/lib/engine --features query_string_parser,persistence )
BUILD_ARGS+=" --features use_pizza_engine"
else
echo "Skipping pizza engine dependency for ${{matrix.platform }}-${{matrix.target }}"
fi
echo "BUILD_ARGS=${BUILD_ARGS}" >> $GITHUB_ENV
- name: Rust cache
uses: swatinem/rust-cache@v2
@@ -90,8 +139,8 @@ jobs:
- name: Install app dependencies and build web
run: pnpm install --frozen-lockfile
- name: Build the app
- name: Build the coco at ${{ matrix.platform}} for ${{ matrix.target }} @ ${{ env.APP_VERSION }}
uses: tauri-apps/tauri-action@v0
env:
CI: false
@@ -107,8 +156,8 @@ jobs:
APPLE_TEAM_ID: ${{ secrets.APPLE_TEAM_ID }}
with:
tagName: ${{ github.ref_name }}
releaseName: Coco ${{ needs.create-release.outputs.APP_VERSION }}
releaseBody: ""
releaseName: Coco ${{ env.APP_VERSION }}
releaseBody: "${{ needs.create-release.outputs.RELEASE_BODY }}"
releaseDraft: true
prerelease: false
args: --target ${{ matrix.target }}
args: ${{ env.BUILD_ARGS }}

61
.github/workflows/rust_code_check.yml vendored Normal file
View File

@@ -0,0 +1,61 @@
name: Rust Code Check
on:
pull_request:
# Only run it when Rust code changes
paths:
- 'src-tauri/**'
jobs:
check:
strategy:
matrix:
platform: [ubuntu-latest, windows-latest, macos-latest]
runs-on: ${{ matrix.platform }}
steps:
- uses: actions/checkout@v4
- name: Checkout dependency (pizza-engine) repository
uses: actions/checkout@v4
with:
repository: 'infinilabs/pizza'
ssh-key: ${{ secrets.SSH_PRIVATE_KEY }}
submodules: recursive
ref: main
path: pizza
- name: Install dependencies (ubuntu only)
if: startsWith(matrix.platform, 'ubuntu-latest')
run: |
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf xdg-utils
- name: Add pizza engine as a dependency
working-directory: src-tauri
shell: bash
run: cargo add --path ../pizza/lib/engine --features query_string_parser,persistence
- name: Format check
working-directory: src-tauri
shell: bash
run: |
rustup component add rustfmt
cargo fmt --all --check
- name: Check compilation (Without Pizza engine enabled)
working-directory: ./src-tauri
run: cargo check
- name: Check compilation (With Pizza engine enabled)
working-directory: ./src-tauri
run: cargo check --features use_pizza_engine
- name: Run tests (Without Pizza engine)
working-directory: ./src-tauri
run: cargo test
- name: Run tests (With Pizza engine)
working-directory: ./src-tauri
run: cargo test --features use_pizza_engine

View File

@@ -8,11 +8,14 @@
"clsx",
"codegen",
"dataurl",
"deeplink",
"deepthink",
"dtolnay",
"dyld",
"elif",
"errmsg",
"fullscreen",
"fulltext",
"headlessui",
"Icdbb",
"icns",
@@ -29,6 +32,8 @@
"localstorage",
"lucide",
"maximizable",
"mdast",
"meval",
"Minimizable",
"msvc",
"nord",
@@ -38,9 +43,11 @@
"overscan",
"partialize",
"patchelf",
"Quicklink",
"Raycast",
"rehype",
"reqwest",
"rerank",
"rgba",
"rustup",
"screenshotable",
@@ -55,6 +62,7 @@
"traptitech",
"unlisten",
"unlistener",
"unlisteners",
"unminimize",
"uuidv",
"VITE",

View File

@@ -78,4 +78,8 @@ clean-rebuild:
$(MAKE) dev-build
add-dep-pizza-engine:
cd src-tauri && cargo add --git ssh://git@github.com/infinilabs/pizza.git pizza-engine --features query_string_parser,persistence
cd src-tauri && cargo add --git ssh://git@github.com/infinilabs/pizza.git pizza-engine --features query_string_parser,persistence
dev-build-with-pizza: add-dep-pizza-engine
@echo "Starting desktop development with Pizza Engine pulled in..."
RUST_BACKTRACE=1 pnpm tauri dev --features use_pizza_engine

View File

@@ -91,6 +91,8 @@ pnpm tauri build
- [Coco App Documentation](https://docs.infinilabs.com/coco-app/main/)
- [Coco Server Documentation](https://docs.infinilabs.com/coco-server/main/)
- [DeepWiki Coco App](https://deepwiki.com/infinilabs/coco-app)
- [DeepWiki Coco Server](https://deepwiki.com/infinilabs/coco-server)
- [Tauri Documentation](https://tauri.app/)
## Contributors

View File

@@ -1,21 +1,35 @@
---
weight: 10
title: "Mac OS"
title: "macOS"
asciinema: true
---
# Mac OS
# macOS
## Download Coco AI
Goto [https://coco.rs/](https://coco.rs/)
Go to [coco.rs](https://coco.rs/) and download the package of your architecture:
{{% load-img "/img/download-mac-app.png" "" %}}
{{% load-img "/img/macos/mac-download-app.png" "" %}}
It should be placed in your "Downloads" folder:
{{% load-img "/img/macos/mac-zip-file.png" "" %}}
## Unzip DMG file
{{% load-img "/img/unzip-dmg-file.png" "" %}}
Unzip the file:
{{% load-img "/img/macos/mac-unzip-zip-file.png" "" %}}
You will get a `dmg` file:
{{% load-img "/img/macos/mac-dmg.png" "" %}}
## Drag to Application Folder
{{% load-img "/img/drag-to-application-folder.png" "" %}}
Double click the `dmg` file, a window will pop up. Then drag the "Coco-AI" app to
your "Applications" folder:
{{% load-img "/img/macos/drag-to-app-folder.png" "" %}}

View File

@@ -14,7 +14,9 @@ asciinema: true
[if_x11]: https://unix.stackexchange.com/q/202891/498440
## Goto [https://coco.rs/](https://coco.rs/)
## Go to the download page
Download page: [link](https://coco.rs/#install)
## Download the package

View File

@@ -5,7 +5,7 @@ title: "Release Notes"
# Release Notes
Information about release notes of Coco Server is provided here.
Information about release notes of Coco App is provided here.
## Latest (In development)
@@ -13,6 +13,132 @@ Information about release notes of Coco Server is provided here.
### 🚀 Features
- feat: enhance ui for skipped version #834
- feat: support installing local extensions #749
- feat: support sending files in chat messages #764
- feat: sub extension can set 'platforms' now #847
### 🐛 Bug fix
- fix: fix issue with update check failure #833
### ✈️ Improvements
- refactor: calling service related interfaces #831
- refactor: split query_coco_fusion() #836
- chore: web component loading font icon #838
- chore: delete unused code files and dependencies #841
- chore: ignore tauri::AppHandle's generic argument R #845
- refactor: check Extension/plugin.json from all sources #846
## 0.7.1 (2025-07-27)
### ❌ Breaking changes
### 🚀 Features
### 🐛 Bug fix
- fix: correct enter key behavior #828
### ✈️ Improvements
- chore: web component add notification component #825
- refactor: collection behavior defaults to `MoveToActiveSpace`, and only use `CanJoinAllSpaces` when window is pinned #829
## 0.7.0 (2025-07-25)
### ❌ Breaking changes
### 🚀 Features
- feat: file search using spotlight #705
- feat: voice input support in both search and chat modes #732
- feat: text to speech now powered by LLM #750
- feat: file search for Windows #762
### 🐛 Bug fix
- fix(file search): apply filters before from/size parameters #741
- fix(file search): searching by name&content does not search file name #743
- fix: prevent window from hiding when moved on Windows #748
- fix: unregister ext hotkey when it gets deleted #770
- fix: indexing apps does not respect search scope config #773
- fix: restore missing category titles on subpages #772
- fix: correct incorrect assistant display when quick ai access #779
- fix: resolved minor issues with voice playback #780
- fix: fixed incorrect taskbar icon display on linux #783
- fix: fix data inconsistency issue on secondary pages #784
- fix: incorrect status when installing extension #789
- fix: increase read_timeout for HTTP streaming stability #798
- fix: enter key problem #794
- fix: fix selection issue after renaming #800
- fix: fix shortcut issue in windows context menu #804
- fix: panic caused by "state() called before manage()" #806
- fix: fix multiline input issue #808
- fix: fix ctrl+k not working #815
- fix: fix update window config sync #818
- fix: fix enter key on subpages #819
- fix: panic on Ubuntu (GNOME) when opening apps #821
### ✈️ Improvements
- refactor: prioritize stat(2) when checking if a file is dir #737
- refactor: change File Search ext type to extension #738
- refactor: create chat & send chat api #739
- chore: icon support for more file types #740
- chore: replace meval-rs with our fork to clear dep warning #745
- refactor: adjusted assistant, datasource, mcp_server interface parameters #746
- refactor: adjust extension code hierarchy #747
- chore: bump dep applications-rs #751
- chore: rename QuickLink/quick_link to Quicklink/quicklink #752
- chore: assistant params & styles #753
- chore: make optional fields optional #758
- chore: search-chat components add formatUrl & think data & icons url #765
- chore: Coco app http request headers #744
- refactor: do status code check before deserializing response #767
- style: splash adapts to the width of mobile phones #768
- chore: search-chat add language and formatUrl parameters #775
- chore: not request the interface if not logged in #795
- refactor: clean up unsupported characters from query string in Win Search #802
- chore: display backtrace in panic log #805
## 0.6.0 (2025-06-29)
### ❌ Breaking changes
### 🚀 Features
- feat: support `Tab` and `Enter` for delete dialog buttons #700
- feat: add check for updates #701
- feat: impl extension store #699
- feat: support back navigation via delete key #717
### 🐛 Bug fix
- fix: quick ai state synchronous #693
- fix: toggle extension should register/unregister hotkey #691
- fix: take coco server back on refresh #696
- fix: some input fields couldnt accept spaces #709
- fix: context menu search not working #713
- fix: open extension store display #724
### ✈️ Improvements
- refactor: use author/ext_id as extension unique identifier #643
- refactor: refactoring search api #679
- chore: continue to chat page display #690
- chore: improve server list selection with enter key #692
- chore: add message for latest version check #703
- chore: log command execution results #718
- chore: adjust styles and add button reindex #719
## 0.5.0 (2025-06-13)
### ❌ Breaking changes
### 🚀 Features
- feat: check or enter to close the list of assistants #469
- feat: add dimness settings for pinned window #470
- feat: supports Shift + Enter input box line feeds #472
@@ -24,17 +150,59 @@ Information about release notes of Coco Server is provided here.
- feat: the search input box supports multi-line input #501
- feat: websocket support self-signed TLS #504
- feat: add option to allow self-signed certificates #509
- feat: add AI summary component #518
- feat: dynamic log level via env var COCO_LOG #535
- feat: add quick AI access to search mode #556
- feat: rerank search results #561
- feat: ai overview support is enabled with shortcut #597
- feat: add key monitoring during reset #615
- feat: calculator extension add description #623
- feat: support right-click actions after text selection #624
- feat: add ai overview minimum number of search results configuration #625
- feat: add internationalized translations of AI-related extensions #632
- feat: context menu support for secondary pages #680
### 🐛 Bug fix
- fix: solve the problem of modifying the assistant in the chat #476
- fix: several issues around search #502
- fix: fixed the newly created session has no title when it is deleted #511
- fix: loading chat history for potential empty attachments
- fix: datasource & MCP list synchronization update #521
- fix: app icon & category icon #529
- fix: show only enabled datasource & MCP list
- fix: server image loading failure #534
- fix: panic when fetching app metadata on Windows #538
- fix: service switching error #539
- fix: switch server assistant and session unchanged #540
- fix: history list height #550
- fix: secondary page cannot be searched #551
- fix: the scroll button is not displayed by default #552
- fix: suggestion list position #553
- fix: independent chat window has no data #554
- fix: resolved navigation error on continue chat action #558
- fix: make extension search source respect parameter datasource #576
- fix: fixed issue with incorrect login status #600
- fix: new chat assistant id not found #603
- fix: resolve regex error on older macOS versions #605
- fix: fix chat log update and sorting issues #612
- fix: resolved an issue where number keys were not working on the web #616
- fix: do not panic when the datasource specified does not exist #618
- fix: fixed modifier keys not working with continue chat #619
- fix: invalid DSL error if input contains multiple lines #620
- fix: fix ai overview hidden height before message #622
- fix: tab key hides window in chat mode #641
- fix: arrow keys still navigated search when menu opened with Cmd+K #642
- fix: input lost when reopening dialog after search #644
- fix: web page unmount event #645
- fix: fix the problem of local path not opening #650
- fix: number keys not following settings #661
- fix: fix problem with up and down key indexing #676
- fix: arrow inserting escape sequences #683
### ✈️ Improvements
- chore: adjust list error message #475
- fix: solve the problem of modifying the assistant in the chat #476
- chore: refine wording on search failure
- choresearch and MCP show hidden logic #494
- chore: greetings show hidden logic #496
@@ -45,6 +213,32 @@ Information about release notes of Coco Server is provided here.
- refactor: optimized the modification operation of the numeric input box #508
- style: modify the style of the search input box #513
- style: chat input icons show #515
- refactor: refactoring icon component #514
- refactor: optimizing list styles in markdown content #520
- feat: add a component for text reading aloud #522
- style: history component styles #528
- style: search error styles #533
- chore: skip register server that not logged in #536
- refactor: service info related components #537
- chore: chat content can be copied #539
- refactor: refactoring search error #541
- chore: add assistant count #542
- chore: add global login judgment #544
- chore: mark server offline on user logout #546
- chore: logout update server profile #549
- chore: assistant keyboard events and mouse events #559
- chore: web component start page config #560
- chore: assistant chat placeholder & refactor input box components #566
- refactor: input box related components #568
- chore: mark unavailable server to offline on refresh info #569
- chore: only show available servers in chat #570
- refactor: search result related components #571
- chore: initialize current assistant from history #606
- chore: add onContextMenu event #629
- chore: more logs for the setup process #634
- chore: copy supports http protocol #639
- refactor: use author/ext_id as extension unique identifier #643
- chore: add special character filtering #668
## 0.4.0 (2025-04-27)
@@ -74,6 +268,8 @@ Information about release notes of Coco Server is provided here.
- feat: data sources support displaying customized icons #432
- feat: add shortcut key conflict hint and reset function #442
- feat: updated to include error message #465
- feat: support third party extensions #572
- feat: support ai overview #572
### Bug fix

Binary file not shown.

Before

Width:  |  Height:  |  Size: 155 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 69 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 239 KiB

BIN
docs/static/img/macos/mac-dmg.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 586 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 299 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 650 KiB

BIN
docs/static/img/macos/mac-zip-file.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 441 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 121 KiB

View File

@@ -1,7 +1,7 @@
{
"name": "coco",
"private": true,
"version": "0.4.0",
"version": "0.7.1",
"type": "module",
"scripts": {
"dev": "vite",
@@ -18,7 +18,6 @@
"release-beta": "release-it --preRelease=beta --preReleaseBase=1"
},
"dependencies": {
"@ant-design/icons": "^6.0.0",
"@headlessui/react": "^2.2.2",
"@tauri-apps/api": "^2.5.0",
"@tauri-apps/plugin-autostart": "~2.2.0",
@@ -27,11 +26,11 @@
"@tauri-apps/plugin-global-shortcut": "~2.0.0",
"@tauri-apps/plugin-http": "~2.0.2",
"@tauri-apps/plugin-log": "~2.4.0",
"@tauri-apps/plugin-opener": "^2.2.7",
"@tauri-apps/plugin-os": "^2.2.1",
"@tauri-apps/plugin-process": "^2.2.1",
"@tauri-apps/plugin-shell": "^2.2.1",
"@tauri-apps/plugin-updater": "github:infinilabs/tauri-plugin-updater#v2",
"@tauri-apps/plugin-websocket": "~2.3.0",
"@tauri-apps/plugin-window": "2.0.0-alpha.1",
"@wavesurfer/react": "^1.0.11",
"ahooks": "^3.8.4",
@@ -44,6 +43,7 @@
"i18next-browser-languagedetector": "^8.1.0",
"lodash-es": "^4.17.21",
"lucide-react": "^0.461.0",
"mdast-util-gfm-autolink-literal": "2.0.0",
"mermaid": "^11.6.0",
"nanoid": "^5.1.5",
"react": "^18.3.1",
@@ -58,10 +58,12 @@
"remark-breaks": "^4.0.0",
"remark-gfm": "^4.0.1",
"remark-math": "^6.0.0",
"tailwind-merge": "^3.3.1",
"tauri-plugin-fs-pro-api": "^2.4.0",
"tauri-plugin-macos-permissions-api": "^2.3.0",
"tauri-plugin-screenshots-api": "^2.2.0",
"tauri-plugin-windows-version-api": "^2.0.0",
"type-fest": "^4.41.0",
"use-debounce": "^10.0.4",
"uuid": "^11.1.0",
"wavesurfer.js": "^7.9.5",
@@ -89,5 +91,6 @@
"tsx": "^4.19.4",
"typescript": "^5.8.3",
"vite": "^5.4.19"
}
}
},
"packageManager": "pnpm@10.11.0+sha512.6540583f41cc5f628eb3d9773ecee802f4f9ef9923cc45b69890fb47991d4b092964694ec3a4f738a420c918a333062c8b925d312f42e4f0c263eb603551f977"
}

155
pnpm-lock.yaml generated
View File

@@ -8,9 +8,6 @@ importers:
.:
dependencies:
'@ant-design/icons':
specifier: ^6.0.0
version: 6.0.0(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
'@headlessui/react':
specifier: ^2.2.2
version: 2.2.2(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
@@ -35,6 +32,9 @@ importers:
'@tauri-apps/plugin-log':
specifier: ~2.4.0
version: 2.4.0
'@tauri-apps/plugin-opener':
specifier: ^2.2.7
version: 2.2.7
'@tauri-apps/plugin-os':
specifier: ^2.2.1
version: 2.2.1
@@ -47,9 +47,6 @@ importers:
'@tauri-apps/plugin-updater':
specifier: github:infinilabs/tauri-plugin-updater#v2
version: https://codeload.github.com/infinilabs/tauri-plugin-updater/tar.gz/358e689c65e9943b53eff50bcb9dfd5b1cfc4072
'@tauri-apps/plugin-websocket':
specifier: ~2.3.0
version: 2.3.0
'@tauri-apps/plugin-window':
specifier: 2.0.0-alpha.1
version: 2.0.0-alpha.1
@@ -86,6 +83,9 @@ importers:
lucide-react:
specifier: ^0.461.0
version: 0.461.0(react@18.3.1)
mdast-util-gfm-autolink-literal:
specifier: 2.0.0
version: 2.0.0
mermaid:
specifier: ^11.6.0
version: 11.6.0
@@ -128,6 +128,9 @@ importers:
remark-math:
specifier: ^6.0.0
version: 6.0.0
tailwind-merge:
specifier: ^3.3.1
version: 3.3.1
tauri-plugin-fs-pro-api:
specifier: ^2.4.0
version: 2.4.0
@@ -140,6 +143,9 @@ importers:
tauri-plugin-windows-version-api:
specifier: ^2.0.0
version: 2.0.0
type-fest:
specifier: ^4.41.0
version: 4.41.0
use-debounce:
specifier: ^10.0.4
version: 10.0.4(react@18.3.1)
@@ -182,7 +188,7 @@ importers:
version: 1.8.8
'@vitejs/plugin-react':
specifier: ^4.4.1
version: 4.4.1(vite@5.4.19(@types/node@22.15.17)(sass@1.87.0))
version: 4.4.1(vite@5.4.19(@types/node@22.15.17)(sass@1.87.0)(terser@5.40.0))
autoprefixer:
specifier: ^10.4.21
version: 10.4.21(postcss@8.5.3)
@@ -215,7 +221,7 @@ importers:
version: 5.8.3
vite:
specifier: ^5.4.19
version: 5.4.19(@types/node@22.15.17)(sass@1.87.0)
version: 5.4.19(@types/node@22.15.17)(sass@1.87.0)(terser@5.40.0)
packages:
@@ -227,23 +233,6 @@ packages:
resolution: {integrity: sha512-30iZtAPgz+LTIYoeivqYo853f02jBYSd5uGnGpkFV0M3xOt9aN73erkgYAmZU43x4VfqcnLxW9Kpg3R5LC4YYw==}
engines: {node: '>=6.0.0'}
'@ant-design/colors@8.0.0':
resolution: {integrity: sha512-6YzkKCw30EI/E9kHOIXsQDHmMvTllT8STzjMb4K2qzit33RW2pqCJP0sk+hidBntXxE+Vz4n1+RvCTfBw6OErw==}
'@ant-design/fast-color@3.0.0':
resolution: {integrity: sha512-eqvpP7xEDm2S7dUzl5srEQCBTXZMmY3ekf97zI+M2DHOYyKdJGH0qua0JACHTqbkRnD/KHFQP9J1uMJ/XWVzzA==}
engines: {node: '>=8.x'}
'@ant-design/icons-svg@4.4.2':
resolution: {integrity: sha512-vHbT+zJEVzllwP+CM+ul7reTEfBR0vgxFe7+lREAsAA7YGsYpboiq2sQNeQeRvh09GfQgs/GyFEvZpJ9cLXpXA==}
'@ant-design/icons@6.0.0':
resolution: {integrity: sha512-o0aCCAlHc1o4CQcapAwWzHeaW2x9F49g7P3IDtvtNXgHowtRWYb7kiubt8sQPFvfVIVU/jLw2hzeSlNt0FU+Uw==}
engines: {node: '>=8'}
peerDependencies:
react: '>=16.0.0'
react-dom: '>=16.0.0'
'@antfu/install-pkg@1.1.0':
resolution: {integrity: sha512-MGQsmw10ZyI+EJo45CdSER4zEb+p31LpDAFp2Z3gkSd1yqVZGi0Ebx++YTEMonJy4oChEMLsxZ64j8FH6sSqtQ==}
@@ -813,6 +802,9 @@ packages:
resolution: {integrity: sha512-R8gLRTZeyp03ymzP/6Lil/28tGeGEzhx1q2k703KGWRAI1VdvPIXdG70VJc2pAMw3NA6JKL5hhFu1sJX0Mnn/A==}
engines: {node: '>=6.0.0'}
'@jridgewell/source-map@0.3.6':
resolution: {integrity: sha512-1ZJTZebgqllO79ue2bm3rIGud/bOe0pP5BjSRCRxxYkEZS8STV7zN84UBbiYu7jy+eCKSnVIUgoWWE/tt+shMQ==}
'@jridgewell/sourcemap-codec@1.5.0':
resolution: {integrity: sha512-gv3ZRaISU3fjPAgNsriBRqGWQL6quFx04YMPW/zD8XMLsU32mhCCbfbO6KZFLjvYpCZ8zyDEgqsgf+PwPaM7GQ==}
@@ -990,12 +982,6 @@ packages:
resolution: {integrity: sha512-c83qWb22rNRuB0UaVCI0uRPNRr8Z0FWnEIvT47jiHAmOIUHbBOg5XvV7pM5x+rKn9HRpjxquDbXYSXr3fAKFcw==}
engines: {node: '>=12'}
'@rc-component/util@1.2.1':
resolution: {integrity: sha512-AUVu6jO+lWjQnUOOECwu8iR0EdElQgWW5NBv5vP/Uf9dWbAX3udhMutRlkVXjuac2E40ghkFy+ve00mc/3Fymg==}
peerDependencies:
react: '>=18.0.0'
react-dom: '>=18.0.0'
'@react-aria/focus@3.20.2':
resolution: {integrity: sha512-Q3rouk/rzoF/3TuH6FzoAIKrl+kzZi9LHmr8S5EqLAOyP9TXIKG34x2j42dZsAhrw7TbF9gA8tBKwnCNH4ZV+Q==}
peerDependencies:
@@ -1256,6 +1242,9 @@ packages:
'@tauri-apps/plugin-log@2.4.0':
resolution: {integrity: sha512-j7yrDtLNmayCBOO2esl3aZv9jSXy2an8MDLry3Ys9ZXerwUg35n1Y2uD8HoCR+8Ng/EUgx215+qOUfJasjYrHw==}
'@tauri-apps/plugin-opener@2.2.7':
resolution: {integrity: sha512-uduEyvOdjpPOEeDRrhwlCspG/f9EQalHumWBtLBnp3fRp++fKGLqDOyUhSIn7PzX45b/rKep//ZQSAQoIxobLA==}
'@tauri-apps/plugin-os@2.2.1':
resolution: {integrity: sha512-cNYpNri2CCc6BaNeB6G/mOtLvg8dFyFQyCUdf2y0K8PIAKGEWdEcu8DECkydU2B+oj4OJihDPD2de5K6cbVl9A==}
@@ -1269,9 +1258,6 @@ packages:
resolution: {tarball: https://codeload.github.com/infinilabs/tauri-plugin-updater/tar.gz/358e689c65e9943b53eff50bcb9dfd5b1cfc4072}
version: 2.7.1
'@tauri-apps/plugin-websocket@2.3.0':
resolution: {integrity: sha512-eAwRGe3tnqDeQYE0wq4g1PUKbam9tYvlC4uP/au12Y/z7MP4lrS4ylv+aoZ5Ly+hTlBdi7hDkhHomwF/UeBesA==}
'@tauri-apps/plugin-window@2.0.0-alpha.1':
resolution: {integrity: sha512-dFOAgal/3Txz3SQ+LNQq0AK1EPC+acdaFlwPVB/6KXUZYmaFleIlzgxDVoJCQ+/xOhxvYrdQaFLefh0I/Kldbg==}
@@ -1583,6 +1569,9 @@ packages:
engines: {node: ^6 || ^7 || ^8 || ^9 || ^10 || ^11 || ^12 || >=13.7}
hasBin: true
buffer-from@1.1.2:
resolution: {integrity: sha512-E+XQCRwSbaaiChtv6k6Dwgc+bx+Bs6vuKJHHl5kox/BaKbhiXzqQOwK4cO22yElGp2OCmjwVhT3HmxgyPGnJfQ==}
bundle-name@4.1.0:
resolution: {integrity: sha512-tjwM5exMg6BGRI+kNmTntNsvdZS1X8BFYS6tnJ2hdH0kVxM6/eVZ2xy+FqStSWvYmtfFMDLIxurorHwDKfDz5Q==}
engines: {node: '>=18'}
@@ -1658,9 +1647,6 @@ packages:
resolution: {integrity: sha512-cYY9mypksY8NRqgDB1XD1RiJL338v/551niynFTGkZOO2LHuB2OmOYxDIe/ttN9AHwrqdum1360G3ald0W9kCg==}
engines: {node: '>=8'}
classnames@2.5.1:
resolution: {integrity: sha512-saHYOzhIQs6wy2sVxTM6bUDsQO4F50V9RQ22qBpEdCW+I+/Wmke2HOl6lS6dTpdxVhb88/I6+Hs+438c3lfUow==}
cli-boxes@3.0.0:
resolution: {integrity: sha512-/lzGpEWL/8PfI0BmBOPRwp0c/wFNX1RdUML3jK/RcSBA9T8mZDdQpqYBKtCFTOfQbwPqWEOpjqW+Fnayc0969g==}
engines: {node: '>=10'}
@@ -1695,6 +1681,9 @@ packages:
comma-separated-tokens@2.0.3:
resolution: {integrity: sha512-Fu4hJdvzeylCfQPp9SGWidpzrMs7tTrlu6Vb8XGaRGck8QSNZJJp538Wrb60Lax4fPwR64ViY468OIUTbRlGZg==}
commander@2.20.3:
resolution: {integrity: sha512-GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ==}
commander@4.1.1:
resolution: {integrity: sha512-NOKm8xhkzAjzFx8B2v5OAHT+u5pRQc2UCa2Vq9jYL/31o2wi9mxBA7LIFs3sV5VSC49z6pEhfbMULvShKj26WA==}
engines: {node: '>= 6'}
@@ -2640,8 +2629,8 @@ packages:
mdast-util-from-markdown@2.0.2:
resolution: {integrity: sha512-uZhTV/8NBuw0WHkPTrCqDOl0zVe1BIng5ZtHoDk49ME1qqcjYmmLmOf0gELgcRMxN4w2iuIeVso5/6QymSrgmA==}
mdast-util-gfm-autolink-literal@2.0.1:
resolution: {integrity: sha512-5HVP2MKaP6L+G6YaxPNjuL0BPrq9orG3TsrZ9YXbA3vDw/ACI4MEsnoDpn6ZNm7GnZgtAcONJyPhOP8tNJQavQ==}
mdast-util-gfm-autolink-literal@2.0.0:
resolution: {integrity: sha512-FyzMsduZZHSc3i0Px3PQcBT4WJY/X/RCtEJKuybiC6sjPqLv7h1yqAkmILZtuxMSsUyaLUWNp71+vQH2zqp5cg==}
mdast-util-gfm-footnote@2.1.0:
resolution: {integrity: sha512-sqpDWlsHn7Ac9GNZQMeUzPQSMzR6Wv0WKRNvQRg0KqHh02fpTz69Qc1QSseNX29bhz1ROIyNyxExfawVKTm1GQ==}
@@ -3137,9 +3126,6 @@ packages:
typescript:
optional: true
react-is@18.3.1:
resolution: {integrity: sha512-/LLMVyas0ljjAtoYiPqYiL8VWXzUUdThrmU5+n20DZv+a+ClRoevUzw5JxU+Ieh5/c87ytoTBV9G1FiKfNJdmg==}
react-markdown@9.1.0:
resolution: {integrity: sha512-xaijuJB0kzGiUdG7nc2MOMDUDBWPyGAjZtUrow9XxUeua8IqeP+VlIfAZ3bphpcLTnSZXz6z9jcVC/TCwbfgdw==}
peerDependencies:
@@ -3346,6 +3332,9 @@ packages:
resolution: {integrity: sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA==}
engines: {node: '>=0.10.0'}
source-map-support@0.5.21:
resolution: {integrity: sha512-uBHU3L3czsIyYXKX88fdrGovxdSCoTGDRZ6SYXtSRxLZUzHg5P/66Ht6uoUlHu9EZod+inXhKo3qQgwXUT/y1w==}
source-map@0.6.1:
resolution: {integrity: sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==}
engines: {node: '>=0.10.0'}
@@ -3423,6 +3412,9 @@ packages:
tabbable@6.2.0:
resolution: {integrity: sha512-Cat63mxsVJlzYvN51JmVXIgNoUokrIaT2zLclCXjRd8boZ0004U4KCs/sToJ75C6sdlByWxpYnb5Boif1VSFew==}
tailwind-merge@3.3.1:
resolution: {integrity: sha512-gBXpgUm/3rp1lMZZrM/w7D8GKqshif0zAymAhbCyIt8KMe+0v9DQ7cdYLR4FHH/cKpdTXb+A/tKKU3eolfsI+g==}
tailwindcss@3.4.17:
resolution: {integrity: sha512-w33E2aCvSDP0tW9RZuNXadXlkHXqFzSkQew/aIa2i/Sj8fThxwovwlXHSPXTbAHwEIhBFXAedUhP2tueAKP8Og==}
engines: {node: '>=14.0.0'}
@@ -3440,6 +3432,11 @@ packages:
tauri-plugin-windows-version-api@2.0.0:
resolution: {integrity: sha512-tty5n4ASYbXpnsD5ws2iTcTTpDCrSbzRTVp5Bo3UTpYGqlN1gBn2Zk8s3oO4w7VIM5WtJhDM9Jr/UgoTk7tFJQ==}
terser@5.40.0:
resolution: {integrity: sha512-cfeKl/jjwSR5ar7d0FGmave9hFGJT8obyo0z+CrQOylLDbk7X81nPU6vq9VORa5jU30SkDnT2FXjLbR8HLP+xA==}
engines: {node: '>=10'}
hasBin: true
thenify-all@1.6.0:
resolution: {integrity: sha512-RNxQH/qI8/t3thXJDwcstUO4zeqo64+Uy/+sNVRBx4Xn2OX+OZ9oP+iJnNFqplFra2ZUVeKCSa2oVWi3T4uVmA==}
engines: {node: '>=0.8'}
@@ -3774,23 +3771,6 @@ snapshots:
'@jridgewell/gen-mapping': 0.3.8
'@jridgewell/trace-mapping': 0.3.25
'@ant-design/colors@8.0.0':
dependencies:
'@ant-design/fast-color': 3.0.0
'@ant-design/fast-color@3.0.0': {}
'@ant-design/icons-svg@4.4.2': {}
'@ant-design/icons@6.0.0(react-dom@18.3.1(react@18.3.1))(react@18.3.1)':
dependencies:
'@ant-design/colors': 8.0.0
'@ant-design/icons-svg': 4.4.2
'@rc-component/util': 1.2.1(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
classnames: 2.5.1
react: 18.3.1
react-dom: 18.3.1(react@18.3.1)
'@antfu/install-pkg@1.1.0':
dependencies:
package-manager-detector: 1.3.0
@@ -4260,6 +4240,12 @@ snapshots:
'@jridgewell/set-array@1.2.1': {}
'@jridgewell/source-map@0.3.6':
dependencies:
'@jridgewell/gen-mapping': 0.3.8
'@jridgewell/trace-mapping': 0.3.25
optional: true
'@jridgewell/sourcemap-codec@1.5.0': {}
'@jridgewell/trace-mapping@0.3.25':
@@ -4427,12 +4413,6 @@ snapshots:
'@pnpm/network.ca-file': 1.0.2
config-chain: 1.1.13
'@rc-component/util@1.2.1(react-dom@18.3.1(react@18.3.1))(react@18.3.1)':
dependencies:
react: 18.3.1
react-dom: 18.3.1(react@18.3.1)
react-is: 18.3.1
'@react-aria/focus@3.20.2(react-dom@18.3.1(react@18.3.1))(react@18.3.1)':
dependencies:
'@react-aria/interactions': 3.25.0(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
@@ -4637,6 +4617,10 @@ snapshots:
dependencies:
'@tauri-apps/api': 2.5.0
'@tauri-apps/plugin-opener@2.2.7':
dependencies:
'@tauri-apps/api': 2.5.0
'@tauri-apps/plugin-os@2.2.1':
dependencies:
'@tauri-apps/api': 2.5.0
@@ -4653,10 +4637,6 @@ snapshots:
dependencies:
'@tauri-apps/api': 2.5.0
'@tauri-apps/plugin-websocket@2.3.0':
dependencies:
'@tauri-apps/api': 2.5.0
'@tauri-apps/plugin-window@2.0.0-alpha.1':
dependencies:
'@tauri-apps/api': 2.0.0-alpha.6
@@ -4878,14 +4858,14 @@ snapshots:
'@ungap/structured-clone@1.3.0': {}
'@vitejs/plugin-react@4.4.1(vite@5.4.19(@types/node@22.15.17)(sass@1.87.0))':
'@vitejs/plugin-react@4.4.1(vite@5.4.19(@types/node@22.15.17)(sass@1.87.0)(terser@5.40.0))':
dependencies:
'@babel/core': 7.27.1
'@babel/plugin-transform-react-jsx-self': 7.27.1(@babel/core@7.27.1)
'@babel/plugin-transform-react-jsx-source': 7.27.1(@babel/core@7.27.1)
'@types/babel__core': 7.20.5
react-refresh: 0.17.0
vite: 5.4.19(@types/node@22.15.17)(sass@1.87.0)
vite: 5.4.19(@types/node@22.15.17)(sass@1.87.0)(terser@5.40.0)
transitivePeerDependencies:
- supports-color
@@ -5014,6 +4994,9 @@ snapshots:
node-releases: 2.0.19
update-browserslist-db: 1.1.3(browserslist@4.24.5)
buffer-from@1.1.2:
optional: true
bundle-name@4.1.0:
dependencies:
run-applescript: 7.0.0
@@ -5084,8 +5067,6 @@ snapshots:
ci-info@4.2.0: {}
classnames@2.5.1: {}
cli-boxes@3.0.0: {}
cli-cursor@5.0.0:
@@ -5110,6 +5091,9 @@ snapshots:
comma-separated-tokens@2.0.3: {}
commander@2.20.3:
optional: true
commander@4.1.1: {}
commander@7.2.0: {}
@@ -6114,7 +6098,7 @@ snapshots:
transitivePeerDependencies:
- supports-color
mdast-util-gfm-autolink-literal@2.0.1:
mdast-util-gfm-autolink-literal@2.0.0:
dependencies:
'@types/mdast': 4.0.4
ccount: 2.0.1
@@ -6162,7 +6146,7 @@ snapshots:
mdast-util-gfm@3.1.0:
dependencies:
mdast-util-from-markdown: 2.0.2
mdast-util-gfm-autolink-literal: 2.0.1
mdast-util-gfm-autolink-literal: 2.0.0
mdast-util-gfm-footnote: 2.1.0
mdast-util-gfm-strikethrough: 2.0.0
mdast-util-gfm-table: 2.0.0
@@ -6830,8 +6814,6 @@ snapshots:
react-dom: 18.3.1(react@18.3.1)
typescript: 5.8.3
react-is@18.3.1: {}
react-markdown@9.1.0(@types/react@18.3.21)(react@18.3.1):
dependencies:
'@types/hast': 3.0.4
@@ -7121,6 +7103,12 @@ snapshots:
source-map-js@1.2.1: {}
source-map-support@0.5.21:
dependencies:
buffer-from: 1.1.2
source-map: 0.6.1
optional: true
source-map@0.6.1:
optional: true
@@ -7197,6 +7185,8 @@ snapshots:
tabbable@6.2.0: {}
tailwind-merge@3.3.1: {}
tailwindcss@3.4.17:
dependencies:
'@alloc/quick-lru': 5.2.0
@@ -7240,6 +7230,14 @@ snapshots:
dependencies:
'@tauri-apps/api': 2.5.0
terser@5.40.0:
dependencies:
'@jridgewell/source-map': 0.3.6
acorn: 8.14.1
commander: 2.20.3
source-map-support: 0.5.21
optional: true
thenify-all@1.6.0:
dependencies:
thenify: 3.3.1
@@ -7426,7 +7424,7 @@ snapshots:
'@types/unist': 3.0.3
vfile-message: 4.0.2
vite@5.4.19(@types/node@22.15.17)(sass@1.87.0):
vite@5.4.19(@types/node@22.15.17)(sass@1.87.0)(terser@5.40.0):
dependencies:
esbuild: 0.21.5
postcss: 8.5.3
@@ -7435,6 +7433,7 @@ snapshots:
'@types/node': 22.15.17
fsevents: 2.3.3
sass: 1.87.0
terser: 5.40.0
void-elements@3.1.0: {}

File diff suppressed because one or more lines are too long

1
scripts/devWeb.ts Normal file
View File

@@ -0,0 +1 @@
(() => {})();

714
src-tauri/Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,9 +1,9 @@
[package]
name = "coco"
version = "0.4.0"
version = "0.7.1"
description = "Search, connect, collaborate all in one place."
authors = ["INFINI Labs"]
edition = "2021"
edition = "2024"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[lib]
@@ -44,14 +44,13 @@ use_pizza_engine = []
[dependencies]
pizza-common = { git = "https://github.com/infinilabs/pizza-common", branch = "main" }
tauri = { version = "2", features = ["protocol-asset", "macos-private-api", "tray-icon", "image-ico", "image-png", "unstable"] }
tauri = { version = "2", features = ["protocol-asset", "macos-private-api", "tray-icon", "image-ico", "image-png"] }
tauri-plugin-shell = "2"
serde = { version = "1", features = ["derive"] }
# Need `arbitrary_precision` feature to support storing u128
# see: https://docs.rs/serde_json/latest/serde_json/struct.Number.html#method.from_u128
serde_json = { version = "1", features = ["arbitrary_precision"] }
serde_json = { version = "1", features = ["arbitrary_precision", "preserve_order"] }
tauri-plugin-http = "2"
tauri-plugin-websocket = "2"
tauri-plugin-deep-link = "2.0.0"
tauri-plugin-store = "2.2.0"
tauri-plugin-os = "2"
@@ -62,7 +61,7 @@ tauri-plugin-drag = "2"
tauri-plugin-macos-permissions = "2"
tauri-plugin-fs-pro = "2"
tauri-plugin-screenshots = "2"
applications = { git = "https://github.com/infinilabs/applications-rs", rev = "7bb507e6b12f73c96f3a52f0578d0246a689f381" }
applications = { git = "https://github.com/infinilabs/applications-rs", rev = "31b0c030a0f3bc82275fe12debe526153978671d" }
tokio-native-tls = "0.3" # For wss connections
tokio = { version = "1", features = ["full"] }
tokio-tungstenite = { version = "0.20", features = ["native-tls"] }
@@ -81,21 +80,37 @@ plist = "1.7"
base64 = "0.13"
walkdir = "2"
log = "0.4"
strsim = "0.10"
futures-util = "0.3.31"
url = "2.5.2"
http = "1.1.0"
tungstenite = "0.24.0"
tokio-util = "0.7.14"
tauri-plugin-windows-version = "2"
meval = "0.2"
meval = { git = "https://github.com/infinilabs/meval-rs" }
chinese-number = "0.7"
num2words = "1"
tauri-plugin-log = "2"
chrono = "0.4.41"
serde_plain = "1.0.2"
derive_more = { version = "2.0.1", features = ["display"] }
anyhow = "1.0.98"
function_name = "0.3.0"
regex = "1.11.1"
borrowme = "0.0.15"
tauri-plugin-opener = "2"
async-recursion = "1.1.1"
zip = "4.0.0"
url = "2.5.2"
camino = "1.1.10"
tokio-stream = { version = "0.1.17", features = ["io-util"] }
cfg-if = "1.0.1"
sysinfo = "0.35.2"
indexmap = { version = "2.10.0", features = ["serde"] }
strum = { version = "0.27.2", features = ["derive"] }
[target."cfg(target_os = \"macos\")".dependencies]
tauri-nspanel = { git = "https://github.com/ahkohd/tauri-nspanel", branch = "v2" }
cocoa = "0.24"
[target."cfg(any(target_os = \"macos\", windows, target_os = \"linux\"))".dependencies]
tauri-plugin-single-instance = { version = "2.0.0", features = ["deep-link"] }
@@ -114,6 +129,9 @@ strip = true # Ensures debug symbols are removed.
tauri-plugin-autostart = "^2.2"
tauri-plugin-global-shortcut = "2"
tauri-plugin-updater = { git = "https://github.com/infinilabs/plugins-workspace", branch = "v2" }
# This should be compatible with the semver used by `tauri-plugin-updater`
semver = { version = "1", features = ["serde"] }
[target."cfg(target_os = \"windows\")".dependencies]
enigo="0.3"
windows = { version = "0.61.3", features = ["Win32_Foundation", "Win32_System_Com", "Win32_System_Ole", "Win32_System_Search", "Win32_UI_Shell_PropertiesSystem", "Win32_Data"] }

View File

@@ -12,6 +12,8 @@
<true/>
<key>com.apple.security.automation.apple-events</key>
<true/>
<key>com.apple.security.device.microphone</key>
<true/>
<key>com.apple.security.device.audio-input</key>
<true/>
<key>com.apple.security.network.client</key>
@@ -24,6 +26,5 @@
<string>6GVZT94974.rs.coco.app</string>
<key>com.apple.developer.team-identifier</key>
<string>6GVZT94974</string>
</dict>
</plist>
</plist>

View File

@@ -2,11 +2,6 @@
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>NSCameraUsageDescription</key>
<string>Request camera access for WebRTC</string>
<key>NSMicrophoneUsageDescription</key>
<string>Request microphone access for WebRTC</string>
<key>CFBundleIdentifier</key>
<string>rs.coco.app</string>
<key>CFBundleExecutable</key>

View File

@@ -1,3 +1,14 @@
fn main() {
tauri_build::build()
tauri_build::build();
// If env var `GITHUB_ACTIONS` exists, we are running in CI, set up the `ci`
// attribute
if std::env::var("GITHUB_ACTIONS").is_ok() {
println!("cargo:rustc-cfg=ci");
}
// Notify `rustc` of this `cfg` attribute to suppress unknown attribute warnings.
//
// unexpected condition name: `ci`
println!("cargo::rustc-check-cfg=cfg(ci)");
}

View File

@@ -2,7 +2,7 @@
"$schema": "../gen/schemas/desktop-schema.json",
"identifier": "default",
"description": "Capability for the main window",
"windows": ["main", "chat", "settings"],
"windows": ["main", "chat", "settings", "check"],
"permissions": [
"core:default",
"core:event:allow-emit",
@@ -37,9 +37,6 @@
"http:allow-fetch-cancel",
"http:allow-fetch-read-body",
"http:allow-fetch-send",
"websocket:default",
"websocket:allow-connect",
"websocket:allow-send",
"autostart:allow-enable",
"autostart:allow-disable",
"autostart:allow-is-enabled",
@@ -71,6 +68,7 @@
"process:default",
"updater:default",
"windows-version:default",
"log:default"
"log:default",
"opener:default"
]
}

View File

@@ -1,2 +1,2 @@
[toolchain]
channel = "nightly-2024-10-29"
channel = "nightly-2025-06-26"

View File

@@ -1,30 +1,34 @@
use crate::common;
use crate::common::assistant::ChatRequestMessage;
use crate::common::http::GetResponse;
use crate::common::http::convert_query_params_to_strings;
use crate::common::register::SearchSourceRegistry;
use crate::server::http_client::HttpClient;
use crate::{common, server::servers::COCO_SERVERS};
use futures::StreamExt;
use futures::stream::FuturesUnordered;
use futures_util::TryStreamExt;
use http::Method;
use serde_json::Value;
use std::collections::HashMap;
use tauri::{AppHandle, Runtime};
use tauri::{AppHandle, Emitter, Manager};
use tokio::io::AsyncBufReadExt;
#[tauri::command]
pub async fn chat_history<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn chat_history(
_app_handle: AppHandle,
server_id: String,
from: u32,
size: u32,
query: Option<String>,
) -> Result<String, String> {
let mut query_params: HashMap<String, Value> = HashMap::new();
if from > 0 {
query_params.insert("from".to_string(), from.into());
}
if size > 0 {
query_params.insert("size".to_string(), size.into());
}
let mut query_params = Vec::new();
// Add from/size as number values
query_params.push(format!("from={}", from));
query_params.push(format!("size={}", size));
if let Some(query) = query {
if !query.is_empty() {
query_params.insert("query".to_string(), query.into());
query_params.push(format!("query={}", query.to_string()));
}
}
@@ -39,20 +43,18 @@ pub async fn chat_history<R: Runtime>(
}
#[tauri::command]
pub async fn session_chat_history<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn session_chat_history(
_app_handle: AppHandle,
server_id: String,
session_id: String,
from: u32,
size: u32,
) -> Result<String, String> {
let mut query_params: HashMap<String, Value> = HashMap::new();
if from > 0 {
query_params.insert("from".to_string(), from.into());
}
if size > 0 {
query_params.insert("size".to_string(), size.into());
}
let mut query_params = Vec::new();
// Add from/size as number values
query_params.push(format!("from={}", from));
query_params.push(format!("size={}", size));
let path = format!("/chat/{}/_history", session_id);
@@ -64,15 +66,14 @@ pub async fn session_chat_history<R: Runtime>(
}
#[tauri::command]
pub async fn open_session_chat<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn open_session_chat(
_app_handle: AppHandle,
server_id: String,
session_id: String,
) -> Result<String, String> {
let query_params = HashMap::new();
let path = format!("/chat/{}/_open", session_id);
let response = HttpClient::post(&server_id, path.as_str(), Some(query_params), None)
let response = HttpClient::post(&server_id, path.as_str(), None, None)
.await
.map_err(|e| format!("Error open session: {}", e))?;
@@ -80,30 +81,30 @@ pub async fn open_session_chat<R: Runtime>(
}
#[tauri::command]
pub async fn close_session_chat<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn close_session_chat(
_app_handle: AppHandle,
server_id: String,
session_id: String,
) -> Result<String, String> {
let query_params = HashMap::new();
let path = format!("/chat/{}/_close", session_id);
let response = HttpClient::post(&server_id, path.as_str(), Some(query_params), None)
let response = HttpClient::post(&server_id, path.as_str(), None, None)
.await
.map_err(|e| format!("Error close session: {}", e))?;
common::http::get_response_body_text(response).await
}
#[tauri::command]
pub async fn cancel_session_chat<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn cancel_session_chat(
_app_handle: AppHandle,
server_id: String,
session_id: String,
query_params: Option<HashMap<String, Value>>,
) -> Result<String, String> {
let query_params = HashMap::new();
let path = format!("/chat/{}/_cancel", session_id);
let query_params = convert_query_params_to_strings(query_params);
let response = HttpClient::post(&server_id, path.as_str(), Some(query_params), None)
let response = HttpClient::post(&server_id, path.as_str(), query_params, None)
.await
.map_err(|e| format!("Error cancel session: {}", e))?;
@@ -111,75 +112,161 @@ pub async fn cancel_session_chat<R: Runtime>(
}
#[tauri::command]
pub async fn new_chat<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn chat_create(
app_handle: AppHandle,
server_id: String,
websocket_id: String,
message: String,
message: Option<String>,
attachments: Option<Vec<String>>,
query_params: Option<HashMap<String, Value>>,
) -> Result<GetResponse, String> {
let body = if !message.is_empty() {
let message = ChatRequestMessage {
message: Some(message),
client_id: String,
) -> Result<(), String> {
println!("chat_create message: {:?}", message);
println!("chat_create attachments: {:?}", attachments);
let message_empty = message.as_ref().map_or(true, |m| m.is_empty());
let attachments_empty = attachments.as_ref().map_or(true, |a| a.is_empty());
if message_empty && attachments_empty {
return Err("Message and attachments are empty".to_string());
}
let body = {
let request_message: ChatRequestMessage = ChatRequestMessage {
message,
attachments,
};
println!("chat_create body: {:?}", request_message);
Some(
serde_json::to_string(&message)
serde_json::to_string(&request_message)
.map_err(|e| format!("Failed to serialize message: {}", e))?
.into(),
)
} else {
None
};
let mut headers = HashMap::new();
headers.insert("WEBSOCKET-SESSION-ID".to_string(), websocket_id.into());
let response = HttpClient::advanced_post(
&server_id,
"/chat/_create",
None,
convert_query_params_to_strings(query_params),
body,
)
.await
.map_err(|e| format!("Error sending message: {}", e))?;
let response =
HttpClient::advanced_post(&server_id, "/chat/_new", Some(headers), query_params, body)
.await
.map_err(|e| format!("Error sending message: {}", e))?;
let body_text = common::http::get_response_body_text(response).await?;
let chat_response: GetResponse =
serde_json::from_str(&body_text).map_err(|e| format!("Failed to parse response JSON: {}", e))?;
if chat_response.result != "created" {
return Err(format!("Unexpected result: {}", chat_response.result));
if response.status() == 429 {
log::warn!("Rate limit exceeded for chat create");
return Err("Rate limited".to_string());
}
Ok(chat_response)
if !response.status().is_success() {
return Err(format!("Request failed with status: {}", response.status()));
}
let stream = response.bytes_stream();
let reader = tokio_util::io::StreamReader::new(
stream.map_err(|e| std::io::Error::new(std::io::ErrorKind::Other, e)),
);
let mut lines = tokio::io::BufReader::new(reader).lines();
log::info!("client_id_create: {}", &client_id);
while let Ok(Some(line)) = lines.next_line().await {
log::info!("Received chat stream line: {}", &line);
if let Err(err) = app_handle.emit(&client_id, line) {
log::error!("Emit failed: {:?}", err);
let _ = app_handle.emit("chat-create-error", format!("Emit failed: {:?}", err));
}
}
Ok(())
}
#[tauri::command]
pub async fn send_message<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn chat_chat(
app_handle: AppHandle,
server_id: String,
websocket_id: String,
session_id: String,
message: String,
message: Option<String>,
attachments: Option<Vec<String>>,
query_params: Option<HashMap<String, Value>>, //search,deep_thinking
) -> Result<String, String> {
let path = format!("/chat/{}/_send", session_id);
let msg = ChatRequestMessage {
message: Some(message),
client_id: String,
) -> Result<(), String> {
println!("chat_chat message: {:?}", message);
println!("chat_chat attachments: {:?}", attachments);
let message_empty = message.as_ref().map_or(true, |m| m.is_empty());
let attachments_empty = attachments.as_ref().map_or(true, |a| a.is_empty());
if message_empty && attachments_empty {
return Err("Message and attachments are empty".to_string());
}
let body = {
let request_message = ChatRequestMessage {
message,
attachments,
};
println!("chat_chat body: {:?}", request_message);
Some(
serde_json::to_string(&request_message)
.map_err(|e| format!("Failed to serialize message: {}", e))?
.into(),
)
};
let mut headers = HashMap::new();
headers.insert("WEBSOCKET-SESSION-ID".to_string(), websocket_id.into());
let path = format!("/chat/{}/_chat", session_id);
let body = reqwest::Body::from(serde_json::to_string(&msg).unwrap());
let response = HttpClient::advanced_post(
&server_id,
path.as_str(),
Some(headers),
query_params,
Some(body),
None,
convert_query_params_to_strings(query_params),
body,
)
.await
.map_err(|e| format!("Error cancel session: {}", e))?;
.await
.map_err(|e| format!("Error sending message: {}", e))?;
common::http::get_response_body_text(response).await
if response.status() == 429 {
log::warn!("Rate limit exceeded for chat create");
return Err("Rate limited".to_string());
}
if !response.status().is_success() {
return Err(format!("Request failed with status: {}", response.status()));
}
let stream = response.bytes_stream();
let reader = tokio_util::io::StreamReader::new(
stream.map_err(|e| std::io::Error::new(std::io::ErrorKind::Other, e)),
);
let mut lines = tokio::io::BufReader::new(reader).lines();
let mut first_log = true;
log::info!("client_id: {}", &client_id);
while let Ok(Some(line)) = lines.next_line().await {
log::info!("Received chat stream line: {}", &line);
if first_log {
log::info!("first stream line: {}", &line);
first_log = false;
}
if let Err(err) = app_handle.emit(&client_id, line) {
log::error!("Emit failed: {:?}", err);
print!("Error sending message: {:?}", err);
let _ = app_handle.emit("chat-create-error", format!("Emit failed: {:?}", err));
}
}
Ok(())
}
#[tauri::command]
@@ -219,40 +306,194 @@ pub async fn update_session_chat(
None,
Some(reqwest::Body::from(serde_json::to_string(&body).unwrap())),
)
.await
.map_err(|e| format!("Error updating session: {}", e))?;
.await
.map_err(|e| format!("Error updating session: {}", e))?;
Ok(response.status().is_success())
}
#[tauri::command]
pub async fn assistant_search<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn assistant_search(
_app_handle: AppHandle,
server_id: String,
from: u32,
size: u32,
query: Option<HashMap<String, Value>>,
query_params: Option<Vec<String>>,
) -> Result<Value, String> {
let mut body = serde_json::json!({
"from": from,
"size": size,
});
if let Some(q) = query {
body["query"] = serde_json::to_value(q).map_err(|e| e.to_string())?;
}
let response = HttpClient::post(
&server_id,
"/assistant/_search",
None,
Some(reqwest::Body::from(body.to_string())),
)
.await
.map_err(|e| format!("Error searching assistants: {}", e))?;
let response = HttpClient::post(&server_id, "/assistant/_search", query_params, None)
.await
.map_err(|e| format!("Error searching assistants: {}", e))?;
response
.json::<Value>()
.await
.map_err(|err| err.to_string())
}
#[tauri::command]
pub async fn assistant_get(
_app_handle: AppHandle,
server_id: String,
assistant_id: String,
) -> Result<Value, String> {
let response = HttpClient::get(
&server_id,
&format!("/assistant/{}", assistant_id),
None, // headers
)
.await
.map_err(|e| format!("Error getting assistant: {}", e))?;
response
.json::<Value>()
.await
.map_err(|err| err.to_string())
}
/// Gets the information of the assistant specified by `assistant_id` by querying **all**
/// Coco servers.
///
/// Returns as soon as the assistant is found on any Coco server.
#[tauri::command]
pub async fn assistant_get_multi(
app_handle: AppHandle,
assistant_id: String,
) -> Result<Value, String> {
let search_sources = app_handle.state::<SearchSourceRegistry>();
let sources_future = search_sources.get_sources();
let sources_list = sources_future.await;
let mut futures = FuturesUnordered::new();
for query_source in &sources_list {
let query_source_type = query_source.get_type();
if query_source_type.r#type != COCO_SERVERS {
// Assistants only exists on Coco servers.
continue;
}
let coco_server_id = query_source_type.id.clone();
let path = format!("/assistant/{}", assistant_id);
let fut = async move {
let res_response = HttpClient::get(
&coco_server_id,
&path,
None, // headers
)
.await;
match res_response {
Ok(response) => response
.json::<serde_json::Value>()
.await
.map_err(|e| e.to_string()),
Err(e) => Err(e),
}
};
futures.push(fut);
}
while let Some(res_response_json) = futures.next().await {
let response_json = match res_response_json {
Ok(json) => json,
Err(e) => return Err(e),
};
// Example response JSON
//
// When assistant is not found:
// ```json
// {
// "_id": "ID",
// "result": "not_found"
// }
// ```
//
// When assistant is found:
// ```json
// {
// "_id": "ID",
// "_source": {...}
// "found": true
// }
// ```
if let Some(found) = response_json.get("found") {
if found == true {
return Ok(response_json);
}
}
}
Err(format!(
"could not find Assistant [{}] on all the Coco servers",
assistant_id
))
}
use regex::Regex;
/// Remove all `"icon": "..."` fields from a JSON string
pub fn remove_icon_fields(json: &str) -> String {
// Regex to match `"icon": "..."` fields, including base64 or escaped strings
let re = Regex::new(r#""icon"\s*:\s*"[^"]*"(,?)"#).unwrap();
// Replace with empty string, or just remove trailing comma if needed
re.replace_all(json, |caps: &regex::Captures| {
if &caps[1] == "," {
"".to_string() // keep comma removal logic safe
} else {
"".to_string()
}
})
.to_string()
}
#[tauri::command]
pub async fn ask_ai(
app_handle: AppHandle,
message: String,
server_id: String,
assistant_id: String,
client_id: String,
) -> Result<(), String> {
let cleaned = remove_icon_fields(message.as_str());
let body = serde_json::json!({ "message": cleaned });
let path = format!("/assistant/{}/_ask", assistant_id);
println!("Sending request to {}", &path);
let response = HttpClient::send_request(
server_id.as_str(),
Method::POST,
path.as_str(),
None,
None,
Some(reqwest::Body::from(body.to_string())),
)
.await?;
if response.status() == 429 {
log::warn!("Rate limit exceeded for assistant: {}", &assistant_id);
return Ok(());
}
if !response.status().is_success() {
return Err(format!("Request Failed: {}", response.status()));
}
let stream = response.bytes_stream();
let reader = tokio_util::io::StreamReader::new(
stream.map_err(|e| std::io::Error::new(std::io::ErrorKind::Other, e)),
);
let mut lines = tokio::io::BufReader::new(reader).lines();
while let Ok(Some(line)) = lines.next_line().await {
dbg!("Received line: {}", &line);
let _ = app_handle.emit(&client_id, line).map_err(|err| {
println!("Failed to emit: {:?}", err);
});
}
Ok(())
}

View File

@@ -1,43 +1,48 @@
use std::{fs::create_dir, io::Read};
use tauri::{Manager, Runtime};
use tauri::Manager;
use tauri_plugin_autostart::ManagerExt;
// Start or stop according to configuration
pub fn enable_autostart(app: &mut tauri::App) {
use tauri_plugin_autostart::MacosLauncher;
use tauri_plugin_autostart::ManagerExt;
app.handle()
.plugin(tauri_plugin_autostart::init(
MacosLauncher::AppleScript,
None,
))
.unwrap();
/// If the state reported from the OS and the state stored by us differ, our state is
/// prioritized and seen as the correct one. Update the OS state to make them consistent.
pub fn ensure_autostart_state_consistent(app: &mut tauri::App) -> Result<(), String> {
let autostart_manager = app.autolaunch();
// close autostart
// autostart_manager.disable().unwrap();
// return;
let os_state = autostart_manager.is_enabled().map_err(|e| e.to_string())?;
let coco_stored_state = current_autostart(app.app_handle()).map_err(|e| e.to_string())?;
match (
autostart_manager.is_enabled(),
current_autostart(app.app_handle()),
) {
(Ok(false), Ok(true)) => match autostart_manager.enable() {
Ok(_) => println!("Autostart enabled successfully."),
Err(err) => eprintln!("Failed to enable autostart: {}", err),
},
(Ok(true), Ok(false)) => match autostart_manager.disable() {
Ok(_) => println!("Autostart disable successfully."),
Err(err) => eprintln!("Failed to disable autostart: {}", err),
},
_ => (),
if os_state != coco_stored_state {
log::warn!(
"autostart inconsistent states, OS state [{}], Coco state [{}], config file could be deleted or corrupted",
os_state,
coco_stored_state
);
log::info!("trying to correct the inconsistent states");
let result = if coco_stored_state {
autostart_manager.enable()
} else {
autostart_manager.disable()
};
match result {
Ok(_) => {
log::info!("inconsistent autostart states fixed");
}
Err(e) => {
log::error!(
"failed to fix inconsistent autostart state due to error [{}]",
e
);
return Err(e.to_string());
}
}
}
Ok(())
}
fn current_autostart<R: Runtime>(app: &tauri::AppHandle<R>) -> Result<bool, String> {
fn current_autostart(app: &tauri::AppHandle) -> Result<bool, String> {
use std::fs::File;
let path = app.path().app_config_dir().unwrap();
@@ -60,10 +65,7 @@ fn current_autostart<R: Runtime>(app: &tauri::AppHandle<R>) -> Result<bool, Stri
}
#[tauri::command]
pub async fn change_autostart<R: Runtime>(
app: tauri::AppHandle<R>,
open: bool,
) -> Result<(), String> {
pub async fn change_autostart(app: tauri::AppHandle, open: bool) -> Result<(), String> {
use std::fs::File;
use std::io::Write;

View File

@@ -3,19 +3,22 @@ use serde_json::Value;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ChatRequestMessage {
#[serde(skip_serializing_if = "Option::is_none")]
pub message: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub attachments: Option<Vec<String>>,
}
#[allow(dead_code)]
pub struct NewChatResponse {
pub _id: String,
pub _source: Source,
pub _source: Session,
pub result: String,
pub payload: Option<Value>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct Source {
pub struct Session {
pub id: String,
pub created: String,
pub updated: String,
@@ -23,4 +26,11 @@ pub struct Source {
pub title: Option<String>,
pub summary: Option<String>,
pub manually_renamed_title: bool,
pub visible: Option<bool>,
pub context: Option<SessionContext>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct SessionContext {
pub attachments: Option<Vec<String>>,
}

View File

@@ -1,6 +1,6 @@
use serde::{Deserialize, Serialize};
#[derive(Debug,Clone, Serialize, Deserialize)]
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Connector {
pub id: String,
pub created: Option<String>,
@@ -13,7 +13,7 @@ pub struct Connector {
pub url: Option<String>,
pub assets: Option<ConnectorAssets>,
}
#[derive(Debug,Clone, Serialize, Deserialize)]
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ConnectorAssets {
pub icons: Option<std::collections::HashMap<String, String>>,
}
}

View File

@@ -18,4 +18,4 @@ pub struct DataSource {
pub struct ConnectorConfig {
pub id: Option<String>,
pub config: Option<serde_json::Value>, // Using serde_json::Value to handle any type of config
}
}

View File

@@ -1,5 +1,6 @@
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
use tauri::AppHandle;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct RichLabel {
@@ -29,6 +30,138 @@ pub struct EditorInfo {
pub timestamp: Option<String>,
}
/// Defines the action that would be performed when a document gets opened.
#[derive(Debug, Clone, Serialize, Deserialize)]
pub(crate) enum OnOpened {
/// Launch the application
Application { app_path: String },
/// Open the URL.
Document { url: String },
/// Spawn a child process to run the `CommandAction`.
Command {
action: crate::extension::CommandAction,
},
// NOTE that this variant has the same definition as `struct Quicklink`, but we
// cannot use it directly, its `link` field should be deserialized/serialized
// from/to a string, but we need a JSON object here.
//
// See also the comments in `struct Quicklink`.
Quicklink {
link: crate::extension::QuicklinkLink,
open_with: Option<String>,
},
}
impl OnOpened {
pub(crate) fn url(&self) -> String {
match self {
Self::Application { app_path } => app_path.clone(),
Self::Document { url } => url.clone(),
Self::Command { action } => {
const WHITESPACE: &str = " ";
let mut ret = action.exec.clone();
ret.push_str(WHITESPACE);
if let Some(ref args) = action.args {
ret.push_str(args.join(WHITESPACE).as_str());
}
ret
}
// Currently, our URL is static and does not support dynamic parameters.
// The URL of a quicklink is nearly useless without such dynamic user
// inputs, so until we have dynamic URL support, we just use "N/A".
Self::Quicklink { .. } => String::from("N/A"),
}
}
}
#[tauri::command]
pub(crate) async fn open(
tauri_app_handle: AppHandle,
on_opened: OnOpened,
extra_args: Option<HashMap<String, String>>,
) -> Result<(), String> {
use crate::util::open as homemade_tauri_shell_open;
use std::process::Command;
match on_opened {
OnOpened::Application { app_path } => {
log::debug!("open application [{}]", app_path);
homemade_tauri_shell_open(tauri_app_handle.clone(), app_path).await?
}
OnOpened::Document { url } => {
log::debug!("open document [{}]", url);
homemade_tauri_shell_open(tauri_app_handle.clone(), url).await?
}
OnOpened::Command { action } => {
log::debug!("open (execute) command [{:?}]", action);
let mut cmd = Command::new(action.exec);
if let Some(args) = action.args {
cmd.args(args);
}
let output = cmd.output().map_err(|e| e.to_string())?;
// Sometimes, we wanna see the result in logs even though it doesn't fail.
log::debug!(
"executing open(Command) result, exit code: [{}], stdout: [{}], stderr: [{}]",
output.status,
String::from_utf8_lossy(&output.stdout),
String::from_utf8_lossy(&output.stderr)
);
if !output.status.success() {
log::warn!(
"executing open(Command) failed, exit code: [{}], stdout: [{}], stderr: [{}]",
output.status,
String::from_utf8_lossy(&output.stdout),
String::from_utf8_lossy(&output.stderr)
);
return Err(format!(
"Command failed, stderr [{}]",
String::from_utf8_lossy(&output.stderr)
));
}
}
OnOpened::Quicklink {
link,
open_with: opt_open_with,
} => {
let url = link.concatenate_url(&extra_args);
log::debug!("open quicklink [{}] with [{:?}]", url, opt_open_with);
cfg_if::cfg_if! {
// The `open_with` functionality is only supported on macOS, provided
// by the `open -a` command.
if #[cfg(target_os = "macos")] {
let mut cmd = Command::new("open");
if let Some(ref open_with) = opt_open_with {
cmd.arg("-a");
cmd.arg(open_with.as_str());
}
cmd.arg(&url);
let output = cmd.output().map_err(|e| format!("failed to spawn [open] due to error [{}]", e))?;
if !output.status.success() {
return Err(format!(
"failed to open with app {:?}: {}",
opt_open_with,
String::from_utf8_lossy(&output.stderr)
));
}
} else {
homemade_tauri_shell_open(tauri_app_handle.clone(), url).await?
}
}
}
}
Ok(())
}
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
pub struct Document {
pub id: String,
@@ -48,6 +181,8 @@ pub struct Document {
pub thumbnail: Option<String>,
pub cover: Option<String>,
pub tags: Option<Vec<String>>,
/// What will happen if we open this document.
pub on_opened: Option<OnOpened>,
pub url: Option<String>,
pub size: Option<i64>,
pub metadata: Option<HashMap<String, serde_json::Value>>,

View File

@@ -1,34 +1,67 @@
use serde::{Deserialize, Serialize};
use reqwest::StatusCode;
use serde::{Deserialize, Serialize, Serializer};
use thiserror::Error;
fn serialize_optional_status_code<S>(
status_code: &Option<StatusCode>,
serializer: S,
) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
match status_code {
Some(code) => serializer.serialize_str(&format!("{:?}", code)),
None => serializer.serialize_none(),
}
}
#[allow(unused)]
#[derive(Debug, Deserialize)]
pub struct ErrorCause {
#[serde(default)]
pub r#type: Option<String>,
#[serde(default)]
pub reason: Option<String>,
}
#[derive(Debug, Deserialize)]
#[allow(unused)]
pub struct ErrorDetail {
pub reason: String,
pub status: u16,
#[serde(default)]
pub root_cause: Option<Vec<ErrorCause>>,
#[serde(default)]
pub r#type: Option<String>,
#[serde(default)]
pub reason: Option<String>,
#[serde(default)]
pub caused_by: Option<ErrorCause>,
}
#[derive(Debug, Deserialize)]
pub struct ErrorResponse {
pub error: ErrorDetail,
#[serde(default)]
pub error: Option<ErrorDetail>,
#[serde(default)]
#[allow(unused)]
pub status: Option<u16>,
}
#[derive(Debug, Error, Serialize)]
pub enum SearchError {
#[error("HTTP request failed: {0}")]
HttpError(String),
#[error("HttpError: status code [{status_code:?}], msg [{msg}]")]
HttpError {
#[serde(serialize_with = "serialize_optional_status_code")]
status_code: Option<StatusCode>,
msg: String,
},
#[error("Invalid response format: {0}")]
#[error("ParseError: {0}")]
ParseError(String),
#[error("Timeout occurred")]
Timeout,
#[error("Unknown error: {0}")]
#[allow(dead_code)]
Unknown(String),
#[error("InternalError error: {0}")]
#[allow(dead_code)]
#[error("InternalError: {0}")]
InternalError(String),
}
@@ -39,7 +72,10 @@ impl From<reqwest::Error> for SearchError {
} else if err.is_decode() {
SearchError::ParseError(err.to_string())
} else {
SearchError::HttpError(err.to_string())
SearchError::HttpError {
status_code: err.status(),
msg: err.to_string(),
}
}
}
}
}

View File

@@ -2,6 +2,8 @@ use crate::common;
use reqwest::Response;
use serde::{Deserialize, Serialize};
use serde_json::Value;
use std::collections::HashMap;
use tauri_plugin_store::JsonValue;
#[derive(Debug, Serialize, Deserialize)]
pub struct GetResponse {
@@ -40,13 +42,34 @@ pub async fn get_response_body_text(response: Response) -> Result<String, String
Ok(parsed_error) => {
dbg!(&parsed_error);
Err(format!(
"Server error ({}): {}",
parsed_error.error.status, parsed_error.error.reason
"Server error ({}): {:?}",
status, parsed_error.error
))
}
Err(_) => Err(fallback_error),
Err(_) => {
log::warn!("Failed to parse error response: {}", &body);
Err(fallback_error)
}
}
} else {
Ok(body)
}
}
}
pub fn convert_query_params_to_strings(
query_params: Option<HashMap<String, JsonValue>>,
) -> Option<Vec<String>> {
query_params.map(|map| {
map.into_iter()
.filter_map(|(k, v)| match v {
JsonValue::String(s) => Some(format!("{}={}", k, s)),
JsonValue::Number(n) => Some(format!("{}={}", k, n)),
JsonValue::Bool(b) => Some(format!("{}={}", k, b)),
_ => {
eprintln!("Skipping unsupported query value for key '{}': {:?}", k, v);
None
}
})
.collect()
})
}

View File

@@ -1,16 +1,17 @@
pub mod health;
pub mod profile;
pub mod server;
pub mod auth;
pub mod datasource;
pub mod connector;
pub mod search;
pub mod document;
pub mod traits;
pub mod register;
pub mod assistant;
pub mod http;
pub mod auth;
pub mod connector;
pub mod datasource;
pub mod document;
pub mod error;
pub mod health;
pub mod http;
pub mod profile;
pub mod register;
pub mod search;
pub mod server;
pub mod traits;
pub static MAIN_WINDOW_LABEL: &str = "main";
pub static SETTINGS_WINDOW_LABEL: &str = "settings";
pub static CHECK_WINDOW_LABEL: &str = "check";

View File

@@ -13,4 +13,4 @@ pub struct UserProfile {
pub email: String,
pub avatar: Option<String>,
pub preferences: Option<Preferences>,
}
}

View File

@@ -7,8 +7,8 @@ use std::error::Error;
#[derive(Debug, Serialize, Deserialize)]
pub struct SearchResponse<T> {
pub took: u64,
pub timed_out: bool,
pub took: Option<u64>,
pub timed_out: Option<bool>,
pub _shards: Option<Shards>,
pub hits: Hits<T>,
}
@@ -25,7 +25,7 @@ pub struct Shards {
pub struct Hits<T> {
pub total: Total,
pub max_score: Option<f32>,
pub hits: Vec<SearchHit<T>>,
pub hits: Option<Vec<SearchHit<T>>>,
}
#[derive(Debug, Serialize, Deserialize)]
@@ -36,9 +36,9 @@ pub struct Total {
#[derive(Debug, Serialize, Deserialize)]
pub struct SearchHit<T> {
pub _index: String,
pub _type: String,
pub _id: String,
pub _index: Option<String>,
pub _type: Option<String>,
pub _id: Option<String>,
pub _score: Option<f64>,
pub _source: T, // This will hold the type we pass in (e.g., DataSource)
}
@@ -58,13 +58,18 @@ where
Ok(search_response)
}
use serde::de::DeserializeOwned;
pub async fn parse_search_hits<T>(response: Response) -> Result<Vec<SearchHit<T>>, Box<dyn Error>>
where
T: for<'de> Deserialize<'de> + std::fmt::Debug,
T: DeserializeOwned + std::fmt::Debug,
{
let response = parse_search_response(response).await?;
Ok(response.hits.hits)
match response.hits.hits {
Some(hits) => Ok(hits),
None => Ok(Vec::new()),
}
}
pub async fn parse_search_results<T>(response: Response) -> Result<Vec<T>, Box<dyn Error>>
@@ -78,20 +83,6 @@ where
.collect())
}
#[allow(dead_code)]
pub async fn parse_search_results_with_score<T>(
response: Response,
) -> Result<Vec<(T, Option<f64>)>, Box<dyn Error>>
where
T: for<'de> Deserialize<'de> + std::fmt::Debug,
{
Ok(parse_search_hits(response)
.await?
.into_iter()
.map(|hit| (hit._source, hit._score))
.collect())
}
#[derive(Debug, Clone, Serialize)]
pub struct SearchQuery {
pub from: u64,

View File

@@ -1,6 +1,8 @@
use crate::common::health::Health;
use crate::common::profile::UserProfile;
use serde::{Deserialize, Serialize};
use serde_json::Value;
use std::collections::HashMap;
use std::hash::{Hash, Hasher};
#[derive(Debug, Clone, Serialize, Deserialize)]
@@ -48,9 +50,17 @@ pub struct Server {
pub updated: String,
#[serde(default = "default_enabled_type")]
pub enabled: bool,
/// Public Coco servers can be used without signing in.
#[serde(default = "default_bool_type")]
pub public: bool,
/// A coco server is available if:
///
/// 1. It is still online, we check this via the `GET /base_url/provider/_info`
/// interface.
/// 2. A user is logged in to this Coco server, i.e., a token is stored in the
/// `SERVER_TOKEN_LIST_CACHE`.
/// For public Coco servers, requirement 2 is not needed.
#[serde(default = "default_available_type")]
pub available: bool,
@@ -60,6 +70,7 @@ pub struct Server {
pub auth_provider: AuthProvider,
#[serde(default = "default_priority_type")]
pub priority: u32,
pub stats: Option<HashMap<String, Value>>,
}
impl PartialEq for Server {
@@ -81,7 +92,10 @@ pub struct ServerAccessToken {
#[serde(default = "default_empty_string")] // Custom default function for empty string
pub id: String,
pub access_token: String,
pub expired_at: u32, //unix timestamp in seconds
/// Unix timestamp in seconds
///
/// Currently, this is UNUSED.
pub expired_at: u32,
}
impl ServerAccessToken {

View File

@@ -1,13 +1,16 @@
use crate::common::error::SearchError;
// use std::{future::Future, pin::Pin};
use crate::common::search::SearchQuery;
use crate::common::search::{QueryResponse, QuerySource};
use async_trait::async_trait;
use tauri::AppHandle;
#[async_trait]
pub trait SearchSource: Send + Sync {
fn get_type(&self) -> QuerySource;
async fn search(&self, query: SearchQuery) -> Result<QueryResponse, SearchError>;
async fn search(
&self,
tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError>;
}

View File

@@ -0,0 +1,13 @@
pub(super) const EXTENSION_ID: &str = "AIOverview";
/// JSON file for this extension.
pub(crate) const PLUGIN_JSON_FILE: &str = r#"
{
"id": "AIOverview",
"name": "AI Overview",
"description": "...",
"icon": "font_a-AIOverview",
"type": "ai_extension",
"enabled": true
}
"#;

View File

@@ -12,9 +12,10 @@ pub use with_feature::*;
#[cfg(not(feature = "use_pizza_engine"))]
pub use without_feature::*;
#[derive(Debug, Serialize, Clone)]
#[serde(rename_all = "camelCase")]
#[allow(dead_code)]
pub struct AppEntry {
path: String,
name: String,
@@ -24,15 +25,26 @@ pub struct AppEntry {
is_disabled: bool,
}
#[derive(serde::Serialize)]
#[serde(rename_all = "camelCase")]
pub struct AppMetadata {
name: String,
r#where: String,
size: u64,
icon: String,
created: u128,
modified: u128,
last_opened: u128,
}
}
/// JSON file for this extension.
pub(crate) const PLUGIN_JSON_FILE: &str = r#"
{
"id": "Applications",
"platforms": ["macos", "linux", "windows"],
"name": "Applications",
"description": "Application search",
"icon": "font_Application",
"type": "group",
"enabled": true
}
"#;

View File

@@ -1,18 +1,18 @@
use super::super::Extension;
use super::AppMetadata;
use crate::common::error::SearchError;
use crate::common::search::{QueryResponse, QuerySource, SearchQuery};
use crate::common::traits::SearchSource;
use crate::local::LOCAL_QUERY_SOURCE_TYPE;
use crate::extension::LOCAL_QUERY_SOURCE_TYPE;
use async_trait::async_trait;
use tauri::{AppHandle, Runtime};
use super::AppEntry;
use super::AppMetadata;
use tauri::AppHandle;
pub(crate) const QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME: &str = "Applications";
pub struct ApplicationSearchSource;
impl ApplicationSearchSource {
pub async fn init<R: Runtime>(_app_handle: AppHandle<R>) -> Result<(), String> {
pub async fn prepare_index_and_store(_app_handle: AppHandle) -> Result<(), String> {
Ok(())
}
}
@@ -30,7 +30,11 @@ impl SearchSource for ApplicationSearchSource {
}
}
async fn search(&self, _query: SearchQuery) -> Result<QueryResponse, SearchError> {
async fn search(
&self,
_tauri_app_handle: AppHandle,
_query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
@@ -39,49 +43,39 @@ impl SearchSource for ApplicationSearchSource {
}
}
#[tauri::command]
pub async fn set_app_alias(_app_path: String, _alias: String) -> Result<(), String> {
pub fn set_app_alias(_tauri_app_handle: &AppHandle, _app_path: &str, _alias: &str) {
unreachable!("app list should be empty, there is no way this can be invoked")
}
#[tauri::command]
pub async fn register_app_hotkey<R: Runtime>(
_tauri_app_handle: AppHandle<R>,
_app_path: String,
_hotkey: String,
pub fn register_app_hotkey(
_tauri_app_handle: &AppHandle,
_app_path: &str,
_hotkey: &str,
) -> Result<(), String> {
unreachable!("app list should be empty, there is no way this can be invoked")
}
#[tauri::command]
pub async fn unregister_app_hotkey<R: Runtime>(
_tauri_app_handle: AppHandle<R>,
_app_path: String,
) -> Result<(), String> {
pub fn unregister_app_hotkey(_tauri_app_handle: &AppHandle, _app_path: &str) -> Result<(), String> {
unreachable!("app list should be empty, there is no way this can be invoked")
}
#[tauri::command]
pub async fn disable_app_search<R: Runtime>(
_tauri_app_handle: AppHandle<R>,
_app_path: String,
) -> Result<(), String> {
pub fn disable_app_search(_tauri_app_handle: &AppHandle, _app_path: &str) -> Result<(), String> {
// no-op
Ok(())
}
#[tauri::command]
pub async fn enable_app_search<R: Runtime>(
_tauri_app_handle: AppHandle<R>,
_app_path: String,
) -> Result<(), String> {
pub fn enable_app_search(_tauri_app_handle: &AppHandle, _app_path: &str) -> Result<(), String> {
// no-op
Ok(())
}
pub fn is_app_search_enabled(_app_path: &str) -> bool {
false
}
#[tauri::command]
pub async fn add_app_search_path<R: Runtime>(
_tauri_app_handle: AppHandle<R>,
pub async fn add_app_search_path(
_tauri_app_handle: AppHandle,
_search_path: String,
) -> Result<(), String> {
// no-op
@@ -89,8 +83,8 @@ pub async fn add_app_search_path<R: Runtime>(
}
#[tauri::command]
pub async fn remove_app_search_path<R: Runtime>(
_tauri_app_handle: AppHandle<R>,
pub async fn remove_app_search_path(
_tauri_app_handle: AppHandle,
_search_path: String,
) -> Result<(), String> {
// no-op
@@ -98,24 +92,37 @@ pub async fn remove_app_search_path<R: Runtime>(
}
#[tauri::command]
pub async fn get_app_search_path<R: Runtime>(_tauri_app_handle: AppHandle<R>) -> Vec<String> {
pub async fn get_app_search_path(_tauri_app_handle: AppHandle) -> Vec<String> {
// Return an empty list
Vec::new()
}
#[tauri::command]
pub async fn get_app_list<R: Runtime>(
_tauri_app_handle: AppHandle<R>,
) -> Result<Vec<AppEntry>, String> {
pub async fn get_app_list(_tauri_app_handle: AppHandle) -> Result<Vec<Extension>, String> {
// Return an empty list
Ok(Vec::new())
}
#[tauri::command]
pub async fn get_app_metadata<R: Runtime>(
_tauri_app_handle: AppHandle<R>,
pub async fn get_app_metadata(
_tauri_app_handle: AppHandle,
_app_path: String,
) -> Result<AppMetadata, String> {
unreachable!("app list should be empty, there is no way this can be invoked")
}
pub(crate) fn set_apps_hotkey(_tauri_app_handle: &AppHandle) -> Result<(), String> {
// no-op
Ok(())
}
pub(crate) fn unset_apps_hotkey(_tauri_app_handle: &AppHandle) -> Result<(), String> {
// no-op
Ok(())
}
#[tauri::command]
pub async fn reindex_applications(_tauri_app_handle: AppHandle) -> Result<(), String> {
// no-op
Ok(())
}

View File

@@ -1,4 +1,4 @@
use super::LOCAL_QUERY_SOURCE_TYPE;
use super::super::LOCAL_QUERY_SOURCE_TYPE;
use crate::common::{
document::{DataSourceReference, Document},
error::SearchError,
@@ -10,9 +10,23 @@ use chinese_number::{ChineseCase, ChineseCountMethod, ChineseVariant, NumberToCh
use num2words::Num2Words;
use serde_json::Value;
use std::collections::HashMap;
use tauri::AppHandle;
pub(crate) const DATA_SOURCE_ID: &str = "Calculator";
/// JSON file for this extension.
pub(crate) const PLUGIN_JSON_FILE: &str = r#"
{
"id": "Calculator",
"name": "Calculator",
"platforms": ["macos", "linux", "windows"],
"description": "...",
"icon": "font_Calculator",
"type": "calculator",
"enabled": true
}
"#;
pub struct CalculatorSource {
base_score: f64,
}
@@ -23,7 +37,7 @@ impl CalculatorSource {
}
}
fn parse_query(query: String) -> Value {
fn parse_query(query: &str) -> Value {
let mut query_json = serde_json::Map::new();
let operators = ["+", "-", "*", "/", "%"];
@@ -48,7 +62,7 @@ fn parse_query(query: String) -> Value {
query_json.insert("type".to_string(), Value::String("expression".to_string()));
}
query_json.insert("value".to_string(), Value::String(query));
query_json.insert("value".to_string(), Value::String(query.to_string()));
Value::Object(query_json)
}
@@ -107,12 +121,22 @@ impl SearchSource for CalculatorSource {
}
}
async fn search(&self, query: SearchQuery) -> Result<QueryResponse, SearchError> {
let query_string = query
.query_strings
.get("query")
.unwrap_or(&"".to_string())
.to_string();
async fn search(
&self,
_tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
let Some(query_string) = query.query_strings.get("query") else {
return Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
});
};
// Trim the leading and tailing whitespace so that our later if condition
// will only be evaluated against non-whitespace characters.
let query_string = query_string.trim();
if query_string.is_empty() || query_string.len() == 1 {
return Ok(QueryResponse {
@@ -122,42 +146,54 @@ impl SearchSource for CalculatorSource {
});
}
match meval::eval_str(&query_string) {
Ok(num) => {
let mut payload: HashMap<String, Value> = HashMap::new();
let query_string_clone = query_string.to_string();
let query_source = self.get_type();
let base_score = self.base_score;
let closure = move || -> QueryResponse {
let res_num = meval::eval_str(&query_string_clone);
let payload_query = parse_query(query_string);
let payload_result = parse_result(num);
match res_num {
Ok(num) => {
let mut payload: HashMap<String, Value> = HashMap::new();
payload.insert("query".to_string(), payload_query);
payload.insert("result".to_string(), payload_result);
let payload_query = parse_query(&query_string_clone);
let payload_result = parse_result(num);
let doc = Document {
id: DATA_SOURCE_ID.to_string(),
category: Some(DATA_SOURCE_ID.to_string()),
payload: Some(payload),
source: Some(DataSourceReference {
r#type: Some(LOCAL_QUERY_SOURCE_TYPE.into()),
name: Some(DATA_SOURCE_ID.into()),
id: Some(DATA_SOURCE_ID.into()),
icon: None,
}),
..Default::default()
};
payload.insert("query".to_string(), payload_query);
payload.insert("result".to_string(), payload_result);
return Ok(QueryResponse {
source: self.get_type(),
hits: vec![(doc, self.base_score)],
total_hits: 1,
});
}
Err(_) => {
return Ok(QueryResponse {
source: self.get_type(),
let doc = Document {
id: DATA_SOURCE_ID.to_string(),
category: Some(DATA_SOURCE_ID.to_string()),
payload: Some(payload),
source: Some(DataSourceReference {
r#type: Some(LOCAL_QUERY_SOURCE_TYPE.into()),
name: Some(DATA_SOURCE_ID.into()),
id: Some(DATA_SOURCE_ID.into()),
icon: Some(String::from("font_Calculator")),
}),
..Default::default()
};
QueryResponse {
source: query_source,
hits: vec![(doc, base_score)],
total_hits: 1,
}
}
Err(_) => QueryResponse {
source: query_source,
hits: Vec::new(),
total_hits: 0,
});
},
}
};
let spawn_result = tokio::task::spawn_blocking(closure).await;
match spawn_result {
Ok(response) => Ok(response),
Err(e) => std::panic::resume_unwind(e.into_panic()),
}
}
}

View File

@@ -0,0 +1,209 @@
//! File Search configuration entries definition and getter/setter functions.
use serde::Deserialize;
use serde::Serialize;
use serde_json::Value;
use std::sync::LazyLock;
use tauri::AppHandle;
use tauri_plugin_store::StoreExt;
// Tauri store keys for file system configuration
const TAURI_STORE_FILE_SYSTEM_CONFIG: &str = "file_system_config";
const TAURI_STORE_KEY_SEARCH_BY: &str = "search_by";
const TAURI_STORE_KEY_SEARCH_PATHS: &str = "search_paths";
const TAURI_STORE_KEY_EXCLUDE_PATHS: &str = "exclude_paths";
const TAURI_STORE_KEY_FILE_TYPES: &str = "file_types";
static HOME_DIR: LazyLock<String> = LazyLock::new(|| {
let os_string = dirs::home_dir()
.expect("$HOME should be set")
.into_os_string();
os_string
.into_string()
.expect("User home directory should be encoded with UTF-8")
});
#[derive(Debug, Clone, Serialize, Deserialize, Copy)]
pub enum SearchBy {
Name,
NameAndContents,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct FileSearchConfig {
pub search_paths: Vec<String>,
pub exclude_paths: Vec<String>,
pub file_types: Vec<String>,
pub search_by: SearchBy,
}
impl Default for FileSearchConfig {
fn default() -> Self {
Self {
search_paths: vec![
format!("{}/Documents", HOME_DIR.as_str()),
format!("{}/Desktop", HOME_DIR.as_str()),
format!("{}/Downloads", HOME_DIR.as_str()),
],
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::Name,
}
}
}
impl FileSearchConfig {
pub(crate) fn get(tauri_app_handle: &AppHandle) -> Self {
let store = tauri_app_handle
.store(TAURI_STORE_FILE_SYSTEM_CONFIG)
.unwrap_or_else(|e| {
panic!(
"store [{}] not found/loaded, error [{}]",
TAURI_STORE_FILE_SYSTEM_CONFIG, e
)
});
// Default value, will be used when specific config entries are not set
let default_config = FileSearchConfig::default();
let search_paths = {
if let Some(search_paths) = store.get(TAURI_STORE_KEY_SEARCH_PATHS) {
match search_paths {
Value::Array(arr) => {
let mut vec = Vec::with_capacity(arr.len());
for v in arr {
match v {
Value::String(s) => vec.push(s),
other => panic!(
"Expected all elements of 'search_paths' to be strings, but found: {:?}",
other
),
}
}
vec
}
other => panic!(
"Expected 'search_paths' to be an array of strings in the file system config store, but got: {:?}",
other
),
}
} else {
store.set(
TAURI_STORE_KEY_SEARCH_PATHS,
default_config.search_paths.as_slice(),
);
default_config.search_paths
}
};
let exclude_paths = {
if let Some(exclude_paths) = store.get(TAURI_STORE_KEY_EXCLUDE_PATHS) {
match exclude_paths {
Value::Array(arr) => {
let mut vec = Vec::with_capacity(arr.len());
for v in arr {
match v {
Value::String(s) => vec.push(s),
other => panic!(
"Expected all elements of 'exclude_paths' to be strings, but found: {:?}",
other
),
}
}
vec
}
other => panic!(
"Expected 'exclude_paths' to be an array of strings in the file system config store, but got: {:?}",
other
),
}
} else {
store.set(
TAURI_STORE_KEY_EXCLUDE_PATHS,
default_config.exclude_paths.as_slice(),
);
default_config.exclude_paths
}
};
let file_types = {
if let Some(file_types) = store.get(TAURI_STORE_KEY_FILE_TYPES) {
match file_types {
Value::Array(arr) => {
let mut vec = Vec::with_capacity(arr.len());
for v in arr {
match v {
Value::String(s) => vec.push(s),
other => panic!(
"Expected all elements of 'file_types' to be strings, but found: {:?}",
other
),
}
}
vec
}
other => panic!(
"Expected 'file_types' to be an array of strings in the file system config store, but got: {:?}",
other
),
}
} else {
store.set(
TAURI_STORE_KEY_FILE_TYPES,
default_config.file_types.as_slice(),
);
default_config.file_types
}
};
let search_by = {
if let Some(search_by) = store.get(TAURI_STORE_KEY_SEARCH_BY) {
serde_json::from_value(search_by.clone()).unwrap_or_else(|e| {
panic!(
"Failed to deserialize 'search_by' from file system config store. Invalid JSON: {:?}, error: {}",
search_by, e
)
})
} else {
store.set(
TAURI_STORE_KEY_SEARCH_BY,
serde_json::to_value(default_config.search_by).unwrap(),
);
default_config.search_by
}
};
Self {
search_by,
search_paths,
exclude_paths,
file_types,
}
}
}
// Tauri commands for managing file system configuration
#[tauri::command]
pub async fn get_file_system_config(tauri_app_handle: AppHandle) -> FileSearchConfig {
FileSearchConfig::get(&tauri_app_handle)
}
#[tauri::command]
pub async fn set_file_system_config(
tauri_app_handle: AppHandle,
config: FileSearchConfig,
) -> Result<(), String> {
let store = tauri_app_handle
.store(TAURI_STORE_FILE_SYSTEM_CONFIG)
.map_err(|e| e.to_string())?;
store.set(TAURI_STORE_KEY_SEARCH_PATHS, config.search_paths);
store.set(TAURI_STORE_KEY_EXCLUDE_PATHS, config.exclude_paths);
store.set(TAURI_STORE_KEY_FILE_TYPES, config.file_types);
store.set(
TAURI_STORE_KEY_SEARCH_BY,
serde_json::to_value(config.search_by).unwrap(),
);
Ok(())
}

View File

@@ -0,0 +1,186 @@
use super::super::EXTENSION_ID;
use super::super::config::FileSearchConfig;
use super::super::config::SearchBy;
use crate::common::document::{DataSourceReference, Document};
use crate::extension::LOCAL_QUERY_SOURCE_TYPE;
use crate::extension::OnOpened;
use crate::util::file::get_file_icon;
use futures::stream::Stream;
use futures::stream::StreamExt;
use std::os::fd::OwnedFd;
use std::path::Path;
use tokio::io::AsyncBufReadExt;
use tokio::io::BufReader;
use tokio::process::Child;
use tokio::process::Command;
use tokio_stream::wrappers::LinesStream;
/// `mdfind` won't return scores, we use this score for all the documents.
const SCORE: f64 = 1.0;
pub(crate) async fn hits(
query_string: &str,
from: usize,
size: usize,
config: &FileSearchConfig,
) -> Result<Vec<(Document, f64)>, String> {
let (mut iter, mut mdfind_child_process) =
execute_mdfind_query(&query_string, from, size, &config)?;
// Convert results to documents
let mut hits: Vec<(Document, f64)> = Vec::new();
while let Some(res_file_path) = iter.next().await {
let file_path = res_file_path.map_err(|io_err| io_err.to_string())?;
let icon = get_file_icon(file_path.clone()).await;
let file_path_of_type_path = camino::Utf8Path::new(&file_path);
let r#where = file_path_of_type_path
.parent()
.unwrap_or_else(|| {
panic!(
"expect path [{}] to have a parent, but it does not",
file_path
);
})
.to_string();
let file_name = file_path_of_type_path.file_name().unwrap_or_else(|| {
panic!(
"expect path [{}] to have a file name, but it does not",
file_path
);
});
let on_opened = OnOpened::Document {
url: file_path.clone(),
};
let doc = Document {
id: file_path.clone(),
title: Some(file_name.to_string()),
source: Some(DataSourceReference {
r#type: Some(LOCAL_QUERY_SOURCE_TYPE.into()),
name: Some(EXTENSION_ID.into()),
id: Some(EXTENSION_ID.into()),
icon: Some(String::from("font_Filesearch")),
}),
category: Some(r#where),
on_opened: Some(on_opened),
url: Some(file_path),
icon: Some(icon.to_string()),
..Default::default()
};
hits.push((doc, SCORE));
}
// Kill the mdfind process once we get the needed results to prevent zombie
// processes.
mdfind_child_process
.kill()
.await
.map_err(|e| format!("{:?}", e))?;
Ok(hits)
}
/// Return an array containing the `mdfind` command and its arguments.
fn build_mdfind_query(query_string: &str, config: &FileSearchConfig) -> Vec<String> {
let mut args = vec!["mdfind".to_string()];
match config.search_by {
SearchBy::Name => {
args.push(format!("kMDItemFSName == '*{}*'", query_string));
}
SearchBy::NameAndContents => {
args.push(format!(
"kMDItemFSName == '*{}*' || kMDItemTextContent == '{}'",
query_string, query_string
));
}
}
// Add search paths using -onlyin
for path in &config.search_paths {
if Path::new(path).exists() {
args.extend_from_slice(&["-onlyin".to_string(), path.to_string()]);
}
}
args
}
/// Spawn the `mdfind` child process and return an async iterator over its output,
/// allowing us to collect the results asynchronously.
///
/// # Return value:
///
/// * impl Stream: an async iterator that will yield the matched files
/// * Child: The handle to the mdfind process, we need to kill it once we
/// collect all the results to avoid zombie processes.
fn execute_mdfind_query(
query_string: &str,
from: usize,
size: usize,
config: &FileSearchConfig,
) -> Result<(impl Stream<Item = std::io::Result<String>>, Child), String> {
let args = build_mdfind_query(query_string, &config);
let (rx, tx) = std::io::pipe().unwrap();
let rx_owned = OwnedFd::from(rx);
let async_rx = tokio::net::unix::pipe::Receiver::from_owned_fd(rx_owned).unwrap();
let buffered_rx = BufReader::new(async_rx);
let lines = LinesStream::new(buffered_rx.lines());
let child = Command::new(&args[0])
.args(&args[1..])
.stdout(tx)
.stderr(std::process::Stdio::null())
.spawn()
.map_err(|e| format!("Failed to spawn mdfind: {}", e))?;
let config_clone = config.clone();
let iter = lines
.filter(move |res_path| {
std::future::ready({
match res_path {
Ok(path) => !should_be_filtered_out(&config_clone, path),
Err(_) => {
// Don't filter out Err() values
true
}
}
})
})
.skip(from)
.take(size);
Ok((iter, child))
}
/// If `file_path` should be removed from the search results given the filter
/// conditions specified in `config`.
fn should_be_filtered_out(config: &FileSearchConfig, file_path: &str) -> bool {
let is_excluded = config
.exclude_paths
.iter()
.any(|exclude_path| file_path.starts_with(exclude_path));
if is_excluded {
return true;
}
let matches_file_type = if config.file_types.is_empty() {
true
} else {
let path_obj = camino::Utf8Path::new(&file_path);
if let Some(extension) = path_obj.extension() {
config
.file_types
.iter()
.any(|file_type| file_type == extension)
} else {
// `config.file_types` is not empty, then the search results
// should have extensions.
false
}
};
!matches_file_type
}

View File

@@ -0,0 +1,10 @@
#[cfg(target_os = "macos")]
mod macos;
#[cfg(target_os = "windows")]
mod windows;
// `hits()` function is platform-specific, export the corresponding impl.
#[cfg(target_os = "macos")]
pub(crate) use macos::hits;
#[cfg(target_os = "windows")]
pub(crate) use windows::hits;

View File

@@ -0,0 +1,751 @@
//! # Credits
//!
//! https://github.com/IRONAGE-Park/rag-sample/blob/3f0ad8c8012026cd3a7e453d08f041609426cb91/src/native/windows.rs
//! is the starting point of this implementation.
use super::super::EXTENSION_ID;
use super::super::config::FileSearchConfig;
use super::super::config::SearchBy;
use crate::common::document::{DataSourceReference, Document};
use crate::extension::LOCAL_QUERY_SOURCE_TYPE;
use crate::extension::OnOpened;
use crate::util::file::get_file_icon;
use windows::{
Win32::System::{
Com::{CLSCTX_INPROC_SERVER, CoCreateInstance},
Ole::{OleInitialize, OleUninitialize},
Search::{
DB_NULL_HCHAPTER, DBACCESSOR_ROWDATA, DBBINDING, DBMEMOWNER_CLIENTOWNED,
DBPARAMIO_NOTPARAM, DBPART_VALUE, DBTYPE_WSTR, HACCESSOR, IAccessor, ICommand,
ICommandText, IDBCreateCommand, IDBCreateSession, IDBInitialize, IDataInitialize,
IRowset, MSDAINITIALIZE,
},
},
core::{GUID, IUnknown, Interface, PWSTR, w},
};
/// Owned version of `PWSTR` that holds the heap memory.
///
/// Use `as_pwstr()` to convert it to a raw pointer.
struct PwStrOwned(Vec<u16>);
impl PwStrOwned {
/// # SAFETY
///
/// The returned `PWSTR` is basically a raw pointer, it is only valid within the
/// lifetime of `PwStrOwned`.
unsafe fn as_pwstr(&mut self) -> PWSTR {
let raw_ptr = self.0.as_mut_ptr();
PWSTR::from_raw(raw_ptr)
}
}
/// Construct `PwStrOwned` from any `str`.
impl<S: AsRef<str> + ?Sized> From<&S> for PwStrOwned {
fn from(value: &S) -> Self {
let mut utf16_bytes = value.as_ref().encode_utf16().collect::<Vec<u16>>();
utf16_bytes.push(0); // the tailing NULL
PwStrOwned(utf16_bytes)
}
}
/// Helper function to replace unsupported characters with whitespace.
///
/// Windows search will error out if it encounters these characters.
///
/// The complete list of unsupported characters is unknown and we don't know how
/// to escape them, so let's replace them.
fn query_string_cleanup(old: &str) -> String {
const UNSUPPORTED_CHAR: [char; 2] = ['\'', '\n'];
// Using len in bytes is ok
let mut chars = Vec::with_capacity(old.len());
for char in old.chars() {
if UNSUPPORTED_CHAR.contains(&char) {
chars.push(' ');
} else {
chars.push(char);
}
}
chars.into_iter().collect()
}
/// Helper function to construct the Windows Search SQL.
///
/// Paging is not natively supported by windows Search SQL, it only supports `size`
/// via the `TOP` keyword ("SELECT TOP {n} {columns}"). The SQL returned by this
/// function will have `{n}` set to `from + size`, then we will manually implement
/// paging.
fn query_sql(query_string: &str, from: usize, size: usize, config: &FileSearchConfig) -> String {
let top_n = from
.checked_add(size)
.expect("[from + size] cannot fit into an [usize]");
// System.ItemUrl is a column that contains the file path
// example: "file:C:/Users/desktop.ini"
//
// System.Search.Rank is the relevance score
let mut sql = format!(
"SELECT TOP {} System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE",
top_n
);
let query_string = query_string_cleanup(query_string);
let search_by_predicate = match config.search_by {
SearchBy::Name => {
// `contains(System.FileName, '{query_string}')` would be faster
// because it uses inverted index, but that's not what we want
// due to the limitation of tokenization. For example, suppose "Coco AI.rs"
// will be tokenized to `["Coco", "AI", "rs"]`, then if users search
// via `Co`, this file won't be returned because term `Co` does not
// exist in the index.
//
// So we use wildcard instead even though it is slower.
format!("(System.FileName LIKE '%{query_string}%')")
}
SearchBy::NameAndContents => {
// Windows File Search does not support searching by file content.
//
// `CONTAINS('query_string')` would search all columns for `query_string`,
// this is the closest solution we have.
format!("((System.FileName LIKE '%{query_string}%') OR CONTAINS('{query_string}'))")
}
};
let search_paths_predicate: Option<String> = {
if config.search_paths.is_empty() {
None
} else {
let mut output = String::from("(");
for (idx, search_path) in config.search_paths.iter().enumerate() {
if idx != 0 {
output.push_str(" OR ");
}
output.push_str("SCOPE = 'file:");
output.push_str(&search_path);
output.push('\'');
}
output.push(')');
Some(output)
}
};
let exclude_paths_predicate: Option<String> = {
if config.exclude_paths.is_empty() {
None
} else {
let mut output = String::from("(");
for (idx, exclude_path) in config.exclude_paths.iter().enumerate() {
if idx != 0 {
output.push_str(" AND ");
}
output.push_str("(NOT SCOPE = 'file:");
output.push_str(&exclude_path);
output.push('\'');
output.push(')');
}
output.push(')');
Some(output)
}
};
let file_types_predicate: Option<String> = {
if config.file_types.is_empty() {
None
} else {
let mut output = String::from("(");
for (idx, file_type) in config.file_types.iter().enumerate() {
if idx != 0 {
output.push_str(" OR ");
}
// NOTE that this column contains a starting dot
output.push_str("System.FileExtension = '.");
output.push_str(&file_type);
output.push('\'');
}
output.push(')');
Some(output)
}
};
sql.push(' ');
sql.push_str(search_by_predicate.as_str());
if let Some(search_paths_predicate) = search_paths_predicate {
sql.push_str(" AND ");
sql.push_str(search_paths_predicate.as_str());
}
if let Some(exclude_paths_predicate) = exclude_paths_predicate {
sql.push_str(" AND ");
sql.push_str(exclude_paths_predicate.as_str());
}
if let Some(file_types_predicate) = file_types_predicate {
sql.push_str(" AND ");
sql.push_str(file_types_predicate.as_str());
}
sql
}
/// Default GUID for Search.CollatorDSO.1
const DBGUID_DEFAULT: GUID = GUID {
data1: 0xc8b521fb,
data2: 0x5cf3,
data3: 0x11ce,
data4: [0xad, 0xe5, 0x00, 0xaa, 0x00, 0x44, 0x77, 0x3d],
};
unsafe fn create_accessor_handle(accessor: &IAccessor, index: usize) -> Result<HACCESSOR, String> {
let bindings = DBBINDING {
iOrdinal: index,
obValue: 0,
obStatus: 0,
obLength: 0,
dwPart: DBPART_VALUE.0 as u32,
dwMemOwner: DBMEMOWNER_CLIENTOWNED.0 as u32,
eParamIO: DBPARAMIO_NOTPARAM.0 as u32,
cbMaxLen: 512,
dwFlags: 0,
wType: DBTYPE_WSTR.0 as u16,
bPrecision: 0,
bScale: 0,
..Default::default()
};
let mut status = 0;
let mut accessor_handle = HACCESSOR::default();
unsafe {
accessor
.CreateAccessor(
DBACCESSOR_ROWDATA.0 as u32,
1,
&bindings,
0,
&mut accessor_handle,
Some(&mut status),
)
.map_err(|e| e.to_string())?;
}
Ok(accessor_handle)
}
fn create_db_initialize() -> Result<IDBInitialize, String> {
unsafe {
let data_init: IDataInitialize =
CoCreateInstance(&MSDAINITIALIZE, None, CLSCTX_INPROC_SERVER)
.map_err(|e| e.to_string())?;
let mut unknown: Option<IUnknown> = None;
data_init
.GetDataSource(
None,
CLSCTX_INPROC_SERVER.0,
w!("provider=Search.CollatorDSO.1;EXTENDED PROPERTIES=\"Application=Windows\""),
&IDBInitialize::IID,
&mut unknown as *mut _ as *mut _,
)
.map_err(|e| e.to_string())?;
Ok(unknown.unwrap().cast().map_err(|e| e.to_string())?)
}
}
fn create_command(db_init: IDBInitialize) -> Result<ICommandText, String> {
unsafe {
let db_create_session: IDBCreateSession = db_init.cast().map_err(|e| e.to_string())?;
let session: IUnknown = db_create_session
.CreateSession(None, &IUnknown::IID)
.map_err(|e| e.to_string())?;
let db_create_command: IDBCreateCommand = session.cast().map_err(|e| e.to_string())?;
Ok(db_create_command
.CreateCommand(None, &ICommand::IID)
.map_err(|e| e.to_string())?
.cast()
.map_err(|e| e.to_string())?)
}
}
fn execute_windows_search_sql(sql_query: &str) -> Result<Vec<(String, String)>, String> {
unsafe {
let mut pwstr_owned_sql = PwStrOwned::from(sql_query);
// SAFETY: pwstr_owned_sql will live for the whole lifetime of this function.
let sql_query = pwstr_owned_sql.as_pwstr();
let db_init = create_db_initialize()?;
db_init.Initialize().map_err(|e| e.to_string())?;
let command = create_command(db_init)?;
// Set the command text
command
.SetCommandText(&DBGUID_DEFAULT, sql_query)
.map_err(|e| e.to_string())?;
// Execute the command
let mut rowset: Option<IRowset> = None;
command
.Execute(
None,
&IRowset::IID,
None,
None,
Some(&mut rowset as *mut _ as *mut _),
)
.map_err(|e| e.to_string())?;
let rowset = rowset.ok_or_else(|| {
format!(
"No rowset returned for query: {}",
// SAFETY: the raw pointer is not dangling
sql_query
.to_string()
.expect("the conversion should work as `sql_query` was created from a String",)
)
})?;
let accessor: IAccessor = rowset
.cast()
.map_err(|e| format!("Failed to cast to IAccessor: {}", e.to_string()))?;
let mut output = Vec::new();
let mut count = 0;
loop {
let mut rows_fetched = 0;
let mut row_handles = [std::ptr::null_mut(); 1];
let result = rowset.GetNextRows(
DB_NULL_HCHAPTER as usize,
0,
&mut rows_fetched,
&mut row_handles,
);
if result.is_err() {
break;
}
if rows_fetched == 0 {
break;
}
let mut data = Vec::new();
for i in 0..2 {
let mut item_name = [0u16; 512];
let accessor_handle = create_accessor_handle(&accessor, i + 1)?;
rowset
.GetData(
*row_handles[0],
accessor_handle,
item_name.as_mut_ptr() as *mut _,
)
.map_err(|e| {
format!(
"Failed to get data at count {}, index {}: {}",
count,
i,
e.to_string()
)
})?;
let name = String::from_utf16_lossy(&item_name);
// Remove null characters
data.push(name.trim_end_matches('\u{0000}').to_string());
accessor
.ReleaseAccessor(accessor_handle, None)
.map_err(|e| {
format!(
"Failed to release accessor at count {}, index {}: {}",
count,
i,
e.to_string()
)
})?;
}
output.push((data[0].clone(), data[1].clone()));
count += 1;
rowset
.ReleaseRows(
1,
row_handles[0],
std::ptr::null_mut(),
std::ptr::null_mut(),
std::ptr::null_mut(),
)
.map_err(|e| {
format!(
"Failed to release rows at count {}: {}",
count,
e.to_string()
)
})?;
}
Ok(output)
}
}
pub(crate) async fn hits(
query_string: &str,
from: usize,
size: usize,
config: &FileSearchConfig,
) -> Result<Vec<(Document, f64)>, String> {
let sql = query_sql(query_string, from, size, config);
unsafe { OleInitialize(None).map_err(|e| e.to_string())? };
let result = execute_windows_search_sql(&sql)?;
unsafe { OleUninitialize() };
// .take(size) is not needed as `result` will contain `from+size` files at most
let result_with_paging = result.into_iter().skip(from);
// result_with_paging won't contain more than `size` entries
let mut hits = Vec::with_capacity(size);
const ITEM_URL_PREFIX: &str = "file:";
const ITEM_URL_PREFIX_LEN: usize = ITEM_URL_PREFIX.len();
for (item_url, score_str) in result_with_paging {
// path returned from Windows Search contains a prefix, we need to trim it.
//
// "file:C:/Users/desktop.ini" => "C:/Users/desktop.ini"
let file_path = &item_url[ITEM_URL_PREFIX_LEN..];
let icon = get_file_icon(file_path.to_string()).await;
let file_path_of_type_path = camino::Utf8Path::new(&file_path);
let r#where = file_path_of_type_path
.parent()
.unwrap_or_else(|| {
panic!(
"expect path [{}] to have a parent, but it does not",
file_path
);
})
.to_string();
let file_name = file_path_of_type_path.file_name().unwrap_or_else(|| {
panic!(
"expect path [{}] to have a file name, but it does not",
file_path
);
});
let on_opened = OnOpened::Document {
url: file_path.to_string(),
};
let doc = Document {
id: file_path.to_string(),
title: Some(file_name.to_string()),
source: Some(DataSourceReference {
r#type: Some(LOCAL_QUERY_SOURCE_TYPE.into()),
name: Some(EXTENSION_ID.into()),
id: Some(EXTENSION_ID.into()),
icon: Some(String::from("font_Filesearch")),
}),
category: Some(r#where),
on_opened: Some(on_opened),
url: Some(file_path.into()),
icon: Some(icon.to_string()),
..Default::default()
};
let score: f64 = score_str.parse().expect(
"System.Search.Rank should be in range [0, 1000], which should be valid for [f64]",
);
hits.push((doc, score));
}
Ok(hits)
}
// Skip these tests in our CI, they fail with the following error
// "SQL is invalid: "0x80041820""
//
// I have no idea about the underlying root cause
#[cfg(all(test, not(ci)))]
mod test_windows_search {
use super::*;
/// Helper function for ensuring `sql` is valid SQL by actually executing it.
fn ensure_it_is_valid_sql(sql: &str) {
unsafe { OleInitialize(None).unwrap() };
execute_windows_search_sql(&sql).expect("SQL is invalid");
unsafe { OleUninitialize() };
}
#[test]
fn test_query_sql_empty_config_search_by_name() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("coco", 0, 10, &config);
assert_eq!(
sql,
"SELECT TOP 10 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%coco%')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_empty_config_search_by_name_and_content() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::NameAndContents,
};
let sql = query_sql("coco", 0, 10, &config);
assert_eq!(
sql,
"SELECT TOP 10 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE ((System.FileName LIKE '%coco%') OR CONTAINS('coco'))"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_search_paths() {
let config = FileSearchConfig {
search_paths: vec!["C:/Users/".into()],
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("coco", 0, 10, &config);
assert_eq!(
sql,
"SELECT TOP 10 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%coco%') AND (SCOPE = 'file:C:/Users/')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_multiple_search_paths() {
let config = FileSearchConfig {
search_paths: vec![
"C:/Users/".into(),
"D:/Projects/".into(),
"E:/Documents/".into(),
],
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("test", 0, 5, &config);
assert_eq!(
sql,
"SELECT TOP 5 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%test%') AND (SCOPE = 'file:C:/Users/' OR SCOPE = 'file:D:/Projects/' OR SCOPE = 'file:E:/Documents/')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_exclude_paths() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: vec!["C:/Windows/".into()],
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("file", 0, 20, &config);
assert_eq!(
sql,
"SELECT TOP 20 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%file%') AND ((NOT SCOPE = 'file:C:/Windows/'))"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_multiple_exclude_paths() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: vec!["C:/Windows/".into(), "C:/System/".into(), "C:/Temp/".into()],
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("data", 5, 15, &config);
assert_eq!(
sql,
"SELECT TOP 20 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%data%') AND ((NOT SCOPE = 'file:C:/Windows/') AND (NOT SCOPE = 'file:C:/System/') AND (NOT SCOPE = 'file:C:/Temp/'))"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_file_types() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: Vec::new(),
file_types: vec!["txt".into()],
search_by: SearchBy::Name,
};
let sql = query_sql("readme", 0, 10, &config);
assert_eq!(
sql,
"SELECT TOP 10 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%readme%') AND (System.FileExtension = '.txt')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_multiple_file_types() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: Vec::new(),
file_types: vec!["rs".into(), "toml".into(), "md".into(), "json".into()],
search_by: SearchBy::Name,
};
let sql = query_sql("config", 0, 50, &config);
assert_eq!(
sql,
"SELECT TOP 50 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%config%') AND (System.FileExtension = '.rs' OR System.FileExtension = '.toml' OR System.FileExtension = '.md' OR System.FileExtension = '.json')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_all_fields_combined() {
let config = FileSearchConfig {
search_paths: vec!["C:/Projects/".into(), "D:/Code/".into()],
exclude_paths: vec!["C:/Projects/temp/".into()],
file_types: vec!["rs".into(), "ts".into()],
search_by: SearchBy::Name,
};
let sql = query_sql("main", 10, 25, &config);
assert_eq!(
sql,
"SELECT TOP 35 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%main%') AND (SCOPE = 'file:C:/Projects/' OR SCOPE = 'file:D:/Code/') AND ((NOT SCOPE = 'file:C:/Projects/temp/')) AND (System.FileExtension = '.rs' OR System.FileExtension = '.ts')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_special_characters() {
let config = FileSearchConfig {
search_paths: vec!["C:/Users/John Doe/".into()],
exclude_paths: Vec::new(),
file_types: vec!["c++".into()],
search_by: SearchBy::Name,
};
let sql = query_sql("hello-world", 0, 10, &config);
assert_eq!(
sql,
"SELECT TOP 10 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%hello-world%') AND (SCOPE = 'file:C:/Users/John Doe/') AND (System.FileExtension = '.c++')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_edge_case_large_offset() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("test", 100, 50, &config);
assert_eq!(
sql,
"SELECT TOP 150 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%test%')"
);
ensure_it_is_valid_sql(&sql);
}
}
#[cfg(test)]
mod test {
use super::*;
#[test]
fn test_query_string_cleanup_no_unsupported_chars() {
let input = "hello world";
let result = query_string_cleanup(input);
assert_eq!(result, input);
}
#[test]
fn test_query_string_cleanup_single_quote() {
let input = "don't worry";
let result = query_string_cleanup(input);
assert_eq!(result, "don t worry");
}
#[test]
fn test_query_string_cleanup_newline() {
let input = "line1\nline2";
let result = query_string_cleanup(input);
assert_eq!(result, "line1 line2");
}
#[test]
fn test_query_string_cleanup_both_unsupported_chars() {
let input = "don't\nworry";
let result = query_string_cleanup(input);
assert_eq!(result, "don t worry");
}
#[test]
fn test_query_string_cleanup_multiple_single_quotes() {
let input = "it's a 'test' string";
let result = query_string_cleanup(input);
assert_eq!(result, "it s a test string");
}
#[test]
fn test_query_string_cleanup_multiple_newlines() {
let input = "line1\n\nline2\nline3";
let result = query_string_cleanup(input);
assert_eq!(result, "line1 line2 line3");
}
#[test]
fn test_query_string_cleanup_empty_string() {
let input = "";
let result = query_string_cleanup(input);
assert_eq!(result, input);
}
#[test]
fn test_query_string_cleanup_only_unsupported_chars() {
let input = "'\n'";
let result = query_string_cleanup(input);
assert_eq!(result, " ");
}
#[test]
fn test_query_string_cleanup_unicode_characters() {
let input = "héllo wörld's\nfile";
let result = query_string_cleanup(input);
assert_eq!(result, "héllo wörld s file");
}
#[test]
fn test_query_string_cleanup_special_chars_preserved() {
let input = "test@file#name$with%symbols";
let result = query_string_cleanup(input);
assert_eq!(result, input);
}
}

View File

@@ -0,0 +1,97 @@
pub(crate) mod config;
pub(crate) mod implementation;
use super::super::LOCAL_QUERY_SOURCE_TYPE;
use crate::common::{
error::SearchError,
search::{QueryResponse, QuerySource, SearchQuery},
traits::SearchSource,
};
use async_trait::async_trait;
use config::FileSearchConfig;
use hostname;
use tauri::AppHandle;
pub(crate) const EXTENSION_ID: &str = "File Search";
/// JSON file for this extension.
pub(crate) const PLUGIN_JSON_FILE: &str = r#"
{
"id": "File Search",
"name": "File Search",
"platforms": ["macos", "windows"],
"description": "Search files on your system",
"icon": "font_Filesearch",
"type": "extension"
}
"#;
pub struct FileSearchExtensionSearchSource;
#[async_trait]
impl SearchSource for FileSearchExtensionSearchSource {
fn get_type(&self) -> QuerySource {
QuerySource {
r#type: LOCAL_QUERY_SOURCE_TYPE.into(),
name: hostname::get()
.unwrap_or(EXTENSION_ID.into())
.to_string_lossy()
.into(),
id: EXTENSION_ID.into(),
}
}
async fn search(
&self,
tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
let Some(query_string) = query.query_strings.get("query") else {
return Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
});
};
let from = usize::try_from(query.from).expect("from too big");
let size = usize::try_from(query.size).expect("size too big");
let query_string = query_string.trim();
if query_string.is_empty() {
return Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
});
}
// Get configuration from tauri store
let config = FileSearchConfig::get(&tauri_app_handle);
// If search paths are empty, then the hit should be empty.
//
// Without this, empty search paths will result in a mdfind that has no `-onlyin`
// option, which will in turn query the whole disk volume.
if config.search_paths.is_empty() {
return Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
});
}
// Execute search in a blocking task
let query_source = self.get_type();
let hits = implementation::hits(&query_string, from, size, &config)
.await
.map_err(SearchError::InternalError)?;
let total_hits = hits.len();
Ok(QueryResponse {
source: query_source,
hits,
total_hits,
})
}
}

View File

@@ -0,0 +1,541 @@
//! Built-in extensions and related stuff.
pub mod ai_overview;
pub mod application;
pub mod calculator;
#[cfg(any(target_os = "macos", target_os = "windows"))]
pub mod file_search;
pub mod pizza_engine_runtime;
pub mod quick_ai_access;
use super::Extension;
use crate::SearchSourceRegistry;
use crate::extension::built_in::application::{set_apps_hotkey, unset_apps_hotkey};
use crate::extension::{
ExtensionBundleIdBorrowed, PLUGIN_JSON_FILE_NAME, alter_extension_json_file,
};
use anyhow::Context;
use std::path::{Path, PathBuf};
use tauri::{AppHandle, Manager};
pub(crate) fn get_built_in_extension_directory(tauri_app_handle: &AppHandle) -> PathBuf {
let mut resource_dir = tauri_app_handle.path().app_data_dir().expect(
"User home directory not found, which should be impossible on desktop environments",
);
resource_dir.push("built_in_extensions");
resource_dir
}
/// Helper function to load the built-in extension specified by `extension_id`, used
/// in `list_built_in_extensions()`.
///
/// For built-in extensions, users are only allowed to edit these fields:
///
/// 1. alias (if this extension supports alias)
/// 2. hotkey (if this extension supports hotkey)
/// 3. enabled
///
/// If
///
/// 1. The above fields have invalid value
/// 2. Other fields are modified
///
/// we ignore and reset them to the default value.
async fn load_built_in_extension(
built_in_extensions_dir: &Path,
extension_id: &str,
default_plugin_json_file: &str,
) -> Result<Extension, String> {
let mut extension_dir = built_in_extensions_dir.join(extension_id);
let mut default_plugin_json = serde_json::from_str::<Extension>(&default_plugin_json_file).unwrap_or_else( |e| {
panic!("the default extension {} file of built-in extension [{}] cannot be parsed as a valid [struct Extension], error [{}]", PLUGIN_JSON_FILE_NAME, extension_id, e);
});
if !extension_dir.try_exists().map_err(|e| e.to_string())? {
tokio::fs::create_dir_all(extension_dir.as_path())
.await
.map_err(|e| e.to_string())?;
}
let plugin_json_file_path = {
extension_dir.push(PLUGIN_JSON_FILE_NAME);
extension_dir
};
// If the JSON file does not exist, create a file with the default template and return.
if !plugin_json_file_path
.try_exists()
.map_err(|e| e.to_string())?
{
tokio::fs::write(plugin_json_file_path, default_plugin_json_file)
.await
.map_err(|e| e.to_string())?;
return Ok(default_plugin_json);
}
let plugin_json_file_content = tokio::fs::read_to_string(plugin_json_file_path.as_path())
.await
.map_err(|e| e.to_string())?;
let res_plugin_json = serde_json::from_str::<Extension>(&plugin_json_file_content);
let Ok(plugin_json) = res_plugin_json else {
log::warn!(
"user invalidated built-in extension [{}] file, overwriting it with the default template",
extension_id
);
// If the JSON file cannot be parsed as `struct Extension`, overwrite it with the default template and return.
tokio::fs::write(plugin_json_file_path, default_plugin_json_file)
.await
.map_err(|e| e.to_string())?;
return Ok(default_plugin_json);
};
// Users are only allowed to edit the below fields
// 1. alias (if this extension supports alias)
// 2. hotkey (if this extension supports hotkey)
// 3. enabled
// so we ignore all other fields.
let alias = if default_plugin_json.supports_alias_hotkey() {
plugin_json.alias.clone()
} else {
None
};
let hotkey = if default_plugin_json.supports_alias_hotkey() {
plugin_json.hotkey.clone()
} else {
None
};
let enabled = plugin_json.enabled;
default_plugin_json.alias = alias;
default_plugin_json.hotkey = hotkey;
default_plugin_json.enabled = enabled;
let final_plugin_json_file_content = serde_json::to_string_pretty(&default_plugin_json)
.expect("failed to serialize `struct Extension`");
tokio::fs::write(plugin_json_file_path, final_plugin_json_file_content)
.await
.map_err(|e| e.to_string())?;
Ok(default_plugin_json)
}
/// Return the built-in extension list.
///
/// Will create extension files when they are not found.
///
/// Users may put extension files in the built-in extension directory, but
/// we do not care and will ignore them.
///
/// We only read alias/hotkey/enabled from the JSON file, we have ensured that if
/// alias/hotkey is not supported, then it will be `None`. Besides that, no further
/// validation is needed because nothing could go wrong.
pub(crate) async fn list_built_in_extensions(
tauri_app_handle: &AppHandle,
) -> Result<Vec<Extension>, String> {
let dir = get_built_in_extension_directory(tauri_app_handle);
let mut built_in_extensions = Vec::new();
built_in_extensions.push(
load_built_in_extension(
&dir,
application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME,
application::PLUGIN_JSON_FILE,
)
.await?,
);
built_in_extensions.push(
load_built_in_extension(
&dir,
calculator::DATA_SOURCE_ID,
calculator::PLUGIN_JSON_FILE,
)
.await?,
);
built_in_extensions.push(
load_built_in_extension(
&dir,
ai_overview::EXTENSION_ID,
ai_overview::PLUGIN_JSON_FILE,
)
.await?,
);
built_in_extensions.push(
load_built_in_extension(
&dir,
quick_ai_access::EXTENSION_ID,
quick_ai_access::PLUGIN_JSON_FILE,
)
.await?,
);
cfg_if::cfg_if! {
if #[cfg(any(target_os = "macos", target_os = "windows"))] {
built_in_extensions.push(
load_built_in_extension(
&dir,
file_search::EXTENSION_ID,
file_search::PLUGIN_JSON_FILE,
)
.await?,
);
}
}
Ok(built_in_extensions)
}
pub(super) async fn init_built_in_extension(
tauri_app_handle: &AppHandle,
extension: &Extension,
search_source_registry: &SearchSourceRegistry,
) -> Result<(), String> {
log::trace!("initializing built-in extensions [{}]", extension.id);
if extension.id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
search_source_registry
.register_source(application::ApplicationSearchSource)
.await;
set_apps_hotkey(&tauri_app_handle)?;
log::debug!("built-in extension [{}] initialized", extension.id);
}
if extension.id == calculator::DATA_SOURCE_ID {
let calculator_search = calculator::CalculatorSource::new(2000f64);
search_source_registry
.register_source(calculator_search)
.await;
log::debug!("built-in extension [{}] initialized", extension.id);
}
cfg_if::cfg_if! {
if #[cfg(any(target_os = "macos", target_os = "windows"))] {
if extension.id == file_search::EXTENSION_ID {
let file_system_search = file_search::FileSearchExtensionSearchSource;
search_source_registry
.register_source(file_system_search)
.await;
log::debug!("built-in extension [{}] initialized", extension.id);
}
}
}
Ok(())
}
pub(crate) fn is_extension_built_in(bundle_id: &ExtensionBundleIdBorrowed<'_>) -> bool {
bundle_id.developer.is_none()
}
pub(crate) async fn enable_built_in_extension(
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
) -> Result<(), String> {
let search_source_registry_tauri_state = tauri_app_handle.state::<SearchSourceRegistry>();
let update_extension = |extension: &mut Extension| -> Result<(), String> {
extension.enabled = true;
Ok(())
};
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME
&& bundle_id.sub_extension_id.is_none()
{
search_source_registry_tauri_state
.register_source(application::ApplicationSearchSource)
.await;
set_apps_hotkey(tauri_app_handle)?;
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
// Check if this is an application
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME
&& bundle_id.sub_extension_id.is_some()
{
let app_path = bundle_id.sub_extension_id.expect("just checked it is Some");
application::enable_app_search(tauri_app_handle, app_path)?;
return Ok(());
}
if bundle_id.extension_id == calculator::DATA_SOURCE_ID {
let calculator_search = calculator::CalculatorSource::new(2000f64);
search_source_registry_tauri_state
.register_source(calculator_search)
.await;
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
if bundle_id.extension_id == quick_ai_access::EXTENSION_ID {
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
if bundle_id.extension_id == ai_overview::EXTENSION_ID {
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
cfg_if::cfg_if! {
if #[cfg(any(target_os = "macos", target_os = "windows"))] {
if bundle_id.extension_id == file_search::EXTENSION_ID {
let file_system_search = file_search::FileSearchExtensionSearchSource;
search_source_registry_tauri_state
.register_source(file_system_search)
.await;
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
}
}
Ok(())
}
pub(crate) async fn disable_built_in_extension(
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
) -> Result<(), String> {
let search_source_registry_tauri_state = tauri_app_handle.state::<SearchSourceRegistry>();
let update_extension = |extension: &mut Extension| -> Result<(), String> {
extension.enabled = false;
Ok(())
};
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME
&& bundle_id.sub_extension_id.is_none()
{
search_source_registry_tauri_state
.remove_source(bundle_id.extension_id)
.await;
unset_apps_hotkey(tauri_app_handle)?;
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
// Check if this is an application
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME
&& bundle_id.sub_extension_id.is_some()
{
let app_path = bundle_id.sub_extension_id.expect("just checked it is Some");
application::disable_app_search(tauri_app_handle, app_path)?;
return Ok(());
}
if bundle_id.extension_id == calculator::DATA_SOURCE_ID {
search_source_registry_tauri_state
.remove_source(bundle_id.extension_id)
.await;
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
if bundle_id.extension_id == quick_ai_access::EXTENSION_ID {
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
if bundle_id.extension_id == ai_overview::EXTENSION_ID {
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
cfg_if::cfg_if! {
if #[cfg(any(target_os = "macos", target_os = "windows"))] {
if bundle_id.extension_id == file_search::EXTENSION_ID {
search_source_registry_tauri_state
.remove_source(bundle_id.extension_id)
.await;
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
}
}
Ok(())
}
pub(crate) fn set_built_in_extension_alias(
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
alias: &str,
) {
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
if let Some(app_path) = bundle_id.sub_extension_id {
application::set_app_alias(tauri_app_handle, app_path, alias);
}
}
}
pub(crate) fn register_built_in_extension_hotkey(
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
hotkey: &str,
) -> Result<(), String> {
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
if let Some(app_path) = bundle_id.sub_extension_id {
application::register_app_hotkey(&tauri_app_handle, app_path, hotkey)?;
}
}
Ok(())
}
pub(crate) fn unregister_built_in_extension_hotkey(
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
) -> Result<(), String> {
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
if let Some(app_path) = bundle_id.sub_extension_id {
application::unregister_app_hotkey(&tauri_app_handle, app_path)?;
}
}
Ok(())
}
fn split_extension_id(extension_id: &str) -> (&str, Option<&str>) {
match extension_id.find('.') {
Some(idx) => (&extension_id[..idx], Some(&extension_id[idx + 1..])),
None => (extension_id, None),
}
}
fn load_extension_from_json_file(
extension_directory: &Path,
extension_id: &str,
) -> Result<Extension, String> {
let (parent_extension_id, _opt_sub_extension_id) = split_extension_id(extension_id);
let json_file_path = {
let mut extension_directory_path = extension_directory.join(parent_extension_id);
extension_directory_path.push(PLUGIN_JSON_FILE_NAME);
extension_directory_path
};
let mut extension = serde_json::from_reader::<_, Extension>(
std::fs::File::open(&json_file_path)
.with_context(|| {
format!(
"the [{}] file for extension [{}] is missing or broken",
PLUGIN_JSON_FILE_NAME, parent_extension_id
)
})
.map_err(|e| e.to_string())?,
)
.map_err(|e| e.to_string())?;
super::canonicalize_relative_icon_path(extension_directory, &mut extension)?;
Ok(extension)
}
pub(crate) async fn is_built_in_extension_enabled(
tauri_app_handle: &AppHandle,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
) -> Result<bool, String> {
let search_source_registry_tauri_state = tauri_app_handle.state::<SearchSourceRegistry>();
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME
&& bundle_id.sub_extension_id.is_none()
{
return Ok(search_source_registry_tauri_state
.get_source(bundle_id.extension_id)
.await
.is_some());
}
// Check if this is an application
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
if let Some(app_path) = bundle_id.sub_extension_id {
return Ok(application::is_app_search_enabled(app_path));
}
}
if bundle_id.extension_id == calculator::DATA_SOURCE_ID {
return Ok(search_source_registry_tauri_state
.get_source(bundle_id.extension_id)
.await
.is_some());
}
if bundle_id.extension_id == quick_ai_access::EXTENSION_ID {
let extension = load_extension_from_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id.extension_id,
)?;
return Ok(extension.enabled);
}
if bundle_id.extension_id == ai_overview::EXTENSION_ID {
let extension = load_extension_from_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id.extension_id,
)?;
return Ok(extension.enabled);
}
cfg_if::cfg_if! {
if #[cfg(any(target_os = "macos", target_os = "windows"))] {
if bundle_id.extension_id == file_search::EXTENSION_ID
&& bundle_id.sub_extension_id.is_none()
{
return Ok(search_source_registry_tauri_state
.get_source(bundle_id.extension_id)
.await
.is_some());
}
}
}
unreachable!("extension [{:?}] is not a built-in extension", bundle_id)
}

View File

@@ -0,0 +1,76 @@
//! We use Pizza Engine to index applications and local files. The engine will be
//! run in the thread/runtime defined in this file.
//!
//! # Why such a thread/runtime is needed
//!
//! Generally, Tokio async runtime requires all the async tasks running on it to be
//! `Send` and `Sync`, but the async tasks created by Pizza Engine are not,
//! which forces us to create a dedicated thread/runtime to execute them.
use std::any::Any;
use std::collections::HashMap;
use std::collections::hash_map::Entry;
use std::sync::OnceLock;
pub(crate) trait SearchSourceState {
#[cfg_attr(not(feature = "use_pizza_engine"), allow(unused))]
fn as_mut_any(&mut self) -> &mut dyn Any;
}
#[async_trait::async_trait(?Send)]
pub(crate) trait Task: Send + Sync {
fn search_source_id(&self) -> &'static str;
async fn exec(&mut self, state: &mut Option<Box<dyn SearchSourceState>>);
}
pub(crate) static RUNTIME_TX: OnceLock<tokio::sync::mpsc::UnboundedSender<Box<dyn Task>>> =
OnceLock::new();
/// This function blocks until the runtime thread is ready for accepting tasks.
pub(crate) async fn start_pizza_engine_runtime() {
const THREAD_NAME: &str = "Pizza engine runtime thread";
log::trace!("starting Pizza engine runtime");
let (engine_start_signal_tx, engine_start_signal_rx) = tokio::sync::oneshot::channel();
std::thread::Builder::new()
.name(THREAD_NAME.into())
.spawn(move || {
let rt = tokio::runtime::Runtime::new().unwrap();
let main = async {
let mut states: HashMap<String, Option<Box<dyn SearchSourceState>>> =
HashMap::new();
let (tx, mut rx) = tokio::sync::mpsc::unbounded_channel();
RUNTIME_TX.set(tx).unwrap();
engine_start_signal_tx
.send(())
.expect("engine_start_signal_rx dropped");
while let Some(mut task) = rx.recv().await {
let opt_search_source_state = match states.entry(task.search_source_id().into())
{
Entry::Occupied(o) => o.into_mut(),
Entry::Vacant(v) => v.insert(None),
};
task.exec(opt_search_source_state).await;
}
};
rt.block_on(main);
})
.unwrap_or_else(|e| {
panic!(
"failed to start thread [{}] due to error [{}]",
THREAD_NAME, e
);
});
engine_start_signal_rx
.await
.expect("engine_start_signal_tx dropped, the runtime thread could be dead");
log::trace!("Pizza engine runtime started");
}

View File

@@ -0,0 +1,12 @@
pub(super) const EXTENSION_ID: &str = "QuickAIAccess";
pub(crate) const PLUGIN_JSON_FILE: &str = r#"
{
"id": "QuickAIAccess",
"name": "Quick AI Access",
"description": "...",
"icon": "font_a-QuickAIAccess",
"type": "ai_extension",
"enabled": true
}
"#;

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,694 @@
//! Coco has 4 sources of `plugin.json` to check and validate:
//!
//! 1. From coco-extensions repository
//!
//! Granted, Coco APP won't check these files directly, but the code here
//! will run in that repository's CI to prevent errors in the first place.
//!
//! 2. From the "<data directory>/third_party_extensions" directory
//! 3. Imported via "Import Local Extension"
//! 4. Downloaded from the "store/extension/<extension ID>/_download" API
//!
//! This file contains the checks that are general enough to be applied to all
//! these 4 sources
use crate::extension::Extension;
use crate::extension::ExtensionType;
use crate::util::platform::Platform;
use std::collections::HashSet;
pub(crate) fn general_check(extension: &Extension) -> Result<(), String> {
// Check main extension
check_main_extension_only(extension)?;
check_main_extension_or_sub_extension(extension, &format!("extension [{}]", extension.id))?;
// `None` if `extension` is compatible with all the platforms. Otherwise `Some(limited_platforms)`
let limited_supported_platforms = match extension.platforms.as_ref() {
Some(platforms) => {
if platforms.len() == Platform::num_of_supported_platforms() {
None
} else {
Some(platforms)
}
}
None => None,
};
// Check sub extensions
let commands = match extension.commands {
Some(ref v) => v.as_slice(),
None => &[],
};
let scripts = match extension.scripts {
Some(ref v) => v.as_slice(),
None => &[],
};
let quicklinks = match extension.quicklinks {
Some(ref v) => v.as_slice(),
None => &[],
};
let sub_extensions = [commands, scripts, quicklinks].concat();
let mut sub_extension_ids = HashSet::new();
for sub_extension in sub_extensions.iter() {
check_sub_extension_only(&extension.id, sub_extension, limited_supported_platforms)?;
check_main_extension_or_sub_extension(
extension,
&format!("sub-extension [{}-{}]", extension.id, sub_extension.id),
)?;
if !sub_extension_ids.insert(sub_extension.id.as_str()) {
// extension ID already exists
return Err(format!(
"sub-extension with ID [{}] already exists",
sub_extension.id
));
}
}
Ok(())
}
/// This checks the main extension only, it won't check sub-extensions.
fn check_main_extension_only(extension: &Extension) -> Result<(), String> {
// Group and Extension cannot have alias
if extension.alias.is_some() {
if extension.r#type == ExtensionType::Group || extension.r#type == ExtensionType::Extension
{
return Err(format!(
"invalid extension [{}], extension of type [{:?}] cannot have alias",
extension.id, extension.r#type
));
}
}
// Group and Extension cannot have hotkey
if extension.hotkey.is_some() {
if extension.r#type == ExtensionType::Group || extension.r#type == ExtensionType::Extension
{
return Err(format!(
"invalid extension [{}], extension of type [{:?}] cannot have hotkey",
extension.id, extension.r#type
));
}
}
if extension.commands.is_some() || extension.scripts.is_some() || extension.quicklinks.is_some()
{
if extension.r#type != ExtensionType::Group && extension.r#type != ExtensionType::Extension
{
return Err(format!(
"invalid extension [{}], only extension of type [Group] and [Extension] can have sub-extensions",
extension.id,
));
}
}
Ok(())
}
fn check_sub_extension_only(
extension_id: &str,
sub_extension: &Extension,
limited_platforms: Option<&HashSet<Platform>>,
) -> Result<(), String> {
if sub_extension.r#type == ExtensionType::Group
|| sub_extension.r#type == ExtensionType::Extension
{
return Err(format!(
"invalid sub-extension [{}-{}]: sub-extensions should not be of type [Group] or [Extension]",
extension_id, sub_extension.id
));
}
if sub_extension.commands.is_some()
|| sub_extension.scripts.is_some()
|| sub_extension.quicklinks.is_some()
{
return Err(format!(
"invalid sub-extension [{}-{}]: fields [commands/scripts/quicklinks] should not be set in sub-extensions",
extension_id, sub_extension.id
));
}
if sub_extension.developer.is_some() {
return Err(format!(
"invalid sub-extension [{}-{}]: field [developer] should not be set in sub-extensions",
extension_id, sub_extension.id
));
}
if let Some(platforms_supported_by_main_extension) = limited_platforms {
match sub_extension.platforms {
Some(ref platforms_supported_by_sub_extension) => {
let diff = platforms_supported_by_sub_extension
.difference(&platforms_supported_by_main_extension)
.into_iter()
.map(|p| p.to_string())
.collect::<Vec<String>>();
if !diff.is_empty() {
return Err(format!(
"invalid sub-extension [{}-{}]: it supports platforms {:?} that are not supported by the main extension",
extension_id, sub_extension.id, diff
));
}
}
None => {
// if `sub_extension.platform` is None, it means it has the same value
// as main extension's `platforms` field, so we don't need to check it.
}
}
}
Ok(())
}
fn check_main_extension_or_sub_extension(
extension: &Extension,
identifier: &str,
) -> Result<(), String> {
// If field `action` is Some, then it should be a Command
if extension.action.is_some() && extension.r#type != ExtensionType::Command {
return Err(format!(
"invalid {}, field [action] is set for a non-Command extension",
identifier
));
}
if extension.r#type == ExtensionType::Command && extension.action.is_none() {
return Err(format!(
"invalid {}, field [action] should be set for a Command extension",
identifier
));
}
// If field `quicklink` is Some, then it should be a Quicklink
if extension.quicklink.is_some() && extension.r#type != ExtensionType::Quicklink {
return Err(format!(
"invalid {}, field [quicklink] is set for a non-Quicklink extension",
identifier
));
}
if extension.r#type == ExtensionType::Quicklink && extension.quicklink.is_none() {
return Err(format!(
"invalid {}, field [quicklink] should be set for a Quicklink extension",
identifier
));
}
Ok(())
}
#[cfg(test)]
mod tests {
use super::*;
use crate::extension::{CommandAction, Quicklink, QuicklinkLink, QuicklinkLinkComponent};
/// Helper function to create a basic valid extension
fn create_basic_extension(id: &str, extension_type: ExtensionType) -> Extension {
Extension {
id: id.to_string(),
name: "Test Extension".to_string(),
developer: None,
platforms: None,
description: "Test description".to_string(),
icon: "test-icon.png".to_string(),
r#type: extension_type,
action: None,
quicklink: None,
commands: None,
scripts: None,
quicklinks: None,
alias: None,
hotkey: None,
enabled: true,
settings: None,
screenshots: None,
url: None,
version: None,
}
}
/// Helper function to create a command action
fn create_command_action() -> CommandAction {
CommandAction {
exec: "echo".to_string(),
args: Some(vec!["test".to_string()]),
}
}
/// Helper function to create a quicklink
fn create_quicklink() -> Quicklink {
Quicklink {
link: QuicklinkLink {
components: vec![QuicklinkLinkComponent::StaticStr(
"https://example.com".to_string(),
)],
},
open_with: None,
}
}
/* test_check_main_extension_only */
#[test]
fn test_group_cannot_have_alias() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
extension.alias = Some("group-alias".to_string());
let result = general_check(&extension);
assert!(result.is_err());
assert!(result.unwrap_err().contains("cannot have alias"));
}
#[test]
fn test_extension_cannot_have_alias() {
let mut extension = create_basic_extension("test-ext", ExtensionType::Extension);
extension.alias = Some("ext-alias".to_string());
let result = general_check(&extension);
assert!(result.is_err());
assert!(result.unwrap_err().contains("cannot have alias"));
}
#[test]
fn test_group_cannot_have_hotkey() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
extension.hotkey = Some("cmd+g".to_string());
let result = general_check(&extension);
assert!(result.is_err());
assert!(result.unwrap_err().contains("cannot have hotkey"));
}
#[test]
fn test_extension_cannot_have_hotkey() {
let mut extension = create_basic_extension("test-ext", ExtensionType::Extension);
extension.hotkey = Some("cmd+e".to_string());
let result = general_check(&extension);
assert!(result.is_err());
assert!(result.unwrap_err().contains("cannot have hotkey"));
}
#[test]
fn test_non_container_types_cannot_have_sub_extensions() {
let mut extension = create_basic_extension("test-cmd", ExtensionType::Command);
extension.action = Some(create_command_action());
extension.commands = Some(vec![create_basic_extension(
"sub-cmd",
ExtensionType::Command,
)]);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("only extension of type [Group] and [Extension] can have sub-extensions")
);
}
/* test_check_main_extension_only */
/* test check_main_extension_or_sub_extension */
#[test]
fn test_command_must_have_action() {
let extension = create_basic_extension("test-cmd", ExtensionType::Command);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("field [action] should be set for a Command extension")
);
}
#[test]
fn test_non_command_cannot_have_action() {
let mut extension = create_basic_extension("test-script", ExtensionType::Script);
extension.action = Some(create_command_action());
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("field [action] is set for a non-Command extension")
);
}
#[test]
fn test_quicklink_must_have_quicklink_field() {
let extension = create_basic_extension("test-quicklink", ExtensionType::Quicklink);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("field [quicklink] should be set for a Quicklink extension")
);
}
#[test]
fn test_non_quicklink_cannot_have_quicklink_field() {
let mut extension = create_basic_extension("test-cmd", ExtensionType::Command);
extension.action = Some(create_command_action());
extension.quicklink = Some(create_quicklink());
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("field [quicklink] is set for a non-Quicklink extension")
);
}
/* test check_main_extension_or_sub_extension */
/* Test check_sub_extension_only */
#[test]
fn test_sub_extension_cannot_be_group() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
let sub_group = create_basic_extension("sub-group", ExtensionType::Group);
extension.commands = Some(vec![sub_group]);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("sub-extensions should not be of type [Group] or [Extension]")
);
}
#[test]
fn test_sub_extension_cannot_be_extension() {
let mut extension = create_basic_extension("test-ext", ExtensionType::Extension);
let sub_ext = create_basic_extension("sub-ext", ExtensionType::Extension);
extension.scripts = Some(vec![sub_ext]);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("sub-extensions should not be of type [Group] or [Extension]")
);
}
#[test]
fn test_sub_extension_cannot_have_developer() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.developer = Some("test-dev".to_string());
extension.commands = Some(vec![sub_cmd]);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("field [developer] should not be set in sub-extensions")
);
}
#[test]
fn test_sub_extension_cannot_have_sub_extensions() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.commands = Some(vec![create_basic_extension(
"nested-cmd",
ExtensionType::Command,
)]);
extension.commands = Some(vec![sub_cmd]);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result.unwrap_err().contains(
"fields [commands/scripts/quicklinks] should not be set in sub-extensions"
)
);
}
/* Test check_sub_extension_only */
#[test]
fn test_duplicate_sub_extension_ids() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
let mut cmd1 = create_basic_extension("duplicate-id", ExtensionType::Command);
cmd1.action = Some(create_command_action());
let mut cmd2 = create_basic_extension("duplicate-id", ExtensionType::Command);
cmd2.action = Some(create_command_action());
extension.commands = Some(vec![cmd1, cmd2]);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("sub-extension with ID [duplicate-id] already exists")
);
}
#[test]
fn test_duplicate_ids_across_different_sub_extension_types() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
let mut cmd = create_basic_extension("same-id", ExtensionType::Command);
cmd.action = Some(create_command_action());
let script = create_basic_extension("same-id", ExtensionType::Script);
extension.commands = Some(vec![cmd]);
extension.scripts = Some(vec![script]);
let result = general_check(&extension);
assert!(result.is_err());
assert!(
result
.unwrap_err()
.contains("sub-extension with ID [same-id] already exists")
);
}
#[test]
fn test_valid_group_extension() {
let mut extension = create_basic_extension("test-group", ExtensionType::Group);
extension.commands = Some(vec![create_basic_extension("cmd1", ExtensionType::Command)]);
assert!(general_check(&extension).is_ok());
}
#[test]
fn test_valid_extension_type() {
let mut extension = create_basic_extension("test-ext", ExtensionType::Extension);
extension.scripts = Some(vec![create_basic_extension(
"script1",
ExtensionType::Script,
)]);
assert!(general_check(&extension).is_ok());
}
#[test]
fn test_valid_command_extension() {
let mut extension = create_basic_extension("test-cmd", ExtensionType::Command);
extension.action = Some(create_command_action());
assert!(general_check(&extension).is_ok());
}
#[test]
fn test_valid_quicklink_extension() {
let mut extension = create_basic_extension("test-quicklink", ExtensionType::Quicklink);
extension.quicklink = Some(create_quicklink());
assert!(general_check(&extension).is_ok());
}
#[test]
fn test_valid_complex_extension() {
let mut extension = create_basic_extension("spotify-controls", ExtensionType::Extension);
// Add valid commands
let mut play_pause = create_basic_extension("play-pause", ExtensionType::Command);
play_pause.action = Some(create_command_action());
let mut next_track = create_basic_extension("next-track", ExtensionType::Command);
next_track.action = Some(create_command_action());
let mut prev_track = create_basic_extension("prev-track", ExtensionType::Command);
prev_track.action = Some(create_command_action());
extension.commands = Some(vec![play_pause, next_track, prev_track]);
assert!(general_check(&extension).is_ok());
}
#[test]
fn test_valid_single_layer_command() {
let mut extension = create_basic_extension("empty-trash", ExtensionType::Command);
extension.action = Some(create_command_action());
assert!(general_check(&extension).is_ok());
}
#[test]
fn test_command_alias_and_hotkey_allowed() {
let mut extension = create_basic_extension("test-cmd", ExtensionType::Command);
extension.action = Some(create_command_action());
extension.alias = Some("cmd-alias".to_string());
extension.hotkey = Some("cmd+t".to_string());
assert!(general_check(&extension).is_ok());
}
/*
* Tests for check that sub extension cannot support extensions that are not
* supported by the main extension
*
* Start here
*/
#[test]
fn test_platform_validation_both_none() {
// Case 1: main extension's platforms = None, sub extension's platforms = None
// Should return Ok(())
let mut main_extension = create_basic_extension("main-ext", ExtensionType::Group);
main_extension.platforms = None;
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.platforms = None;
main_extension.commands = Some(vec![sub_cmd]);
let result = general_check(&main_extension);
assert!(result.is_ok());
}
#[test]
fn test_platform_validation_main_all_sub_none() {
// Case 2: main extension's platforms = Some(all platforms), sub extension's platforms = None
// Should return Ok(())
let mut main_extension = create_basic_extension("main-ext", ExtensionType::Group);
main_extension.platforms = Some(Platform::all());
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.platforms = None;
main_extension.commands = Some(vec![sub_cmd]);
let result = general_check(&main_extension);
assert!(result.is_ok());
}
#[test]
fn test_platform_validation_main_none_sub_some() {
// Case 3: main extension's platforms = None, sub extension's platforms = Some([Platform::Macos])
// Should return Ok(()) because None means supports all platforms
let mut main_extension = create_basic_extension("main-ext", ExtensionType::Group);
main_extension.platforms = None;
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.platforms = Some(HashSet::from([Platform::Macos]));
main_extension.commands = Some(vec![sub_cmd]);
let result = general_check(&main_extension);
assert!(result.is_ok());
}
#[test]
fn test_platform_validation_main_all_sub_subset() {
// Case 4: main extension's platforms = Some(all platforms), sub extension's platforms = Some([Platform::Macos])
// Should return Ok(()) because sub extension supports a subset of main extension's platforms
let mut main_extension = create_basic_extension("main-ext", ExtensionType::Group);
main_extension.platforms = Some(Platform::all());
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.platforms = Some(HashSet::from([Platform::Macos]));
main_extension.commands = Some(vec![sub_cmd]);
let result = general_check(&main_extension);
assert!(result.is_ok());
}
#[test]
fn test_platform_validation_main_limited_sub_unsupported() {
// Case 5: main extension's platforms = Some([Platform::Macos]), sub extension's platforms = Some([Platform::Linux])
// Should return Err because sub extension supports a platform not supported by main extension
let mut main_extension = create_basic_extension("main-ext", ExtensionType::Group);
main_extension.platforms = Some(HashSet::from([Platform::Macos]));
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.platforms = Some(HashSet::from([Platform::Linux]));
main_extension.commands = Some(vec![sub_cmd]);
let result = general_check(&main_extension);
assert!(result.is_err());
let error_msg = result.unwrap_err();
assert!(error_msg.contains("it supports platforms"));
assert!(error_msg.contains("that are not supported by the main extension"));
assert!(error_msg.contains("Linux")); // Should mention the unsupported platform
}
#[test]
fn test_platform_validation_main_partial_sub_unsupported() {
// Case 6: main extension's platforms = Some([Platform::Macos, Platform::Windows]), sub extension's platforms = Some([Platform::Linux])
// Should return Err because sub extension supports a platform not supported by main extension
let mut main_extension = create_basic_extension("main-ext", ExtensionType::Group);
main_extension.platforms = Some(HashSet::from([Platform::Macos, Platform::Windows]));
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.platforms = Some(HashSet::from([Platform::Linux]));
main_extension.commands = Some(vec![sub_cmd]);
let result = general_check(&main_extension);
assert!(result.is_err());
let error_msg = result.unwrap_err();
assert!(error_msg.contains("it supports platforms"));
assert!(error_msg.contains("that are not supported by the main extension"));
assert!(error_msg.contains("Linux")); // Should mention the unsupported platform
}
#[test]
fn test_platform_validation_main_limited_sub_none() {
// Case 7: main extension's platforms = Some([Platform::Macos]), sub extension's platforms = None
// Should return Ok(()) because when sub extension's platforms is None, it inherits main extension's platforms
let mut main_extension = create_basic_extension("main-ext", ExtensionType::Group);
main_extension.platforms = Some(HashSet::from([Platform::Macos]));
let mut sub_cmd = create_basic_extension("sub-cmd", ExtensionType::Command);
sub_cmd.action = Some(create_command_action());
sub_cmd.platforms = None;
main_extension.commands = Some(vec![sub_cmd]);
let result = general_check(&main_extension);
assert!(result.is_ok());
}
/*
* Tests for check that sub extension cannot support extensions that are not
* supported by the main extension
*
* End here
*/
}

View File

@@ -0,0 +1,249 @@
use crate::extension::PLUGIN_JSON_FILE_NAME;
use crate::extension::third_party::check::general_check;
use crate::extension::third_party::install::{
filter_out_incompatible_sub_extensions, is_extension_installed,
};
use crate::extension::third_party::{
THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE, get_third_party_extension_directory,
};
use crate::extension::{Extension, canonicalize_relative_icon_path};
use crate::util::platform::Platform;
use serde_json::Value as Json;
use std::path::Path;
use std::path::PathBuf;
use tauri::AppHandle;
use tokio::fs;
/// All the extensions installed from local file will belong to a special developer
/// "__local__".
const DEVELOPER_ID_LOCAL: &str = "__local__";
/// Install the extension specified by `path`.
///
/// `path` should point to a directory with the following structure:
///
/// ```text
/// extension-directory/
/// ├── assets/
/// │ ├── icon.png
/// │ └── other-assets...
/// └── plugin.json
/// ```
#[tauri::command]
pub(crate) async fn install_local_extension(
tauri_app_handle: AppHandle,
path: PathBuf,
) -> Result<(), String> {
let extension_dir_name = path
.file_name()
.ok_or_else(|| "Invalid extension: no directory name".to_string())?
.to_str()
.ok_or_else(|| "Invalid extension: non-UTF8 extension id".to_string())?;
// we use extension directory name as the extension ID.
let extension_id = extension_dir_name;
if is_extension_installed(DEVELOPER_ID_LOCAL, extension_id).await {
// The frontend code uses this string to distinguish between 2 error cases:
//
// 1. This extension is already imported
// 2. This extension is incompatible with the current platform
// 3. The selected directory does not contain a valid extension
//
// do NOT edit this without updating the frontend code.
//
// ```ts
// if (errorMessage === "already imported") {
// addError(t("settings.extensions.hints.extensionAlreadyImported"));
// } else if (errorMessage === "incompatible") {
// addError(t("settings.extensions.hints.incompatibleExtension"));
// } else {
// addError(t("settings.extensions.hints.importFailed"));
// }
// ```
//
// This is definitely error-prone, but we have to do this until we have
// structured error type
return Err("already imported".into());
}
let plugin_json_path = path.join(PLUGIN_JSON_FILE_NAME);
let plugin_json_content = fs::read_to_string(&plugin_json_path)
.await
.map_err(|e| e.to_string())?;
// Parse as JSON first as it is not valid for `struct Extension`, we need to
// correct it (set fields `id` and `developer`) before converting it to `struct Extension`:
let mut extension_json: Json =
serde_json::from_str(&plugin_json_content).map_err(|e| e.to_string())?;
// Set the main extension ID to the directory name
let extension_obj = extension_json
.as_object_mut()
.expect("extension_json should be an object");
extension_obj.insert("id".to_string(), Json::String(extension_id.to_string()));
extension_obj.insert(
"developer".to_string(),
Json::String(DEVELOPER_ID_LOCAL.to_string()),
);
// Counter for sub-extension IDs
let mut counter = 1u32;
// Set IDs for commands
if let Some(commands) = extension_obj.get_mut("commands") {
if let Some(commands_array) = commands.as_array_mut() {
for command in commands_array {
if let Some(command_obj) = command.as_object_mut() {
command_obj.insert("id".to_string(), Json::String(counter.to_string()));
counter += 1;
}
}
}
}
// Set IDs for quicklinks
if let Some(quicklinks) = extension_obj.get_mut("quicklinks") {
if let Some(quicklinks_array) = quicklinks.as_array_mut() {
for quicklink in quicklinks_array {
if let Some(quicklink_obj) = quicklink.as_object_mut() {
quicklink_obj.insert("id".to_string(), Json::String(counter.to_string()));
counter += 1;
}
}
}
}
// Set IDs for scripts
if let Some(scripts) = extension_obj.get_mut("scripts") {
if let Some(scripts_array) = scripts.as_array_mut() {
for script in scripts_array {
if let Some(script_obj) = script.as_object_mut() {
script_obj.insert("id".to_string(), Json::String(counter.to_string()));
counter += 1;
}
}
}
}
// Now we can convert JSON to `struct Extension`
let mut extension: Extension =
serde_json::from_value(extension_json).map_err(|e| e.to_string())?;
let current_platform = Platform::current();
/* Check begins here */
general_check(&extension)?;
if let Some(ref platforms) = extension.platforms {
if !platforms.contains(&current_platform) {
// The frontend code uses this string to distinguish between 3 error cases:
//
// 1. This extension is already imported
// 2. This extension is incompatible with the current platform
// 3. The selected directory does not contain a valid extension
//
// do NOT edit this without updating the frontend code.
//
// ```ts
// if (errorMessage === "already imported") {
// addError(t("settings.extensions.hints.extensionAlreadyImported"));
// } else if (errorMessage === "incompatible") {
// addError(t("settings.extensions.hints.incompatibleExtension"));
// } else {
// addError(t("settings.extensions.hints.importFailed"));
// }
// ```
//
// This is definitely error-prone, but we have to do this until we have
// structured error type
return Err("incompatible".into());
}
}
/* Check ends here */
// Extension is compatible with current platform, but it could contain sub
// extensions that are not, filter them out.
filter_out_incompatible_sub_extensions(&mut extension, current_platform);
// Create destination directory
let dest_dir = get_third_party_extension_directory(&tauri_app_handle)
.join(DEVELOPER_ID_LOCAL)
.join(extension_dir_name);
fs::create_dir_all(&dest_dir)
.await
.map_err(|e| e.to_string())?;
// Copy all files except plugin.json
let mut entries = fs::read_dir(&path).await.map_err(|e| e.to_string())?;
while let Some(entry) = entries.next_entry().await.map_err(|e| e.to_string())? {
let file_name = entry.file_name();
let file_name_str = file_name
.to_str()
.ok_or_else(|| "Invalid filename: non-UTF8".to_string())?;
// plugin.json will be handled separately.
if file_name_str == PLUGIN_JSON_FILE_NAME {
continue;
}
let src_path = entry.path();
let dest_path = dest_dir.join(&file_name);
if src_path.is_dir() {
// Recursively copy directory
copy_dir_recursively(&src_path, &dest_path).await?;
} else {
// Copy file
fs::copy(&src_path, &dest_path)
.await
.map_err(|e| e.to_string())?;
}
}
// Write the corrected plugin.json file
let corrected_plugin_json =
serde_json::to_string_pretty(&extension).map_err(|e| e.to_string())?;
let dest_plugin_json_path = dest_dir.join(PLUGIN_JSON_FILE_NAME);
fs::write(&dest_plugin_json_path, corrected_plugin_json)
.await
.map_err(|e| e.to_string())?;
// Canonicalize relative icon paths
canonicalize_relative_icon_path(&dest_dir, &mut extension)?;
// Add extension to the search source
THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE
.get()
.unwrap()
.add_extension(extension)
.await;
Ok(())
}
/// Helper function to recursively copy directories.
#[async_recursion::async_recursion]
async fn copy_dir_recursively(src: &Path, dest: &Path) -> Result<(), String> {
tokio::fs::create_dir_all(dest)
.await
.map_err(|e| e.to_string())?;
let mut read_dir = tokio::fs::read_dir(src).await.map_err(|e| e.to_string())?;
while let Some(entry) = read_dir.next_entry().await.map_err(|e| e.to_string())? {
let src_path = entry.path();
let dest_path = dest.join(entry.file_name());
if src_path.is_dir() {
copy_dir_recursively(&src_path, &dest_path).await?;
} else {
tokio::fs::copy(&src_path, &dest_path)
.await
.map_err(|e| e.to_string())?;
}
}
Ok(())
}

View File

@@ -0,0 +1,224 @@
//! This module contains the code of extension installation.
//!
//!
//! # How
//!
//! Technically, installing an extension involves the following steps:
//!
//! 1. Correct the `plugin.json` JSON if it does not conform to our `struct Extension`
//! definition.
//!
//! 2. Write the extension files to the corresponding location
//!
//! * developer directory
//! * extension directory
//! * assets directory
//! * various assets files, e.g., "icon.png"
//! * plugin.json file
//!
//! 3. Canonicalize the `Extension.icon` fields if they are relative paths
//! (relative to the `assets` directory)
//!
//! 4. Deserialize the `plugin.json` file to a `struct Extension`, and call
//! `THIRD_PARTY_EXTENSIONS_DIRECTORY.add_extension(extension)` to add it to
//! the in-memory extension list.
pub(crate) mod local_extension;
pub(crate) mod store;
use crate::extension::Extension;
use crate::util::platform::Platform;
use super::THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE;
pub(crate) async fn is_extension_installed(developer: &str, extension_id: &str) -> bool {
THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE
.get()
.unwrap()
.extension_exists(developer, extension_id)
.await
}
/// Filters out sub-extensions that are not compatible with the current platform.
///
/// We make `current_platform` an argument so that this function is testable.
pub(crate) fn filter_out_incompatible_sub_extensions(
extension: &mut Extension,
current_platform: Platform,
) {
// Only process extensions of type Group or Extension that can have sub-extensions
if !extension.r#type.contains_sub_items() {
return;
}
// Filter commands
if let Some(ref mut commands) = extension.commands {
commands.retain(|sub_ext| {
// If platforms is None, the sub-extension is compatible with all platforms
if let Some(ref platforms) = sub_ext.platforms {
platforms.contains(&current_platform)
} else {
true
}
});
}
// Filter scripts
if let Some(ref mut scripts) = extension.scripts {
scripts.retain(|sub_ext| {
// If platforms is None, the sub-extension is compatible with all platforms
if let Some(ref platforms) = sub_ext.platforms {
platforms.contains(&current_platform)
} else {
true
}
});
}
// Filter quicklinks
if let Some(ref mut quicklinks) = extension.quicklinks {
quicklinks.retain(|sub_ext| {
// If platforms is None, the sub-extension is compatible with all platforms
if let Some(ref platforms) = sub_ext.platforms {
platforms.contains(&current_platform)
} else {
true
}
});
}
}
#[cfg(test)]
mod tests {
use super::*;
use crate::extension::ExtensionType;
use std::collections::HashSet;
/// Helper function to create a basic extension for testing
/// `filter_out_incompatible_sub_extensions`
fn create_test_extension(
extension_type: ExtensionType,
platforms: Option<HashSet<Platform>>,
) -> Extension {
Extension {
id: "ID".into(),
name: "name".into(),
developer: None,
platforms,
description: "Test extension".to_string(),
icon: "test-icon".to_string(),
r#type: extension_type,
action: None,
quicklink: None,
commands: None,
scripts: None,
quicklinks: None,
alias: None,
hotkey: None,
enabled: true,
settings: None,
screenshots: None,
url: None,
version: None,
}
}
#[test]
fn test_filter_out_incompatible_sub_extensions_filter_non_group_extension_unchanged() {
// Command
let mut extension = create_test_extension(ExtensionType::Command, None);
let clone = extension.clone();
filter_out_incompatible_sub_extensions(&mut extension, Platform::Linux);
assert_eq!(extension, clone);
// Quicklink
let mut extension = create_test_extension(ExtensionType::Quicklink, None);
let clone = extension.clone();
filter_out_incompatible_sub_extensions(&mut extension, Platform::Linux);
assert_eq!(extension, clone);
}
#[test]
fn test_filter_out_incompatible_sub_extensions() {
let mut main_extension = create_test_extension(ExtensionType::Group, None);
// init sub extensions, which are macOS-only
let commands = vec![create_test_extension(
ExtensionType::Command,
Some(HashSet::from([Platform::Macos])),
)];
let quicklinks = vec![create_test_extension(
ExtensionType::Quicklink,
Some(HashSet::from([Platform::Macos])),
)];
let scripts = vec![create_test_extension(
ExtensionType::Script,
Some(HashSet::from([Platform::Macos])),
)];
// Set sub extensions
main_extension.commands = Some(commands);
main_extension.quicklinks = Some(quicklinks);
main_extension.scripts = Some(scripts);
// Current platform is Linux, all the sub extensions should be filtered out.
filter_out_incompatible_sub_extensions(&mut main_extension, Platform::Linux);
// assertions
assert!(main_extension.commands.unwrap().is_empty());
assert!(main_extension.quicklinks.unwrap().is_empty());
assert!(main_extension.scripts.unwrap().is_empty());
}
/// Sub extensions are compatible with all the platforms, nothing to filter out.
#[test]
fn test_filter_out_incompatible_sub_extensions_all_compatible() {
{
let mut main_extension = create_test_extension(ExtensionType::Group, None);
// init sub extensions, which are compatible with all the platforms
let commands = vec![create_test_extension(
ExtensionType::Command,
Some(Platform::all()),
)];
let quicklinks = vec![create_test_extension(
ExtensionType::Quicklink,
Some(Platform::all()),
)];
let scripts = vec![create_test_extension(
ExtensionType::Script,
Some(Platform::all()),
)];
// Set sub extensions
main_extension.commands = Some(commands);
main_extension.quicklinks = Some(quicklinks);
main_extension.scripts = Some(scripts);
// Current platform is Linux, all the sub extensions should be filtered out.
filter_out_incompatible_sub_extensions(&mut main_extension, Platform::Linux);
// assertions
assert_eq!(main_extension.commands.unwrap().len(), 1);
assert_eq!(main_extension.quicklinks.unwrap().len(), 1);
assert_eq!(main_extension.scripts.unwrap().len(), 1);
}
// `platforms: None` means all platforms as well
{
let mut main_extension = create_test_extension(ExtensionType::Group, None);
// init sub extensions, which are compatible with all the platforms
let commands = vec![create_test_extension(ExtensionType::Command, None)];
let quicklinks = vec![create_test_extension(ExtensionType::Quicklink, None)];
let scripts = vec![create_test_extension(ExtensionType::Script, None)];
// Set sub extensions
main_extension.commands = Some(commands);
main_extension.quicklinks = Some(quicklinks);
main_extension.scripts = Some(scripts);
// Current platform is Linux, all the sub extensions should be filtered out.
filter_out_incompatible_sub_extensions(&mut main_extension, Platform::Linux);
// assertions
assert_eq!(main_extension.commands.unwrap().len(), 1);
assert_eq!(main_extension.quicklinks.unwrap().len(), 1);
assert_eq!(main_extension.scripts.unwrap().len(), 1);
}
}
}

View File

@@ -0,0 +1,341 @@
//! Extension store related stuff.
use super::super::LOCAL_QUERY_SOURCE_TYPE;
use super::is_extension_installed;
use crate::common::document::DataSourceReference;
use crate::common::document::Document;
use crate::common::error::SearchError;
use crate::common::search::QueryResponse;
use crate::common::search::QuerySource;
use crate::common::search::SearchQuery;
use crate::common::traits::SearchSource;
use crate::extension::Extension;
use crate::extension::PLUGIN_JSON_FILE_NAME;
use crate::extension::THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE;
use crate::extension::canonicalize_relative_icon_path;
use crate::extension::third_party::check::general_check;
use crate::extension::third_party::get_third_party_extension_directory;
use crate::extension::third_party::install::filter_out_incompatible_sub_extensions;
use crate::server::http_client::HttpClient;
use crate::util::platform::Platform;
use async_trait::async_trait;
use reqwest::StatusCode;
use serde_json::Map as JsonObject;
use serde_json::Value as Json;
use std::io::Read;
use tauri::AppHandle;
const DATA_SOURCE_ID: &str = "Extension Store";
pub(crate) struct ExtensionStore;
#[async_trait]
impl SearchSource for ExtensionStore {
fn get_type(&self) -> QuerySource {
QuerySource {
r#type: LOCAL_QUERY_SOURCE_TYPE.into(),
name: hostname::get()
.unwrap_or(DATA_SOURCE_ID.into())
.to_string_lossy()
.into(),
id: DATA_SOURCE_ID.into(),
}
}
async fn search(
&self,
_tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
const SCORE: f64 = 2000.0;
let Some(query_string) = query.query_strings.get("query") else {
return Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
});
};
let lowercase_query_string = query_string.to_lowercase();
let expected_str = "extension store";
if expected_str.contains(&lowercase_query_string) {
let doc = Document {
id: DATA_SOURCE_ID.to_string(),
category: Some(DATA_SOURCE_ID.to_string()),
title: Some(DATA_SOURCE_ID.to_string()),
icon: Some("font_Store".to_string()),
source: Some(DataSourceReference {
r#type: Some(LOCAL_QUERY_SOURCE_TYPE.into()),
name: Some(DATA_SOURCE_ID.into()),
id: Some(DATA_SOURCE_ID.into()),
icon: Some("font_Store".to_string()),
}),
..Default::default()
};
Ok(QueryResponse {
source: self.get_type(),
hits: vec![(doc, SCORE)],
total_hits: 1,
})
} else {
Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
})
}
}
}
#[tauri::command]
pub(crate) async fn search_extension(
query_params: Option<Vec<String>>,
) -> Result<Vec<Json>, String> {
let response = HttpClient::get(
"default_coco_server",
"store/extension/_search",
query_params,
)
.await
.map_err(|e| format!("Failed to send request: {:?}", e))?;
// The response of a ES style search request
let mut response: JsonObject<String, Json> = response
.json()
.await
.map_err(|e| format!("Failed to parse response: {:?}", e))?;
let hits_json = response
.remove("hits")
.expect("the JSON response should contain field [hits]");
let mut hits = match hits_json {
Json::Object(obj) => obj,
_ => panic!(
"field [hits] should be a JSON object, but it is not, value: [{}]",
hits_json
),
};
let Some(hits_hits_json) = hits.remove("hits") else {
return Ok(Vec::new());
};
let hits_hits = match hits_hits_json {
Json::Array(arr) => arr,
_ => panic!(
"field [hits.hits] should be an array, but it is not, value: [{}]",
hits_hits_json
),
};
let mut extensions = Vec::with_capacity(hits_hits.len());
for hit in hits_hits {
let mut hit_obj = match hit {
Json::Object(obj) => obj,
_ => panic!(
"each hit in [hits.hits] should be a JSON object, but it is not, value: [{}]",
hit
),
};
let source = hit_obj
.remove("_source")
.expect("each hit should contain field [_source]");
let mut source_obj = match source {
Json::Object(obj) => obj,
_ => panic!(
"field [_source] should be a JSON object, but it is not, value: [{}]",
source
),
};
let developer_id = source_obj
.get("developer")
.and_then(|dev| dev.get("id"))
.and_then(|id| id.as_str())
.expect("developer.id should exist");
let extension_id = source_obj
.get("id")
.and_then(|id| id.as_str())
.expect("extension id should exist");
let installed = is_extension_installed(developer_id, extension_id).await;
source_obj.insert("installed".to_string(), Json::Bool(installed));
extensions.push(Json::Object(source_obj));
}
Ok(extensions)
}
#[tauri::command]
pub(crate) async fn install_extension_from_store(
tauri_app_handle: AppHandle,
id: String,
) -> Result<(), String> {
let path = format!("store/extension/{}/_download", id);
let response = HttpClient::get("default_coco_server", &path, None)
.await
.map_err(|e| format!("Failed to download extension: {}", e))?;
if response.status() == StatusCode::NOT_FOUND {
return Err(format!("extension [{}] not found", id));
}
let bytes = response
.bytes()
.await
.map_err(|e| format!("Failed to read response bytes: {}", e))?;
let cursor = std::io::Cursor::new(bytes);
let mut archive =
zip::ZipArchive::new(cursor).map_err(|e| format!("Failed to read zip archive: {}", e))?;
// The plugin.json sent from the server does not conform to our `struct Extension` definition:
//
// 1. Its `developer` field is a JSON object, but we need a string
// 2. sub-extensions won't have their `id` fields set
//
// we need to correct it
let mut plugin_json = archive
.by_name(PLUGIN_JSON_FILE_NAME)
.map_err(|e| e.to_string())?;
let mut plugin_json_content = String::new();
std::io::Read::read_to_string(&mut plugin_json, &mut plugin_json_content)
.map_err(|e| e.to_string())?;
let mut extension: Json = serde_json::from_str(&plugin_json_content)
.map_err(|e| format!("Failed to parse plugin.json: {}", e))?;
let mut_ref_to_developer_object: &mut Json = extension
.as_object_mut()
.expect("plugin.json should be an object")
.get_mut("developer")
.expect("plugin.json should contain field [developer]");
let developer_id = mut_ref_to_developer_object
.get("id")
.expect("plugin.json should contain [developer.id]")
.as_str()
.expect("plugin.json field [developer.id] should be a string");
*mut_ref_to_developer_object = Json::String(developer_id.into());
// Set IDs for sub-extensions (commands, quicklinks, scripts)
let mut counter = 0;
// Helper function to set IDs for array fields
fn set_ids_for_field(extension: &mut Json, field_name: &str, counter: &mut i32) {
if let Some(field) = extension.as_object_mut().unwrap().get_mut(field_name) {
if let Some(array) = field.as_array_mut() {
for item in array {
if let Some(item_obj) = item.as_object_mut() {
if !item_obj.contains_key("id") {
item_obj.insert("id".to_string(), Json::String(counter.to_string()));
*counter += 1;
}
}
}
}
}
}
set_ids_for_field(&mut extension, "commands", &mut counter);
set_ids_for_field(&mut extension, "quicklinks", &mut counter);
set_ids_for_field(&mut extension, "scripts", &mut counter);
// Now the extension JSON is valid
let mut extension: Extension = serde_json::from_value(extension).unwrap_or_else(|e| {
panic!(
"cannot parse plugin.json as struct Extension, error [{:?}]",
e
);
});
drop(plugin_json);
general_check(&extension)?;
// Extension is compatible with current platform, but it could contain sub
// extensions that are not, filter them out.
filter_out_incompatible_sub_extensions(&mut extension, Platform::current());
// Write extension files to the extension directory
let developer = extension.developer.clone().unwrap_or_default();
let extension_id = extension.id.clone();
let extension_directory = {
let mut path = get_third_party_extension_directory(&tauri_app_handle);
path.push(developer);
path.push(extension_id.as_str());
path
};
tokio::fs::create_dir_all(extension_directory.as_path())
.await
.map_err(|e| e.to_string())?;
// Extract all files except plugin.json
for i in 0..archive.len() {
let mut zip_file = archive.by_index(i).map_err(|e| e.to_string())?;
// `.name()` is safe to use in our cases, the cases listed in the below
// page won't happen to us.
//
// https://docs.rs/zip/4.2.0/zip/read/struct.ZipFile.html#method.name
//
// Example names:
//
// * `assets/icon.png`
// * `assets/screenshot.png`
// * `plugin.json`
//
// Yes, the `assets` directory is not a part of it.
let zip_file_name = zip_file.name();
// Skip the plugin.json file as we'll create it from the extension variable
if zip_file_name == PLUGIN_JSON_FILE_NAME {
continue;
}
let dest_file_path = extension_directory.join(zip_file_name);
// For cases like `assets/xxx.png`
if let Some(parent_dir) = dest_file_path.parent()
&& !parent_dir.exists()
{
tokio::fs::create_dir_all(parent_dir)
.await
.map_err(|e| e.to_string())?;
}
let mut dest_file = tokio::fs::File::create(&dest_file_path)
.await
.map_err(|e| e.to_string())?;
let mut src_bytes = Vec::with_capacity(
zip_file
.size()
.try_into()
.expect("we won't have a extension file that is bigger than 4GiB"),
);
zip_file
.read_to_end(&mut src_bytes)
.map_err(|e| e.to_string())?;
tokio::io::copy(&mut src_bytes.as_slice(), &mut dest_file)
.await
.map_err(|e| e.to_string())?;
}
// Create plugin.json from the extension variable
let plugin_json_path = extension_directory.join(PLUGIN_JSON_FILE_NAME);
let extension_json = serde_json::to_string_pretty(&extension).map_err(|e| e.to_string())?;
tokio::fs::write(&plugin_json_path, extension_json)
.await
.map_err(|e| e.to_string())?;
// Turn it into an absolute path if it is a valid relative path because frontend code need this.
canonicalize_relative_icon_path(&extension_directory, &mut extension)?;
THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE
.get()
.unwrap()
.add_extension(extension)
.await;
Ok(())
}

1028
src-tauri/src/extension/third_party/mod.rs vendored Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -1,7 +1,7 @@
mod assistant;
mod autostart;
mod common;
mod local;
mod extension;
mod search;
mod server;
mod settings;
@@ -11,19 +11,15 @@ mod util;
use crate::common::register::SearchSourceRegistry;
// use crate::common::traits::SearchSource;
use crate::common::{MAIN_WINDOW_LABEL, SETTINGS_WINDOW_LABEL};
use crate::common::{CHECK_WINDOW_LABEL, MAIN_WINDOW_LABEL, SETTINGS_WINDOW_LABEL};
use crate::server::servers::{load_or_insert_default_server, load_servers_token};
use autostart::{change_autostart, enable_autostart};
use autostart::{change_autostart, ensure_autostart_state_consistent};
use lazy_static::lazy_static;
use std::sync::Mutex;
use std::sync::OnceLock;
use tauri::async_runtime::block_on;
use tauri::plugin::TauriPlugin;
#[cfg(target_os = "macos")]
use tauri::ActivationPolicy;
use tauri::{
AppHandle, Emitter, Manager, PhysicalPosition, Runtime, WebviewWindow, Window, WindowEvent,
};
use tauri::{AppHandle, Emitter, Manager, PhysicalPosition, WebviewWindow, WindowEvent};
use tauri_plugin_autostart::MacosLauncher;
/// Tauri store name
@@ -32,9 +28,14 @@ pub(crate) const COCO_TAURI_STORE: &str = "coco_tauri_store";
lazy_static! {
static ref PREVIOUS_MONITOR_NAME: Mutex<Option<String>> = Mutex::new(None);
}
/// To allow us to access tauri's `AppHandle` when its context is inaccessible,
/// store it globally. It will be set in `init()`.
///
/// # WARNING
///
/// You may find this work, but the usage is discouraged and should be generally
/// avoided. If you do need it, always be careful that it may not be set() when
/// you access it.
pub(crate) static GLOBAL_TAURI_APP_HANDLE: OnceLock<AppHandle> = OnceLock::new();
#[tauri::command]
@@ -64,11 +65,13 @@ pub fn run() {
let ctx = tauri::generate_context!();
let mut app_builder = tauri::Builder::default();
// Set up logger first
app_builder = app_builder.plugin(set_up_tauri_logger());
#[cfg(desktop)]
{
app_builder = app_builder.plugin(tauri_plugin_single_instance::init(|_app, argv, _cwd| {
println!("a new app instance was opened with {argv:?} and the deep link event was already triggered");
log::debug!("a new app instance was opened with {argv:?} and the deep link event was already triggered");
// when defining deep link schemes at runtime, you must also check `argv` here
}));
}
@@ -77,7 +80,7 @@ pub fn run() {
.plugin(tauri_plugin_http::init())
.plugin(tauri_plugin_shell::init())
.plugin(tauri_plugin_autostart::init(
MacosLauncher::AppleScript,
MacosLauncher::LaunchAgent,
None,
))
.plugin(tauri_plugin_deep_link::init())
@@ -87,9 +90,13 @@ pub fn run() {
.plugin(tauri_plugin_macos_permissions::init())
.plugin(tauri_plugin_screenshots::init())
.plugin(tauri_plugin_process::init())
.plugin(tauri_plugin_updater::Builder::new().build())
.plugin(
tauri_plugin_updater::Builder::new()
.default_version_comparator(crate::util::updater::custom_version_comparator)
.build(),
)
.plugin(tauri_plugin_windows_version::init())
.plugin(set_up_tauri_logger());
.plugin(tauri_plugin_opener::init());
// Conditional compilation for macOS
#[cfg(target_os = "macos")]
@@ -107,7 +114,8 @@ pub fn run() {
show_coco,
hide_coco,
show_settings,
server::servers::get_server_token,
show_check,
hide_check,
server::servers::add_coco_server,
server::servers::remove_coco_server,
server::servers::list_coco_servers,
@@ -122,8 +130,8 @@ pub fn run() {
server::connector::get_connectors_by_server,
search::query_coco_fusion,
assistant::chat_history,
assistant::new_chat,
assistant::send_message,
assistant::chat_create,
assistant::chat_chat,
assistant::session_chat_history,
assistant::open_session_chat,
assistant::close_session_chat,
@@ -131,60 +139,97 @@ pub fn run() {
assistant::delete_session_chat,
assistant::update_session_chat,
assistant::assistant_search,
assistant::assistant_get,
assistant::assistant_get_multi,
// server::get_coco_server_datasources,
// server::get_coco_server_connectors,
server::websocket::connect_to_server,
server::websocket::disconnect,
get_app_search_source,
server::attachment::upload_attachment,
server::attachment::get_attachment,
server::attachment::get_attachment_by_ids,
server::attachment::delete_attachment,
server::transcription::transcription,
util::open,
server::system_settings::get_system_settings,
simulate_mouse_click,
local::get_disabled_local_query_sources,
local::enable_local_query_source,
local::disable_local_query_source,
local::application::get_app_list,
local::application::get_app_search_path,
local::application::get_app_metadata,
local::application::set_app_alias,
local::application::register_app_hotkey,
local::application::unregister_app_hotkey,
local::application::disable_app_search,
local::application::enable_app_search,
local::application::add_app_search_path,
local::application::remove_app_search_path,
extension::built_in::application::get_app_list,
extension::built_in::application::get_app_search_path,
extension::built_in::application::get_app_metadata,
extension::built_in::application::add_app_search_path,
extension::built_in::application::remove_app_search_path,
extension::built_in::application::reindex_applications,
extension::quicklink_link_arguments,
extension::list_extensions,
extension::enable_extension,
extension::disable_extension,
extension::set_extension_alias,
extension::register_extension_hotkey,
extension::unregister_extension_hotkey,
extension::is_extension_enabled,
extension::third_party::install::store::search_extension,
extension::third_party::install::store::install_extension_from_store,
extension::third_party::install::local_extension::install_local_extension,
extension::third_party::uninstall_extension,
settings::set_allow_self_signature,
settings::get_allow_self_signature,
assistant::ask_ai,
crate::common::document::open,
#[cfg(any(target_os = "macos", target_os = "windows"))]
extension::built_in::file_search::config::get_file_system_config,
#[cfg(any(target_os = "macos", target_os = "windows"))]
extension::built_in::file_search::config::set_file_system_config,
server::synthesize::synthesize,
util::file::get_file_icon,
util::app_lang::update_app_lang,
#[cfg(target_os = "macos")]
setup::toggle_move_to_active_space_attribute,
])
.setup(|app| {
let app_handle = app.handle().clone();
GLOBAL_TAURI_APP_HANDLE
.set(app_handle.clone())
.expect("variable already initialized");
.expect("global tauri AppHandle already initialized");
log::trace!("global Tauri AppHandle set");
#[cfg(target_os = "macos")]
{
log::trace!("hiding Dock icon on macOS");
app.set_activation_policy(tauri::ActivationPolicy::Accessory);
log::trace!("Dock icon should be hidden now");
}
let registry = SearchSourceRegistry::default();
app.manage(registry); // Store registry in Tauri's app state
app.manage(server::websocket::WebSocketManager::default());
// This has to be called before initializing extensions as doing that
// requires access to the shortcut store, which will be set by this
// function.
shortcut::enable_shortcut(app);
block_on(async {
init(app.handle()).await;
// We want all the extensions here, so no filter condition specified.
match extension::list_extensions(app_handle.clone(), None, None, false).await {
Ok(extensions) => {
// Initializing extension relies on SearchSourceRegistry, so this should
// be executed after `app.manage(registry)`
if let Err(e) =
extension::init_extensions(app_handle.clone(), extensions).await
{
log::error!("initializing extensions failed with error [{}]", e);
}
}
Err(e) => {
log::error!("listing extensions failed with error [{}]", e);
}
}
});
shortcut::enable_shortcut(app);
enable_autostart(app);
#[cfg(target_os = "macos")]
app.set_activation_policy(ActivationPolicy::Accessory);
ensure_autostart_state_consistent(app)?;
// app.listen("theme-changed", move |event| {
// if let Ok(payload) = serde_json::from_str::<ThemeChangedPayload>(event.payload()) {
// // switch_tray_icon(app.app_handle(), payload.is_dark_mode);
// println!("Theme changed: is_dark_mode = {}", payload.is_dark_mode);
// log::debug!("Theme changed: is_dark_mode = {}", payload.is_dark_mode);
// }
// });
@@ -204,13 +249,19 @@ pub fn run() {
let main_window = app.get_webview_window(MAIN_WINDOW_LABEL).unwrap();
let settings_window = app.get_webview_window(SETTINGS_WINDOW_LABEL).unwrap();
setup::default(app, main_window.clone(), settings_window.clone());
let check_window = app.get_webview_window(CHECK_WINDOW_LABEL).unwrap();
setup::default(
app,
main_window.clone(),
settings_window.clone(),
check_window.clone(),
);
Ok(())
})
.on_window_event(|window, event| match event {
WindowEvent::CloseRequested { api, .. } => {
dbg!("Close requested event received");
//dbg!("Close requested event received");
window.hide().unwrap();
api.prevent_close();
}
@@ -225,10 +276,10 @@ pub fn run() {
has_visible_windows,
..
} => {
dbg!(
"Reopen event received: has_visible_windows = {}",
has_visible_windows
);
// dbg!(
// "Reopen event received: has_visible_windows = {}",
// has_visible_windows
// );
if has_visible_windows {
return;
}
@@ -239,17 +290,17 @@ pub fn run() {
});
}
pub async fn init<R: Runtime>(app_handle: &AppHandle<R>) {
pub async fn init(app_handle: &AppHandle) {
// Await the async functions to load the servers and tokens
if let Err(err) = load_or_insert_default_server(app_handle).await {
eprintln!("Failed to load servers: {}", err);
log::error!("Failed to load servers: {}", err);
}
if let Err(err) = load_servers_token(app_handle).await {
eprintln!("Failed to load server tokens: {}", err);
log::error!("Failed to load server tokens: {}", err);
}
let coco_servers = server::servers::get_all_servers();
let coco_servers = server::servers::get_all_servers().await;
// Get the registry from Tauri's state
// let registry: State<SearchSourceRegistry> = app_handle.state::<SearchSourceRegistry>();
@@ -259,12 +310,12 @@ pub async fn init<R: Runtime>(app_handle: &AppHandle<R>) {
.await;
}
local::start_pizza_engine_runtime();
extension::built_in::pizza_engine_runtime::start_pizza_engine_runtime().await;
}
#[tauri::command]
async fn show_coco<R: Runtime>(app_handle: AppHandle<R>) {
if let Some(window) = app_handle.get_window(MAIN_WINDOW_LABEL) {
async fn show_coco(app_handle: AppHandle) {
if let Some(window) = app_handle.get_webview_window(MAIN_WINDOW_LABEL) {
move_window_to_active_monitor(&window);
let _ = window.show();
@@ -276,25 +327,25 @@ async fn show_coco<R: Runtime>(app_handle: AppHandle<R>) {
}
#[tauri::command]
async fn hide_coco<R: Runtime>(app: AppHandle<R>) {
if let Some(window) = app.get_window(MAIN_WINDOW_LABEL) {
async fn hide_coco(app: AppHandle) {
if let Some(window) = app.get_webview_window(MAIN_WINDOW_LABEL) {
if let Err(err) = window.hide() {
eprintln!("Failed to hide the window: {}", err);
log::error!("Failed to hide the window: {}", err);
} else {
println!("Window successfully hidden.");
log::debug!("Window successfully hidden.");
}
} else {
eprintln!("Main window not found.");
log::error!("Main window not found.");
}
}
fn move_window_to_active_monitor<R: Runtime>(window: &Window<R>) {
dbg!("Moving window to active monitor");
fn move_window_to_active_monitor(window: &WebviewWindow) {
//dbg!("Moving window to active monitor");
// Try to get the available monitors, handle failure gracefully
let available_monitors = match window.available_monitors() {
Ok(monitors) => monitors,
Err(e) => {
eprintln!("Failed to get monitors: {}", e);
log::error!("Failed to get monitors: {}", e);
return;
}
};
@@ -303,7 +354,7 @@ fn move_window_to_active_monitor<R: Runtime>(window: &Window<R>) {
let cursor_position = match window.cursor_position() {
Ok(pos) => Some(pos),
Err(e) => {
eprintln!("Failed to get cursor position: {}", e);
log::error!("Failed to get cursor position: {}", e);
None
}
};
@@ -332,7 +383,7 @@ fn move_window_to_active_monitor<R: Runtime>(window: &Window<R>) {
let monitor = match target_monitor.or_else(|| window.primary_monitor().ok().flatten()) {
Some(monitor) => monitor,
None => {
eprintln!("No monitor found!");
log::error!("No monitor found!");
return;
}
};
@@ -342,7 +393,7 @@ fn move_window_to_active_monitor<R: Runtime>(window: &Window<R>) {
if let Some(ref prev_name) = *previous_monitor_name {
if name.to_string() == *prev_name {
println!("Currently on the same monitor");
log::debug!("Currently on the same monitor");
return;
}
@@ -356,7 +407,7 @@ fn move_window_to_active_monitor<R: Runtime>(window: &Window<R>) {
let window_size = match window.inner_size() {
Ok(size) => size,
Err(e) => {
eprintln!("Failed to get window size: {}", e);
log::error!("Failed to get window size: {}", e);
return;
}
};
@@ -370,52 +421,19 @@ fn move_window_to_active_monitor<R: Runtime>(window: &Window<R>) {
// Move the window to the new position
if let Err(e) = window.set_position(PhysicalPosition::new(window_x, window_y)) {
eprintln!("Failed to move window: {}", e);
log::error!("Failed to move window: {}", e);
}
if let Some(name) = monitor.name() {
println!("Window moved to monitor: {}", name);
log::debug!("Window moved to monitor: {}", name);
let mut previous_monitor = PREVIOUS_MONITOR_NAME.lock().unwrap();
*previous_monitor = Some(name.to_string());
}
}
#[allow(dead_code)]
fn open_settings(app: &tauri::AppHandle) {
use tauri::webview::WebviewBuilder;
println!("settings menu item was clicked");
let window = app.get_webview_window("settings");
if let Some(window) = window {
let _ = window.show();
let _ = window.unminimize();
let _ = window.set_focus();
} else {
let window = tauri::window::WindowBuilder::new(app, "settings")
.title("Settings Window")
.fullscreen(false)
.resizable(false)
.minimizable(false)
.maximizable(false)
.inner_size(800.0, 600.0)
.build()
.unwrap();
let webview_builder =
WebviewBuilder::new("settings", tauri::WebviewUrl::App("/ui/settings".into()));
let _webview = window
.add_child(
webview_builder,
tauri::LogicalPosition::new(0, 0),
window.inner_size().unwrap(),
)
.unwrap();
}
}
#[tauri::command]
async fn get_app_search_source<R: Runtime>(app_handle: AppHandle<R>) -> Result<(), String> {
local::init_local_search_source(&app_handle).await?;
async fn get_app_search_source(app_handle: AppHandle) -> Result<(), String> {
let _ = server::connector::refresh_all_connectors(&app_handle).await;
let _ = server::datasource::refresh_all_datasources(&app_handle).await;
@@ -424,53 +442,36 @@ async fn get_app_search_source<R: Runtime>(app_handle: AppHandle<R>) -> Result<(
#[tauri::command]
async fn show_settings(app_handle: AppHandle) {
open_settings(&app_handle);
log::debug!("settings menu item was clicked");
let window = app_handle
.get_webview_window(SETTINGS_WINDOW_LABEL)
.expect("we have a settings window");
window.show().unwrap();
window.unminimize().unwrap();
window.set_focus().unwrap();
}
#[tauri::command]
async fn simulate_mouse_click<R: Runtime>(window: WebviewWindow<R>, is_chat_mode: bool) {
#[cfg(target_os = "windows")]
{
use enigo::{Button, Coordinate, Direction, Enigo, Mouse, Settings};
use std::{thread, time::Duration};
async fn show_check(app_handle: AppHandle) {
log::debug!("check menu item was clicked");
let window = app_handle
.get_webview_window(CHECK_WINDOW_LABEL)
.expect("we have a check window");
if let Ok(mut enigo) = Enigo::new(&Settings::default()) {
// Save the current mouse position
if let Ok((original_x, original_y)) = enigo.location() {
// Retrieve the window's outer position (top-left corner)
if let Ok(position) = window.outer_position() {
// Retrieve the window's inner size (client area)
if let Ok(size) = window.inner_size() {
// Calculate the center position of the title bar
let x = position.x + (size.width as i32 / 2);
let y = if is_chat_mode {
position.y + size.height as i32 - 50
} else {
position.y + 30
};
window.show().unwrap();
window.unminimize().unwrap();
window.set_focus().unwrap();
}
// Move the mouse cursor to the calculated position
if enigo.move_mouse(x, y, Coordinate::Abs).is_ok() {
// // Simulate a left mouse click
let _ = enigo.button(Button::Left, Direction::Click);
// let _ = enigo.button(Button::Left, Direction::Release);
#[tauri::command]
async fn hide_check(app_handle: AppHandle) {
log::debug!("check window was closed");
let window = &app_handle
.get_webview_window(CHECK_WINDOW_LABEL)
.expect("we have a check window");
thread::sleep(Duration::from_millis(100));
// Move the mouse cursor back to the original position
let _ = enigo.move_mouse(original_x, original_y, Coordinate::Abs);
}
}
}
}
}
}
#[cfg(not(target_os = "windows"))]
{
let _ = window;
let _ = is_chat_mode;
}
window.hide().unwrap();
}
/// Log format:
@@ -487,6 +488,12 @@ async fn simulate_mouse_click<R: Runtime>(window: WebviewWindow<R>, is_chat_mode
/// ```
fn set_up_tauri_logger() -> TauriPlugin<tauri::Wry> {
use log::Level;
use log::LevelFilter;
use tauri_plugin_log::Builder;
/// Coco-AI app's default log level.
const DEFAULT_LOG_LEVEL: LevelFilter = LevelFilter::Info;
const LOG_LEVEL_ENV_VAR: &str = "COCO_LOG";
fn format_log_level(level: Level) -> &'static str {
match level {
@@ -508,16 +515,93 @@ fn set_up_tauri_logger() -> TauriPlugin<tauri::Wry> {
str
}
tauri_plugin_log::Builder::new()
.format(|out, message, record| {
let now = chrono::Local::now().format("%m-%d %H:%M:%S");
let level = format_log_level(record.level());
let target_and_line = format_target_and_line(record);
out.finish(format_args!(
"[{}] [{}] [{}] {}",
now, level, target_and_line, message
));
})
.level(log::LevelFilter::Debug)
.build()
/// Allow us to configure dynamic log levels via environment variable `COCO_LOG`.
///
/// Generally, it mirros the behavior of `env_logger`. Syntax: `COCO_LOG=[target][=][level][,...]`
///
/// * If this environment variable is not set, use the default log level.
/// * If it is set, respect it:
///
/// * `COCO_LOG=coco_lib` turns on all logging for the `coco_lib` module, which is
/// equivalent to `COCO_LOG=coco_lib=trace`
/// * `COCO_LOG=trace` turns on all logging for the application, regardless of its name
/// * `COCO_LOG=TRACE` turns on all logging for the application, regardless of its name (same as previous)
/// * `COCO_LOG=reqwest=debug` turns on debug logging for `reqwest`
/// * `COCO_LOG=trace,tauri=off` turns on all the logging except for the logs come from `tauri`
/// * `COCO_LOG=off` turns off all logging for the application
/// * `COCO_LOG=` Since the value is empty, turns off all logging for the application as well
fn dynamic_log_level(mut builder: Builder) -> Builder {
let Some(log_levels) = std::env::var_os(LOG_LEVEL_ENV_VAR) else {
return builder.level(DEFAULT_LOG_LEVEL);
};
builder = builder.level(LevelFilter::Off);
let log_levels = log_levels.into_string().unwrap_or_else(|e| {
panic!(
"The value '{}' set in environment varaible '{}' is not UTF-8 encoded",
// Cannot use `.display()` here becuase that requires MSRV 1.87.0
e.to_string_lossy(),
LOG_LEVEL_ENV_VAR
)
});
// COCO_LOG=[target][=][level][,...]
let target_log_levels = log_levels.split(',');
for target_log_level in target_log_levels {
#[allow(clippy::collapsible_else_if)]
if let Some(char_index) = target_log_level.chars().position(|c| c == '=') {
let (target, equal_sign_and_level) = target_log_level.split_at(char_index);
// Remove the equal sign, we know it takes 1 byte
let level = &equal_sign_and_level[1..];
if let Ok(level) = level.parse::<LevelFilter>() {
// Here we have to call `.to_string()` because `Cow<'static, str>` requires `&'static str`
builder = builder.level_for(target.to_string(), level);
} else {
panic!(
"log level '{}' set in '{}={}' is invalid",
level, target, level
);
}
} else {
if let Ok(level) = target_log_level.parse::<LevelFilter>() {
// This is a level
builder = builder.level(level);
} else {
// This is a target, enable all the logging
//
// Here we have to call `.to_string()` because `Cow<'static, str>` requires `&'static str`
builder = builder.level_for(target_log_level.to_string(), LevelFilter::Trace);
}
}
}
builder
}
// When running the built binary, set `COCO_LOG` to `coco_lib=trace` to capture all logs
// that come from Coco in the log file, which helps with debugging.
if !tauri::is_dev() {
// We have absolutely no guarantee that we (We have control over the Rust
// code, but definitely no idea about the libc C code, all the shared objects
// that we will link) will not concurrently read/write `envp`, so just use unsafe.
unsafe {
std::env::set_var("COCO_LOG", "coco_lib=trace");
}
}
let mut builder = tauri_plugin_log::Builder::new();
builder = builder.format(|out, message, record| {
let now = chrono::Local::now().format("%m-%d %H:%M:%S");
let level = format_log_level(record.level());
let target_and_line = format_target_and_line(record);
out.finish(format_args!(
"[{}] [{}] [{}] {}",
now, level, target_and_line, message
));
});
builder = dynamic_log_level(builder);
builder.build()
}

View File

@@ -1,164 +0,0 @@
pub mod application;
pub mod calculator;
pub mod file_system;
use std::any::Any;
use std::collections::hash_map::Entry;
use std::collections::HashMap;
use std::sync::OnceLock;
use crate::common::register::SearchSourceRegistry;
use serde_json::Value as Json;
use tauri::{AppHandle, Manager, Runtime};
use tauri_plugin_store::StoreExt;
pub const LOCAL_QUERY_SOURCE_TYPE: &str = "local";
pub const TAURI_STORE_LOCAL_QUERY_SOURCE_ENABLED_STATE: &str = "local_query_source_enabled_state";
trait SearchSourceState {
#[cfg_attr(not(feature = "use_pizza_engine"), allow(unused))]
fn as_mut_any(&mut self) -> &mut dyn Any;
}
#[async_trait::async_trait(?Send)]
trait Task: Send + Sync {
fn search_source_id(&self) -> &'static str;
async fn exec(&mut self, state: &mut Option<Box<dyn SearchSourceState>>);
}
static RUNTIME_TX: OnceLock<tokio::sync::mpsc::UnboundedSender<Box<dyn Task>>> = OnceLock::new();
pub(crate) fn start_pizza_engine_runtime() {
std::thread::spawn(|| {
let rt = tokio::runtime::Runtime::new().unwrap();
let main = async {
let mut states: HashMap<String, Option<Box<dyn SearchSourceState>>> = HashMap::new();
let (tx, mut rx) = tokio::sync::mpsc::unbounded_channel();
RUNTIME_TX.set(tx).unwrap();
while let Some(mut task) = rx.recv().await {
let opt_search_source_state = match states.entry(task.search_source_id().into()) {
Entry::Occupied(o) => o.into_mut(),
Entry::Vacant(v) => v.insert(None),
};
task.exec(opt_search_source_state).await;
}
};
rt.block_on(main);
});
}
pub(crate) async fn init_local_search_source<R: Runtime>(
app_handle: &AppHandle<R>,
) -> Result<(), String> {
let enabled_status_store = app_handle
.store(TAURI_STORE_LOCAL_QUERY_SOURCE_ENABLED_STATE)
.map_err(|e| e.to_string())?;
if enabled_status_store.is_empty() {
enabled_status_store.set(
application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME,
Json::Bool(true),
);
enabled_status_store.set(calculator::DATA_SOURCE_ID, Json::Bool(true));
}
let registry = app_handle.state::<SearchSourceRegistry>();
application::ApplicationSearchSource::init(app_handle.clone()).await?;
for (id, enabled) in enabled_status_store.entries() {
let enabled = match enabled {
Json::Bool(b) => b,
_ => unreachable!("enabled state should be stored as a boolean"),
};
if enabled {
if id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
registry
.register_source(application::ApplicationSearchSource)
.await;
}
if id == calculator::DATA_SOURCE_ID {
let calculator_search = calculator::CalculatorSource::new(2000f64);
registry.register_source(calculator_search).await;
}
}
}
Ok(())
}
#[tauri::command]
pub async fn get_disabled_local_query_sources<R: Runtime>(app_handle: AppHandle<R>) -> Vec<String> {
let enabled_status_store = app_handle
.store(TAURI_STORE_LOCAL_QUERY_SOURCE_ENABLED_STATE)
.unwrap_or_else(|e| {
panic!(
"tauri store [{}] should exist and be loaded, but that's not true due to error [{}]",
TAURI_STORE_LOCAL_QUERY_SOURCE_ENABLED_STATE, e
)
});
let mut disabled_local_query_sources = Vec::new();
for (id, enabled) in enabled_status_store.entries() {
let enabled = match enabled {
Json::Bool(b) => b,
_ => unreachable!("enabled state should be stored as a boolean"),
};
if !enabled {
disabled_local_query_sources.push(id);
}
}
disabled_local_query_sources
}
#[tauri::command]
pub async fn enable_local_query_source<R: Runtime>(
app_handle: AppHandle<R>,
query_source_id: String,
) {
let registry = app_handle.state::<SearchSourceRegistry>();
if query_source_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
let application_search = application::ApplicationSearchSource;
registry.register_source(application_search).await;
}
if query_source_id == calculator::DATA_SOURCE_ID {
let calculator_search = calculator::CalculatorSource::new(2000f64);
registry.register_source(calculator_search).await;
}
let enabled_status_store = app_handle
.store(TAURI_STORE_LOCAL_QUERY_SOURCE_ENABLED_STATE)
.unwrap_or_else(|e| {
panic!(
"tauri store [{}] should exist and be loaded, but that's not true due to error [{}]",
TAURI_STORE_LOCAL_QUERY_SOURCE_ENABLED_STATE, e
)
});
enabled_status_store.set(query_source_id, Json::Bool(true));
}
#[tauri::command]
pub async fn disable_local_query_source<R: Runtime>(
app_handle: AppHandle<R>,
query_source_id: String,
) {
let registry = app_handle.state::<SearchSourceRegistry>();
registry.remove_source(&query_source_id).await;
let enabled_status_store = app_handle
.store(TAURI_STORE_LOCAL_QUERY_SOURCE_ENABLED_STATE)
.unwrap_or_else(|e| {
panic!(
"tauri store [{}] should exist and be loaded, but that's not true due to error [{}]",
TAURI_STORE_LOCAL_QUERY_SOURCE_ENABLED_STATE, e
)
});
enabled_status_store.set(query_source_id, Json::Bool(false));
}

View File

@@ -1,5 +1,112 @@
// Prevents additional console window on Windows in release, DO NOT REMOVE!!
#![cfg_attr(not(debug_assertions), windows_subsystem = "windows")]
use std::fs::OpenOptions;
use std::io::Write;
use std::path::PathBuf;
/// Helper function to return the log directory.
///
/// This should return the same value as `tauri_app_handle.path().app_log_dir().unwrap()`.
fn app_log_dir() -> PathBuf {
// This function `app_log_dir()` is for the panic hook, which should be set
// before Tauri performs any initialization. At that point, we do not have
// access to the identifier provided by Tauri, so we need to define our own
// one here.
//
// NOTE: If you update identifier in the following files, update this one
// as well!
//
// src-tauri/tauri.linux.conf.json
// src-tauri/Entitlements.plist
// src-tauri/tauri.conf.json
// src-tauri/Info.plist
const IDENTIFIER: &str = "rs.coco.app";
#[cfg(target_os = "macos")]
let path = dirs::home_dir()
.expect("cannot find the home directory, Coco should never run in such a environment")
.join("Library/Logs")
.join(IDENTIFIER);
#[cfg(not(target_os = "macos"))]
let path = dirs::data_local_dir()
.expect("app local dir is None, we should not encounter this")
.join(IDENTIFIER)
.join("logs");
path
}
/// Set up panic hook to log panic information to a file
fn setup_panic_hook() {
std::panic::set_hook(Box::new(|panic_info| {
let timestamp = chrono::Local::now();
// "%Y-%m-%d %H:%M:%S"
//
// I would like to use the above format, but Windows does not allow that
// and complains with OS error 123.
let datetime_str = timestamp.format("%Y-%m-%d-%H-%M-%S").to_string();
let log_dir = app_log_dir();
// Ensure the log directory exists
if let Err(e) = std::fs::create_dir_all(&log_dir) {
eprintln!("Panic hook error: failed to create log directory: {}", e);
return;
}
let panic_file = log_dir.join(format!("{}_rust_panic.log", datetime_str));
// Prepare panic information
let panic_message = if let Some(s) = panic_info.payload().downcast_ref::<&str>() {
s.to_string()
} else if let Some(s) = panic_info.payload().downcast_ref::<String>() {
s.clone()
} else {
"Unknown panic message".to_string()
};
let location = if let Some(location) = panic_info.location() {
format!(
"{}:{}:{}",
location.file(),
location.line(),
location.column()
)
} else {
"Unknown location".to_string()
};
// Use `force_capture()` instead of `capture()` as we want backtrace
// regardless of whether the corresponding env vars are set or not.
let backtrace = std::backtrace::Backtrace::force_capture();
let panic_log = format!(
"Time: [{}]\nLocation: [{}]\nMessage: [{}]\nBacktrace: \n{}",
datetime_str, location, panic_message, backtrace
);
// Write to panic file
match OpenOptions::new()
.create(true)
.append(true)
.open(&panic_file)
{
Ok(mut file) => {
if let Err(e) = writeln!(file, "{}", panic_log) {
eprintln!("Panic hook error: Failed to write panic to file: {}", e);
}
}
Err(e) => {
eprintln!("Panic hook error: Failed to open panic log file: {}", e);
}
}
}));
}
fn main() {
// Panic hook setup should be the first thing to do, everything could panic!
setup_panic_hook();
coco_lib::run();
}

View File

@@ -3,128 +3,273 @@ use crate::common::register::SearchSourceRegistry;
use crate::common::search::{
FailedRequest, MultiSourceQueryResponse, QueryHits, QuerySource, SearchQuery,
};
use futures::stream::FuturesUnordered;
use crate::common::traits::SearchSource;
use crate::server::servers::logout_coco_server;
use crate::server::servers::mark_server_as_offline;
use function_name::named;
use futures::StreamExt;
use futures::stream::FuturesUnordered;
use reqwest::StatusCode;
use std::cmp::Reverse;
use std::collections::HashMap;
use std::collections::HashSet;
use tauri::{AppHandle, Manager, Runtime};
use tokio::time::{timeout, Duration};
use std::sync::Arc;
use tauri::{AppHandle, Manager};
use tokio::time::{Duration, timeout};
#[named]
#[tauri::command]
pub async fn query_coco_fusion<R: Runtime>(
app_handle: AppHandle<R>,
pub async fn query_coco_fusion(
tauri_app_handle: AppHandle,
from: u64,
size: u64,
query_strings: HashMap<String, String>,
query_timeout: u64,
) -> Result<MultiSourceQueryResponse, SearchError> {
let query_source_to_search = query_strings.get("querysource");
let search_sources = app_handle.state::<SearchSourceRegistry>();
let sources_future = search_sources.get_sources();
let mut futures = FuturesUnordered::new();
let mut sources = HashMap::new();
let sources_list = sources_future.await;
// Time limit for each query
let opt_query_source_id = query_strings.get("querysource");
let search_sources = tauri_app_handle.state::<SearchSourceRegistry>();
let query_source_list = search_sources.get_sources().await;
let timeout_duration = Duration::from_millis(query_timeout);
let search_query = SearchQuery::new(from, size, query_strings.clone());
// Push all queries into futures
for query_source in sources_list {
let query_source_type = query_source.get_type().clone();
log::debug!(
"{}() invoked with parameters: from: [{}], size: [{}], query_strings: [{:?}], timeout: [{:?}]",
function_name!(),
from,
size,
query_strings,
timeout_duration
);
if let Some(query_source_to_search) = query_source_to_search {
// We should not search this data source
if &query_source_type.id != query_source_to_search {
continue;
}
// Dispatch to different `query_coco_fusion_xxx()` functions.
if let Some(query_source_id) = opt_query_source_id {
query_coco_fusion_single_query_source(
tauri_app_handle,
query_source_list,
query_source_id.clone(),
timeout_duration,
search_query,
)
.await
} else {
query_coco_fusion_multi_query_sources(
tauri_app_handle,
query_source_list,
timeout_duration,
search_query,
)
.await
}
}
/// Query only 1 query source.
///
/// The logic here is much simpler than `query_coco_fusion_multi_query_sources()`
/// as we don't need to re-rank due to fact that this does not involve multiple
/// query sources.
async fn query_coco_fusion_single_query_source(
tauri_app_handle: AppHandle,
mut query_source_list: Vec<Arc<dyn SearchSource>>,
id_of_query_source_to_query: String,
timeout_duration: Duration,
search_query: SearchQuery,
) -> Result<MultiSourceQueryResponse, SearchError> {
// If this query source ID is specified, we only query this query source.
log::debug!(
"parameter [querysource={}] specified, will only query this query source",
id_of_query_source_to_query
);
let opt_query_source_trait_object_index = query_source_list
.iter()
.position(|query_source| query_source.get_type().id == id_of_query_source_to_query);
let Some(query_source_trait_object_index) = opt_query_source_trait_object_index else {
// It is possible (an edge case) that the frontend invokes `query_coco_fusion()`
// with a querysource that does not exist in the source list:
//
// 1. Search applications
// 2. Navigate to the application sub page
// 3. Disable the application extension in settings, which removes this
// query source from the list
// 4. hide the search window
// 5. Re-open the search window, you will still be in the sub page, type to search
// something
//
// The application query source is not in the source list because the extension
// was disabled and thus removed from the query sources, but the last
// search is indeed invoked with parameter `querysource=application`.
return Ok(MultiSourceQueryResponse {
failed: Vec::new(),
hits: Vec::new(),
total_hits: 0,
});
};
let query_source_trait_object = query_source_list.remove(query_source_trait_object_index);
let query_source = query_source_trait_object.get_type();
let search_fut = query_source_trait_object.search(tauri_app_handle.clone(), search_query);
let timeout_result = timeout(timeout_duration, search_fut).await;
let mut failed_requests: Vec<FailedRequest> = Vec::new();
let mut hits = Vec::new();
let mut total_hits = 0;
match timeout_result {
// Ignore the `_timeout` variable as it won't provide any useful debugging information.
Err(_timeout) => {
log::warn!(
"searching query source [{}] timed out, skip this request",
query_source.id
);
}
Ok(query_result) => match query_result {
Ok(response) => {
total_hits = response.total_hits;
sources.insert(query_source_type.id.clone(), query_source_type);
for (document, score) in response.hits {
log::debug!(
"document from query source [{}]: ID [{}], title [{:?}], score [{}]",
response.source.id,
document.id,
document.title,
score
);
let query = SearchQuery::new(from, size, query_strings.clone());
let query_source_clone = query_source.clone(); // Clone Arc to avoid ownership issues
let query_hit = QueryHits {
source: Some(response.source.clone()),
score,
document,
};
futures.push(tokio::spawn(async move {
// Timeout each query execution
timeout(timeout_duration, async {
query_source_clone.search(query).await
})
.await
}));
hits.push(query_hit);
}
}
Err(search_error) => {
query_coco_fusion_handle_failed_request(
tauri_app_handle.clone(),
&mut failed_requests,
query_source,
search_error,
)
.await;
}
},
}
Ok(MultiSourceQueryResponse {
failed: failed_requests,
hits,
total_hits,
})
}
async fn query_coco_fusion_multi_query_sources(
tauri_app_handle: AppHandle,
query_source_trait_object_list: Vec<Arc<dyn SearchSource>>,
timeout_duration: Duration,
search_query: SearchQuery,
) -> Result<MultiSourceQueryResponse, SearchError> {
log::debug!(
"will query query sources {:?}",
query_source_trait_object_list
.iter()
.map(|search_source| search_source.get_type().id.clone())
.collect::<Vec<String>>()
);
let query_keyword = search_query
.query_strings
.get("query")
.unwrap_or(&"".to_string())
.clone();
let size = search_query.size;
let mut futures = FuturesUnordered::new();
let query_source_list_len = query_source_trait_object_list.len();
for query_source_trait_object in query_source_trait_object_list {
let query_source = query_source_trait_object.get_type().clone();
let tauri_app_handle_clone = tauri_app_handle.clone();
let search_query_clone = search_query.clone();
futures.push(async move {
(
// Store `query_source` as part of future for debugging purposes.
query_source,
timeout(timeout_duration, async {
query_source_trait_object
.search(tauri_app_handle_clone, search_query_clone)
.await
})
.await,
)
});
}
let mut total_hits = 0;
let mut need_rerank = true; //TODO set default to false when boost supported in Pizza
let mut failed_requests = Vec::new();
let mut all_hits: Vec<(String, QueryHits, f64)> = Vec::new();
let mut hits_per_source: HashMap<String, Vec<(QueryHits, f64)>> = HashMap::new();
while let Some(result) = futures.next().await {
match result {
Ok(Ok(Ok(response))) => {
total_hits += response.total_hits;
let source_id = response.source.id.clone();
if query_source_list_len > 1 {
need_rerank = true; // If we have more than one source, we need to rerank the hits
}
for (doc, score) in response.hits {
let query_hit = QueryHits {
source: Some(response.source.clone()),
score,
document: doc,
};
while let Some((query_source, timeout_result)) = futures.next().await {
match timeout_result {
// Ignore the `_timeout` variable as it won't provide any useful debugging information.
Err(_timeout) => {
log::warn!(
"searching query source [{}] timed out, skip this request",
query_source.id
);
}
Ok(query_result) => match query_result {
Ok(response) => {
total_hits += response.total_hits;
let source_id = response.source.id.clone();
all_hits.push((source_id.clone(), query_hit.clone(), score));
for (document, score) in response.hits {
log::debug!(
"document from query source [{}]: ID [{}], title [{:?}], score [{}]",
response.source.id,
document.id,
document.title,
score
);
hits_per_source
.entry(source_id.clone())
.or_insert_with(Vec::new)
.push((query_hit, score));
let query_hit = QueryHits {
source: Some(response.source.clone()),
score,
document,
};
all_hits.push((source_id.clone(), query_hit.clone(), score));
hits_per_source
.entry(source_id.clone())
.or_insert_with(Vec::new)
.push((query_hit, score));
}
}
}
Ok(Ok(Err(err))) => {
failed_requests.push(FailedRequest {
source: QuerySource {
r#type: "N/A".into(),
name: "N/A".into(),
id: "N/A".into(),
},
status: 0,
error: Some(err.to_string()),
reason: None,
});
}
Ok(Err(err)) => {
failed_requests.push(FailedRequest {
source: QuerySource {
r#type: "N/A".into(),
name: "N/A".into(),
id: "N/A".into(),
},
status: 0,
error: Some(err.to_string()),
reason: None,
});
}
// Timeout reached, skip this request
_ => {
failed_requests.push(FailedRequest {
source: QuerySource {
r#type: "N/A".into(),
name: "N/A".into(),
id: "N/A".into(),
},
status: 0,
error: Some(format!("{:?}", &result)),
reason: None,
});
}
Err(search_error) => {
query_coco_fusion_handle_failed_request(
tauri_app_handle.clone(),
&mut failed_requests,
query_source,
search_error,
)
.await;
}
},
}
}
// Sort hits within each source by score (descending)
for hits in hits_per_source.values_mut() {
hits.sort_by(|a, b| b.1.partial_cmp(&a.1).unwrap_or(std::cmp::Ordering::Equal));
hits.sort_by(|a, b| b.1.partial_cmp(&a.1).unwrap_or(std::cmp::Ordering::Greater));
}
let total_sources = hits_per_source.len();
@@ -140,16 +285,71 @@ pub async fn query_coco_fusion<R: Runtime>(
// Distribute hits fairly across sources
for (_source_id, hits) in &mut hits_per_source {
let take_count = hits.len().min(max_hits_per_source);
for (doc, _) in hits.drain(0..take_count) {
for (doc, score) in hits.drain(0..take_count) {
if !seen_docs.contains(&doc.document.id) {
seen_docs.insert(doc.document.id.clone());
log::debug!(
"collect doc: {}, {:?}, {}",
doc.document.id,
doc.document.title,
score
);
final_hits.push(doc);
}
}
}
// If we still need more hits, take the highest-scoring remaining ones
if final_hits.len() < size as usize {
log::debug!("final hits: {:?}", final_hits.len());
let mut unique_sources = HashSet::new();
for hit in &final_hits {
if let Some(source) = &hit.source {
if source.id != crate::extension::built_in::calculator::DATA_SOURCE_ID {
unique_sources.insert(&source.id);
}
}
}
log::debug!(
"Multiple sources found: {:?}, no rerank needed",
unique_sources
);
if unique_sources.len() < 1 {
need_rerank = false; // If we have hits from multiple sources, we don't need to rerank
}
if need_rerank && final_hits.len() > 1 {
// Precollect (index, title)
let titles_to_score: Vec<(usize, &str)> = final_hits
.iter()
.enumerate()
.filter_map(|(idx, hit)| {
let source = hit.source.as_ref()?;
let title = hit.document.title.as_deref()?;
if source.id != crate::extension::built_in::calculator::DATA_SOURCE_ID {
Some((idx, title))
} else {
None
}
})
.collect();
// Score them
let scored_hits = boosted_levenshtein_rerank(query_keyword.as_str(), titles_to_score);
// Sort descending by score
let mut scored_hits = scored_hits;
scored_hits.sort_by_key(|&(_, score)| Reverse((score * 1000.0) as u64));
// Apply new scores to final_hits
for (idx, score) in scored_hits.into_iter().take(size as usize) {
final_hits[idx].score = score;
}
} else if final_hits.len() < size as usize {
// If we still need more hits, take the highest-scoring remaining ones
let remaining_needed = size as usize - final_hits.len();
// Sort all hits by score descending, removing duplicates by document ID
@@ -179,9 +379,96 @@ pub async fn query_coco_fusion<R: Runtime>(
.unwrap_or(std::cmp::Ordering::Equal)
});
if final_hits.len() < 5 {
//TODO: Add a recommendation system to suggest more sources
log::info!(
"Less than 5 hits found, consider using recommendation to find more suggestions."
);
//local: recent history, local extensions
//remote: ai agents, quick links, other tasks, managed by server
}
Ok(MultiSourceQueryResponse {
failed: failed_requests,
hits: final_hits,
total_hits,
})
}
fn boosted_levenshtein_rerank(query: &str, titles: Vec<(usize, &str)>) -> Vec<(usize, f64)> {
use strsim::levenshtein;
let query_lower = query.to_lowercase();
titles
.into_iter()
.map(|(idx, title)| {
let mut score = 0.0;
if title.contains(query) {
score += 0.4;
} else if title.to_lowercase().contains(&query_lower) {
score += 0.2;
}
let dist = levenshtein(&query_lower, &title.to_lowercase());
let max_len = query_lower.len().max(title.len());
if max_len > 0 {
score += (1.0 - (dist as f64 / max_len as f64)) as f32;
}
(idx, score.min(1.0) as f64)
})
.collect()
}
/// Helper function to handle a failed request.
///
/// Extracted as a function because `query_coco_fusion_single_query_source()` and
/// `query_coco_fusion_multi_query_sources()` share the same error handling logic.
async fn query_coco_fusion_handle_failed_request(
tauri_app_handle: AppHandle,
failed_requests: &mut Vec<FailedRequest>,
query_source: QuerySource,
search_error: SearchError,
) {
log::error!(
"searching query source [{}] failed, error [{}]",
query_source.id,
search_error
);
let mut status_code_num: u16 = 0;
if let SearchError::HttpError {
status_code: opt_status_code,
msg: _,
} = search_error
{
if let Some(status_code) = opt_status_code {
status_code_num = status_code.as_u16();
if status_code != StatusCode::OK {
if status_code == StatusCode::UNAUTHORIZED {
// This Coco server is unavailable. In addition to marking it as
// unavailable, we need to log out because the status code is 401.
logout_coco_server(tauri_app_handle.clone(), query_source.id.to_string()).await.unwrap_or_else(|e| {
panic!(
"the search request to Coco server [id {}, name {}] failed with status code {}, the login token is invalid, we are trying to log out, but failed with error [{}]",
query_source.id, query_source.name, StatusCode::UNAUTHORIZED, e
);
})
} else {
// This Coco server is unavailable
mark_server_as_offline(tauri_app_handle.clone(), &query_source.id).await;
}
}
}
}
failed_requests.push(FailedRequest {
source: query_source,
status: status_code_num,
error: Some(search_error.to_string()),
reason: None,
});
}

View File

@@ -15,42 +15,6 @@ pub struct UploadAttachmentResponse {
pub attachments: Vec<String>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct AttachmentSource {
pub id: String,
pub created: String,
pub updated: String,
pub session: String,
pub name: String,
pub icon: String,
pub url: String,
pub size: u64,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct AttachmentHit {
pub _index: String,
pub _type: Option<String>,
pub _id: String,
pub _score: Option<f64>,
pub _source: AttachmentSource,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct AttachmentHits {
pub total: Value,
pub max_score: Option<f64>,
pub hits: Option<Vec<AttachmentHit>>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct GetAttachmentResponse {
pub took: u32,
pub timed_out: bool,
pub _shards: Option<Value>,
pub hits: AttachmentHits,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct DeleteAttachmentResponse {
pub _id: String,
@@ -60,7 +24,6 @@ pub struct DeleteAttachmentResponse {
#[command]
pub async fn upload_attachment(
server_id: String,
session_id: String,
file_paths: Vec<PathBuf>,
) -> Result<UploadAttachmentResponse, String> {
let mut form = Form::new();
@@ -82,10 +45,12 @@ pub async fn upload_attachment(
form = form.part("files", part);
}
let server = get_server_by_id(&server_id).ok_or("Server not found")?;
let url = HttpClient::join_url(&server.endpoint, &format!("chat/{}/_upload", session_id));
let server = get_server_by_id(&server_id)
.await
.ok_or("Server not found")?;
let url = HttpClient::join_url(&server.endpoint, &format!("attachment/_upload"));
let token = get_server_token(&server_id).await?;
let token = get_server_token(&server_id).await;
let mut headers = HashMap::new();
if let Some(token) = token {
headers.insert("X-API-TOKEN".to_string(), token.access_token);
@@ -107,20 +72,25 @@ pub async fn upload_attachment(
}
#[command]
pub async fn get_attachment(
pub async fn get_attachment_by_ids(
server_id: String,
session_id: String,
) -> Result<GetAttachmentResponse, String> {
let mut query_params = HashMap::new();
query_params.insert("session".to_string(), serde_json::Value::String(session_id));
attachments: Vec<String>,
) -> Result<Value, String> {
println!("get_attachment_by_ids server_id: {}", server_id);
println!("get_attachment_by_ids attachments: {:?}", attachments);
let response = HttpClient::get(&server_id, "/attachment/_search", Some(query_params))
let request_body = serde_json::json!({
"attachments": attachments
});
let body = reqwest::Body::from(serde_json::to_string(&request_body).unwrap());
let response = HttpClient::post(&server_id, "/attachment/_search", None, Some(body))
.await
.map_err(|e| format!("Request error: {}", e))?;
let body = get_response_body_text(response).await?;
serde_json::from_str::<GetAttachmentResponse>(&body)
serde_json::from_str::<Value>(&body)
.map_err(|e| format!("Failed to parse attachment response: {}", e))
}

View File

@@ -4,7 +4,7 @@ use crate::server::servers::{
get_server_by_id, persist_servers, persist_servers_token, save_access_token, save_server,
try_register_server_to_search_source,
};
use tauri::{AppHandle, Runtime};
use tauri::AppHandle;
#[allow(dead_code)]
fn request_access_token_url(request_id: &str) -> String {
@@ -13,22 +13,22 @@ fn request_access_token_url(request_id: &str) -> String {
}
#[tauri::command]
pub async fn handle_sso_callback<R: Runtime>(
app_handle: AppHandle<R>,
pub async fn handle_sso_callback(
app_handle: AppHandle,
server_id: String,
request_id: String,
code: String,
) -> Result<(), String> {
// Retrieve the server details using the server ID
let server = get_server_by_id(&server_id);
let server = get_server_by_id(&server_id).await;
let expire_in = 3600; // TODO, need to update to actual expire_in value
if let Some(mut server) = server {
// Save the access token for the server
let access_token = ServerAccessToken::new(server_id.clone(), code.clone(), expire_in);
// dbg!(&server_id, &request_id, &code, &token);
save_access_token(server_id.clone(), access_token);
persist_servers_token(&app_handle)?;
save_access_token(server_id.clone(), access_token).await;
persist_servers_token(&app_handle).await?;
// Register the server to the search source
try_register_server_to_search_source(app_handle.clone(), &server).await;
@@ -41,7 +41,7 @@ pub async fn handle_sso_callback<R: Runtime>(
Ok(p) => {
server.profile = Some(p);
server.available = true;
save_server(&server);
save_server(&server).await;
persist_servers(&app_handle).await?;
Ok(())
}

View File

@@ -1,11 +1,12 @@
use crate::common::connector::Connector;
use crate::common::search::parse_search_results;
use crate::server::http_client::HttpClient;
use crate::server::http_client::{HttpClient, status_code_check};
use crate::server::servers::get_all_servers;
use http::StatusCode;
use lazy_static::lazy_static;
use std::collections::HashMap;
use std::sync::{Arc, RwLock};
use tauri::{AppHandle, Runtime};
use tauri::AppHandle;
lazy_static! {
static ref CONNECTOR_CACHE: Arc<RwLock<HashMap<String, HashMap<String, Connector>>>> =
@@ -28,8 +29,8 @@ pub fn get_connector_by_id(server_id: &str, connector_id: &str) -> Option<Connec
Some(connector.clone())
}
pub async fn refresh_all_connectors<R: Runtime>(app_handle: &AppHandle<R>) -> Result<(), String> {
let servers = get_all_servers();
pub async fn refresh_all_connectors(app_handle: &AppHandle) -> Result<(), String> {
let servers = get_all_servers().await;
// Collect all the tasks for fetching and refreshing connectors
let mut server_map = HashMap::new();
@@ -107,6 +108,7 @@ pub async fn fetch_connectors_by_server(id: &str) -> Result<Vec<Connector>, Stri
// dbg!("Error fetching connector for id {}: {}", &id, &e);
format!("Error fetching connector: {}", e)
})?;
status_code_check(&resp, &[StatusCode::OK, StatusCode::CREATED])?;
// Parse the search results directly from the response body
let datasource: Vec<Connector> = parse_search_results(resp)
@@ -120,8 +122,8 @@ pub async fn fetch_connectors_by_server(id: &str) -> Result<Vec<Connector>, Stri
}
#[tauri::command]
pub async fn get_connectors_by_server<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn get_connectors_by_server(
_app_handle: AppHandle,
id: String,
) -> Result<Vec<Connector>, String> {
let connectors = fetch_connectors_by_server(&id).await?;

View File

@@ -1,19 +1,13 @@
use crate::common::datasource::DataSource;
use crate::common::search::parse_search_results;
use crate::server::connector::get_connector_by_id;
use crate::server::http_client::HttpClient;
use crate::server::http_client::{HttpClient, status_code_check};
use crate::server::servers::get_all_servers;
use http::StatusCode;
use lazy_static::lazy_static;
use std::collections::HashMap;
use std::sync::{Arc, RwLock};
use tauri::{AppHandle, Runtime};
#[derive(serde::Deserialize, Debug)]
pub struct GetDatasourcesByServerOptions {
pub from: Option<u32>,
pub size: Option<u32>,
pub query: Option<String>,
}
use tauri::AppHandle;
lazy_static! {
static ref DATASOURCE_CACHE: Arc<RwLock<HashMap<String, HashMap<String, DataSource>>>> =
@@ -37,10 +31,10 @@ pub fn get_datasources_from_cache(server_id: &str) -> Option<HashMap<String, Dat
Some(server_cache.clone())
}
pub async fn refresh_all_datasources<R: Runtime>(_app_handle: &AppHandle<R>) -> Result<(), String> {
pub async fn refresh_all_datasources(_app_handle: &AppHandle) -> Result<(), String> {
// dbg!("Attempting to refresh all datasources");
let servers = get_all_servers();
let servers = get_all_servers().await;
let mut server_map = HashMap::new();
@@ -96,50 +90,17 @@ pub async fn refresh_all_datasources<R: Runtime>(_app_handle: &AppHandle<R>) ->
#[tauri::command]
pub async fn datasource_search(
id: &str,
options: Option<GetDatasourcesByServerOptions>,
query_params: Option<Vec<String>>, //["query=abc", "filter=er", "filter=efg", "from=0", "size=5"],
) -> Result<Vec<DataSource>, String> {
let from = options.as_ref().and_then(|opt| opt.from).unwrap_or(0);
let size = options.as_ref().and_then(|opt| opt.size).unwrap_or(10000);
let query = options
.and_then(|opt| opt.query)
.unwrap_or(String::default());
let mut body = serde_json::json!({
"from": from,
"size": size,
});
if !query.is_empty() {
body["query"] = serde_json::json!({
"bool": {
"must": [{
"query_string": {
"fields": ["combined_fulltext"],
"query": query,
"fuzziness": "AUTO",
"fuzzy_prefix_length": 2,
"fuzzy_max_expansions": 10,
"fuzzy_transpositions": true,
"allow_leading_wildcard": false
}
}]
}
});
}
// Perform the async HTTP request outside the cache lock
let resp = HttpClient::post(
id,
"/datasource/_search",
None,
Some(reqwest::Body::from(body.to_string())),
)
let resp = HttpClient::post(id, "/datasource/_search", query_params, None)
.await
.map_err(|e| format!("Error fetching datasource: {}", e))?;
status_code_check(&resp, &[StatusCode::OK, StatusCode::CREATED])?;
// Parse the search results from the response
let datasources: Vec<DataSource> = parse_search_results(resp).await.map_err(|e| {
dbg!("Error parsing search results: {}", &e);
//dbg!("Error parsing search results: {}", &e);
e.to_string()
})?;
@@ -152,50 +113,17 @@ pub async fn datasource_search(
#[tauri::command]
pub async fn mcp_server_search(
id: &str,
options: Option<GetDatasourcesByServerOptions>,
query_params: Option<Vec<String>>,
) -> Result<Vec<DataSource>, String> {
let from = options.as_ref().and_then(|opt| opt.from).unwrap_or(0);
let size = options.as_ref().and_then(|opt| opt.size).unwrap_or(10000);
let query = options
.and_then(|opt| opt.query)
.unwrap_or(String::default());
let mut body = serde_json::json!({
"from": from,
"size": size,
});
if !query.is_empty() {
body["query"] = serde_json::json!({
"bool": {
"must": [{
"query_string": {
"fields": ["combined_fulltext"],
"query": query,
"fuzziness": "AUTO",
"fuzzy_prefix_length": 2,
"fuzzy_max_expansions": 10,
"fuzzy_transpositions": true,
"allow_leading_wildcard": false
}
}]
}
});
}
// Perform the async HTTP request outside the cache lock
let resp = HttpClient::post(
id,
"/mcp_server/_search",
None,
Some(reqwest::Body::from(body.to_string())),
)
let resp = HttpClient::post(id, "/mcp_server/_search", query_params, None)
.await
.map_err(|e| format!("Error fetching datasource: {}", e))?;
status_code_check(&resp, &[StatusCode::OK, StatusCode::CREATED])?;
// Parse the search results from the response
let mcp_server: Vec<DataSource> = parse_search_results(resp).await.map_err(|e| {
dbg!("Error parsing search results: {}", &e);
//dbg!("Error parsing search results: {}", &e);
e.to_string()
})?;

View File

@@ -1,17 +1,19 @@
use crate::server::servers::{get_server_by_id, get_server_token};
use http::{HeaderName, HeaderValue};
use crate::util::app_lang::get_app_lang;
use crate::util::platform::Platform;
use http::{HeaderName, HeaderValue, StatusCode};
use once_cell::sync::Lazy;
use reqwest::{Client, Method, RequestBuilder};
use std::collections::HashMap;
use std::sync::LazyLock;
use std::time::Duration;
use tauri_plugin_store::JsonValue;
use tokio::sync::Mutex;
pub(crate) fn new_reqwest_http_client(accept_invalid_certs: bool) -> Client {
Client::builder()
.read_timeout(Duration::from_secs(3)) // Set a timeout of 3 second
.connect_timeout(Duration::from_secs(3)) // Set a timeout of 3 second
.timeout(Duration::from_secs(10)) // Set a timeout of 10 seconds
.read_timeout(Duration::from_secs(60)) // Set a timeout of 60 second
.connect_timeout(Duration::from_secs(30)) // Set a timeout of 30 second
.timeout(Duration::from_secs(5 * 60)) // Set a timeout of 5 minute
.danger_accept_invalid_certs(accept_invalid_certs) // allow self-signed certificates
.build()
.expect("Failed to build client")
@@ -27,6 +29,26 @@ pub static HTTP_CLIENT: Lazy<Mutex<Client>> = Lazy::new(|| {
Mutex::new(new_reqwest_http_client(allow_self_signature))
});
/// These header values won't change during a process's lifetime.
static STATIC_HEADERS: LazyLock<HashMap<String, String>> = LazyLock::new(|| {
HashMap::from([
(
"X-OS-NAME".into(),
Platform::current()
.to_os_name_http_header_str()
.into_owned(),
),
(
"X-OS-VER".into(),
sysinfo::System::os_version()
.expect("sysinfo::System::os_version() should be Some on major systems"),
),
("X-OS-ARCH".into(), sysinfo::System::cpu_arch()),
("X-APP-NAME".into(), "coco-app".into()),
("X-APP-VER".into(), env!("CARGO_PKG_VERSION").into()),
])
});
pub struct HttpClient;
impl HttpClient {
@@ -40,7 +62,7 @@ impl HttpClient {
pub async fn send_raw_request(
method: Method,
url: &str,
query_params: Option<HashMap<String, JsonValue>>,
query_params: Option<Vec<String>>,
headers: Option<HashMap<String, String>>,
body: Option<reqwest::Body>,
) -> Result<reqwest::Response, String> {
@@ -56,7 +78,7 @@ impl HttpClient {
Self::get_request_builder(method, url, headers, query_params, body).await;
let response = request_builder.send().await.map_err(|e| {
dbg!("Failed to send request: {}", &e);
//dbg!("Failed to send request: {}", &e);
format!("Failed to send request: {}", e)
})?;
@@ -74,7 +96,7 @@ impl HttpClient {
method: Method,
url: &str,
headers: Option<HashMap<String, String>>,
query_params: Option<HashMap<String, JsonValue>>, // Add query parameters
query_params: Option<Vec<String>>, // Add query parameters
body: Option<reqwest::Body>,
) -> RequestBuilder {
let client = HTTP_CLIENT.lock().await; // Acquire the lock on HTTP_CLIENT
@@ -82,8 +104,32 @@ impl HttpClient {
// Build the request
let mut request_builder = client.request(method.clone(), url);
// Populate the headers defined by us
let mut req_headers = reqwest::header::HeaderMap::new();
for (key, value) in STATIC_HEADERS.iter() {
let key = HeaderName::from_bytes(key.as_bytes())
.expect("headers defined by us should be valid");
let value = HeaderValue::from_str(value.trim()).unwrap_or_else(|e| {
panic!(
"header value [{}] is invalid, error [{}], this should be unreachable",
value, e
);
});
req_headers.insert(key, value);
}
let app_lang = get_app_lang().await.to_string();
req_headers.insert(
"X-APP-LANG",
HeaderValue::from_str(&app_lang).unwrap_or_else(|e| {
panic!(
"header value [{}] is invalid, error [{}], this should be unreachable",
app_lang, e
);
}),
);
// Headers from the function parameter
if let Some(h) = headers {
let mut req_headers = reqwest::header::HeaderMap::new();
for (key, value) in h.into_iter() {
match (
HeaderName::from_bytes(key.as_bytes()),
@@ -106,24 +152,9 @@ impl HttpClient {
request_builder = request_builder.headers(req_headers);
}
if let Some(query) = query_params {
// Convert only supported value types into strings
let query: HashMap<String, String> = query
.into_iter()
.filter_map(|(k, v)| {
match v {
JsonValue::String(s) => Some((k, s)),
JsonValue::Number(n) => Some((k, n.to_string())),
JsonValue::Bool(b) => Some((k, b.to_string())),
_ => {
dbg!(
"Unsupported query parameter type. Only strings, numbers, and booleans are supported.",k,v,
);
None
} // skip arrays, objects, nulls
}
})
.collect();
if let Some(params) = query_params {
let query: Vec<(&str, &str)> =
params.iter().filter_map(|s| s.split_once('=')).collect();
request_builder = request_builder.query(&query);
}
@@ -140,18 +171,18 @@ impl HttpClient {
method: Method,
path: &str,
custom_headers: Option<HashMap<String, String>>,
query_params: Option<HashMap<String, JsonValue>>,
query_params: Option<Vec<String>>,
body: Option<reqwest::Body>,
) -> Result<reqwest::Response, String> {
// Fetch the server using the server_id
let server = get_server_by_id(server_id);
let server = get_server_by_id(server_id).await;
if let Some(s) = server {
// Construct the URL
let url = HttpClient::join_url(&s.endpoint, path);
// Retrieve the token for the server (token is optional)
let token = get_server_token(server_id)
.await?
.await
.map(|t| t.access_token.clone());
let mut headers = if let Some(custom_headers) = custom_headers {
@@ -165,16 +196,16 @@ impl HttpClient {
headers.insert("X-API-TOKEN".to_string(), t);
}
log::debug!(
"Sending request to server: {}, url: {}, headers: {:?}",
&server_id,
&url,
&headers
);
// log::debug!(
// "Sending request to server: {}, url: {}, headers: {:?}",
// &server_id,
// &url,
// &headers
// );
Self::send_raw_request(method, &url, query_params, Some(headers), body).await
} else {
Err("Server not found".to_string())
Err(format!("Server [{}] not found", server_id))
}
}
@@ -182,7 +213,7 @@ impl HttpClient {
pub async fn get(
server_id: &str,
path: &str,
query_params: Option<HashMap<String, JsonValue>>, // Add query parameters
query_params: Option<Vec<String>>,
) -> Result<reqwest::Response, String> {
HttpClient::send_request(server_id, Method::GET, path, None, query_params, None).await
}
@@ -191,7 +222,7 @@ impl HttpClient {
pub async fn post(
server_id: &str,
path: &str,
query_params: Option<HashMap<String, JsonValue>>, // Add query parameters
query_params: Option<Vec<String>>,
body: Option<reqwest::Body>,
) -> Result<reqwest::Response, String> {
HttpClient::send_request(server_id, Method::POST, path, None, query_params, body).await
@@ -201,7 +232,7 @@ impl HttpClient {
server_id: &str,
path: &str,
custom_headers: Option<HashMap<String, String>>,
query_params: Option<HashMap<String, JsonValue>>, // Add query parameters
query_params: Option<Vec<String>>,
body: Option<reqwest::Body>,
) -> Result<reqwest::Response, String> {
HttpClient::send_request(
@@ -221,7 +252,7 @@ impl HttpClient {
server_id: &str,
path: &str,
custom_headers: Option<HashMap<String, String>>,
query_params: Option<HashMap<String, JsonValue>>, // Add query parameters
query_params: Option<Vec<String>>,
body: Option<reqwest::Body>,
) -> Result<reqwest::Response, String> {
HttpClient::send_request(
@@ -241,7 +272,7 @@ impl HttpClient {
server_id: &str,
path: &str,
custom_headers: Option<HashMap<String, String>>,
query_params: Option<HashMap<String, JsonValue>>, // Add query parameters
query_params: Option<Vec<String>>,
) -> Result<reqwest::Response, String> {
HttpClient::send_request(
server_id,
@@ -254,3 +285,30 @@ impl HttpClient {
.await
}
}
/// Helper function to check status code.
///
/// If the status code is not in the `allowed_status_codes` list, return an error.
pub(crate) fn status_code_check(
response: &reqwest::Response,
allowed_status_codes: &[StatusCode],
) -> Result<(), String> {
let status_code = response.status();
if !allowed_status_codes.contains(&status_code) {
let msg = format!(
"Response of request [{}] status code failed: status code [{}], which is not in the 'allow' list {:?}",
response.url(),
status_code,
allowed_status_codes
.iter()
.map(|status| status.to_string())
.collect::<Vec<String>>()
);
log::warn!("{}", msg);
Err(msg)
} else {
Ok(())
}
}

View File

@@ -8,6 +8,6 @@ pub mod http_client;
pub mod profile;
pub mod search;
pub mod servers;
pub mod synthesize;
pub mod system_settings;
pub mod transcription;
pub mod websocket;

View File

@@ -1,11 +1,11 @@
use crate::common::http::get_response_body_text;
use crate::common::profile::UserProfile;
use crate::server::http_client::HttpClient;
use tauri::{AppHandle, Runtime};
use tauri::AppHandle;
#[tauri::command]
pub async fn get_user_profiles<R: Runtime>(
_app_handle: AppHandle<R>,
pub async fn get_user_profiles(
_app_handle: AppHandle,
server_id: String,
) -> Result<UserProfile, String> {
// Use the generic GET method from HttpClient

View File

@@ -1,4 +1,4 @@
use crate::common::document::Document;
use crate::common::document::{Document, OnOpened};
use crate::common::error::SearchError;
use crate::common::http::get_response_body_text;
use crate::common::search::{QueryHits, QueryResponse, QuerySource, SearchQuery, SearchResponse};
@@ -6,11 +6,10 @@ use crate::common::server::Server;
use crate::common::traits::SearchSource;
use crate::server::http_client::HttpClient;
use async_trait::async_trait;
// use futures::stream::StreamExt;
use ordered_float::OrderedFloat;
use reqwest::StatusCode;
use std::collections::HashMap;
use tauri_plugin_store::JsonValue;
// use std::hash::Hash;
use tauri::AppHandle;
#[allow(dead_code)]
pub(crate) struct DocumentsSizedCollector {
@@ -45,7 +44,7 @@ impl DocumentsSizedCollector {
}
}
fn documents(self) -> impl ExactSizeIterator<Item=Document> {
fn documents(self) -> impl ExactSizeIterator<Item = Document> {
self.docs.into_iter().map(|(_, doc, _)| doc)
}
@@ -91,41 +90,74 @@ impl SearchSource for CocoSearchSource {
}
}
async fn search(&self, query: SearchQuery) -> Result<QueryResponse, SearchError> {
async fn search(
&self,
_tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
let url = "/query/_search";
let mut total_hits = 0;
let mut hits: Vec<(Document, f64)> = Vec::new();
let mut query_args: HashMap<String, JsonValue> = HashMap::new();
query_args.insert("from".into(), JsonValue::Number(query.from.into()));
query_args.insert("size".into(), JsonValue::Number(query.size.into()));
let mut query_params = Vec::new();
// Add from/size as number values
query_params.push(format!("from={}", query.from));
query_params.push(format!("size={}", query.size));
// Add query strings
for (key, value) in query.query_strings {
query_args.insert(key, JsonValue::String(value));
query_params.push(format!("{}={}", key, value));
}
let response = HttpClient::get(
&self.server.id,
&url,
Some(query_args),
)
let response = HttpClient::get(&self.server.id, &url, Some(query_params))
.await
.map_err(|e| SearchError::HttpError(format!("Error to send search request: {}", e)))?;
.map_err(|e| SearchError::HttpError {
status_code: None,
msg: format!("{}", e),
})?;
let status_code = response.status();
if ![StatusCode::OK, StatusCode::CREATED].contains(&status_code) {
return Err(SearchError::HttpError {
status_code: Some(status_code),
msg: format!("Request failed with status code [{}]", status_code),
});
}
// Use the helper function to parse the response body
let response_body = get_response_body_text(response)
.await
.map_err(|e| SearchError::ParseError(format!("Failed to read response body: {}", e)))?;
.map_err(|e| SearchError::ParseError(e))?;
// Parse the search response from the body text
let parsed: SearchResponse<Document> = serde_json::from_str(&response_body)
.map_err(|e| SearchError::ParseError(format!("Failed to parse search response: {}", e)))?;
// Check if the response body is empty
if !response_body.is_empty() {
// log::info!("Search response body: {}", &response_body);
// Process the parsed response
let total_hits = parsed.hits.total.value as usize;
let hits: Vec<(Document, f64)> = parsed
.hits
.hits
.into_iter()
.map(|hit| (hit._source, hit._score.unwrap_or(0.0))) // Default _score to 0.0 if None
.collect();
// Parse the search response from the body text
let parsed: SearchResponse<Document> = serde_json::from_str(&response_body)
.map_err(|e| SearchError::ParseError(format!("{}", e)))?;
// Process the parsed response
total_hits = parsed.hits.total.value as usize;
if let Some(items) = parsed.hits.hits {
for hit in items {
let mut document = hit._source;
// Default _score to 0.0 if None
let score = hit._score.unwrap_or(0.0);
let on_opened = document
.url
.as_ref()
.map(|url| OnOpened::Document { url: url.clone() });
// Set the `on_opened` field as it won't be returned from Coco server
document.on_opened = on_opened;
hits.push((document, score));
}
}
}
// Return the final result
Ok(QueryResponse {

View File

@@ -1,3 +1,4 @@
use crate::COCO_TAURI_STORE;
use crate::common::http::get_response_body_text;
use crate::common::register::SearchSourceRegistry;
use crate::common::server::{AuthProvider, Provider, Server, ServerAccessToken, Sso, Version};
@@ -5,68 +6,71 @@ use crate::server::connector::fetch_connectors_by_server;
use crate::server::datasource::datasource_search;
use crate::server::http_client::HttpClient;
use crate::server::search::CocoSearchSource;
use crate::COCO_TAURI_STORE;
use lazy_static::lazy_static;
use function_name;
use http::StatusCode;
use reqwest::Method;
use serde_json::from_value;
use serde_json::Value as JsonValue;
use serde_json::from_value;
use std::collections::HashMap;
use std::sync::Arc;
use std::sync::RwLock;
use tauri::Runtime;
use std::sync::LazyLock;
use tauri::{AppHandle, Manager};
use tauri_plugin_store::StoreExt;
// Assuming you're using serde_json
use tokio::sync::RwLock;
lazy_static! {
static ref SERVER_CACHE: Arc<RwLock<HashMap<String, Server>>> =
Arc::new(RwLock::new(HashMap::new()));
static ref SERVER_TOKEN: Arc<RwLock<HashMap<String, ServerAccessToken>>> =
Arc::new(RwLock::new(HashMap::new()));
}
/// Coco sever list
static SERVER_LIST_CACHE: LazyLock<RwLock<HashMap<String, Server>>> =
LazyLock::new(|| RwLock::new(HashMap::new()));
#[allow(dead_code)]
fn check_server_exists(id: &str) -> bool {
let cache = SERVER_CACHE.read().unwrap(); // Acquire read lock
cache.contains_key(id)
}
/// If a server has a token stored here that has not expired, it is considered logged in.
///
/// Since the `expire_at` field of `struct ServerAccessToken` is currently unused,
/// all servers stored here are treated as logged in.
static SERVER_TOKEN_LIST_CACHE: LazyLock<RwLock<HashMap<String, ServerAccessToken>>> =
LazyLock::new(|| RwLock::new(HashMap::new()));
pub fn get_server_by_id(id: &str) -> Option<Server> {
let cache = SERVER_CACHE.read().unwrap(); // Acquire read lock
/// `SERVER_LIST_CACHE` will be stored in KV store COCO_TAURI_STORE, under this key.
pub const COCO_SERVERS: &str = "coco_servers";
/// `SERVER_TOKEN_LIST_CACHE` will be stored in KV store COCO_TAURI_STORE, under this key.
const COCO_SERVER_TOKENS: &str = "coco_server_tokens";
pub async fn get_server_by_id(id: &str) -> Option<Server> {
let cache = SERVER_LIST_CACHE.read().await;
cache.get(id).cloned()
}
#[tauri::command]
pub async fn get_server_token(id: &str) -> Result<Option<ServerAccessToken>, String> {
let cache = SERVER_TOKEN.read().map_err(|err| err.to_string())?;
pub async fn get_server_token(id: &str) -> Option<ServerAccessToken> {
let cache = SERVER_TOKEN_LIST_CACHE.read().await;
Ok(cache.get(id).cloned())
cache.get(id).cloned()
}
pub fn save_access_token(server_id: String, token: ServerAccessToken) -> bool {
let mut cache = SERVER_TOKEN.write().unwrap();
pub async fn save_access_token(server_id: String, token: ServerAccessToken) -> bool {
let mut cache = SERVER_TOKEN_LIST_CACHE.write().await;
cache.insert(server_id, token).is_none()
}
fn check_endpoint_exists(endpoint: &str) -> bool {
let cache = SERVER_CACHE.read().unwrap();
async fn check_endpoint_exists(endpoint: &str) -> bool {
let cache = SERVER_LIST_CACHE.read().await;
cache.values().any(|server| server.endpoint == endpoint)
}
pub fn save_server(server: &Server) -> bool {
let mut cache = SERVER_CACHE.write().unwrap();
cache.insert(server.id.clone(), server.clone()).is_none() // If the server id did not exist, `insert` will return `None`
/// Return true if `server` does not exists in the server list, i.e., it is a newly-added
/// server.
pub async fn save_server(server: &Server) -> bool {
let mut cache = SERVER_LIST_CACHE.write().await;
cache.insert(server.id.clone(), server.clone()).is_none()
}
fn remove_server_by_id(id: String) -> bool {
dbg!("remove server by id:", &id);
let mut cache = SERVER_CACHE.write().unwrap();
let deleted = cache.remove(id.as_str());
deleted.is_some()
/// Return the removed `Server` if it exists in the server list.
async fn remove_server_by_id(id: &str) -> Option<Server> {
log::debug!("remove server by id: {}", &id);
let mut cache = SERVER_LIST_CACHE.write().await;
cache.remove(id)
}
pub async fn persist_servers<R: Runtime>(app_handle: &AppHandle<R>) -> Result<(), String> {
let cache = SERVER_CACHE.read().unwrap(); // Acquire a read lock, not a write lock, since you're not modifying the cache
pub async fn persist_servers(app_handle: &AppHandle) -> Result<(), String> {
let cache = SERVER_LIST_CACHE.read().await;
// Convert HashMap to Vec for serialization (iterating over values of HashMap)
let servers: Vec<Server> = cache.values().cloned().collect();
@@ -86,14 +90,16 @@ pub async fn persist_servers<R: Runtime>(app_handle: &AppHandle<R>) -> Result<()
Ok(())
}
pub fn remove_server_token(id: &str) -> bool {
dbg!("remove server token by id:", &id);
let mut cache = SERVER_TOKEN.write().unwrap();
/// Return true if the server token of the server specified by `id` exists in
/// the token list and gets deleted.
pub async fn remove_server_token(id: &str) -> bool {
log::debug!("remove server token by id: {}", &id);
let mut cache = SERVER_TOKEN_LIST_CACHE.write().await;
cache.remove(id).is_some()
}
pub fn persist_servers_token<R: Runtime>(app_handle: &AppHandle<R>) -> Result<(), String> {
let cache = SERVER_TOKEN.read().unwrap(); // Acquire a read lock, not a write lock, since you're not modifying the cache
pub async fn persist_servers_token(app_handle: &AppHandle) -> Result<(), String> {
let cache = SERVER_TOKEN_LIST_CACHE.read().await;
// Convert HashMap to Vec for serialization (iterating over values of HashMap)
let servers: Vec<ServerAccessToken> = cache.values().cloned().collect();
@@ -104,7 +110,7 @@ pub fn persist_servers_token<R: Runtime>(app_handle: &AppHandle<R>) -> Result<()
.map(|server| serde_json::to_value(server).expect("Failed to serialize access_tokens")) // Automatically serialize all fields
.collect();
dbg!(format!("persist servers token: {:?}", &json_servers));
log::debug!("persist servers token: {:?}", &json_servers);
// Save the serialized servers to Tauri's store
app_handle
@@ -143,17 +149,16 @@ fn get_default_server() -> Server {
profile: None,
auth_provider: AuthProvider {
sso: Sso {
url: "https://coco.infini.cloud/sso/login/".to_string(),
url: "https://coco.infini.cloud/sso/login/cloud?provider=coco-cloud&product=coco".to_string(),
},
},
priority: 0,
stats: None,
}
}
pub async fn load_servers_token<R: Runtime>(
app_handle: &AppHandle<R>,
) -> Result<Vec<ServerAccessToken>, String> {
dbg!("Attempting to load servers token");
pub async fn load_servers_token(app_handle: &AppHandle) -> Result<Vec<ServerAccessToken>, String> {
log::debug!("Attempting to load servers token");
let store = app_handle
.store(COCO_TAURI_STORE)
@@ -172,33 +177,46 @@ pub async fn load_servers_token<R: Runtime>(
servers.ok_or_else(|| "Failed to read servers from store: No servers found".to_string())?;
// Convert each item in the JsonValue array to a Server
if let JsonValue::Array(servers_array) = servers {
// Deserialize each JsonValue into Server, filtering out any errors
let deserialized_tokens: Vec<ServerAccessToken> = servers_array
.into_iter()
.filter_map(|server_json| from_value(server_json).ok()) // Only keep valid Server instances
.collect();
match servers {
JsonValue::Array(servers_array) => {
let mut deserialized_tokens: Vec<ServerAccessToken> =
Vec::with_capacity(servers_array.len());
for server_json in servers_array {
match from_value(server_json.clone()) {
Ok(token) => {
deserialized_tokens.push(token);
}
Err(e) => {
panic!(
"failed to deserialize JSON [{}] to [struct ServerAccessToken], error [{}], store [{}] key [{}] is possibly corrupted!",
server_json, e, COCO_TAURI_STORE, COCO_SERVER_TOKENS
);
}
}
}
if deserialized_tokens.is_empty() {
return Err("Failed to deserialize any servers from the store.".to_string());
if deserialized_tokens.is_empty() {
return Err("Failed to deserialize any servers from the store.".to_string());
}
for server in deserialized_tokens.iter() {
save_access_token(server.id.clone(), server.clone()).await;
}
log::debug!("loaded {:?} servers's token", &deserialized_tokens.len());
Ok(deserialized_tokens)
}
for server in deserialized_tokens.iter() {
save_access_token(server.id.clone(), server.clone());
_ => {
unreachable!(
"coco server tokens should be stored in an array under store [{}] key [{}], but it is not",
COCO_TAURI_STORE, COCO_SERVER_TOKENS
);
}
dbg!(format!(
"loaded {:?} servers's token",
&deserialized_tokens.len()
));
Ok(deserialized_tokens)
} else {
Err("Failed to read servers from store: Invalid format".to_string())
}
}
pub async fn load_servers<R: Runtime>(app_handle: &AppHandle<R>) -> Result<Vec<Server>, String> {
pub async fn load_servers(app_handle: &AppHandle) -> Result<Vec<Server>, String> {
let store = app_handle
.store(COCO_TAURI_STORE)
.expect("create or load a store should not fail");
@@ -216,91 +234,89 @@ pub async fn load_servers<R: Runtime>(app_handle: &AppHandle<R>) -> Result<Vec<S
servers.ok_or_else(|| "Failed to read servers from store: No servers found".to_string())?;
// Convert each item in the JsonValue array to a Server
if let JsonValue::Array(servers_array) = servers {
// Deserialize each JsonValue into Server, filtering out any errors
let deserialized_servers: Vec<Server> = servers_array
.into_iter()
.filter_map(|server_json| from_value(server_json).ok()) // Only keep valid Server instances
.collect();
match servers {
JsonValue::Array(servers_array) => {
let mut deserialized_servers = Vec::with_capacity(servers_array.len());
for server_json in servers_array {
match from_value(server_json.clone()) {
Ok(server) => {
deserialized_servers.push(server);
}
Err(e) => {
panic!(
"failed to deserialize JSON [{}] to [struct Server], error [{}], store [{}] key [{}] is possibly corrupted!",
server_json, e, COCO_TAURI_STORE, COCO_SERVERS
);
}
}
}
if deserialized_servers.is_empty() {
return Err("Failed to deserialize any servers from the store.".to_string());
if deserialized_servers.is_empty() {
return Err("Failed to deserialize any servers from the store.".to_string());
}
for server in deserialized_servers.iter() {
save_server(&server).await;
}
log::debug!("load servers: {:?}", &deserialized_servers);
Ok(deserialized_servers)
}
for server in deserialized_servers.iter() {
save_server(&server);
_ => {
unreachable!(
"coco servers should be stored in an array under store [{}] key [{}], but it is not",
COCO_TAURI_STORE, COCO_SERVERS
);
}
// dbg!(format!("load servers: {:?}", &deserialized_servers));
Ok(deserialized_servers)
} else {
Err("Failed to read servers from store: Invalid format".to_string())
}
}
/// Function to load servers or insert a default one if none exist
pub async fn load_or_insert_default_server<R: Runtime>(
app_handle: &AppHandle<R>,
) -> Result<Vec<Server>, String> {
dbg!("Attempting to load or insert default server");
pub async fn load_or_insert_default_server(app_handle: &AppHandle) -> Result<Vec<Server>, String> {
log::debug!("Attempting to load or insert default server");
let exists_servers = load_servers(&app_handle).await;
if exists_servers.is_ok() && !exists_servers.as_ref()?.is_empty() {
dbg!(format!("loaded {} servers", &exists_servers.clone()?.len()));
log::debug!("loaded {} servers", &exists_servers.clone()?.len());
return exists_servers;
}
let default = get_default_server();
save_server(&default);
save_server(&default).await;
dbg!("loaded default servers");
log::debug!("loaded default servers");
Ok(vec![default])
}
#[tauri::command]
pub async fn list_coco_servers<R: Runtime>(
_app_handle: AppHandle<R>,
) -> Result<Vec<Server>, String> {
pub async fn list_coco_servers(app_handle: AppHandle) -> Result<Vec<Server>, String> {
//hard fresh all server's info, in order to get the actual health
refresh_all_coco_server_info(_app_handle.clone()).await;
refresh_all_coco_server_info(app_handle.clone()).await;
let servers: Vec<Server> = get_all_servers().await;
let servers: Vec<Server> = get_all_servers();
Ok(servers)
}
#[allow(dead_code)]
pub fn get_servers_as_hashmap() -> HashMap<String, Server> {
let cache = SERVER_CACHE.read().unwrap();
cache.clone()
}
pub fn get_all_servers() -> Vec<Server> {
let cache = SERVER_CACHE.read().unwrap();
pub async fn get_all_servers() -> Vec<Server> {
let cache = SERVER_LIST_CACHE.read().await;
cache.values().cloned().collect()
}
/// We store added Coco servers in the Tauri store using this key.
pub const COCO_SERVERS: &str = "coco_servers";
const COCO_SERVER_TOKENS: &str = "coco_server_tokens";
pub async fn refresh_all_coco_server_info<R: Runtime>(app_handle: AppHandle<R>) {
let servers = get_all_servers();
pub async fn refresh_all_coco_server_info(app_handle: AppHandle) {
let servers = get_all_servers().await;
for server in servers {
let _ = refresh_coco_server_info(app_handle.clone(), server.id.clone()).await;
}
}
#[tauri::command]
pub async fn refresh_coco_server_info<R: Runtime>(
app_handle: AppHandle<R>,
id: String,
) -> Result<Server, String> {
pub async fn refresh_coco_server_info(app_handle: AppHandle, id: String) -> Result<Server, String> {
// Retrieve the server from the cache
let cached_server = {
let cache = SERVER_CACHE.read().unwrap();
let cache = SERVER_LIST_CACHE.read().await;
cache.get(&id).cloned()
};
@@ -315,12 +331,16 @@ pub async fn refresh_coco_server_info<R: Runtime>(
let profile = server.profile;
// Send request to fetch updated server info
let response = HttpClient::get(&id, "/provider/_info", None)
.await
.map_err(|e| format!("Failed to contact the server: {}", e))?;
let response = match HttpClient::get(&id, "/provider/_info", None).await {
Ok(response) => response,
Err(e) => {
mark_server_as_offline(app_handle, &id).await;
return Err(e);
}
};
if !response.status().is_success() {
mark_server_as_offline(&id).await;
mark_server_as_offline(app_handle, &id).await;
return Err(format!("Request failed with status: {}", response.status()));
}
@@ -335,12 +355,22 @@ pub async fn refresh_coco_server_info<R: Runtime>(
updated_server.id = id.clone();
updated_server.builtin = is_builtin;
updated_server.enabled = is_enabled;
updated_server.available = true;
updated_server.available = {
if server.public {
// Public Coco servers are available as long as they are online.
true
} else {
// For non-public Coco servers, we still need to check if it is
// logged in, i.e., has a token stored in `SERVER_TOKEN_LIST_CACHE`.
get_server_token(&id).await.is_some()
}
};
updated_server.profile = profile;
trim_endpoint_last_forward_slash(&mut updated_server);
// Save and persist
save_server(&updated_server);
save_server(&updated_server).await;
try_register_server_to_search_source(app_handle.clone(), &updated_server).await;
persist_servers(&app_handle)
.await
.map_err(|e| format!("Failed to persist servers: {}", e))?;
@@ -353,21 +383,18 @@ pub async fn refresh_coco_server_info<R: Runtime>(
}
#[tauri::command]
pub async fn add_coco_server<R: Runtime>(
app_handle: AppHandle<R>,
endpoint: String,
) -> Result<Server, String> {
pub async fn add_coco_server(app_handle: AppHandle, endpoint: String) -> Result<Server, String> {
load_or_insert_default_server(&app_handle)
.await
.map_err(|e| format!("Failed to load default servers: {}", e))?;
let endpoint = endpoint.trim_end_matches('/');
if check_endpoint_exists(endpoint) {
dbg!(format!(
"This Coco server has already been registered: {:?}",
&endpoint
));
if check_endpoint_exists(endpoint).await {
log::debug!(
"trying to register a Coco server [{}] that has already been registered",
endpoint
);
return Err("This Coco server has already been registered.".into());
}
@@ -376,7 +403,16 @@ pub async fn add_coco_server<R: Runtime>(
.await
.map_err(|e| format!("Failed to send request to the server: {}", e))?;
dbg!(format!("Get provider info response: {:?}", &response));
log::debug!("Get provider info response: {:?}", &response);
if response.status() != StatusCode::OK {
log::debug!(
"trying to register a Coco server [{}] that is possibly down",
endpoint
);
return Err("This Coco server is possibly down".into());
}
let body = get_response_body_text(response).await?;
@@ -385,158 +421,255 @@ pub async fn add_coco_server<R: Runtime>(
trim_endpoint_last_forward_slash(&mut server);
// The JSON returned from `provider/_info` won't have this field, serde will set
// it to an empty string during deserialization, we need to set a valid value here.
if server.id.is_empty() {
server.id = pizza_common::utils::uuid::Uuid::new().to_string();
}
// Use the default name, if it is not set.
if server.name.is_empty() {
server.name = "Coco Server".to_string();
}
save_server(&server);
// Update the `available` field
if server.public {
// Serde already sets this to true, but just to make the code clear, do it again.
server.available = true;
} else {
let opt_token = get_server_token(&server.id).await;
assert!(
opt_token.is_none(),
"this Coco server is newly-added, we should have no token stored for it!"
);
// This is a non-public Coco server, and it is not logged in, so it is unavailable.
server.available = false;
}
save_server(&server).await;
try_register_server_to_search_source(app_handle.clone(), &server).await;
persist_servers(&app_handle)
.await
.map_err(|e| format!("Failed to persist Coco servers: {}", e))?;
dbg!(format!("Successfully registered server: {:?}", &endpoint));
log::debug!("Successfully registered server: {:?}", &endpoint);
Ok(server)
}
#[tauri::command]
pub async fn remove_coco_server<R: Runtime>(
app_handle: AppHandle<R>,
id: String,
) -> Result<(), ()> {
#[function_name::named]
pub async fn remove_coco_server(app_handle: AppHandle, id: String) -> Result<(), ()> {
let registry = app_handle.state::<SearchSourceRegistry>();
registry.remove_source(id.as_str()).await;
remove_server_token(id.as_str());
remove_server_by_id(id);
let opt_server = remove_server_by_id(id.as_str()).await;
let Some(server) = opt_server else {
panic!(
"[{}()] invoked with a server [{}] that does not exist! Mismatched states between frontend and backend!",
function_name!(),
id
);
};
persist_servers(&app_handle)
.await
.expect("failed to save servers");
persist_servers_token(&app_handle).expect("failed to save server tokens");
// Only non-public Coco servers require tokens
if !server.public {
// If is logged in, clear the token as well.
let deleted = remove_server_token(id.as_str()).await;
if deleted {
persist_servers_token(&app_handle)
.await
.expect("failed to save server tokens");
}
}
Ok(())
}
#[tauri::command]
pub async fn enable_server<R: Runtime>(app_handle: AppHandle<R>, id: String) -> Result<(), ()> {
println!("enable_server: {}", id);
#[function_name::named]
pub async fn enable_server(app_handle: AppHandle, id: String) -> Result<(), ()> {
let opt_server = get_server_by_id(id.as_str()).await;
let server = get_server_by_id(id.as_str());
if let Some(mut server) = server {
server.enabled = true;
save_server(&server);
let Some(mut server) = opt_server else {
panic!(
"[{}()] invoked with a server [{}] that does not exist! Mismatched states between frontend and backend!",
function_name!(),
id
);
};
// Register the server to the search source
try_register_server_to_search_source(app_handle.clone(), &server).await;
server.enabled = true;
save_server(&server).await;
persist_servers(&app_handle)
.await
.expect("failed to save servers");
}
// Register the server to the search source
try_register_server_to_search_source(app_handle.clone(), &server).await;
persist_servers(&app_handle)
.await
.expect("failed to save servers");
Ok(())
}
pub async fn try_register_server_to_search_source(
app_handle: AppHandle<impl Runtime>,
server: &Server,
) {
#[tauri::command]
#[function_name::named]
pub async fn disable_server(app_handle: AppHandle, id: String) -> Result<(), ()> {
let opt_server = get_server_by_id(id.as_str()).await;
let Some(mut server) = opt_server else {
panic!(
"[{}()] invoked with a server [{}] that does not exist! Mismatched states between frontend and backend!",
function_name!(),
id
);
};
server.enabled = false;
let registry = app_handle.state::<SearchSourceRegistry>();
registry.remove_source(id.as_str()).await;
save_server(&server).await;
persist_servers(&app_handle)
.await
.expect("failed to save servers");
Ok(())
}
/// For non-public Coco servers, we add it to the search source as long as it is
/// enabled.
///
/// For public Coco server, an extra token is required.
pub async fn try_register_server_to_search_source(app_handle: AppHandle, server: &Server) {
if server.enabled {
log::trace!(
"Server [name: {}, id: {}] is public: {} and available: {}",
&server.name,
&server.id,
&server.public,
&server.available
);
if !server.public {
let opt_token = get_server_token(&server.id).await;
if opt_token.is_none() {
log::debug!("Server {} is not public and no token was found", &server.id);
return;
}
}
let registry = app_handle.state::<SearchSourceRegistry>();
let source = CocoSearchSource::new(server.clone());
registry.register_source(source).await;
}
}
pub async fn mark_server_as_offline(id: &str) {
// println!("server_is_offline: {}", id);
let server = get_server_by_id(id);
#[function_name::named]
#[allow(unused)]
async fn mark_server_as_online(app_handle: AppHandle, id: &str) {
let server = get_server_by_id(id).await;
if let Some(mut server) = server {
server.available = true;
server.health = None;
save_server(&server).await;
try_register_server_to_search_source(app_handle.clone(), &server).await;
} else {
log::warn!(
"[{}()] invoked with a server [{}] that does not exist!",
function_name!(),
id
);
}
}
#[function_name::named]
pub(crate) async fn mark_server_as_offline(app_handle: AppHandle, id: &str) {
let server = get_server_by_id(id).await;
if let Some(mut server) = server {
server.available = false;
server.health = None;
save_server(&server);
}
}
#[tauri::command]
pub async fn disable_server<R: Runtime>(app_handle: AppHandle<R>, id: String) -> Result<(), ()> {
println!("disable_server: {}", id);
let server = get_server_by_id(id.as_str());
if let Some(mut server) = server {
server.enabled = false;
save_server(&server).await;
let registry = app_handle.state::<SearchSourceRegistry>();
registry.remove_source(id.as_str()).await;
save_server(&server);
persist_servers(&app_handle)
.await
.expect("failed to save servers");
registry.remove_source(id).await;
} else {
log::warn!(
"[{}()] invoked with a server [{}] that does not exist!",
function_name!(),
id
);
}
Ok(())
}
#[tauri::command]
pub async fn logout_coco_server<R: Runtime>(
app_handle: AppHandle<R>,
id: String,
) -> Result<(), String> {
dbg!("Attempting to log out server by id:", &id);
// Check if server token exists
if let Some(_token) = get_server_token(id.as_str()).await? {
dbg!("Found server token for id:", &id);
// Remove the server token from cache
remove_server_token(id.as_str());
// Persist the updated tokens
if let Err(e) = persist_servers_token(&app_handle) {
dbg!("Failed to save tokens for id: {}. Error: {:?}", &id, &e);
return Err(format!("Failed to save tokens: {}", &e));
}
} else {
// Log the case where server token is not found
dbg!("No server token found for id: {}", &id);
}
#[function_name::named]
pub async fn logout_coco_server(app_handle: AppHandle, id: String) -> Result<(), String> {
log::debug!("Attempting to log out server by id: {}", &id);
// Check if the server exists
if let Some(mut server) = get_server_by_id(id.as_str()) {
dbg!("Found server for id:", &id);
let Some(mut server) = get_server_by_id(id.as_str()).await else {
panic!(
"[{}()] invoked with a server [{}] that does not exist! Mismatched states between frontend and backend!",
function_name!(),
id
);
};
// Clear server profile
server.profile = None;
// Save the updated server data
save_server(&server);
// Persist the updated server data
if let Err(e) = persist_servers(&app_handle).await {
dbg!("Failed to save server for id: {}. Error: {:?}", &id, &e);
return Err(format!("Failed to save server: {}", &e));
}
} else {
// Log the case where server is not found
dbg!("No server found for id: {}", &id);
return Err(format!("No server found for id: {}", id));
// Clear server profile
server.profile = None;
// Logging out from a non-public Coco server makes it unavailable
if !server.public {
server.available = false;
}
// Save the updated server data
save_server(&server).await;
// Persist the updated server data
if let Err(e) = persist_servers(&app_handle).await {
log::debug!("Failed to save server for id: {}. Error: {:?}", &id, &e);
return Err(format!("Failed to save server: {}", &e));
}
dbg!("Successfully logged out server with id:", &id);
let has_token = get_server_token(id.as_str()).await.is_some();
if server.public {
if has_token {
panic!("Public Coco server won't have token")
}
} else {
assert!(
has_token,
"This is a non-public Coco server, and it is logged in, we should have a token"
);
// Remove the server token from cache
remove_server_token(id.as_str()).await;
// Persist the updated tokens
if let Err(e) = persist_servers_token(&app_handle).await {
log::debug!("Failed to save tokens for id: {}. Error: {:?}", &id, &e);
return Err(format!("Failed to save tokens: {}", &e));
}
}
// Remove it from the search source if it becomes unavailable
if !server.available {
let registry = app_handle.state::<SearchSourceRegistry>();
registry.remove_source(id.as_str()).await;
}
log::debug!("Successfully logged out server with id: {}", &id);
Ok(())
}
/// Removes the trailing slash from the server's endpoint if present.
/// Helper function to remove the trailing slash from the server's endpoint if present.
fn trim_endpoint_last_forward_slash(server: &mut Server) {
if server.endpoint.ends_with('/') {
server.endpoint.pop(); // Remove the last character
while server.endpoint.ends_with('/') {
server.endpoint.pop();
}
let endpoint = &mut server.endpoint;
while endpoint.ends_with('/') {
endpoint.pop();
}
}
@@ -545,41 +678,47 @@ fn provider_info_url(endpoint: &str) -> String {
format!("{endpoint}/provider/_info")
}
#[test]
fn test_trim_endpoint_last_forward_slash() {
let mut server = Server {
id: "test".to_string(),
builtin: false,
enabled: true,
name: "".to_string(),
endpoint: "https://example.com///".to_string(),
provider: Provider {
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_trim_endpoint_last_forward_slash() {
let mut server = Server {
id: "test".to_string(),
builtin: false,
enabled: true,
name: "".to_string(),
icon: "".to_string(),
website: "".to_string(),
eula: "".to_string(),
privacy_policy: "".to_string(),
banner: "".to_string(),
description: "".to_string(),
},
version: Version {
number: "".to_string(),
},
minimal_client_version: None,
updated: "".to_string(),
public: false,
available: false,
health: None,
profile: None,
auth_provider: AuthProvider {
sso: Sso {
url: "".to_string(),
endpoint: "https://example.com///".to_string(),
provider: Provider {
name: "".to_string(),
icon: "".to_string(),
website: "".to_string(),
eula: "".to_string(),
privacy_policy: "".to_string(),
banner: "".to_string(),
description: "".to_string(),
},
},
priority: 0,
};
version: Version {
number: "".to_string(),
},
minimal_client_version: None,
updated: "".to_string(),
public: false,
available: false,
health: None,
profile: None,
auth_provider: AuthProvider {
sso: Sso {
url: "".to_string(),
},
},
priority: 0,
stats: None,
};
trim_endpoint_last_forward_slash(&mut server);
trim_endpoint_last_forward_slash(&mut server);
assert_eq!(server.endpoint, "https://example.com");
assert_eq!(server.endpoint, "https://example.com");
}
}

View File

@@ -0,0 +1,57 @@
use crate::server::http_client::HttpClient;
use futures_util::StreamExt;
use http::Method;
use serde_json::json;
use tauri::{AppHandle, Emitter, command};
#[command]
pub async fn synthesize(
app_handle: AppHandle,
client_id: String,
server_id: String,
voice: String,
content: String,
) -> Result<(), String> {
let body = json!({
"voice": voice,
"content": content,
})
.to_string();
let response = HttpClient::send_request(
server_id.as_str(),
Method::POST,
"/services/audio/synthesize",
None,
None,
Some(reqwest::Body::from(body.to_string())),
)
.await?;
log::info!("Synthesize response status: {}", response.status());
if response.status() == 429 {
return Ok(());
}
if !response.status().is_success() {
return Err(format!("Request Failed: {}", response.status()));
}
let mut stream = response.bytes_stream();
while let Some(chunk) = stream.next().await {
match chunk {
Ok(bytes) => {
if let Err(err) = app_handle.emit(&client_id, bytes.to_vec()) {
log::error!("Emit error: {:?}", err);
}
}
Err(e) => {
log::error!("Stream error: {:?}", e);
break;
}
}
}
Ok(())
}

View File

@@ -1,43 +1,96 @@
use crate::common::http::get_response_body_text;
use crate::server::http_client::HttpClient;
use serde::{Deserialize, Serialize};
use serde_json::Value as JsonValue;
use std::collections::HashMap;
use serde_json::{Value, from_str};
use tauri::command;
#[derive(Debug, Serialize, Deserialize)]
pub struct TranscriptionResponse {
pub text: String,
task_id: String,
results: Vec<Value>,
}
#[command]
pub async fn transcription(
server_id: String,
audio_type: String,
audio_content: String,
) -> Result<TranscriptionResponse, String> {
let mut query_params = HashMap::new();
query_params.insert("type".to_string(), JsonValue::String(audio_type));
query_params.insert("content".to_string(), JsonValue::String(audio_content));
// Send the HTTP POST request
let response = HttpClient::post(
// Send request to initiate transcription task
let init_response = HttpClient::post(
&server_id,
"/services/audio/transcription",
Some(query_params),
None,
Some(audio_content.into()),
)
.await
.map_err(|e| format!("Error sending transcription request: {}", e))?;
.await
.map_err(|e| format!("Failed to initiate transcription: {}", e))?;
// Use get_response_body_text to extract the response body as text
let response_body = get_response_body_text(response)
// Extract response body as text
let init_response_text = get_response_body_text(init_response)
.await
.map_err(|e| format!("Failed to read response body: {}", e))?;
.map_err(|e| format!("Failed to read initial response body: {}", e))?;
// Deserialize the response body into TranscriptionResponse
let transcription_response: TranscriptionResponse = serde_json::from_str(&response_body)
.map_err(|e| format!("Failed to parse transcription response: {}", e))?;
// Parse response JSON to extract task ID
let init_response_json: Value = from_str(&init_response_text).map_err(|e| {
format!(
"Failed to parse initial response JSON: {}. Raw response: {}",
e, init_response_text
)
})?;
let transcription_task_id = init_response_json["task_id"]
.as_str()
.ok_or_else(|| {
format!(
"Missing or invalid task_id in initial response: {}",
init_response_text
)
})?
.to_string();
// Set up polling with timeout
let polling_start = std::time::Instant::now();
const POLLING_TIMEOUT: std::time::Duration = std::time::Duration::from_secs(30);
const POLLING_INTERVAL: std::time::Duration = std::time::Duration::from_millis(200);
let mut transcription_response: TranscriptionResponse;
loop {
// Poll for transcription results
let poll_response = HttpClient::get(
&server_id,
&format!("/services/audio/task/{}", transcription_task_id),
None,
)
.await
.map_err(|e| format!("Failed to poll transcription task: {}", e))?;
// Extract poll response body
let poll_response_text = get_response_body_text(poll_response)
.await
.map_err(|e| format!("Failed to read poll response body: {}", e))?;
// Parse poll response JSON
transcription_response = from_str(&poll_response_text).map_err(|e| {
format!(
"Failed to parse poll response JSON: {}. Raw response: {}",
e, poll_response_text
)
})?;
// Check if transcription results are available
if !transcription_response.results.is_empty() {
break;
}
// Check for timeout
if polling_start.elapsed() >= POLLING_TIMEOUT {
return Err("Transcription task timed out after 30 seconds".to_string());
}
// Wait before next poll
tokio::time::sleep(POLLING_INTERVAL).await;
}
Ok(transcription_response)
}

View File

@@ -1,168 +0,0 @@
use crate::server::servers::{get_server_by_id, get_server_token};
use futures::StreamExt;
use std::collections::HashMap;
use std::sync::Arc;
use tauri::{AppHandle, Emitter, Runtime};
use tokio::net::TcpStream;
use tokio::sync::{mpsc, Mutex};
use tokio_tungstenite::tungstenite::handshake::client::generate_key;
use tokio_tungstenite::tungstenite::Message;
use tokio_tungstenite::MaybeTlsStream;
use tokio_tungstenite::WebSocketStream;
use tokio_tungstenite::{connect_async_tls_with_config, Connector};
#[derive(Default)]
pub struct WebSocketManager {
connections: Arc<Mutex<HashMap<String, Arc<WebSocketInstance>>>>,
}
struct WebSocketInstance {
ws_connection: Mutex<WebSocketStream<MaybeTlsStream<TcpStream>>>, // No need to lock the entire map
cancel_tx: mpsc::Sender<()>,
}
fn convert_to_websocket(endpoint: &str) -> Result<String, String> {
let url = url::Url::parse(endpoint).map_err(|e| format!("Invalid URL: {}", e))?;
let ws_protocol = if url.scheme() == "https" {
"wss://"
} else {
"ws://"
};
let host = url.host_str().ok_or("No host found in URL")?;
let port = url
.port_or_known_default()
.unwrap_or(if url.scheme() == "https" { 443 } else { 80 });
let ws_endpoint = if port == 80 || port == 443 {
format!("{}{}{}", ws_protocol, host, "/ws")
} else {
format!("{}{}:{}/ws", ws_protocol, host, port)
};
Ok(ws_endpoint)
}
#[tauri::command]
pub async fn connect_to_server<R: Runtime>(
tauri_app_handle: AppHandle<R>,
id: String,
client_id: String,
state: tauri::State<'_, WebSocketManager>,
app_handle: AppHandle,
) -> Result<(), String> {
let connections_clone = state.connections.clone();
// Disconnect old connection first
disconnect(client_id.clone(), state.clone()).await.ok();
let server = get_server_by_id(&id).ok_or(format!("Server with ID {} not found", id))?;
let endpoint = convert_to_websocket(&server.endpoint)?;
let token = get_server_token(&id).await?.map(|t| t.access_token.clone());
let mut request =
tokio_tungstenite::tungstenite::client::IntoClientRequest::into_client_request(&endpoint)
.map_err(|e| format!("Failed to create WebSocket request: {}", e))?;
request
.headers_mut()
.insert("Connection", "Upgrade".parse().unwrap());
request
.headers_mut()
.insert("Upgrade", "websocket".parse().unwrap());
request
.headers_mut()
.insert("Sec-WebSocket-Version", "13".parse().unwrap());
request
.headers_mut()
.insert("Sec-WebSocket-Key", generate_key().parse().unwrap());
if let Some(token) = token {
request
.headers_mut()
.insert("X-API-TOKEN", token.parse().unwrap());
}
let allow_self_signature =
crate::settings::get_allow_self_signature(tauri_app_handle.clone()).await;
let tls_connector = tokio_native_tls::native_tls::TlsConnector::builder()
.danger_accept_invalid_certs(allow_self_signature)
.build()
.map_err(|e| format!("TLS build error: {:?}", e))?;
let connector = Connector::NativeTls(tls_connector.into());
let (ws_stream, _) = connect_async_tls_with_config(
request,
None, // WebSocketConfig
true, // disable_nagle
Some(connector), // Connector
)
.await
.map_err(|e| format!("WebSocket TLS error: {:?}", e))?;
let (cancel_tx, mut cancel_rx) = mpsc::channel(1);
let instance = Arc::new(WebSocketInstance {
ws_connection: Mutex::new(ws_stream),
cancel_tx,
});
// Insert connection into the map (lock is held briefly)
{
let mut connections = connections_clone.lock().await;
connections.insert(client_id.clone(), instance.clone());
}
// Spawn WebSocket handler in a separate task
let app_handle_clone = app_handle.clone();
let client_id_clone = client_id.clone();
tokio::spawn(async move {
let ws = &mut *instance.ws_connection.lock().await;
loop {
tokio::select! {
msg = ws.next() => {
match msg {
Some(Ok(Message::Text(text))) => {
let _ = app_handle_clone.emit(&format!("ws-message-{}", client_id_clone), text);
},
Some(Err(_)) | None => {
let _ = app_handle_clone.emit(&format!("ws-error-{}", client_id_clone), id.clone());
break;
}
_ => {}
}
}
_ = cancel_rx.recv() => {
let _ = app_handle_clone.emit(&format!("ws-error-{}", client_id_clone), id.clone());
break;
}
}
}
// Remove connection after it closes
let mut connections = connections_clone.lock().await;
connections.remove(&client_id_clone);
});
Ok(())
}
#[tauri::command]
pub async fn disconnect(
client_id: String,
state: tauri::State<'_, WebSocketManager>,
) -> Result<(), String> {
let instance = {
let mut connections = state.connections.lock().await;
connections.remove(&client_id)
};
if let Some(instance) = instance {
let _ = instance.cancel_tx.send(()).await;
// Close WebSocket (lock only the connection, not the whole map)
let mut ws = instance.ws_connection.lock().await;
let _ = ws.close(None).await;
}
Ok(())
}

View File

@@ -1,12 +1,12 @@
use crate::COCO_TAURI_STORE;
use serde_json::Value as Json;
use tauri::{AppHandle, Runtime};
use tauri::AppHandle;
use tauri_plugin_store::StoreExt;
const SETTINGS_ALLOW_SELF_SIGNATURE: &str = "settings_allow_self_signature";
#[tauri::command]
pub async fn set_allow_self_signature<R: Runtime>(tauri_app_handle: AppHandle<R>, value: bool) {
pub async fn set_allow_self_signature(tauri_app_handle: AppHandle, value: bool) {
use crate::server::http_client;
let store = tauri_app_handle
@@ -40,7 +40,7 @@ pub async fn set_allow_self_signature<R: Runtime>(tauri_app_handle: AppHandle<R>
}
/// Synchronous version of `async get_allow_self_signature()`.
pub fn _get_allow_self_signature<R: Runtime>(tauri_app_handle: AppHandle<R>) -> bool {
pub fn _get_allow_self_signature(tauri_app_handle: AppHandle) -> bool {
let store = tauri_app_handle
.store(COCO_TAURI_STORE)
.unwrap_or_else(|e| {
@@ -67,6 +67,6 @@ pub fn _get_allow_self_signature<R: Runtime>(tauri_app_handle: AppHandle<R>) ->
}
#[tauri::command]
pub async fn get_allow_self_signature<R: Runtime>(tauri_app_handle: AppHandle<R>) -> bool {
pub async fn get_allow_self_signature(tauri_app_handle: AppHandle) -> bool {
_get_allow_self_signature(tauri_app_handle)
}

View File

@@ -1,3 +1,9 @@
use tauri::{App, WebviewWindow};
pub fn platform(_app: &mut App, _main_window: WebviewWindow, _settings_window: WebviewWindow) {}
pub fn platform(
_app: &mut App,
_main_window: WebviewWindow,
_settings_window: WebviewWindow,
_check_window: WebviewWindow,
) {
}

View File

@@ -1,6 +1,9 @@
//credits to: https://github.com/ayangweb/ayangweb-EcoPaste/blob/169323dbe6365ffe4abb64d867439ed2ea84c6d1/src-tauri/src/core/setup/mac.rs
use tauri::{ActivationPolicy, App, Emitter, EventTarget, WebviewWindow};
use tauri_nspanel::{cocoa::appkit::NSWindowCollectionBehavior, panel_delegate, WebviewWindowExt};
//! credits to: https://github.com/ayangweb/ayangweb-EcoPaste/blob/169323dbe6365ffe4abb64d867439ed2ea84c6d1/src-tauri/src/core/setup/mac.rs
use cocoa::appkit::NSWindow;
use tauri::Manager;
use tauri::{App, AppHandle, Emitter, EventTarget, WebviewWindow};
use tauri_nspanel::{WebviewWindowExt, cocoa::appkit::NSWindowCollectionBehavior, panel_delegate};
use crate::common::MAIN_WINDOW_LABEL;
@@ -12,9 +15,12 @@ const WINDOW_BLUR_EVENT: &str = "tauri://blur";
const WINDOW_MOVED_EVENT: &str = "tauri://move";
const WINDOW_RESIZED_EVENT: &str = "tauri://resize";
pub fn platform(app: &mut App, main_window: WebviewWindow, _settings_window: WebviewWindow) {
app.set_activation_policy(ActivationPolicy::Accessory);
pub fn platform(
_app: &mut App,
main_window: WebviewWindow,
_settings_window: WebviewWindow,
_check_window: WebviewWindow,
) {
// Convert ns_window to ns_panel
let panel = main_window.to_panel().unwrap();
@@ -26,7 +32,7 @@ pub fn platform(app: &mut App, main_window: WebviewWindow, _settings_window: Web
// Share the window across all desktop spaces and full screen
panel.set_collection_behaviour(
NSWindowCollectionBehavior::NSWindowCollectionBehaviorCanJoinAllSpaces
NSWindowCollectionBehavior::NSWindowCollectionBehaviorMoveToActiveSpace
| NSWindowCollectionBehavior::NSWindowCollectionBehaviorStationary
| NSWindowCollectionBehavior::NSWindowCollectionBehaviorFullScreenAuxiliary,
);
@@ -75,3 +81,50 @@ pub fn platform(app: &mut App, main_window: WebviewWindow, _settings_window: Web
// Set the delegate object for the window to handle window events
panel.set_delegate(delegate);
}
/// Change NS window attribute between `NSWindowCollectionBehaviorCanJoinAllSpaces`
/// and `NSWindowCollectionBehaviorMoveToActiveSpace` accordingly.
///
/// NOTE: this tauri command is not async because we should run it in the main
/// thread, or `ns_window.setCollectionBehavior_(collection_behavior)` would lead
/// to UB.
#[tauri::command]
pub(crate) fn toggle_move_to_active_space_attribute(tauri_app_hanlde: AppHandle) {
use cocoa::appkit::NSWindowCollectionBehavior;
use cocoa::base::id;
let main_window = tauri_app_hanlde
.get_webview_window(MAIN_WINDOW_LABEL)
.unwrap();
let ns_window = main_window.ns_window().unwrap() as id;
let mut collection_behavior = unsafe { ns_window.collectionBehavior() };
let join_all_spaces = collection_behavior
.contains(NSWindowCollectionBehavior::NSWindowCollectionBehaviorCanJoinAllSpaces);
let move_to_active_space = collection_behavior
.contains(NSWindowCollectionBehavior::NSWindowCollectionBehaviorMoveToActiveSpace);
match (join_all_spaces, move_to_active_space) {
(true, false) => {
collection_behavior
.remove(NSWindowCollectionBehavior::NSWindowCollectionBehaviorCanJoinAllSpaces);
collection_behavior
.insert(NSWindowCollectionBehavior::NSWindowCollectionBehaviorMoveToActiveSpace);
}
(false, true) => {
collection_behavior
.remove(NSWindowCollectionBehavior::NSWindowCollectionBehaviorMoveToActiveSpace);
collection_behavior
.insert(NSWindowCollectionBehavior::NSWindowCollectionBehaviorCanJoinAllSpaces);
}
_ => {
panic!(
"invalid NS window attribute, NSWindowCollectionBehaviorCanJoinAllSpaces is set [{}], NSWindowCollectionBehaviorMoveToActiveSpace is set [{}]",
join_all_spaces, move_to_active_space
);
}
}
unsafe {
ns_window.setCollectionBehavior_(collection_behavior);
}
}

View File

@@ -18,10 +18,20 @@ pub use windows::*;
#[cfg(target_os = "linux")]
pub use linux::*;
pub fn default(app: &mut App, main_window: WebviewWindow, settings_window: WebviewWindow) {
pub fn default(
app: &mut App,
main_window: WebviewWindow,
settings_window: WebviewWindow,
check_window: WebviewWindow,
) {
// Development mode automatically opens the console: https://tauri.app/develop/debug
#[cfg(all(dev, debug_assertions))]
#[cfg(debug_assertions)]
main_window.open_devtools();
platform(app, main_window.clone(), settings_window.clone());
platform(
app,
main_window.clone(),
settings_window.clone(),
check_window.clone(),
);
}

View File

@@ -1,3 +1,9 @@
use tauri::{App, WebviewWindow};
pub fn platform(_app: &mut App, _main_window: WebviewWindow, _settings_window: WebviewWindow) {}
pub fn platform(
_app: &mut App,
_main_window: WebviewWindow,
_settings_window: WebviewWindow,
_check_window: WebviewWindow,
) {
}

View File

@@ -1,5 +1,5 @@
use crate::{hide_coco, show_coco, COCO_TAURI_STORE};
use tauri::{async_runtime, App, AppHandle, Manager, Runtime};
use crate::{COCO_TAURI_STORE, hide_coco, show_coco};
use tauri::{App, AppHandle, Manager, async_runtime};
use tauri_plugin_global_shortcut::{GlobalShortcutExt, Shortcut, ShortcutState};
use tauri_plugin_store::{JsonValue, StoreExt};
@@ -17,6 +17,7 @@ const DEFAULT_SHORTCUT: &str = "ctrl+shift+space";
/// Set up the shortcut upon app start.
pub fn enable_shortcut(app: &App) {
log::trace!("setting up Coco hotkey");
let store = app
.store(COCO_TAURI_STORE)
.expect("creating a store should not fail");
@@ -43,19 +44,20 @@ pub fn enable_shortcut(app: &App) {
.expect("default shortcut should never be invalid");
_register_shortcut_upon_start(app, default_shortcut);
}
log::trace!("Coco hotkey has been set");
}
/// Get the stored shortcut as a string, same as [`_get_shortcut()`], except that
/// this is a `tauri::command` interface.
#[tauri::command]
pub async fn get_current_shortcut<R: Runtime>(app: AppHandle<R>) -> Result<String, String> {
pub async fn get_current_shortcut(app: AppHandle) -> Result<String, String> {
let shortcut = _get_shortcut(&app);
Ok(shortcut)
}
/// Get the current shortcut and unregister it on the tauri side.
#[tauri::command]
pub async fn unregister_shortcut<R: Runtime>(app: AppHandle<R>) {
pub async fn unregister_shortcut(app: AppHandle) {
let shortcut_str = _get_shortcut(&app);
let shortcut = shortcut_str
.parse::<Shortcut>()
@@ -68,9 +70,9 @@ pub async fn unregister_shortcut<R: Runtime>(app: AppHandle<R>) {
/// Change the global shortcut to `key`.
#[tauri::command]
pub async fn change_shortcut<R: Runtime>(
app: AppHandle<R>,
_window: tauri::Window<R>,
pub async fn change_shortcut(
app: AppHandle,
_window: tauri::Window,
key: String,
) -> Result<(), String> {
println!("key {}:", key);
@@ -92,12 +94,12 @@ pub async fn change_shortcut<R: Runtime>(
}
/// Helper function to register a shortcut, used for shortcut updates.
fn _register_shortcut<R: Runtime>(app: &AppHandle<R>, shortcut: Shortcut) {
fn _register_shortcut(app: &AppHandle, shortcut: Shortcut) {
app.global_shortcut()
.on_shortcut(shortcut, move |app, scut, event| {
if scut == &shortcut {
dbg!("shortcut pressed");
let main_window = app.get_window(MAIN_WINDOW_LABEL).unwrap();
let main_window = app.get_webview_window(MAIN_WINDOW_LABEL).unwrap();
if let ShortcutState::Pressed = event.state() {
let app_handle = app.clone();
if main_window.is_visible().unwrap() {
@@ -126,7 +128,7 @@ fn _register_shortcut_upon_start(app: &App, shortcut: Shortcut) {
tauri_plugin_global_shortcut::Builder::new()
.with_handler(move |app, scut, event| {
if scut == &shortcut {
let window = app.get_window(MAIN_WINDOW_LABEL).unwrap();
let window = app.get_webview_window(MAIN_WINDOW_LABEL).unwrap();
if let ShortcutState::Pressed = event.state() {
let app_handle = app.clone();
@@ -149,7 +151,7 @@ fn _register_shortcut_upon_start(app: &App, shortcut: Shortcut) {
}
/// Helper function to get the stored global shortcut, as a string.
pub fn _get_shortcut<R: Runtime>(app: &AppHandle<R>) -> String {
pub fn _get_shortcut(app: &AppHandle) -> String {
let store = app
.get_store(COCO_TAURI_STORE)
.expect("store should be loaded or created");

View File

@@ -0,0 +1,62 @@
//! Configuration entry App language is persisted in the frontend code, but we
//! need to access it on the backend.
//!
//! So we duplicate it here **in the MEMORY** and expose a setter method to the
//! frontend so that the value can be updated and stay update-to-date.
use function_name::named;
use tokio::sync::RwLock;
#[derive(Debug, Clone, Copy, PartialEq)]
#[allow(non_camel_case_types)]
pub(crate) enum Lang {
en_US,
zh_CN,
}
impl std::fmt::Display for Lang {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Lang::en_US => write!(f, "en_US"),
Lang::zh_CN => write!(f, "zh_CN"),
}
}
}
impl std::str::FromStr for Lang {
type Err = String;
fn from_str(s: &str) -> Result<Self, Self::Err> {
match s {
"en" => Ok(Lang::en_US),
"zh" => Ok(Lang::zh_CN),
_ => Err(format!("Invalid language: {}", s)),
}
}
}
/// Cache the language config in memory.
static APP_LANG: RwLock<Option<Lang>> = RwLock::const_new(None);
/// Frontend code uses this interface to update the in-memory cached `APP_LANG` config.
#[named]
#[tauri::command]
pub(crate) async fn update_app_lang(lang: String) {
let app_lang = lang.parse::<Lang>().unwrap_or_else(|e| {
panic!(
"frontend code passes an invalid argument [{}] to interface [{}], parsing error [{}]",
lang,
function_name!(),
e
)
});
let mut write_guard = APP_LANG.write().await;
*write_guard = Some(app_lang);
}
/// Helper getter method to handle the `None` case.
pub(crate) async fn get_app_lang() -> Lang {
let opt_lang = *APP_LANG.read().await;
opt_lang.expect("frontend code did not invoke [update_app_lang()] to set the APP_LANG")
}

174
src-tauri/src/util/file.rs Normal file
View File

@@ -0,0 +1,174 @@
#[derive(Debug, Clone, PartialEq, Copy)]
pub(crate) enum FileType {
Folder,
JPEGImage,
PNGImage,
PDFDocument,
PlainTextDocument,
MicrosoftWordDocument,
MicrosoftExcelSpreadsheet,
AudioFile,
VideoFile,
CHeaderFile,
TOMLDocument,
RustScript,
CSourceCode,
MarkdownDocument,
TerminalSettings,
ZipArchive,
Dmg,
Html,
Json,
Xml,
Yaml,
Css,
Vue,
React,
Sql,
Csv,
Javascript,
Lnk,
Typescript,
Python,
Java,
Golang,
Ruby,
Php,
Sass,
Sketch,
AdobeAi,
AdobePsd,
AdobePr,
AdobeAu,
AdobeAe,
AdobeLr,
AdobeXd,
AdobeFl,
AdobeId,
Svg,
Epub,
Unknown,
}
async fn get_file_type(path: &str) -> FileType {
let path = camino::Utf8Path::new(path);
// stat() is more precise than file extension, use it if possible.
if path.is_dir() {
return FileType::Folder;
}
let Some(ext) = path.extension() else {
return FileType::Unknown;
};
let ext = ext.to_lowercase();
match ext.as_str() {
"pdf" => FileType::PDFDocument,
"txt" | "text" => FileType::PlainTextDocument,
"doc" | "docx" => FileType::MicrosoftWordDocument,
"xls" | "xlsx" => FileType::MicrosoftExcelSpreadsheet,
"jpg" | "jpeg" => FileType::JPEGImage,
"png" => FileType::PNGImage,
"mp3" | "wav" | "flac" | "aac" | "ogg" | "m4a" => FileType::AudioFile,
"mp4" | "avi" | "mov" | "mkv" | "wmv" | "flv" | "webm" => FileType::VideoFile,
"h" | "hpp" => FileType::CHeaderFile,
"c" | "cpp" | "cc" | "cxx" => FileType::CSourceCode,
"toml" => FileType::TOMLDocument,
"rs" => FileType::RustScript,
"md" | "markdown" => FileType::MarkdownDocument,
"terminal" => FileType::TerminalSettings,
"zip" | "rar" | "7z" | "tar" | "gz" | "bz2" => FileType::ZipArchive,
"dmg" => FileType::Dmg,
"html" | "htm" => FileType::Html,
"json" => FileType::Json,
"xml" => FileType::Xml,
"yaml" | "yml" => FileType::Yaml,
"css" => FileType::Css,
"vue" => FileType::Vue,
"jsx" | "tsx" => FileType::React,
"sql" => FileType::Sql,
"csv" => FileType::Csv,
"js" | "mjs" => FileType::Javascript,
"ts" => FileType::Typescript,
"py" | "pyw" => FileType::Python,
"java" => FileType::Java,
"go" => FileType::Golang,
"rb" => FileType::Ruby,
"php" => FileType::Php,
"sass" | "scss" => FileType::Sass,
"sketch" => FileType::Sketch,
"ai" => FileType::AdobeAi,
"psd" => FileType::AdobePsd,
"prproj" => FileType::AdobePr,
"aup" | "aup3" => FileType::AdobeAu,
"aep" => FileType::AdobeAe,
"lrcat" => FileType::AdobeLr,
"xd" => FileType::AdobeXd,
"fla" => FileType::AdobeFl,
"indd" => FileType::AdobeId,
"svg" => FileType::Svg,
"epub" => FileType::Epub,
"lnk" => FileType::Lnk,
_ => FileType::Unknown,
}
}
fn type_to_icon(ty: FileType) -> &'static str {
match ty {
FileType::Folder => "font_file_folder",
FileType::JPEGImage => "font_file_image",
FileType::PNGImage => "font_file_image",
FileType::PDFDocument => "font_file_document_pdf",
FileType::PlainTextDocument => "font_file_txt",
FileType::MicrosoftWordDocument => "font_file_document_word",
FileType::MicrosoftExcelSpreadsheet => "font_file_spreadsheet_excel",
FileType::AudioFile => "font_file_audio",
FileType::VideoFile => "font_file_video",
FileType::CHeaderFile => "font_file_csource",
FileType::TOMLDocument => "font_file_toml",
FileType::RustScript => "font_file_rustscript1",
FileType::CSourceCode => "font_file_csource",
FileType::MarkdownDocument => "font_file_markdown",
FileType::TerminalSettings => "font_file_terminal1",
FileType::ZipArchive => "font_file_zip",
FileType::Dmg => "font_file_dmg",
FileType::Html => "font_file_html",
FileType::Json => "font_file_json",
FileType::Xml => "font_file_xml",
FileType::Yaml => "font_file_yaml",
FileType::Css => "font_file_css",
FileType::Vue => "font_file_vue",
FileType::React => "font_file_react",
FileType::Sql => "font_file_sql",
FileType::Csv => "font_file_csv",
FileType::Javascript => "font_file_javascript",
FileType::Lnk => "font_file_lnk",
FileType::Typescript => "font_file_typescript",
FileType::Python => "font_file_python",
FileType::Java => "font_file_java",
FileType::Golang => "font_file_golang",
FileType::Ruby => "font_file_ruby",
FileType::Php => "font_file_php",
FileType::Sass => "font_file_sass",
FileType::Sketch => "font_file_sketch",
FileType::AdobeAi => "font_file_adobe_ai",
FileType::AdobePsd => "font_file_adobe_psd",
FileType::AdobePr => "font_file_adobe_pr",
FileType::AdobeAu => "font_file_adobe_au",
FileType::AdobeAe => "font_file_adobe_ae",
FileType::AdobeLr => "font_file_adobe_lr",
FileType::AdobeXd => "font_file_adobe_xd",
FileType::AdobeFl => "font_file_adobe_fl",
FileType::AdobeId => "font_file_adobe_id",
FileType::Svg => "font_file_svg",
FileType::Epub => "font_file_epub",
FileType::Unknown => "font_file_unknown",
}
}
#[tauri::command]
pub(crate) async fn get_file_icon(path: String) -> &'static str {
let ty = get_file_type(path.as_str()).await;
type_to_icon(ty)
}

View File

@@ -1,10 +1,20 @@
pub(crate) mod app_lang;
pub(crate) mod file;
pub(crate) mod platform;
pub(crate) mod updater;
use std::{path::Path, process::Command};
use tauri::{AppHandle, Runtime};
use tauri::AppHandle;
use tauri_plugin_shell::ShellExt;
/// We use this env variable to determine the DE on Linux.
const XDG_CURRENT_DESKTOP: &str = "XDG_CURRENT_DESKTOP";
#[derive(Debug, PartialEq)]
enum LinuxDesktopEnvironment {
Gnome,
Kde,
Unsupported { xdg_current_desktop: String },
}
impl LinuxDesktopEnvironment {
@@ -30,6 +40,14 @@ impl LinuxDesktopEnvironment {
.arg(path)
.output()
.map_err(|e| e.to_string())?,
Self::Unsupported {
xdg_current_desktop,
} => {
return Err(format!(
"Cannot open apps as this Linux desktop environment [{}] is not supported",
xdg_current_desktop
));
}
};
if !cmd_output.status.success() {
@@ -44,20 +62,23 @@ impl LinuxDesktopEnvironment {
}
}
/// None means that it is likely that we do not have a desktop environment.
fn get_linux_desktop_environment() -> Option<LinuxDesktopEnvironment> {
let de_os_str = std::env::var_os("XDG_CURRENT_DESKTOP")?;
let de_str = de_os_str
.into_string()
.expect("$XDG_CURRENT_DESKTOP should be UTF-8 encoded");
let de_os_str = std::env::var_os(XDG_CURRENT_DESKTOP)?;
let de_str = de_os_str.into_string().unwrap_or_else(|_os_string| {
panic!("${} should be UTF-8 encoded", XDG_CURRENT_DESKTOP);
});
let de = match de_str.as_str() {
"GNOME" => LinuxDesktopEnvironment::Gnome,
// Ubuntu uses "ubuntu:GNOME" instead of just "GNOME", they really love
// their distro name.
"ubuntu:GNOME" => LinuxDesktopEnvironment::Gnome,
"KDE" => LinuxDesktopEnvironment::Kde,
unsupported_de => unimplemented!(
"This desktop environment [{}] has not been supported yet",
unsupported_de
),
_ => LinuxDesktopEnvironment::Unsupported {
xdg_current_desktop: de_str,
},
};
Some(de)
@@ -67,13 +88,12 @@ fn get_linux_desktop_environment() -> Option<LinuxDesktopEnvironment> {
//
// tauri_plugin_shell::open() is deprecated, but we still use it.
#[allow(deprecated)]
#[tauri::command]
pub async fn open<R: Runtime>(app_handle: AppHandle<R>, path: String) -> Result<(), String> {
pub async fn open(app_handle: AppHandle, path: String) -> Result<(), String> {
if cfg!(target_os = "linux") {
let borrowed_path = Path::new(&path);
if let Some(file_extension) = borrowed_path.extension() {
if file_extension == "desktop" {
let desktop_environment = get_linux_desktop_environment().expect("The Linux OS is running without a desktop, Coco could never run in such a environment");
let desktop_environment = get_linux_desktop_environment().expect("The Linux OS is running without a desktop, Coco could never run in such an environment");
return desktop_environment.launch_app_via_desktop_file(path);
}
}
@@ -84,3 +104,55 @@ pub async fn open<R: Runtime>(app_handle: AppHandle<R>, path: String) -> Result<
.open(path, None)
.map_err(|e| e.to_string())
}
#[cfg(test)]
mod tests {
use super::*;
// This test modifies env var XDG_CURRENT_DESKTOP, which is kinda unsafe
// but considering this is just test, it is ok to do so.
#[test]
fn test_get_linux_desktop_environment() {
// SAFETY: Rust code won't modify/read XDG_CURRENT_DESKTOP concurrently, we
// have no guarantee from the underlying C code.
unsafe {
// Save the original value if it exists
let original_value = std::env::var_os(XDG_CURRENT_DESKTOP);
// Test when XDG_CURRENT_DESKTOP is not set
std::env::remove_var(XDG_CURRENT_DESKTOP);
assert!(get_linux_desktop_environment().is_none());
// Test GNOME
std::env::set_var(XDG_CURRENT_DESKTOP, "GNOME");
let result = get_linux_desktop_environment();
assert_eq!(result.unwrap(), LinuxDesktopEnvironment::Gnome);
// Test ubuntu:GNOME
std::env::set_var(XDG_CURRENT_DESKTOP, "ubuntu:GNOME");
let result = get_linux_desktop_environment();
assert_eq!(result.unwrap(), LinuxDesktopEnvironment::Gnome);
// Test KDE
std::env::set_var(XDG_CURRENT_DESKTOP, "KDE");
let result = get_linux_desktop_environment();
assert_eq!(result.unwrap(), LinuxDesktopEnvironment::Kde);
// Test unsupported desktop environment
std::env::set_var(XDG_CURRENT_DESKTOP, "XFCE");
let result = get_linux_desktop_environment();
assert_eq!(
result.unwrap(),
LinuxDesktopEnvironment::Unsupported {
xdg_current_desktop: "XFCE".into()
}
);
// Restore the original value
match original_value {
Some(value) => std::env::set_var(XDG_CURRENT_DESKTOP, value),
None => std::env::remove_var(XDG_CURRENT_DESKTOP),
}
}
}
}

View File

@@ -0,0 +1,61 @@
use derive_more::Display;
use serde::{Deserialize, Serialize};
use std::borrow::Cow;
use strum::EnumCount;
use strum::VariantArray;
#[derive(
Debug,
Deserialize,
Serialize,
Copy,
Clone,
Hash,
PartialEq,
Eq,
Display,
EnumCount,
VariantArray,
)]
#[serde(rename_all(serialize = "lowercase", deserialize = "lowercase"))]
pub(crate) enum Platform {
#[display("macOS")]
Macos,
#[display("Linux")]
Linux,
#[display("windows")]
Windows,
}
impl Platform {
/// Helper function to determine the current platform.
pub(crate) fn current() -> Platform {
let os_str = std::env::consts::OS;
serde_plain::from_str(os_str).unwrap_or_else(|_e| {
panic!("std::env::consts::OS is [{}], which is not a valid value for [enum Platform], valid values: {:?}", os_str, Self::VARIANTS.iter().map(|platform|platform.to_string()).collect::<Vec<String>>());
})
}
/// Return the `X-OS-NAME` HTTP request header.
pub(crate) fn to_os_name_http_header_str(&self) -> Cow<'static, str> {
match self {
Self::Macos => Cow::Borrowed("macos"),
Self::Windows => Cow::Borrowed("windows"),
// For Linux, we need the actual distro `ID`, not just a "linux".
Self::Linux => Cow::Owned(sysinfo::System::distribution_id()),
}
}
/// Returns the number of platforms supported by Coco.
//
// a.k.a., the number of this enum's variants.
pub(crate) fn num_of_supported_platforms() -> usize {
Platform::COUNT
}
/// Returns a set that contains all the platforms.
#[cfg(test)] // currently, only used in tests
pub(crate) fn all() -> std::collections::HashSet<Self> {
Platform::VARIANTS.into_iter().copied().collect()
}
}

View File

@@ -0,0 +1,87 @@
use semver::Version;
use tauri_plugin_updater::RemoteRelease;
/// Helper function to extract the build number out of `version`.
///
/// If the version string is in the `x.y.z` format and does not include a build
/// number, we assume a build number of 0.
fn extract_build_number(version: &Version) -> u32 {
let pre = &version.pre;
if pre.is_empty() {
// A special value for the versions that do not have array
0
} else {
let pre_str = pre.as_str();
let build_number_str = {
match pre_str.strip_prefix("SNAPSHOT-") {
Some(str) => str,
None => pre_str,
}
};
let build_number : u32 = build_number_str.parse().unwrap_or_else(|e| {
panic!(
"invalid build number, cannot parse [{}] to a valid build number, error [{}], version [{}]",
build_number_str, e, version
)
});
build_number
}
}
/// # Local version format
///
/// Packages built in our CI use the following format:
///
/// * `x.y.z-SNAPSHOT-<build number>`
/// * `x.y.z-<build number>`
///
/// If you build Coco from src, the version will be in format `x.y.z`
///
/// # Remote version format
///
/// `x.y.z-<build number>`
///
/// # How we compare versions
///
/// We compare versions based solely on the build number.
/// If the version string is in the `x.y.z` format and does not include a build number,
/// we assume a build number of 0. As a result, such versions are considered older
/// than any version with an explicit build number.
pub(crate) fn custom_version_comparator(local: Version, remote_release: RemoteRelease) -> bool {
let remote = remote_release.version;
let local_build_number = extract_build_number(&local);
let remote_build_number = extract_build_number(&remote);
let should_update = remote_build_number > local_build_number;
log::debug!(
"custom version comparator invoked, local version [{}], remote version [{}], should update [{}]",
local,
remote,
should_update
);
should_update
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_extract_build_number() {
// 0.6.0 => 0
let version = Version::parse("0.6.0").unwrap();
assert_eq!(extract_build_number(&version), 0);
// 0.6.0-2371 => 2371
let version = Version::parse("0.6.0-2371").unwrap();
assert_eq!(extract_build_number(&version), 2371);
// 0.6.0-SNAPSHOT-2371 => 2371
let version = Version::parse("0.6.0-SNAPSHOT-2371").unwrap();
assert_eq!(extract_build_number(&version), 2371);
}
}

View File

@@ -41,7 +41,9 @@
"title": "Coco AI Settings",
"url": "/ui/settings",
"width": 1000,
"minWidth": 1000,
"height": 700,
"minHeight": 700,
"center": true,
"transparent": true,
"maximizable": false,
@@ -53,6 +55,26 @@
"effects": ["sidebar"],
"state": "active"
}
},
{
"label": "check",
"title": "Coco AI Update",
"url": "/ui/check",
"width": 340,
"minWidth": 340,
"height": 260,
"minHeight": 260,
"center": false,
"transparent": true,
"maximizable": false,
"skipTaskbar": false,
"dragDropEnabled": false,
"hiddenTitle": true,
"visible": false,
"windowEffects": {
"effects": ["sidebar"],
"state": "active"
}
}
],
"security": {
@@ -91,21 +113,7 @@
"icons/Square310x310Logo.png",
"icons/StoreLogo.png"
],
"macOS": {
"minimumSystemVersion": "12.0",
"hardenedRuntime": true,
"dmg": {
"appPosition": {
"x": 180,
"y": 180
},
"applicationFolderPosition": {
"x": 480,
"y": 180
}
}
},
"resources": ["assets", "icons"]
"resources": ["assets/**/*", "icons"]
},
"plugins": {
"features": {
@@ -118,7 +126,6 @@
"https://release.infinilabs.com/coco/app/.latest.json?target={{target}}&arch={{arch}}&current_version={{current_version}}"
]
},
"websocket": {},
"shell": {},
"globalShortcut": {},
"deep-link": {

View File

@@ -0,0 +1,15 @@
{
"identifier": "rs.coco.app",
"bundle": {
"linux": {
"deb": {
"depends": ["gstreamer1.0-plugins-good"],
"desktopTemplate": "./Coco.desktop"
},
"rpm": {
"depends": ["gstreamer1-plugins-good"],
"desktopTemplate": "./Coco.desktop"
}
}
}
}

View File

@@ -86,6 +86,12 @@ export const Get = <T>(
} else {
res = result?.data as FcResponse<T>;
}
// web component log
infoLog({
username: "@/api/axiosRequest.ts",
logName: url,
})(res);
resolve([null, res as FcResponse<T>]);
})
.catch((err) => {
@@ -96,14 +102,14 @@ export const Get = <T>(
export const Post = <T>(
url: string,
data: IAnyObj,
data: IAnyObj | undefined,
params: IAnyObj = {},
headers: IAnyObj = {}
): Promise<[any, FcResponse<T> | undefined]> => {
return new Promise((resolve) => {
const appStore = JSON.parse(localStorage.getItem("app-store") || "{}");
let baseURL = appStore.state?.endpoint_http
let baseURL = appStore.state?.endpoint_http;
if (!baseURL || baseURL === "undefined") {
baseURL = "";
}

63
src/api/streamFetch.ts Normal file
View File

@@ -0,0 +1,63 @@
export async function streamPost({
url,
body,
queryParams,
headers,
onMessage,
onError,
}: {
url: string;
body: any;
queryParams?: Record<string, any>;
headers?: Record<string, string>;
onMessage: (chunk: string) => void;
onError?: (err: any) => void;
}) {
const appStore = JSON.parse(localStorage.getItem("app-store") || "{}");
let baseURL = appStore.state?.endpoint_http;
if (!baseURL || baseURL === "undefined") {
baseURL = "";
}
const headersStr = localStorage.getItem("headers") || "{}";
const headersStorage = JSON.parse(headersStr);
const query = new URLSearchParams(queryParams || {}).toString();
const fullUrl = `${baseURL}${url}?${query}`;
try {
const res = await fetch(fullUrl, {
method: "POST",
headers: {
"Content-Type": "application/json",
...(headersStorage),
...(headers || {}),
},
body: JSON.stringify(body),
});
if (!res.ok || !res.body) throw new Error("Stream failed");
const reader = res.body.getReader();
const decoder = new TextDecoder("utf-8");
let buffer = "";
while (true) {
const { done, value } = await reader.read();
if (done) break;
buffer += decoder.decode(value, { stream: true });
const lines = buffer.split("\n");
for (let i = 0; i < lines.length - 1; i++) {
const line = lines[i].trim();
if (line) onMessage(line);
}
buffer = lines[lines.length - 1];
}
} catch (err) {
console.error("streamPost error:", err);
onError?.(err);
}
}

View File

@@ -1,133 +0,0 @@
import { fetch } from "@tauri-apps/plugin-http";
import { clientEnv } from "@/utils/env";
import { useLogStore } from "@/stores/logStore";
import { get_server_token } from "@/commands";
interface FetchRequestConfig {
url: string;
method?: "GET" | "POST" | "PUT" | "DELETE";
headers?: Record<string, string>;
body?: any;
timeout?: number;
parseAs?: "json" | "text" | "binary";
baseURL?: string;
}
interface FetchResponse<T = any> {
data: T;
status: number;
statusText: string;
headers: Headers;
}
const timeoutPromise = (ms: number) => {
return new Promise<never>((_, reject) =>
setTimeout(() => reject(new Error(`Request timed out after ${ms} ms`)), ms)
);
};
export const tauriFetch = async <T = any>({
url,
method = "GET",
headers = {},
body,
timeout = 30,
parseAs = "json",
baseURL = clientEnv.COCO_SERVER_URL
}: FetchRequestConfig): Promise<FetchResponse<T>> => {
const addLog = useLogStore.getState().addLog;
try {
const appStore = JSON.parse(localStorage.getItem("app-store") || "{}");
const connectStore = JSON.parse(localStorage.getItem("connect-store") || "{}");
console.log("baseURL", appStore.state?.endpoint_http)
baseURL = appStore.state?.endpoint_http || baseURL;
const authStore = JSON.parse(localStorage.getItem("auth-store") || "{}")
const auth = authStore?.state?.auth
console.log("auth", auth)
if (baseURL.endsWith("/")) {
baseURL = baseURL.slice(0, -1);
}
if (!url.startsWith("http://") && !url.startsWith("https://")) {
// If not, prepend the defaultPrefix
url = baseURL + url;
}
if (method !== "GET") {
headers["Content-Type"] = "application/json";
}
const server_id = connectStore.state?.currentService?.id || "default_coco_server"
const res: any = await get_server_token(server_id);
headers["X-API-TOKEN"] = headers["X-API-TOKEN"] || res?.access_token || undefined;
// debug API
const requestInfo = {
url,
method,
headers,
body,
timeout,
parseAs,
};
const fetchPromise = fetch(url, {
method,
headers,
body,
});
const response = await Promise.race([
fetchPromise,
timeoutPromise(timeout * 1000),
]);
const statusText = response.ok ? "OK" : "Error";
let data: any;
if (parseAs === "json") {
data = await response.json();
} else if (parseAs === "text") {
data = await response.text();
} else {
data = await response.arrayBuffer();
}
// debug API
const log = {
request: requestInfo,
response: {
data,
status: response.status,
statusText,
headers: response.headers,
},
};
addLog(log);
return log.response;
} catch (error) {
console.error("Request failed:", error);
// debug API
const log = {
request: {
url,
method,
headers,
body,
timeout,
parseAs,
},
error,
};
addLog(log);
throw error;
}
};

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 346 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 347 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 KiB

Some files were not shown because too many files have changed in this diff Show More