306 Commits

Author SHA1 Message Date
ayang
e7f8f7ef6e chore: add macos config for tauri 2025-07-28 16:31:54 +08:00
ayangweb
4709f8c660 feat: enhance ui for skipped version (#834) 2025-07-28 11:43:10 +08:00
SteveLauC
4696aa1759 test: test extract_build_number() (#835)
This commit adds a test for extract_build_number(), which I forgot to do
in commit 067fb7144f6[1].

[1]: 067fb7144f
2025-07-28 11:42:50 +08:00
ayangweb
924fc09516 fix: fix issue with update check failure (#833)
* fix: fix issue with update check failure

* docs: update changelog
2025-07-28 10:06:07 +08:00
SteveLauC
5a700662dd chore: release notes for 0.7.1 (#832) 2025-07-28 10:00:12 +08:00
BiggerRain
8f992bfa92 chore: bump version number to 0.7.1 (#830) 2025-07-27 17:26:08 +08:00
BiggerRain
e7dd27c744 chore: add toggle_move_to_active_space_attribute (#829)
* chore: add toggle_move_to_active_space_attribute

* chore: pin

* chore: add

* update

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2025-07-27 16:50:11 +08:00
ayangweb
7914836c3e fix: correct enter key behavior (#828) 2025-07-27 11:52:40 +08:00
BiggerRain
b37bf1f7c7 chore: bump version number to 0.7.0 (#827) 2025-07-25 19:54:33 +08:00
BiggerRain
419d9d55c5 chore: web componet remove server name (#826) 2025-07-25 18:16:07 +08:00
BiggerRain
d3ed54c771 chore: web component add notification component (#825)
* chroe: web component add notification component

* docs: update notes
2025-07-25 18:15:49 +08:00
ayangweb
8f26dbcbe6 refactor: optimize subpage shortcut context menu (#822)
* refactor: optimize subpage shortcut context menu

* update

* update
2025-07-25 16:43:41 +08:00
ayangweb
663873ae14 refactor: optimize carriage return copying (#823) 2025-07-25 16:43:05 +08:00
SteveLauC
286b1be212 fix: panic on Ubuntu (GNOME) when opening apps (#821)
On Ubuntu (the GNOME version), Coco would panic when users open an app due
to the reason that Coco thinks it is running in an unsupported desktop
environment (DE).

We rely on the environment variable XDG_CURRENT_DESKTOP to detect the DE,
Ubuntu sets this variable to "ubuntu:GNOME" instead of just "GNOME",
which was not handled by the previous implementation.

This commit supports this case. Also, when Coco runs in an unsupported DE,
opening apps should not panic the app. After this commit, we would return
an error.
2025-07-25 15:32:48 +08:00
ayangweb
37221782b0 refactor: optimize shortcut key triggering (#820) 2025-07-25 14:54:32 +08:00
ayangweb
644e291105 fix: fix update window config sync (#818)
* fix: fix update window config sync

* docs: update changelog
2025-07-25 14:47:20 +08:00
BiggerRain
aae6984aa7 fix: re-search data initialization (#817) 2025-07-25 14:43:27 +08:00
ayangweb
dbd296d399 fix: fix enter key on subpages (#819)
* fix: fix enter key on subpages

* docs: update changelog
2025-07-25 14:43:16 +08:00
ayangweb
e2ad25967d fix: fix ctrl+k not working (#815) 2025-07-25 14:30:03 +08:00
ayangweb
21b61d80d8 refactor: optimize method calls for checking for updates (#814) 2025-07-25 13:42:12 +08:00
ayangweb
9f4c693ac4 refactor: optimize line breaks in input boxes (#813) 2025-07-25 12:36:07 +08:00
BiggerRain
45c27cac56 chore: cancel interface param (#816) 2025-07-25 12:16:23 +08:00
BiggerRain
e46035afd4 fix:the client id is the same (#812)
* chore: add

* fix: client id
2025-07-25 11:25:22 +08:00
BiggerRain
1004bb73f4 chore: delay the chat monitoring event (#811) 2025-07-24 20:03:30 +08:00
BiggerRain
d664fa7271 chore: handle reply to message (#799)
* chore: add reply to message

* chore: handle rust data

* log

* chore: id

* feat: add

* chore: loading step

* chore: cur id

* feat: add

* accept query parameters

* chore: add message id for cancel

* chore: remove log

* chore: remove log

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2025-07-24 18:06:59 +08:00
SteveLauC
067fb7144f refactor: use custom version comparator to determine if we should update (#810) 2025-07-24 16:05:36 +08:00
ayangweb
579f91f3aa refactor: refactor version update check (#809) 2025-07-24 11:56:57 +08:00
ayangweb
abe2aecedf fix: fix multiline input issue (#808) 2025-07-24 10:58:57 +08:00
SteveLauC
e8f9a4e627 chore: log querysources to search only when querysource is not set (#807) 2025-07-24 09:39:29 +08:00
ayangweb
22b1558e8b refactor: optimized data fetching for secondary pages (#803) 2025-07-23 18:56:56 +08:00
SteveLauC
ca3b514a65 fix: panic caused by "state() called before manage()" (#806)
This commit fixes the following panic:

```
Time: [2025-07-23-17-03-23]
Location: [/Users/steve/.cargo/registry/src/index.crates.io-1949cf8c6b5b557f/tauri-2.5.1/src/lib.rs:742:7]
Message: [state() called before manage() for tauri_plugin_global_shortcut::GlobalShortcut<tauri_runtime_wry::Wry<tauri::EventLoopMessage>>]
```

The root cause is that, in a Tauri application, before you can access a piece of
managed state with the .state() method, you must first register it with Tauri
using .manage(). When a user reigsters hotkey for an extension,
initializing extensions will invoke the .state() method, at that point,
.manage() hasn't been called.

The fix is simple, we simply call .manage() earlies (invoked by our
`shortcut::enable_shortcut(app)` function).
2025-07-23 18:56:16 +08:00
SteveLauC
c694c4eda9 chore: display backtrace in panic log (#805)
Having backtrace in the panic log will help debugging a lot. Under
release builds, we strip our binary so the symbols information is
unavailable, but this information is still useful in debug builds.

Panic log in release builds:

```
Time: [YYYY-MM-DD-HH-MM-SS]
Location: [/Users/foo/.cargo/registry/src/index.crates.io-1949cf8c6b5b557f/tauri-2.5.1/src/lib.rs:742:7]
Message: [state() called before manage() for tauri_plugin_global_shortcut::GlobalShortcut<tauri_runtime_wry::Wry<tauri::EventLoopMessage>>]
Backtrace:
   0: __mh_execute_header
   1: __mh_execute_header
   2: __mh_execute_header
   3: __mh_execute_header
   4: __mh_execute_header
   5: __mh_execute_header
   6: __mh_execute_header
   7: __mh_execute_header
   8: __mh_execute_header
   9: __mh_execute_header
  10: __mh_execute_header
  11: __mh_execute_header
  12: __mh_execute_header
  13: __mh_execute_header
  14: __mh_execute_header
  15: __mh_execute_header
  16: __mh_execute_header
  17: <unknown>
  18: <unknown>
```
2025-07-23 17:00:48 +08:00
ayangweb
ac835c76aa fix: fix shortcut issue in windows context menu (#804)
* fix: fix shortcut issue in windows context menu

* docs: update changelog
2025-07-23 16:20:46 +08:00
SteveLauC
25bbab7432 refactor: clean up unsupported characters from query string in Win Search (#802)
We found that Windows Search would error out if it encounters a single
quote character, the natural solution would be to escape it. But I couldn't
find out how. The approach mentioned by most posts:

```
~="<Unsupported Char>"
```

won't work in my test. So I decided to replace it with a whitespace.

Single quote is not the first unsupported character I know, the newline
character is not supported as well, so it will be handled in the same
way.
2025-07-23 16:13:15 +08:00
ayangweb
cca00e944e fix: fix selection issue after renaming (#800) 2025-07-23 13:59:33 +08:00
SteveLauC
e78fe4ac89 fix: broken windows search (#801)
This commit fixes the search issue introduced by [commit](5c0a865822). We have no idea why the tauri command `get_app_search_source` won't be invoked after that commit on Windows.

This commit resolves the issue by moving the extension init logic to the Rust side.

Also, update the querysource logs in `quey_coco_fusion()`, the old one won't say anything if the querysource list is empty, the new one will tell us that.
2025-07-23 12:33:18 +08:00
Medcl
60fd79f1fa fix: increase read_timeout for HTTP streaming stability (#798) 2025-07-22 18:44:27 +08:00
BiggerRain
5c0a865822 chore: not request the interface if not logged in (#795)
* chore: not request the interface if not logged in

* chore: res

* chore: res

* chore: common interface

* chore: no login

* chore: login

* chore: login

* chore: add

* dbg print servers

* chore: id

* docs: update notes

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2025-07-22 16:15:58 +08:00
SteveLauC
5b50e4b51b ci: add Rust code format check to CI (#797)
This commit adds the Rust code format check to our CI.
2025-07-22 15:11:13 +08:00
SteveLauC
b97386a827 refactor: avoid GLOBAL_TAURI_APP_HANLE if possible (#796)
This commit fixes the Windows panic issue. 

Coco panicked because it accessed `GLOBAL_TAURI_APP_HANDLE` when this global variable wasn't initialized. I removed all the uses of this variable except for the one use in `src-tauri/src/server/http_client.rs`, which I don't have a good way to refactor.

If you are wondering why this didn't happen in the past, the access was triggered by the frontend code, something there likely changed. Regardless, this global variable is still dangerous and error-prone, so we should avoid it.

Also, this commit fixes the issue that the panic hook does not work on Windows because the log filename contains ":", which is not allowed by the Windows file system.
2025-07-22 14:43:27 +08:00
SteveLauC
29aa26af94 chore: add a panic hook to catch panic msg (#793) 2025-07-22 10:34:27 +08:00
BiggerRain
3650d9914c fix: enter key problem (#794)
* fixed: enter key problem

* docs: update notes

* fix: enter key problem
2025-07-22 10:13:08 +08:00
SteveLauC
f26031047c fix: refreshing Coco server should register it to SearchSource (#792) 2025-07-22 08:51:57 +08:00
BiggerRain
c8719926be chore: add 401 unauthorized (#791) 2025-07-21 22:21:07 +08:00
BiggerRain
f1dfc5c730 fixed: chat message confusion (#782)
* fix: chat

* fix: chat

* chore: add session id

* fix: fixed incorrect taskbar icon display on linux (#783)

* fix: fixed incorrect taskbar icon display on linux

* docs: update changelog

* fix: fix data inconsistency issue on secondary pages (#784)

* chore: chat

* chore: chat

* chore: add logging message

* chore: chat

* chore: chat

* chore: add

* feat: add

* chore: chat end

* style: message width

---------

Co-authored-by: ayangweb <75017711+ayangweb@users.noreply.github.com>
Co-authored-by: medcl <m@medcl.net>
2025-07-21 21:17:20 +08:00
SteveLauC
74ed642a42 refactor: tighten up Coco servers state management (#790)
* refactor: tighten up Coco servers state management

* ignore unused warnings

* log out if the failed request has status 401
2025-07-21 20:39:16 +08:00
ayangweb
5a17173620 fix: incorrect status when installing extension (#789)
* fix: incorrect status when installing extension

* docs: update changelog
2025-07-21 18:17:30 +08:00
SteveLauC
29d14ff931 chore: remove unused type ServerTokenResponse (#788)
After this commit[1], type `ServerTokenResponse` became unused, remove
it as well.

[1]: 57ab08fb6d
2025-07-21 15:30:26 +08:00
ayangweb
ad01504766 refactor: decouple window switch services to ensure they operate independently (#786) 2025-07-20 17:26:15 +08:00
SteveLauC
57ab08fb6d chore: remove unused tauri cmd get_server_token (#787)
Found this tauri command while reading the code, then I realized that
token management logic should all be kept in the backend, there is no
need to expose it to the frontend. And indeed, searching for it in the
frontend code showed that it is not used at all.

```sh
$ cd src

$ rg get_server_token
commands/servers.ts
75:export function get_server_token(id: string): Promise<ServerTokenResponse> {
76:  return invokeWithErrorHandler(`get_server_token`, { id });
```

So remove it.
2025-07-20 17:25:32 +08:00
ayangweb
db5c09f80c fix: fix data inconsistency issue on secondary pages (#784) 2025-07-20 10:54:51 +08:00
ayangweb
b1e2c6961d fix: fixed incorrect taskbar icon display on linux (#783)
* fix: fixed incorrect taskbar icon display on linux

* docs: update changelog
2025-07-20 10:08:11 +08:00
BiggerRain
3f4abe51e5 fix: web component server list error (#781)
* chore: update app

* fix: web component server list error

* feat: add

* chore: remove defalut version
2025-07-19 17:07:11 +08:00
ayangweb
060c09e11c fix: resolved minor issues with voice playback (#780)
* fix: resolved minor issues with voice playback

* docs: update changelog

* update
2025-07-19 14:25:19 +08:00
ayangweb
657df482bf fix: correct incorrect assistant display when quick ai access (#779)
* fix: correct incorrect assistant display when quick ai access

* docs: update changelog
2025-07-19 13:54:39 +08:00
ayangweb
f4f7732927 refactor: show specific values in shortcut key conflict tips (#778)
* refactor: show specific values in shortcut key conflict tips

* update

* update

* update

* update

* update

* update

* update
2025-07-19 11:05:17 +08:00
ayangweb
5e536e1444 refactor: separate user agreement and privacy policy links (#777) 2025-07-19 10:24:29 +08:00
ayangweb
2b48cdf84a refactor: add border-radius to extended categories (#776) 2025-07-19 10:08:04 +08:00
BiggerRain
bc37616506 chore: search-chat add language and formatUrl parameters (#775)
* chore: add language

* build: build web

* docs: update notes
2025-07-19 09:34:38 +08:00
ayangweb
07bcd80776 refactor: invoke language update logic earlier (#774) 2025-07-18 16:44:43 +08:00
SteveLauC
7b8b396368 fix: indexing apps does not respect search scope config (#773)
This commit fixes the issue that indexing applications does not
respect the search scope configuration, it always uses the default
values.
2025-07-18 16:26:34 +08:00
ayangweb
823a95d601 fix: restore missing category titles on subpages (#772) 2025-07-18 16:25:44 +08:00
ayangweb
af0b98a41b refactor: rebuild app index with improved suggestions (#771) 2025-07-18 16:15:28 +08:00
SteveLauC
7d0e7cd7dc fix: unregister ext hotkey when it gets deleted (#770)
This commit fixes the bug that when an extension gets uninstalled, its
registered hotkey won't be cleared.
2025-07-18 13:20:41 +08:00
ayangweb
e56d6b1b60 refactor: close the file upload port (#769) 2025-07-18 10:45:05 +08:00
BiggerRain
941cf96a07 style: splash adapts to the width of mobile phones (#768)
* style: splash width style

* docs: update notes
2025-07-17 15:33:24 +08:00
SteveLauC
14fbf2ac5d refactor: do status code check before deserializing response (#767)
* refactor: do status code check before deserializing response

This commit adds a status code check to the following requests, only when
this check passes, we deserialize the response JSON body:

- get_connectors_by_server
- mcp_server_search
- datasource_search

A helper function `status_code_check(response, allowed_status_codes)`
is added to make refactoring easier.

* chore: release notes
2025-07-17 15:08:14 +08:00
SteveLauC
494e2f0d8a chore: Coco app http request headers (#744)
Add the following HTTP headers when making HTTP requests:

- X-OS-NAME
- X-OS-VER
- X-OS-ARCH
- X-APP-NAME
- X-APP-VER
- X-APP-LANG
2025-07-17 11:31:19 +08:00
BiggerRain
e3a3849fa4 chore: search-chat components add formatUrl & think data & icons url (#765)
* chore: web components add formatUrl & think data

* chore: add headers

* chore: add

* chhore: add server url

* docs: update notes

* chore: url

* docs: search chat docs
2025-07-17 09:22:23 +08:00
SteveLauC
0b5e31a476 chore(deps): bump the windows crate (#766)
This commit bumps the windows crate from "0.60.0" to "0.61.3", it should
solve the CI issue happened here[1]:

```text
error[E0277]: `DBOBJECT` doesn't implement `Debug`
     --> C:\Users\runneradmin\.cargo\registry\src\index.crates.io-1949cf8c6b5b557f\windows-0.60.0\src\Windows\Win32\System\Search\mod.rs:21828:5
      |
21826 | #[derive(Clone, Debug, PartialEq)]
      |                 ----- in this derive macro expansion
21827 | pub struct SSVARIANT_0_4 {
21828 |     pub dbobj: DBOBJECT,
      |     ^^^^^^^^^^^^^^^^^^^ the trait `Debug` is not implemented for `DBOBJECT`
      |
      = note: add `#[derive(Debug)]` to `DBOBJECT` or manually `impl Debug for DBOBJECT`
```

[1]: https://github.com/infinilabs/ci/actions/runs/16314479643/job/46076989290
2025-07-16 17:10:32 +08:00
SteveLauC
c8a723ed9d feat: file search for Windows (#762)
This commit implements the file search extension for Windows platforms using the [Windows Search](https://learn.microsoft.com/en-us/windows/win32/search/-search-3x-wds-qryidx-overview) functionality.

Something to note:

1. Searching by file content is not natively supported. Coco would search for all the columns (attributes/fields within the index) with this option:

```rust
        SearchBy::NameAndContents => {
            // Windows File Search does not support searching by file content.
            //
            // `CONTAINS('query_string')` would search all columns for `query_string`,
            // this is the closest solution we have.
            format!("((System.FileName LIKE '%{query_string}%') OR CONTAINS('{query_string}'))")
        }
```

2. Tests have been added, but they failed in our CI for unknown reasons so I disabled them:

```rust
// Skip these tests in our CI, they fail with the following error 
// "SQL is invalid: "0x80041820""
// 
// I have no idea about the underlying root cause
#[cfg(all(test, not(ci)))]
mod test {
```

3. The Windows Search index is not real-time and can return obsolete results. Opening the returned documents could fail if the chosen file has been deleted or moved.
2025-07-16 09:11:53 +08:00
ayangweb
aaf4bf2737 refactor: update the font icon link (#763) 2025-07-15 09:10:26 +08:00
BiggerRain
24b0123a61 docs: add deep wiki docs (#761) 2025-07-11 17:22:18 +08:00
ayangweb
e8bd970cdb refactor: updated the upload endpoint for attachments (#759) 2025-07-10 18:20:32 +08:00
ayangweb
dd3be3a819 refactor: refactored file icon retrieval logic (#757)
* refactor: refactored file icon retrieval logic

* update

* update

* update
2025-07-10 18:10:39 +08:00
Medcl
5b034c28ac chore: make optional fields optional (#758)
* chore: make optional fields optional

* chore: update docs
2025-07-10 18:06:05 +08:00
ayangweb
b17949fe29 refactor: enabling the upload file component (#755)
* refactor: enabling the upload file component

* update
2025-07-10 17:26:44 +08:00
SteveLauC
5d37420109 feat: tauri command get_file_icon() (#756) 2025-07-10 16:51:34 +08:00
ayangweb
1d3ceb0c70 refactor: remove speech-to-text shortcuts (#754) 2025-07-10 13:58:37 +08:00
BiggerRain
4d11afe18e chore: assistant params & styles (#753)
* chore: add

* chore: add

* chore: assistant params & styles

* docs: update notes
2025-07-10 11:47:10 +08:00
SteveLauC
0c0291c8c0 chore: rename QuickLink/quick_link to Quicklink/quicklink (#752)
* chore: rename QuickLink/quick_link to Quicklink/quicklink

Standardize varaible naming to match the correct term: "Quicklink" and "quicklink".
This updates all incorrect variants such as "QuickLink" and "quick_link".

* chore: release notes
2025-07-10 10:18:57 +08:00
ayangweb
cca672b2cb feat: text to speech now powered by LLM (#750)
* feat: support text to speech

* chore: receive bytes stream

* chore: update testing code

* feat: mp3 paly

* update

* docs: update changelog

* update

* update

* update

---------

Co-authored-by: medcl <m@medcl.net>
Co-authored-by: rain9 <15911122312@163.com>
2025-07-10 10:16:51 +08:00
BiggerRain
5b27488402 refactor: adjusted assistant, datasource, mcp_server interface parameters (#746)
* chore: handle mcp interface parameters

* docs: update notes

* chore: remove code

* chore: assistant params

* fix: assistant params

* docs: update notes
2025-07-10 09:48:42 +08:00
SteveLauC
c1c4e0db7b chore: bump dep applications-rs (#751)
* chore: bump dep applications-rs

Currently Coco depends on atty v0.2.14, a crate that has
[vulnerability](https://github.com/infinilabs/coco-app/security/dependabot/25),
here is the dependency chain:

```
coco -> applications-rs -> freedesktop-file-parser 0.1.0 -> atty 0.2.14
```

I bumped the [`freedesktop-file-parser`](7bdb070e45)
crate in our applications-rs crate, which would eliminate the `atty` crate
from the chain to fix the vulnerability.

This commit bumps the applications-rs crate to include the above change.

* chore: release notes
2025-07-09 18:52:17 +08:00
ayangweb
074a7c8b0a fix: prevent window from hiding when moved on Windows (#748)
* fix: prevent window from hiding when moved on Windows

* docs: update changelog

* update
2025-07-09 16:30:41 +08:00
SteveLauC
bc524e19db refactor: adjust extension code hierarchy (#747)
* refactor: adjust extension code hierarchy

In this commit, I refactored the extension code structure.

* We can only install third-party extensions so the `store.rs` file should
  belong to the `third_party` directory.

* Move tauri command `uninstall_extension()` to `extension/mod.rs` from
  `third_party.rs` since one can uninstall an extension regardless of
  how you installed it.

* Refactor the `install_extension_from_store()` function, add more
  descriptive code comments.

Also, a trivial change, bump Rust toolchain and edition to use the
[let-chains](https://blog.rust-lang.org/2025/06/26/Rust-1.88.0/#let-chains) syntax.

* chore: release notes
2025-07-09 16:28:59 +08:00
SteveLauC
05f70d26d9 chore: replace meval-rs with our fork to clear dep warning (#745)
* chore: replace meval-rs with our fork to clear dep warning

This commit replaces the meval-rs dependency with our
[fork](https://github.com/infinilabs/meval-rs). The original meval-rs
crate has not been maintained for a long time and uses nom 1.0, a crate
that was released 9 years ago, which would be rejected by future Rust
compiler because it contains outdated Rust syntaxes. This is the reason
why we are seeing the following warning:

```
warning: the following packages contain code that will be rejected by a future version of Rust: nom v1.2.4
note: to see what the problems were, use the option `--future-incompat-report`, or run `cargo report future-incompatibilities --id 1
```

Switching to our fork would solve this warning.

* chore: release notes
2025-07-08 15:39:58 +08:00
SteveLauC
ab26dc7c6a fix(file search): searching by name&content does not search file name (#743)
* fix(file search): searching by name&content does not search file name

* release note
2025-07-08 09:21:43 +08:00
BiggerRain
6ff6b46139 refactor: create chat & send chat api (#739)
* chore: code format

* fix: build error

* refactor: chat create & chat

* chore: aa

* chore: aa

* refactor: send chat messages

* chore: chat

* chore: web

* chore: add

* docs: update notes
2025-07-07 19:41:29 +08:00
SteveLauC
119fd87a25 fix(file search): apply filters before from/size parameters (#741) 2025-07-07 19:40:46 +08:00
SteveLauC
de226a8fa4 ci: compile-check rust code & run rust tests when Rust code changes (#742)
Run some basic Rust checks in our CI iff rust code changes
2025-07-07 18:14:25 +08:00
SteveLauC
6865957725 chore: icon support for more file types (#740)
This PR adds icon support for more types of files, see the code for the full file type list.

Co-authored-by: ayang <473033518@qq.com>
2025-07-02 16:27:44 +08:00
SteveLauC
87818d69ed refactor: change File Search ext type to extension (#738)
* refactor: change File Search ext type to extension

* chore: release notes
2025-07-02 10:45:54 +08:00
SteveLauC
38b67d01b8 refactor: prioritize stat(2) when checking if a file is dir (#737)
* refactor: prioritize stat(2) when checking if a file is dir

* chore: release notes
2025-07-02 10:00:33 +08:00
ayangweb
a4f4a24730 feat: voice input support in both search and chat modes (#732)
* feat: voice input support in both search and chat modes

* docs: update changelog

* update

* update

* update

* update
2025-07-02 09:35:16 +08:00
BiggerRain
87bd3d020f fix: build error (#736) 2025-07-02 07:03:09 +08:00
SteveLauC
825ac5d565 feat: file search using spotlight (#705)
Co-authored-by: ayang <473033518@qq.com>
2025-07-01 19:19:16 +08:00
BiggerRain
f21a35e15d fix: update information storage cache and styles (#735) 2025-07-01 15:46:37 +08:00
BiggerRain
6e90b28204 style: extension iocn styles (#734) 2025-07-01 13:44:44 +08:00
Hardy
e92e5e5158 chore: typo step name and env (#731)
Co-authored-by: hardy <luohf@infinilabs.com>
2025-06-30 14:40:26 +08:00
Hardy
2ac81566c6 Fix run shell (#730)
* fix: windows platform run with shell

* chore: add rust target

* fix: fix app version and release body

* chore: update step id

---------

Co-authored-by: hardy <luohf@infinilabs.com>
2025-06-30 14:29:52 +08:00
Hardy
b004670dec fix: windows platform run with shell (#729)
* fix: windows platform run with shell

* chore: add rust target

---------

Co-authored-by: hardy <luohf@infinilabs.com>
2025-06-30 14:11:51 +08:00
Hardy
a426e33e6b fix: feature dependcy local path (#728)
* fix: feature dependcy local path

* chore: use build args from env

* chore: remove no use step

---------

Co-authored-by: hardy <luohf@infinilabs.com>
2025-06-30 13:16:54 +08:00
Hardy
bb7dd6bf7c fix: build error on windows platform with cargo add git repo (#727)
Co-authored-by: hardy <luohf@infinilabs.com>
2025-06-30 12:13:46 +08:00
BiggerRain
37c5f2de24 fix: tray not on display (#726) 2025-06-30 10:53:40 +08:00
SteveLauC
ab6c25fe96 chore: release notes for 0.6.0 (#725) 2025-06-29 17:42:32 +08:00
BiggerRain
1fb464df09 fix: open extension store display (#724) 2025-06-29 17:38:14 +08:00
SteveLauC
65aa75043f chore: bump version number to 0.6.0 (#723) 2025-06-29 17:08:06 +08:00
BiggerRain
79dcc7b4ec fix: text display error (#722)
* fix: text  display error

* fix: text  display error

* fix: select extension display install

* fix: select extension display install
2025-06-29 16:47:02 +08:00
BiggerRain
3d29cfe235 chore: rebuild index position (#721) 2025-06-29 15:53:43 +08:00
BiggerRain
aea3a7ba98 chore: rebuild index (#720)
* chore: rebuild index

* chore: rebuild index
2025-06-29 15:39:01 +08:00
BiggerRain
190dfc6ecd chore: adjust styles and add button reindex (#719)
* chore: adjust styles and add button reindex

* docs: update notes

* style: remove margin bottom
2025-06-29 13:32:07 +08:00
SteveLauC
316a7940d6 chore: log command execution results (#718)
* chore: log command execution results

* release note
2025-06-29 10:46:47 +08:00
SteveLauC
acfc1bb32d feat: interface reindex_applications() (#704)
* feat: impl re-indexing applications

* drop pizza engine
2025-06-29 10:27:02 +08:00
ayangweb
c4d178dc2d feat: support back navigation via delete key (#717)
* feat: support back navigation via delete key

* docs: update changelog
2025-06-27 19:17:27 +08:00
ayangweb
6333c697d5 refactor: support large preview for extensions (#716) 2025-06-27 17:30:49 +08:00
ayangweb
810541494f refactor: update extension detail page ui (#715) 2025-06-27 15:07:34 +08:00
ayangweb
e45dc2acbe fix: context menu search not working (#713) 2025-06-27 14:18:54 +08:00
ayangweb
2d1ccb9744 refactor: improve layout of the extension list (#714) 2025-06-27 14:18:32 +08:00
SteveLauC
406f3b31e9 chore: change extension store request URL to default coco server (#712) 2025-06-27 10:40:12 +08:00
ayangweb
f51dd81014 refactor: optimized some issues with extensions (#711) 2025-06-27 10:22:51 +08:00
SteveLauC
3b38cbfb6c chore: update category name and icon (#710) 2025-06-27 10:16:27 +08:00
ayangweb
a4483ba277 fix: some input fields couldn’t accept spaces (#709)
* fix: some input fields couldn’t accept spaces

* docs: update changelog

* update
2025-06-27 10:16:02 +08:00
ayangweb
bf46979b80 refactor: remove special character filtering and clean up related code (#708) 2025-06-27 10:08:33 +08:00
ayangweb
070f171ad4 refactor: update context menu color for the delete action (#707) 2025-06-27 09:43:35 +08:00
ayangweb
3180704a0d refactor: show all extensions by default in the extension store (#706) 2025-06-27 09:36:20 +08:00
SteveLauC
b3f68697ce feat: impl extension store (#699)
Implements extension store so that users can install extensions from a GUI interface


---------

Co-authored-by: ayang <473033518@qq.com>
2025-06-26 18:40:33 +08:00
BiggerRain
69d2b4b834 chore: add message for latest version check (#703)
* chore: add message for latest version check

* docs: update notes
2025-06-25 10:38:38 +08:00
BiggerRain
6837286061 feat: add manual check for updates (#701)
* feat: add check for update

* feat: add Check for Updates

* docs: update notes

* build: build bundle test

* docs: update notes

* chore: recovering files
2025-06-19 20:58:54 +08:00
ayangweb
a431ead22a feat: support Tab and Enter for delete dialog buttons (#700)
* feat: support `Tab` and `Enter` for delete dialog buttons

* docs: update changelog

* refactor: update
2025-06-19 08:59:01 +08:00
ayangweb
7ec41dfe80 refactor: request data when service is available (#698) 2025-06-18 15:47:49 +08:00
ayangweb
06053e9fd9 refactor: getting service info only when a profile is available (#697)
* refactor: getting service info only when a profile is available

* refactor: update
2025-06-18 14:47:21 +08:00
Medcl
70b048fba3 fix: take coco server back on refresh (#696)
* fix: take coco server back on refresh

* chore: update release notes:
2025-06-18 13:33:59 +08:00
ayangweb
45083f829b refactor: optimized the style of the drop-down selection box (#695)
* refactor: optimized the style of the drop-down selection box

* refactor: update
2025-06-17 18:15:40 +08:00
SteveLauC
e4f6fb8e98 fix: toggle extension should register/unregister hotkey (#691) 2025-06-17 16:56:06 +08:00
BiggerRain
ee182b22da chore: keeping windows and documents safe (#694) 2025-06-17 15:39:18 +08:00
BiggerRain
a37e22c227 fix: quick ai state synchronous (#693)
* fix: quick ai state synchronous

* docs: update notes
2025-06-17 15:38:39 +08:00
BiggerRain
d75ab1018d chore: improve server list selection with enter key (#692)
* chore: server list enter selected

* docs: update notes

* chore: remove log
2025-06-17 09:36:04 +08:00
Medcl
40ad066e69 refactor: refactoring search api (#679)
* refactor: refactoring search api

* chore: interface type

* chore: interface type

* refactor: assistant search

* refactor: arrays into multiple fields

* refactor: update

* feat: search to add fuzziness to 5

* refactor: update

* chore: update release notes

---------

Co-authored-by: rain9 <15911122312@163.com>
Co-authored-by: ayang <473033518@qq.com>
Co-authored-by: ayangweb <75017711+ayangweb@users.noreply.github.com>
2025-06-17 09:31:43 +08:00
BiggerRain
a2a5a9f8fe chore: continue to chat page display (#690)
* chore: Continue to chat page display

* docs: update notes
2025-06-16 18:02:47 +08:00
SteveLauC
5fd9339e56 refactor: use author/ext_id as extension unique identifier (#643)
* refactor: use author/ext_id as extension unique identifier

* refactor: refactoring extended component interfaces and logic

* refactor: update

* style: remove console

* refactor: update

* drop pizza engine

* refactor: restore hotkey upon start no matter if the ext is enabled or not

* chore: release note

---------

Co-authored-by: ayang <473033518@qq.com>
2025-06-16 10:52:01 +08:00
Hardy
a8a9208b1f fix: no make target with project (#689)
* fix: no make with project

* chore: set working directory

---------

Co-authored-by: hardy <luohf@infinilabs.com>
2025-06-13 22:17:37 +08:00
medcl
8c9a2ff441 v0.5.0 2025-06-13 19:28:38 +08:00
Medcl
2251b0af95 chore: update release notes (#687) 2025-06-13 18:37:47 +08:00
BiggerRain
560a12ab93 fix: search & chat dispaly (#686) 2025-06-13 18:18:46 +08:00
ayangweb
2ff66c0b91 fix: arrow inserting escape sequences (#683)
* fix: arrow inserting escape sequences

* fix build

* docs: update changelog

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2025-06-13 18:06:21 +08:00
ayangweb
ef4a184233 refactor: optimize the operation of the small assistant on the secondary page (#685)
* refactor: optimize the operation of the small assistant on the secondary page

* refactor: update
2025-06-13 16:13:31 +08:00
ayangweb
8422bc03e7 refactor: optimize the timing of arrow key triggers on secondary pages (#684) 2025-06-13 15:52:20 +08:00
BiggerRain
370113129c fix: web component start page (#681) 2025-06-13 15:17:52 +08:00
ayangweb
cb758ef452 feat: context menu support for secondary pages (#680)
* feat: context menu support for secondary pages

* docs: update changelog
2025-06-13 15:07:05 +08:00
ayangweb
12b9b4bb81 refactor: blocking the default behavior of the tab key (#678)
* refactor: blocking the default behavior of the tab key

* refactor: update

* refactor: update

* refactor: update
2025-06-13 14:19:27 +08:00
BiggerRain
562db19f16 fix: filter services for unlogged-in users (#677) 2025-06-13 11:04:58 +08:00
ayangweb
dc5cd9aecb fix: fix problem with up and down key indexing (#676)
* fix: fix problem with up and down key indexing

* refactor: update

* docs: update changelog
2025-06-13 10:39:27 +08:00
BiggerRain
0b018cd24f chore: search & deep think & mcp (#675)
* fix: keep line breaks

* chore: search & deep think & mcp
2025-06-12 22:06:48 +08:00
BiggerRain
2ed22d3d7c fix: keep line breaks (#674) 2025-06-12 18:20:44 +08:00
BiggerRain
4ce9561eb7 style: safari styles (#673) 2025-06-12 14:50:02 +08:00
BiggerRain
3aeb39b3af refactor: optimize global state synchronization (#672)
* refactor: optimize global state synchronization

* refactor: reconstruct the language change processing logic

---------

Co-authored-by: ayang <473033518@qq.com>
2025-06-12 14:45:33 +08:00
BiggerRain
27e99d4629 fix: web assistant list (#671) 2025-06-12 11:28:10 +08:00
ayangweb
df70276a54 refactor: ai assistant hides the copy menu (#670)
* refactor: ai assistant hides the copy menu

* style: remove console
2025-06-12 10:39:37 +08:00
BiggerRain
6553a8f5d3 chore: add special character filtering (#668)
* chore: add special character filtering

* docs: update notes
2025-06-12 10:31:15 +08:00
ayangweb
4ebbc9ec6e refactor: improved ai overview and ai quick access blank issue (#669)
* refactor: improved ai overview and ai quick access blank issue

* refactor: update
2025-06-12 10:30:41 +08:00
BiggerRain
4208633556 fix: Fix Special Character input (#667) 2025-06-11 17:50:39 +08:00
ayangweb
fc43fbe798 refactor: improve AI assistant interaction logic and Tab key handling (#666)
* refactor: improve AI assistant interaction logic and Tab key handling

* refactor: update

* style: remove
2025-06-11 17:49:05 +08:00
ayangweb
b5bb9105d4 refactor: re-enable the service to get a list of assistants (#665) 2025-06-11 16:28:53 +08:00
BiggerRain
b6ebd6e5f8 fix: web component dispaly (#663)
* fix: web component dispaly

* fix: web component dispaly

* fix: add showChatHistory & connected

* fix: add isCurrentLogin

* chore: add history
2025-06-11 16:28:43 +08:00
ayangweb
22216491b6 refactor: dynamically generated copy button id (#664) 2025-06-11 15:52:49 +08:00
ayangweb
44ca66259c refactor: don't hide pinned window on search result open (#662)
* refactor: don't hide pinned window on search result open

* refactor: update
2025-06-11 15:26:06 +08:00
ayangweb
be3cae36e2 fix: number keys not following settings (#661)
* fix: number keys not following settings

* refactor: remove unused `modifierKey` dependencies

* docs: update changelog
2025-06-11 14:15:32 +08:00
ayangweb
35ea30626f refactor: improve tooltip display in chinese (#660) 2025-06-11 14:01:08 +08:00
BiggerRain
4bcae5cffb fix: delete history (#659) 2025-06-11 13:36:21 +08:00
BiggerRain
76458db8ab chore: remove enter disabled (#658) 2025-06-11 12:10:37 +08:00
BiggerRain
5b41e190d3 chore: add i18n to services (#657) 2025-06-11 11:03:42 +08:00
ayangweb
43ac9a054c refactor: remove the behavior that organizes event bubbling (#656) 2025-06-11 10:14:05 +08:00
BiggerRain
ac485a32cc style: user message styles (#655) 2025-06-10 19:25:54 +08:00
ayangweb
e10908a095 refactor: optimize the timing of the enter key (#654)
* refactor: optimize the timing of the enter key

* fix: remove input element

---------

Co-authored-by: rain <15911122312@163.com>
2025-06-10 19:01:25 +08:00
BiggerRain
78b8908ac8 fix: stop event bubbling (#653) 2025-06-10 18:22:54 +08:00
ayangweb
3c54cb84a8 refactor: filter unavailable servers (#652) 2025-06-10 17:37:57 +08:00
ayangweb
8ed808c591 fix: fix the problem of local path not opening (#650)
* fix: fix the problem of local path not opening

* docs: update changelog

* chore: remove pizza-engine
2025-06-10 17:26:19 +08:00
ayangweb
7a2dde7448 refactor: check if the message block is purely blank (#651) 2025-06-10 17:22:13 +08:00
BiggerRain
65451fc63e style: user message line break (#648) 2025-06-10 15:41:08 +08:00
BiggerRain
5d108a46d3 style: differentiate between hover and selected styles (#649) 2025-06-10 15:37:17 +08:00
BiggerRain
f9567c2d46 chore: remove defalut current service (#647) 2025-06-10 14:54:07 +08:00
BiggerRain
da917e6012 fix: web page unmount event (#645)
* fix: web page unmont event

* docs: update notes
2025-06-10 14:28:00 +08:00
ayangweb
335a906674 refactor: refactoring shortcut reset logic and optimizing UI interactions (#646) 2025-06-10 14:27:23 +08:00
ayangweb
a50a636d59 fix: input lost when reopening dialog after search (#644)
* fix: input lost when reopening dialog after search

* docs: update changelog
2025-06-10 11:45:45 +08:00
ayangweb
2dd3f776e6 fix: arrow keys still navigated search when menu opened with Cmd+K (#642)
* fix: arrow keys still navigated search when menu opened with `Cmd+K`

* docs: update changelog
2025-06-10 09:56:27 +08:00
BiggerRain
40f6aa0ccd chore: copy supports http protocol (#639)
* chore: copy supports http protocol

* docs: update notes
2025-06-09 18:12:43 +08:00
ayangweb
4da9e024e0 refactor: update login status when service is not enabled (#638) 2025-06-09 18:11:35 +08:00
ayangweb
c20bba51f5 fix: tab key hides window in chat mode (#641)
* fix: tab key hides window in chat mode

* docs: update changelog
2025-06-09 18:10:56 +08:00
BiggerRain
0a62a2095b fix: add shift line break to chat input (#637) 2025-06-09 15:06:59 +08:00
SteveLauC
5677995185 chore: more logs for the setup process (#634)
* chore: more logs for the setup process

* chore: more logs for the setup process

* chore: more logs for the setup process

* chore: release note
2025-06-09 14:46:06 +08:00
BiggerRain
ec4e5e7d1d fix: remove stopImmediatePropagation event (#636) 2025-06-09 12:05:27 +08:00
BiggerRain
1df5265b1a chore: add onContextMenu event (#629) 2025-06-09 11:57:48 +08:00
ayangweb
fb8a4684dc refactor: improved page content after disabling the service (#635)
* refactor: improved page content after disabling the service

* style: remove unless code

* style: remove unless code
2025-06-09 11:54:44 +08:00
BiggerRain
0b609e570d chore: web component default mode (#627) 2025-06-09 09:54:09 +08:00
BiggerRain
f91f6bdc17 fix: web component set IsDark (#630) 2025-06-07 10:49:16 +08:00
ayangweb
57590f3b57 feat: add internationalized translations of AI-related extensions (#632)
* feat: add internationalized translations of AI-related extensions

* docs: update changelog

* refactor: update
2025-06-07 10:48:55 +08:00
ayangweb
c18f9ea154 refactor: optimized input box logic for transparency (#628) 2025-06-06 17:58:18 +08:00
ayangweb
441875d9b4 refactor: optimize data filtering logic (#626) 2025-06-06 17:20:45 +08:00
ayangweb
eddf9075bb feat: add ai overview minimum number of search results configuration (#625)
* feat: add ai overview minimum number of search results configuration

* docs: update changelog

* style: remove unless code
2025-06-06 17:05:20 +08:00
ayangweb
9eac8f8a8e feat: support right-click actions after text selection (#624)
* feat: support right-click actions after text selection

* docs: update changelog

* feat: support for selecting messages sent by users
2025-06-06 16:43:27 +08:00
ayangweb
515260c43f feat: calculator extension add description (#623)
* feat: calculator extension add description

* docs: update changelog
2025-06-06 15:43:24 +08:00
ayangweb
118de0e80b fix: fix ai overview hidden height before message (#622)
* fix: fix ai overview hidden height before message

* docs: update changelog
2025-06-06 15:30:42 +08:00
SteveLauC
19ce896fdc chore: release note for PR 620 (#621) 2025-06-06 15:17:59 +08:00
SteveLauC
4a41ea5d8b fix: invalid DSL error if input contains multiple lines (#620) 2025-06-06 14:58:45 +08:00
ayangweb
880e1206ce fix: fixed modifier keys not working with continue chat (#619)
* fix: fixed modifier keys not working with continue chat

* docs: update changelog
2025-06-06 14:24:36 +08:00
SteveLauC
1e6d9f9550 fix: do not panic when the datasource specified does not exist (#618)
* fix: do not panic when the datasource specified does not exist

* release note
2025-06-06 14:07:27 +08:00
BiggerRain
ff0faf425f fix: only select history and then set the assistant (#617)
* fix: only select history and then set the assistant

* fix: only select history and then set the assistant
2025-06-06 14:06:49 +08:00
ayangweb
1fbf5d6552 fix: resolved an issue where number keys were not working on the web (#616)
* fix: resolved an issue where number keys were not working on the web

* docs: update changelog
2025-06-06 11:47:38 +08:00
ayangweb
db41e817c3 feat: add key monitoring during reset (#615)
* feat: add key monitoring during reset

* docs: update changelog
2025-06-06 11:23:40 +08:00
BiggerRain
1296755bc5 fix: datasource and mcp data updates (#614) 2025-06-06 11:11:33 +08:00
ayangweb
d410f20864 refactor: remove footer from standalone history window (#613) 2025-06-06 11:11:06 +08:00
ayangweb
61d0a3b79a fix: fix chat log update and sorting issues (#612)
* fix: fix chat log update and sorting issues

* docs: update changelog
2025-06-06 10:52:47 +08:00
BiggerRain
b24319b649 fix: datasource refresh status feedback (#611) 2025-06-06 10:51:31 +08:00
BiggerRain
3c0fb24548 fix: shortcut key prompts cannot be hidden (#610) 2025-06-06 10:51:09 +08:00
BiggerRain
2fcbed0381 fix: i18n is not accurate (#609) 2025-06-06 10:50:36 +08:00
SteveLauC
7444347e0c docs: new doc for macOS (#608) 2025-06-05 19:23:14 +08:00
SteveLauC
725ce042de docs: remove the hyperlink in title (#607) 2025-06-05 18:26:09 +08:00
BiggerRain
3b67de5387 chore: initialize current assistant from history (#606)
* chore: the last assistant in history is set as current

* docs: update notes

* docs: update notes
2025-06-05 08:54:39 +08:00
SteveLauC
9b53a026ff refactor: execute Calculator/Extension search() in spawn_blocking (#601) 2025-06-04 18:45:17 +08:00
ayangweb
9ea7dbf3aa fix: resolve regex error on older macOS versions (#605)
* fix: fix: resolve regex error on older macOS versions

* docs: update changelog

* style: remove unless code

* style: remove unless code
2025-06-04 18:38:34 +08:00
BiggerRain
55622911ac style: Switch selected color in dark mode (#604) 2025-06-04 14:10:17 +08:00
BiggerRain
92f78ad08c fix: new chat assistant id not found (#603)
* fix: new chat assistant id

* docs: update notes
2025-06-04 13:06:30 +08:00
ayangweb
f690dbaab2 refactor: web use the default icon for now (#602) 2025-06-04 11:30:59 +08:00
ayangweb
210efe763d fix: fixed issue with incorrect login status (#600)
* fix: fixed issue with incorrect login status

* style: remove unless code

* fix: user avatar error

* refactor: replace with default svg icon

* style: remove unless code

* docs: update changelog

---------

Co-authored-by: rain <15911122312@163.com>
2025-06-04 10:24:56 +08:00
BiggerRain
f23498afa0 fix: web icon isAbsolute (#599) 2025-06-03 19:28:26 +08:00
BiggerRain
a80a5d928f fix: app icon load console error (#598) 2025-06-03 15:47:58 +08:00
ayangweb
b733bb5516 feat: ai overview support is enabled with shortcut (#597)
* feat: ai overview support is enabled with shortcut

* docs: update changelog
2025-06-03 15:01:29 +08:00
ayangweb
5046754534 refactor: optimized loading of font icons on the web side (#596)
* refactor: optimized loading of font icons on the web side

* refactor: update
2025-06-03 11:22:22 +08:00
SteveLauC
f557f7e780 chore: set log level to coco_lib=trace for built Coco app (#595) 2025-06-03 11:18:28 +08:00
BiggerRain
18feb2d690 fix: set chat message assistant (#594) 2025-06-03 10:53:01 +08:00
BiggerRain
af59f2fe9f fix: web component removes redundant parameters (#593) 2025-06-03 10:35:26 +08:00
BiggerRain
5e1bb54d5e chore: web component adds variable process (#592) 2025-06-03 10:12:22 +08:00
Hardy
33fa516aad fix: rustup for i688 (#590)
Co-authored-by: hardy <luohf@infinilabs.com>
2025-06-01 07:19:28 +08:00
Hardy
d2c1cf513d chore: use version fix (#591)
Co-authored-by: hardy <luohf@infinilabs.com>
2025-05-31 20:22:53 +08:00
Hardy
f81bec8403 chore: rollback publish (#589)
* chore: rollback publish

* chore: set toolchain

---------

Co-authored-by: hardy <luohf@infinilabs.com>
2025-05-31 16:29:08 +08:00
medcl
cce956ac15 v0.5.2 2025-05-31 16:06:12 +08:00
Hardy
0d1174c8dd chore: fix ci publish error (#588)
* chore: fix ci publish error

* docs: update release notes

---------

Co-authored-by: hardy <luohf@infinilabs.com>
2025-05-31 16:05:28 +08:00
ayangweb
e0258dc2fa fix: fixed issue with quick ai access making multiple requests at once (#586) 2025-05-31 15:56:35 +08:00
medcl
310a70838b v0.5.1 2025-05-31 15:55:33 +08:00
Hardy
94d7f809d2 chore: add ssh private key for pizza engine (#587)
Co-authored-by: hardy <luohf@infinilabs.com>
2025-05-31 15:51:20 +08:00
medcl
e1d1bc2684 v0.5.0 2025-05-31 15:01:02 +08:00
Medcl
a9e3bb3eee chore: ignore throttle message (#585) 2025-05-31 11:07:01 +08:00
Medcl
d184851e3b chore: remove icon field before ask ai (#584) 2025-05-31 10:03:19 +08:00
BiggerRain
c9b785ccf3 fix: sent chat once more (#583) 2025-05-31 08:53:37 +08:00
Medcl
4c5ae8c718 chore: update error handling (#582)
* chore: update error handling

* chore: update min osx version
2025-05-31 08:50:27 +08:00
Hardy
8a7f7bc708 chore: add pizza feature for release (#581)
Co-authored-by: hardy <luohf@infinilabs.com>
2025-05-30 22:28:44 +08:00
ayangweb
3d44d10048 refactor: remove unused disabledExtensions related code (#580) 2025-05-30 19:41:51 +08:00
BiggerRain
97d880ea27 fix: useScript error (#579) 2025-05-30 19:41:29 +08:00
Medcl
6c53056edd chore: update default coco server (#578) 2025-05-30 19:27:41 +08:00
ayangweb
a6fd2ebd16 fix: fix web carriage return not jumping (#577) 2025-05-30 18:41:58 +08:00
SteveLauC
b509176572 fix: make extension search source respect parameter datasource (#576) 2025-05-30 18:39:09 +08:00
ayangweb
17f2bcf7a8 fix: fix the problem that web cannot click on the jump (#575) 2025-05-30 18:22:18 +08:00
ayangweb
c471a83821 feat: support third party extensions (#572)
* refactor: support third party extensions

* fix tests

* fix: assistant_get error

* aaa

* bbb

* ccc

* ddd

* fix: aa

* fix: aa

* sss

* fix:asds

* eee

* refactor: loosen restriction of query string length

* fix: input auto

* feat: add ai overview trigger condition configuration

* refactor: continue chatting to select the corresponding mini-helper

* chore: settings width height

* aaa

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
Co-authored-by: rain <15911122312@163.com>
2025-05-30 17:18:52 +08:00
SteveLauC
51b0a2a545 refactor: remove thread app list synchronizer as it leaks memory on macOS (#573) 2025-05-29 17:55:24 +08:00
BiggerRain
baded2af1e refactor: search result related components (#571)
* refactor: search result related components

* refactor: search result related components

* docs: update notes

* refactor: search result related components

* fix: ArrowLeft error

* chore: remove log

* fix: ask ai
2025-05-29 16:01:52 +08:00
BiggerRain
2b21426355 refactor: input box related components (#568)
* refactor: input box components

* chore: change variable name

* docs: update notes

* fix: shortcut key failure issue
2025-05-28 12:29:28 +08:00
BiggerRain
8edc938426 chore: only show available servers in chat (#570)
* chore: add server available

* docs: update notes

* docs: update notes
2025-05-28 10:51:25 +08:00
Medcl
fa919bee11 chore: mark unavailable server to offline on refresh info (#569)
* chore: mark server offline on refresh info

* chore: update release notes
2025-05-28 10:43:53 +08:00
Medcl
50f1e611c3 refactor: refactoring rerank feature (#567)
* refactor: refactoring rerank feature

* chore: remove unused code

* chore: pull back unrelated changes
2025-05-27 18:27:53 +08:00
BiggerRain
4c3cf28012 chore: assistant chat placeholder & refactor input box components (#566)
* chore: input placeholder

* chore: add assitant

* impl assistant_get_multi()

* chore: add assitant

* refactor: input box components

* chore: ask ai search placeholder

* chore: ask ai search placeholder

* docs: update notes

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2025-05-27 16:29:43 +08:00
BiggerRain
89fcc67222 fix: assistant list (#563)
* fix: assistant list

* fix: assistant list

* fix: assistant list

* fix: assistant list
2025-05-27 09:24:58 +08:00
Medcl
33c9ce67df chore: remove pizza deps (#565) 2025-05-27 09:09:17 +08:00
SteveLauC
c6dadfd83e ci: deny dep pizza-engine (#564)
* ci: deny dep pizza-engine

* ci: set PWD to cargo workspace
2025-05-27 08:59:46 +08:00
Medcl
e707a8b5c7 chore: rerank support ignore case (#562)
* chore: rerank support ignore case

* chore: remove unused deps
2025-05-26 19:24:01 +08:00
BiggerRain
5c5364974a chore: web component start page config (#560)
* chore: web component start page config

* chore: web component start page config

* docs: update notes
2025-05-26 18:54:33 +08:00
Medcl
9d3e3e8dde feat: rerank search results (#561)
* feat: rerank search results

* chore: update release notes
2025-05-26 18:54:06 +08:00
BiggerRain
e065ba749f chore: assistant keyboard events and mouse events (#559)
* chore: assistant keyboard events and mouse events

* docs: update notes
2025-05-26 15:44:05 +08:00
ayangweb
2dd8e3160c fix: resolved navigation error on continue chat action (#558)
* fix: resolved navigation error on continue chat action

* docs: update changelog
2025-05-26 10:56:29 +08:00
ayangweb
6aeecfe3ac feat: add quick AI access to search mode (#556)
* feat: add quick AI access to search mode

* feat: add aI assistant quick access

* refactor: adjusting lodash-es import location to optimize code structure

* docs: update changelog

* fix: fix the logic of assigning serverId in AskAi component

* refactor: optimized layout

* refactor: optimized some issues
2025-05-23 18:14:41 +08:00
SteveLauC
334e29d69b chore: add make cmd dev-build-with-pizza (#555) 2025-05-23 16:43:38 +08:00
BiggerRain
382f89ace0 fix: independent chat app has no datasources (#554)
* fix: independent chat window has no data

* docs: update notes
2025-05-23 16:42:35 +08:00
BiggerRain
32c7cc5060 fix: suggestion list position (#553)
* fix: suggestion List position

* docs: update notes
2025-05-23 15:31:27 +08:00
BiggerRain
c13151d69e fix: the scroll button is not displayed by default (#552)
* fix: the scroll button is not displayed by default

* docs: update notes
2025-05-23 14:53:57 +08:00
BiggerRain
07c4ab03b5 fix: secondary page cannot be searched (#551)
* fix: secondary page cannot be searched

* docs: update notes
2025-05-22 19:45:28 +08:00
BiggerRain
cf3f2affa5 fix: history list height (#550)
* fix: history list height

* docs: update notes
2025-05-22 16:28:11 +08:00
BiggerRain
401832ad43 chore: logout update server profile (#549)
* chore: logout update server profile

* docs: update notes
2025-05-22 11:53:23 +08:00
Medcl
6a6f48d2fc chore: mark server offline on user logout (#546)
* chore: mark server offline on user logout

* update release notes
2025-05-22 11:37:20 +08:00
BiggerRain
8a6c90d124 chore: add global login judgment (#544)
* chore: add global login judgment

* docs: update notes
2025-05-22 10:59:46 +08:00
BiggerRain
34acecbcb0 chore: add assistant count (#542)
* fix: switch server assistant and session session unchanged

* docs: update notes

* fix: add server error

* chore: add assistant count

* docs: update notes
2025-05-21 15:29:04 +08:00
SteveLauC
4474212b7d chore: dead code cleanup (#543) 2025-05-21 14:40:38 +08:00
Medcl
1187b641d4 refactor: refactoring search error (#541)
* refactor: refactoring search error

* chore: update release notes
2025-05-21 14:27:17 +08:00
BiggerRain
ef8cd569e4 fix: switch server assistant and session session unchanged (#540)
* fix: switch server assistant and session session unchanged

* docs: update notes
2025-05-21 11:34:03 +08:00
BiggerRain
5ef06bfc95 fix: service switching error (#539)
* fix: service switching error

* build: build error

* chore: chat content can be copied

* docs: update notes

* fix: service switching error

* chore: change to send cancel event to ws_cancel

* chore: add ws-cancel

---------

Co-authored-by: medcl <m@medcl.net>
2025-05-21 09:04:57 +08:00
SteveLauC
2b59addb08 fix: panic when fetching app metadata on Windows (#538)
* fix: panic when fetching app metadata on Windows

* release note
2025-05-21 09:04:08 +08:00
BiggerRain
ee750620f2 refactor: service info related components (#537)
* refactor: service info related components

* docs: update notes

* refactor: chat header service status
2025-05-20 17:02:10 +08:00
Medcl
acc3b1a0d2 chore: skip register server that not logged in (#536)
* chore: update logging message

* chore: skip register server that not logged in

* chore: update logging message

* chore: update release notes
2025-05-20 15:10:27 +08:00
SteveLauC
4372747014 feat: dynamic log level via env var COCO_LOG (#535) 2025-05-20 12:54:07 +08:00
BiggerRain
ee531209aa fix: server image loading failure (#534)
* fix: server image loading failure

* docs: update notes
2025-05-20 09:31:54 +08:00
BiggerRain
ee0bbce3e2 style: search error styles (#533)
* style: search error styles

* docs: update notes
2025-05-19 19:54:34 +08:00
SteveLauC
7eccf99f92 fix: do not pass whitespace-only strings to Calculator expr evaluation lib (#532) 2025-05-19 19:24:32 +08:00
SteveLauC
5044a98bb7 fix: app hotkey hanlder invoked twice (key pressed and released) (#531) 2025-05-19 18:40:44 +08:00
SteveLauC
72165812bf refactor: ignore the error happens while indexing a specific app (#530)
* refactor: ignore the error happens while indexing a specific app

* refactor: ignore the error happens while indexing a specific app
2025-05-19 17:28:13 +08:00
BiggerRain
f9c1be8517 fix: app icon & category icon (#529) 2025-05-19 17:24:51 +08:00
BiggerRain
71ce23ef21 style: history component styles (#528)
* style: history component styles

* docs: update notes

* build: build & publish web componet version 1.2.1

* build: build & publish web componet version 1.2.2
2025-05-19 16:56:00 +08:00
Medcl
3e6041cbd8 chroe: update minimum macOS version to 10 (#527) 2025-05-18 15:06:06 +08:00
SteveLauC
0b9e158b55 fix: panic caused by an unwrap() (#526) 2025-05-17 18:44:17 +08:00
BiggerRain
688ced3fc3 build: build & publish web component (#524) 2025-05-17 16:53:17 +08:00
BiggerRain
16e0382a8b docs: update release notes (#525) 2025-05-17 16:52:26 +08:00
BiggerRain
91c9cd5725 fix: show only enabled datasource & MCP list (#523)
* fix: show only enabled datasource & MCP list

* docs: update notes

* fix: show only enabled datasource & MCP list
2025-05-17 12:01:18 +08:00
ayangweb
7f3e602bb3 feat: add a component for text reading aloud (#522)
* feat: add a component for text reading aloud

* docs: update changelog
2025-05-16 16:21:57 +08:00
BiggerRain
5e9d41ea5c fix: datasource & MCP list synchronization update (#521)
* fix: datasource & MCP list update

* docs: update notes

* docs:update notes
2025-05-16 15:09:51 +08:00
Medcl
8bdb93d813 refactor: refactoring icon component (#514)
* chore: try to fix icon for insecure-tls deployment

* chore: handling icon resource loading errors

* refactor: refactored icon component

* chore: update release notes

---------

Co-authored-by: rain <15911122312@163.com>
2025-05-16 12:03:43 +08:00
ayangweb
690e6a3225 refactor: optimizing list styles in markdown content (#520)
* refactor: optimizing list styles in markdown content

* docs: update changelog

* style: remove unless code
2025-05-16 10:21:41 +08:00
ayangweb
111d9bddca style: remove useless code (#519) 2025-05-16 09:17:41 +08:00
ayangweb
7645b3e736 feat: add AI summary component (#518)
* feat: add AI summary component

* docs: update changelog

* refactor: update
2025-05-15 18:27:17 +08:00
270 changed files with 19247 additions and 7116 deletions

View File

@@ -0,0 +1,18 @@
name: Enforce no dependency pizza-engine
on:
pull_request:
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name:
working-directory: ./src-tauri
run: |
# if cargo remove pizza-engine succeeds, then it is in our dependency list, fail the CI pipeline.
if cargo remove pizza-engine; then exit 1; fi

View File

@@ -9,10 +9,16 @@ on:
jobs: jobs:
create-release: create-release:
runs-on: ubuntu-latest runs-on: ubuntu-latest
outputs:
APP_VERSION: ${{ steps.get-version.outputs.APP_VERSION }}
RELEASE_BODY: ${{ steps.get-changelog.outputs.RELEASE_BODY }}
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
with: with:
fetch-depth: 0 fetch-depth: 0
- name: Set output - name: Set output
id: vars id: vars
run: echo "tag=${GITHUB_REF#refs/*/}" >> $GITHUB_OUTPUT run: echo "tag=${GITHUB_REF#refs/*/}" >> $GITHUB_OUTPUT
@@ -22,11 +28,28 @@ jobs:
with: with:
node-version: 20 node-version: 20
- name: Get build version
shell: bash
id: get-version
run: |
PACKAGE_VERSION=$(jq -r '.version' package.json)
CARGO_VERSION=$(grep -m 1 '^version =' src-tauri/Cargo.toml | sed -E 's/.*"([^"]+)".*/\1/')
if [ "$PACKAGE_VERSION" != "$CARGO_VERSION" ]; then
echo "::error::Version mismatch!"
else
echo "Version match: $PACKAGE_VERSION"
fi
echo "APP_VERSION=$PACKAGE_VERSION" >> $GITHUB_OUTPUT
- name: Generate changelog - name: Generate changelog
id: create_release id: get-changelog
run: npx changelogithub --draft --name ${{ steps.vars.outputs.tag }} run: |
CHANGELOG_BODY=$(npx changelogithub --draft --name ${{ steps.vars.outputs.tag }})
echo "RELEASE_BODY<<EOF" >> $GITHUB_OUTPUT
echo "$CHANGELOG_BODY" >> $GITHUB_OUTPUT
echo "EOF" >> $GITHUB_OUTPUT
env: env:
GITHUB_TOKEN: ${{ secrets.RELEASE_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
build-app: build-app:
needs: create-release needs: create-release
@@ -52,11 +75,23 @@ jobs:
target: "x86_64-unknown-linux-gnu" target: "x86_64-unknown-linux-gnu"
- platform: "ubuntu-22.04-arm" - platform: "ubuntu-22.04-arm"
target: "aarch64-unknown-linux-gnu" target: "aarch64-unknown-linux-gnu"
env:
APP_VERSION: ${{ needs.create-release.outputs.APP_VERSION }}
runs-on: ${{ matrix.platform }} runs-on: ${{ matrix.platform }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v4 uses: actions/checkout@v4
- name: Checkout dependency repository
uses: actions/checkout@v4
with:
repository: 'infinilabs/pizza'
ssh-key: ${{ secrets.SSH_PRIVATE_KEY }}
submodules: recursive
ref: main
path: pizza
- name: Setup node - name: Setup node
uses: actions/setup-node@v4 uses: actions/setup-node@v4
with: with:
@@ -65,17 +100,31 @@ jobs:
with: with:
version: latest version: latest
- name: Install rust target
run: rustup target add ${{ matrix.target }}
- name: Install dependencies (ubuntu only) - name: Install dependencies (ubuntu only)
if: startsWith(matrix.platform, 'ubuntu-22.04') if: startsWith(matrix.platform, 'ubuntu-22.04')
run: | run: |
sudo apt-get update sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf xdg-utils sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf xdg-utils
- name: Install Rust stable - name: Add Rust build target
run: rustup toolchain install stable working-directory: src-tauri
shell: bash
run: |
rustup target add ${{ matrix.target }} || true
- name: Add pizza engine as a dependency
working-directory: src-tauri
shell: bash
run: |
BUILD_ARGS="--target ${{ matrix.target }}"
if [[ "${{matrix.target }}" != "i686-pc-windows-msvc" ]]; then
echo "Adding pizza engine as a dependency for ${{matrix.platform }}-${{matrix.target }}"
( cargo add --path ../pizza/lib/engine --features query_string_parser,persistence )
BUILD_ARGS+=" --features use_pizza_engine"
else
echo "Skipping pizza engine dependency for ${{matrix.platform }}-${{matrix.target }}"
fi
echo "BUILD_ARGS=${BUILD_ARGS}" >> $GITHUB_ENV
- name: Rust cache - name: Rust cache
uses: swatinem/rust-cache@v2 uses: swatinem/rust-cache@v2
@@ -90,8 +139,8 @@ jobs:
- name: Install app dependencies and build web - name: Install app dependencies and build web
run: pnpm install --frozen-lockfile run: pnpm install --frozen-lockfile
- name: Build the app - name: Build the coco at ${{ matrix.platform}} for ${{ matrix.target }} @ ${{ env.APP_VERSION }}
uses: tauri-apps/tauri-action@v0 uses: tauri-apps/tauri-action@v0
env: env:
CI: false CI: false
@@ -107,8 +156,8 @@ jobs:
APPLE_TEAM_ID: ${{ secrets.APPLE_TEAM_ID }} APPLE_TEAM_ID: ${{ secrets.APPLE_TEAM_ID }}
with: with:
tagName: ${{ github.ref_name }} tagName: ${{ github.ref_name }}
releaseName: Coco ${{ needs.create-release.outputs.APP_VERSION }} releaseName: Coco ${{ env.APP_VERSION }}
releaseBody: "" releaseBody: "${{ needs.create-release.outputs.RELEASE_BODY }}"
releaseDraft: true releaseDraft: true
prerelease: false prerelease: false
args: --target ${{ matrix.target }} args: ${{ env.BUILD_ARGS }}

61
.github/workflows/rust_code_check.yml vendored Normal file
View File

@@ -0,0 +1,61 @@
name: Rust Code Check
on:
pull_request:
# Only run it when Rust code changes
paths:
- 'src-tauri/**'
jobs:
check:
strategy:
matrix:
platform: [ubuntu-latest, windows-latest, macos-latest]
runs-on: ${{ matrix.platform }}
steps:
- uses: actions/checkout@v4
- name: Checkout dependency (pizza-engine) repository
uses: actions/checkout@v4
with:
repository: 'infinilabs/pizza'
ssh-key: ${{ secrets.SSH_PRIVATE_KEY }}
submodules: recursive
ref: main
path: pizza
- name: Install dependencies (ubuntu only)
if: startsWith(matrix.platform, 'ubuntu-latest')
run: |
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf xdg-utils
- name: Add pizza engine as a dependency
working-directory: src-tauri
shell: bash
run: cargo add --path ../pizza/lib/engine --features query_string_parser,persistence
- name: Format check
working-directory: src-tauri
shell: bash
run: |
rustup component add rustfmt
cargo fmt --all --check
- name: Check compilation (Without Pizza engine enabled)
working-directory: ./src-tauri
run: cargo check
- name: Check compilation (With Pizza engine enabled)
working-directory: ./src-tauri
run: cargo check --features use_pizza_engine
- name: Run tests (Without Pizza engine)
working-directory: ./src-tauri
run: cargo test
- name: Run tests (With Pizza engine)
working-directory: ./src-tauri
run: cargo test --features use_pizza_engine

View File

@@ -8,11 +8,14 @@
"clsx", "clsx",
"codegen", "codegen",
"dataurl", "dataurl",
"deeplink",
"deepthink",
"dtolnay", "dtolnay",
"dyld", "dyld",
"elif", "elif",
"errmsg", "errmsg",
"fullscreen", "fullscreen",
"fulltext",
"headlessui", "headlessui",
"Icdbb", "Icdbb",
"icns", "icns",
@@ -29,6 +32,8 @@
"localstorage", "localstorage",
"lucide", "lucide",
"maximizable", "maximizable",
"mdast",
"meval",
"Minimizable", "Minimizable",
"msvc", "msvc",
"nord", "nord",
@@ -38,9 +43,11 @@
"overscan", "overscan",
"partialize", "partialize",
"patchelf", "patchelf",
"Quicklink",
"Raycast", "Raycast",
"rehype", "rehype",
"reqwest", "reqwest",
"rerank",
"rgba", "rgba",
"rustup", "rustup",
"screenshotable", "screenshotable",
@@ -55,6 +62,7 @@
"traptitech", "traptitech",
"unlisten", "unlisten",
"unlistener", "unlistener",
"unlisteners",
"unminimize", "unminimize",
"uuidv", "uuidv",
"VITE", "VITE",

View File

@@ -78,4 +78,8 @@ clean-rebuild:
$(MAKE) dev-build $(MAKE) dev-build
add-dep-pizza-engine: add-dep-pizza-engine:
cd src-tauri && cargo add --git ssh://git@github.com/infinilabs/pizza.git pizza-engine --features query_string_parser,persistence cd src-tauri && cargo add --git ssh://git@github.com/infinilabs/pizza.git pizza-engine --features query_string_parser,persistence
dev-build-with-pizza: add-dep-pizza-engine
@echo "Starting desktop development with Pizza Engine pulled in..."
RUST_BACKTRACE=1 pnpm tauri dev --features use_pizza_engine

View File

@@ -91,6 +91,8 @@ pnpm tauri build
- [Coco App Documentation](https://docs.infinilabs.com/coco-app/main/) - [Coco App Documentation](https://docs.infinilabs.com/coco-app/main/)
- [Coco Server Documentation](https://docs.infinilabs.com/coco-server/main/) - [Coco Server Documentation](https://docs.infinilabs.com/coco-server/main/)
- [DeepWiki Coco App](https://deepwiki.com/infinilabs/coco-app)
- [DeepWiki Coco Server](https://deepwiki.com/infinilabs/coco-server)
- [Tauri Documentation](https://tauri.app/) - [Tauri Documentation](https://tauri.app/)
## Contributors ## Contributors

View File

@@ -1,21 +1,35 @@
--- ---
weight: 10 weight: 10
title: "Mac OS" title: "macOS"
asciinema: true asciinema: true
--- ---
# Mac OS # macOS
## Download Coco AI ## Download Coco AI
Goto [https://coco.rs/](https://coco.rs/) Go to [coco.rs](https://coco.rs/) and download the package of your architecture:
{{% load-img "/img/download-mac-app.png" "" %}} {{% load-img "/img/macos/mac-download-app.png" "" %}}
It should be placed in your "Downloads" folder:
{{% load-img "/img/macos/mac-zip-file.png" "" %}}
## Unzip DMG file ## Unzip DMG file
{{% load-img "/img/unzip-dmg-file.png" "" %}} Unzip the file:
{{% load-img "/img/macos/mac-unzip-zip-file.png" "" %}}
You will get a `dmg` file:
{{% load-img "/img/macos/mac-dmg.png" "" %}}
## Drag to Application Folder ## Drag to Application Folder
{{% load-img "/img/drag-to-application-folder.png" "" %}} Double click the `dmg` file, a window will pop up. Then drag the "Coco-AI" app to
your "Applications" folder:
{{% load-img "/img/macos/drag-to-app-folder.png" "" %}}

View File

@@ -14,7 +14,9 @@ asciinema: true
[if_x11]: https://unix.stackexchange.com/q/202891/498440 [if_x11]: https://unix.stackexchange.com/q/202891/498440
## Goto [https://coco.rs/](https://coco.rs/) ## Go to the download page
Download page: [link](https://coco.rs/#install)
## Download the package ## Download the package

View File

@@ -5,7 +5,7 @@ title: "Release Notes"
# Release Notes # Release Notes
Information about release notes of Coco Server is provided here. Information about release notes of Coco App is provided here.
## Latest (In development) ## Latest (In development)
@@ -13,6 +13,122 @@ Information about release notes of Coco Server is provided here.
### 🚀 Features ### 🚀 Features
- feat: enhance ui for skipped version #834
### 🐛 Bug fix
- fix: fix issue with update check failure #833
### ✈️ Improvements
## 0.7.1 (2025-07-27)
### ❌ Breaking changes
### 🚀 Features
### 🐛 Bug fix
- fix: correct enter key behavior #828
### ✈️ Improvements
- chore: web component add notification component #825
- refactor: collection behavior defaults to `MoveToActiveSpace`, and only use `CanJoinAllSpaces` when window is pinned #829
## 0.7.0 (2025-07-25)
### ❌ Breaking changes
### 🚀 Features
- feat: file search using spotlight #705
- feat: voice input support in both search and chat modes #732
- feat: text to speech now powered by LLM #750
- feat: file search for Windows #762
### 🐛 Bug fix
- fix(file search): apply filters before from/size parameters #741
- fix(file search): searching by name&content does not search file name #743
- fix: prevent window from hiding when moved on Windows #748
- fix: unregister ext hotkey when it gets deleted #770
- fix: indexing apps does not respect search scope config #773
- fix: restore missing category titles on subpages #772
- fix: correct incorrect assistant display when quick ai access #779
- fix: resolved minor issues with voice playback #780
- fix: fixed incorrect taskbar icon display on linux #783
- fix: fix data inconsistency issue on secondary pages #784
- fix: incorrect status when installing extension #789
- fix: increase read_timeout for HTTP streaming stability #798
- fix: enter key problem #794
- fix: fix selection issue after renaming #800
- fix: fix shortcut issue in windows context menu #804
- fix: panic caused by "state() called before manage()" #806
- fix: fix multiline input issue #808
- fix: fix ctrl+k not working #815
- fix: fix update window config sync #818
- fix: fix enter key on subpages #819
- fix: panic on Ubuntu (GNOME) when opening apps #821
### ✈️ Improvements
- refactor: prioritize stat(2) when checking if a file is dir #737
- refactor: change File Search ext type to extension #738
- refactor: create chat & send chat api #739
- chore: icon support for more file types #740
- chore: replace meval-rs with our fork to clear dep warning #745
- refactor: adjusted assistant, datasource, mcp_server interface parameters #746
- refactor: adjust extension code hierarchy #747
- chore: bump dep applications-rs #751
- chore: rename QuickLink/quick_link to Quicklink/quicklink #752
- chore: assistant params & styles #753
- chore: make optional fields optional #758
- chore: search-chat components add formatUrl & think data & icons url #765
- chore: Coco app http request headers #744
- refactor: do status code check before deserializing response #767
- style: splash adapts to the width of mobile phones #768
- chore: search-chat add language and formatUrl parameters #775
- chore: not request the interface if not logged in #795
- refactor: clean up unsupported characters from query string in Win Search #802
- chore: display backtrace in panic log #805
## 0.6.0 (2025-06-29)
### ❌ Breaking changes
### 🚀 Features
- feat: support `Tab` and `Enter` for delete dialog buttons #700
- feat: add check for updates #701
- feat: impl extension store #699
- feat: support back navigation via delete key #717
### 🐛 Bug fix
- fix: quick ai state synchronous #693
- fix: toggle extension should register/unregister hotkey #691
- fix: take coco server back on refresh #696
- fix: some input fields couldnt accept spaces #709
- fix: context menu search not working #713
- fix: open extension store display #724
### ✈️ Improvements
- refactor: use author/ext_id as extension unique identifier #643
- refactor: refactoring search api #679
- chore: continue to chat page display #690
- chore: improve server list selection with enter key #692
- chore: add message for latest version check #703
- chore: log command execution results #718
- chore: adjust styles and add button reindex #719
## 0.5.0 (2025-06-13)
### ❌ Breaking changes
### 🚀 Features
- feat: check or enter to close the list of assistants #469 - feat: check or enter to close the list of assistants #469
- feat: add dimness settings for pinned window #470 - feat: add dimness settings for pinned window #470
- feat: supports Shift + Enter input box line feeds #472 - feat: supports Shift + Enter input box line feeds #472
@@ -24,17 +140,59 @@ Information about release notes of Coco Server is provided here.
- feat: the search input box supports multi-line input #501 - feat: the search input box supports multi-line input #501
- feat: websocket support self-signed TLS #504 - feat: websocket support self-signed TLS #504
- feat: add option to allow self-signed certificates #509 - feat: add option to allow self-signed certificates #509
- feat: add AI summary component #518
- feat: dynamic log level via env var COCO_LOG #535
- feat: add quick AI access to search mode #556
- feat: rerank search results #561
- feat: ai overview support is enabled with shortcut #597
- feat: add key monitoring during reset #615
- feat: calculator extension add description #623
- feat: support right-click actions after text selection #624
- feat: add ai overview minimum number of search results configuration #625
- feat: add internationalized translations of AI-related extensions #632
- feat: context menu support for secondary pages #680
### 🐛 Bug fix ### 🐛 Bug fix
- fix: solve the problem of modifying the assistant in the chat #476
- fix: several issues around search #502 - fix: several issues around search #502
- fix: fixed the newly created session has no title when it is deleted #511 - fix: fixed the newly created session has no title when it is deleted #511
- fix: loading chat history for potential empty attachments - fix: loading chat history for potential empty attachments
- fix: datasource & MCP list synchronization update #521
- fix: app icon & category icon #529
- fix: show only enabled datasource & MCP list
- fix: server image loading failure #534
- fix: panic when fetching app metadata on Windows #538
- fix: service switching error #539
- fix: switch server assistant and session unchanged #540
- fix: history list height #550
- fix: secondary page cannot be searched #551
- fix: the scroll button is not displayed by default #552
- fix: suggestion list position #553
- fix: independent chat window has no data #554
- fix: resolved navigation error on continue chat action #558
- fix: make extension search source respect parameter datasource #576
- fix: fixed issue with incorrect login status #600
- fix: new chat assistant id not found #603
- fix: resolve regex error on older macOS versions #605
- fix: fix chat log update and sorting issues #612
- fix: resolved an issue where number keys were not working on the web #616
- fix: do not panic when the datasource specified does not exist #618
- fix: fixed modifier keys not working with continue chat #619
- fix: invalid DSL error if input contains multiple lines #620
- fix: fix ai overview hidden height before message #622
- fix: tab key hides window in chat mode #641
- fix: arrow keys still navigated search when menu opened with Cmd+K #642
- fix: input lost when reopening dialog after search #644
- fix: web page unmount event #645
- fix: fix the problem of local path not opening #650
- fix: number keys not following settings #661
- fix: fix problem with up and down key indexing #676
- fix: arrow inserting escape sequences #683
### ✈️ Improvements ### ✈️ Improvements
- chore: adjust list error message #475 - chore: adjust list error message #475
- fix: solve the problem of modifying the assistant in the chat #476
- chore: refine wording on search failure - chore: refine wording on search failure
- choresearch and MCP show hidden logic #494 - choresearch and MCP show hidden logic #494
- chore: greetings show hidden logic #496 - chore: greetings show hidden logic #496
@@ -45,6 +203,32 @@ Information about release notes of Coco Server is provided here.
- refactor: optimized the modification operation of the numeric input box #508 - refactor: optimized the modification operation of the numeric input box #508
- style: modify the style of the search input box #513 - style: modify the style of the search input box #513
- style: chat input icons show #515 - style: chat input icons show #515
- refactor: refactoring icon component #514
- refactor: optimizing list styles in markdown content #520
- feat: add a component for text reading aloud #522
- style: history component styles #528
- style: search error styles #533
- chore: skip register server that not logged in #536
- refactor: service info related components #537
- chore: chat content can be copied #539
- refactor: refactoring search error #541
- chore: add assistant count #542
- chore: add global login judgment #544
- chore: mark server offline on user logout #546
- chore: logout update server profile #549
- chore: assistant keyboard events and mouse events #559
- chore: web component start page config #560
- chore: assistant chat placeholder & refactor input box components #566
- refactor: input box related components #568
- chore: mark unavailable server to offline on refresh info #569
- chore: only show available servers in chat #570
- refactor: search result related components #571
- chore: initialize current assistant from history #606
- chore: add onContextMenu event #629
- chore: more logs for the setup process #634
- chore: copy supports http protocol #639
- refactor: use author/ext_id as extension unique identifier #643
- chore: add special character filtering #668
## 0.4.0 (2025-04-27) ## 0.4.0 (2025-04-27)
@@ -74,6 +258,8 @@ Information about release notes of Coco Server is provided here.
- feat: data sources support displaying customized icons #432 - feat: data sources support displaying customized icons #432
- feat: add shortcut key conflict hint and reset function #442 - feat: add shortcut key conflict hint and reset function #442
- feat: updated to include error message #465 - feat: updated to include error message #465
- feat: support third party extensions #572
- feat: support ai overview #572
### Bug fix ### Bug fix

Binary file not shown.

Before

Width:  |  Height:  |  Size: 155 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 69 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 239 KiB

BIN
docs/static/img/macos/mac-dmg.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 586 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 299 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 650 KiB

BIN
docs/static/img/macos/mac-zip-file.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 441 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 121 KiB

View File

@@ -1,7 +1,7 @@
{ {
"name": "coco", "name": "coco",
"private": true, "private": true,
"version": "0.4.0", "version": "0.7.1",
"type": "module", "type": "module",
"scripts": { "scripts": {
"dev": "vite", "dev": "vite",
@@ -18,7 +18,6 @@
"release-beta": "release-it --preRelease=beta --preReleaseBase=1" "release-beta": "release-it --preRelease=beta --preReleaseBase=1"
}, },
"dependencies": { "dependencies": {
"@ant-design/icons": "^6.0.0",
"@headlessui/react": "^2.2.2", "@headlessui/react": "^2.2.2",
"@tauri-apps/api": "^2.5.0", "@tauri-apps/api": "^2.5.0",
"@tauri-apps/plugin-autostart": "~2.2.0", "@tauri-apps/plugin-autostart": "~2.2.0",
@@ -27,6 +26,7 @@
"@tauri-apps/plugin-global-shortcut": "~2.0.0", "@tauri-apps/plugin-global-shortcut": "~2.0.0",
"@tauri-apps/plugin-http": "~2.0.2", "@tauri-apps/plugin-http": "~2.0.2",
"@tauri-apps/plugin-log": "~2.4.0", "@tauri-apps/plugin-log": "~2.4.0",
"@tauri-apps/plugin-opener": "^2.2.7",
"@tauri-apps/plugin-os": "^2.2.1", "@tauri-apps/plugin-os": "^2.2.1",
"@tauri-apps/plugin-process": "^2.2.1", "@tauri-apps/plugin-process": "^2.2.1",
"@tauri-apps/plugin-shell": "^2.2.1", "@tauri-apps/plugin-shell": "^2.2.1",
@@ -44,6 +44,7 @@
"i18next-browser-languagedetector": "^8.1.0", "i18next-browser-languagedetector": "^8.1.0",
"lodash-es": "^4.17.21", "lodash-es": "^4.17.21",
"lucide-react": "^0.461.0", "lucide-react": "^0.461.0",
"mdast-util-gfm-autolink-literal": "2.0.0",
"mermaid": "^11.6.0", "mermaid": "^11.6.0",
"nanoid": "^5.1.5", "nanoid": "^5.1.5",
"react": "^18.3.1", "react": "^18.3.1",
@@ -58,10 +59,12 @@
"remark-breaks": "^4.0.0", "remark-breaks": "^4.0.0",
"remark-gfm": "^4.0.1", "remark-gfm": "^4.0.1",
"remark-math": "^6.0.0", "remark-math": "^6.0.0",
"tailwind-merge": "^3.3.1",
"tauri-plugin-fs-pro-api": "^2.4.0", "tauri-plugin-fs-pro-api": "^2.4.0",
"tauri-plugin-macos-permissions-api": "^2.3.0", "tauri-plugin-macos-permissions-api": "^2.3.0",
"tauri-plugin-screenshots-api": "^2.2.0", "tauri-plugin-screenshots-api": "^2.2.0",
"tauri-plugin-windows-version-api": "^2.0.0", "tauri-plugin-windows-version-api": "^2.0.0",
"type-fest": "^4.41.0",
"use-debounce": "^10.0.4", "use-debounce": "^10.0.4",
"uuid": "^11.1.0", "uuid": "^11.1.0",
"wavesurfer.js": "^7.9.5", "wavesurfer.js": "^7.9.5",
@@ -89,5 +92,6 @@
"tsx": "^4.19.4", "tsx": "^4.19.4",
"typescript": "^5.8.3", "typescript": "^5.8.3",
"vite": "^5.4.19" "vite": "^5.4.19"
} },
} "packageManager": "pnpm@10.11.0+sha512.6540583f41cc5f628eb3d9773ecee802f4f9ef9923cc45b69890fb47991d4b092964694ec3a4f738a420c918a333062c8b925d312f42e4f0c263eb603551f977"
}

145
pnpm-lock.yaml generated
View File

@@ -8,9 +8,6 @@ importers:
.: .:
dependencies: dependencies:
'@ant-design/icons':
specifier: ^6.0.0
version: 6.0.0(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
'@headlessui/react': '@headlessui/react':
specifier: ^2.2.2 specifier: ^2.2.2
version: 2.2.2(react-dom@18.3.1(react@18.3.1))(react@18.3.1) version: 2.2.2(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
@@ -35,6 +32,9 @@ importers:
'@tauri-apps/plugin-log': '@tauri-apps/plugin-log':
specifier: ~2.4.0 specifier: ~2.4.0
version: 2.4.0 version: 2.4.0
'@tauri-apps/plugin-opener':
specifier: ^2.2.7
version: 2.2.7
'@tauri-apps/plugin-os': '@tauri-apps/plugin-os':
specifier: ^2.2.1 specifier: ^2.2.1
version: 2.2.1 version: 2.2.1
@@ -86,6 +86,9 @@ importers:
lucide-react: lucide-react:
specifier: ^0.461.0 specifier: ^0.461.0
version: 0.461.0(react@18.3.1) version: 0.461.0(react@18.3.1)
mdast-util-gfm-autolink-literal:
specifier: 2.0.0
version: 2.0.0
mermaid: mermaid:
specifier: ^11.6.0 specifier: ^11.6.0
version: 11.6.0 version: 11.6.0
@@ -128,6 +131,9 @@ importers:
remark-math: remark-math:
specifier: ^6.0.0 specifier: ^6.0.0
version: 6.0.0 version: 6.0.0
tailwind-merge:
specifier: ^3.3.1
version: 3.3.1
tauri-plugin-fs-pro-api: tauri-plugin-fs-pro-api:
specifier: ^2.4.0 specifier: ^2.4.0
version: 2.4.0 version: 2.4.0
@@ -140,6 +146,9 @@ importers:
tauri-plugin-windows-version-api: tauri-plugin-windows-version-api:
specifier: ^2.0.0 specifier: ^2.0.0
version: 2.0.0 version: 2.0.0
type-fest:
specifier: ^4.41.0
version: 4.41.0
use-debounce: use-debounce:
specifier: ^10.0.4 specifier: ^10.0.4
version: 10.0.4(react@18.3.1) version: 10.0.4(react@18.3.1)
@@ -182,7 +191,7 @@ importers:
version: 1.8.8 version: 1.8.8
'@vitejs/plugin-react': '@vitejs/plugin-react':
specifier: ^4.4.1 specifier: ^4.4.1
version: 4.4.1(vite@5.4.19(@types/node@22.15.17)(sass@1.87.0)) version: 4.4.1(vite@5.4.19(@types/node@22.15.17)(sass@1.87.0)(terser@5.40.0))
autoprefixer: autoprefixer:
specifier: ^10.4.21 specifier: ^10.4.21
version: 10.4.21(postcss@8.5.3) version: 10.4.21(postcss@8.5.3)
@@ -215,7 +224,7 @@ importers:
version: 5.8.3 version: 5.8.3
vite: vite:
specifier: ^5.4.19 specifier: ^5.4.19
version: 5.4.19(@types/node@22.15.17)(sass@1.87.0) version: 5.4.19(@types/node@22.15.17)(sass@1.87.0)(terser@5.40.0)
packages: packages:
@@ -227,23 +236,6 @@ packages:
resolution: {integrity: sha512-30iZtAPgz+LTIYoeivqYo853f02jBYSd5uGnGpkFV0M3xOt9aN73erkgYAmZU43x4VfqcnLxW9Kpg3R5LC4YYw==} resolution: {integrity: sha512-30iZtAPgz+LTIYoeivqYo853f02jBYSd5uGnGpkFV0M3xOt9aN73erkgYAmZU43x4VfqcnLxW9Kpg3R5LC4YYw==}
engines: {node: '>=6.0.0'} engines: {node: '>=6.0.0'}
'@ant-design/colors@8.0.0':
resolution: {integrity: sha512-6YzkKCw30EI/E9kHOIXsQDHmMvTllT8STzjMb4K2qzit33RW2pqCJP0sk+hidBntXxE+Vz4n1+RvCTfBw6OErw==}
'@ant-design/fast-color@3.0.0':
resolution: {integrity: sha512-eqvpP7xEDm2S7dUzl5srEQCBTXZMmY3ekf97zI+M2DHOYyKdJGH0qua0JACHTqbkRnD/KHFQP9J1uMJ/XWVzzA==}
engines: {node: '>=8.x'}
'@ant-design/icons-svg@4.4.2':
resolution: {integrity: sha512-vHbT+zJEVzllwP+CM+ul7reTEfBR0vgxFe7+lREAsAA7YGsYpboiq2sQNeQeRvh09GfQgs/GyFEvZpJ9cLXpXA==}
'@ant-design/icons@6.0.0':
resolution: {integrity: sha512-o0aCCAlHc1o4CQcapAwWzHeaW2x9F49g7P3IDtvtNXgHowtRWYb7kiubt8sQPFvfVIVU/jLw2hzeSlNt0FU+Uw==}
engines: {node: '>=8'}
peerDependencies:
react: '>=16.0.0'
react-dom: '>=16.0.0'
'@antfu/install-pkg@1.1.0': '@antfu/install-pkg@1.1.0':
resolution: {integrity: sha512-MGQsmw10ZyI+EJo45CdSER4zEb+p31LpDAFp2Z3gkSd1yqVZGi0Ebx++YTEMonJy4oChEMLsxZ64j8FH6sSqtQ==} resolution: {integrity: sha512-MGQsmw10ZyI+EJo45CdSER4zEb+p31LpDAFp2Z3gkSd1yqVZGi0Ebx++YTEMonJy4oChEMLsxZ64j8FH6sSqtQ==}
@@ -813,6 +805,9 @@ packages:
resolution: {integrity: sha512-R8gLRTZeyp03ymzP/6Lil/28tGeGEzhx1q2k703KGWRAI1VdvPIXdG70VJc2pAMw3NA6JKL5hhFu1sJX0Mnn/A==} resolution: {integrity: sha512-R8gLRTZeyp03ymzP/6Lil/28tGeGEzhx1q2k703KGWRAI1VdvPIXdG70VJc2pAMw3NA6JKL5hhFu1sJX0Mnn/A==}
engines: {node: '>=6.0.0'} engines: {node: '>=6.0.0'}
'@jridgewell/source-map@0.3.6':
resolution: {integrity: sha512-1ZJTZebgqllO79ue2bm3rIGud/bOe0pP5BjSRCRxxYkEZS8STV7zN84UBbiYu7jy+eCKSnVIUgoWWE/tt+shMQ==}
'@jridgewell/sourcemap-codec@1.5.0': '@jridgewell/sourcemap-codec@1.5.0':
resolution: {integrity: sha512-gv3ZRaISU3fjPAgNsriBRqGWQL6quFx04YMPW/zD8XMLsU32mhCCbfbO6KZFLjvYpCZ8zyDEgqsgf+PwPaM7GQ==} resolution: {integrity: sha512-gv3ZRaISU3fjPAgNsriBRqGWQL6quFx04YMPW/zD8XMLsU32mhCCbfbO6KZFLjvYpCZ8zyDEgqsgf+PwPaM7GQ==}
@@ -990,12 +985,6 @@ packages:
resolution: {integrity: sha512-c83qWb22rNRuB0UaVCI0uRPNRr8Z0FWnEIvT47jiHAmOIUHbBOg5XvV7pM5x+rKn9HRpjxquDbXYSXr3fAKFcw==} resolution: {integrity: sha512-c83qWb22rNRuB0UaVCI0uRPNRr8Z0FWnEIvT47jiHAmOIUHbBOg5XvV7pM5x+rKn9HRpjxquDbXYSXr3fAKFcw==}
engines: {node: '>=12'} engines: {node: '>=12'}
'@rc-component/util@1.2.1':
resolution: {integrity: sha512-AUVu6jO+lWjQnUOOECwu8iR0EdElQgWW5NBv5vP/Uf9dWbAX3udhMutRlkVXjuac2E40ghkFy+ve00mc/3Fymg==}
peerDependencies:
react: '>=18.0.0'
react-dom: '>=18.0.0'
'@react-aria/focus@3.20.2': '@react-aria/focus@3.20.2':
resolution: {integrity: sha512-Q3rouk/rzoF/3TuH6FzoAIKrl+kzZi9LHmr8S5EqLAOyP9TXIKG34x2j42dZsAhrw7TbF9gA8tBKwnCNH4ZV+Q==} resolution: {integrity: sha512-Q3rouk/rzoF/3TuH6FzoAIKrl+kzZi9LHmr8S5EqLAOyP9TXIKG34x2j42dZsAhrw7TbF9gA8tBKwnCNH4ZV+Q==}
peerDependencies: peerDependencies:
@@ -1256,6 +1245,9 @@ packages:
'@tauri-apps/plugin-log@2.4.0': '@tauri-apps/plugin-log@2.4.0':
resolution: {integrity: sha512-j7yrDtLNmayCBOO2esl3aZv9jSXy2an8MDLry3Ys9ZXerwUg35n1Y2uD8HoCR+8Ng/EUgx215+qOUfJasjYrHw==} resolution: {integrity: sha512-j7yrDtLNmayCBOO2esl3aZv9jSXy2an8MDLry3Ys9ZXerwUg35n1Y2uD8HoCR+8Ng/EUgx215+qOUfJasjYrHw==}
'@tauri-apps/plugin-opener@2.2.7':
resolution: {integrity: sha512-uduEyvOdjpPOEeDRrhwlCspG/f9EQalHumWBtLBnp3fRp++fKGLqDOyUhSIn7PzX45b/rKep//ZQSAQoIxobLA==}
'@tauri-apps/plugin-os@2.2.1': '@tauri-apps/plugin-os@2.2.1':
resolution: {integrity: sha512-cNYpNri2CCc6BaNeB6G/mOtLvg8dFyFQyCUdf2y0K8PIAKGEWdEcu8DECkydU2B+oj4OJihDPD2de5K6cbVl9A==} resolution: {integrity: sha512-cNYpNri2CCc6BaNeB6G/mOtLvg8dFyFQyCUdf2y0K8PIAKGEWdEcu8DECkydU2B+oj4OJihDPD2de5K6cbVl9A==}
@@ -1583,6 +1575,9 @@ packages:
engines: {node: ^6 || ^7 || ^8 || ^9 || ^10 || ^11 || ^12 || >=13.7} engines: {node: ^6 || ^7 || ^8 || ^9 || ^10 || ^11 || ^12 || >=13.7}
hasBin: true hasBin: true
buffer-from@1.1.2:
resolution: {integrity: sha512-E+XQCRwSbaaiChtv6k6Dwgc+bx+Bs6vuKJHHl5kox/BaKbhiXzqQOwK4cO22yElGp2OCmjwVhT3HmxgyPGnJfQ==}
bundle-name@4.1.0: bundle-name@4.1.0:
resolution: {integrity: sha512-tjwM5exMg6BGRI+kNmTntNsvdZS1X8BFYS6tnJ2hdH0kVxM6/eVZ2xy+FqStSWvYmtfFMDLIxurorHwDKfDz5Q==} resolution: {integrity: sha512-tjwM5exMg6BGRI+kNmTntNsvdZS1X8BFYS6tnJ2hdH0kVxM6/eVZ2xy+FqStSWvYmtfFMDLIxurorHwDKfDz5Q==}
engines: {node: '>=18'} engines: {node: '>=18'}
@@ -1658,9 +1653,6 @@ packages:
resolution: {integrity: sha512-cYY9mypksY8NRqgDB1XD1RiJL338v/551niynFTGkZOO2LHuB2OmOYxDIe/ttN9AHwrqdum1360G3ald0W9kCg==} resolution: {integrity: sha512-cYY9mypksY8NRqgDB1XD1RiJL338v/551niynFTGkZOO2LHuB2OmOYxDIe/ttN9AHwrqdum1360G3ald0W9kCg==}
engines: {node: '>=8'} engines: {node: '>=8'}
classnames@2.5.1:
resolution: {integrity: sha512-saHYOzhIQs6wy2sVxTM6bUDsQO4F50V9RQ22qBpEdCW+I+/Wmke2HOl6lS6dTpdxVhb88/I6+Hs+438c3lfUow==}
cli-boxes@3.0.0: cli-boxes@3.0.0:
resolution: {integrity: sha512-/lzGpEWL/8PfI0BmBOPRwp0c/wFNX1RdUML3jK/RcSBA9T8mZDdQpqYBKtCFTOfQbwPqWEOpjqW+Fnayc0969g==} resolution: {integrity: sha512-/lzGpEWL/8PfI0BmBOPRwp0c/wFNX1RdUML3jK/RcSBA9T8mZDdQpqYBKtCFTOfQbwPqWEOpjqW+Fnayc0969g==}
engines: {node: '>=10'} engines: {node: '>=10'}
@@ -1695,6 +1687,9 @@ packages:
comma-separated-tokens@2.0.3: comma-separated-tokens@2.0.3:
resolution: {integrity: sha512-Fu4hJdvzeylCfQPp9SGWidpzrMs7tTrlu6Vb8XGaRGck8QSNZJJp538Wrb60Lax4fPwR64ViY468OIUTbRlGZg==} resolution: {integrity: sha512-Fu4hJdvzeylCfQPp9SGWidpzrMs7tTrlu6Vb8XGaRGck8QSNZJJp538Wrb60Lax4fPwR64ViY468OIUTbRlGZg==}
commander@2.20.3:
resolution: {integrity: sha512-GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ==}
commander@4.1.1: commander@4.1.1:
resolution: {integrity: sha512-NOKm8xhkzAjzFx8B2v5OAHT+u5pRQc2UCa2Vq9jYL/31o2wi9mxBA7LIFs3sV5VSC49z6pEhfbMULvShKj26WA==} resolution: {integrity: sha512-NOKm8xhkzAjzFx8B2v5OAHT+u5pRQc2UCa2Vq9jYL/31o2wi9mxBA7LIFs3sV5VSC49z6pEhfbMULvShKj26WA==}
engines: {node: '>= 6'} engines: {node: '>= 6'}
@@ -2640,8 +2635,8 @@ packages:
mdast-util-from-markdown@2.0.2: mdast-util-from-markdown@2.0.2:
resolution: {integrity: sha512-uZhTV/8NBuw0WHkPTrCqDOl0zVe1BIng5ZtHoDk49ME1qqcjYmmLmOf0gELgcRMxN4w2iuIeVso5/6QymSrgmA==} resolution: {integrity: sha512-uZhTV/8NBuw0WHkPTrCqDOl0zVe1BIng5ZtHoDk49ME1qqcjYmmLmOf0gELgcRMxN4w2iuIeVso5/6QymSrgmA==}
mdast-util-gfm-autolink-literal@2.0.1: mdast-util-gfm-autolink-literal@2.0.0:
resolution: {integrity: sha512-5HVP2MKaP6L+G6YaxPNjuL0BPrq9orG3TsrZ9YXbA3vDw/ACI4MEsnoDpn6ZNm7GnZgtAcONJyPhOP8tNJQavQ==} resolution: {integrity: sha512-FyzMsduZZHSc3i0Px3PQcBT4WJY/X/RCtEJKuybiC6sjPqLv7h1yqAkmILZtuxMSsUyaLUWNp71+vQH2zqp5cg==}
mdast-util-gfm-footnote@2.1.0: mdast-util-gfm-footnote@2.1.0:
resolution: {integrity: sha512-sqpDWlsHn7Ac9GNZQMeUzPQSMzR6Wv0WKRNvQRg0KqHh02fpTz69Qc1QSseNX29bhz1ROIyNyxExfawVKTm1GQ==} resolution: {integrity: sha512-sqpDWlsHn7Ac9GNZQMeUzPQSMzR6Wv0WKRNvQRg0KqHh02fpTz69Qc1QSseNX29bhz1ROIyNyxExfawVKTm1GQ==}
@@ -3137,9 +3132,6 @@ packages:
typescript: typescript:
optional: true optional: true
react-is@18.3.1:
resolution: {integrity: sha512-/LLMVyas0ljjAtoYiPqYiL8VWXzUUdThrmU5+n20DZv+a+ClRoevUzw5JxU+Ieh5/c87ytoTBV9G1FiKfNJdmg==}
react-markdown@9.1.0: react-markdown@9.1.0:
resolution: {integrity: sha512-xaijuJB0kzGiUdG7nc2MOMDUDBWPyGAjZtUrow9XxUeua8IqeP+VlIfAZ3bphpcLTnSZXz6z9jcVC/TCwbfgdw==} resolution: {integrity: sha512-xaijuJB0kzGiUdG7nc2MOMDUDBWPyGAjZtUrow9XxUeua8IqeP+VlIfAZ3bphpcLTnSZXz6z9jcVC/TCwbfgdw==}
peerDependencies: peerDependencies:
@@ -3346,6 +3338,9 @@ packages:
resolution: {integrity: sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA==} resolution: {integrity: sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA==}
engines: {node: '>=0.10.0'} engines: {node: '>=0.10.0'}
source-map-support@0.5.21:
resolution: {integrity: sha512-uBHU3L3czsIyYXKX88fdrGovxdSCoTGDRZ6SYXtSRxLZUzHg5P/66Ht6uoUlHu9EZod+inXhKo3qQgwXUT/y1w==}
source-map@0.6.1: source-map@0.6.1:
resolution: {integrity: sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==} resolution: {integrity: sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==}
engines: {node: '>=0.10.0'} engines: {node: '>=0.10.0'}
@@ -3423,6 +3418,9 @@ packages:
tabbable@6.2.0: tabbable@6.2.0:
resolution: {integrity: sha512-Cat63mxsVJlzYvN51JmVXIgNoUokrIaT2zLclCXjRd8boZ0004U4KCs/sToJ75C6sdlByWxpYnb5Boif1VSFew==} resolution: {integrity: sha512-Cat63mxsVJlzYvN51JmVXIgNoUokrIaT2zLclCXjRd8boZ0004U4KCs/sToJ75C6sdlByWxpYnb5Boif1VSFew==}
tailwind-merge@3.3.1:
resolution: {integrity: sha512-gBXpgUm/3rp1lMZZrM/w7D8GKqshif0zAymAhbCyIt8KMe+0v9DQ7cdYLR4FHH/cKpdTXb+A/tKKU3eolfsI+g==}
tailwindcss@3.4.17: tailwindcss@3.4.17:
resolution: {integrity: sha512-w33E2aCvSDP0tW9RZuNXadXlkHXqFzSkQew/aIa2i/Sj8fThxwovwlXHSPXTbAHwEIhBFXAedUhP2tueAKP8Og==} resolution: {integrity: sha512-w33E2aCvSDP0tW9RZuNXadXlkHXqFzSkQew/aIa2i/Sj8fThxwovwlXHSPXTbAHwEIhBFXAedUhP2tueAKP8Og==}
engines: {node: '>=14.0.0'} engines: {node: '>=14.0.0'}
@@ -3440,6 +3438,11 @@ packages:
tauri-plugin-windows-version-api@2.0.0: tauri-plugin-windows-version-api@2.0.0:
resolution: {integrity: sha512-tty5n4ASYbXpnsD5ws2iTcTTpDCrSbzRTVp5Bo3UTpYGqlN1gBn2Zk8s3oO4w7VIM5WtJhDM9Jr/UgoTk7tFJQ==} resolution: {integrity: sha512-tty5n4ASYbXpnsD5ws2iTcTTpDCrSbzRTVp5Bo3UTpYGqlN1gBn2Zk8s3oO4w7VIM5WtJhDM9Jr/UgoTk7tFJQ==}
terser@5.40.0:
resolution: {integrity: sha512-cfeKl/jjwSR5ar7d0FGmave9hFGJT8obyo0z+CrQOylLDbk7X81nPU6vq9VORa5jU30SkDnT2FXjLbR8HLP+xA==}
engines: {node: '>=10'}
hasBin: true
thenify-all@1.6.0: thenify-all@1.6.0:
resolution: {integrity: sha512-RNxQH/qI8/t3thXJDwcstUO4zeqo64+Uy/+sNVRBx4Xn2OX+OZ9oP+iJnNFqplFra2ZUVeKCSa2oVWi3T4uVmA==} resolution: {integrity: sha512-RNxQH/qI8/t3thXJDwcstUO4zeqo64+Uy/+sNVRBx4Xn2OX+OZ9oP+iJnNFqplFra2ZUVeKCSa2oVWi3T4uVmA==}
engines: {node: '>=0.8'} engines: {node: '>=0.8'}
@@ -3774,23 +3777,6 @@ snapshots:
'@jridgewell/gen-mapping': 0.3.8 '@jridgewell/gen-mapping': 0.3.8
'@jridgewell/trace-mapping': 0.3.25 '@jridgewell/trace-mapping': 0.3.25
'@ant-design/colors@8.0.0':
dependencies:
'@ant-design/fast-color': 3.0.0
'@ant-design/fast-color@3.0.0': {}
'@ant-design/icons-svg@4.4.2': {}
'@ant-design/icons@6.0.0(react-dom@18.3.1(react@18.3.1))(react@18.3.1)':
dependencies:
'@ant-design/colors': 8.0.0
'@ant-design/icons-svg': 4.4.2
'@rc-component/util': 1.2.1(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
classnames: 2.5.1
react: 18.3.1
react-dom: 18.3.1(react@18.3.1)
'@antfu/install-pkg@1.1.0': '@antfu/install-pkg@1.1.0':
dependencies: dependencies:
package-manager-detector: 1.3.0 package-manager-detector: 1.3.0
@@ -4260,6 +4246,12 @@ snapshots:
'@jridgewell/set-array@1.2.1': {} '@jridgewell/set-array@1.2.1': {}
'@jridgewell/source-map@0.3.6':
dependencies:
'@jridgewell/gen-mapping': 0.3.8
'@jridgewell/trace-mapping': 0.3.25
optional: true
'@jridgewell/sourcemap-codec@1.5.0': {} '@jridgewell/sourcemap-codec@1.5.0': {}
'@jridgewell/trace-mapping@0.3.25': '@jridgewell/trace-mapping@0.3.25':
@@ -4427,12 +4419,6 @@ snapshots:
'@pnpm/network.ca-file': 1.0.2 '@pnpm/network.ca-file': 1.0.2
config-chain: 1.1.13 config-chain: 1.1.13
'@rc-component/util@1.2.1(react-dom@18.3.1(react@18.3.1))(react@18.3.1)':
dependencies:
react: 18.3.1
react-dom: 18.3.1(react@18.3.1)
react-is: 18.3.1
'@react-aria/focus@3.20.2(react-dom@18.3.1(react@18.3.1))(react@18.3.1)': '@react-aria/focus@3.20.2(react-dom@18.3.1(react@18.3.1))(react@18.3.1)':
dependencies: dependencies:
'@react-aria/interactions': 3.25.0(react-dom@18.3.1(react@18.3.1))(react@18.3.1) '@react-aria/interactions': 3.25.0(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
@@ -4637,6 +4623,10 @@ snapshots:
dependencies: dependencies:
'@tauri-apps/api': 2.5.0 '@tauri-apps/api': 2.5.0
'@tauri-apps/plugin-opener@2.2.7':
dependencies:
'@tauri-apps/api': 2.5.0
'@tauri-apps/plugin-os@2.2.1': '@tauri-apps/plugin-os@2.2.1':
dependencies: dependencies:
'@tauri-apps/api': 2.5.0 '@tauri-apps/api': 2.5.0
@@ -4878,14 +4868,14 @@ snapshots:
'@ungap/structured-clone@1.3.0': {} '@ungap/structured-clone@1.3.0': {}
'@vitejs/plugin-react@4.4.1(vite@5.4.19(@types/node@22.15.17)(sass@1.87.0))': '@vitejs/plugin-react@4.4.1(vite@5.4.19(@types/node@22.15.17)(sass@1.87.0)(terser@5.40.0))':
dependencies: dependencies:
'@babel/core': 7.27.1 '@babel/core': 7.27.1
'@babel/plugin-transform-react-jsx-self': 7.27.1(@babel/core@7.27.1) '@babel/plugin-transform-react-jsx-self': 7.27.1(@babel/core@7.27.1)
'@babel/plugin-transform-react-jsx-source': 7.27.1(@babel/core@7.27.1) '@babel/plugin-transform-react-jsx-source': 7.27.1(@babel/core@7.27.1)
'@types/babel__core': 7.20.5 '@types/babel__core': 7.20.5
react-refresh: 0.17.0 react-refresh: 0.17.0
vite: 5.4.19(@types/node@22.15.17)(sass@1.87.0) vite: 5.4.19(@types/node@22.15.17)(sass@1.87.0)(terser@5.40.0)
transitivePeerDependencies: transitivePeerDependencies:
- supports-color - supports-color
@@ -5014,6 +5004,9 @@ snapshots:
node-releases: 2.0.19 node-releases: 2.0.19
update-browserslist-db: 1.1.3(browserslist@4.24.5) update-browserslist-db: 1.1.3(browserslist@4.24.5)
buffer-from@1.1.2:
optional: true
bundle-name@4.1.0: bundle-name@4.1.0:
dependencies: dependencies:
run-applescript: 7.0.0 run-applescript: 7.0.0
@@ -5084,8 +5077,6 @@ snapshots:
ci-info@4.2.0: {} ci-info@4.2.0: {}
classnames@2.5.1: {}
cli-boxes@3.0.0: {} cli-boxes@3.0.0: {}
cli-cursor@5.0.0: cli-cursor@5.0.0:
@@ -5110,6 +5101,9 @@ snapshots:
comma-separated-tokens@2.0.3: {} comma-separated-tokens@2.0.3: {}
commander@2.20.3:
optional: true
commander@4.1.1: {} commander@4.1.1: {}
commander@7.2.0: {} commander@7.2.0: {}
@@ -6114,7 +6108,7 @@ snapshots:
transitivePeerDependencies: transitivePeerDependencies:
- supports-color - supports-color
mdast-util-gfm-autolink-literal@2.0.1: mdast-util-gfm-autolink-literal@2.0.0:
dependencies: dependencies:
'@types/mdast': 4.0.4 '@types/mdast': 4.0.4
ccount: 2.0.1 ccount: 2.0.1
@@ -6162,7 +6156,7 @@ snapshots:
mdast-util-gfm@3.1.0: mdast-util-gfm@3.1.0:
dependencies: dependencies:
mdast-util-from-markdown: 2.0.2 mdast-util-from-markdown: 2.0.2
mdast-util-gfm-autolink-literal: 2.0.1 mdast-util-gfm-autolink-literal: 2.0.0
mdast-util-gfm-footnote: 2.1.0 mdast-util-gfm-footnote: 2.1.0
mdast-util-gfm-strikethrough: 2.0.0 mdast-util-gfm-strikethrough: 2.0.0
mdast-util-gfm-table: 2.0.0 mdast-util-gfm-table: 2.0.0
@@ -6830,8 +6824,6 @@ snapshots:
react-dom: 18.3.1(react@18.3.1) react-dom: 18.3.1(react@18.3.1)
typescript: 5.8.3 typescript: 5.8.3
react-is@18.3.1: {}
react-markdown@9.1.0(@types/react@18.3.21)(react@18.3.1): react-markdown@9.1.0(@types/react@18.3.21)(react@18.3.1):
dependencies: dependencies:
'@types/hast': 3.0.4 '@types/hast': 3.0.4
@@ -7121,6 +7113,12 @@ snapshots:
source-map-js@1.2.1: {} source-map-js@1.2.1: {}
source-map-support@0.5.21:
dependencies:
buffer-from: 1.1.2
source-map: 0.6.1
optional: true
source-map@0.6.1: source-map@0.6.1:
optional: true optional: true
@@ -7197,6 +7195,8 @@ snapshots:
tabbable@6.2.0: {} tabbable@6.2.0: {}
tailwind-merge@3.3.1: {}
tailwindcss@3.4.17: tailwindcss@3.4.17:
dependencies: dependencies:
'@alloc/quick-lru': 5.2.0 '@alloc/quick-lru': 5.2.0
@@ -7240,6 +7240,14 @@ snapshots:
dependencies: dependencies:
'@tauri-apps/api': 2.5.0 '@tauri-apps/api': 2.5.0
terser@5.40.0:
dependencies:
'@jridgewell/source-map': 0.3.6
acorn: 8.14.1
commander: 2.20.3
source-map-support: 0.5.21
optional: true
thenify-all@1.6.0: thenify-all@1.6.0:
dependencies: dependencies:
thenify: 3.3.1 thenify: 3.3.1
@@ -7426,7 +7434,7 @@ snapshots:
'@types/unist': 3.0.3 '@types/unist': 3.0.3
vfile-message: 4.0.2 vfile-message: 4.0.2
vite@5.4.19(@types/node@22.15.17)(sass@1.87.0): vite@5.4.19(@types/node@22.15.17)(sass@1.87.0)(terser@5.40.0):
dependencies: dependencies:
esbuild: 0.21.5 esbuild: 0.21.5
postcss: 8.5.3 postcss: 8.5.3
@@ -7435,6 +7443,7 @@ snapshots:
'@types/node': 22.15.17 '@types/node': 22.15.17
fsevents: 2.3.3 fsevents: 2.3.3
sass: 1.87.0 sass: 1.87.0
terser: 5.40.0
void-elements@3.1.0: {} void-elements@3.1.0: {}

File diff suppressed because one or more lines are too long

1
scripts/devWeb.ts Normal file
View File

@@ -0,0 +1 @@
(() => {})();

614
src-tauri/Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,9 +1,9 @@
[package] [package]
name = "coco" name = "coco"
version = "0.4.0" version = "0.7.1"
description = "Search, connect, collaborate all in one place." description = "Search, connect, collaborate all in one place."
authors = ["INFINI Labs"] authors = ["INFINI Labs"]
edition = "2021" edition = "2024"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[lib] [lib]
@@ -44,12 +44,12 @@ use_pizza_engine = []
[dependencies] [dependencies]
pizza-common = { git = "https://github.com/infinilabs/pizza-common", branch = "main" } pizza-common = { git = "https://github.com/infinilabs/pizza-common", branch = "main" }
tauri = { version = "2", features = ["protocol-asset", "macos-private-api", "tray-icon", "image-ico", "image-png", "unstable"] } tauri = { version = "2", features = ["protocol-asset", "macos-private-api", "tray-icon", "image-ico", "image-png"] }
tauri-plugin-shell = "2" tauri-plugin-shell = "2"
serde = { version = "1", features = ["derive"] } serde = { version = "1", features = ["derive"] }
# Need `arbitrary_precision` feature to support storing u128 # Need `arbitrary_precision` feature to support storing u128
# see: https://docs.rs/serde_json/latest/serde_json/struct.Number.html#method.from_u128 # see: https://docs.rs/serde_json/latest/serde_json/struct.Number.html#method.from_u128
serde_json = { version = "1", features = ["arbitrary_precision"] } serde_json = { version = "1", features = ["arbitrary_precision", "preserve_order"] }
tauri-plugin-http = "2" tauri-plugin-http = "2"
tauri-plugin-websocket = "2" tauri-plugin-websocket = "2"
tauri-plugin-deep-link = "2.0.0" tauri-plugin-deep-link = "2.0.0"
@@ -62,7 +62,7 @@ tauri-plugin-drag = "2"
tauri-plugin-macos-permissions = "2" tauri-plugin-macos-permissions = "2"
tauri-plugin-fs-pro = "2" tauri-plugin-fs-pro = "2"
tauri-plugin-screenshots = "2" tauri-plugin-screenshots = "2"
applications = { git = "https://github.com/infinilabs/applications-rs", rev = "7bb507e6b12f73c96f3a52f0578d0246a689f381" } applications = { git = "https://github.com/infinilabs/applications-rs", rev = "31b0c030a0f3bc82275fe12debe526153978671d" }
tokio-native-tls = "0.3" # For wss connections tokio-native-tls = "0.3" # For wss connections
tokio = { version = "1", features = ["full"] } tokio = { version = "1", features = ["full"] }
tokio-tungstenite = { version = "0.20", features = ["native-tls"] } tokio-tungstenite = { version = "0.20", features = ["native-tls"] }
@@ -81,21 +81,35 @@ plist = "1.7"
base64 = "0.13" base64 = "0.13"
walkdir = "2" walkdir = "2"
log = "0.4" log = "0.4"
strsim = "0.10"
futures-util = "0.3.31" futures-util = "0.3.31"
url = "2.5.2"
http = "1.1.0" http = "1.1.0"
tungstenite = "0.24.0" tungstenite = "0.24.0"
tokio-util = "0.7.14" tokio-util = "0.7.14"
tauri-plugin-windows-version = "2" tauri-plugin-windows-version = "2"
meval = "0.2" meval = { git = "https://github.com/infinilabs/meval-rs" }
chinese-number = "0.7" chinese-number = "0.7"
num2words = "1" num2words = "1"
tauri-plugin-log = "2" tauri-plugin-log = "2"
chrono = "0.4.41" chrono = "0.4.41"
serde_plain = "1.0.2"
derive_more = { version = "2.0.1", features = ["display"] }
anyhow = "1.0.98"
function_name = "0.3.0"
regex = "1.11.1"
borrowme = "0.0.15"
tauri-plugin-opener = "2"
async-recursion = "1.1.1"
zip = "4.0.0"
url = "2.5.2"
camino = "1.1.10"
tokio-stream = { version = "0.1.17", features = ["io-util"] }
cfg-if = "1.0.1"
sysinfo = "0.35.2"
[target."cfg(target_os = \"macos\")".dependencies] [target."cfg(target_os = \"macos\")".dependencies]
tauri-nspanel = { git = "https://github.com/ahkohd/tauri-nspanel", branch = "v2" } tauri-nspanel = { git = "https://github.com/ahkohd/tauri-nspanel", branch = "v2" }
cocoa = "0.24"
[target."cfg(any(target_os = \"macos\", windows, target_os = \"linux\"))".dependencies] [target."cfg(any(target_os = \"macos\", windows, target_os = \"linux\"))".dependencies]
tauri-plugin-single-instance = { version = "2.0.0", features = ["deep-link"] } tauri-plugin-single-instance = { version = "2.0.0", features = ["deep-link"] }
@@ -114,6 +128,9 @@ strip = true # Ensures debug symbols are removed.
tauri-plugin-autostart = "^2.2" tauri-plugin-autostart = "^2.2"
tauri-plugin-global-shortcut = "2" tauri-plugin-global-shortcut = "2"
tauri-plugin-updater = { git = "https://github.com/infinilabs/plugins-workspace", branch = "v2" } tauri-plugin-updater = { git = "https://github.com/infinilabs/plugins-workspace", branch = "v2" }
# This should be compatible with the semver used by `tauri-plugin-updater`
semver = { version = "1", features = ["serde"] }
[target."cfg(target_os = \"windows\")".dependencies] [target."cfg(target_os = \"windows\")".dependencies]
enigo="0.3" enigo="0.3"
windows = { version = "0.61.3", features = ["Win32_Foundation", "Win32_System_Com", "Win32_System_Ole", "Win32_System_Search", "Win32_UI_Shell_PropertiesSystem", "Win32_Data"] }

View File

@@ -1,3 +1,14 @@
fn main() { fn main() {
tauri_build::build() tauri_build::build();
// If env var `GITHUB_ACTIONS` exists, we are running in CI, set up the `ci`
// attribute
if std::env::var("GITHUB_ACTIONS").is_ok() {
println!("cargo:rustc-cfg=ci");
}
// Notify `rustc` of this `cfg` attribute to suppress unknown attribute warnings.
//
// unexpected condition name: `ci`
println!("cargo::rustc-check-cfg=cfg(ci)");
} }

View File

@@ -2,7 +2,7 @@
"$schema": "../gen/schemas/desktop-schema.json", "$schema": "../gen/schemas/desktop-schema.json",
"identifier": "default", "identifier": "default",
"description": "Capability for the main window", "description": "Capability for the main window",
"windows": ["main", "chat", "settings"], "windows": ["main", "chat", "settings", "check"],
"permissions": [ "permissions": [
"core:default", "core:default",
"core:event:allow-emit", "core:event:allow-emit",
@@ -71,6 +71,7 @@
"process:default", "process:default",
"updater:default", "updater:default",
"windows-version:default", "windows-version:default",
"log:default" "log:default",
"opener:default"
] ]
} }

View File

@@ -1,2 +1,2 @@
[toolchain] [toolchain]
channel = "nightly-2024-10-29" channel = "nightly-2025-06-26"

View File

@@ -1,10 +1,16 @@
use crate::common;
use crate::common::assistant::ChatRequestMessage; use crate::common::assistant::ChatRequestMessage;
use crate::common::http::GetResponse; use crate::common::http::{GetResponse, convert_query_params_to_strings};
use crate::common::register::SearchSourceRegistry;
use crate::server::http_client::HttpClient; use crate::server::http_client::HttpClient;
use crate::{common, server::servers::COCO_SERVERS};
use futures::StreamExt;
use futures::stream::FuturesUnordered;
use futures_util::TryStreamExt;
use http::Method;
use serde_json::Value; use serde_json::Value;
use std::collections::HashMap; use std::collections::HashMap;
use tauri::{AppHandle, Runtime}; use tauri::{AppHandle, Emitter, Manager, Runtime};
use tokio::io::AsyncBufReadExt;
#[tauri::command] #[tauri::command]
pub async fn chat_history<R: Runtime>( pub async fn chat_history<R: Runtime>(
@@ -14,17 +20,15 @@ pub async fn chat_history<R: Runtime>(
size: u32, size: u32,
query: Option<String>, query: Option<String>,
) -> Result<String, String> { ) -> Result<String, String> {
let mut query_params: HashMap<String, Value> = HashMap::new(); let mut query_params = Vec::new();
if from > 0 {
query_params.insert("from".to_string(), from.into()); // Add from/size as number values
} query_params.push(format!("from={}", from));
if size > 0 { query_params.push(format!("size={}", size));
query_params.insert("size".to_string(), size.into());
}
if let Some(query) = query { if let Some(query) = query {
if !query.is_empty() { if !query.is_empty() {
query_params.insert("query".to_string(), query.into()); query_params.push(format!("query={}", query.to_string()));
} }
} }
@@ -46,13 +50,11 @@ pub async fn session_chat_history<R: Runtime>(
from: u32, from: u32,
size: u32, size: u32,
) -> Result<String, String> { ) -> Result<String, String> {
let mut query_params: HashMap<String, Value> = HashMap::new(); let mut query_params = Vec::new();
if from > 0 {
query_params.insert("from".to_string(), from.into()); // Add from/size as number values
} query_params.push(format!("from={}", from));
if size > 0 { query_params.push(format!("size={}", size));
query_params.insert("size".to_string(), size.into());
}
let path = format!("/chat/{}/_history", session_id); let path = format!("/chat/{}/_history", session_id);
@@ -69,10 +71,9 @@ pub async fn open_session_chat<R: Runtime>(
server_id: String, server_id: String,
session_id: String, session_id: String,
) -> Result<String, String> { ) -> Result<String, String> {
let query_params = HashMap::new();
let path = format!("/chat/{}/_open", session_id); let path = format!("/chat/{}/_open", session_id);
let response = HttpClient::post(&server_id, path.as_str(), Some(query_params), None) let response = HttpClient::post(&server_id, path.as_str(), None, None)
.await .await
.map_err(|e| format!("Error open session: {}", e))?; .map_err(|e| format!("Error open session: {}", e))?;
@@ -85,10 +86,9 @@ pub async fn close_session_chat<R: Runtime>(
server_id: String, server_id: String,
session_id: String, session_id: String,
) -> Result<String, String> { ) -> Result<String, String> {
let query_params = HashMap::new();
let path = format!("/chat/{}/_close", session_id); let path = format!("/chat/{}/_close", session_id);
let response = HttpClient::post(&server_id, path.as_str(), Some(query_params), None) let response = HttpClient::post(&server_id, path.as_str(), None, None)
.await .await
.map_err(|e| format!("Error close session: {}", e))?; .map_err(|e| format!("Error close session: {}", e))?;
@@ -99,11 +99,12 @@ pub async fn cancel_session_chat<R: Runtime>(
_app_handle: AppHandle<R>, _app_handle: AppHandle<R>,
server_id: String, server_id: String,
session_id: String, session_id: String,
query_params: Option<HashMap<String, Value>>,
) -> Result<String, String> { ) -> Result<String, String> {
let query_params = HashMap::new();
let path = format!("/chat/{}/_cancel", session_id); let path = format!("/chat/{}/_cancel", session_id);
let query_params = convert_query_params_to_strings(query_params);
let response = HttpClient::post(&server_id, path.as_str(), Some(query_params), None) let response = HttpClient::post(&server_id, path.as_str(), query_params, None)
.await .await
.map_err(|e| format!("Error cancel session: {}", e))?; .map_err(|e| format!("Error cancel session: {}", e))?;
@@ -134,15 +135,22 @@ pub async fn new_chat<R: Runtime>(
let mut headers = HashMap::new(); let mut headers = HashMap::new();
headers.insert("WEBSOCKET-SESSION-ID".to_string(), websocket_id.into()); headers.insert("WEBSOCKET-SESSION-ID".to_string(), websocket_id.into());
let response = let response = HttpClient::advanced_post(
HttpClient::advanced_post(&server_id, "/chat/_new", Some(headers), query_params, body) &server_id,
.await "/chat/_new",
.map_err(|e| format!("Error sending message: {}", e))?; Some(headers),
convert_query_params_to_strings(query_params),
body,
)
.await
.map_err(|e| format!("Error sending message: {}", e))?;
let body_text = common::http::get_response_body_text(response).await?; let body_text = common::http::get_response_body_text(response).await?;
let chat_response: GetResponse = log::debug!("New chat response: {}", &body_text);
serde_json::from_str(&body_text).map_err(|e| format!("Failed to parse response JSON: {}", e))?;
let chat_response: GetResponse = serde_json::from_str(&body_text)
.map_err(|e| format!("Failed to parse response JSON: {}", e))?;
if chat_response.result != "created" { if chat_response.result != "created" {
return Err(format!("Unexpected result: {}", chat_response.result)); return Err(format!("Unexpected result: {}", chat_response.result));
@@ -151,6 +159,69 @@ pub async fn new_chat<R: Runtime>(
Ok(chat_response) Ok(chat_response)
} }
#[tauri::command]
pub async fn chat_create<R: Runtime>(
app_handle: AppHandle<R>,
server_id: String,
message: String,
query_params: Option<HashMap<String, Value>>,
client_id: String,
) -> Result<(), String> {
let body = if !message.is_empty() {
let message = ChatRequestMessage {
message: Some(message),
};
Some(
serde_json::to_string(&message)
.map_err(|e| format!("Failed to serialize message: {}", e))?
.into(),
)
} else {
None
};
let response = HttpClient::advanced_post(
&server_id,
"/chat/_create",
None,
convert_query_params_to_strings(query_params),
body,
)
.await
.map_err(|e| format!("Error sending message: {}", e))?;
if response.status() == 429 {
log::warn!("Rate limit exceeded for chat create");
return Err("Rate limited".to_string());
}
if !response.status().is_success() {
return Err(format!("Request failed with status: {}", response.status()));
}
let stream = response.bytes_stream();
let reader = tokio_util::io::StreamReader::new(
stream.map_err(|e| std::io::Error::new(std::io::ErrorKind::Other, e)),
);
let mut lines = tokio::io::BufReader::new(reader).lines();
log::info!("client_id_create: {}", &client_id);
while let Ok(Some(line)) = lines.next_line().await {
log::info!("Received chat stream line: {}", &line);
if let Err(err) = app_handle.emit(&client_id, line) {
log::error!("Emit failed: {:?}", err);
print!("Error sending message: {:?}", err);
let _ = app_handle.emit("chat-create-error", format!("Emit failed: {:?}", err));
}
}
Ok(())
}
#[tauri::command] #[tauri::command]
pub async fn send_message<R: Runtime>( pub async fn send_message<R: Runtime>(
_app_handle: AppHandle<R>, _app_handle: AppHandle<R>,
@@ -173,15 +244,83 @@ pub async fn send_message<R: Runtime>(
&server_id, &server_id,
path.as_str(), path.as_str(),
Some(headers), Some(headers),
query_params, convert_query_params_to_strings(query_params),
Some(body), Some(body),
) )
.await .await
.map_err(|e| format!("Error cancel session: {}", e))?; .map_err(|e| format!("Error cancel session: {}", e))?;
common::http::get_response_body_text(response).await common::http::get_response_body_text(response).await
} }
#[tauri::command]
pub async fn chat_chat<R: Runtime>(
app_handle: AppHandle<R>,
server_id: String,
session_id: String,
message: String,
query_params: Option<HashMap<String, Value>>, //search,deep_thinking
client_id: String,
) -> Result<(), String> {
let body = if !message.is_empty() {
let message = ChatRequestMessage {
message: Some(message),
};
Some(
serde_json::to_string(&message)
.map_err(|e| format!("Failed to serialize message: {}", e))?
.into(),
)
} else {
None
};
let path = format!("/chat/{}/_chat", session_id);
let response = HttpClient::advanced_post(
&server_id,
path.as_str(),
None,
convert_query_params_to_strings(query_params),
body,
)
.await
.map_err(|e| format!("Error sending message: {}", e))?;
if response.status() == 429 {
log::warn!("Rate limit exceeded for chat create");
return Err("Rate limited".to_string());
}
if !response.status().is_success() {
return Err(format!("Request failed with status: {}", response.status()));
}
let stream = response.bytes_stream();
let reader = tokio_util::io::StreamReader::new(
stream.map_err(|e| std::io::Error::new(std::io::ErrorKind::Other, e)),
);
let mut lines = tokio::io::BufReader::new(reader).lines();
let mut first_log = true;
log::info!("client_id: {}", &client_id);
while let Ok(Some(line)) = lines.next_line().await {
log::info!("Received chat stream line: {}", &line);
if first_log {
log::info!("first stream line: {}", &line);
first_log = false;
}
if let Err(err) = app_handle.emit(&client_id, line) {
log::error!("Emit failed: {:?}", err);
let _ = app_handle.emit("chat-create-error", format!("Emit failed: {:?}", err));
}
}
Ok(())
}
#[tauri::command] #[tauri::command]
pub async fn delete_session_chat(server_id: String, session_id: String) -> Result<bool, String> { pub async fn delete_session_chat(server_id: String, session_id: String) -> Result<bool, String> {
let response = let response =
@@ -219,8 +358,8 @@ pub async fn update_session_chat(
None, None,
Some(reqwest::Body::from(serde_json::to_string(&body).unwrap())), Some(reqwest::Body::from(serde_json::to_string(&body).unwrap())),
) )
.await .await
.map_err(|e| format!("Error updating session: {}", e))?; .map_err(|e| format!("Error updating session: {}", e))?;
Ok(response.status().is_success()) Ok(response.status().is_success())
} }
@@ -229,30 +368,184 @@ pub async fn update_session_chat(
pub async fn assistant_search<R: Runtime>( pub async fn assistant_search<R: Runtime>(
_app_handle: AppHandle<R>, _app_handle: AppHandle<R>,
server_id: String, server_id: String,
from: u32, query_params: Option<Vec<String>>,
size: u32,
query: Option<HashMap<String, Value>>,
) -> Result<Value, String> { ) -> Result<Value, String> {
let mut body = serde_json::json!({ let response = HttpClient::post(&server_id, "/assistant/_search", query_params, None)
"from": from, .await
"size": size, .map_err(|e| format!("Error searching assistants: {}", e))?;
});
if let Some(q) = query {
body["query"] = serde_json::to_value(q).map_err(|e| e.to_string())?;
}
let response = HttpClient::post(
&server_id,
"/assistant/_search",
None,
Some(reqwest::Body::from(body.to_string())),
)
.await
.map_err(|e| format!("Error searching assistants: {}", e))?;
response response
.json::<Value>() .json::<Value>()
.await .await
.map_err(|err| err.to_string()) .map_err(|err| err.to_string())
} }
#[tauri::command]
pub async fn assistant_get<R: Runtime>(
_app_handle: AppHandle<R>,
server_id: String,
assistant_id: String,
) -> Result<Value, String> {
let response = HttpClient::get(
&server_id,
&format!("/assistant/{}", assistant_id),
None, // headers
)
.await
.map_err(|e| format!("Error getting assistant: {}", e))?;
response
.json::<Value>()
.await
.map_err(|err| err.to_string())
}
/// Gets the information of the assistant specified by `assistant_id` by querying **all**
/// Coco servers.
///
/// Returns as soon as the assistant is found on any Coco server.
#[tauri::command]
pub async fn assistant_get_multi<R: Runtime>(
app_handle: AppHandle<R>,
assistant_id: String,
) -> Result<Value, String> {
let search_sources = app_handle.state::<SearchSourceRegistry>();
let sources_future = search_sources.get_sources();
let sources_list = sources_future.await;
let mut futures = FuturesUnordered::new();
for query_source in &sources_list {
let query_source_type = query_source.get_type();
if query_source_type.r#type != COCO_SERVERS {
// Assistants only exists on Coco servers.
continue;
}
let coco_server_id = query_source_type.id.clone();
let path = format!("/assistant/{}", assistant_id);
let fut = async move {
let res_response = HttpClient::get(
&coco_server_id,
&path,
None, // headers
)
.await;
match res_response {
Ok(response) => response
.json::<serde_json::Value>()
.await
.map_err(|e| e.to_string()),
Err(e) => Err(e),
}
};
futures.push(fut);
}
while let Some(res_response_json) = futures.next().await {
let response_json = match res_response_json {
Ok(json) => json,
Err(e) => return Err(e),
};
// Example response JSON
//
// When assistant is not found:
// ```json
// {
// "_id": "ID",
// "result": "not_found"
// }
// ```
//
// When assistant is found:
// ```json
// {
// "_id": "ID",
// "_source": {...}
// "found": true
// }
// ```
if let Some(found) = response_json.get("found") {
if found == true {
return Ok(response_json);
}
}
}
Err(format!(
"could not find Assistant [{}] on all the Coco servers",
assistant_id
))
}
use regex::Regex;
/// Remove all `"icon": "..."` fields from a JSON string
pub fn remove_icon_fields(json: &str) -> String {
// Regex to match `"icon": "..."` fields, including base64 or escaped strings
let re = Regex::new(r#""icon"\s*:\s*"[^"]*"(,?)"#).unwrap();
// Replace with empty string, or just remove trailing comma if needed
re.replace_all(json, |caps: &regex::Captures| {
if &caps[1] == "," {
"".to_string() // keep comma removal logic safe
} else {
"".to_string()
}
})
.to_string()
}
#[tauri::command]
pub async fn ask_ai<R: Runtime>(
app_handle: AppHandle<R>,
message: String,
server_id: String,
assistant_id: String,
client_id: String,
) -> Result<(), String> {
let cleaned = remove_icon_fields(message.as_str());
let body = serde_json::json!({ "message": cleaned });
let path = format!("/assistant/{}/_ask", assistant_id);
println!("Sending request to {}", &path);
let response = HttpClient::send_request(
server_id.as_str(),
Method::POST,
path.as_str(),
None,
None,
Some(reqwest::Body::from(body.to_string())),
)
.await?;
if response.status() == 429 {
log::warn!("Rate limit exceeded for assistant: {}", &assistant_id);
return Ok(());
}
if !response.status().is_success() {
return Err(format!("Request Failed: {}", response.status()));
}
let stream = response.bytes_stream();
let reader = tokio_util::io::StreamReader::new(
stream.map_err(|e| std::io::Error::new(std::io::ErrorKind::Other, e)),
);
let mut lines = tokio::io::BufReader::new(reader).lines();
while let Ok(Some(line)) = lines.next_line().await {
dbg!("Received line: {}", &line);
let _ = app_handle.emit(&client_id, line).map_err(|err| {
println!("Failed to emit: {:?}", err);
});
}
Ok(())
}

View File

@@ -3,38 +3,43 @@ use std::{fs::create_dir, io::Read};
use tauri::{Manager, Runtime}; use tauri::{Manager, Runtime};
use tauri_plugin_autostart::ManagerExt; use tauri_plugin_autostart::ManagerExt;
// Start or stop according to configuration /// If the state reported from the OS and the state stored by us differ, our state is
pub fn enable_autostart(app: &mut tauri::App) { /// prioritized and seen as the correct one. Update the OS state to make them consistent.
use tauri_plugin_autostart::MacosLauncher; pub fn ensure_autostart_state_consistent(app: &mut tauri::App) -> Result<(), String> {
use tauri_plugin_autostart::ManagerExt;
app.handle()
.plugin(tauri_plugin_autostart::init(
MacosLauncher::AppleScript,
None,
))
.unwrap();
let autostart_manager = app.autolaunch(); let autostart_manager = app.autolaunch();
// close autostart let os_state = autostart_manager.is_enabled().map_err(|e| e.to_string())?;
// autostart_manager.disable().unwrap(); let coco_stored_state = current_autostart(app.app_handle()).map_err(|e| e.to_string())?;
// return;
match ( if os_state != coco_stored_state {
autostart_manager.is_enabled(), log::warn!(
current_autostart(app.app_handle()), "autostart inconsistent states, OS state [{}], Coco state [{}], config file could be deleted or corrupted",
) { os_state,
(Ok(false), Ok(true)) => match autostart_manager.enable() { coco_stored_state
Ok(_) => println!("Autostart enabled successfully."), );
Err(err) => eprintln!("Failed to enable autostart: {}", err), log::info!("trying to correct the inconsistent states");
},
(Ok(true), Ok(false)) => match autostart_manager.disable() { let result = if coco_stored_state {
Ok(_) => println!("Autostart disable successfully."), autostart_manager.enable()
Err(err) => eprintln!("Failed to disable autostart: {}", err), } else {
}, autostart_manager.disable()
_ => (), };
match result {
Ok(_) => {
log::info!("inconsistent autostart states fixed");
}
Err(e) => {
log::error!(
"failed to fix inconsistent autostart state due to error [{}]",
e
);
return Err(e.to_string());
}
}
} }
Ok(())
} }
fn current_autostart<R: Runtime>(app: &tauri::AppHandle<R>) -> Result<bool, String> { fn current_autostart<R: Runtime>(app: &tauri::AppHandle<R>) -> Result<bool, String> {

View File

@@ -9,13 +9,13 @@ pub struct ChatRequestMessage {
#[allow(dead_code)] #[allow(dead_code)]
pub struct NewChatResponse { pub struct NewChatResponse {
pub _id: String, pub _id: String,
pub _source: Source, pub _source: Session,
pub result: String, pub result: String,
pub payload: Option<Value>, pub payload: Option<Value>,
} }
#[derive(Debug, Serialize, Deserialize)] #[derive(Debug, Serialize, Deserialize)]
pub struct Source { pub struct Session {
pub id: String, pub id: String,
pub created: String, pub created: String,
pub updated: String, pub updated: String,
@@ -23,4 +23,11 @@ pub struct Source {
pub title: Option<String>, pub title: Option<String>,
pub summary: Option<String>, pub summary: Option<String>,
pub manually_renamed_title: bool, pub manually_renamed_title: bool,
pub visible: Option<bool>,
pub context: Option<SessionContext>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct SessionContext {
pub attachments: Option<Vec<String>>,
} }

View File

@@ -1,6 +1,6 @@
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
#[derive(Debug,Clone, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Connector { pub struct Connector {
pub id: String, pub id: String,
pub created: Option<String>, pub created: Option<String>,
@@ -13,7 +13,7 @@ pub struct Connector {
pub url: Option<String>, pub url: Option<String>,
pub assets: Option<ConnectorAssets>, pub assets: Option<ConnectorAssets>,
} }
#[derive(Debug,Clone, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ConnectorAssets { pub struct ConnectorAssets {
pub icons: Option<std::collections::HashMap<String, String>>, pub icons: Option<std::collections::HashMap<String, String>>,
} }

View File

@@ -18,4 +18,4 @@ pub struct DataSource {
pub struct ConnectorConfig { pub struct ConnectorConfig {
pub id: Option<String>, pub id: Option<String>,
pub config: Option<serde_json::Value>, // Using serde_json::Value to handle any type of config pub config: Option<serde_json::Value>, // Using serde_json::Value to handle any type of config
} }

View File

@@ -1,5 +1,7 @@
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use std::collections::HashMap; use std::collections::HashMap;
use tauri::AppHandle;
use tauri::Runtime;
#[derive(Debug, Clone, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]
pub struct RichLabel { pub struct RichLabel {
@@ -29,6 +31,87 @@ pub struct EditorInfo {
pub timestamp: Option<String>, pub timestamp: Option<String>,
} }
/// Defines the action that would be performed when a document gets opened.
#[derive(Debug, Clone, Serialize, Deserialize)]
pub(crate) enum OnOpened {
/// Launch the application
Application { app_path: String },
/// Open the URL.
Document { url: String },
/// Spawn a child process to run the `CommandAction`.
Command {
action: crate::extension::CommandAction,
},
}
impl OnOpened {
pub(crate) fn url(&self) -> String {
match self {
Self::Application { app_path } => app_path.clone(),
Self::Document { url } => url.clone(),
Self::Command { action } => {
const WHITESPACE: &str = " ";
let mut ret = action.exec.clone();
ret.push_str(WHITESPACE);
if let Some(ref args) = action.args {
ret.push_str(args.join(WHITESPACE).as_str());
}
ret
}
}
}
}
#[tauri::command]
pub(crate) async fn open<R: Runtime>(
tauri_app_handle: AppHandle<R>,
on_opened: OnOpened,
) -> Result<(), String> {
log::debug!("open({})", on_opened.url());
use crate::util::open as homemade_tauri_shell_open;
use std::process::Command;
match on_opened {
OnOpened::Application { app_path } => {
homemade_tauri_shell_open(tauri_app_handle.clone(), app_path).await?
}
OnOpened::Document { url } => {
homemade_tauri_shell_open(tauri_app_handle.clone(), url).await?
}
OnOpened::Command { action } => {
let mut cmd = Command::new(action.exec);
if let Some(args) = action.args {
cmd.args(args);
}
let output = cmd.output().map_err(|e| e.to_string())?;
// Sometimes, we wanna see the result in logs even though it doesn't fail.
log::debug!(
"executing open(Command) result, exit code: [{}], stdout: [{}], stderr: [{}]",
output.status,
String::from_utf8_lossy(&output.stdout),
String::from_utf8_lossy(&output.stderr)
);
if !output.status.success() {
log::warn!(
"executing open(Command) failed, exit code: [{}], stdout: [{}], stderr: [{}]",
output.status,
String::from_utf8_lossy(&output.stdout),
String::from_utf8_lossy(&output.stderr)
);
return Err(format!(
"Command failed, stderr [{}]",
String::from_utf8_lossy(&output.stderr)
));
}
}
}
Ok(())
}
#[derive(Debug, Clone, Serialize, Deserialize, Default)] #[derive(Debug, Clone, Serialize, Deserialize, Default)]
pub struct Document { pub struct Document {
pub id: String, pub id: String,
@@ -48,6 +131,8 @@ pub struct Document {
pub thumbnail: Option<String>, pub thumbnail: Option<String>,
pub cover: Option<String>, pub cover: Option<String>,
pub tags: Option<Vec<String>>, pub tags: Option<Vec<String>>,
/// What will happen if we open this document.
pub on_opened: Option<OnOpened>,
pub url: Option<String>, pub url: Option<String>,
pub size: Option<i64>, pub size: Option<i64>,
pub metadata: Option<HashMap<String, serde_json::Value>>, pub metadata: Option<HashMap<String, serde_json::Value>>,

View File

@@ -1,34 +1,67 @@
use serde::{Deserialize, Serialize}; use reqwest::StatusCode;
use serde::{Deserialize, Serialize, Serializer};
use thiserror::Error; use thiserror::Error;
fn serialize_optional_status_code<S>(
status_code: &Option<StatusCode>,
serializer: S,
) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
match status_code {
Some(code) => serializer.serialize_str(&format!("{:?}", code)),
None => serializer.serialize_none(),
}
}
#[allow(unused)]
#[derive(Debug, Deserialize)] #[derive(Debug, Deserialize)]
pub struct ErrorCause {
#[serde(default)]
pub r#type: Option<String>,
#[serde(default)]
pub reason: Option<String>,
}
#[derive(Debug, Deserialize)]
#[allow(unused)]
pub struct ErrorDetail { pub struct ErrorDetail {
pub reason: String, #[serde(default)]
pub status: u16, pub root_cause: Option<Vec<ErrorCause>>,
#[serde(default)]
pub r#type: Option<String>,
#[serde(default)]
pub reason: Option<String>,
#[serde(default)]
pub caused_by: Option<ErrorCause>,
} }
#[derive(Debug, Deserialize)] #[derive(Debug, Deserialize)]
pub struct ErrorResponse { pub struct ErrorResponse {
pub error: ErrorDetail, #[serde(default)]
pub error: Option<ErrorDetail>,
#[serde(default)]
#[allow(unused)]
pub status: Option<u16>,
} }
#[derive(Debug, Error, Serialize)] #[derive(Debug, Error, Serialize)]
pub enum SearchError { pub enum SearchError {
#[error("HTTP request failed: {0}")] #[error("HttpError: status code [{status_code:?}], msg [{msg}]")]
HttpError(String), HttpError {
#[serde(serialize_with = "serialize_optional_status_code")]
status_code: Option<StatusCode>,
msg: String,
},
#[error("Invalid response format: {0}")] #[error("ParseError: {0}")]
ParseError(String), ParseError(String),
#[error("Timeout occurred")] #[error("Timeout occurred")]
Timeout, Timeout,
#[error("Unknown error: {0}")] #[error("InternalError: {0}")]
#[allow(dead_code)]
Unknown(String),
#[error("InternalError error: {0}")]
#[allow(dead_code)]
InternalError(String), InternalError(String),
} }
@@ -39,7 +72,10 @@ impl From<reqwest::Error> for SearchError {
} else if err.is_decode() { } else if err.is_decode() {
SearchError::ParseError(err.to_string()) SearchError::ParseError(err.to_string())
} else { } else {
SearchError::HttpError(err.to_string()) SearchError::HttpError {
status_code: err.status(),
msg: err.to_string(),
}
} }
} }
} }

View File

@@ -2,6 +2,8 @@ use crate::common;
use reqwest::Response; use reqwest::Response;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use serde_json::Value; use serde_json::Value;
use std::collections::HashMap;
use tauri_plugin_store::JsonValue;
#[derive(Debug, Serialize, Deserialize)] #[derive(Debug, Serialize, Deserialize)]
pub struct GetResponse { pub struct GetResponse {
@@ -40,13 +42,34 @@ pub async fn get_response_body_text(response: Response) -> Result<String, String
Ok(parsed_error) => { Ok(parsed_error) => {
dbg!(&parsed_error); dbg!(&parsed_error);
Err(format!( Err(format!(
"Server error ({}): {}", "Server error ({}): {:?}",
parsed_error.error.status, parsed_error.error.reason status, parsed_error.error
)) ))
} }
Err(_) => Err(fallback_error), Err(_) => {
log::warn!("Failed to parse error response: {}", &body);
Err(fallback_error)
}
} }
} else { } else {
Ok(body) Ok(body)
} }
} }
pub fn convert_query_params_to_strings(
query_params: Option<HashMap<String, JsonValue>>,
) -> Option<Vec<String>> {
query_params.map(|map| {
map.into_iter()
.filter_map(|(k, v)| match v {
JsonValue::String(s) => Some(format!("{}={}", k, s)),
JsonValue::Number(n) => Some(format!("{}={}", k, n)),
JsonValue::Bool(b) => Some(format!("{}={}", k, b)),
_ => {
eprintln!("Skipping unsupported query value for key '{}': {:?}", k, v);
None
}
})
.collect()
})
}

View File

@@ -1,16 +1,17 @@
pub mod health;
pub mod profile;
pub mod server;
pub mod auth;
pub mod datasource;
pub mod connector;
pub mod search;
pub mod document;
pub mod traits;
pub mod register;
pub mod assistant; pub mod assistant;
pub mod http; pub mod auth;
pub mod connector;
pub mod datasource;
pub mod document;
pub mod error; pub mod error;
pub mod health;
pub mod http;
pub mod profile;
pub mod register;
pub mod search;
pub mod server;
pub mod traits;
pub static MAIN_WINDOW_LABEL: &str = "main"; pub static MAIN_WINDOW_LABEL: &str = "main";
pub static SETTINGS_WINDOW_LABEL: &str = "settings"; pub static SETTINGS_WINDOW_LABEL: &str = "settings";
pub static CHECK_WINDOW_LABEL: &str = "check";

View File

@@ -13,4 +13,4 @@ pub struct UserProfile {
pub email: String, pub email: String,
pub avatar: Option<String>, pub avatar: Option<String>,
pub preferences: Option<Preferences>, pub preferences: Option<Preferences>,
} }

View File

@@ -7,8 +7,8 @@ use std::error::Error;
#[derive(Debug, Serialize, Deserialize)] #[derive(Debug, Serialize, Deserialize)]
pub struct SearchResponse<T> { pub struct SearchResponse<T> {
pub took: u64, pub took: Option<u64>,
pub timed_out: bool, pub timed_out: Option<bool>,
pub _shards: Option<Shards>, pub _shards: Option<Shards>,
pub hits: Hits<T>, pub hits: Hits<T>,
} }
@@ -25,7 +25,7 @@ pub struct Shards {
pub struct Hits<T> { pub struct Hits<T> {
pub total: Total, pub total: Total,
pub max_score: Option<f32>, pub max_score: Option<f32>,
pub hits: Vec<SearchHit<T>>, pub hits: Option<Vec<SearchHit<T>>>,
} }
#[derive(Debug, Serialize, Deserialize)] #[derive(Debug, Serialize, Deserialize)]
@@ -36,9 +36,9 @@ pub struct Total {
#[derive(Debug, Serialize, Deserialize)] #[derive(Debug, Serialize, Deserialize)]
pub struct SearchHit<T> { pub struct SearchHit<T> {
pub _index: String, pub _index: Option<String>,
pub _type: String, pub _type: Option<String>,
pub _id: String, pub _id: Option<String>,
pub _score: Option<f64>, pub _score: Option<f64>,
pub _source: T, // This will hold the type we pass in (e.g., DataSource) pub _source: T, // This will hold the type we pass in (e.g., DataSource)
} }
@@ -58,13 +58,18 @@ where
Ok(search_response) Ok(search_response)
} }
use serde::de::DeserializeOwned;
pub async fn parse_search_hits<T>(response: Response) -> Result<Vec<SearchHit<T>>, Box<dyn Error>> pub async fn parse_search_hits<T>(response: Response) -> Result<Vec<SearchHit<T>>, Box<dyn Error>>
where where
T: for<'de> Deserialize<'de> + std::fmt::Debug, T: DeserializeOwned + std::fmt::Debug,
{ {
let response = parse_search_response(response).await?; let response = parse_search_response(response).await?;
Ok(response.hits.hits) match response.hits.hits {
Some(hits) => Ok(hits),
None => Ok(Vec::new()),
}
} }
pub async fn parse_search_results<T>(response: Response) -> Result<Vec<T>, Box<dyn Error>> pub async fn parse_search_results<T>(response: Response) -> Result<Vec<T>, Box<dyn Error>>
@@ -78,20 +83,6 @@ where
.collect()) .collect())
} }
#[allow(dead_code)]
pub async fn parse_search_results_with_score<T>(
response: Response,
) -> Result<Vec<(T, Option<f64>)>, Box<dyn Error>>
where
T: for<'de> Deserialize<'de> + std::fmt::Debug,
{
Ok(parse_search_hits(response)
.await?
.into_iter()
.map(|hit| (hit._source, hit._score))
.collect())
}
#[derive(Debug, Clone, Serialize)] #[derive(Debug, Clone, Serialize)]
pub struct SearchQuery { pub struct SearchQuery {
pub from: u64, pub from: u64,

View File

@@ -1,6 +1,8 @@
use crate::common::health::Health; use crate::common::health::Health;
use crate::common::profile::UserProfile; use crate::common::profile::UserProfile;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use serde_json::Value;
use std::collections::HashMap;
use std::hash::{Hash, Hasher}; use std::hash::{Hash, Hasher};
#[derive(Debug, Clone, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]
@@ -48,9 +50,17 @@ pub struct Server {
pub updated: String, pub updated: String,
#[serde(default = "default_enabled_type")] #[serde(default = "default_enabled_type")]
pub enabled: bool, pub enabled: bool,
/// Public Coco servers can be used without signing in.
#[serde(default = "default_bool_type")] #[serde(default = "default_bool_type")]
pub public: bool, pub public: bool,
/// A coco server is available if:
///
/// 1. It is still online, we check this via the `GET /base_url/provider/_info`
/// interface.
/// 2. A user is logged in to this Coco server, i.e., a token is stored in the
/// `SERVER_TOKEN_LIST_CACHE`.
/// For public Coco servers, requirement 2 is not needed.
#[serde(default = "default_available_type")] #[serde(default = "default_available_type")]
pub available: bool, pub available: bool,
@@ -60,6 +70,7 @@ pub struct Server {
pub auth_provider: AuthProvider, pub auth_provider: AuthProvider,
#[serde(default = "default_priority_type")] #[serde(default = "default_priority_type")]
pub priority: u32, pub priority: u32,
pub stats: Option<HashMap<String, Value>>,
} }
impl PartialEq for Server { impl PartialEq for Server {
@@ -81,7 +92,10 @@ pub struct ServerAccessToken {
#[serde(default = "default_empty_string")] // Custom default function for empty string #[serde(default = "default_empty_string")] // Custom default function for empty string
pub id: String, pub id: String,
pub access_token: String, pub access_token: String,
pub expired_at: u32, //unix timestamp in seconds /// Unix timestamp in seconds
///
/// Currently, this is UNUSED.
pub expired_at: u32,
} }
impl ServerAccessToken { impl ServerAccessToken {

View File

@@ -1,13 +1,16 @@
use crate::common::error::SearchError; use crate::common::error::SearchError;
// use std::{future::Future, pin::Pin};
use crate::common::search::SearchQuery; use crate::common::search::SearchQuery;
use crate::common::search::{QueryResponse, QuerySource}; use crate::common::search::{QueryResponse, QuerySource};
use async_trait::async_trait; use async_trait::async_trait;
use tauri::AppHandle;
#[async_trait] #[async_trait]
pub trait SearchSource: Send + Sync { pub trait SearchSource: Send + Sync {
fn get_type(&self) -> QuerySource; fn get_type(&self) -> QuerySource;
async fn search(&self, query: SearchQuery) -> Result<QueryResponse, SearchError>; async fn search(
&self,
tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError>;
} }

View File

@@ -0,0 +1,13 @@
pub(super) const EXTENSION_ID: &str = "AIOverview";
/// JSON file for this extension.
pub(crate) const PLUGIN_JSON_FILE: &str = r#"
{
"id": "AIOverview",
"name": "AI Overview",
"description": "...",
"icon": "font_a-AIOverview",
"type": "ai_extension",
"enabled": true
}
"#;

View File

@@ -12,9 +12,10 @@ pub use with_feature::*;
#[cfg(not(feature = "use_pizza_engine"))] #[cfg(not(feature = "use_pizza_engine"))]
pub use without_feature::*; pub use without_feature::*;
#[derive(Debug, Serialize, Clone)] #[derive(Debug, Serialize, Clone)]
#[serde(rename_all = "camelCase")] #[serde(rename_all = "camelCase")]
#[allow(dead_code)]
pub struct AppEntry { pub struct AppEntry {
path: String, path: String,
name: String, name: String,
@@ -24,15 +25,26 @@ pub struct AppEntry {
is_disabled: bool, is_disabled: bool,
} }
#[derive(serde::Serialize)] #[derive(serde::Serialize)]
#[serde(rename_all = "camelCase")] #[serde(rename_all = "camelCase")]
pub struct AppMetadata { pub struct AppMetadata {
name: String, name: String,
r#where: String, r#where: String,
size: u64, size: u64,
icon: String,
created: u128, created: u128,
modified: u128, modified: u128,
last_opened: u128, last_opened: u128,
} }
/// JSON file for this extension.
pub(crate) const PLUGIN_JSON_FILE: &str = r#"
{
"id": "Applications",
"platforms": ["macos", "linux", "windows"],
"name": "Applications",
"description": "Application search",
"icon": "font_Application",
"type": "group",
"enabled": true
}
"#;

View File

@@ -1,18 +1,20 @@
use super::super::Extension;
use super::AppMetadata;
use crate::common::error::SearchError; use crate::common::error::SearchError;
use crate::common::search::{QueryResponse, QuerySource, SearchQuery}; use crate::common::search::{QueryResponse, QuerySource, SearchQuery};
use crate::common::traits::SearchSource; use crate::common::traits::SearchSource;
use crate::local::LOCAL_QUERY_SOURCE_TYPE; use crate::extension::LOCAL_QUERY_SOURCE_TYPE;
use async_trait::async_trait; use async_trait::async_trait;
use tauri::{AppHandle, Runtime}; use tauri::{AppHandle, Runtime};
use super::AppEntry;
use super::AppMetadata;
pub(crate) const QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME: &str = "Applications"; pub(crate) const QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME: &str = "Applications";
pub struct ApplicationSearchSource; pub struct ApplicationSearchSource;
impl ApplicationSearchSource { impl ApplicationSearchSource {
pub async fn init<R: Runtime>(_app_handle: AppHandle<R>) -> Result<(), String> { pub async fn prepare_index_and_store<R: Runtime>(
_app_handle: AppHandle<R>,
) -> Result<(), String> {
Ok(()) Ok(())
} }
} }
@@ -30,7 +32,11 @@ impl SearchSource for ApplicationSearchSource {
} }
} }
async fn search(&self, _query: SearchQuery) -> Result<QueryResponse, SearchError> { async fn search(
&self,
_tauri_app_handle: AppHandle,
_query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
Ok(QueryResponse { Ok(QueryResponse {
source: self.get_type(), source: self.get_type(),
hits: Vec::new(), hits: Vec::new(),
@@ -39,46 +45,45 @@ impl SearchSource for ApplicationSearchSource {
} }
} }
#[tauri::command] pub fn set_app_alias<R: Runtime>(_tauri_app_handle: &AppHandle<R>, _app_path: &str, _alias: &str) {
pub async fn set_app_alias(_app_path: String, _alias: String) -> Result<(), String> {
unreachable!("app list should be empty, there is no way this can be invoked") unreachable!("app list should be empty, there is no way this can be invoked")
} }
#[tauri::command] pub fn register_app_hotkey<R: Runtime>(
pub async fn register_app_hotkey<R: Runtime>( _tauri_app_handle: &AppHandle<R>,
_tauri_app_handle: AppHandle<R>, _app_path: &str,
_app_path: String, _hotkey: &str,
_hotkey: String,
) -> Result<(), String> { ) -> Result<(), String> {
unreachable!("app list should be empty, there is no way this can be invoked") unreachable!("app list should be empty, there is no way this can be invoked")
} }
#[tauri::command] pub fn unregister_app_hotkey<R: Runtime>(
pub async fn unregister_app_hotkey<R: Runtime>( _tauri_app_handle: &AppHandle<R>,
_tauri_app_handle: AppHandle<R>, _app_path: &str,
_app_path: String,
) -> Result<(), String> { ) -> Result<(), String> {
unreachable!("app list should be empty, there is no way this can be invoked") unreachable!("app list should be empty, there is no way this can be invoked")
} }
#[tauri::command] pub fn disable_app_search<R: Runtime>(
pub async fn disable_app_search<R: Runtime>( _tauri_app_handle: &AppHandle<R>,
_tauri_app_handle: AppHandle<R>, _app_path: &str,
_app_path: String,
) -> Result<(), String> { ) -> Result<(), String> {
// no-op // no-op
Ok(()) Ok(())
} }
#[tauri::command] pub fn enable_app_search<R: Runtime>(
pub async fn enable_app_search<R: Runtime>( _tauri_app_handle: &AppHandle<R>,
_tauri_app_handle: AppHandle<R>, _app_path: &str,
_app_path: String,
) -> Result<(), String> { ) -> Result<(), String> {
// no-op // no-op
Ok(()) Ok(())
} }
pub fn is_app_search_enabled(_app_path: &str) -> bool {
false
}
#[tauri::command] #[tauri::command]
pub async fn add_app_search_path<R: Runtime>( pub async fn add_app_search_path<R: Runtime>(
_tauri_app_handle: AppHandle<R>, _tauri_app_handle: AppHandle<R>,
@@ -103,11 +108,10 @@ pub async fn get_app_search_path<R: Runtime>(_tauri_app_handle: AppHandle<R>) ->
Vec::new() Vec::new()
} }
#[tauri::command] #[tauri::command]
pub async fn get_app_list<R: Runtime>( pub async fn get_app_list<R: Runtime>(
_tauri_app_handle: AppHandle<R>, _tauri_app_handle: AppHandle<R>,
) -> Result<Vec<AppEntry>, String> { ) -> Result<Vec<Extension>, String> {
// Return an empty list // Return an empty list
Ok(Vec::new()) Ok(Vec::new())
} }
@@ -119,3 +123,23 @@ pub async fn get_app_metadata<R: Runtime>(
) -> Result<AppMetadata, String> { ) -> Result<AppMetadata, String> {
unreachable!("app list should be empty, there is no way this can be invoked") unreachable!("app list should be empty, there is no way this can be invoked")
} }
pub(crate) fn set_apps_hotkey<R: Runtime>(_tauri_app_handle: &AppHandle<R>) -> Result<(), String> {
// no-op
Ok(())
}
pub(crate) fn unset_apps_hotkey<R: Runtime>(
_tauri_app_handle: &AppHandle<R>,
) -> Result<(), String> {
// no-op
Ok(())
}
#[tauri::command]
pub async fn reindex_applications<R: Runtime>(
_tauri_app_handle: AppHandle<R>,
) -> Result<(), String> {
// no-op
Ok(())
}

View File

@@ -1,4 +1,4 @@
use super::LOCAL_QUERY_SOURCE_TYPE; use super::super::LOCAL_QUERY_SOURCE_TYPE;
use crate::common::{ use crate::common::{
document::{DataSourceReference, Document}, document::{DataSourceReference, Document},
error::SearchError, error::SearchError,
@@ -10,9 +10,23 @@ use chinese_number::{ChineseCase, ChineseCountMethod, ChineseVariant, NumberToCh
use num2words::Num2Words; use num2words::Num2Words;
use serde_json::Value; use serde_json::Value;
use std::collections::HashMap; use std::collections::HashMap;
use tauri::AppHandle;
pub(crate) const DATA_SOURCE_ID: &str = "Calculator"; pub(crate) const DATA_SOURCE_ID: &str = "Calculator";
/// JSON file for this extension.
pub(crate) const PLUGIN_JSON_FILE: &str = r#"
{
"id": "Calculator",
"name": "Calculator",
"platforms": ["macos", "linux", "windows"],
"description": "...",
"icon": "font_Calculator",
"type": "calculator",
"enabled": true
}
"#;
pub struct CalculatorSource { pub struct CalculatorSource {
base_score: f64, base_score: f64,
} }
@@ -23,7 +37,7 @@ impl CalculatorSource {
} }
} }
fn parse_query(query: String) -> Value { fn parse_query(query: &str) -> Value {
let mut query_json = serde_json::Map::new(); let mut query_json = serde_json::Map::new();
let operators = ["+", "-", "*", "/", "%"]; let operators = ["+", "-", "*", "/", "%"];
@@ -48,7 +62,7 @@ fn parse_query(query: String) -> Value {
query_json.insert("type".to_string(), Value::String("expression".to_string())); query_json.insert("type".to_string(), Value::String("expression".to_string()));
} }
query_json.insert("value".to_string(), Value::String(query)); query_json.insert("value".to_string(), Value::String(query.to_string()));
Value::Object(query_json) Value::Object(query_json)
} }
@@ -107,12 +121,22 @@ impl SearchSource for CalculatorSource {
} }
} }
async fn search(&self, query: SearchQuery) -> Result<QueryResponse, SearchError> { async fn search(
let query_string = query &self,
.query_strings _tauri_app_handle: AppHandle,
.get("query") query: SearchQuery,
.unwrap_or(&"".to_string()) ) -> Result<QueryResponse, SearchError> {
.to_string(); let Some(query_string) = query.query_strings.get("query") else {
return Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
});
};
// Trim the leading and tailing whitespace so that our later if condition
// will only be evaluated against non-whitespace characters.
let query_string = query_string.trim();
if query_string.is_empty() || query_string.len() == 1 { if query_string.is_empty() || query_string.len() == 1 {
return Ok(QueryResponse { return Ok(QueryResponse {
@@ -122,42 +146,54 @@ impl SearchSource for CalculatorSource {
}); });
} }
match meval::eval_str(&query_string) { let query_string_clone = query_string.to_string();
Ok(num) => { let query_source = self.get_type();
let mut payload: HashMap<String, Value> = HashMap::new(); let base_score = self.base_score;
let closure = move || -> QueryResponse {
let res_num = meval::eval_str(&query_string_clone);
let payload_query = parse_query(query_string); match res_num {
let payload_result = parse_result(num); Ok(num) => {
let mut payload: HashMap<String, Value> = HashMap::new();
payload.insert("query".to_string(), payload_query); let payload_query = parse_query(&query_string_clone);
payload.insert("result".to_string(), payload_result); let payload_result = parse_result(num);
let doc = Document { payload.insert("query".to_string(), payload_query);
id: DATA_SOURCE_ID.to_string(), payload.insert("result".to_string(), payload_result);
category: Some(DATA_SOURCE_ID.to_string()),
payload: Some(payload),
source: Some(DataSourceReference {
r#type: Some(LOCAL_QUERY_SOURCE_TYPE.into()),
name: Some(DATA_SOURCE_ID.into()),
id: Some(DATA_SOURCE_ID.into()),
icon: None,
}),
..Default::default()
};
return Ok(QueryResponse { let doc = Document {
source: self.get_type(), id: DATA_SOURCE_ID.to_string(),
hits: vec![(doc, self.base_score)], category: Some(DATA_SOURCE_ID.to_string()),
total_hits: 1, payload: Some(payload),
}); source: Some(DataSourceReference {
} r#type: Some(LOCAL_QUERY_SOURCE_TYPE.into()),
Err(_) => { name: Some(DATA_SOURCE_ID.into()),
return Ok(QueryResponse { id: Some(DATA_SOURCE_ID.into()),
source: self.get_type(), icon: Some(String::from("font_Calculator")),
}),
..Default::default()
};
QueryResponse {
source: query_source,
hits: vec![(doc, base_score)],
total_hits: 1,
}
}
Err(_) => QueryResponse {
source: query_source,
hits: Vec::new(), hits: Vec::new(),
total_hits: 0, total_hits: 0,
}); },
} }
}; };
let spawn_result = tokio::task::spawn_blocking(closure).await;
match spawn_result {
Ok(response) => Ok(response),
Err(e) => std::panic::resume_unwind(e.into_panic()),
}
} }
} }

View File

@@ -0,0 +1,212 @@
//! File Search configuration entries definition and getter/setter functions.
use serde::Deserialize;
use serde::Serialize;
use serde_json::Value;
use std::sync::LazyLock;
use tauri::AppHandle;
use tauri::Runtime;
use tauri_plugin_store::StoreExt;
// Tauri store keys for file system configuration
const TAURI_STORE_FILE_SYSTEM_CONFIG: &str = "file_system_config";
const TAURI_STORE_KEY_SEARCH_BY: &str = "search_by";
const TAURI_STORE_KEY_SEARCH_PATHS: &str = "search_paths";
const TAURI_STORE_KEY_EXCLUDE_PATHS: &str = "exclude_paths";
const TAURI_STORE_KEY_FILE_TYPES: &str = "file_types";
static HOME_DIR: LazyLock<String> = LazyLock::new(|| {
let os_string = dirs::home_dir()
.expect("$HOME should be set")
.into_os_string();
os_string
.into_string()
.expect("User home directory should be encoded with UTF-8")
});
#[derive(Debug, Clone, Serialize, Deserialize, Copy)]
pub enum SearchBy {
Name,
NameAndContents,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct FileSearchConfig {
pub search_paths: Vec<String>,
pub exclude_paths: Vec<String>,
pub file_types: Vec<String>,
pub search_by: SearchBy,
}
impl Default for FileSearchConfig {
fn default() -> Self {
Self {
search_paths: vec![
format!("{}/Documents", HOME_DIR.as_str()),
format!("{}/Desktop", HOME_DIR.as_str()),
format!("{}/Downloads", HOME_DIR.as_str()),
],
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::Name,
}
}
}
impl FileSearchConfig {
pub(crate) fn get<R: Runtime>(tauri_app_handle: &AppHandle<R>) -> Self {
let store = tauri_app_handle
.store(TAURI_STORE_FILE_SYSTEM_CONFIG)
.unwrap_or_else(|e| {
panic!(
"store [{}] not found/loaded, error [{}]",
TAURI_STORE_FILE_SYSTEM_CONFIG, e
)
});
// Default value, will be used when specific config entries are not set
let default_config = FileSearchConfig::default();
let search_paths = {
if let Some(search_paths) = store.get(TAURI_STORE_KEY_SEARCH_PATHS) {
match search_paths {
Value::Array(arr) => {
let mut vec = Vec::with_capacity(arr.len());
for v in arr {
match v {
Value::String(s) => vec.push(s),
other => panic!(
"Expected all elements of 'search_paths' to be strings, but found: {:?}",
other
),
}
}
vec
}
other => panic!(
"Expected 'search_paths' to be an array of strings in the file system config store, but got: {:?}",
other
),
}
} else {
store.set(
TAURI_STORE_KEY_SEARCH_PATHS,
default_config.search_paths.as_slice(),
);
default_config.search_paths
}
};
let exclude_paths = {
if let Some(exclude_paths) = store.get(TAURI_STORE_KEY_EXCLUDE_PATHS) {
match exclude_paths {
Value::Array(arr) => {
let mut vec = Vec::with_capacity(arr.len());
for v in arr {
match v {
Value::String(s) => vec.push(s),
other => panic!(
"Expected all elements of 'exclude_paths' to be strings, but found: {:?}",
other
),
}
}
vec
}
other => panic!(
"Expected 'exclude_paths' to be an array of strings in the file system config store, but got: {:?}",
other
),
}
} else {
store.set(
TAURI_STORE_KEY_EXCLUDE_PATHS,
default_config.exclude_paths.as_slice(),
);
default_config.exclude_paths
}
};
let file_types = {
if let Some(file_types) = store.get(TAURI_STORE_KEY_FILE_TYPES) {
match file_types {
Value::Array(arr) => {
let mut vec = Vec::with_capacity(arr.len());
for v in arr {
match v {
Value::String(s) => vec.push(s),
other => panic!(
"Expected all elements of 'file_types' to be strings, but found: {:?}",
other
),
}
}
vec
}
other => panic!(
"Expected 'file_types' to be an array of strings in the file system config store, but got: {:?}",
other
),
}
} else {
store.set(
TAURI_STORE_KEY_FILE_TYPES,
default_config.file_types.as_slice(),
);
default_config.file_types
}
};
let search_by = {
if let Some(search_by) = store.get(TAURI_STORE_KEY_SEARCH_BY) {
serde_json::from_value(search_by.clone()).unwrap_or_else(|e| {
panic!(
"Failed to deserialize 'search_by' from file system config store. Invalid JSON: {:?}, error: {}",
search_by, e
)
})
} else {
store.set(
TAURI_STORE_KEY_SEARCH_BY,
serde_json::to_value(default_config.search_by).unwrap(),
);
default_config.search_by
}
};
Self {
search_by,
search_paths,
exclude_paths,
file_types,
}
}
}
// Tauri commands for managing file system configuration
#[tauri::command]
pub async fn get_file_system_config<R: Runtime>(
tauri_app_handle: AppHandle<R>,
) -> FileSearchConfig {
FileSearchConfig::get(&tauri_app_handle)
}
#[tauri::command]
pub async fn set_file_system_config<R: Runtime>(
tauri_app_handle: AppHandle<R>,
config: FileSearchConfig,
) -> Result<(), String> {
let store = tauri_app_handle
.store(TAURI_STORE_FILE_SYSTEM_CONFIG)
.map_err(|e| e.to_string())?;
store.set(TAURI_STORE_KEY_SEARCH_PATHS, config.search_paths);
store.set(TAURI_STORE_KEY_EXCLUDE_PATHS, config.exclude_paths);
store.set(TAURI_STORE_KEY_FILE_TYPES, config.file_types);
store.set(
TAURI_STORE_KEY_SEARCH_BY,
serde_json::to_value(config.search_by).unwrap(),
);
Ok(())
}

View File

@@ -0,0 +1,186 @@
use super::super::EXTENSION_ID;
use super::super::config::FileSearchConfig;
use super::super::config::SearchBy;
use crate::common::document::{DataSourceReference, Document};
use crate::extension::LOCAL_QUERY_SOURCE_TYPE;
use crate::extension::OnOpened;
use crate::util::file::get_file_icon;
use futures::stream::Stream;
use futures::stream::StreamExt;
use std::os::fd::OwnedFd;
use std::path::Path;
use tokio::io::AsyncBufReadExt;
use tokio::io::BufReader;
use tokio::process::Child;
use tokio::process::Command;
use tokio_stream::wrappers::LinesStream;
/// `mdfind` won't return scores, we use this score for all the documents.
const SCORE: f64 = 1.0;
pub(crate) async fn hits(
query_string: &str,
from: usize,
size: usize,
config: &FileSearchConfig,
) -> Result<Vec<(Document, f64)>, String> {
let (mut iter, mut mdfind_child_process) =
execute_mdfind_query(&query_string, from, size, &config)?;
// Convert results to documents
let mut hits: Vec<(Document, f64)> = Vec::new();
while let Some(res_file_path) = iter.next().await {
let file_path = res_file_path.map_err(|io_err| io_err.to_string())?;
let icon = get_file_icon(file_path.clone()).await;
let file_path_of_type_path = camino::Utf8Path::new(&file_path);
let r#where = file_path_of_type_path
.parent()
.unwrap_or_else(|| {
panic!(
"expect path [{}] to have a parent, but it does not",
file_path
);
})
.to_string();
let file_name = file_path_of_type_path.file_name().unwrap_or_else(|| {
panic!(
"expect path [{}] to have a file name, but it does not",
file_path
);
});
let on_opened = OnOpened::Document {
url: file_path.clone(),
};
let doc = Document {
id: file_path.clone(),
title: Some(file_name.to_string()),
source: Some(DataSourceReference {
r#type: Some(LOCAL_QUERY_SOURCE_TYPE.into()),
name: Some(EXTENSION_ID.into()),
id: Some(EXTENSION_ID.into()),
icon: Some(String::from("font_Filesearch")),
}),
category: Some(r#where),
on_opened: Some(on_opened),
url: Some(file_path),
icon: Some(icon.to_string()),
..Default::default()
};
hits.push((doc, SCORE));
}
// Kill the mdfind process once we get the needed results to prevent zombie
// processes.
mdfind_child_process
.kill()
.await
.map_err(|e| format!("{:?}", e))?;
Ok(hits)
}
/// Return an array containing the `mdfind` command and its arguments.
fn build_mdfind_query(query_string: &str, config: &FileSearchConfig) -> Vec<String> {
let mut args = vec!["mdfind".to_string()];
match config.search_by {
SearchBy::Name => {
args.push(format!("kMDItemFSName == '*{}*'", query_string));
}
SearchBy::NameAndContents => {
args.push(format!(
"kMDItemFSName == '*{}*' || kMDItemTextContent == '{}'",
query_string, query_string
));
}
}
// Add search paths using -onlyin
for path in &config.search_paths {
if Path::new(path).exists() {
args.extend_from_slice(&["-onlyin".to_string(), path.to_string()]);
}
}
args
}
/// Spawn the `mdfind` child process and return an async iterator over its output,
/// allowing us to collect the results asynchronously.
///
/// # Return value:
///
/// * impl Stream: an async iterator that will yield the matched files
/// * Child: The handle to the mdfind process, we need to kill it once we
/// collect all the results to avoid zombie processes.
fn execute_mdfind_query(
query_string: &str,
from: usize,
size: usize,
config: &FileSearchConfig,
) -> Result<(impl Stream<Item = std::io::Result<String>>, Child), String> {
let args = build_mdfind_query(query_string, &config);
let (rx, tx) = std::io::pipe().unwrap();
let rx_owned = OwnedFd::from(rx);
let async_rx = tokio::net::unix::pipe::Receiver::from_owned_fd(rx_owned).unwrap();
let buffered_rx = BufReader::new(async_rx);
let lines = LinesStream::new(buffered_rx.lines());
let child = Command::new(&args[0])
.args(&args[1..])
.stdout(tx)
.stderr(std::process::Stdio::null())
.spawn()
.map_err(|e| format!("Failed to spawn mdfind: {}", e))?;
let config_clone = config.clone();
let iter = lines
.filter(move |res_path| {
std::future::ready({
match res_path {
Ok(path) => !should_be_filtered_out(&config_clone, path),
Err(_) => {
// Don't filter out Err() values
true
}
}
})
})
.skip(from)
.take(size);
Ok((iter, child))
}
/// If `file_path` should be removed from the search results given the filter
/// conditions specified in `config`.
fn should_be_filtered_out(config: &FileSearchConfig, file_path: &str) -> bool {
let is_excluded = config
.exclude_paths
.iter()
.any(|exclude_path| file_path.starts_with(exclude_path));
if is_excluded {
return true;
}
let matches_file_type = if config.file_types.is_empty() {
true
} else {
let path_obj = camino::Utf8Path::new(&file_path);
if let Some(extension) = path_obj.extension() {
config
.file_types
.iter()
.any(|file_type| file_type == extension)
} else {
// `config.file_types` is not empty, then the search results
// should have extensions.
false
}
};
!matches_file_type
}

View File

@@ -0,0 +1,10 @@
#[cfg(target_os = "macos")]
mod macos;
#[cfg(target_os = "windows")]
mod windows;
// `hits()` function is platform-specific, export the corresponding impl.
#[cfg(target_os = "macos")]
pub(crate) use macos::hits;
#[cfg(target_os = "windows")]
pub(crate) use windows::hits;

View File

@@ -0,0 +1,751 @@
//! # Credits
//!
//! https://github.com/IRONAGE-Park/rag-sample/blob/3f0ad8c8012026cd3a7e453d08f041609426cb91/src/native/windows.rs
//! is the starting point of this implementation.
use super::super::EXTENSION_ID;
use super::super::config::FileSearchConfig;
use super::super::config::SearchBy;
use crate::common::document::{DataSourceReference, Document};
use crate::extension::LOCAL_QUERY_SOURCE_TYPE;
use crate::extension::OnOpened;
use crate::util::file::get_file_icon;
use windows::{
Win32::System::{
Com::{CLSCTX_INPROC_SERVER, CoCreateInstance},
Ole::{OleInitialize, OleUninitialize},
Search::{
DB_NULL_HCHAPTER, DBACCESSOR_ROWDATA, DBBINDING, DBMEMOWNER_CLIENTOWNED,
DBPARAMIO_NOTPARAM, DBPART_VALUE, DBTYPE_WSTR, HACCESSOR, IAccessor, ICommand,
ICommandText, IDBCreateCommand, IDBCreateSession, IDBInitialize, IDataInitialize,
IRowset, MSDAINITIALIZE,
},
},
core::{GUID, IUnknown, Interface, PWSTR, w},
};
/// Owned version of `PWSTR` that holds the heap memory.
///
/// Use `as_pwstr()` to convert it to a raw pointer.
struct PwStrOwned(Vec<u16>);
impl PwStrOwned {
/// # SAFETY
///
/// The returned `PWSTR` is basically a raw pointer, it is only valid within the
/// lifetime of `PwStrOwned`.
unsafe fn as_pwstr(&mut self) -> PWSTR {
let raw_ptr = self.0.as_mut_ptr();
PWSTR::from_raw(raw_ptr)
}
}
/// Construct `PwStrOwned` from any `str`.
impl<S: AsRef<str> + ?Sized> From<&S> for PwStrOwned {
fn from(value: &S) -> Self {
let mut utf16_bytes = value.as_ref().encode_utf16().collect::<Vec<u16>>();
utf16_bytes.push(0); // the tailing NULL
PwStrOwned(utf16_bytes)
}
}
/// Helper function to replace unsupported characters with whitespace.
///
/// Windows search will error out if it encounters these characters.
///
/// The complete list of unsupported characters is unknown and we don't know how
/// to escape them, so let's replace them.
fn query_string_cleanup(old: &str) -> String {
const UNSUPPORTED_CHAR: [char; 2] = ['\'', '\n'];
// Using len in bytes is ok
let mut chars = Vec::with_capacity(old.len());
for char in old.chars() {
if UNSUPPORTED_CHAR.contains(&char) {
chars.push(' ');
} else {
chars.push(char);
}
}
chars.into_iter().collect()
}
/// Helper function to construct the Windows Search SQL.
///
/// Paging is not natively supported by windows Search SQL, it only supports `size`
/// via the `TOP` keyword ("SELECT TOP {n} {columns}"). The SQL returned by this
/// function will have `{n}` set to `from + size`, then we will manually implement
/// paging.
fn query_sql(query_string: &str, from: usize, size: usize, config: &FileSearchConfig) -> String {
let top_n = from
.checked_add(size)
.expect("[from + size] cannot fit into an [usize]");
// System.ItemUrl is a column that contains the file path
// example: "file:C:/Users/desktop.ini"
//
// System.Search.Rank is the relevance score
let mut sql = format!(
"SELECT TOP {} System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE",
top_n
);
let query_string = query_string_cleanup(query_string);
let search_by_predicate = match config.search_by {
SearchBy::Name => {
// `contains(System.FileName, '{query_string}')` would be faster
// because it uses inverted index, but that's not what we want
// due to the limitation of tokenization. For example, suppose "Coco AI.rs"
// will be tokenized to `["Coco", "AI", "rs"]`, then if users search
// via `Co`, this file won't be returned because term `Co` does not
// exist in the index.
//
// So we use wildcard instead even though it is slower.
format!("(System.FileName LIKE '%{query_string}%')")
}
SearchBy::NameAndContents => {
// Windows File Search does not support searching by file content.
//
// `CONTAINS('query_string')` would search all columns for `query_string`,
// this is the closest solution we have.
format!("((System.FileName LIKE '%{query_string}%') OR CONTAINS('{query_string}'))")
}
};
let search_paths_predicate: Option<String> = {
if config.search_paths.is_empty() {
None
} else {
let mut output = String::from("(");
for (idx, search_path) in config.search_paths.iter().enumerate() {
if idx != 0 {
output.push_str(" OR ");
}
output.push_str("SCOPE = 'file:");
output.push_str(&search_path);
output.push('\'');
}
output.push(')');
Some(output)
}
};
let exclude_paths_predicate: Option<String> = {
if config.exclude_paths.is_empty() {
None
} else {
let mut output = String::from("(");
for (idx, exclude_path) in config.exclude_paths.iter().enumerate() {
if idx != 0 {
output.push_str(" AND ");
}
output.push_str("(NOT SCOPE = 'file:");
output.push_str(&exclude_path);
output.push('\'');
output.push(')');
}
output.push(')');
Some(output)
}
};
let file_types_predicate: Option<String> = {
if config.file_types.is_empty() {
None
} else {
let mut output = String::from("(");
for (idx, file_type) in config.file_types.iter().enumerate() {
if idx != 0 {
output.push_str(" OR ");
}
// NOTE that this column contains a starting dot
output.push_str("System.FileExtension = '.");
output.push_str(&file_type);
output.push('\'');
}
output.push(')');
Some(output)
}
};
sql.push(' ');
sql.push_str(search_by_predicate.as_str());
if let Some(search_paths_predicate) = search_paths_predicate {
sql.push_str(" AND ");
sql.push_str(search_paths_predicate.as_str());
}
if let Some(exclude_paths_predicate) = exclude_paths_predicate {
sql.push_str(" AND ");
sql.push_str(exclude_paths_predicate.as_str());
}
if let Some(file_types_predicate) = file_types_predicate {
sql.push_str(" AND ");
sql.push_str(file_types_predicate.as_str());
}
sql
}
/// Default GUID for Search.CollatorDSO.1
const DBGUID_DEFAULT: GUID = GUID {
data1: 0xc8b521fb,
data2: 0x5cf3,
data3: 0x11ce,
data4: [0xad, 0xe5, 0x00, 0xaa, 0x00, 0x44, 0x77, 0x3d],
};
unsafe fn create_accessor_handle(accessor: &IAccessor, index: usize) -> Result<HACCESSOR, String> {
let bindings = DBBINDING {
iOrdinal: index,
obValue: 0,
obStatus: 0,
obLength: 0,
dwPart: DBPART_VALUE.0 as u32,
dwMemOwner: DBMEMOWNER_CLIENTOWNED.0 as u32,
eParamIO: DBPARAMIO_NOTPARAM.0 as u32,
cbMaxLen: 512,
dwFlags: 0,
wType: DBTYPE_WSTR.0 as u16,
bPrecision: 0,
bScale: 0,
..Default::default()
};
let mut status = 0;
let mut accessor_handle = HACCESSOR::default();
unsafe {
accessor
.CreateAccessor(
DBACCESSOR_ROWDATA.0 as u32,
1,
&bindings,
0,
&mut accessor_handle,
Some(&mut status),
)
.map_err(|e| e.to_string())?;
}
Ok(accessor_handle)
}
fn create_db_initialize() -> Result<IDBInitialize, String> {
unsafe {
let data_init: IDataInitialize =
CoCreateInstance(&MSDAINITIALIZE, None, CLSCTX_INPROC_SERVER)
.map_err(|e| e.to_string())?;
let mut unknown: Option<IUnknown> = None;
data_init
.GetDataSource(
None,
CLSCTX_INPROC_SERVER.0,
w!("provider=Search.CollatorDSO.1;EXTENDED PROPERTIES=\"Application=Windows\""),
&IDBInitialize::IID,
&mut unknown as *mut _ as *mut _,
)
.map_err(|e| e.to_string())?;
Ok(unknown.unwrap().cast().map_err(|e| e.to_string())?)
}
}
fn create_command(db_init: IDBInitialize) -> Result<ICommandText, String> {
unsafe {
let db_create_session: IDBCreateSession = db_init.cast().map_err(|e| e.to_string())?;
let session: IUnknown = db_create_session
.CreateSession(None, &IUnknown::IID)
.map_err(|e| e.to_string())?;
let db_create_command: IDBCreateCommand = session.cast().map_err(|e| e.to_string())?;
Ok(db_create_command
.CreateCommand(None, &ICommand::IID)
.map_err(|e| e.to_string())?
.cast()
.map_err(|e| e.to_string())?)
}
}
fn execute_windows_search_sql(sql_query: &str) -> Result<Vec<(String, String)>, String> {
unsafe {
let mut pwstr_owned_sql = PwStrOwned::from(sql_query);
// SAFETY: pwstr_owned_sql will live for the whole lifetime of this function.
let sql_query = pwstr_owned_sql.as_pwstr();
let db_init = create_db_initialize()?;
db_init.Initialize().map_err(|e| e.to_string())?;
let command = create_command(db_init)?;
// Set the command text
command
.SetCommandText(&DBGUID_DEFAULT, sql_query)
.map_err(|e| e.to_string())?;
// Execute the command
let mut rowset: Option<IRowset> = None;
command
.Execute(
None,
&IRowset::IID,
None,
None,
Some(&mut rowset as *mut _ as *mut _),
)
.map_err(|e| e.to_string())?;
let rowset = rowset.ok_or_else(|| {
format!(
"No rowset returned for query: {}",
// SAFETY: the raw pointer is not dangling
sql_query
.to_string()
.expect("the conversion should work as `sql_query` was created from a String",)
)
})?;
let accessor: IAccessor = rowset
.cast()
.map_err(|e| format!("Failed to cast to IAccessor: {}", e.to_string()))?;
let mut output = Vec::new();
let mut count = 0;
loop {
let mut rows_fetched = 0;
let mut row_handles = [std::ptr::null_mut(); 1];
let result = rowset.GetNextRows(
DB_NULL_HCHAPTER as usize,
0,
&mut rows_fetched,
&mut row_handles,
);
if result.is_err() {
break;
}
if rows_fetched == 0 {
break;
}
let mut data = Vec::new();
for i in 0..2 {
let mut item_name = [0u16; 512];
let accessor_handle = create_accessor_handle(&accessor, i + 1)?;
rowset
.GetData(
*row_handles[0],
accessor_handle,
item_name.as_mut_ptr() as *mut _,
)
.map_err(|e| {
format!(
"Failed to get data at count {}, index {}: {}",
count,
i,
e.to_string()
)
})?;
let name = String::from_utf16_lossy(&item_name);
// Remove null characters
data.push(name.trim_end_matches('\u{0000}').to_string());
accessor
.ReleaseAccessor(accessor_handle, None)
.map_err(|e| {
format!(
"Failed to release accessor at count {}, index {}: {}",
count,
i,
e.to_string()
)
})?;
}
output.push((data[0].clone(), data[1].clone()));
count += 1;
rowset
.ReleaseRows(
1,
row_handles[0],
std::ptr::null_mut(),
std::ptr::null_mut(),
std::ptr::null_mut(),
)
.map_err(|e| {
format!(
"Failed to release rows at count {}: {}",
count,
e.to_string()
)
})?;
}
Ok(output)
}
}
pub(crate) async fn hits(
query_string: &str,
from: usize,
size: usize,
config: &FileSearchConfig,
) -> Result<Vec<(Document, f64)>, String> {
let sql = query_sql(query_string, from, size, config);
unsafe { OleInitialize(None).map_err(|e| e.to_string())? };
let result = execute_windows_search_sql(&sql)?;
unsafe { OleUninitialize() };
// .take(size) is not needed as `result` will contain `from+size` files at most
let result_with_paging = result.into_iter().skip(from);
// result_with_paging won't contain more than `size` entries
let mut hits = Vec::with_capacity(size);
const ITEM_URL_PREFIX: &str = "file:";
const ITEM_URL_PREFIX_LEN: usize = ITEM_URL_PREFIX.len();
for (item_url, score_str) in result_with_paging {
// path returned from Windows Search contains a prefix, we need to trim it.
//
// "file:C:/Users/desktop.ini" => "C:/Users/desktop.ini"
let file_path = &item_url[ITEM_URL_PREFIX_LEN..];
let icon = get_file_icon(file_path.to_string()).await;
let file_path_of_type_path = camino::Utf8Path::new(&file_path);
let r#where = file_path_of_type_path
.parent()
.unwrap_or_else(|| {
panic!(
"expect path [{}] to have a parent, but it does not",
file_path
);
})
.to_string();
let file_name = file_path_of_type_path.file_name().unwrap_or_else(|| {
panic!(
"expect path [{}] to have a file name, but it does not",
file_path
);
});
let on_opened = OnOpened::Document {
url: file_path.to_string(),
};
let doc = Document {
id: file_path.to_string(),
title: Some(file_name.to_string()),
source: Some(DataSourceReference {
r#type: Some(LOCAL_QUERY_SOURCE_TYPE.into()),
name: Some(EXTENSION_ID.into()),
id: Some(EXTENSION_ID.into()),
icon: Some(String::from("font_Filesearch")),
}),
category: Some(r#where),
on_opened: Some(on_opened),
url: Some(file_path.into()),
icon: Some(icon.to_string()),
..Default::default()
};
let score: f64 = score_str.parse().expect(
"System.Search.Rank should be in range [0, 1000], which should be valid for [f64]",
);
hits.push((doc, score));
}
Ok(hits)
}
// Skip these tests in our CI, they fail with the following error
// "SQL is invalid: "0x80041820""
//
// I have no idea about the underlying root cause
#[cfg(all(test, not(ci)))]
mod test_windows_search {
use super::*;
/// Helper function for ensuring `sql` is valid SQL by actually executing it.
fn ensure_it_is_valid_sql(sql: &str) {
unsafe { OleInitialize(None).unwrap() };
execute_windows_search_sql(&sql).expect("SQL is invalid");
unsafe { OleUninitialize() };
}
#[test]
fn test_query_sql_empty_config_search_by_name() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("coco", 0, 10, &config);
assert_eq!(
sql,
"SELECT TOP 10 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%coco%')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_empty_config_search_by_name_and_content() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::NameAndContents,
};
let sql = query_sql("coco", 0, 10, &config);
assert_eq!(
sql,
"SELECT TOP 10 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE ((System.FileName LIKE '%coco%') OR CONTAINS('coco'))"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_search_paths() {
let config = FileSearchConfig {
search_paths: vec!["C:/Users/".into()],
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("coco", 0, 10, &config);
assert_eq!(
sql,
"SELECT TOP 10 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%coco%') AND (SCOPE = 'file:C:/Users/')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_multiple_search_paths() {
let config = FileSearchConfig {
search_paths: vec![
"C:/Users/".into(),
"D:/Projects/".into(),
"E:/Documents/".into(),
],
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("test", 0, 5, &config);
assert_eq!(
sql,
"SELECT TOP 5 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%test%') AND (SCOPE = 'file:C:/Users/' OR SCOPE = 'file:D:/Projects/' OR SCOPE = 'file:E:/Documents/')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_exclude_paths() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: vec!["C:/Windows/".into()],
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("file", 0, 20, &config);
assert_eq!(
sql,
"SELECT TOP 20 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%file%') AND ((NOT SCOPE = 'file:C:/Windows/'))"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_multiple_exclude_paths() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: vec!["C:/Windows/".into(), "C:/System/".into(), "C:/Temp/".into()],
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("data", 5, 15, &config);
assert_eq!(
sql,
"SELECT TOP 20 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%data%') AND ((NOT SCOPE = 'file:C:/Windows/') AND (NOT SCOPE = 'file:C:/System/') AND (NOT SCOPE = 'file:C:/Temp/'))"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_file_types() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: Vec::new(),
file_types: vec!["txt".into()],
search_by: SearchBy::Name,
};
let sql = query_sql("readme", 0, 10, &config);
assert_eq!(
sql,
"SELECT TOP 10 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%readme%') AND (System.FileExtension = '.txt')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_multiple_file_types() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: Vec::new(),
file_types: vec!["rs".into(), "toml".into(), "md".into(), "json".into()],
search_by: SearchBy::Name,
};
let sql = query_sql("config", 0, 50, &config);
assert_eq!(
sql,
"SELECT TOP 50 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%config%') AND (System.FileExtension = '.rs' OR System.FileExtension = '.toml' OR System.FileExtension = '.md' OR System.FileExtension = '.json')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_all_fields_combined() {
let config = FileSearchConfig {
search_paths: vec!["C:/Projects/".into(), "D:/Code/".into()],
exclude_paths: vec!["C:/Projects/temp/".into()],
file_types: vec!["rs".into(), "ts".into()],
search_by: SearchBy::Name,
};
let sql = query_sql("main", 10, 25, &config);
assert_eq!(
sql,
"SELECT TOP 35 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%main%') AND (SCOPE = 'file:C:/Projects/' OR SCOPE = 'file:D:/Code/') AND ((NOT SCOPE = 'file:C:/Projects/temp/')) AND (System.FileExtension = '.rs' OR System.FileExtension = '.ts')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_with_special_characters() {
let config = FileSearchConfig {
search_paths: vec!["C:/Users/John Doe/".into()],
exclude_paths: Vec::new(),
file_types: vec!["c++".into()],
search_by: SearchBy::Name,
};
let sql = query_sql("hello-world", 0, 10, &config);
assert_eq!(
sql,
"SELECT TOP 10 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%hello-world%') AND (SCOPE = 'file:C:/Users/John Doe/') AND (System.FileExtension = '.c++')"
);
ensure_it_is_valid_sql(&sql);
}
#[test]
fn test_query_sql_edge_case_large_offset() {
let config = FileSearchConfig {
search_paths: Vec::new(),
exclude_paths: Vec::new(),
file_types: Vec::new(),
search_by: SearchBy::Name,
};
let sql = query_sql("test", 100, 50, &config);
assert_eq!(
sql,
"SELECT TOP 150 System.ItemUrl, System.Search.Rank FROM SystemIndex WHERE (System.FileName LIKE '%test%')"
);
ensure_it_is_valid_sql(&sql);
}
}
#[cfg(test)]
mod test {
use super::*;
#[test]
fn test_query_string_cleanup_no_unsupported_chars() {
let input = "hello world";
let result = query_string_cleanup(input);
assert_eq!(result, input);
}
#[test]
fn test_query_string_cleanup_single_quote() {
let input = "don't worry";
let result = query_string_cleanup(input);
assert_eq!(result, "don t worry");
}
#[test]
fn test_query_string_cleanup_newline() {
let input = "line1\nline2";
let result = query_string_cleanup(input);
assert_eq!(result, "line1 line2");
}
#[test]
fn test_query_string_cleanup_both_unsupported_chars() {
let input = "don't\nworry";
let result = query_string_cleanup(input);
assert_eq!(result, "don t worry");
}
#[test]
fn test_query_string_cleanup_multiple_single_quotes() {
let input = "it's a 'test' string";
let result = query_string_cleanup(input);
assert_eq!(result, "it s a test string");
}
#[test]
fn test_query_string_cleanup_multiple_newlines() {
let input = "line1\n\nline2\nline3";
let result = query_string_cleanup(input);
assert_eq!(result, "line1 line2 line3");
}
#[test]
fn test_query_string_cleanup_empty_string() {
let input = "";
let result = query_string_cleanup(input);
assert_eq!(result, input);
}
#[test]
fn test_query_string_cleanup_only_unsupported_chars() {
let input = "'\n'";
let result = query_string_cleanup(input);
assert_eq!(result, " ");
}
#[test]
fn test_query_string_cleanup_unicode_characters() {
let input = "héllo wörld's\nfile";
let result = query_string_cleanup(input);
assert_eq!(result, "héllo wörld s file");
}
#[test]
fn test_query_string_cleanup_special_chars_preserved() {
let input = "test@file#name$with%symbols";
let result = query_string_cleanup(input);
assert_eq!(result, input);
}
}

View File

@@ -0,0 +1,97 @@
pub(crate) mod config;
pub(crate) mod implementation;
use super::super::LOCAL_QUERY_SOURCE_TYPE;
use crate::common::{
error::SearchError,
search::{QueryResponse, QuerySource, SearchQuery},
traits::SearchSource,
};
use async_trait::async_trait;
use config::FileSearchConfig;
use hostname;
use tauri::AppHandle;
pub(crate) const EXTENSION_ID: &str = "File Search";
/// JSON file for this extension.
pub(crate) const PLUGIN_JSON_FILE: &str = r#"
{
"id": "File Search",
"name": "File Search",
"platforms": ["macos", "windows"],
"description": "Search files on your system",
"icon": "font_Filesearch",
"type": "extension"
}
"#;
pub struct FileSearchExtensionSearchSource;
#[async_trait]
impl SearchSource for FileSearchExtensionSearchSource {
fn get_type(&self) -> QuerySource {
QuerySource {
r#type: LOCAL_QUERY_SOURCE_TYPE.into(),
name: hostname::get()
.unwrap_or(EXTENSION_ID.into())
.to_string_lossy()
.into(),
id: EXTENSION_ID.into(),
}
}
async fn search(
&self,
tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
let Some(query_string) = query.query_strings.get("query") else {
return Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
});
};
let from = usize::try_from(query.from).expect("from too big");
let size = usize::try_from(query.size).expect("size too big");
let query_string = query_string.trim();
if query_string.is_empty() {
return Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
});
}
// Get configuration from tauri store
let config = FileSearchConfig::get(&tauri_app_handle);
// If search paths are empty, then the hit should be empty.
//
// Without this, empty search paths will result in a mdfind that has no `-onlyin`
// option, which will in turn query the whole disk volume.
if config.search_paths.is_empty() {
return Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
});
}
// Execute search in a blocking task
let query_source = self.get_type();
let hits = implementation::hits(&query_string, from, size, &config)
.await
.map_err(SearchError::InternalError)?;
let total_hits = hits.len();
Ok(QueryResponse {
source: query_source,
hits,
total_hits,
})
}
}

View File

@@ -0,0 +1,543 @@
//! Built-in extensions and related stuff.
pub mod ai_overview;
pub mod application;
pub mod calculator;
#[cfg(any(target_os = "macos", target_os = "windows"))]
pub mod file_search;
pub mod pizza_engine_runtime;
pub mod quick_ai_access;
use super::Extension;
use crate::SearchSourceRegistry;
use crate::extension::built_in::application::{set_apps_hotkey, unset_apps_hotkey};
use crate::extension::{
ExtensionBundleIdBorrowed, PLUGIN_JSON_FILE_NAME, alter_extension_json_file,
};
use anyhow::Context;
use std::path::{Path, PathBuf};
use tauri::{AppHandle, Manager, Runtime};
pub(crate) fn get_built_in_extension_directory<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
) -> PathBuf {
let mut resource_dir = tauri_app_handle.path().app_data_dir().expect(
"User home directory not found, which should be impossible on desktop environments",
);
resource_dir.push("built_in_extensions");
resource_dir
}
/// Helper function to load the built-in extension specified by `extension_id`, used
/// in `list_built_in_extensions()`.
///
/// For built-in extensions, users are only allowed to edit these fields:
///
/// 1. alias (if this extension supports alias)
/// 2. hotkey (if this extension supports hotkey)
/// 3. enabled
///
/// If
///
/// 1. The above fields have invalid value
/// 2. Other fields are modified
///
/// we ignore and reset them to the default value.
async fn load_built_in_extension(
built_in_extensions_dir: &Path,
extension_id: &str,
default_plugin_json_file: &str,
) -> Result<Extension, String> {
let mut extension_dir = built_in_extensions_dir.join(extension_id);
let mut default_plugin_json = serde_json::from_str::<Extension>(&default_plugin_json_file).unwrap_or_else( |e| {
panic!("the default extension {} file of built-in extension [{}] cannot be parsed as a valid [struct Extension], error [{}]", PLUGIN_JSON_FILE_NAME, extension_id, e);
});
if !extension_dir.try_exists().map_err(|e| e.to_string())? {
tokio::fs::create_dir_all(extension_dir.as_path())
.await
.map_err(|e| e.to_string())?;
}
let plugin_json_file_path = {
extension_dir.push(PLUGIN_JSON_FILE_NAME);
extension_dir
};
// If the JSON file does not exist, create a file with the default template and return.
if !plugin_json_file_path
.try_exists()
.map_err(|e| e.to_string())?
{
tokio::fs::write(plugin_json_file_path, default_plugin_json_file)
.await
.map_err(|e| e.to_string())?;
return Ok(default_plugin_json);
}
let plugin_json_file_content = tokio::fs::read_to_string(plugin_json_file_path.as_path())
.await
.map_err(|e| e.to_string())?;
let res_plugin_json = serde_json::from_str::<Extension>(&plugin_json_file_content);
let Ok(plugin_json) = res_plugin_json else {
log::warn!(
"user invalidated built-in extension [{}] file, overwriting it with the default template",
extension_id
);
// If the JSON file cannot be parsed as `struct Extension`, overwrite it with the default template and return.
tokio::fs::write(plugin_json_file_path, default_plugin_json_file)
.await
.map_err(|e| e.to_string())?;
return Ok(default_plugin_json);
};
// Users are only allowed to edit the below fields
// 1. alias (if this extension supports alias)
// 2. hotkey (if this extension supports hotkey)
// 3. enabled
// so we ignore all other fields.
let alias = if default_plugin_json.supports_alias_hotkey() {
plugin_json.alias.clone()
} else {
None
};
let hotkey = if default_plugin_json.supports_alias_hotkey() {
plugin_json.hotkey.clone()
} else {
None
};
let enabled = plugin_json.enabled;
default_plugin_json.alias = alias;
default_plugin_json.hotkey = hotkey;
default_plugin_json.enabled = enabled;
let final_plugin_json_file_content = serde_json::to_string_pretty(&default_plugin_json)
.expect("failed to serialize `struct Extension`");
tokio::fs::write(plugin_json_file_path, final_plugin_json_file_content)
.await
.map_err(|e| e.to_string())?;
Ok(default_plugin_json)
}
/// Return the built-in extension list.
///
/// Will create extension files when they are not found.
///
/// Users may put extension files in the built-in extension directory, but
/// we do not care and will ignore them.
///
/// We only read alias/hotkey/enabled from the JSON file, we have ensured that if
/// alias/hotkey is not supported, then it will be `None`. Besides that, no further
/// validation is needed because nothing could go wrong.
pub(crate) async fn list_built_in_extensions<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
) -> Result<Vec<Extension>, String> {
let dir = get_built_in_extension_directory(tauri_app_handle);
let mut built_in_extensions = Vec::new();
built_in_extensions.push(
load_built_in_extension(
&dir,
application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME,
application::PLUGIN_JSON_FILE,
)
.await?,
);
built_in_extensions.push(
load_built_in_extension(
&dir,
calculator::DATA_SOURCE_ID,
calculator::PLUGIN_JSON_FILE,
)
.await?,
);
built_in_extensions.push(
load_built_in_extension(
&dir,
ai_overview::EXTENSION_ID,
ai_overview::PLUGIN_JSON_FILE,
)
.await?,
);
built_in_extensions.push(
load_built_in_extension(
&dir,
quick_ai_access::EXTENSION_ID,
quick_ai_access::PLUGIN_JSON_FILE,
)
.await?,
);
cfg_if::cfg_if! {
if #[cfg(any(target_os = "macos", target_os = "windows"))] {
built_in_extensions.push(
load_built_in_extension(
&dir,
file_search::EXTENSION_ID,
file_search::PLUGIN_JSON_FILE,
)
.await?,
);
}
}
Ok(built_in_extensions)
}
pub(super) async fn init_built_in_extension<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
extension: &Extension,
search_source_registry: &SearchSourceRegistry,
) -> Result<(), String> {
log::trace!("initializing built-in extensions [{}]", extension.id);
if extension.id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
search_source_registry
.register_source(application::ApplicationSearchSource)
.await;
set_apps_hotkey(&tauri_app_handle)?;
log::debug!("built-in extension [{}] initialized", extension.id);
}
if extension.id == calculator::DATA_SOURCE_ID {
let calculator_search = calculator::CalculatorSource::new(2000f64);
search_source_registry
.register_source(calculator_search)
.await;
log::debug!("built-in extension [{}] initialized", extension.id);
}
cfg_if::cfg_if! {
if #[cfg(any(target_os = "macos", target_os = "windows"))] {
if extension.id == file_search::EXTENSION_ID {
let file_system_search = file_search::FileSearchExtensionSearchSource;
search_source_registry
.register_source(file_system_search)
.await;
log::debug!("built-in extension [{}] initialized", extension.id);
}
}
}
Ok(())
}
pub(crate) fn is_extension_built_in(bundle_id: &ExtensionBundleIdBorrowed<'_>) -> bool {
bundle_id.developer.is_none()
}
pub(crate) async fn enable_built_in_extension<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
) -> Result<(), String> {
let search_source_registry_tauri_state = tauri_app_handle.state::<SearchSourceRegistry>();
let update_extension = |extension: &mut Extension| -> Result<(), String> {
extension.enabled = true;
Ok(())
};
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME
&& bundle_id.sub_extension_id.is_none()
{
search_source_registry_tauri_state
.register_source(application::ApplicationSearchSource)
.await;
set_apps_hotkey(tauri_app_handle)?;
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
// Check if this is an application
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME
&& bundle_id.sub_extension_id.is_some()
{
let app_path = bundle_id.sub_extension_id.expect("just checked it is Some");
application::enable_app_search(tauri_app_handle, app_path)?;
return Ok(());
}
if bundle_id.extension_id == calculator::DATA_SOURCE_ID {
let calculator_search = calculator::CalculatorSource::new(2000f64);
search_source_registry_tauri_state
.register_source(calculator_search)
.await;
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
if bundle_id.extension_id == quick_ai_access::EXTENSION_ID {
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
if bundle_id.extension_id == ai_overview::EXTENSION_ID {
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
cfg_if::cfg_if! {
if #[cfg(any(target_os = "macos", target_os = "windows"))] {
if bundle_id.extension_id == file_search::EXTENSION_ID {
let file_system_search = file_search::FileSearchExtensionSearchSource;
search_source_registry_tauri_state
.register_source(file_system_search)
.await;
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
}
}
Ok(())
}
pub(crate) async fn disable_built_in_extension<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
) -> Result<(), String> {
let search_source_registry_tauri_state = tauri_app_handle.state::<SearchSourceRegistry>();
let update_extension = |extension: &mut Extension| -> Result<(), String> {
extension.enabled = false;
Ok(())
};
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME
&& bundle_id.sub_extension_id.is_none()
{
search_source_registry_tauri_state
.remove_source(bundle_id.extension_id)
.await;
unset_apps_hotkey(tauri_app_handle)?;
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
// Check if this is an application
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME
&& bundle_id.sub_extension_id.is_some()
{
let app_path = bundle_id.sub_extension_id.expect("just checked it is Some");
application::disable_app_search(tauri_app_handle, app_path)?;
return Ok(());
}
if bundle_id.extension_id == calculator::DATA_SOURCE_ID {
search_source_registry_tauri_state
.remove_source(bundle_id.extension_id)
.await;
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
if bundle_id.extension_id == quick_ai_access::EXTENSION_ID {
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
if bundle_id.extension_id == ai_overview::EXTENSION_ID {
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
cfg_if::cfg_if! {
if #[cfg(any(target_os = "macos", target_os = "windows"))] {
if bundle_id.extension_id == file_search::EXTENSION_ID {
search_source_registry_tauri_state
.remove_source(bundle_id.extension_id)
.await;
alter_extension_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id,
update_extension,
)?;
return Ok(());
}
}
}
Ok(())
}
pub(crate) fn set_built_in_extension_alias<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
alias: &str,
) {
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
if let Some(app_path) = bundle_id.sub_extension_id {
application::set_app_alias(tauri_app_handle, app_path, alias);
}
}
}
pub(crate) fn register_built_in_extension_hotkey<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
hotkey: &str,
) -> Result<(), String> {
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
if let Some(app_path) = bundle_id.sub_extension_id {
application::register_app_hotkey(&tauri_app_handle, app_path, hotkey)?;
}
}
Ok(())
}
pub(crate) fn unregister_built_in_extension_hotkey<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
) -> Result<(), String> {
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
if let Some(app_path) = bundle_id.sub_extension_id {
application::unregister_app_hotkey(&tauri_app_handle, app_path)?;
}
}
Ok(())
}
fn split_extension_id(extension_id: &str) -> (&str, Option<&str>) {
match extension_id.find('.') {
Some(idx) => (&extension_id[..idx], Some(&extension_id[idx + 1..])),
None => (extension_id, None),
}
}
fn load_extension_from_json_file(
extension_directory: &Path,
extension_id: &str,
) -> Result<Extension, String> {
let (parent_extension_id, _opt_sub_extension_id) = split_extension_id(extension_id);
let json_file_path = {
let mut extension_directory_path = extension_directory.join(parent_extension_id);
extension_directory_path.push(PLUGIN_JSON_FILE_NAME);
extension_directory_path
};
let mut extension = serde_json::from_reader::<_, Extension>(
std::fs::File::open(&json_file_path)
.with_context(|| {
format!(
"the [{}] file for extension [{}] is missing or broken",
PLUGIN_JSON_FILE_NAME, parent_extension_id
)
})
.map_err(|e| e.to_string())?,
)
.map_err(|e| e.to_string())?;
super::canonicalize_relative_icon_path(extension_directory, &mut extension)?;
Ok(extension)
}
pub(crate) async fn is_built_in_extension_enabled<R: Runtime>(
tauri_app_handle: &AppHandle<R>,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
) -> Result<bool, String> {
let search_source_registry_tauri_state = tauri_app_handle.state::<SearchSourceRegistry>();
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME
&& bundle_id.sub_extension_id.is_none()
{
return Ok(search_source_registry_tauri_state
.get_source(bundle_id.extension_id)
.await
.is_some());
}
// Check if this is an application
if bundle_id.extension_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
if let Some(app_path) = bundle_id.sub_extension_id {
return Ok(application::is_app_search_enabled(app_path));
}
}
if bundle_id.extension_id == calculator::DATA_SOURCE_ID {
return Ok(search_source_registry_tauri_state
.get_source(bundle_id.extension_id)
.await
.is_some());
}
if bundle_id.extension_id == quick_ai_access::EXTENSION_ID {
let extension = load_extension_from_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id.extension_id,
)?;
return Ok(extension.enabled);
}
if bundle_id.extension_id == ai_overview::EXTENSION_ID {
let extension = load_extension_from_json_file(
&get_built_in_extension_directory(tauri_app_handle),
bundle_id.extension_id,
)?;
return Ok(extension.enabled);
}
cfg_if::cfg_if! {
if #[cfg(any(target_os = "macos", target_os = "windows"))] {
if bundle_id.extension_id == file_search::EXTENSION_ID
&& bundle_id.sub_extension_id.is_none()
{
return Ok(search_source_registry_tauri_state
.get_source(bundle_id.extension_id)
.await
.is_some());
}
}
}
unreachable!("extension [{:?}] is not a built-in extension", bundle_id)
}

View File

@@ -0,0 +1,76 @@
//! We use Pizza Engine to index applications and local files. The engine will be
//! run in the thread/runtime defined in this file.
//!
//! # Why such a thread/runtime is needed
//!
//! Generally, Tokio async runtime requires all the async tasks running on it to be
//! `Send` and `Sync`, but the async tasks created by Pizza Engine are not,
//! which forces us to create a dedicated thread/runtime to execute them.
use std::any::Any;
use std::collections::HashMap;
use std::collections::hash_map::Entry;
use std::sync::OnceLock;
pub(crate) trait SearchSourceState {
#[cfg_attr(not(feature = "use_pizza_engine"), allow(unused))]
fn as_mut_any(&mut self) -> &mut dyn Any;
}
#[async_trait::async_trait(?Send)]
pub(crate) trait Task: Send + Sync {
fn search_source_id(&self) -> &'static str;
async fn exec(&mut self, state: &mut Option<Box<dyn SearchSourceState>>);
}
pub(crate) static RUNTIME_TX: OnceLock<tokio::sync::mpsc::UnboundedSender<Box<dyn Task>>> =
OnceLock::new();
/// This function blocks until the runtime thread is ready for accepting tasks.
pub(crate) async fn start_pizza_engine_runtime() {
const THREAD_NAME: &str = "Pizza engine runtime thread";
log::trace!("starting Pizza engine runtime");
let (engine_start_signal_tx, engine_start_signal_rx) = tokio::sync::oneshot::channel();
std::thread::Builder::new()
.name(THREAD_NAME.into())
.spawn(move || {
let rt = tokio::runtime::Runtime::new().unwrap();
let main = async {
let mut states: HashMap<String, Option<Box<dyn SearchSourceState>>> =
HashMap::new();
let (tx, mut rx) = tokio::sync::mpsc::unbounded_channel();
RUNTIME_TX.set(tx).unwrap();
engine_start_signal_tx
.send(())
.expect("engine_start_signal_rx dropped");
while let Some(mut task) = rx.recv().await {
let opt_search_source_state = match states.entry(task.search_source_id().into())
{
Entry::Occupied(o) => o.into_mut(),
Entry::Vacant(v) => v.insert(None),
};
task.exec(opt_search_source_state).await;
}
};
rt.block_on(main);
})
.unwrap_or_else(|e| {
panic!(
"failed to start thread [{}] due to error [{}]",
THREAD_NAME, e
);
});
engine_start_signal_rx
.await
.expect("engine_start_signal_tx dropped, the runtime thread could be dead");
log::trace!("Pizza engine runtime started");
}

View File

@@ -0,0 +1,12 @@
pub(super) const EXTENSION_ID: &str = "QuickAIAccess";
pub(crate) const PLUGIN_JSON_FILE: &str = r#"
{
"id": "QuickAIAccess",
"name": "Quick AI Access",
"description": "...",
"icon": "font_a-QuickAIAccess",
"type": "ai_extension",
"enabled": true
}
"#;

View File

@@ -0,0 +1,775 @@
pub(crate) mod built_in;
pub(crate) mod third_party;
use crate::common::document::OnOpened;
use crate::common::register::SearchSourceRegistry;
use crate::util::platform::Platform;
use anyhow::Context;
use borrowme::{Borrow, ToOwned};
use derive_more::Display;
use serde::Deserialize;
use serde::Serialize;
use serde_json::Value as Json;
use std::collections::HashSet;
use std::path::Path;
use tauri::{AppHandle, Manager, Runtime};
use third_party::THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE;
pub const LOCAL_QUERY_SOURCE_TYPE: &str = "local";
const PLUGIN_JSON_FILE_NAME: &str = "plugin.json";
const ASSETS_DIRECTORY_FILE_NAME: &str = "assets";
fn default_true() -> bool {
true
}
#[derive(Debug, Deserialize, Serialize, Clone)]
pub struct Extension {
/// Extension ID.
///
/// The ID doesn't uniquely identifies an extension; Its bundle ID (ID & developer) does.
id: String,
/// Extension name.
name: String,
/// ID of the developer.
///
/// * For built-in extensions, this will always be None.
/// * For third-party first-layer extensions, the on-disk plugin.json file
/// won't contain this field, but we will set this field for them after reading them into the memory.
/// * For third-party sub extensions, this field will be None.
developer: Option<String>,
/// Platforms supported by this extension.
///
/// If `None`, then this extension can be used on all the platforms.
#[serde(skip_serializing_if = "Option::is_none")]
platforms: Option<HashSet<Platform>>,
/// Extension description.
description: String,
//// Specify the icon for this extension,
///
/// For the `plugin.json` file, this field can be specified in multi options:
///
/// 1. It can be a path to the icon file, the path can be
///
/// * relative (relative to the "assets" directory)
/// * absolute
/// 2. It can be a font class code, e.g., 'font_coco', if you want to use
/// Coco's built-in icons.
///
/// In cases where your icon file is named similarly to a font class code, Coco
/// will treat it as an icon file if it exists, i.e., if file `<extension>/assets/font_coco`
/// exists, then Coco will use this file rather than the built-in 'font_coco' icon.
///
/// For the `struct Extension` loaded into memory, this field should be:
///
/// 1. An absolute path
/// 2. A font code
icon: String,
r#type: ExtensionType,
/// If this is a Command extension, then action defines the operation to execute
/// when the it is triggered.
#[serde(skip_serializing_if = "Option::is_none")]
action: Option<CommandAction>,
/// The link to open if this is a Quicklink extension.
#[serde(skip_serializing_if = "Option::is_none")]
quicklink: Option<Quicklink>,
// If this extension is of type Group or Extension, then it behaves like a
// directory, i.e., it could contain sub items.
commands: Option<Vec<Extension>>,
scripts: Option<Vec<Extension>>,
quicklinks: Option<Vec<Extension>>,
/// The alias of the extension.
///
/// Extension of type Group and Extension cannot have alias.
#[serde(skip_serializing_if = "Option::is_none")]
alias: Option<String>,
/// The hotkey of the extension.
///
/// Extension of type Group and Extension cannot have hotkey.
#[serde(skip_serializing_if = "Option::is_none")]
hotkey: Option<String>,
/// Is this extension enabled.
#[serde(default = "default_true")]
enabled: bool,
/// Extension settings
#[serde(skip_serializing_if = "Option::is_none")]
settings: Option<Json>,
// We do not care about these fields, just take it regardless of what it is.
screenshots: Option<Json>,
url: Option<Json>,
version: Option<Json>,
}
/// Bundle ID uniquely identifies an extension.
#[derive(Debug, Deserialize, Serialize, PartialEq, Clone)]
pub(crate) struct ExtensionBundleId {
developer: Option<String>,
extension_id: String,
sub_extension_id: Option<String>,
}
impl Borrow for ExtensionBundleId {
type Target<'a> = ExtensionBundleIdBorrowed<'a>;
fn borrow(&self) -> Self::Target<'_> {
ExtensionBundleIdBorrowed {
developer: self.developer.as_deref(),
extension_id: &self.extension_id,
sub_extension_id: self.sub_extension_id.as_deref(),
}
}
}
/// Reference version of `ExtensionBundleId`.
#[derive(Debug, Serialize, PartialEq)]
pub(crate) struct ExtensionBundleIdBorrowed<'ext> {
developer: Option<&'ext str>,
extension_id: &'ext str,
sub_extension_id: Option<&'ext str>,
}
impl ToOwned for ExtensionBundleIdBorrowed<'_> {
type Owned = ExtensionBundleId;
fn to_owned(&self) -> Self::Owned {
ExtensionBundleId {
developer: self.developer.map(|s| s.to_string()),
extension_id: self.extension_id.to_string(),
sub_extension_id: self.sub_extension_id.map(|s| s.to_string()),
}
}
}
impl<'ext> PartialEq<ExtensionBundleIdBorrowed<'ext>> for ExtensionBundleId {
fn eq(&self, other: &ExtensionBundleIdBorrowed<'ext>) -> bool {
self.developer.as_deref() == other.developer
&& self.extension_id == other.extension_id
&& self.sub_extension_id.as_deref() == other.sub_extension_id
}
}
impl<'ext> PartialEq<ExtensionBundleId> for ExtensionBundleIdBorrowed<'ext> {
fn eq(&self, other: &ExtensionBundleId) -> bool {
self.developer == other.developer.as_deref()
&& self.extension_id == other.extension_id
&& self.sub_extension_id == other.sub_extension_id.as_deref()
}
}
impl Extension {
/// WARNING: the bundle ID returned from this function always has its `sub_extension_id`
/// set to `None`, this may not be what you want.
pub(crate) fn bundle_id_borrowed(&self) -> ExtensionBundleIdBorrowed<'_> {
ExtensionBundleIdBorrowed {
developer: self.developer.as_deref(),
extension_id: &self.id,
sub_extension_id: None,
}
}
/// Whether this extension could be searched.
pub(crate) fn searchable(&self) -> bool {
self.on_opened().is_some()
}
/// Return what will happen when we open this extension.
///
/// `None` if it cannot be opened.
pub(crate) fn on_opened(&self) -> Option<OnOpened> {
match self.r#type {
ExtensionType::Group => None,
ExtensionType::Extension => None,
ExtensionType::Command => Some(OnOpened::Command {
action: self.action.clone().unwrap_or_else(|| {
panic!(
"Command extension [{}]'s [action] field is not set, something wrong with your extension validity check", self.id
)
}),
}),
ExtensionType::Application => Some(OnOpened::Application {
app_path: self.id.clone(),
}),
ExtensionType::Script => todo!("not supported yet"),
ExtensionType::Quicklink => todo!("not supported yet"),
ExtensionType::Setting => todo!("not supported yet"),
ExtensionType::Calculator => None,
ExtensionType::AiExtension => None,
}
}
pub(crate) fn get_sub_extension(&self, sub_extension_id: &str) -> Option<&Self> {
if !self.r#type.contains_sub_items() {
return None;
}
if let Some(ref commands) = self.commands {
if let Some(sub_ext) = commands.iter().find(|cmd| cmd.id == sub_extension_id) {
return Some(sub_ext);
}
}
if let Some(ref scripts) = self.scripts {
if let Some(sub_ext) = scripts.iter().find(|script| script.id == sub_extension_id) {
return Some(sub_ext);
}
}
if let Some(ref quicklinks) = self.quicklinks {
if let Some(sub_ext) = quicklinks.iter().find(|link| link.id == sub_extension_id) {
return Some(sub_ext);
}
}
None
}
pub(crate) fn get_sub_extension_mut(&mut self, sub_extension_id: &str) -> Option<&mut Self> {
if !self.r#type.contains_sub_items() {
return None;
}
if let Some(ref mut commands) = self.commands {
if let Some(sub_ext) = commands.iter_mut().find(|cmd| cmd.id == sub_extension_id) {
return Some(sub_ext);
}
}
if let Some(ref mut scripts) = self.scripts {
if let Some(sub_ext) = scripts
.iter_mut()
.find(|script| script.id == sub_extension_id)
{
return Some(sub_ext);
}
}
if let Some(ref mut quicklinks) = self.quicklinks {
if let Some(sub_ext) = quicklinks
.iter_mut()
.find(|link| link.id == sub_extension_id)
{
return Some(sub_ext);
}
}
None
}
pub(crate) fn supports_alias_hotkey(&self) -> bool {
let ty = self.r#type;
ty != ExtensionType::Group && ty != ExtensionType::Extension
}
}
#[derive(Debug, Deserialize, Serialize, Clone)]
pub(crate) struct CommandAction {
pub(crate) exec: String,
pub(crate) args: Option<Vec<String>>,
}
#[derive(Debug, Deserialize, Serialize, Clone)]
pub struct Quicklink {
link: String,
}
#[derive(Debug, PartialEq, Deserialize, Serialize, Clone, Display, Copy)]
#[serde(rename_all(serialize = "snake_case", deserialize = "snake_case"))]
pub enum ExtensionType {
#[display("Group")]
Group,
#[display("Extension")]
Extension,
#[display("Command")]
Command,
#[display("Application")]
Application,
#[display("Script")]
Script,
#[display("Quicklink")]
Quicklink,
#[display("Setting")]
Setting,
#[display("Calculator")]
Calculator,
#[display("AI Extension")]
AiExtension,
}
impl ExtensionType {
pub(crate) fn contains_sub_items(&self) -> bool {
self == &Self::Group || self == &Self::Extension
}
}
/// Helper function to filter out the extensions that do not satisfy the specified conditions.
///
/// used in `list_extensions()`
fn filter_out_extensions(
extensions: &mut Vec<Extension>,
query: Option<&str>,
extension_type: Option<ExtensionType>,
list_enabled: bool,
) {
// apply `list_enabled`
if list_enabled {
extensions.retain(|ext| ext.enabled);
for extension in extensions.iter_mut() {
if extension.r#type.contains_sub_items() {
if let Some(ref mut commands) = extension.commands {
commands.retain(|cmd| cmd.enabled);
}
if let Some(ref mut scripts) = extension.scripts {
scripts.retain(|script| script.enabled);
}
if let Some(ref mut quicklinks) = extension.quicklinks {
quicklinks.retain(|link| link.enabled);
}
}
}
}
// apply extension type filter to non-group/extension extensions
if let Some(extension_type) = extension_type {
assert!(
extension_type != ExtensionType::Group && extension_type != ExtensionType::Extension,
"filtering in folder extensions is pointless"
);
extensions.retain(|ext| {
let ty = ext.r#type;
ty == ExtensionType::Group || ty == ExtensionType::Extension || ty == extension_type
});
// Filter sub-extensions to only include the requested type
for extension in extensions.iter_mut() {
if extension.r#type.contains_sub_items() {
if let Some(ref mut commands) = extension.commands {
commands.retain(|cmd| cmd.r#type == extension_type);
}
if let Some(ref mut scripts) = extension.scripts {
scripts.retain(|script| script.r#type == extension_type);
}
if let Some(ref mut quicklinks) = extension.quicklinks {
quicklinks.retain(|link| link.r#type == extension_type);
}
}
}
// Application is special, technically, it should never be filtered out by
// this condition. But if our users will be surprising if they choose a
// non-Application type and see it in the results. So we do this to remedy the
// issue
if let Some(idx) = extensions.iter().position(|ext| {
ext.developer.is_none()
&& ext.id == built_in::application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME
}) {
if extension_type != ExtensionType::Application {
extensions.remove(idx);
}
}
}
// apply query filter
if let Some(query) = query {
let match_closure = |ext: &Extension| {
let lowercase_title = ext.name.to_lowercase();
let lowercase_alias = ext.alias.as_ref().map(|alias| alias.to_lowercase());
let lowercase_query = query.to_lowercase();
lowercase_title.contains(&lowercase_query)
|| lowercase_alias.map_or(false, |alias| alias.contains(&lowercase_query))
};
extensions.retain(|ext| {
if ext.r#type.contains_sub_items() {
// Keep all group/extension types
true
} else {
// Apply filter to non-group/extension types
match_closure(ext)
}
});
// Filter sub-extensions in groups and extensions
for extension in extensions.iter_mut() {
if extension.r#type.contains_sub_items() {
if let Some(ref mut commands) = extension.commands {
commands.retain(&match_closure);
}
if let Some(ref mut scripts) = extension.scripts {
scripts.retain(&match_closure);
}
if let Some(ref mut quicklinks) = extension.quicklinks {
quicklinks.retain(&match_closure);
}
}
}
}
}
/// Return value:
///
/// * boolean: indicates if we found any invalid extensions
/// * Vec<Extension>: loaded extensions
#[tauri::command]
pub(crate) async fn list_extensions<R: Runtime>(
tauri_app_handle: AppHandle<R>,
query: Option<String>,
extension_type: Option<ExtensionType>,
list_enabled: bool,
) -> Result<(bool, Vec<Extension>), String> {
log::trace!("loading extensions");
let third_party_dir = third_party::get_third_party_extension_directory(&tauri_app_handle);
if !third_party_dir.try_exists().map_err(|e| e.to_string())? {
tokio::fs::create_dir_all(&third_party_dir)
.await
.map_err(|e| e.to_string())?;
}
let (third_party_found_invalid_extension, mut third_party_extensions) =
third_party::list_third_party_extensions(&third_party_dir).await?;
let built_in_extensions = built_in::list_built_in_extensions(&tauri_app_handle).await?;
let found_invalid_extension = third_party_found_invalid_extension;
let mut extensions = {
third_party_extensions.extend(built_in_extensions);
third_party_extensions
};
filter_out_extensions(
&mut extensions,
query.as_deref(),
extension_type,
list_enabled,
);
// Cleanup after filtering extensions, don't do it if filter is not performed.
//
// Remove parent extensions (Group/Extension types) that have no sub-items after filtering
let filter_performed = query.is_some() || extension_type.is_some() || list_enabled;
if filter_performed {
extensions.retain(|ext| {
if !ext.r#type.contains_sub_items() {
return true;
}
// We don't do this filter to applications since it is always empty, load at runtime.
if ext.developer.is_none()
&& ext.id == built_in::application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME
{
return true;
}
let has_commands = ext
.commands
.as_ref()
.map_or(false, |commands| !commands.is_empty());
let has_scripts = ext
.scripts
.as_ref()
.map_or(false, |scripts| !scripts.is_empty());
let has_quicklinks = ext
.quicklinks
.as_ref()
.map_or(false, |quicklinks| !quicklinks.is_empty());
has_commands || has_scripts || has_quicklinks
});
}
Ok((found_invalid_extension, extensions))
}
pub(crate) async fn init_extensions(
tauri_app_handle: AppHandle,
mut extensions: Vec<Extension>,
) -> Result<(), String> {
log::trace!("initializing extensions");
let search_source_registry_tauri_state = tauri_app_handle.state::<SearchSourceRegistry>();
built_in::application::ApplicationSearchSource::prepare_index_and_store(
tauri_app_handle.clone(),
)
.await?;
// extension store
search_source_registry_tauri_state
.register_source(third_party::store::ExtensionStore)
.await;
// Init the built-in enabled extensions
for built_in_extension in extensions
.extract_if(.., |ext| {
built_in::is_extension_built_in(&ext.bundle_id_borrowed())
})
.filter(|ext| ext.enabled)
{
built_in::init_built_in_extension(
&tauri_app_handle,
&built_in_extension,
&search_source_registry_tauri_state,
)
.await?;
}
// Now the third-party extensions
let third_party_search_source = third_party::ThirdPartyExtensionsSearchSource::new(extensions);
third_party_search_source.init(&tauri_app_handle).await?;
let third_party_search_source_clone = third_party_search_source.clone();
// Set the global search source so that we can access it in `#[tauri::command]`s
// ignore the result because this function will be invoked twice, which
// means this global variable will be set twice.
let _ = THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE.set(third_party_search_source_clone);
search_source_registry_tauri_state
.register_source(third_party_search_source)
.await;
Ok(())
}
#[tauri::command]
pub(crate) async fn enable_extension(
tauri_app_handle: AppHandle,
bundle_id: ExtensionBundleId,
) -> Result<(), String> {
let bundle_id_borrowed = bundle_id.borrow();
if built_in::is_extension_built_in(&bundle_id_borrowed) {
built_in::enable_built_in_extension(&tauri_app_handle, &bundle_id_borrowed).await?;
return Ok(());
}
third_party::THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE.get().expect("global third party search source not set, looks like init_extensions() has not been executed").enable_extension(&tauri_app_handle, &bundle_id_borrowed).await
}
#[tauri::command]
pub(crate) async fn disable_extension(
tauri_app_handle: AppHandle,
bundle_id: ExtensionBundleId,
) -> Result<(), String> {
let bundle_id_borrowed = bundle_id.borrow();
if built_in::is_extension_built_in(&bundle_id_borrowed) {
built_in::disable_built_in_extension(&tauri_app_handle, &bundle_id_borrowed).await?;
return Ok(());
}
third_party::THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE.get().expect("global third party search source not set, looks like init_extensions() has not been executed").disable_extension(&tauri_app_handle, &bundle_id_borrowed).await
}
#[tauri::command]
pub(crate) async fn set_extension_alias(
tauri_app_handle: AppHandle,
bundle_id: ExtensionBundleId,
alias: String,
) -> Result<(), String> {
let bundle_id_borrowed = bundle_id.borrow();
if built_in::is_extension_built_in(&bundle_id_borrowed) {
built_in::set_built_in_extension_alias(&tauri_app_handle, &bundle_id_borrowed, &alias);
return Ok(());
}
third_party::THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE.get().expect("global third party search source not set, looks like init_extensions() has not been executed").set_extension_alias(&tauri_app_handle, &bundle_id_borrowed, &alias).await
}
#[tauri::command]
pub(crate) async fn register_extension_hotkey(
tauri_app_handle: AppHandle,
bundle_id: ExtensionBundleId,
hotkey: String,
) -> Result<(), String> {
let bundle_id_borrowed = bundle_id.borrow();
if built_in::is_extension_built_in(&bundle_id_borrowed) {
built_in::register_built_in_extension_hotkey(
&tauri_app_handle,
&bundle_id_borrowed,
&hotkey,
)?;
return Ok(());
}
third_party::THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE.get().expect("global third party search source not set, looks like init_extensions() has not been executed").register_extension_hotkey(&tauri_app_handle, &bundle_id_borrowed, &hotkey).await
}
/// NOTE: this function won't error out if the extension specified by `extension_id`
/// has no hotkey set because we need it to behave like this.
#[tauri::command]
pub(crate) async fn unregister_extension_hotkey(
tauri_app_handle: AppHandle,
bundle_id: ExtensionBundleId,
) -> Result<(), String> {
let bundle_id_borrowed = bundle_id.borrow();
if built_in::is_extension_built_in(&bundle_id_borrowed) {
built_in::unregister_built_in_extension_hotkey(&tauri_app_handle, &bundle_id_borrowed)?;
return Ok(());
}
third_party::THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE.get().expect("global third party search source not set, looks like init_extensions() has not been executed").unregister_extension_hotkey(&tauri_app_handle, &bundle_id_borrowed).await?;
Ok(())
}
#[tauri::command]
pub(crate) async fn is_extension_enabled(
tauri_app_handle: AppHandle,
bundle_id: ExtensionBundleId,
) -> Result<bool, String> {
let bundle_id_borrowed = bundle_id.borrow();
if built_in::is_extension_built_in(&bundle_id_borrowed) {
return built_in::is_built_in_extension_enabled(&tauri_app_handle, &bundle_id_borrowed)
.await;
}
third_party::THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE.get().expect("global third party search source not set, looks like init_extensions() has not been executed").is_extension_enabled(&bundle_id_borrowed).await
}
pub(crate) fn canonicalize_relative_icon_path(
extension_dir: &Path,
extension: &mut Extension,
) -> Result<(), String> {
fn _canonicalize_relative_icon_path(
extension_dir: &Path,
extension: &mut Extension,
) -> Result<(), String> {
let icon_str = &extension.icon;
let icon_path = Path::new(icon_str);
if icon_path.is_relative() {
let absolute_icon_path = {
let mut assets_directory = extension_dir.join(ASSETS_DIRECTORY_FILE_NAME);
assets_directory.push(icon_path);
assets_directory
};
if absolute_icon_path.try_exists().map_err(|e| e.to_string())? {
extension.icon = absolute_icon_path
.into_os_string()
.into_string()
.expect("path should be UTF-8 encoded");
}
}
Ok(())
}
_canonicalize_relative_icon_path(extension_dir, extension)?;
if let Some(commands) = &mut extension.commands {
for command in commands {
_canonicalize_relative_icon_path(extension_dir, command)?;
}
}
if let Some(scripts) = &mut extension.scripts {
for script in scripts {
_canonicalize_relative_icon_path(extension_dir, script)?;
}
}
if let Some(quicklinks) = &mut extension.quicklinks {
for quicklink in quicklinks {
_canonicalize_relative_icon_path(extension_dir, quicklink)?;
}
}
Ok(())
}
fn alter_extension_json_file(
extension_directory: &Path,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
how: impl Fn(&mut Extension) -> Result<(), String>,
) -> Result<(), String> {
/// Perform `how` against the extension specified by `extension_id`.
///
/// Please note that `bundle` could point to a sub extension if `sub_extension_id` is Some.
pub(crate) fn modify(
root_extension: &mut Extension,
bundle_id: &ExtensionBundleIdBorrowed<'_>,
how: impl FnOnce(&mut Extension) -> Result<(), String>,
) -> Result<(), String> {
let (parent_extension_id, opt_sub_extension_id) =
(bundle_id.extension_id, bundle_id.sub_extension_id);
assert_eq!(
parent_extension_id, root_extension.id,
"modify() should be invoked against a parent extension"
);
let Some(sub_extension_id) = opt_sub_extension_id else {
how(root_extension)?;
return Ok(());
};
// Search in commands
if let Some(ref mut commands) = root_extension.commands {
if let Some(command) = commands.iter_mut().find(|cmd| cmd.id == sub_extension_id) {
how(command)?;
return Ok(());
}
}
// Search in scripts
if let Some(ref mut scripts) = root_extension.scripts {
if let Some(script) = scripts.iter_mut().find(|scr| scr.id == sub_extension_id) {
how(script)?;
return Ok(());
}
}
// Search in quicklinks
if let Some(ref mut quicklinks) = root_extension.quicklinks {
if let Some(link) = quicklinks.iter_mut().find(|lnk| lnk.id == sub_extension_id) {
how(link)?;
return Ok(());
}
}
Err(format!(
"extension [{:?}] not found in {:?}",
bundle_id, root_extension
))
}
log::debug!(
"altering extension JSON file for extension [{:?}]",
bundle_id
);
let json_file_path = {
let mut path = extension_directory.to_path_buf();
if let Some(developer) = bundle_id.developer {
path.push(developer);
}
path.push(bundle_id.extension_id);
path.push(PLUGIN_JSON_FILE_NAME);
path
};
let mut extension = serde_json::from_reader::<_, Extension>(
std::fs::File::open(&json_file_path)
.with_context(|| {
format!(
"the [{}] file for extension [{:?}] is missing or broken",
PLUGIN_JSON_FILE_NAME, bundle_id
)
})
.map_err(|e| e.to_string())?,
)
.map_err(|e| e.to_string())?;
modify(&mut extension, bundle_id, how)?;
std::fs::write(
&json_file_path,
serde_json::to_string_pretty(&extension).map_err(|e| e.to_string())?,
)
.map_err(|e| e.to_string())?;
Ok(())
}

1193
src-tauri/src/extension/third_party/mod.rs vendored Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,341 @@
//! Extension store related stuff.
use super::LOCAL_QUERY_SOURCE_TYPE;
use crate::common::document::DataSourceReference;
use crate::common::document::Document;
use crate::common::error::SearchError;
use crate::common::search::QueryResponse;
use crate::common::search::QuerySource;
use crate::common::search::SearchQuery;
use crate::common::traits::SearchSource;
use crate::extension::Extension;
use crate::extension::PLUGIN_JSON_FILE_NAME;
use crate::extension::THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE;
use crate::extension::canonicalize_relative_icon_path;
use crate::extension::third_party::get_third_party_extension_directory;
use crate::server::http_client::HttpClient;
use async_trait::async_trait;
use reqwest::StatusCode;
use serde_json::Map as JsonObject;
use serde_json::Value as Json;
use std::io::Read;
use tauri::AppHandle;
const DATA_SOURCE_ID: &str = "Extension Store";
pub(crate) struct ExtensionStore;
#[async_trait]
impl SearchSource for ExtensionStore {
fn get_type(&self) -> QuerySource {
QuerySource {
r#type: LOCAL_QUERY_SOURCE_TYPE.into(),
name: hostname::get()
.unwrap_or(DATA_SOURCE_ID.into())
.to_string_lossy()
.into(),
id: DATA_SOURCE_ID.into(),
}
}
async fn search(
&self,
_tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
const SCORE: f64 = 2000.0;
let Some(query_string) = query.query_strings.get("query") else {
return Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
});
};
let lowercase_query_string = query_string.to_lowercase();
let expected_str = "extension store";
if expected_str.contains(&lowercase_query_string) {
let doc = Document {
id: DATA_SOURCE_ID.to_string(),
category: Some(DATA_SOURCE_ID.to_string()),
title: Some(DATA_SOURCE_ID.to_string()),
icon: Some("font_Store".to_string()),
source: Some(DataSourceReference {
r#type: Some(LOCAL_QUERY_SOURCE_TYPE.into()),
name: Some(DATA_SOURCE_ID.into()),
id: Some(DATA_SOURCE_ID.into()),
icon: Some("font_Store".to_string()),
}),
..Default::default()
};
Ok(QueryResponse {
source: self.get_type(),
hits: vec![(doc, SCORE)],
total_hits: 1,
})
} else {
Ok(QueryResponse {
source: self.get_type(),
hits: Vec::new(),
total_hits: 0,
})
}
}
}
#[tauri::command]
pub(crate) async fn search_extension(
query_params: Option<Vec<String>>,
) -> Result<Vec<Json>, String> {
let response = HttpClient::get(
"default_coco_server",
"store/extension/_search",
query_params,
)
.await
.map_err(|e| format!("Failed to send request: {:?}", e))?;
// The response of a ES style search request
let mut response: JsonObject<String, Json> = response
.json()
.await
.map_err(|e| format!("Failed to parse response: {:?}", e))?;
let hits_json = response
.remove("hits")
.expect("the JSON response should contain field [hits]");
let mut hits = match hits_json {
Json::Object(obj) => obj,
_ => panic!(
"field [hits] should be a JSON object, but it is not, value: [{}]",
hits_json
),
};
let Some(hits_hits_json) = hits.remove("hits") else {
return Ok(Vec::new());
};
let hits_hits = match hits_hits_json {
Json::Array(arr) => arr,
_ => panic!(
"field [hits.hits] should be an array, but it is not, value: [{}]",
hits_hits_json
),
};
let mut extensions = Vec::with_capacity(hits_hits.len());
for hit in hits_hits {
let mut hit_obj = match hit {
Json::Object(obj) => obj,
_ => panic!(
"each hit in [hits.hits] should be a JSON object, but it is not, value: [{}]",
hit
),
};
let source = hit_obj
.remove("_source")
.expect("each hit should contain field [_source]");
let mut source_obj = match source {
Json::Object(obj) => obj,
_ => panic!(
"field [_source] should be a JSON object, but it is not, value: [{}]",
source
),
};
let developer_id = source_obj
.get("developer")
.and_then(|dev| dev.get("id"))
.and_then(|id| id.as_str())
.expect("developer.id should exist")
.to_string();
let extension_id = source_obj
.get("id")
.and_then(|id| id.as_str())
.expect("extension id should exist")
.to_string();
let installed = is_extension_installed(developer_id, extension_id).await;
source_obj.insert("installed".to_string(), Json::Bool(installed));
extensions.push(Json::Object(source_obj));
}
Ok(extensions)
}
async fn is_extension_installed(developer: String, extension_id: String) -> bool {
THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE
.get()
.unwrap()
.extension_exists(&developer, &extension_id)
.await
}
#[tauri::command]
pub(crate) async fn install_extension_from_store(
tauri_app_handle: AppHandle,
id: String,
) -> Result<(), String> {
let path = format!("store/extension/{}/_download", id);
let response = HttpClient::get("default_coco_server", &path, None)
.await
.map_err(|e| format!("Failed to download extension: {}", e))?;
if response.status() == StatusCode::NOT_FOUND {
return Err(format!("extension [{}] not found", id));
}
let bytes = response
.bytes()
.await
.map_err(|e| format!("Failed to read response bytes: {}", e))?;
let cursor = std::io::Cursor::new(bytes);
let mut archive =
zip::ZipArchive::new(cursor).map_err(|e| format!("Failed to read zip archive: {}", e))?;
// The plugin.json sent from the server does not conform to our `struct Extension` definition:
//
// 1. Its `developer` field is a JSON object, but we need a string
// 2. sub-extensions won't have their `id` fields set
//
// we need to correct it
let mut plugin_json = archive
.by_name(PLUGIN_JSON_FILE_NAME)
.map_err(|e| e.to_string())?;
let mut plugin_json_content = String::new();
std::io::Read::read_to_string(&mut plugin_json, &mut plugin_json_content)
.map_err(|e| e.to_string())?;
let mut extension: Json = serde_json::from_str(&plugin_json_content)
.map_err(|e| format!("Failed to parse plugin.json: {}", e))?;
let mut_ref_to_developer_object: &mut Json = extension
.as_object_mut()
.expect("plugin.json should be an object")
.get_mut("developer")
.expect("plugin.json should contain field [developer]");
let developer_id = mut_ref_to_developer_object
.get("id")
.expect("plugin.json should contain [developer.id]")
.as_str()
.expect("plugin.json field [developer.id] should be a string");
*mut_ref_to_developer_object = Json::String(developer_id.into());
// Set IDs for sub-extensions (commands, quicklinks, scripts)
let mut counter = 0;
// Helper function to set IDs for array fields
fn set_ids_for_field(extension: &mut Json, field_name: &str, counter: &mut i32) {
if let Some(field) = extension.as_object_mut().unwrap().get_mut(field_name) {
if let Some(array) = field.as_array_mut() {
for item in array {
if let Some(item_obj) = item.as_object_mut() {
if !item_obj.contains_key("id") {
item_obj.insert("id".to_string(), Json::String(counter.to_string()));
*counter += 1;
}
}
}
}
}
}
set_ids_for_field(&mut extension, "commands", &mut counter);
set_ids_for_field(&mut extension, "quicklinks", &mut counter);
set_ids_for_field(&mut extension, "scripts", &mut counter);
// Now the extension JSON is valid
let mut extension: Extension = serde_json::from_value(extension).unwrap_or_else(|e| {
panic!(
"cannot parse plugin.json as struct Extension, error [{:?}]",
e
);
});
drop(plugin_json);
// Write extension files to the extension directory
let developer = extension.developer.clone().unwrap_or_default();
let extension_id = extension.id.clone();
let extension_directory = {
let mut path = get_third_party_extension_directory(&tauri_app_handle);
path.push(developer);
path.push(extension_id.as_str());
path
};
tokio::fs::create_dir_all(extension_directory.as_path())
.await
.map_err(|e| e.to_string())?;
// Extract all files except plugin.json
for i in 0..archive.len() {
let mut zip_file = archive.by_index(i).map_err(|e| e.to_string())?;
// `.name()` is safe to use in our cases, the cases listed in the below
// page won't happen to us.
//
// https://docs.rs/zip/4.2.0/zip/read/struct.ZipFile.html#method.name
//
// Example names:
//
// * `assets/icon.png`
// * `assets/screenshot.png`
// * `plugin.json`
//
// Yes, the `assets` directory is not a part of it.
let zip_file_name = zip_file.name();
// Skip the plugin.json file as we'll create it from the extension variable
if zip_file_name == PLUGIN_JSON_FILE_NAME {
continue;
}
let dest_file_path = extension_directory.join(zip_file_name);
// For cases like `assets/xxx.png`
if let Some(parent_dir) = dest_file_path.parent()
&& !parent_dir.exists()
{
tokio::fs::create_dir_all(parent_dir)
.await
.map_err(|e| e.to_string())?;
}
let mut dest_file = tokio::fs::File::create(&dest_file_path)
.await
.map_err(|e| e.to_string())?;
let mut src_bytes = Vec::with_capacity(
zip_file
.size()
.try_into()
.expect("we won't have a extension file that is bigger than 4GiB"),
);
zip_file
.read_to_end(&mut src_bytes)
.map_err(|e| e.to_string())?;
tokio::io::copy(&mut src_bytes.as_slice(), &mut dest_file)
.await
.map_err(|e| e.to_string())?;
}
// Create plugin.json from the extension variable
let plugin_json_path = extension_directory.join(PLUGIN_JSON_FILE_NAME);
let extension_json = serde_json::to_string_pretty(&extension).map_err(|e| e.to_string())?;
tokio::fs::write(&plugin_json_path, extension_json)
.await
.map_err(|e| e.to_string())?;
// Turn it into an absolute path if it is a valid relative path because frontend code need this.
canonicalize_relative_icon_path(&extension_directory, &mut extension)?;
THIRD_PARTY_EXTENSIONS_SEARCH_SOURCE
.get()
.unwrap()
.add_extension(extension)
.await;
Ok(())
}

View File

@@ -1,7 +1,7 @@
mod assistant; mod assistant;
mod autostart; mod autostart;
mod common; mod common;
mod local; mod extension;
mod search; mod search;
mod server; mod server;
mod settings; mod settings;
@@ -11,19 +11,15 @@ mod util;
use crate::common::register::SearchSourceRegistry; use crate::common::register::SearchSourceRegistry;
// use crate::common::traits::SearchSource; // use crate::common::traits::SearchSource;
use crate::common::{MAIN_WINDOW_LABEL, SETTINGS_WINDOW_LABEL}; use crate::common::{CHECK_WINDOW_LABEL, MAIN_WINDOW_LABEL, SETTINGS_WINDOW_LABEL};
use crate::server::servers::{load_or_insert_default_server, load_servers_token}; use crate::server::servers::{load_or_insert_default_server, load_servers_token};
use autostart::{change_autostart, enable_autostart}; use autostart::{change_autostart, ensure_autostart_state_consistent};
use lazy_static::lazy_static; use lazy_static::lazy_static;
use std::sync::Mutex; use std::sync::Mutex;
use std::sync::OnceLock; use std::sync::OnceLock;
use tauri::async_runtime::block_on; use tauri::async_runtime::block_on;
use tauri::plugin::TauriPlugin; use tauri::plugin::TauriPlugin;
#[cfg(target_os = "macos")] use tauri::{AppHandle, Emitter, Manager, PhysicalPosition, Runtime, WebviewWindow, WindowEvent};
use tauri::ActivationPolicy;
use tauri::{
AppHandle, Emitter, Manager, PhysicalPosition, Runtime, WebviewWindow, Window, WindowEvent,
};
use tauri_plugin_autostart::MacosLauncher; use tauri_plugin_autostart::MacosLauncher;
/// Tauri store name /// Tauri store name
@@ -32,9 +28,14 @@ pub(crate) const COCO_TAURI_STORE: &str = "coco_tauri_store";
lazy_static! { lazy_static! {
static ref PREVIOUS_MONITOR_NAME: Mutex<Option<String>> = Mutex::new(None); static ref PREVIOUS_MONITOR_NAME: Mutex<Option<String>> = Mutex::new(None);
} }
/// To allow us to access tauri's `AppHandle` when its context is inaccessible, /// To allow us to access tauri's `AppHandle` when its context is inaccessible,
/// store it globally. It will be set in `init()`. /// store it globally. It will be set in `init()`.
///
/// # WARNING
///
/// You may find this work, but the usage is discouraged and should be generally
/// avoided. If you do need it, always be careful that it may not be set() when
/// you access it.
pub(crate) static GLOBAL_TAURI_APP_HANDLE: OnceLock<AppHandle> = OnceLock::new(); pub(crate) static GLOBAL_TAURI_APP_HANDLE: OnceLock<AppHandle> = OnceLock::new();
#[tauri::command] #[tauri::command]
@@ -64,11 +65,13 @@ pub fn run() {
let ctx = tauri::generate_context!(); let ctx = tauri::generate_context!();
let mut app_builder = tauri::Builder::default(); let mut app_builder = tauri::Builder::default();
// Set up logger first
app_builder = app_builder.plugin(set_up_tauri_logger());
#[cfg(desktop)] #[cfg(desktop)]
{ {
app_builder = app_builder.plugin(tauri_plugin_single_instance::init(|_app, argv, _cwd| { app_builder = app_builder.plugin(tauri_plugin_single_instance::init(|_app, argv, _cwd| {
println!("a new app instance was opened with {argv:?} and the deep link event was already triggered"); log::debug!("a new app instance was opened with {argv:?} and the deep link event was already triggered");
// when defining deep link schemes at runtime, you must also check `argv` here // when defining deep link schemes at runtime, you must also check `argv` here
})); }));
} }
@@ -77,7 +80,7 @@ pub fn run() {
.plugin(tauri_plugin_http::init()) .plugin(tauri_plugin_http::init())
.plugin(tauri_plugin_shell::init()) .plugin(tauri_plugin_shell::init())
.plugin(tauri_plugin_autostart::init( .plugin(tauri_plugin_autostart::init(
MacosLauncher::AppleScript, MacosLauncher::LaunchAgent,
None, None,
)) ))
.plugin(tauri_plugin_deep_link::init()) .plugin(tauri_plugin_deep_link::init())
@@ -87,9 +90,13 @@ pub fn run() {
.plugin(tauri_plugin_macos_permissions::init()) .plugin(tauri_plugin_macos_permissions::init())
.plugin(tauri_plugin_screenshots::init()) .plugin(tauri_plugin_screenshots::init())
.plugin(tauri_plugin_process::init()) .plugin(tauri_plugin_process::init())
.plugin(tauri_plugin_updater::Builder::new().build()) .plugin(
tauri_plugin_updater::Builder::new()
.default_version_comparator(crate::util::updater::custom_version_comparator)
.build(),
)
.plugin(tauri_plugin_windows_version::init()) .plugin(tauri_plugin_windows_version::init())
.plugin(set_up_tauri_logger()); .plugin(tauri_plugin_opener::init());
// Conditional compilation for macOS // Conditional compilation for macOS
#[cfg(target_os = "macos")] #[cfg(target_os = "macos")]
@@ -107,7 +114,8 @@ pub fn run() {
show_coco, show_coco,
hide_coco, hide_coco,
show_settings, show_settings,
server::servers::get_server_token, show_check,
hide_check,
server::servers::add_coco_server, server::servers::add_coco_server,
server::servers::remove_coco_server, server::servers::remove_coco_server,
server::servers::list_coco_servers, server::servers::list_coco_servers,
@@ -123,7 +131,9 @@ pub fn run() {
search::query_coco_fusion, search::query_coco_fusion,
assistant::chat_history, assistant::chat_history,
assistant::new_chat, assistant::new_chat,
assistant::chat_create,
assistant::send_message, assistant::send_message,
assistant::chat_chat,
assistant::session_chat_history, assistant::session_chat_history,
assistant::open_session_chat, assistant::open_session_chat,
assistant::close_session_chat, assistant::close_session_chat,
@@ -131,6 +141,8 @@ pub fn run() {
assistant::delete_session_chat, assistant::delete_session_chat,
assistant::update_session_chat, assistant::update_session_chat,
assistant::assistant_search, assistant::assistant_search,
assistant::assistant_get,
assistant::assistant_get_multi,
// server::get_coco_server_datasources, // server::get_coco_server_datasources,
// server::get_coco_server_connectors, // server::get_coco_server_connectors,
server::websocket::connect_to_server, server::websocket::connect_to_server,
@@ -140,51 +152,87 @@ pub fn run() {
server::attachment::get_attachment, server::attachment::get_attachment,
server::attachment::delete_attachment, server::attachment::delete_attachment,
server::transcription::transcription, server::transcription::transcription,
util::open,
server::system_settings::get_system_settings, server::system_settings::get_system_settings,
simulate_mouse_click, extension::built_in::application::get_app_list,
local::get_disabled_local_query_sources, extension::built_in::application::get_app_search_path,
local::enable_local_query_source, extension::built_in::application::get_app_metadata,
local::disable_local_query_source, extension::built_in::application::add_app_search_path,
local::application::get_app_list, extension::built_in::application::remove_app_search_path,
local::application::get_app_search_path, extension::built_in::application::reindex_applications,
local::application::get_app_metadata, extension::list_extensions,
local::application::set_app_alias, extension::enable_extension,
local::application::register_app_hotkey, extension::disable_extension,
local::application::unregister_app_hotkey, extension::set_extension_alias,
local::application::disable_app_search, extension::register_extension_hotkey,
local::application::enable_app_search, extension::unregister_extension_hotkey,
local::application::add_app_search_path, extension::is_extension_enabled,
local::application::remove_app_search_path, extension::third_party::store::search_extension,
extension::third_party::store::install_extension_from_store,
extension::third_party::uninstall_extension,
settings::set_allow_self_signature, settings::set_allow_self_signature,
settings::get_allow_self_signature, settings::get_allow_self_signature,
assistant::ask_ai,
crate::common::document::open,
#[cfg(any(target_os = "macos", target_os = "windows"))]
extension::built_in::file_search::config::get_file_system_config,
#[cfg(any(target_os = "macos", target_os = "windows"))]
extension::built_in::file_search::config::set_file_system_config,
server::synthesize::synthesize,
util::file::get_file_icon,
util::app_lang::update_app_lang,
#[cfg(target_os = "macos")]
setup::toggle_move_to_active_space_attribute,
]) ])
.setup(|app| { .setup(|app| {
let app_handle = app.handle().clone(); let app_handle = app.handle().clone();
GLOBAL_TAURI_APP_HANDLE GLOBAL_TAURI_APP_HANDLE
.set(app_handle.clone()) .set(app_handle.clone())
.expect("variable already initialized"); .expect("global tauri AppHandle already initialized");
log::trace!("global Tauri AppHandle set");
#[cfg(target_os = "macos")]
{
log::trace!("hiding Dock icon on macOS");
app.set_activation_policy(tauri::ActivationPolicy::Accessory);
log::trace!("Dock icon should be hidden now");
}
let registry = SearchSourceRegistry::default(); let registry = SearchSourceRegistry::default();
app.manage(registry); // Store registry in Tauri's app state app.manage(registry); // Store registry in Tauri's app state
app.manage(server::websocket::WebSocketManager::default()); app.manage(server::websocket::WebSocketManager::default());
block_on(async { // This has to be called before initializing extensions as doing that
init(app.handle()).await; // requires access to the shortcut store, which will be set by this
}); // function.
shortcut::enable_shortcut(app); shortcut::enable_shortcut(app);
enable_autostart(app); block_on(async {
init(app.handle()).await;
#[cfg(target_os = "macos")] // We want all the extensions here, so no filter condition specified.
app.set_activation_policy(ActivationPolicy::Accessory); match extension::list_extensions(app_handle.clone(), None, None, false).await {
Ok((_found_invalid_extensions, extensions)) => {
// Initializing extension relies on SearchSourceRegistry, so this should
// be executed after `app.manage(registry)`
if let Err(e) =
extension::init_extensions(app_handle.clone(), extensions).await
{
log::error!("initializing extensions failed with error [{}]", e);
}
}
Err(e) => {
log::error!("listing extensions failed with error [{}]", e);
}
}
});
ensure_autostart_state_consistent(app)?;
// app.listen("theme-changed", move |event| { // app.listen("theme-changed", move |event| {
// if let Ok(payload) = serde_json::from_str::<ThemeChangedPayload>(event.payload()) { // if let Ok(payload) = serde_json::from_str::<ThemeChangedPayload>(event.payload()) {
// // switch_tray_icon(app.app_handle(), payload.is_dark_mode); // // switch_tray_icon(app.app_handle(), payload.is_dark_mode);
// println!("Theme changed: is_dark_mode = {}", payload.is_dark_mode); // log::debug!("Theme changed: is_dark_mode = {}", payload.is_dark_mode);
// } // }
// }); // });
@@ -204,13 +252,19 @@ pub fn run() {
let main_window = app.get_webview_window(MAIN_WINDOW_LABEL).unwrap(); let main_window = app.get_webview_window(MAIN_WINDOW_LABEL).unwrap();
let settings_window = app.get_webview_window(SETTINGS_WINDOW_LABEL).unwrap(); let settings_window = app.get_webview_window(SETTINGS_WINDOW_LABEL).unwrap();
setup::default(app, main_window.clone(), settings_window.clone()); let check_window = app.get_webview_window(CHECK_WINDOW_LABEL).unwrap();
setup::default(
app,
main_window.clone(),
settings_window.clone(),
check_window.clone(),
);
Ok(()) Ok(())
}) })
.on_window_event(|window, event| match event { .on_window_event(|window, event| match event {
WindowEvent::CloseRequested { api, .. } => { WindowEvent::CloseRequested { api, .. } => {
dbg!("Close requested event received"); //dbg!("Close requested event received");
window.hide().unwrap(); window.hide().unwrap();
api.prevent_close(); api.prevent_close();
} }
@@ -225,10 +279,10 @@ pub fn run() {
has_visible_windows, has_visible_windows,
.. ..
} => { } => {
dbg!( // dbg!(
"Reopen event received: has_visible_windows = {}", // "Reopen event received: has_visible_windows = {}",
has_visible_windows // has_visible_windows
); // );
if has_visible_windows { if has_visible_windows {
return; return;
} }
@@ -242,14 +296,14 @@ pub fn run() {
pub async fn init<R: Runtime>(app_handle: &AppHandle<R>) { pub async fn init<R: Runtime>(app_handle: &AppHandle<R>) {
// Await the async functions to load the servers and tokens // Await the async functions to load the servers and tokens
if let Err(err) = load_or_insert_default_server(app_handle).await { if let Err(err) = load_or_insert_default_server(app_handle).await {
eprintln!("Failed to load servers: {}", err); log::error!("Failed to load servers: {}", err);
} }
if let Err(err) = load_servers_token(app_handle).await { if let Err(err) = load_servers_token(app_handle).await {
eprintln!("Failed to load server tokens: {}", err); log::error!("Failed to load server tokens: {}", err);
} }
let coco_servers = server::servers::get_all_servers(); let coco_servers = server::servers::get_all_servers().await;
// Get the registry from Tauri's state // Get the registry from Tauri's state
// let registry: State<SearchSourceRegistry> = app_handle.state::<SearchSourceRegistry>(); // let registry: State<SearchSourceRegistry> = app_handle.state::<SearchSourceRegistry>();
@@ -259,12 +313,12 @@ pub async fn init<R: Runtime>(app_handle: &AppHandle<R>) {
.await; .await;
} }
local::start_pizza_engine_runtime(); extension::built_in::pizza_engine_runtime::start_pizza_engine_runtime().await;
} }
#[tauri::command] #[tauri::command]
async fn show_coco<R: Runtime>(app_handle: AppHandle<R>) { async fn show_coco<R: Runtime>(app_handle: AppHandle<R>) {
if let Some(window) = app_handle.get_window(MAIN_WINDOW_LABEL) { if let Some(window) = app_handle.get_webview_window(MAIN_WINDOW_LABEL) {
move_window_to_active_monitor(&window); move_window_to_active_monitor(&window);
let _ = window.show(); let _ = window.show();
@@ -277,24 +331,24 @@ async fn show_coco<R: Runtime>(app_handle: AppHandle<R>) {
#[tauri::command] #[tauri::command]
async fn hide_coco<R: Runtime>(app: AppHandle<R>) { async fn hide_coco<R: Runtime>(app: AppHandle<R>) {
if let Some(window) = app.get_window(MAIN_WINDOW_LABEL) { if let Some(window) = app.get_webview_window(MAIN_WINDOW_LABEL) {
if let Err(err) = window.hide() { if let Err(err) = window.hide() {
eprintln!("Failed to hide the window: {}", err); log::error!("Failed to hide the window: {}", err);
} else { } else {
println!("Window successfully hidden."); log::debug!("Window successfully hidden.");
} }
} else { } else {
eprintln!("Main window not found."); log::error!("Main window not found.");
} }
} }
fn move_window_to_active_monitor<R: Runtime>(window: &Window<R>) { fn move_window_to_active_monitor<R: Runtime>(window: &WebviewWindow<R>) {
dbg!("Moving window to active monitor"); //dbg!("Moving window to active monitor");
// Try to get the available monitors, handle failure gracefully // Try to get the available monitors, handle failure gracefully
let available_monitors = match window.available_monitors() { let available_monitors = match window.available_monitors() {
Ok(monitors) => monitors, Ok(monitors) => monitors,
Err(e) => { Err(e) => {
eprintln!("Failed to get monitors: {}", e); log::error!("Failed to get monitors: {}", e);
return; return;
} }
}; };
@@ -303,7 +357,7 @@ fn move_window_to_active_monitor<R: Runtime>(window: &Window<R>) {
let cursor_position = match window.cursor_position() { let cursor_position = match window.cursor_position() {
Ok(pos) => Some(pos), Ok(pos) => Some(pos),
Err(e) => { Err(e) => {
eprintln!("Failed to get cursor position: {}", e); log::error!("Failed to get cursor position: {}", e);
None None
} }
}; };
@@ -332,7 +386,7 @@ fn move_window_to_active_monitor<R: Runtime>(window: &Window<R>) {
let monitor = match target_monitor.or_else(|| window.primary_monitor().ok().flatten()) { let monitor = match target_monitor.or_else(|| window.primary_monitor().ok().flatten()) {
Some(monitor) => monitor, Some(monitor) => monitor,
None => { None => {
eprintln!("No monitor found!"); log::error!("No monitor found!");
return; return;
} }
}; };
@@ -342,7 +396,7 @@ fn move_window_to_active_monitor<R: Runtime>(window: &Window<R>) {
if let Some(ref prev_name) = *previous_monitor_name { if let Some(ref prev_name) = *previous_monitor_name {
if name.to_string() == *prev_name { if name.to_string() == *prev_name {
println!("Currently on the same monitor"); log::debug!("Currently on the same monitor");
return; return;
} }
@@ -356,7 +410,7 @@ fn move_window_to_active_monitor<R: Runtime>(window: &Window<R>) {
let window_size = match window.inner_size() { let window_size = match window.inner_size() {
Ok(size) => size, Ok(size) => size,
Err(e) => { Err(e) => {
eprintln!("Failed to get window size: {}", e); log::error!("Failed to get window size: {}", e);
return; return;
} }
}; };
@@ -370,52 +424,19 @@ fn move_window_to_active_monitor<R: Runtime>(window: &Window<R>) {
// Move the window to the new position // Move the window to the new position
if let Err(e) = window.set_position(PhysicalPosition::new(window_x, window_y)) { if let Err(e) = window.set_position(PhysicalPosition::new(window_x, window_y)) {
eprintln!("Failed to move window: {}", e); log::error!("Failed to move window: {}", e);
} }
if let Some(name) = monitor.name() { if let Some(name) = monitor.name() {
println!("Window moved to monitor: {}", name); log::debug!("Window moved to monitor: {}", name);
let mut previous_monitor = PREVIOUS_MONITOR_NAME.lock().unwrap(); let mut previous_monitor = PREVIOUS_MONITOR_NAME.lock().unwrap();
*previous_monitor = Some(name.to_string()); *previous_monitor = Some(name.to_string());
} }
} }
#[allow(dead_code)]
fn open_settings(app: &tauri::AppHandle) {
use tauri::webview::WebviewBuilder;
println!("settings menu item was clicked");
let window = app.get_webview_window("settings");
if let Some(window) = window {
let _ = window.show();
let _ = window.unminimize();
let _ = window.set_focus();
} else {
let window = tauri::window::WindowBuilder::new(app, "settings")
.title("Settings Window")
.fullscreen(false)
.resizable(false)
.minimizable(false)
.maximizable(false)
.inner_size(800.0, 600.0)
.build()
.unwrap();
let webview_builder =
WebviewBuilder::new("settings", tauri::WebviewUrl::App("/ui/settings".into()));
let _webview = window
.add_child(
webview_builder,
tauri::LogicalPosition::new(0, 0),
window.inner_size().unwrap(),
)
.unwrap();
}
}
#[tauri::command] #[tauri::command]
async fn get_app_search_source<R: Runtime>(app_handle: AppHandle<R>) -> Result<(), String> { async fn get_app_search_source(app_handle: AppHandle) -> Result<(), String> {
local::init_local_search_source(&app_handle).await?;
let _ = server::connector::refresh_all_connectors(&app_handle).await; let _ = server::connector::refresh_all_connectors(&app_handle).await;
let _ = server::datasource::refresh_all_datasources(&app_handle).await; let _ = server::datasource::refresh_all_datasources(&app_handle).await;
@@ -424,53 +445,36 @@ async fn get_app_search_source<R: Runtime>(app_handle: AppHandle<R>) -> Result<(
#[tauri::command] #[tauri::command]
async fn show_settings(app_handle: AppHandle) { async fn show_settings(app_handle: AppHandle) {
open_settings(&app_handle); log::debug!("settings menu item was clicked");
let window = app_handle
.get_webview_window(SETTINGS_WINDOW_LABEL)
.expect("we have a settings window");
window.show().unwrap();
window.unminimize().unwrap();
window.set_focus().unwrap();
} }
#[tauri::command] #[tauri::command]
async fn simulate_mouse_click<R: Runtime>(window: WebviewWindow<R>, is_chat_mode: bool) { async fn show_check(app_handle: AppHandle) {
#[cfg(target_os = "windows")] log::debug!("check menu item was clicked");
{ let window = app_handle
use enigo::{Button, Coordinate, Direction, Enigo, Mouse, Settings}; .get_webview_window(CHECK_WINDOW_LABEL)
use std::{thread, time::Duration}; .expect("we have a check window");
if let Ok(mut enigo) = Enigo::new(&Settings::default()) { window.show().unwrap();
// Save the current mouse position window.unminimize().unwrap();
if let Ok((original_x, original_y)) = enigo.location() { window.set_focus().unwrap();
// Retrieve the window's outer position (top-left corner) }
if let Ok(position) = window.outer_position() {
// Retrieve the window's inner size (client area)
if let Ok(size) = window.inner_size() {
// Calculate the center position of the title bar
let x = position.x + (size.width as i32 / 2);
let y = if is_chat_mode {
position.y + size.height as i32 - 50
} else {
position.y + 30
};
// Move the mouse cursor to the calculated position #[tauri::command]
if enigo.move_mouse(x, y, Coordinate::Abs).is_ok() { async fn hide_check(app_handle: AppHandle) {
// // Simulate a left mouse click log::debug!("check window was closed");
let _ = enigo.button(Button::Left, Direction::Click); let window = &app_handle
// let _ = enigo.button(Button::Left, Direction::Release); .get_webview_window(CHECK_WINDOW_LABEL)
.expect("we have a check window");
thread::sleep(Duration::from_millis(100)); window.hide().unwrap();
// Move the mouse cursor back to the original position
let _ = enigo.move_mouse(original_x, original_y, Coordinate::Abs);
}
}
}
}
}
}
#[cfg(not(target_os = "windows"))]
{
let _ = window;
let _ = is_chat_mode;
}
} }
/// Log format: /// Log format:
@@ -487,6 +491,12 @@ async fn simulate_mouse_click<R: Runtime>(window: WebviewWindow<R>, is_chat_mode
/// ``` /// ```
fn set_up_tauri_logger() -> TauriPlugin<tauri::Wry> { fn set_up_tauri_logger() -> TauriPlugin<tauri::Wry> {
use log::Level; use log::Level;
use log::LevelFilter;
use tauri_plugin_log::Builder;
/// Coco-AI app's default log level.
const DEFAULT_LOG_LEVEL: LevelFilter = LevelFilter::Info;
const LOG_LEVEL_ENV_VAR: &str = "COCO_LOG";
fn format_log_level(level: Level) -> &'static str { fn format_log_level(level: Level) -> &'static str {
match level { match level {
@@ -508,16 +518,93 @@ fn set_up_tauri_logger() -> TauriPlugin<tauri::Wry> {
str str
} }
tauri_plugin_log::Builder::new() /// Allow us to configure dynamic log levels via environment variable `COCO_LOG`.
.format(|out, message, record| { ///
let now = chrono::Local::now().format("%m-%d %H:%M:%S"); /// Generally, it mirros the behavior of `env_logger`. Syntax: `COCO_LOG=[target][=][level][,...]`
let level = format_log_level(record.level()); ///
let target_and_line = format_target_and_line(record); /// * If this environment variable is not set, use the default log level.
out.finish(format_args!( /// * If it is set, respect it:
"[{}] [{}] [{}] {}", ///
now, level, target_and_line, message /// * `COCO_LOG=coco_lib` turns on all logging for the `coco_lib` module, which is
)); /// equivalent to `COCO_LOG=coco_lib=trace`
}) /// * `COCO_LOG=trace` turns on all logging for the application, regardless of its name
.level(log::LevelFilter::Debug) /// * `COCO_LOG=TRACE` turns on all logging for the application, regardless of its name (same as previous)
.build() /// * `COCO_LOG=reqwest=debug` turns on debug logging for `reqwest`
/// * `COCO_LOG=trace,tauri=off` turns on all the logging except for the logs come from `tauri`
/// * `COCO_LOG=off` turns off all logging for the application
/// * `COCO_LOG=` Since the value is empty, turns off all logging for the application as well
fn dynamic_log_level(mut builder: Builder) -> Builder {
let Some(log_levels) = std::env::var_os(LOG_LEVEL_ENV_VAR) else {
return builder.level(DEFAULT_LOG_LEVEL);
};
builder = builder.level(LevelFilter::Off);
let log_levels = log_levels.into_string().unwrap_or_else(|e| {
panic!(
"The value '{}' set in environment varaible '{}' is not UTF-8 encoded",
// Cannot use `.display()` here becuase that requires MSRV 1.87.0
e.to_string_lossy(),
LOG_LEVEL_ENV_VAR
)
});
// COCO_LOG=[target][=][level][,...]
let target_log_levels = log_levels.split(',');
for target_log_level in target_log_levels {
#[allow(clippy::collapsible_else_if)]
if let Some(char_index) = target_log_level.chars().position(|c| c == '=') {
let (target, equal_sign_and_level) = target_log_level.split_at(char_index);
// Remove the equal sign, we know it takes 1 byte
let level = &equal_sign_and_level[1..];
if let Ok(level) = level.parse::<LevelFilter>() {
// Here we have to call `.to_string()` because `Cow<'static, str>` requires `&'static str`
builder = builder.level_for(target.to_string(), level);
} else {
panic!(
"log level '{}' set in '{}={}' is invalid",
level, target, level
);
}
} else {
if let Ok(level) = target_log_level.parse::<LevelFilter>() {
// This is a level
builder = builder.level(level);
} else {
// This is a target, enable all the logging
//
// Here we have to call `.to_string()` because `Cow<'static, str>` requires `&'static str`
builder = builder.level_for(target_log_level.to_string(), LevelFilter::Trace);
}
}
}
builder
}
// When running the built binary, set `COCO_LOG` to `coco_lib=trace` to capture all logs
// that come from Coco in the log file, which helps with debugging.
if !tauri::is_dev() {
// We have absolutely no guarantee that we (We have control over the Rust
// code, but definitely no idea about the libc C code, all the shared objects
// that we will link) will not concurrently read/write `envp`, so just use unsafe.
unsafe {
std::env::set_var("COCO_LOG", "coco_lib=trace");
}
}
let mut builder = tauri_plugin_log::Builder::new();
builder = builder.format(|out, message, record| {
let now = chrono::Local::now().format("%m-%d %H:%M:%S");
let level = format_log_level(record.level());
let target_and_line = format_target_and_line(record);
out.finish(format_args!(
"[{}] [{}] [{}] {}",
now, level, target_and_line, message
));
});
builder = dynamic_log_level(builder);
builder.build()
} }

View File

@@ -1,164 +0,0 @@
pub mod application;
pub mod calculator;
pub mod file_system;
use std::any::Any;
use std::collections::hash_map::Entry;
use std::collections::HashMap;
use std::sync::OnceLock;
use crate::common::register::SearchSourceRegistry;
use serde_json::Value as Json;
use tauri::{AppHandle, Manager, Runtime};
use tauri_plugin_store::StoreExt;
pub const LOCAL_QUERY_SOURCE_TYPE: &str = "local";
pub const TAURI_STORE_LOCAL_QUERY_SOURCE_ENABLED_STATE: &str = "local_query_source_enabled_state";
trait SearchSourceState {
#[cfg_attr(not(feature = "use_pizza_engine"), allow(unused))]
fn as_mut_any(&mut self) -> &mut dyn Any;
}
#[async_trait::async_trait(?Send)]
trait Task: Send + Sync {
fn search_source_id(&self) -> &'static str;
async fn exec(&mut self, state: &mut Option<Box<dyn SearchSourceState>>);
}
static RUNTIME_TX: OnceLock<tokio::sync::mpsc::UnboundedSender<Box<dyn Task>>> = OnceLock::new();
pub(crate) fn start_pizza_engine_runtime() {
std::thread::spawn(|| {
let rt = tokio::runtime::Runtime::new().unwrap();
let main = async {
let mut states: HashMap<String, Option<Box<dyn SearchSourceState>>> = HashMap::new();
let (tx, mut rx) = tokio::sync::mpsc::unbounded_channel();
RUNTIME_TX.set(tx).unwrap();
while let Some(mut task) = rx.recv().await {
let opt_search_source_state = match states.entry(task.search_source_id().into()) {
Entry::Occupied(o) => o.into_mut(),
Entry::Vacant(v) => v.insert(None),
};
task.exec(opt_search_source_state).await;
}
};
rt.block_on(main);
});
}
pub(crate) async fn init_local_search_source<R: Runtime>(
app_handle: &AppHandle<R>,
) -> Result<(), String> {
let enabled_status_store = app_handle
.store(TAURI_STORE_LOCAL_QUERY_SOURCE_ENABLED_STATE)
.map_err(|e| e.to_string())?;
if enabled_status_store.is_empty() {
enabled_status_store.set(
application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME,
Json::Bool(true),
);
enabled_status_store.set(calculator::DATA_SOURCE_ID, Json::Bool(true));
}
let registry = app_handle.state::<SearchSourceRegistry>();
application::ApplicationSearchSource::init(app_handle.clone()).await?;
for (id, enabled) in enabled_status_store.entries() {
let enabled = match enabled {
Json::Bool(b) => b,
_ => unreachable!("enabled state should be stored as a boolean"),
};
if enabled {
if id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
registry
.register_source(application::ApplicationSearchSource)
.await;
}
if id == calculator::DATA_SOURCE_ID {
let calculator_search = calculator::CalculatorSource::new(2000f64);
registry.register_source(calculator_search).await;
}
}
}
Ok(())
}
#[tauri::command]
pub async fn get_disabled_local_query_sources<R: Runtime>(app_handle: AppHandle<R>) -> Vec<String> {
let enabled_status_store = app_handle
.store(TAURI_STORE_LOCAL_QUERY_SOURCE_ENABLED_STATE)
.unwrap_or_else(|e| {
panic!(
"tauri store [{}] should exist and be loaded, but that's not true due to error [{}]",
TAURI_STORE_LOCAL_QUERY_SOURCE_ENABLED_STATE, e
)
});
let mut disabled_local_query_sources = Vec::new();
for (id, enabled) in enabled_status_store.entries() {
let enabled = match enabled {
Json::Bool(b) => b,
_ => unreachable!("enabled state should be stored as a boolean"),
};
if !enabled {
disabled_local_query_sources.push(id);
}
}
disabled_local_query_sources
}
#[tauri::command]
pub async fn enable_local_query_source<R: Runtime>(
app_handle: AppHandle<R>,
query_source_id: String,
) {
let registry = app_handle.state::<SearchSourceRegistry>();
if query_source_id == application::QUERYSOURCE_ID_DATASOURCE_ID_DATASOURCE_NAME {
let application_search = application::ApplicationSearchSource;
registry.register_source(application_search).await;
}
if query_source_id == calculator::DATA_SOURCE_ID {
let calculator_search = calculator::CalculatorSource::new(2000f64);
registry.register_source(calculator_search).await;
}
let enabled_status_store = app_handle
.store(TAURI_STORE_LOCAL_QUERY_SOURCE_ENABLED_STATE)
.unwrap_or_else(|e| {
panic!(
"tauri store [{}] should exist and be loaded, but that's not true due to error [{}]",
TAURI_STORE_LOCAL_QUERY_SOURCE_ENABLED_STATE, e
)
});
enabled_status_store.set(query_source_id, Json::Bool(true));
}
#[tauri::command]
pub async fn disable_local_query_source<R: Runtime>(
app_handle: AppHandle<R>,
query_source_id: String,
) {
let registry = app_handle.state::<SearchSourceRegistry>();
registry.remove_source(&query_source_id).await;
let enabled_status_store = app_handle
.store(TAURI_STORE_LOCAL_QUERY_SOURCE_ENABLED_STATE)
.unwrap_or_else(|e| {
panic!(
"tauri store [{}] should exist and be loaded, but that's not true due to error [{}]",
TAURI_STORE_LOCAL_QUERY_SOURCE_ENABLED_STATE, e
)
});
enabled_status_store.set(query_source_id, Json::Bool(false));
}

View File

@@ -1,5 +1,112 @@
// Prevents additional console window on Windows in release, DO NOT REMOVE!! // Prevents additional console window on Windows in release, DO NOT REMOVE!!
#![cfg_attr(not(debug_assertions), windows_subsystem = "windows")] #![cfg_attr(not(debug_assertions), windows_subsystem = "windows")]
use std::fs::OpenOptions;
use std::io::Write;
use std::path::PathBuf;
/// Helper function to return the log directory.
///
/// This should return the same value as `tauri_app_handle.path().app_log_dir().unwrap()`.
fn app_log_dir() -> PathBuf {
// This function `app_log_dir()` is for the panic hook, which should be set
// before Tauri performs any initialization. At that point, we do not have
// access to the identifier provided by Tauri, so we need to define our own
// one here.
//
// NOTE: If you update identifier in the following files, update this one
// as well!
//
// src-tauri/tauri.linux.conf.json
// src-tauri/Entitlements.plist
// src-tauri/tauri.conf.json
// src-tauri/Info.plist
const IDENTIFIER: &str = "rs.coco.app";
#[cfg(target_os = "macos")]
let path = dirs::home_dir()
.expect("cannot find the home directory, Coco should never run in such a environment")
.join("Library/Logs")
.join(IDENTIFIER);
#[cfg(not(target_os = "macos"))]
let path = dirs::data_local_dir()
.expect("app local dir is None, we should not encounter this")
.join(IDENTIFIER)
.join("logs");
path
}
/// Set up panic hook to log panic information to a file
fn setup_panic_hook() {
std::panic::set_hook(Box::new(|panic_info| {
let timestamp = chrono::Local::now();
// "%Y-%m-%d %H:%M:%S"
//
// I would like to use the above format, but Windows does not allow that
// and complains with OS error 123.
let datetime_str = timestamp.format("%Y-%m-%d-%H-%M-%S").to_string();
let log_dir = app_log_dir();
// Ensure the log directory exists
if let Err(e) = std::fs::create_dir_all(&log_dir) {
eprintln!("Panic hook error: failed to create log directory: {}", e);
return;
}
let panic_file = log_dir.join(format!("{}_rust_panic.log", datetime_str));
// Prepare panic information
let panic_message = if let Some(s) = panic_info.payload().downcast_ref::<&str>() {
s.to_string()
} else if let Some(s) = panic_info.payload().downcast_ref::<String>() {
s.clone()
} else {
"Unknown panic message".to_string()
};
let location = if let Some(location) = panic_info.location() {
format!(
"{}:{}:{}",
location.file(),
location.line(),
location.column()
)
} else {
"Unknown location".to_string()
};
// Use `force_capture()` instead of `capture()` as we want backtrace
// regardless of whether the corresponding env vars are set or not.
let backtrace = std::backtrace::Backtrace::force_capture();
let panic_log = format!(
"Time: [{}]\nLocation: [{}]\nMessage: [{}]\nBacktrace: \n{}",
datetime_str, location, panic_message, backtrace
);
// Write to panic file
match OpenOptions::new()
.create(true)
.append(true)
.open(&panic_file)
{
Ok(mut file) => {
if let Err(e) = writeln!(file, "{}", panic_log) {
eprintln!("Panic hook error: Failed to write panic to file: {}", e);
}
}
Err(e) => {
eprintln!("Panic hook error: Failed to open panic log file: {}", e);
}
}
}));
}
fn main() { fn main() {
// Panic hook setup should be the first thing to do, everything could panic!
setup_panic_hook();
coco_lib::run(); coco_lib::run();
} }

View File

@@ -1,130 +1,256 @@
use crate::common::error::SearchError; use crate::common::error::SearchError;
use crate::common::register::SearchSourceRegistry; use crate::common::register::SearchSourceRegistry;
use crate::common::search::{ use crate::common::search::{
FailedRequest, MultiSourceQueryResponse, QueryHits, QuerySource, SearchQuery, FailedRequest, MultiSourceQueryResponse, QueryHits, QueryResponse, QuerySource, SearchQuery,
}; };
use futures::stream::FuturesUnordered; use crate::common::traits::SearchSource;
use crate::server::servers::logout_coco_server;
use crate::server::servers::mark_server_as_offline;
use function_name::named;
use futures::StreamExt; use futures::StreamExt;
use futures::stream::FuturesUnordered;
use reqwest::StatusCode;
use std::cmp::Reverse;
use std::collections::HashMap; use std::collections::HashMap;
use std::collections::HashSet; use std::collections::HashSet;
use tauri::{AppHandle, Manager, Runtime}; use std::future::Future;
use tokio::time::{timeout, Duration}; use std::sync::Arc;
use tauri::{AppHandle, Manager};
use tokio::time::error::Elapsed;
use tokio::time::{Duration, timeout};
/// Helper function to return the Future used for querying querysources.
///
/// It is a workaround for the limitations:
///
/// 1. 2 async blocks have different types in Rust's type system even though
/// they are literally same
/// 2. `futures::stream::FuturesUnordered` needs the `Futures` pushed to it to
/// have only 1 type
///
/// Putting the async block in a function to unify the types.
fn same_type_futures(
query_source: QuerySource,
query_source_trait_object: Arc<dyn SearchSource>,
timeout_duration: Duration,
search_query: SearchQuery,
tauri_app_handle: AppHandle,
) -> impl Future<
Output = (
QuerySource,
Result<Result<QueryResponse, SearchError>, Elapsed>,
),
> + 'static {
async move {
(
// Store `query_source` as part of future for debugging purposes.
query_source,
timeout(timeout_duration, async {
query_source_trait_object
.search(tauri_app_handle.clone(), search_query)
.await
})
.await,
)
}
}
#[named]
#[tauri::command] #[tauri::command]
pub async fn query_coco_fusion<R: Runtime>( pub async fn query_coco_fusion(
app_handle: AppHandle<R>, app_handle: AppHandle,
from: u64, from: u64,
size: u64, size: u64,
query_strings: HashMap<String, String>, query_strings: HashMap<String, String>,
query_timeout: u64, query_timeout: u64,
) -> Result<MultiSourceQueryResponse, SearchError> { ) -> Result<MultiSourceQueryResponse, SearchError> {
let query_source_to_search = query_strings.get("querysource"); let query_keyword = query_strings
.get("query")
.unwrap_or(&"".to_string())
.clone();
let opt_query_source_id = query_strings.get("querysource");
let search_sources = app_handle.state::<SearchSourceRegistry>(); let search_sources = app_handle.state::<SearchSourceRegistry>();
let sources_future = search_sources.get_sources(); let sources_future = search_sources.get_sources();
let mut futures = FuturesUnordered::new(); let mut futures = FuturesUnordered::new();
let mut sources = HashMap::new();
let sources_list = sources_future.await; let mut sources_list = sources_future.await;
let sources_list_len = sources_list.len();
// Time limit for each query // Time limit for each query
let timeout_duration = Duration::from_millis(query_timeout); let timeout_duration = Duration::from_millis(query_timeout);
// Push all queries into futures log::debug!(
for query_source in sources_list { "{}() invoked with parameters: from: [{}], size: [{}], query_strings: [{:?}], timeout: [{:?}]",
let query_source_type = query_source.get_type().clone(); function_name!(),
from,
size,
query_strings,
timeout_duration
);
if let Some(query_source_to_search) = query_source_to_search { let search_query = SearchQuery::new(from, size, query_strings.clone());
// We should not search this data source
if &query_source_type.id != query_source_to_search { if let Some(query_source_id) = opt_query_source_id {
continue; // If this query source ID is specified, we only query this query source.
} log::debug!(
"parameter [querysource={}] specified, will only query this querysource",
query_source_id
);
let opt_query_source_trait_object_index = sources_list
.iter()
.position(|query_source| &query_source.get_type().id == query_source_id);
let Some(query_source_trait_object_index) = opt_query_source_trait_object_index else {
// It is possible (an edge case) that the frontend invokes `query_coco_fusion()` with a
// datasource that does not exist in the source list:
//
// 1. Search applications
// 2. Navigate to the application sub page
// 3. Disable the application extension in settings
// 4. hide the search window
// 5. Re-open the search window and search for something
//
// The application search source is not in the source list because the extension
// has been disabled, but the last search is indeed invoked with parameter
// `datasource=application`.
return Ok(MultiSourceQueryResponse {
failed: Vec::new(),
hits: Vec::new(),
total_hits: 0,
});
};
let query_source_trait_object = sources_list.remove(query_source_trait_object_index);
let query_source = query_source_trait_object.get_type();
futures.push(same_type_futures(
query_source,
query_source_trait_object,
timeout_duration,
search_query,
app_handle.clone(),
));
} else {
log::debug!(
"will query querysources {:?}",
sources_list
.iter()
.map(|search_source| search_source.get_type().id.clone())
.collect::<Vec<String>>()
);
for query_source_trait_object in sources_list {
let query_source = query_source_trait_object.get_type().clone();
futures.push(same_type_futures(
query_source,
query_source_trait_object,
timeout_duration,
search_query.clone(),
app_handle.clone(),
));
} }
sources.insert(query_source_type.id.clone(), query_source_type);
let query = SearchQuery::new(from, size, query_strings.clone());
let query_source_clone = query_source.clone(); // Clone Arc to avoid ownership issues
futures.push(tokio::spawn(async move {
// Timeout each query execution
timeout(timeout_duration, async {
query_source_clone.search(query).await
})
.await
}));
} }
let mut total_hits = 0; let mut total_hits = 0;
let mut need_rerank = true; //TODO set default to false when boost supported in Pizza
let mut failed_requests = Vec::new(); let mut failed_requests = Vec::new();
let mut all_hits: Vec<(String, QueryHits, f64)> = Vec::new(); let mut all_hits: Vec<(String, QueryHits, f64)> = Vec::new();
let mut hits_per_source: HashMap<String, Vec<(QueryHits, f64)>> = HashMap::new(); let mut hits_per_source: HashMap<String, Vec<(QueryHits, f64)>> = HashMap::new();
while let Some(result) = futures.next().await { if sources_list_len > 1 {
match result { need_rerank = true; // If we have more than one source, we need to rerank the hits
Ok(Ok(Ok(response))) => { }
total_hits += response.total_hits;
let source_id = response.source.id.clone();
for (doc, score) in response.hits { while let Some((query_source, timeout_result)) = futures.next().await {
let query_hit = QueryHits { match timeout_result {
source: Some(response.source.clone()), // Ignore the `_timeout` variable as it won't provide any useful debugging information.
score, Err(_timeout) => {
document: doc, log::warn!(
}; "searching query source [{}] timed out, skip this request",
query_source.id
);
// failed_requests.push(FailedRequest {
// source: query_source,
// status: 0,
// error: Some("querying timed out".into()),
// reason: None,
// });
}
Ok(query_result) => match query_result {
Ok(response) => {
total_hits += response.total_hits;
let source_id = response.source.id.clone();
all_hits.push((source_id.clone(), query_hit.clone(), score)); for (doc, score) in response.hits {
log::debug!("doc: {}, {:?}, {}", doc.id, doc.title, score);
hits_per_source let query_hit = QueryHits {
.entry(source_id.clone()) source: Some(response.source.clone()),
.or_insert_with(Vec::new) score,
.push((query_hit, score)); document: doc,
};
all_hits.push((source_id.clone(), query_hit.clone(), score));
hits_per_source
.entry(source_id.clone())
.or_insert_with(Vec::new)
.push((query_hit, score));
}
} }
} Err(search_error) => {
Ok(Ok(Err(err))) => { log::error!(
failed_requests.push(FailedRequest { "searching query source [{}] failed, error [{}]",
source: QuerySource { query_source.id,
r#type: "N/A".into(), search_error
name: "N/A".into(), );
id: "N/A".into(),
}, let mut status_code_num: u16 = 0;
status: 0,
error: Some(err.to_string()), if let SearchError::HttpError {
reason: None, status_code: opt_status_code,
}); msg: _,
} } = search_error
Ok(Err(err)) => { {
failed_requests.push(FailedRequest { if let Some(status_code) = opt_status_code {
source: QuerySource { status_code_num = status_code.as_u16();
r#type: "N/A".into(), if status_code != StatusCode::OK {
name: "N/A".into(), if status_code == StatusCode::UNAUTHORIZED {
id: "N/A".into(), // This Coco server is unavailable. In addition to marking it as
}, // unavailable, we need to log out because the status code is 401.
status: 0, logout_coco_server(app_handle.clone(), query_source.id.clone()).await.unwrap_or_else(|e| {
error: Some(err.to_string()), panic!(
reason: None, "the search request to Coco server [id {}, name {}] failed with status code {}, the login token is invalid, we are trying to log out, but failed with error [{}]",
}); query_source.id, query_source.name, StatusCode::UNAUTHORIZED, e
} );
// Timeout reached, skip this request })
_ => { } else {
failed_requests.push(FailedRequest { // This Coco server is unavailable
source: QuerySource { mark_server_as_offline(app_handle.clone(), &query_source.id)
r#type: "N/A".into(), .await;
name: "N/A".into(), }
id: "N/A".into(), }
}, }
status: 0, }
error: Some(format!("{:?}", &result)),
reason: None, failed_requests.push(FailedRequest {
}); source: query_source,
} status: status_code_num,
error: Some(search_error.to_string()),
reason: None,
});
}
},
} }
} }
// Sort hits within each source by score (descending) // Sort hits within each source by score (descending)
for hits in hits_per_source.values_mut() { for hits in hits_per_source.values_mut() {
hits.sort_by(|a, b| b.1.partial_cmp(&a.1).unwrap_or(std::cmp::Ordering::Equal)); hits.sort_by(|a, b| b.1.partial_cmp(&a.1).unwrap_or(std::cmp::Ordering::Greater));
} }
let total_sources = hits_per_source.len(); let total_sources = hits_per_source.len();
@@ -140,16 +266,71 @@ pub async fn query_coco_fusion<R: Runtime>(
// Distribute hits fairly across sources // Distribute hits fairly across sources
for (_source_id, hits) in &mut hits_per_source { for (_source_id, hits) in &mut hits_per_source {
let take_count = hits.len().min(max_hits_per_source); let take_count = hits.len().min(max_hits_per_source);
for (doc, _) in hits.drain(0..take_count) { for (doc, score) in hits.drain(0..take_count) {
if !seen_docs.contains(&doc.document.id) { if !seen_docs.contains(&doc.document.id) {
seen_docs.insert(doc.document.id.clone()); seen_docs.insert(doc.document.id.clone());
log::debug!(
"collect doc: {}, {:?}, {}",
doc.document.id,
doc.document.title,
score
);
final_hits.push(doc); final_hits.push(doc);
} }
} }
} }
// If we still need more hits, take the highest-scoring remaining ones log::debug!("final hits: {:?}", final_hits.len());
if final_hits.len() < size as usize {
let mut unique_sources = HashSet::new();
for hit in &final_hits {
if let Some(source) = &hit.source {
if source.id != crate::extension::built_in::calculator::DATA_SOURCE_ID {
unique_sources.insert(&source.id);
}
}
}
log::debug!(
"Multiple sources found: {:?}, no rerank needed",
unique_sources
);
if unique_sources.len() < 1 {
need_rerank = false; // If we have hits from multiple sources, we don't need to rerank
}
if need_rerank && final_hits.len() > 1 {
// Precollect (index, title)
let titles_to_score: Vec<(usize, &str)> = final_hits
.iter()
.enumerate()
.filter_map(|(idx, hit)| {
let source = hit.source.as_ref()?;
let title = hit.document.title.as_deref()?;
if source.id != crate::extension::built_in::calculator::DATA_SOURCE_ID {
Some((idx, title))
} else {
None
}
})
.collect();
// Score them
let scored_hits = boosted_levenshtein_rerank(query_keyword.as_str(), titles_to_score);
// Sort descending by score
let mut scored_hits = scored_hits;
scored_hits.sort_by_key(|&(_, score)| Reverse((score * 1000.0) as u64));
// Apply new scores to final_hits
for (idx, score) in scored_hits.into_iter().take(size as usize) {
final_hits[idx].score = score;
}
} else if final_hits.len() < size as usize {
// If we still need more hits, take the highest-scoring remaining ones
let remaining_needed = size as usize - final_hits.len(); let remaining_needed = size as usize - final_hits.len();
// Sort all hits by score descending, removing duplicates by document ID // Sort all hits by score descending, removing duplicates by document ID
@@ -179,9 +360,45 @@ pub async fn query_coco_fusion<R: Runtime>(
.unwrap_or(std::cmp::Ordering::Equal) .unwrap_or(std::cmp::Ordering::Equal)
}); });
if final_hits.len() < 5 {
//TODO: Add a recommendation system to suggest more sources
log::info!(
"Less than 5 hits found, consider using recommendation to find more suggestions."
);
//local: recent history, local extensions
//remote: ai agents, quick links, other tasks, managed by server
}
Ok(MultiSourceQueryResponse { Ok(MultiSourceQueryResponse {
failed: failed_requests, failed: failed_requests,
hits: final_hits, hits: final_hits,
total_hits, total_hits,
}) })
} }
fn boosted_levenshtein_rerank(query: &str, titles: Vec<(usize, &str)>) -> Vec<(usize, f64)> {
use strsim::levenshtein;
let query_lower = query.to_lowercase();
titles
.into_iter()
.map(|(idx, title)| {
let mut score = 0.0;
if title.contains(query) {
score += 0.4;
} else if title.to_lowercase().contains(&query_lower) {
score += 0.2;
}
let dist = levenshtein(&query_lower, &title.to_lowercase());
let max_len = query_lower.len().max(title.len());
if max_len > 0 {
score += (1.0 - (dist as f64 / max_len as f64)) as f32;
}
(idx, score.min(1.0) as f64)
})
.collect()
}

View File

@@ -15,42 +15,6 @@ pub struct UploadAttachmentResponse {
pub attachments: Vec<String>, pub attachments: Vec<String>,
} }
#[derive(Debug, Serialize, Deserialize)]
pub struct AttachmentSource {
pub id: String,
pub created: String,
pub updated: String,
pub session: String,
pub name: String,
pub icon: String,
pub url: String,
pub size: u64,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct AttachmentHit {
pub _index: String,
pub _type: Option<String>,
pub _id: String,
pub _score: Option<f64>,
pub _source: AttachmentSource,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct AttachmentHits {
pub total: Value,
pub max_score: Option<f64>,
pub hits: Option<Vec<AttachmentHit>>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct GetAttachmentResponse {
pub took: u32,
pub timed_out: bool,
pub _shards: Option<Value>,
pub hits: AttachmentHits,
}
#[derive(Debug, Serialize, Deserialize)] #[derive(Debug, Serialize, Deserialize)]
pub struct DeleteAttachmentResponse { pub struct DeleteAttachmentResponse {
pub _id: String, pub _id: String,
@@ -60,7 +24,6 @@ pub struct DeleteAttachmentResponse {
#[command] #[command]
pub async fn upload_attachment( pub async fn upload_attachment(
server_id: String, server_id: String,
session_id: String,
file_paths: Vec<PathBuf>, file_paths: Vec<PathBuf>,
) -> Result<UploadAttachmentResponse, String> { ) -> Result<UploadAttachmentResponse, String> {
let mut form = Form::new(); let mut form = Form::new();
@@ -82,10 +45,12 @@ pub async fn upload_attachment(
form = form.part("files", part); form = form.part("files", part);
} }
let server = get_server_by_id(&server_id).ok_or("Server not found")?; let server = get_server_by_id(&server_id)
let url = HttpClient::join_url(&server.endpoint, &format!("chat/{}/_upload", session_id)); .await
.ok_or("Server not found")?;
let url = HttpClient::join_url(&server.endpoint, &format!("attachment/_upload"));
let token = get_server_token(&server_id).await?; let token = get_server_token(&server_id).await;
let mut headers = HashMap::new(); let mut headers = HashMap::new();
if let Some(token) = token { if let Some(token) = token {
headers.insert("X-API-TOKEN".to_string(), token.access_token); headers.insert("X-API-TOKEN".to_string(), token.access_token);
@@ -107,20 +72,17 @@ pub async fn upload_attachment(
} }
#[command] #[command]
pub async fn get_attachment( pub async fn get_attachment(server_id: String, session_id: String) -> Result<Value, String> {
server_id: String, let mut query_params = Vec::new();
session_id: String, query_params.push(format!("session={}", session_id));
) -> Result<GetAttachmentResponse, String> {
let mut query_params = HashMap::new();
query_params.insert("session".to_string(), serde_json::Value::String(session_id));
let response = HttpClient::get(&server_id, "/attachment/_search", Some(query_params)) let response = HttpClient::get(&server_id, "/attachment/_search", Some(query_params))
.await .await
.map_err(|e| format!("Request error: {}", e))?; .map_err(|e| format!("Request error: {}", e))?;
let body = get_response_body_text(response).await?; let body = get_response_body_text(response).await?;
serde_json::from_str::<GetAttachmentResponse>(&body) serde_json::from_str::<Value>(&body)
.map_err(|e| format!("Failed to parse attachment response: {}", e)) .map_err(|e| format!("Failed to parse attachment response: {}", e))
} }

View File

@@ -20,15 +20,15 @@ pub async fn handle_sso_callback<R: Runtime>(
code: String, code: String,
) -> Result<(), String> { ) -> Result<(), String> {
// Retrieve the server details using the server ID // Retrieve the server details using the server ID
let server = get_server_by_id(&server_id); let server = get_server_by_id(&server_id).await;
let expire_in = 3600; // TODO, need to update to actual expire_in value let expire_in = 3600; // TODO, need to update to actual expire_in value
if let Some(mut server) = server { if let Some(mut server) = server {
// Save the access token for the server // Save the access token for the server
let access_token = ServerAccessToken::new(server_id.clone(), code.clone(), expire_in); let access_token = ServerAccessToken::new(server_id.clone(), code.clone(), expire_in);
// dbg!(&server_id, &request_id, &code, &token); // dbg!(&server_id, &request_id, &code, &token);
save_access_token(server_id.clone(), access_token); save_access_token(server_id.clone(), access_token).await;
persist_servers_token(&app_handle)?; persist_servers_token(&app_handle).await?;
// Register the server to the search source // Register the server to the search source
try_register_server_to_search_source(app_handle.clone(), &server).await; try_register_server_to_search_source(app_handle.clone(), &server).await;
@@ -41,7 +41,7 @@ pub async fn handle_sso_callback<R: Runtime>(
Ok(p) => { Ok(p) => {
server.profile = Some(p); server.profile = Some(p);
server.available = true; server.available = true;
save_server(&server); save_server(&server).await;
persist_servers(&app_handle).await?; persist_servers(&app_handle).await?;
Ok(()) Ok(())
} }

View File

@@ -1,7 +1,8 @@
use crate::common::connector::Connector; use crate::common::connector::Connector;
use crate::common::search::parse_search_results; use crate::common::search::parse_search_results;
use crate::server::http_client::HttpClient; use crate::server::http_client::{HttpClient, status_code_check};
use crate::server::servers::get_all_servers; use crate::server::servers::get_all_servers;
use http::StatusCode;
use lazy_static::lazy_static; use lazy_static::lazy_static;
use std::collections::HashMap; use std::collections::HashMap;
use std::sync::{Arc, RwLock}; use std::sync::{Arc, RwLock};
@@ -29,7 +30,7 @@ pub fn get_connector_by_id(server_id: &str, connector_id: &str) -> Option<Connec
} }
pub async fn refresh_all_connectors<R: Runtime>(app_handle: &AppHandle<R>) -> Result<(), String> { pub async fn refresh_all_connectors<R: Runtime>(app_handle: &AppHandle<R>) -> Result<(), String> {
let servers = get_all_servers(); let servers = get_all_servers().await;
// Collect all the tasks for fetching and refreshing connectors // Collect all the tasks for fetching and refreshing connectors
let mut server_map = HashMap::new(); let mut server_map = HashMap::new();
@@ -107,6 +108,7 @@ pub async fn fetch_connectors_by_server(id: &str) -> Result<Vec<Connector>, Stri
// dbg!("Error fetching connector for id {}: {}", &id, &e); // dbg!("Error fetching connector for id {}: {}", &id, &e);
format!("Error fetching connector: {}", e) format!("Error fetching connector: {}", e)
})?; })?;
status_code_check(&resp, &[StatusCode::OK, StatusCode::CREATED])?;
// Parse the search results directly from the response body // Parse the search results directly from the response body
let datasource: Vec<Connector> = parse_search_results(resp) let datasource: Vec<Connector> = parse_search_results(resp)

View File

@@ -1,20 +1,14 @@
use crate::common::datasource::DataSource; use crate::common::datasource::DataSource;
use crate::common::search::parse_search_results; use crate::common::search::parse_search_results;
use crate::server::connector::get_connector_by_id; use crate::server::connector::get_connector_by_id;
use crate::server::http_client::HttpClient; use crate::server::http_client::{HttpClient, status_code_check};
use crate::server::servers::get_all_servers; use crate::server::servers::get_all_servers;
use http::StatusCode;
use lazy_static::lazy_static; use lazy_static::lazy_static;
use std::collections::HashMap; use std::collections::HashMap;
use std::sync::{Arc, RwLock}; use std::sync::{Arc, RwLock};
use tauri::{AppHandle, Runtime}; use tauri::{AppHandle, Runtime};
#[derive(serde::Deserialize, Debug)]
pub struct GetDatasourcesByServerOptions {
pub from: Option<u32>,
pub size: Option<u32>,
pub query: Option<String>,
}
lazy_static! { lazy_static! {
static ref DATASOURCE_CACHE: Arc<RwLock<HashMap<String, HashMap<String, DataSource>>>> = static ref DATASOURCE_CACHE: Arc<RwLock<HashMap<String, HashMap<String, DataSource>>>> =
Arc::new(RwLock::new(HashMap::new())); Arc::new(RwLock::new(HashMap::new()));
@@ -40,7 +34,7 @@ pub fn get_datasources_from_cache(server_id: &str) -> Option<HashMap<String, Dat
pub async fn refresh_all_datasources<R: Runtime>(_app_handle: &AppHandle<R>) -> Result<(), String> { pub async fn refresh_all_datasources<R: Runtime>(_app_handle: &AppHandle<R>) -> Result<(), String> {
// dbg!("Attempting to refresh all datasources"); // dbg!("Attempting to refresh all datasources");
let servers = get_all_servers(); let servers = get_all_servers().await;
let mut server_map = HashMap::new(); let mut server_map = HashMap::new();
@@ -96,50 +90,17 @@ pub async fn refresh_all_datasources<R: Runtime>(_app_handle: &AppHandle<R>) ->
#[tauri::command] #[tauri::command]
pub async fn datasource_search( pub async fn datasource_search(
id: &str, id: &str,
options: Option<GetDatasourcesByServerOptions>, query_params: Option<Vec<String>>, //["query=abc", "filter=er", "filter=efg", "from=0", "size=5"],
) -> Result<Vec<DataSource>, String> { ) -> Result<Vec<DataSource>, String> {
let from = options.as_ref().and_then(|opt| opt.from).unwrap_or(0);
let size = options.as_ref().and_then(|opt| opt.size).unwrap_or(10000);
let query = options
.and_then(|opt| opt.query)
.unwrap_or(String::default());
let mut body = serde_json::json!({
"from": from,
"size": size,
});
if !query.is_empty() {
body["query"] = serde_json::json!({
"bool": {
"must": [{
"query_string": {
"fields": ["combined_fulltext"],
"query": query,
"fuzziness": "AUTO",
"fuzzy_prefix_length": 2,
"fuzzy_max_expansions": 10,
"fuzzy_transpositions": true,
"allow_leading_wildcard": false
}
}]
}
});
}
// Perform the async HTTP request outside the cache lock // Perform the async HTTP request outside the cache lock
let resp = HttpClient::post( let resp = HttpClient::post(id, "/datasource/_search", query_params, None)
id,
"/datasource/_search",
None,
Some(reqwest::Body::from(body.to_string())),
)
.await .await
.map_err(|e| format!("Error fetching datasource: {}", e))?; .map_err(|e| format!("Error fetching datasource: {}", e))?;
status_code_check(&resp, &[StatusCode::OK, StatusCode::CREATED])?;
// Parse the search results from the response // Parse the search results from the response
let datasources: Vec<DataSource> = parse_search_results(resp).await.map_err(|e| { let datasources: Vec<DataSource> = parse_search_results(resp).await.map_err(|e| {
dbg!("Error parsing search results: {}", &e); //dbg!("Error parsing search results: {}", &e);
e.to_string() e.to_string()
})?; })?;
@@ -152,50 +113,17 @@ pub async fn datasource_search(
#[tauri::command] #[tauri::command]
pub async fn mcp_server_search( pub async fn mcp_server_search(
id: &str, id: &str,
options: Option<GetDatasourcesByServerOptions>, query_params: Option<Vec<String>>,
) -> Result<Vec<DataSource>, String> { ) -> Result<Vec<DataSource>, String> {
let from = options.as_ref().and_then(|opt| opt.from).unwrap_or(0);
let size = options.as_ref().and_then(|opt| opt.size).unwrap_or(10000);
let query = options
.and_then(|opt| opt.query)
.unwrap_or(String::default());
let mut body = serde_json::json!({
"from": from,
"size": size,
});
if !query.is_empty() {
body["query"] = serde_json::json!({
"bool": {
"must": [{
"query_string": {
"fields": ["combined_fulltext"],
"query": query,
"fuzziness": "AUTO",
"fuzzy_prefix_length": 2,
"fuzzy_max_expansions": 10,
"fuzzy_transpositions": true,
"allow_leading_wildcard": false
}
}]
}
});
}
// Perform the async HTTP request outside the cache lock // Perform the async HTTP request outside the cache lock
let resp = HttpClient::post( let resp = HttpClient::post(id, "/mcp_server/_search", query_params, None)
id,
"/mcp_server/_search",
None,
Some(reqwest::Body::from(body.to_string())),
)
.await .await
.map_err(|e| format!("Error fetching datasource: {}", e))?; .map_err(|e| format!("Error fetching datasource: {}", e))?;
status_code_check(&resp, &[StatusCode::OK, StatusCode::CREATED])?;
// Parse the search results from the response // Parse the search results from the response
let mcp_server: Vec<DataSource> = parse_search_results(resp).await.map_err(|e| { let mcp_server: Vec<DataSource> = parse_search_results(resp).await.map_err(|e| {
dbg!("Error parsing search results: {}", &e); //dbg!("Error parsing search results: {}", &e);
e.to_string() e.to_string()
})?; })?;

View File

@@ -1,17 +1,19 @@
use crate::server::servers::{get_server_by_id, get_server_token}; use crate::server::servers::{get_server_by_id, get_server_token};
use http::{HeaderName, HeaderValue}; use crate::util::app_lang::get_app_lang;
use crate::util::platform::Platform;
use http::{HeaderName, HeaderValue, StatusCode};
use once_cell::sync::Lazy; use once_cell::sync::Lazy;
use reqwest::{Client, Method, RequestBuilder}; use reqwest::{Client, Method, RequestBuilder};
use std::collections::HashMap; use std::collections::HashMap;
use std::sync::LazyLock;
use std::time::Duration; use std::time::Duration;
use tauri_plugin_store::JsonValue;
use tokio::sync::Mutex; use tokio::sync::Mutex;
pub(crate) fn new_reqwest_http_client(accept_invalid_certs: bool) -> Client { pub(crate) fn new_reqwest_http_client(accept_invalid_certs: bool) -> Client {
Client::builder() Client::builder()
.read_timeout(Duration::from_secs(3)) // Set a timeout of 3 second .read_timeout(Duration::from_secs(60)) // Set a timeout of 60 second
.connect_timeout(Duration::from_secs(3)) // Set a timeout of 3 second .connect_timeout(Duration::from_secs(30)) // Set a timeout of 30 second
.timeout(Duration::from_secs(10)) // Set a timeout of 10 seconds .timeout(Duration::from_secs(5 * 60)) // Set a timeout of 5 minute
.danger_accept_invalid_certs(accept_invalid_certs) // allow self-signed certificates .danger_accept_invalid_certs(accept_invalid_certs) // allow self-signed certificates
.build() .build()
.expect("Failed to build client") .expect("Failed to build client")
@@ -27,6 +29,26 @@ pub static HTTP_CLIENT: Lazy<Mutex<Client>> = Lazy::new(|| {
Mutex::new(new_reqwest_http_client(allow_self_signature)) Mutex::new(new_reqwest_http_client(allow_self_signature))
}); });
/// These header values won't change during a process's lifetime.
static STATIC_HEADERS: LazyLock<HashMap<String, String>> = LazyLock::new(|| {
HashMap::from([
(
"X-OS-NAME".into(),
Platform::current()
.to_os_name_http_header_str()
.into_owned(),
),
(
"X-OS-VER".into(),
sysinfo::System::os_version()
.expect("sysinfo::System::os_version() should be Some on major systems"),
),
("X-OS-ARCH".into(), sysinfo::System::cpu_arch()),
("X-APP-NAME".into(), "coco-app".into()),
("X-APP-VER".into(), env!("CARGO_PKG_VERSION").into()),
])
});
pub struct HttpClient; pub struct HttpClient;
impl HttpClient { impl HttpClient {
@@ -40,7 +62,7 @@ impl HttpClient {
pub async fn send_raw_request( pub async fn send_raw_request(
method: Method, method: Method,
url: &str, url: &str,
query_params: Option<HashMap<String, JsonValue>>, query_params: Option<Vec<String>>,
headers: Option<HashMap<String, String>>, headers: Option<HashMap<String, String>>,
body: Option<reqwest::Body>, body: Option<reqwest::Body>,
) -> Result<reqwest::Response, String> { ) -> Result<reqwest::Response, String> {
@@ -56,7 +78,7 @@ impl HttpClient {
Self::get_request_builder(method, url, headers, query_params, body).await; Self::get_request_builder(method, url, headers, query_params, body).await;
let response = request_builder.send().await.map_err(|e| { let response = request_builder.send().await.map_err(|e| {
dbg!("Failed to send request: {}", &e); //dbg!("Failed to send request: {}", &e);
format!("Failed to send request: {}", e) format!("Failed to send request: {}", e)
})?; })?;
@@ -74,7 +96,7 @@ impl HttpClient {
method: Method, method: Method,
url: &str, url: &str,
headers: Option<HashMap<String, String>>, headers: Option<HashMap<String, String>>,
query_params: Option<HashMap<String, JsonValue>>, // Add query parameters query_params: Option<Vec<String>>, // Add query parameters
body: Option<reqwest::Body>, body: Option<reqwest::Body>,
) -> RequestBuilder { ) -> RequestBuilder {
let client = HTTP_CLIENT.lock().await; // Acquire the lock on HTTP_CLIENT let client = HTTP_CLIENT.lock().await; // Acquire the lock on HTTP_CLIENT
@@ -82,8 +104,32 @@ impl HttpClient {
// Build the request // Build the request
let mut request_builder = client.request(method.clone(), url); let mut request_builder = client.request(method.clone(), url);
// Populate the headers defined by us
let mut req_headers = reqwest::header::HeaderMap::new();
for (key, value) in STATIC_HEADERS.iter() {
let key = HeaderName::from_bytes(key.as_bytes())
.expect("headers defined by us should be valid");
let value = HeaderValue::from_str(value.trim()).unwrap_or_else(|e| {
panic!(
"header value [{}] is invalid, error [{}], this should be unreachable",
value, e
);
});
req_headers.insert(key, value);
}
let app_lang = get_app_lang().await.to_string();
req_headers.insert(
"X-APP-LANG",
HeaderValue::from_str(&app_lang).unwrap_or_else(|e| {
panic!(
"header value [{}] is invalid, error [{}], this should be unreachable",
app_lang, e
);
}),
);
// Headers from the function parameter
if let Some(h) = headers { if let Some(h) = headers {
let mut req_headers = reqwest::header::HeaderMap::new();
for (key, value) in h.into_iter() { for (key, value) in h.into_iter() {
match ( match (
HeaderName::from_bytes(key.as_bytes()), HeaderName::from_bytes(key.as_bytes()),
@@ -106,24 +152,9 @@ impl HttpClient {
request_builder = request_builder.headers(req_headers); request_builder = request_builder.headers(req_headers);
} }
if let Some(query) = query_params { if let Some(params) = query_params {
// Convert only supported value types into strings let query: Vec<(&str, &str)> =
let query: HashMap<String, String> = query params.iter().filter_map(|s| s.split_once('=')).collect();
.into_iter()
.filter_map(|(k, v)| {
match v {
JsonValue::String(s) => Some((k, s)),
JsonValue::Number(n) => Some((k, n.to_string())),
JsonValue::Bool(b) => Some((k, b.to_string())),
_ => {
dbg!(
"Unsupported query parameter type. Only strings, numbers, and booleans are supported.",k,v,
);
None
} // skip arrays, objects, nulls
}
})
.collect();
request_builder = request_builder.query(&query); request_builder = request_builder.query(&query);
} }
@@ -140,18 +171,18 @@ impl HttpClient {
method: Method, method: Method,
path: &str, path: &str,
custom_headers: Option<HashMap<String, String>>, custom_headers: Option<HashMap<String, String>>,
query_params: Option<HashMap<String, JsonValue>>, query_params: Option<Vec<String>>,
body: Option<reqwest::Body>, body: Option<reqwest::Body>,
) -> Result<reqwest::Response, String> { ) -> Result<reqwest::Response, String> {
// Fetch the server using the server_id // Fetch the server using the server_id
let server = get_server_by_id(server_id); let server = get_server_by_id(server_id).await;
if let Some(s) = server { if let Some(s) = server {
// Construct the URL // Construct the URL
let url = HttpClient::join_url(&s.endpoint, path); let url = HttpClient::join_url(&s.endpoint, path);
// Retrieve the token for the server (token is optional) // Retrieve the token for the server (token is optional)
let token = get_server_token(server_id) let token = get_server_token(server_id)
.await? .await
.map(|t| t.access_token.clone()); .map(|t| t.access_token.clone());
let mut headers = if let Some(custom_headers) = custom_headers { let mut headers = if let Some(custom_headers) = custom_headers {
@@ -165,16 +196,16 @@ impl HttpClient {
headers.insert("X-API-TOKEN".to_string(), t); headers.insert("X-API-TOKEN".to_string(), t);
} }
log::debug!( // log::debug!(
"Sending request to server: {}, url: {}, headers: {:?}", // "Sending request to server: {}, url: {}, headers: {:?}",
&server_id, // &server_id,
&url, // &url,
&headers // &headers
); // );
Self::send_raw_request(method, &url, query_params, Some(headers), body).await Self::send_raw_request(method, &url, query_params, Some(headers), body).await
} else { } else {
Err("Server not found".to_string()) Err(format!("Server [{}] not found", server_id))
} }
} }
@@ -182,7 +213,7 @@ impl HttpClient {
pub async fn get( pub async fn get(
server_id: &str, server_id: &str,
path: &str, path: &str,
query_params: Option<HashMap<String, JsonValue>>, // Add query parameters query_params: Option<Vec<String>>,
) -> Result<reqwest::Response, String> { ) -> Result<reqwest::Response, String> {
HttpClient::send_request(server_id, Method::GET, path, None, query_params, None).await HttpClient::send_request(server_id, Method::GET, path, None, query_params, None).await
} }
@@ -191,7 +222,7 @@ impl HttpClient {
pub async fn post( pub async fn post(
server_id: &str, server_id: &str,
path: &str, path: &str,
query_params: Option<HashMap<String, JsonValue>>, // Add query parameters query_params: Option<Vec<String>>,
body: Option<reqwest::Body>, body: Option<reqwest::Body>,
) -> Result<reqwest::Response, String> { ) -> Result<reqwest::Response, String> {
HttpClient::send_request(server_id, Method::POST, path, None, query_params, body).await HttpClient::send_request(server_id, Method::POST, path, None, query_params, body).await
@@ -201,7 +232,7 @@ impl HttpClient {
server_id: &str, server_id: &str,
path: &str, path: &str,
custom_headers: Option<HashMap<String, String>>, custom_headers: Option<HashMap<String, String>>,
query_params: Option<HashMap<String, JsonValue>>, // Add query parameters query_params: Option<Vec<String>>,
body: Option<reqwest::Body>, body: Option<reqwest::Body>,
) -> Result<reqwest::Response, String> { ) -> Result<reqwest::Response, String> {
HttpClient::send_request( HttpClient::send_request(
@@ -221,7 +252,7 @@ impl HttpClient {
server_id: &str, server_id: &str,
path: &str, path: &str,
custom_headers: Option<HashMap<String, String>>, custom_headers: Option<HashMap<String, String>>,
query_params: Option<HashMap<String, JsonValue>>, // Add query parameters query_params: Option<Vec<String>>,
body: Option<reqwest::Body>, body: Option<reqwest::Body>,
) -> Result<reqwest::Response, String> { ) -> Result<reqwest::Response, String> {
HttpClient::send_request( HttpClient::send_request(
@@ -241,7 +272,7 @@ impl HttpClient {
server_id: &str, server_id: &str,
path: &str, path: &str,
custom_headers: Option<HashMap<String, String>>, custom_headers: Option<HashMap<String, String>>,
query_params: Option<HashMap<String, JsonValue>>, // Add query parameters query_params: Option<Vec<String>>,
) -> Result<reqwest::Response, String> { ) -> Result<reqwest::Response, String> {
HttpClient::send_request( HttpClient::send_request(
server_id, server_id,
@@ -254,3 +285,30 @@ impl HttpClient {
.await .await
} }
} }
/// Helper function to check status code.
///
/// If the status code is not in the `allowed_status_codes` list, return an error.
pub(crate) fn status_code_check(
response: &reqwest::Response,
allowed_status_codes: &[StatusCode],
) -> Result<(), String> {
let status_code = response.status();
if !allowed_status_codes.contains(&status_code) {
let msg = format!(
"Response of request [{}] status code failed: status code [{}], which is not in the 'allow' list {:?}",
response.url(),
status_code,
allowed_status_codes
.iter()
.map(|status| status.to_string())
.collect::<Vec<String>>()
);
log::warn!("{}", msg);
Err(msg)
} else {
Ok(())
}
}

View File

@@ -8,6 +8,7 @@ pub mod http_client;
pub mod profile; pub mod profile;
pub mod search; pub mod search;
pub mod servers; pub mod servers;
pub mod synthesize;
pub mod system_settings; pub mod system_settings;
pub mod transcription; pub mod transcription;
pub mod websocket; pub mod websocket;

View File

@@ -1,4 +1,4 @@
use crate::common::document::Document; use crate::common::document::{Document, OnOpened};
use crate::common::error::SearchError; use crate::common::error::SearchError;
use crate::common::http::get_response_body_text; use crate::common::http::get_response_body_text;
use crate::common::search::{QueryHits, QueryResponse, QuerySource, SearchQuery, SearchResponse}; use crate::common::search::{QueryHits, QueryResponse, QuerySource, SearchQuery, SearchResponse};
@@ -6,11 +6,10 @@ use crate::common::server::Server;
use crate::common::traits::SearchSource; use crate::common::traits::SearchSource;
use crate::server::http_client::HttpClient; use crate::server::http_client::HttpClient;
use async_trait::async_trait; use async_trait::async_trait;
// use futures::stream::StreamExt;
use ordered_float::OrderedFloat; use ordered_float::OrderedFloat;
use reqwest::StatusCode;
use std::collections::HashMap; use std::collections::HashMap;
use tauri_plugin_store::JsonValue; use tauri::AppHandle;
// use std::hash::Hash;
#[allow(dead_code)] #[allow(dead_code)]
pub(crate) struct DocumentsSizedCollector { pub(crate) struct DocumentsSizedCollector {
@@ -45,7 +44,7 @@ impl DocumentsSizedCollector {
} }
} }
fn documents(self) -> impl ExactSizeIterator<Item=Document> { fn documents(self) -> impl ExactSizeIterator<Item = Document> {
self.docs.into_iter().map(|(_, doc, _)| doc) self.docs.into_iter().map(|(_, doc, _)| doc)
} }
@@ -91,41 +90,74 @@ impl SearchSource for CocoSearchSource {
} }
} }
async fn search(&self, query: SearchQuery) -> Result<QueryResponse, SearchError> { async fn search(
&self,
_tauri_app_handle: AppHandle,
query: SearchQuery,
) -> Result<QueryResponse, SearchError> {
let url = "/query/_search"; let url = "/query/_search";
let mut total_hits = 0;
let mut hits: Vec<(Document, f64)> = Vec::new();
let mut query_args: HashMap<String, JsonValue> = HashMap::new(); let mut query_params = Vec::new();
query_args.insert("from".into(), JsonValue::Number(query.from.into()));
query_args.insert("size".into(), JsonValue::Number(query.size.into())); // Add from/size as number values
query_params.push(format!("from={}", query.from));
query_params.push(format!("size={}", query.size));
// Add query strings
for (key, value) in query.query_strings { for (key, value) in query.query_strings {
query_args.insert(key, JsonValue::String(value)); query_params.push(format!("{}={}", key, value));
} }
let response = HttpClient::get( let response = HttpClient::get(&self.server.id, &url, Some(query_params))
&self.server.id,
&url,
Some(query_args),
)
.await .await
.map_err(|e| SearchError::HttpError(format!("Error to send search request: {}", e)))?; .map_err(|e| SearchError::HttpError {
status_code: None,
msg: format!("{}", e),
})?;
let status_code = response.status();
if ![StatusCode::OK, StatusCode::CREATED].contains(&status_code) {
return Err(SearchError::HttpError {
status_code: Some(status_code),
msg: format!("Request failed with status code [{}]", status_code),
});
}
// Use the helper function to parse the response body // Use the helper function to parse the response body
let response_body = get_response_body_text(response) let response_body = get_response_body_text(response)
.await .await
.map_err(|e| SearchError::ParseError(format!("Failed to read response body: {}", e)))?; .map_err(|e| SearchError::ParseError(e))?;
// Parse the search response from the body text // Check if the response body is empty
let parsed: SearchResponse<Document> = serde_json::from_str(&response_body) if !response_body.is_empty() {
.map_err(|e| SearchError::ParseError(format!("Failed to parse search response: {}", e)))?; // log::info!("Search response body: {}", &response_body);
// Process the parsed response // Parse the search response from the body text
let total_hits = parsed.hits.total.value as usize; let parsed: SearchResponse<Document> = serde_json::from_str(&response_body)
let hits: Vec<(Document, f64)> = parsed .map_err(|e| SearchError::ParseError(format!("{}", e)))?;
.hits
.hits // Process the parsed response
.into_iter() total_hits = parsed.hits.total.value as usize;
.map(|hit| (hit._source, hit._score.unwrap_or(0.0))) // Default _score to 0.0 if None
.collect(); if let Some(items) = parsed.hits.hits {
for hit in items {
let mut document = hit._source;
// Default _score to 0.0 if None
let score = hit._score.unwrap_or(0.0);
let on_opened = document
.url
.as_ref()
.map(|url| OnOpened::Document { url: url.clone() });
// Set the `on_opened` field as it won't be returned from Coco server
document.on_opened = on_opened;
hits.push((document, score));
}
}
}
// Return the final result // Return the final result
Ok(QueryResponse { Ok(QueryResponse {

View File

@@ -1,3 +1,4 @@
use crate::COCO_TAURI_STORE;
use crate::common::http::get_response_body_text; use crate::common::http::get_response_body_text;
use crate::common::register::SearchSourceRegistry; use crate::common::register::SearchSourceRegistry;
use crate::common::server::{AuthProvider, Provider, Server, ServerAccessToken, Sso, Version}; use crate::common::server::{AuthProvider, Provider, Server, ServerAccessToken, Sso, Version};
@@ -5,68 +6,72 @@ use crate::server::connector::fetch_connectors_by_server;
use crate::server::datasource::datasource_search; use crate::server::datasource::datasource_search;
use crate::server::http_client::HttpClient; use crate::server::http_client::HttpClient;
use crate::server::search::CocoSearchSource; use crate::server::search::CocoSearchSource;
use crate::COCO_TAURI_STORE; use function_name;
use lazy_static::lazy_static; use http::StatusCode;
use reqwest::Method; use reqwest::Method;
use serde_json::from_value;
use serde_json::Value as JsonValue; use serde_json::Value as JsonValue;
use serde_json::from_value;
use std::collections::HashMap; use std::collections::HashMap;
use std::sync::Arc; use std::sync::LazyLock;
use std::sync::RwLock;
use tauri::Runtime; use tauri::Runtime;
use tauri::{AppHandle, Manager}; use tauri::{AppHandle, Manager};
use tauri_plugin_store::StoreExt; use tauri_plugin_store::StoreExt;
// Assuming you're using serde_json use tokio::sync::RwLock;
lazy_static! { /// Coco sever list
static ref SERVER_CACHE: Arc<RwLock<HashMap<String, Server>>> = static SERVER_LIST_CACHE: LazyLock<RwLock<HashMap<String, Server>>> =
Arc::new(RwLock::new(HashMap::new())); LazyLock::new(|| RwLock::new(HashMap::new()));
static ref SERVER_TOKEN: Arc<RwLock<HashMap<String, ServerAccessToken>>> =
Arc::new(RwLock::new(HashMap::new()));
}
#[allow(dead_code)] /// If a server has a token stored here that has not expired, it is considered logged in.
fn check_server_exists(id: &str) -> bool { ///
let cache = SERVER_CACHE.read().unwrap(); // Acquire read lock /// Since the `expire_at` field of `struct ServerAccessToken` is currently unused,
cache.contains_key(id) /// all servers stored here are treated as logged in.
} static SERVER_TOKEN_LIST_CACHE: LazyLock<RwLock<HashMap<String, ServerAccessToken>>> =
LazyLock::new(|| RwLock::new(HashMap::new()));
pub fn get_server_by_id(id: &str) -> Option<Server> { /// `SERVER_LIST_CACHE` will be stored in KV store COCO_TAURI_STORE, under this key.
let cache = SERVER_CACHE.read().unwrap(); // Acquire read lock pub const COCO_SERVERS: &str = "coco_servers";
/// `SERVER_TOKEN_LIST_CACHE` will be stored in KV store COCO_TAURI_STORE, under this key.
const COCO_SERVER_TOKENS: &str = "coco_server_tokens";
pub async fn get_server_by_id(id: &str) -> Option<Server> {
let cache = SERVER_LIST_CACHE.read().await;
cache.get(id).cloned() cache.get(id).cloned()
} }
#[tauri::command] pub async fn get_server_token(id: &str) -> Option<ServerAccessToken> {
pub async fn get_server_token(id: &str) -> Result<Option<ServerAccessToken>, String> { let cache = SERVER_TOKEN_LIST_CACHE.read().await;
let cache = SERVER_TOKEN.read().map_err(|err| err.to_string())?;
Ok(cache.get(id).cloned()) cache.get(id).cloned()
} }
pub fn save_access_token(server_id: String, token: ServerAccessToken) -> bool { pub async fn save_access_token(server_id: String, token: ServerAccessToken) -> bool {
let mut cache = SERVER_TOKEN.write().unwrap(); let mut cache = SERVER_TOKEN_LIST_CACHE.write().await;
cache.insert(server_id, token).is_none() cache.insert(server_id, token).is_none()
} }
fn check_endpoint_exists(endpoint: &str) -> bool { async fn check_endpoint_exists(endpoint: &str) -> bool {
let cache = SERVER_CACHE.read().unwrap(); let cache = SERVER_LIST_CACHE.read().await;
cache.values().any(|server| server.endpoint == endpoint) cache.values().any(|server| server.endpoint == endpoint)
} }
pub fn save_server(server: &Server) -> bool { /// Return true if `server` does not exists in the server list, i.e., it is a newly-added
let mut cache = SERVER_CACHE.write().unwrap(); /// server.
cache.insert(server.id.clone(), server.clone()).is_none() // If the server id did not exist, `insert` will return `None` pub async fn save_server(server: &Server) -> bool {
let mut cache = SERVER_LIST_CACHE.write().await;
cache.insert(server.id.clone(), server.clone()).is_none()
} }
fn remove_server_by_id(id: String) -> bool { /// Return the removed `Server` if it exists in the server list.
dbg!("remove server by id:", &id); async fn remove_server_by_id(id: &str) -> Option<Server> {
let mut cache = SERVER_CACHE.write().unwrap(); log::debug!("remove server by id: {}", &id);
let deleted = cache.remove(id.as_str()); let mut cache = SERVER_LIST_CACHE.write().await;
deleted.is_some() cache.remove(id)
} }
pub async fn persist_servers<R: Runtime>(app_handle: &AppHandle<R>) -> Result<(), String> { pub async fn persist_servers<R: Runtime>(app_handle: &AppHandle<R>) -> Result<(), String> {
let cache = SERVER_CACHE.read().unwrap(); // Acquire a read lock, not a write lock, since you're not modifying the cache let cache = SERVER_LIST_CACHE.read().await;
// Convert HashMap to Vec for serialization (iterating over values of HashMap) // Convert HashMap to Vec for serialization (iterating over values of HashMap)
let servers: Vec<Server> = cache.values().cloned().collect(); let servers: Vec<Server> = cache.values().cloned().collect();
@@ -86,14 +91,16 @@ pub async fn persist_servers<R: Runtime>(app_handle: &AppHandle<R>) -> Result<()
Ok(()) Ok(())
} }
pub fn remove_server_token(id: &str) -> bool { /// Return true if the server token of the server specified by `id` exists in
dbg!("remove server token by id:", &id); /// the token list and gets deleted.
let mut cache = SERVER_TOKEN.write().unwrap(); pub async fn remove_server_token(id: &str) -> bool {
log::debug!("remove server token by id: {}", &id);
let mut cache = SERVER_TOKEN_LIST_CACHE.write().await;
cache.remove(id).is_some() cache.remove(id).is_some()
} }
pub fn persist_servers_token<R: Runtime>(app_handle: &AppHandle<R>) -> Result<(), String> { pub async fn persist_servers_token<R: Runtime>(app_handle: &AppHandle<R>) -> Result<(), String> {
let cache = SERVER_TOKEN.read().unwrap(); // Acquire a read lock, not a write lock, since you're not modifying the cache let cache = SERVER_TOKEN_LIST_CACHE.read().await;
// Convert HashMap to Vec for serialization (iterating over values of HashMap) // Convert HashMap to Vec for serialization (iterating over values of HashMap)
let servers: Vec<ServerAccessToken> = cache.values().cloned().collect(); let servers: Vec<ServerAccessToken> = cache.values().cloned().collect();
@@ -104,7 +111,7 @@ pub fn persist_servers_token<R: Runtime>(app_handle: &AppHandle<R>) -> Result<()
.map(|server| serde_json::to_value(server).expect("Failed to serialize access_tokens")) // Automatically serialize all fields .map(|server| serde_json::to_value(server).expect("Failed to serialize access_tokens")) // Automatically serialize all fields
.collect(); .collect();
dbg!(format!("persist servers token: {:?}", &json_servers)); log::debug!("persist servers token: {:?}", &json_servers);
// Save the serialized servers to Tauri's store // Save the serialized servers to Tauri's store
app_handle app_handle
@@ -143,17 +150,18 @@ fn get_default_server() -> Server {
profile: None, profile: None,
auth_provider: AuthProvider { auth_provider: AuthProvider {
sso: Sso { sso: Sso {
url: "https://coco.infini.cloud/sso/login/".to_string(), url: "https://coco.infini.cloud/sso/login/cloud?provider=coco-cloud&product=coco".to_string(),
}, },
}, },
priority: 0, priority: 0,
stats: None,
} }
} }
pub async fn load_servers_token<R: Runtime>( pub async fn load_servers_token<R: Runtime>(
app_handle: &AppHandle<R>, app_handle: &AppHandle<R>,
) -> Result<Vec<ServerAccessToken>, String> { ) -> Result<Vec<ServerAccessToken>, String> {
dbg!("Attempting to load servers token"); log::debug!("Attempting to load servers token");
let store = app_handle let store = app_handle
.store(COCO_TAURI_STORE) .store(COCO_TAURI_STORE)
@@ -172,29 +180,42 @@ pub async fn load_servers_token<R: Runtime>(
servers.ok_or_else(|| "Failed to read servers from store: No servers found".to_string())?; servers.ok_or_else(|| "Failed to read servers from store: No servers found".to_string())?;
// Convert each item in the JsonValue array to a Server // Convert each item in the JsonValue array to a Server
if let JsonValue::Array(servers_array) = servers { match servers {
// Deserialize each JsonValue into Server, filtering out any errors JsonValue::Array(servers_array) => {
let deserialized_tokens: Vec<ServerAccessToken> = servers_array let mut deserialized_tokens: Vec<ServerAccessToken> =
.into_iter() Vec::with_capacity(servers_array.len());
.filter_map(|server_json| from_value(server_json).ok()) // Only keep valid Server instances for server_json in servers_array {
.collect(); match from_value(server_json.clone()) {
Ok(token) => {
deserialized_tokens.push(token);
}
Err(e) => {
panic!(
"failed to deserialize JSON [{}] to [struct ServerAccessToken], error [{}], store [{}] key [{}] is possibly corrupted!",
server_json, e, COCO_TAURI_STORE, COCO_SERVER_TOKENS
);
}
}
}
if deserialized_tokens.is_empty() { if deserialized_tokens.is_empty() {
return Err("Failed to deserialize any servers from the store.".to_string()); return Err("Failed to deserialize any servers from the store.".to_string());
}
for server in deserialized_tokens.iter() {
save_access_token(server.id.clone(), server.clone()).await;
}
log::debug!("loaded {:?} servers's token", &deserialized_tokens.len());
Ok(deserialized_tokens)
} }
_ => {
for server in deserialized_tokens.iter() { unreachable!(
save_access_token(server.id.clone(), server.clone()); "coco server tokens should be stored in an array under store [{}] key [{}], but it is not",
COCO_TAURI_STORE, COCO_SERVER_TOKENS
);
} }
dbg!(format!(
"loaded {:?} servers's token",
&deserialized_tokens.len()
));
Ok(deserialized_tokens)
} else {
Err("Failed to read servers from store: Invalid format".to_string())
} }
} }
@@ -216,26 +237,41 @@ pub async fn load_servers<R: Runtime>(app_handle: &AppHandle<R>) -> Result<Vec<S
servers.ok_or_else(|| "Failed to read servers from store: No servers found".to_string())?; servers.ok_or_else(|| "Failed to read servers from store: No servers found".to_string())?;
// Convert each item in the JsonValue array to a Server // Convert each item in the JsonValue array to a Server
if let JsonValue::Array(servers_array) = servers { match servers {
// Deserialize each JsonValue into Server, filtering out any errors JsonValue::Array(servers_array) => {
let deserialized_servers: Vec<Server> = servers_array let mut deserialized_servers = Vec::with_capacity(servers_array.len());
.into_iter() for server_json in servers_array {
.filter_map(|server_json| from_value(server_json).ok()) // Only keep valid Server instances match from_value(server_json.clone()) {
.collect(); Ok(server) => {
deserialized_servers.push(server);
}
Err(e) => {
panic!(
"failed to deserialize JSON [{}] to [struct Server], error [{}], store [{}] key [{}] is possibly corrupted!",
server_json, e, COCO_TAURI_STORE, COCO_SERVERS
);
}
}
}
if deserialized_servers.is_empty() { if deserialized_servers.is_empty() {
return Err("Failed to deserialize any servers from the store.".to_string()); return Err("Failed to deserialize any servers from the store.".to_string());
}
for server in deserialized_servers.iter() {
save_server(&server).await;
}
log::debug!("load servers: {:?}", &deserialized_servers);
Ok(deserialized_servers)
} }
_ => {
for server in deserialized_servers.iter() { unreachable!(
save_server(&server); "coco servers should be stored in an array under store [{}] key [{}], but it is not",
COCO_TAURI_STORE, COCO_SERVERS
);
} }
// dbg!(format!("load servers: {:?}", &deserialized_servers));
Ok(deserialized_servers)
} else {
Err("Failed to read servers from store: Invalid format".to_string())
} }
} }
@@ -243,51 +279,41 @@ pub async fn load_servers<R: Runtime>(app_handle: &AppHandle<R>) -> Result<Vec<S
pub async fn load_or_insert_default_server<R: Runtime>( pub async fn load_or_insert_default_server<R: Runtime>(
app_handle: &AppHandle<R>, app_handle: &AppHandle<R>,
) -> Result<Vec<Server>, String> { ) -> Result<Vec<Server>, String> {
dbg!("Attempting to load or insert default server"); log::debug!("Attempting to load or insert default server");
let exists_servers = load_servers(&app_handle).await; let exists_servers = load_servers(&app_handle).await;
if exists_servers.is_ok() && !exists_servers.as_ref()?.is_empty() { if exists_servers.is_ok() && !exists_servers.as_ref()?.is_empty() {
dbg!(format!("loaded {} servers", &exists_servers.clone()?.len())); log::debug!("loaded {} servers", &exists_servers.clone()?.len());
return exists_servers; return exists_servers;
} }
let default = get_default_server(); let default = get_default_server();
save_server(&default); save_server(&default).await;
dbg!("loaded default servers"); log::debug!("loaded default servers");
Ok(vec![default]) Ok(vec![default])
} }
#[tauri::command] #[tauri::command]
pub async fn list_coco_servers<R: Runtime>( pub async fn list_coco_servers<R: Runtime>(
_app_handle: AppHandle<R>, app_handle: AppHandle<R>,
) -> Result<Vec<Server>, String> { ) -> Result<Vec<Server>, String> {
//hard fresh all server's info, in order to get the actual health //hard fresh all server's info, in order to get the actual health
refresh_all_coco_server_info(_app_handle.clone()).await; refresh_all_coco_server_info(app_handle.clone()).await;
let servers: Vec<Server> = get_all_servers().await;
let servers: Vec<Server> = get_all_servers();
Ok(servers) Ok(servers)
} }
#[allow(dead_code)] pub async fn get_all_servers() -> Vec<Server> {
pub fn get_servers_as_hashmap() -> HashMap<String, Server> { let cache = SERVER_LIST_CACHE.read().await;
let cache = SERVER_CACHE.read().unwrap();
cache.clone()
}
pub fn get_all_servers() -> Vec<Server> {
let cache = SERVER_CACHE.read().unwrap();
cache.values().cloned().collect() cache.values().cloned().collect()
} }
/// We store added Coco servers in the Tauri store using this key.
pub const COCO_SERVERS: &str = "coco_servers";
const COCO_SERVER_TOKENS: &str = "coco_server_tokens";
pub async fn refresh_all_coco_server_info<R: Runtime>(app_handle: AppHandle<R>) { pub async fn refresh_all_coco_server_info<R: Runtime>(app_handle: AppHandle<R>) {
let servers = get_all_servers(); let servers = get_all_servers().await;
for server in servers { for server in servers {
let _ = refresh_coco_server_info(app_handle.clone(), server.id.clone()).await; let _ = refresh_coco_server_info(app_handle.clone(), server.id.clone()).await;
} }
@@ -300,7 +326,7 @@ pub async fn refresh_coco_server_info<R: Runtime>(
) -> Result<Server, String> { ) -> Result<Server, String> {
// Retrieve the server from the cache // Retrieve the server from the cache
let cached_server = { let cached_server = {
let cache = SERVER_CACHE.read().unwrap(); let cache = SERVER_LIST_CACHE.read().await;
cache.get(&id).cloned() cache.get(&id).cloned()
}; };
@@ -315,12 +341,16 @@ pub async fn refresh_coco_server_info<R: Runtime>(
let profile = server.profile; let profile = server.profile;
// Send request to fetch updated server info // Send request to fetch updated server info
let response = HttpClient::get(&id, "/provider/_info", None) let response = match HttpClient::get(&id, "/provider/_info", None).await {
.await Ok(response) => response,
.map_err(|e| format!("Failed to contact the server: {}", e))?; Err(e) => {
mark_server_as_offline(app_handle, &id).await;
return Err(e);
}
};
if !response.status().is_success() { if !response.status().is_success() {
mark_server_as_offline(&id).await; mark_server_as_offline(app_handle, &id).await;
return Err(format!("Request failed with status: {}", response.status())); return Err(format!("Request failed with status: {}", response.status()));
} }
@@ -335,12 +365,22 @@ pub async fn refresh_coco_server_info<R: Runtime>(
updated_server.id = id.clone(); updated_server.id = id.clone();
updated_server.builtin = is_builtin; updated_server.builtin = is_builtin;
updated_server.enabled = is_enabled; updated_server.enabled = is_enabled;
updated_server.available = true; updated_server.available = {
if server.public {
// Public Coco servers are available as long as they are online.
true
} else {
// For non-public Coco servers, we still need to check if it is
// logged in, i.e., has a token stored in `SERVER_TOKEN_LIST_CACHE`.
get_server_token(&id).await.is_some()
}
};
updated_server.profile = profile; updated_server.profile = profile;
trim_endpoint_last_forward_slash(&mut updated_server); trim_endpoint_last_forward_slash(&mut updated_server);
// Save and persist // Save and persist
save_server(&updated_server); save_server(&updated_server).await;
try_register_server_to_search_source(app_handle.clone(), &updated_server).await;
persist_servers(&app_handle) persist_servers(&app_handle)
.await .await
.map_err(|e| format!("Failed to persist servers: {}", e))?; .map_err(|e| format!("Failed to persist servers: {}", e))?;
@@ -363,11 +403,11 @@ pub async fn add_coco_server<R: Runtime>(
let endpoint = endpoint.trim_end_matches('/'); let endpoint = endpoint.trim_end_matches('/');
if check_endpoint_exists(endpoint) { if check_endpoint_exists(endpoint).await {
dbg!(format!( log::debug!(
"This Coco server has already been registered: {:?}", "trying to register a Coco server [{}] that has already been registered",
&endpoint endpoint
)); );
return Err("This Coco server has already been registered.".into()); return Err("This Coco server has already been registered.".into());
} }
@@ -376,7 +416,16 @@ pub async fn add_coco_server<R: Runtime>(
.await .await
.map_err(|e| format!("Failed to send request to the server: {}", e))?; .map_err(|e| format!("Failed to send request to the server: {}", e))?;
dbg!(format!("Get provider info response: {:?}", &response)); log::debug!("Get provider info response: {:?}", &response);
if response.status() != StatusCode::OK {
log::debug!(
"trying to register a Coco server [{}] that is possibly down",
endpoint
);
return Err("This Coco server is possibly down".into());
}
let body = get_response_body_text(response).await?; let body = get_response_body_text(response).await?;
@@ -385,26 +434,44 @@ pub async fn add_coco_server<R: Runtime>(
trim_endpoint_last_forward_slash(&mut server); trim_endpoint_last_forward_slash(&mut server);
// The JSON returned from `provider/_info` won't have this field, serde will set
// it to an empty string during deserialization, we need to set a valid value here.
if server.id.is_empty() { if server.id.is_empty() {
server.id = pizza_common::utils::uuid::Uuid::new().to_string(); server.id = pizza_common::utils::uuid::Uuid::new().to_string();
} }
// Use the default name, if it is not set.
if server.name.is_empty() { if server.name.is_empty() {
server.name = "Coco Server".to_string(); server.name = "Coco Server".to_string();
} }
save_server(&server); // Update the `available` field
if server.public {
// Serde already sets this to true, but just to make the code clear, do it again.
server.available = true;
} else {
let opt_token = get_server_token(&server.id).await;
assert!(
opt_token.is_none(),
"this Coco server is newly-added, we should have no token stored for it!"
);
// This is a non-public Coco server, and it is not logged in, so it is unavailable.
server.available = false;
}
save_server(&server).await;
try_register_server_to_search_source(app_handle.clone(), &server).await; try_register_server_to_search_source(app_handle.clone(), &server).await;
persist_servers(&app_handle) persist_servers(&app_handle)
.await .await
.map_err(|e| format!("Failed to persist Coco servers: {}", e))?; .map_err(|e| format!("Failed to persist Coco servers: {}", e))?;
dbg!(format!("Successfully registered server: {:?}", &endpoint)); log::debug!("Successfully registered server: {:?}", &endpoint);
Ok(server) Ok(server)
} }
#[tauri::command] #[tauri::command]
#[function_name::named]
pub async fn remove_coco_server<R: Runtime>( pub async fn remove_coco_server<R: Runtime>(
app_handle: AppHandle<R>, app_handle: AppHandle<R>,
id: String, id: String,
@@ -412,131 +479,219 @@ pub async fn remove_coco_server<R: Runtime>(
let registry = app_handle.state::<SearchSourceRegistry>(); let registry = app_handle.state::<SearchSourceRegistry>();
registry.remove_source(id.as_str()).await; registry.remove_source(id.as_str()).await;
remove_server_token(id.as_str()); let opt_server = remove_server_by_id(id.as_str()).await;
remove_server_by_id(id); let Some(server) = opt_server else {
panic!(
"[{}()] invoked with a server [{}] that does not exist! Mismatched states between frontend and backend!",
function_name!(),
id
);
};
persist_servers(&app_handle) persist_servers(&app_handle)
.await .await
.expect("failed to save servers"); .expect("failed to save servers");
persist_servers_token(&app_handle).expect("failed to save server tokens");
// Only non-public Coco servers require tokens
if !server.public {
// If is logged in, clear the token as well.
let deleted = remove_server_token(id.as_str()).await;
if deleted {
persist_servers_token(&app_handle)
.await
.expect("failed to save server tokens");
}
}
Ok(()) Ok(())
} }
#[tauri::command] #[tauri::command]
#[function_name::named]
pub async fn enable_server<R: Runtime>(app_handle: AppHandle<R>, id: String) -> Result<(), ()> { pub async fn enable_server<R: Runtime>(app_handle: AppHandle<R>, id: String) -> Result<(), ()> {
println!("enable_server: {}", id); let opt_server = get_server_by_id(id.as_str()).await;
let server = get_server_by_id(id.as_str()); let Some(mut server) = opt_server else {
if let Some(mut server) = server { panic!(
server.enabled = true; "[{}()] invoked with a server [{}] that does not exist! Mismatched states between frontend and backend!",
save_server(&server); function_name!(),
id
);
};
// Register the server to the search source server.enabled = true;
try_register_server_to_search_source(app_handle.clone(), &server).await; save_server(&server).await;
persist_servers(&app_handle) // Register the server to the search source
.await try_register_server_to_search_source(app_handle.clone(), &server).await;
.expect("failed to save servers");
} persist_servers(&app_handle)
.await
.expect("failed to save servers");
Ok(()) Ok(())
} }
#[tauri::command]
#[function_name::named]
pub async fn disable_server<R: Runtime>(app_handle: AppHandle<R>, id: String) -> Result<(), ()> {
let opt_server = get_server_by_id(id.as_str()).await;
let Some(mut server) = opt_server else {
panic!(
"[{}()] invoked with a server [{}] that does not exist! Mismatched states between frontend and backend!",
function_name!(),
id
);
};
server.enabled = false;
let registry = app_handle.state::<SearchSourceRegistry>();
registry.remove_source(id.as_str()).await;
save_server(&server).await;
persist_servers(&app_handle)
.await
.expect("failed to save servers");
Ok(())
}
/// For non-public Coco servers, we add it to the search source as long as it is
/// enabled.
///
/// For public Coco server, an extra token is required.
pub async fn try_register_server_to_search_source( pub async fn try_register_server_to_search_source(
app_handle: AppHandle<impl Runtime>, app_handle: AppHandle<impl Runtime>,
server: &Server, server: &Server,
) { ) {
if server.enabled { if server.enabled {
log::trace!(
"Server [name: {}, id: {}] is public: {} and available: {}",
&server.name,
&server.id,
&server.public,
&server.available
);
if !server.public {
let opt_token = get_server_token(&server.id).await;
if opt_token.is_none() {
log::debug!("Server {} is not public and no token was found", &server.id);
return;
}
}
let registry = app_handle.state::<SearchSourceRegistry>(); let registry = app_handle.state::<SearchSourceRegistry>();
let source = CocoSearchSource::new(server.clone()); let source = CocoSearchSource::new(server.clone());
registry.register_source(source).await; registry.register_source(source).await;
} }
} }
pub async fn mark_server_as_offline(id: &str) { #[function_name::named]
// println!("server_is_offline: {}", id); #[allow(unused)]
let server = get_server_by_id(id); async fn mark_server_as_online<R: Runtime>(app_handle: AppHandle<R>, id: &str) {
let server = get_server_by_id(id).await;
if let Some(mut server) = server {
server.available = true;
server.health = None;
save_server(&server).await;
try_register_server_to_search_source(app_handle.clone(), &server).await;
} else {
log::warn!(
"[{}()] invoked with a server [{}] that does not exist!",
function_name!(),
id
);
}
}
#[function_name::named]
pub(crate) async fn mark_server_as_offline<R: Runtime>(app_handle: AppHandle<R>, id: &str) {
let server = get_server_by_id(id).await;
if let Some(mut server) = server { if let Some(mut server) = server {
server.available = false; server.available = false;
server.health = None; server.health = None;
save_server(&server); save_server(&server).await;
}
}
#[tauri::command]
pub async fn disable_server<R: Runtime>(app_handle: AppHandle<R>, id: String) -> Result<(), ()> {
println!("disable_server: {}", id);
let server = get_server_by_id(id.as_str());
if let Some(mut server) = server {
server.enabled = false;
let registry = app_handle.state::<SearchSourceRegistry>(); let registry = app_handle.state::<SearchSourceRegistry>();
registry.remove_source(id.as_str()).await; registry.remove_source(id).await;
} else {
save_server(&server); log::warn!(
persist_servers(&app_handle) "[{}()] invoked with a server [{}] that does not exist!",
.await function_name!(),
.expect("failed to save servers"); id
);
} }
Ok(())
} }
#[tauri::command] #[tauri::command]
#[function_name::named]
pub async fn logout_coco_server<R: Runtime>( pub async fn logout_coco_server<R: Runtime>(
app_handle: AppHandle<R>, app_handle: AppHandle<R>,
id: String, id: String,
) -> Result<(), String> { ) -> Result<(), String> {
dbg!("Attempting to log out server by id:", &id); log::debug!("Attempting to log out server by id: {}", &id);
// Check if server token exists
if let Some(_token) = get_server_token(id.as_str()).await? {
dbg!("Found server token for id:", &id);
// Remove the server token from cache
remove_server_token(id.as_str());
// Persist the updated tokens
if let Err(e) = persist_servers_token(&app_handle) {
dbg!("Failed to save tokens for id: {}. Error: {:?}", &id, &e);
return Err(format!("Failed to save tokens: {}", &e));
}
} else {
// Log the case where server token is not found
dbg!("No server token found for id: {}", &id);
}
// Check if the server exists // Check if the server exists
if let Some(mut server) = get_server_by_id(id.as_str()) { let Some(mut server) = get_server_by_id(id.as_str()).await else {
dbg!("Found server for id:", &id); panic!(
"[{}()] invoked with a server [{}] that does not exist! Mismatched states between frontend and backend!",
function_name!(),
id
);
};
// Clear server profile // Clear server profile
server.profile = None; server.profile = None;
// Logging out from a non-public Coco server makes it unavailable
// Save the updated server data if !server.public {
save_server(&server); server.available = false;
}
// Persist the updated server data // Save the updated server data
if let Err(e) = persist_servers(&app_handle).await { save_server(&server).await;
dbg!("Failed to save server for id: {}. Error: {:?}", &id, &e); // Persist the updated server data
return Err(format!("Failed to save server: {}", &e)); if let Err(e) = persist_servers(&app_handle).await {
} log::debug!("Failed to save server for id: {}. Error: {:?}", &id, &e);
} else { return Err(format!("Failed to save server: {}", &e));
// Log the case where server is not found
dbg!("No server found for id: {}", &id);
return Err(format!("No server found for id: {}", id));
} }
dbg!("Successfully logged out server with id:", &id); let has_token = get_server_token(id.as_str()).await.is_some();
if server.public {
if has_token {
panic!("Public Coco server won't have token")
}
} else {
assert!(
has_token,
"This is a non-public Coco server, and it is logged in, we should have a token"
);
// Remove the server token from cache
remove_server_token(id.as_str()).await;
// Persist the updated tokens
if let Err(e) = persist_servers_token(&app_handle).await {
log::debug!("Failed to save tokens for id: {}. Error: {:?}", &id, &e);
return Err(format!("Failed to save tokens: {}", &e));
}
}
// Remove it from the search source if it becomes unavailable
if !server.available {
let registry = app_handle.state::<SearchSourceRegistry>();
registry.remove_source(id.as_str()).await;
}
log::debug!("Successfully logged out server with id: {}", &id);
Ok(()) Ok(())
} }
/// Removes the trailing slash from the server's endpoint if present. /// Helper function to remove the trailing slash from the server's endpoint if present.
fn trim_endpoint_last_forward_slash(server: &mut Server) { fn trim_endpoint_last_forward_slash(server: &mut Server) {
if server.endpoint.ends_with('/') { let endpoint = &mut server.endpoint;
server.endpoint.pop(); // Remove the last character while endpoint.ends_with('/') {
while server.endpoint.ends_with('/') { endpoint.pop();
server.endpoint.pop();
}
} }
} }
@@ -545,41 +700,47 @@ fn provider_info_url(endpoint: &str) -> String {
format!("{endpoint}/provider/_info") format!("{endpoint}/provider/_info")
} }
#[test] #[cfg(test)]
fn test_trim_endpoint_last_forward_slash() { mod tests {
let mut server = Server { use super::*;
id: "test".to_string(),
builtin: false, #[test]
enabled: true, fn test_trim_endpoint_last_forward_slash() {
name: "".to_string(), let mut server = Server {
endpoint: "https://example.com///".to_string(), id: "test".to_string(),
provider: Provider { builtin: false,
enabled: true,
name: "".to_string(), name: "".to_string(),
icon: "".to_string(), endpoint: "https://example.com///".to_string(),
website: "".to_string(), provider: Provider {
eula: "".to_string(), name: "".to_string(),
privacy_policy: "".to_string(), icon: "".to_string(),
banner: "".to_string(), website: "".to_string(),
description: "".to_string(), eula: "".to_string(),
}, privacy_policy: "".to_string(),
version: Version { banner: "".to_string(),
number: "".to_string(), description: "".to_string(),
},
minimal_client_version: None,
updated: "".to_string(),
public: false,
available: false,
health: None,
profile: None,
auth_provider: AuthProvider {
sso: Sso {
url: "".to_string(),
}, },
}, version: Version {
priority: 0, number: "".to_string(),
}; },
minimal_client_version: None,
updated: "".to_string(),
public: false,
available: false,
health: None,
profile: None,
auth_provider: AuthProvider {
sso: Sso {
url: "".to_string(),
},
},
priority: 0,
stats: None,
};
trim_endpoint_last_forward_slash(&mut server); trim_endpoint_last_forward_slash(&mut server);
assert_eq!(server.endpoint, "https://example.com"); assert_eq!(server.endpoint, "https://example.com");
}
} }

View File

@@ -0,0 +1,57 @@
use crate::server::http_client::HttpClient;
use futures_util::StreamExt;
use http::Method;
use serde_json::json;
use tauri::{AppHandle, Emitter, Runtime, command};
#[command]
pub async fn synthesize<R: Runtime>(
app_handle: AppHandle<R>,
client_id: String,
server_id: String,
voice: String,
content: String,
) -> Result<(), String> {
let body = json!({
"voice": voice,
"content": content,
})
.to_string();
let response = HttpClient::send_request(
server_id.as_str(),
Method::POST,
"/services/audio/synthesize",
None,
None,
Some(reqwest::Body::from(body.to_string())),
)
.await?;
log::info!("Synthesize response status: {}", response.status());
if response.status() == 429 {
return Ok(());
}
if !response.status().is_success() {
return Err(format!("Request Failed: {}", response.status()));
}
let mut stream = response.bytes_stream();
while let Some(chunk) = stream.next().await {
match chunk {
Ok(bytes) => {
if let Err(err) = app_handle.emit(&client_id, bytes.to_vec()) {
log::error!("Emit error: {:?}", err);
}
}
Err(e) => {
log::error!("Stream error: {:?}", e);
break;
}
}
}
Ok(())
}

View File

@@ -1,43 +1,96 @@
use crate::common::http::get_response_body_text; use crate::common::http::get_response_body_text;
use crate::server::http_client::HttpClient; use crate::server::http_client::HttpClient;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use serde_json::Value as JsonValue; use serde_json::{Value, from_str};
use std::collections::HashMap;
use tauri::command; use tauri::command;
#[derive(Debug, Serialize, Deserialize)] #[derive(Debug, Serialize, Deserialize)]
pub struct TranscriptionResponse { pub struct TranscriptionResponse {
pub text: String, task_id: String,
results: Vec<Value>,
} }
#[command] #[command]
pub async fn transcription( pub async fn transcription(
server_id: String, server_id: String,
audio_type: String,
audio_content: String, audio_content: String,
) -> Result<TranscriptionResponse, String> { ) -> Result<TranscriptionResponse, String> {
let mut query_params = HashMap::new(); // Send request to initiate transcription task
query_params.insert("type".to_string(), JsonValue::String(audio_type)); let init_response = HttpClient::post(
query_params.insert("content".to_string(), JsonValue::String(audio_content));
// Send the HTTP POST request
let response = HttpClient::post(
&server_id, &server_id,
"/services/audio/transcription", "/services/audio/transcription",
Some(query_params),
None, None,
Some(audio_content.into()),
) )
.await .await
.map_err(|e| format!("Error sending transcription request: {}", e))?; .map_err(|e| format!("Failed to initiate transcription: {}", e))?;
// Use get_response_body_text to extract the response body as text // Extract response body as text
let response_body = get_response_body_text(response) let init_response_text = get_response_body_text(init_response)
.await .await
.map_err(|e| format!("Failed to read response body: {}", e))?; .map_err(|e| format!("Failed to read initial response body: {}", e))?;
// Deserialize the response body into TranscriptionResponse // Parse response JSON to extract task ID
let transcription_response: TranscriptionResponse = serde_json::from_str(&response_body) let init_response_json: Value = from_str(&init_response_text).map_err(|e| {
.map_err(|e| format!("Failed to parse transcription response: {}", e))?; format!(
"Failed to parse initial response JSON: {}. Raw response: {}",
e, init_response_text
)
})?;
let transcription_task_id = init_response_json["task_id"]
.as_str()
.ok_or_else(|| {
format!(
"Missing or invalid task_id in initial response: {}",
init_response_text
)
})?
.to_string();
// Set up polling with timeout
let polling_start = std::time::Instant::now();
const POLLING_TIMEOUT: std::time::Duration = std::time::Duration::from_secs(30);
const POLLING_INTERVAL: std::time::Duration = std::time::Duration::from_millis(200);
let mut transcription_response: TranscriptionResponse;
loop {
// Poll for transcription results
let poll_response = HttpClient::get(
&server_id,
&format!("/services/audio/task/{}", transcription_task_id),
None,
)
.await
.map_err(|e| format!("Failed to poll transcription task: {}", e))?;
// Extract poll response body
let poll_response_text = get_response_body_text(poll_response)
.await
.map_err(|e| format!("Failed to read poll response body: {}", e))?;
// Parse poll response JSON
transcription_response = from_str(&poll_response_text).map_err(|e| {
format!(
"Failed to parse poll response JSON: {}. Raw response: {}",
e, poll_response_text
)
})?;
// Check if transcription results are available
if !transcription_response.results.is_empty() {
break;
}
// Check for timeout
if polling_start.elapsed() >= POLLING_TIMEOUT {
return Err("Transcription task timed out after 30 seconds".to_string());
}
// Wait before next poll
tokio::time::sleep(POLLING_INTERVAL).await;
}
Ok(transcription_response) Ok(transcription_response)
} }

View File

@@ -4,12 +4,12 @@ use std::collections::HashMap;
use std::sync::Arc; use std::sync::Arc;
use tauri::{AppHandle, Emitter, Runtime}; use tauri::{AppHandle, Emitter, Runtime};
use tokio::net::TcpStream; use tokio::net::TcpStream;
use tokio::sync::{mpsc, Mutex}; use tokio::sync::{Mutex, mpsc};
use tokio_tungstenite::tungstenite::handshake::client::generate_key;
use tokio_tungstenite::tungstenite::Message;
use tokio_tungstenite::MaybeTlsStream; use tokio_tungstenite::MaybeTlsStream;
use tokio_tungstenite::WebSocketStream; use tokio_tungstenite::WebSocketStream;
use tokio_tungstenite::{connect_async_tls_with_config, Connector}; use tokio_tungstenite::tungstenite::Message;
use tokio_tungstenite::tungstenite::handshake::client::generate_key;
use tokio_tungstenite::{Connector, connect_async_tls_with_config};
#[derive(Default)] #[derive(Default)]
pub struct WebSocketManager { pub struct WebSocketManager {
connections: Arc<Mutex<HashMap<String, Arc<WebSocketInstance>>>>, connections: Arc<Mutex<HashMap<String, Arc<WebSocketInstance>>>>,
@@ -53,9 +53,11 @@ pub async fn connect_to_server<R: Runtime>(
// Disconnect old connection first // Disconnect old connection first
disconnect(client_id.clone(), state.clone()).await.ok(); disconnect(client_id.clone(), state.clone()).await.ok();
let server = get_server_by_id(&id).ok_or(format!("Server with ID {} not found", id))?; let server = get_server_by_id(&id)
.await
.ok_or(format!("Server with ID {} not found", id))?;
let endpoint = convert_to_websocket(&server.endpoint)?; let endpoint = convert_to_websocket(&server.endpoint)?;
let token = get_server_token(&id).await?.map(|t| t.access_token.clone()); let token = get_server_token(&id).await.map(|t| t.access_token.clone());
let mut request = let mut request =
tokio_tungstenite::tungstenite::client::IntoClientRequest::into_client_request(&endpoint) tokio_tungstenite::tungstenite::client::IntoClientRequest::into_client_request(&endpoint)
@@ -125,6 +127,7 @@ pub async fn connect_to_server<R: Runtime>(
let _ = app_handle_clone.emit(&format!("ws-message-{}", client_id_clone), text); let _ = app_handle_clone.emit(&format!("ws-message-{}", client_id_clone), text);
}, },
Some(Err(_)) | None => { Some(Err(_)) | None => {
log::debug!("WebSocket connection closed or error");
let _ = app_handle_clone.emit(&format!("ws-error-{}", client_id_clone), id.clone()); let _ = app_handle_clone.emit(&format!("ws-error-{}", client_id_clone), id.clone());
break; break;
} }
@@ -132,7 +135,8 @@ pub async fn connect_to_server<R: Runtime>(
} }
} }
_ = cancel_rx.recv() => { _ = cancel_rx.recv() => {
let _ = app_handle_clone.emit(&format!("ws-error-{}", client_id_clone), id.clone()); log::debug!("WebSocket connection cancelled");
let _ = app_handle_clone.emit(&format!("ws-cancel-{}", client_id_clone), id.clone());
break; break;
} }
} }

View File

@@ -1,3 +1,9 @@
use tauri::{App, WebviewWindow}; use tauri::{App, WebviewWindow};
pub fn platform(_app: &mut App, _main_window: WebviewWindow, _settings_window: WebviewWindow) {} pub fn platform(
_app: &mut App,
_main_window: WebviewWindow,
_settings_window: WebviewWindow,
_check_window: WebviewWindow,
) {
}

View File

@@ -1,6 +1,9 @@
//credits to: https://github.com/ayangweb/ayangweb-EcoPaste/blob/169323dbe6365ffe4abb64d867439ed2ea84c6d1/src-tauri/src/core/setup/mac.rs //! credits to: https://github.com/ayangweb/ayangweb-EcoPaste/blob/169323dbe6365ffe4abb64d867439ed2ea84c6d1/src-tauri/src/core/setup/mac.rs
use tauri::{ActivationPolicy, App, Emitter, EventTarget, WebviewWindow};
use tauri_nspanel::{cocoa::appkit::NSWindowCollectionBehavior, panel_delegate, WebviewWindowExt}; use cocoa::appkit::NSWindow;
use tauri::Manager;
use tauri::{App, AppHandle, Emitter, EventTarget, WebviewWindow};
use tauri_nspanel::{WebviewWindowExt, cocoa::appkit::NSWindowCollectionBehavior, panel_delegate};
use crate::common::MAIN_WINDOW_LABEL; use crate::common::MAIN_WINDOW_LABEL;
@@ -12,9 +15,12 @@ const WINDOW_BLUR_EVENT: &str = "tauri://blur";
const WINDOW_MOVED_EVENT: &str = "tauri://move"; const WINDOW_MOVED_EVENT: &str = "tauri://move";
const WINDOW_RESIZED_EVENT: &str = "tauri://resize"; const WINDOW_RESIZED_EVENT: &str = "tauri://resize";
pub fn platform(app: &mut App, main_window: WebviewWindow, _settings_window: WebviewWindow) { pub fn platform(
app.set_activation_policy(ActivationPolicy::Accessory); _app: &mut App,
main_window: WebviewWindow,
_settings_window: WebviewWindow,
_check_window: WebviewWindow,
) {
// Convert ns_window to ns_panel // Convert ns_window to ns_panel
let panel = main_window.to_panel().unwrap(); let panel = main_window.to_panel().unwrap();
@@ -26,7 +32,7 @@ pub fn platform(app: &mut App, main_window: WebviewWindow, _settings_window: Web
// Share the window across all desktop spaces and full screen // Share the window across all desktop spaces and full screen
panel.set_collection_behaviour( panel.set_collection_behaviour(
NSWindowCollectionBehavior::NSWindowCollectionBehaviorCanJoinAllSpaces NSWindowCollectionBehavior::NSWindowCollectionBehaviorMoveToActiveSpace
| NSWindowCollectionBehavior::NSWindowCollectionBehaviorStationary | NSWindowCollectionBehavior::NSWindowCollectionBehaviorStationary
| NSWindowCollectionBehavior::NSWindowCollectionBehaviorFullScreenAuxiliary, | NSWindowCollectionBehavior::NSWindowCollectionBehaviorFullScreenAuxiliary,
); );
@@ -75,3 +81,50 @@ pub fn platform(app: &mut App, main_window: WebviewWindow, _settings_window: Web
// Set the delegate object for the window to handle window events // Set the delegate object for the window to handle window events
panel.set_delegate(delegate); panel.set_delegate(delegate);
} }
/// Change NS window attribute between `NSWindowCollectionBehaviorCanJoinAllSpaces`
/// and `NSWindowCollectionBehaviorMoveToActiveSpace` accordingly.
///
/// NOTE: this tauri command is not async because we should run it in the main
/// thread, or `ns_window.setCollectionBehavior_(collection_behavior)` would lead
/// to UB.
#[tauri::command]
pub(crate) fn toggle_move_to_active_space_attribute(tauri_app_hanlde: AppHandle) {
use cocoa::appkit::NSWindowCollectionBehavior;
use cocoa::base::id;
let main_window = tauri_app_hanlde
.get_webview_window(MAIN_WINDOW_LABEL)
.unwrap();
let ns_window = main_window.ns_window().unwrap() as id;
let mut collection_behavior = unsafe { ns_window.collectionBehavior() };
let join_all_spaces = collection_behavior
.contains(NSWindowCollectionBehavior::NSWindowCollectionBehaviorCanJoinAllSpaces);
let move_to_active_space = collection_behavior
.contains(NSWindowCollectionBehavior::NSWindowCollectionBehaviorMoveToActiveSpace);
match (join_all_spaces, move_to_active_space) {
(true, false) => {
collection_behavior
.remove(NSWindowCollectionBehavior::NSWindowCollectionBehaviorCanJoinAllSpaces);
collection_behavior
.insert(NSWindowCollectionBehavior::NSWindowCollectionBehaviorMoveToActiveSpace);
}
(false, true) => {
collection_behavior
.remove(NSWindowCollectionBehavior::NSWindowCollectionBehaviorMoveToActiveSpace);
collection_behavior
.insert(NSWindowCollectionBehavior::NSWindowCollectionBehaviorCanJoinAllSpaces);
}
_ => {
panic!(
"invalid NS window attribute, NSWindowCollectionBehaviorCanJoinAllSpaces is set [{}], NSWindowCollectionBehaviorMoveToActiveSpace is set [{}]",
join_all_spaces, move_to_active_space
);
}
}
unsafe {
ns_window.setCollectionBehavior_(collection_behavior);
}
}

View File

@@ -18,10 +18,20 @@ pub use windows::*;
#[cfg(target_os = "linux")] #[cfg(target_os = "linux")]
pub use linux::*; pub use linux::*;
pub fn default(app: &mut App, main_window: WebviewWindow, settings_window: WebviewWindow) { pub fn default(
app: &mut App,
main_window: WebviewWindow,
settings_window: WebviewWindow,
check_window: WebviewWindow,
) {
// Development mode automatically opens the console: https://tauri.app/develop/debug // Development mode automatically opens the console: https://tauri.app/develop/debug
#[cfg(all(dev, debug_assertions))] #[cfg(debug_assertions)]
main_window.open_devtools(); main_window.open_devtools();
platform(app, main_window.clone(), settings_window.clone()); platform(
app,
main_window.clone(),
settings_window.clone(),
check_window.clone(),
);
} }

View File

@@ -1,3 +1,9 @@
use tauri::{App, WebviewWindow}; use tauri::{App, WebviewWindow};
pub fn platform(_app: &mut App, _main_window: WebviewWindow, _settings_window: WebviewWindow) {} pub fn platform(
_app: &mut App,
_main_window: WebviewWindow,
_settings_window: WebviewWindow,
_check_window: WebviewWindow,
) {
}

View File

@@ -1,5 +1,5 @@
use crate::{hide_coco, show_coco, COCO_TAURI_STORE}; use crate::{COCO_TAURI_STORE, hide_coco, show_coco};
use tauri::{async_runtime, App, AppHandle, Manager, Runtime}; use tauri::{App, AppHandle, Manager, Runtime, async_runtime};
use tauri_plugin_global_shortcut::{GlobalShortcutExt, Shortcut, ShortcutState}; use tauri_plugin_global_shortcut::{GlobalShortcutExt, Shortcut, ShortcutState};
use tauri_plugin_store::{JsonValue, StoreExt}; use tauri_plugin_store::{JsonValue, StoreExt};
@@ -17,6 +17,7 @@ const DEFAULT_SHORTCUT: &str = "ctrl+shift+space";
/// Set up the shortcut upon app start. /// Set up the shortcut upon app start.
pub fn enable_shortcut(app: &App) { pub fn enable_shortcut(app: &App) {
log::trace!("setting up Coco hotkey");
let store = app let store = app
.store(COCO_TAURI_STORE) .store(COCO_TAURI_STORE)
.expect("creating a store should not fail"); .expect("creating a store should not fail");
@@ -43,6 +44,7 @@ pub fn enable_shortcut(app: &App) {
.expect("default shortcut should never be invalid"); .expect("default shortcut should never be invalid");
_register_shortcut_upon_start(app, default_shortcut); _register_shortcut_upon_start(app, default_shortcut);
} }
log::trace!("Coco hotkey has been set");
} }
/// Get the stored shortcut as a string, same as [`_get_shortcut()`], except that /// Get the stored shortcut as a string, same as [`_get_shortcut()`], except that
@@ -97,7 +99,7 @@ fn _register_shortcut<R: Runtime>(app: &AppHandle<R>, shortcut: Shortcut) {
.on_shortcut(shortcut, move |app, scut, event| { .on_shortcut(shortcut, move |app, scut, event| {
if scut == &shortcut { if scut == &shortcut {
dbg!("shortcut pressed"); dbg!("shortcut pressed");
let main_window = app.get_window(MAIN_WINDOW_LABEL).unwrap(); let main_window = app.get_webview_window(MAIN_WINDOW_LABEL).unwrap();
if let ShortcutState::Pressed = event.state() { if let ShortcutState::Pressed = event.state() {
let app_handle = app.clone(); let app_handle = app.clone();
if main_window.is_visible().unwrap() { if main_window.is_visible().unwrap() {
@@ -126,7 +128,7 @@ fn _register_shortcut_upon_start(app: &App, shortcut: Shortcut) {
tauri_plugin_global_shortcut::Builder::new() tauri_plugin_global_shortcut::Builder::new()
.with_handler(move |app, scut, event| { .with_handler(move |app, scut, event| {
if scut == &shortcut { if scut == &shortcut {
let window = app.get_window(MAIN_WINDOW_LABEL).unwrap(); let window = app.get_webview_window(MAIN_WINDOW_LABEL).unwrap();
if let ShortcutState::Pressed = event.state() { if let ShortcutState::Pressed = event.state() {
let app_handle = app.clone(); let app_handle = app.clone();

View File

@@ -0,0 +1,62 @@
//! Configuration entry App language is persisted in the frontend code, but we
//! need to access it on the backend.
//!
//! So we duplicate it here **in the MEMORY** and expose a setter method to the
//! frontend so that the value can be updated and stay update-to-date.
use function_name::named;
use tokio::sync::RwLock;
#[derive(Debug, Clone, Copy, PartialEq)]
#[allow(non_camel_case_types)]
pub(crate) enum Lang {
en_US,
zh_CN,
}
impl std::fmt::Display for Lang {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Lang::en_US => write!(f, "en_US"),
Lang::zh_CN => write!(f, "zh_CN"),
}
}
}
impl std::str::FromStr for Lang {
type Err = String;
fn from_str(s: &str) -> Result<Self, Self::Err> {
match s {
"en" => Ok(Lang::en_US),
"zh" => Ok(Lang::zh_CN),
_ => Err(format!("Invalid language: {}", s)),
}
}
}
/// Cache the language config in memory.
static APP_LANG: RwLock<Option<Lang>> = RwLock::const_new(None);
/// Frontend code uses this interface to update the in-memory cached `APP_LANG` config.
#[named]
#[tauri::command]
pub(crate) async fn update_app_lang(lang: String) {
let app_lang = lang.parse::<Lang>().unwrap_or_else(|e| {
panic!(
"frontend code passes an invalid argument [{}] to interface [{}], parsing error [{}]",
lang,
function_name!(),
e
)
});
let mut write_guard = APP_LANG.write().await;
*write_guard = Some(app_lang);
}
/// Helper getter method to handle the `None` case.
pub(crate) async fn get_app_lang() -> Lang {
let opt_lang = *APP_LANG.read().await;
opt_lang.expect("frontend code did not invoke [update_app_lang()] to set the APP_LANG")
}

174
src-tauri/src/util/file.rs Normal file
View File

@@ -0,0 +1,174 @@
#[derive(Debug, Clone, PartialEq, Copy)]
pub(crate) enum FileType {
Folder,
JPEGImage,
PNGImage,
PDFDocument,
PlainTextDocument,
MicrosoftWordDocument,
MicrosoftExcelSpreadsheet,
AudioFile,
VideoFile,
CHeaderFile,
TOMLDocument,
RustScript,
CSourceCode,
MarkdownDocument,
TerminalSettings,
ZipArchive,
Dmg,
Html,
Json,
Xml,
Yaml,
Css,
Vue,
React,
Sql,
Csv,
Javascript,
Lnk,
Typescript,
Python,
Java,
Golang,
Ruby,
Php,
Sass,
Sketch,
AdobeAi,
AdobePsd,
AdobePr,
AdobeAu,
AdobeAe,
AdobeLr,
AdobeXd,
AdobeFl,
AdobeId,
Svg,
Epub,
Unknown,
}
async fn get_file_type(path: &str) -> FileType {
let path = camino::Utf8Path::new(path);
// stat() is more precise than file extension, use it if possible.
if path.is_dir() {
return FileType::Folder;
}
let Some(ext) = path.extension() else {
return FileType::Unknown;
};
let ext = ext.to_lowercase();
match ext.as_str() {
"pdf" => FileType::PDFDocument,
"txt" | "text" => FileType::PlainTextDocument,
"doc" | "docx" => FileType::MicrosoftWordDocument,
"xls" | "xlsx" => FileType::MicrosoftExcelSpreadsheet,
"jpg" | "jpeg" => FileType::JPEGImage,
"png" => FileType::PNGImage,
"mp3" | "wav" | "flac" | "aac" | "ogg" | "m4a" => FileType::AudioFile,
"mp4" | "avi" | "mov" | "mkv" | "wmv" | "flv" | "webm" => FileType::VideoFile,
"h" | "hpp" => FileType::CHeaderFile,
"c" | "cpp" | "cc" | "cxx" => FileType::CSourceCode,
"toml" => FileType::TOMLDocument,
"rs" => FileType::RustScript,
"md" | "markdown" => FileType::MarkdownDocument,
"terminal" => FileType::TerminalSettings,
"zip" | "rar" | "7z" | "tar" | "gz" | "bz2" => FileType::ZipArchive,
"dmg" => FileType::Dmg,
"html" | "htm" => FileType::Html,
"json" => FileType::Json,
"xml" => FileType::Xml,
"yaml" | "yml" => FileType::Yaml,
"css" => FileType::Css,
"vue" => FileType::Vue,
"jsx" | "tsx" => FileType::React,
"sql" => FileType::Sql,
"csv" => FileType::Csv,
"js" | "mjs" => FileType::Javascript,
"ts" => FileType::Typescript,
"py" | "pyw" => FileType::Python,
"java" => FileType::Java,
"go" => FileType::Golang,
"rb" => FileType::Ruby,
"php" => FileType::Php,
"sass" | "scss" => FileType::Sass,
"sketch" => FileType::Sketch,
"ai" => FileType::AdobeAi,
"psd" => FileType::AdobePsd,
"prproj" => FileType::AdobePr,
"aup" | "aup3" => FileType::AdobeAu,
"aep" => FileType::AdobeAe,
"lrcat" => FileType::AdobeLr,
"xd" => FileType::AdobeXd,
"fla" => FileType::AdobeFl,
"indd" => FileType::AdobeId,
"svg" => FileType::Svg,
"epub" => FileType::Epub,
"lnk" => FileType::Lnk,
_ => FileType::Unknown,
}
}
fn type_to_icon(ty: FileType) -> &'static str {
match ty {
FileType::Folder => "font_file_folder",
FileType::JPEGImage => "font_file_image",
FileType::PNGImage => "font_file_image",
FileType::PDFDocument => "font_file_document_pdf",
FileType::PlainTextDocument => "font_file_txt",
FileType::MicrosoftWordDocument => "font_file_document_word",
FileType::MicrosoftExcelSpreadsheet => "font_file_spreadsheet_excel",
FileType::AudioFile => "font_file_audio",
FileType::VideoFile => "font_file_video",
FileType::CHeaderFile => "font_file_csource",
FileType::TOMLDocument => "font_file_toml",
FileType::RustScript => "font_file_rustscript1",
FileType::CSourceCode => "font_file_csource",
FileType::MarkdownDocument => "font_file_markdown",
FileType::TerminalSettings => "font_file_terminal1",
FileType::ZipArchive => "font_file_zip",
FileType::Dmg => "font_file_dmg",
FileType::Html => "font_file_html",
FileType::Json => "font_file_json",
FileType::Xml => "font_file_xml",
FileType::Yaml => "font_file_yaml",
FileType::Css => "font_file_css",
FileType::Vue => "font_file_vue",
FileType::React => "font_file_react",
FileType::Sql => "font_file_sql",
FileType::Csv => "font_file_csv",
FileType::Javascript => "font_file_javascript",
FileType::Lnk => "font_file_lnk",
FileType::Typescript => "font_file_typescript",
FileType::Python => "font_file_python",
FileType::Java => "font_file_java",
FileType::Golang => "font_file_golang",
FileType::Ruby => "font_file_ruby",
FileType::Php => "font_file_php",
FileType::Sass => "font_file_sass",
FileType::Sketch => "font_file_sketch",
FileType::AdobeAi => "font_file_adobe_ai",
FileType::AdobePsd => "font_file_adobe_psd",
FileType::AdobePr => "font_file_adobe_pr",
FileType::AdobeAu => "font_file_adobe_au",
FileType::AdobeAe => "font_file_adobe_ae",
FileType::AdobeLr => "font_file_adobe_lr",
FileType::AdobeXd => "font_file_adobe_xd",
FileType::AdobeFl => "font_file_adobe_fl",
FileType::AdobeId => "font_file_adobe_id",
FileType::Svg => "font_file_svg",
FileType::Epub => "font_file_epub",
FileType::Unknown => "font_file_unknown",
}
}
#[tauri::command]
pub(crate) async fn get_file_icon(path: String) -> &'static str {
let ty = get_file_type(path.as_str()).await;
type_to_icon(ty)
}

View File

@@ -1,10 +1,20 @@
pub(crate) mod app_lang;
pub(crate) mod file;
pub(crate) mod platform;
pub(crate) mod updater;
use std::{path::Path, process::Command}; use std::{path::Path, process::Command};
use tauri::{AppHandle, Runtime}; use tauri::{AppHandle, Runtime};
use tauri_plugin_shell::ShellExt; use tauri_plugin_shell::ShellExt;
/// We use this env variable to determine the DE on Linux.
const XDG_CURRENT_DESKTOP: &str = "XDG_CURRENT_DESKTOP";
#[derive(Debug, PartialEq)]
enum LinuxDesktopEnvironment { enum LinuxDesktopEnvironment {
Gnome, Gnome,
Kde, Kde,
Unsupported { xdg_current_desktop: String },
} }
impl LinuxDesktopEnvironment { impl LinuxDesktopEnvironment {
@@ -30,6 +40,14 @@ impl LinuxDesktopEnvironment {
.arg(path) .arg(path)
.output() .output()
.map_err(|e| e.to_string())?, .map_err(|e| e.to_string())?,
Self::Unsupported {
xdg_current_desktop,
} => {
return Err(format!(
"Cannot open apps as this Linux desktop environment [{}] is not supported",
xdg_current_desktop
));
}
}; };
if !cmd_output.status.success() { if !cmd_output.status.success() {
@@ -44,20 +62,23 @@ impl LinuxDesktopEnvironment {
} }
} }
/// None means that it is likely that we do not have a desktop environment.
fn get_linux_desktop_environment() -> Option<LinuxDesktopEnvironment> { fn get_linux_desktop_environment() -> Option<LinuxDesktopEnvironment> {
let de_os_str = std::env::var_os("XDG_CURRENT_DESKTOP")?; let de_os_str = std::env::var_os(XDG_CURRENT_DESKTOP)?;
let de_str = de_os_str let de_str = de_os_str.into_string().unwrap_or_else(|_os_string| {
.into_string() panic!("${} should be UTF-8 encoded", XDG_CURRENT_DESKTOP);
.expect("$XDG_CURRENT_DESKTOP should be UTF-8 encoded"); });
let de = match de_str.as_str() { let de = match de_str.as_str() {
"GNOME" => LinuxDesktopEnvironment::Gnome, "GNOME" => LinuxDesktopEnvironment::Gnome,
// Ubuntu uses "ubuntu:GNOME" instead of just "GNOME", they really love
// their distro name.
"ubuntu:GNOME" => LinuxDesktopEnvironment::Gnome,
"KDE" => LinuxDesktopEnvironment::Kde, "KDE" => LinuxDesktopEnvironment::Kde,
unsupported_de => unimplemented!( _ => LinuxDesktopEnvironment::Unsupported {
"This desktop environment [{}] has not been supported yet", xdg_current_desktop: de_str,
unsupported_de },
),
}; };
Some(de) Some(de)
@@ -67,13 +88,12 @@ fn get_linux_desktop_environment() -> Option<LinuxDesktopEnvironment> {
// //
// tauri_plugin_shell::open() is deprecated, but we still use it. // tauri_plugin_shell::open() is deprecated, but we still use it.
#[allow(deprecated)] #[allow(deprecated)]
#[tauri::command]
pub async fn open<R: Runtime>(app_handle: AppHandle<R>, path: String) -> Result<(), String> { pub async fn open<R: Runtime>(app_handle: AppHandle<R>, path: String) -> Result<(), String> {
if cfg!(target_os = "linux") { if cfg!(target_os = "linux") {
let borrowed_path = Path::new(&path); let borrowed_path = Path::new(&path);
if let Some(file_extension) = borrowed_path.extension() { if let Some(file_extension) = borrowed_path.extension() {
if file_extension == "desktop" { if file_extension == "desktop" {
let desktop_environment = get_linux_desktop_environment().expect("The Linux OS is running without a desktop, Coco could never run in such a environment"); let desktop_environment = get_linux_desktop_environment().expect("The Linux OS is running without a desktop, Coco could never run in such an environment");
return desktop_environment.launch_app_via_desktop_file(path); return desktop_environment.launch_app_via_desktop_file(path);
} }
} }
@@ -84,3 +104,55 @@ pub async fn open<R: Runtime>(app_handle: AppHandle<R>, path: String) -> Result<
.open(path, None) .open(path, None)
.map_err(|e| e.to_string()) .map_err(|e| e.to_string())
} }
#[cfg(test)]
mod tests {
use super::*;
// This test modifies env var XDG_CURRENT_DESKTOP, which is kinda unsafe
// but considering this is just test, it is ok to do so.
#[test]
fn test_get_linux_desktop_environment() {
// SAFETY: Rust code won't modify/read XDG_CURRENT_DESKTOP concurrently, we
// have no guarantee from the underlying C code.
unsafe {
// Save the original value if it exists
let original_value = std::env::var_os(XDG_CURRENT_DESKTOP);
// Test when XDG_CURRENT_DESKTOP is not set
std::env::remove_var(XDG_CURRENT_DESKTOP);
assert!(get_linux_desktop_environment().is_none());
// Test GNOME
std::env::set_var(XDG_CURRENT_DESKTOP, "GNOME");
let result = get_linux_desktop_environment();
assert_eq!(result.unwrap(), LinuxDesktopEnvironment::Gnome);
// Test ubuntu:GNOME
std::env::set_var(XDG_CURRENT_DESKTOP, "ubuntu:GNOME");
let result = get_linux_desktop_environment();
assert_eq!(result.unwrap(), LinuxDesktopEnvironment::Gnome);
// Test KDE
std::env::set_var(XDG_CURRENT_DESKTOP, "KDE");
let result = get_linux_desktop_environment();
assert_eq!(result.unwrap(), LinuxDesktopEnvironment::Kde);
// Test unsupported desktop environment
std::env::set_var(XDG_CURRENT_DESKTOP, "XFCE");
let result = get_linux_desktop_environment();
assert_eq!(
result.unwrap(),
LinuxDesktopEnvironment::Unsupported {
xdg_current_desktop: "XFCE".into()
}
);
// Restore the original value
match original_value {
Some(value) => std::env::set_var(XDG_CURRENT_DESKTOP, value),
None => std::env::remove_var(XDG_CURRENT_DESKTOP),
}
}
}
}

View File

@@ -0,0 +1,34 @@
use derive_more::Display;
use serde::{Deserialize, Serialize};
use std::borrow::Cow;
#[derive(Debug, Deserialize, Serialize, Copy, Clone, Hash, PartialEq, Eq, Display)]
#[serde(rename_all(serialize = "lowercase", deserialize = "lowercase"))]
pub(crate) enum Platform {
#[display("macOS")]
Macos,
#[display("Linux")]
Linux,
#[display("windows")]
Windows,
}
impl Platform {
/// Helper function to determine the current platform.
pub(crate) fn current() -> Platform {
let os_str = std::env::consts::OS;
serde_plain::from_str(os_str).unwrap_or_else(|_e| {
panic!("std::env::consts::OS is [{}], which is not a valid value for [enum Platform], valid values: ['macos', 'linux', 'windows']", os_str)
})
}
/// Return the `X-OS-NAME` HTTP request header.
pub(crate) fn to_os_name_http_header_str(&self) -> Cow<'static, str> {
match self {
Self::Macos => Cow::Borrowed("macos"),
Self::Windows => Cow::Borrowed("windows"),
// For Linux, we need the actual distro `ID`, not just a "linux".
Self::Linux => Cow::Owned(sysinfo::System::distribution_id()),
}
}
}

View File

@@ -0,0 +1,87 @@
use semver::Version;
use tauri_plugin_updater::RemoteRelease;
/// Helper function to extract the build number out of `version`.
///
/// If the version string is in the `x.y.z` format and does not include a build
/// number, we assume a build number of 0.
fn extract_build_number(version: &Version) -> u32 {
let pre = &version.pre;
if pre.is_empty() {
// A special value for the versions that do not have array
0
} else {
let pre_str = pre.as_str();
let build_number_str = {
match pre_str.strip_prefix("SNAPSHOT-") {
Some(str) => str,
None => pre_str,
}
};
let build_number : u32 = build_number_str.parse().unwrap_or_else(|e| {
panic!(
"invalid build number, cannot parse [{}] to a valid build number, error [{}], version [{}]",
build_number_str, e, version
)
});
build_number
}
}
/// # Local version format
///
/// Packages built in our CI use the following format:
///
/// * `x.y.z-SNAPSHOT-<build number>`
/// * `x.y.z-<build number>`
///
/// If you build Coco from src, the version will be in format `x.y.z`
///
/// # Remote version format
///
/// `x.y.z-<build number>`
///
/// # How we compare versions
///
/// We compare versions based solely on the build number.
/// If the version string is in the `x.y.z` format and does not include a build number,
/// we assume a build number of 0. As a result, such versions are considered older
/// than any version with an explicit build number.
pub(crate) fn custom_version_comparator(local: Version, remote_release: RemoteRelease) -> bool {
let remote = remote_release.version;
let local_build_number = extract_build_number(&local);
let remote_build_number = extract_build_number(&remote);
let should_update = remote_build_number > local_build_number;
log::debug!(
"custom version comparator invoked, local version [{}], remote version [{}], should update [{}]",
local,
remote,
should_update
);
should_update
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_extract_build_number() {
// 0.6.0 => 0
let version = Version::parse("0.6.0").unwrap();
assert_eq!(extract_build_number(&version), 0);
// 0.6.0-2371 => 2371
let version = Version::parse("0.6.0-2371").unwrap();
assert_eq!(extract_build_number(&version), 2371);
// 0.6.0-SNAPSHOT-2371 => 2371
let version = Version::parse("0.6.0-SNAPSHOT-2371").unwrap();
assert_eq!(extract_build_number(&version), 2371);
}
}

View File

@@ -1,4 +1,5 @@
{ {
"identifier": "rs.coco.app",
"bundle": { "bundle": {
"macOS": { "macOS": {
"entitlements": "./Entitlements.plist", "entitlements": "./Entitlements.plist",
@@ -7,4 +8,4 @@
} }
} }
} }
} }

View File

@@ -41,7 +41,9 @@
"title": "Coco AI Settings", "title": "Coco AI Settings",
"url": "/ui/settings", "url": "/ui/settings",
"width": 1000, "width": 1000,
"minWidth": 1000,
"height": 700, "height": 700,
"minHeight": 700,
"center": true, "center": true,
"transparent": true, "transparent": true,
"maximizable": false, "maximizable": false,
@@ -53,6 +55,26 @@
"effects": ["sidebar"], "effects": ["sidebar"],
"state": "active" "state": "active"
} }
},
{
"label": "check",
"title": "Coco AI Update",
"url": "/ui/check",
"width": 340,
"minWidth": 340,
"height": 260,
"minHeight": 260,
"center": false,
"transparent": true,
"maximizable": false,
"skipTaskbar": false,
"dragDropEnabled": false,
"hiddenTitle": true,
"visible": false,
"windowEffects": {
"effects": ["sidebar"],
"state": "active"
}
} }
], ],
"security": { "security": {
@@ -91,21 +113,7 @@
"icons/Square310x310Logo.png", "icons/Square310x310Logo.png",
"icons/StoreLogo.png" "icons/StoreLogo.png"
], ],
"macOS": { "resources": ["assets/**/*", "icons"]
"minimumSystemVersion": "12.0",
"hardenedRuntime": true,
"dmg": {
"appPosition": {
"x": 180,
"y": 180
},
"applicationFolderPosition": {
"x": 480,
"y": 180
}
}
},
"resources": ["assets", "icons"]
}, },
"plugins": { "plugins": {
"features": { "features": {

View File

@@ -0,0 +1,15 @@
{
"identifier": "rs.coco.app",
"bundle": {
"linux": {
"deb": {
"depends": ["gstreamer1.0-plugins-good"],
"desktopTemplate": "./Coco.desktop"
},
"rpm": {
"depends": ["gstreamer1-plugins-good"],
"desktopTemplate": "./Coco.desktop"
}
}
}
}

View File

@@ -0,0 +1,8 @@
{
"identifier": "rs.coco.app",
"bundle": {
"macOS": {
"entitlements": "./Entitlements.plist"
}
}
}

View File

@@ -96,7 +96,7 @@ export const Get = <T>(
export const Post = <T>( export const Post = <T>(
url: string, url: string,
data: IAnyObj, data: IAnyObj | undefined,
params: IAnyObj = {}, params: IAnyObj = {},
headers: IAnyObj = {} headers: IAnyObj = {}
): Promise<[any, FcResponse<T> | undefined]> => { ): Promise<[any, FcResponse<T> | undefined]> => {

63
src/api/streamFetch.ts Normal file
View File

@@ -0,0 +1,63 @@
export async function streamPost({
url,
body,
queryParams,
headers,
onMessage,
onError,
}: {
url: string;
body: any;
queryParams?: Record<string, any>;
headers?: Record<string, string>;
onMessage: (chunk: string) => void;
onError?: (err: any) => void;
}) {
const appStore = JSON.parse(localStorage.getItem("app-store") || "{}");
let baseURL = appStore.state?.endpoint_http;
if (!baseURL || baseURL === "undefined") {
baseURL = "";
}
const headersStr = localStorage.getItem("headers") || "{}";
const headersStorage = JSON.parse(headersStr);
const query = new URLSearchParams(queryParams || {}).toString();
const fullUrl = `${baseURL}${url}?${query}`;
try {
const res = await fetch(fullUrl, {
method: "POST",
headers: {
"Content-Type": "application/json",
...(headersStorage),
...(headers || {}),
},
body: JSON.stringify(body),
});
if (!res.ok || !res.body) throw new Error("Stream failed");
const reader = res.body.getReader();
const decoder = new TextDecoder("utf-8");
let buffer = "";
while (true) {
const { done, value } = await reader.read();
if (done) break;
buffer += decoder.decode(value, { stream: true });
const lines = buffer.split("\n");
for (let i = 0; i < lines.length - 1; i++) {
const line = lines[i].trim();
if (line) onMessage(line);
}
buffer = lines[lines.length - 1];
}
} catch (err) {
console.error("streamPost error:", err);
onError?.(err);
}
}

View File

@@ -1,133 +0,0 @@
import { fetch } from "@tauri-apps/plugin-http";
import { clientEnv } from "@/utils/env";
import { useLogStore } from "@/stores/logStore";
import { get_server_token } from "@/commands";
interface FetchRequestConfig {
url: string;
method?: "GET" | "POST" | "PUT" | "DELETE";
headers?: Record<string, string>;
body?: any;
timeout?: number;
parseAs?: "json" | "text" | "binary";
baseURL?: string;
}
interface FetchResponse<T = any> {
data: T;
status: number;
statusText: string;
headers: Headers;
}
const timeoutPromise = (ms: number) => {
return new Promise<never>((_, reject) =>
setTimeout(() => reject(new Error(`Request timed out after ${ms} ms`)), ms)
);
};
export const tauriFetch = async <T = any>({
url,
method = "GET",
headers = {},
body,
timeout = 30,
parseAs = "json",
baseURL = clientEnv.COCO_SERVER_URL
}: FetchRequestConfig): Promise<FetchResponse<T>> => {
const addLog = useLogStore.getState().addLog;
try {
const appStore = JSON.parse(localStorage.getItem("app-store") || "{}");
const connectStore = JSON.parse(localStorage.getItem("connect-store") || "{}");
console.log("baseURL", appStore.state?.endpoint_http)
baseURL = appStore.state?.endpoint_http || baseURL;
const authStore = JSON.parse(localStorage.getItem("auth-store") || "{}")
const auth = authStore?.state?.auth
console.log("auth", auth)
if (baseURL.endsWith("/")) {
baseURL = baseURL.slice(0, -1);
}
if (!url.startsWith("http://") && !url.startsWith("https://")) {
// If not, prepend the defaultPrefix
url = baseURL + url;
}
if (method !== "GET") {
headers["Content-Type"] = "application/json";
}
const server_id = connectStore.state?.currentService?.id || "default_coco_server"
const res: any = await get_server_token(server_id);
headers["X-API-TOKEN"] = headers["X-API-TOKEN"] || res?.access_token || undefined;
// debug API
const requestInfo = {
url,
method,
headers,
body,
timeout,
parseAs,
};
const fetchPromise = fetch(url, {
method,
headers,
body,
});
const response = await Promise.race([
fetchPromise,
timeoutPromise(timeout * 1000),
]);
const statusText = response.ok ? "OK" : "Error";
let data: any;
if (parseAs === "json") {
data = await response.json();
} else if (parseAs === "text") {
data = await response.text();
} else {
data = await response.arrayBuffer();
}
// debug API
const log = {
request: requestInfo,
response: {
data,
status: response.status,
statusText,
headers: response.headers,
},
};
addLog(log);
return log.response;
} catch (error) {
console.error("Request failed:", error);
// debug API
const log = {
request: {
url,
method,
headers,
body,
timeout,
parseAs,
},
error,
};
addLog(log);
throw error;
}
};

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 346 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 347 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 485 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 491 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 504 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 500 B

Some files were not shown because too many files have changed in this diff Show More