Compare commits

..

43 Commits

Author SHA1 Message Date
DarkSky
86d65b2f64 feat(server): add image resize support (#14588)
fix #13842 


#### PR Dependency Tree


* **PR #14588** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Images are now processed natively and converted to WebP for smaller,
optimized files; Copilot and avatar attachments use the processed WebP
output.
* Avatar uploads accept BMP, GIF, JPEG, PNG, WebP (5MB max) and are
downscaled to a standard edge.

* **Error Messages / i18n**
  * Added localized error "Image format not supported: {format}".

* **Tests**
* Added end-to-end and unit tests for conversion, EXIF preservation, and
upload limits.

* **Chores**
  * Added native image-processing dependencies.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-07 04:42:12 +08:00
DarkSky
f34e25e122 test: migrate test & utils (#14569)
#### PR Dependency Tree


* **PR #14569** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Chores**
* Upgraded development test tooling to Vitest v4 and added Playwright
browser test integration; normalized test configurations and CI shard
matrix.

* **Tests**
* Added a large suite of new integration tests covering editor flows
(edgeless, database, embeds, images, latex, code, clipboard,
multi-editor, presentation, undo/redo, etc.).
* Removed numerous end-to-end Playwright test suites across the same
feature areas.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-07 04:12:27 +08:00
DarkSky
b5d5b71f95 feat(server): improve markdown parse (#14580)
#### PR Dependency Tree


* **PR #14580** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Markdown conversion now reports lists of known-unsupported and unknown
block identifiers encountered during parsing, and separates them from
the main markdown output.

* **Bug Fixes**
  * Improved error handling and logging around markdown parsing.

* **Tests**
* Updated tests and snapshots to reflect the new block-list fields and
the adjusted markdown output.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-07 00:14:27 +08:00
renovate[bot]
09fa1a8e4e chore: bump up dompurify version to v3.3.2 [SECURITY] (#14581)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
| [dompurify](https://redirect.github.com/cure53/DOMPurify) | [`3.3.0` →
`3.3.2`](https://renovatebot.com/diffs/npm/dompurify/3.3.0/3.3.2) |
![age](https://developer.mend.io/api/mc/badges/age/npm/dompurify/3.3.2?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/dompurify/3.3.0/3.3.2?slim=true)
|

### GitHub Vulnerability Alerts

#### [CVE-2026-0540](https://nvd.nist.gov/vuln/detail/CVE-2026-0540)

DOMPurify 3.1.3 through 3.3.1 and 2.5.3 through 2.5.8, fixed in 2.5.9
and 3.3.2, contain a cross-site scripting vulnerability that allows
attackers to bypass attribute sanitization by exploiting five missing
rawtext elements (noscript, xmp, noembed, noframes, iframe) in the
`SAFE_FOR_XML` regex. Attackers can include payloads like
`</noscript><img src=x onerror=alert(1)>` in attribute values to execute
JavaScript when sanitized output is placed inside these unprotected
rawtext contexts.

---

### Release Notes

<details>
<summary>cure53/DOMPurify (dompurify)</summary>

###
[`v3.3.2`](https://redirect.github.com/cure53/DOMPurify/releases/tag/3.3.2):
DOMPurify 3.3.2

[Compare
Source](https://redirect.github.com/cure53/DOMPurify/compare/3.3.1...3.3.2)

- Fixed a possible bypass caused by jsdom's faulty raw-text tag parsing,
thanks multiple reporters
- Fixed a prototype pollution issue when working with custom elements,
thanks [@&#8203;christos-eth](https://redirect.github.com/christos-eth)
- Fixed a lenient config parsing in `_isValidAttribute`, thanks
[@&#8203;christos-eth](https://redirect.github.com/christos-eth)
- Bumped and removed several dependencies, thanks
[@&#8203;Rotzbua](https://redirect.github.com/Rotzbua)
- Fixed the test suite after bumping dependencies, thanks
[@&#8203;Rotzbua](https://redirect.github.com/Rotzbua)

###
[`v3.3.1`](https://redirect.github.com/cure53/DOMPurify/releases/tag/3.3.1):
DOMPurify 3.3.1

[Compare
Source](https://redirect.github.com/cure53/DOMPurify/compare/3.3.0...3.3.1)

- Updated `ADD_FORBID_CONTENTS` setting to extend default list, thanks
[@&#8203;MariusRumpf](https://redirect.github.com/MariusRumpf)
- Updated the ESM import syntax to be more correct, thanks
[@&#8203;binhpv](https://redirect.github.com/binhpv)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "" (UTC), Automerge - At any time (no
schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My41Ni4wIiwidXBkYXRlZEluVmVyIjoiNDMuNTYuMCIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-06 19:04:08 +08:00
congzhou09
c249011238 fix(editor): note-edgeless-block dimensions mismatch content at non-100% scale (#14577)
### Problem
●In edgeless mode, when the `note-edgeless-block` is scaled below 100%,
its outer dimension becomes larger than its content region. This extra
invisible region will block some user interactions such as clicks and
hovers on editing elements underneath.

<img width="1060" height="541" alt="note-elem-block-click"
src="https://github.com/user-attachments/assets/860d7a4f-d159-437b-bbe8-4560e2463e3d"
/>

●The following video demonstrates this issue:


https://github.com/user-attachments/assets/3b719b25-0d7e-496b-9507-6aa65ed0a797


### Solution
●The root cause is that `transform: scale(...)` CSS property (which
implements the scale) is currently applyed to its **inner root element**
instead of itself, and the solution is to move this CSS property to the
proper place.

### After
●The video below shows the behavior after this fix.



https://github.com/user-attachments/assets/e2dbd75d-c2ea-460d-90a1-5cc13e12d5b8



<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Refactor**
* Centralized CSS scaling for graphics and edgeless note blocks into
dedicated public methods; rendering now uses these methods instead of
inline transform calculations.
* **Tests**
* Updated end-to-end checks to read scale directly from the edgeless
note element and use a more flexible transform-matching pattern.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-06 19:03:36 +08:00
DarkSky
7f5f7e79df feat(server): refactor mcp (#14579)
#### PR Dependency Tree


* **PR #14579** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Full JSON-RPC MCP endpoint with batch requests, per-message
validation, method dispatch (initialize, ping, tools/list, tools/call)
and request cancellation
* Tool listing and execution with input validation, standardized
results, and improved error responses

* **Chores**
  * Removed an external protocol dependency
  * Bumped MCP server version to 1.0.1
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-06 06:35:34 +08:00
DarkSky
fff04395bc chore: update docs 2026-03-06 01:51:09 +08:00
renovate[bot]
bbc01533d7 chore: bump up multer version to v2.1.1 [SECURITY] (#14576)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
| [multer](https://redirect.github.com/expressjs/multer) | [`2.1.0` →
`2.1.1`](https://renovatebot.com/diffs/npm/multer/2.1.0/2.1.1) |
![age](https://developer.mend.io/api/mc/badges/age/npm/multer/2.1.1?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/multer/2.1.0/2.1.1?slim=true)
|

### GitHub Vulnerability Alerts

####
[CVE-2026-2359](https://redirect.github.com/expressjs/multer/security/advisories/GHSA-v52c-386h-88mc)

### Impact

A vulnerability in Multer versions < 2.1.0 allows an attacker to trigger
a Denial of Service (DoS) by dropping connection during file upload,
potentially causing resource exhaustion.

### Patches

Users should upgrade to `2.1.0`

### Workarounds

None

####
[CVE-2026-3304](https://redirect.github.com/expressjs/multer/security/advisories/GHSA-xf7r-hgr6-v32p)

### Impact

A vulnerability in Multer versions < 2.1.0 allows an attacker to trigger
a Denial of Service (DoS) by sending malformed requests, potentially
causing resource exhaustion.

### Patches

Users should upgrade to `2.1.0`

### Workarounds

None

####
[CVE-2026-3520](https://redirect.github.com/expressjs/multer/security/advisories/GHSA-5528-5vmv-3xc2)

### Impact

A vulnerability in Multer versions < 2.1.1 allows an attacker to trigger
a Denial of Service (DoS) by sending malformed requests, potentially
causing stack overflow.

### Patches

Users should upgrade to `2.1.1`

### Workarounds

None

### Resources

-
https://github.com/expressjs/multer/security/advisories/GHSA-5528-5vmv-3xc2
- https://www.cve.org/CVERecord?id=CVE-2026-3520
-
7e66481f8b
- https://cna.openjsf.org/security-advisories.html

---

### Release Notes

<details>
<summary>expressjs/multer (multer)</summary>

###
[`v2.1.1`](https://redirect.github.com/expressjs/multer/blob/HEAD/CHANGELOG.md#211)

[Compare
Source](https://redirect.github.com/expressjs/multer/compare/v2.1.0...v2.1.1)

- Fix [CVE-2026-3520](https://www.cve.org/CVERecord?id=CVE-2026-3520)
([GHSA-5528-5vmv-3xc2](https://redirect.github.com/expressjs/multer/security/advisories/GHSA-5528-5vmv-3xc2))
- fix error/abort handling

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "" (UTC), Automerge - At any time (no
schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My41NS40IiwidXBkYXRlZEluVmVyIjoiNDMuNTUuNCIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-06 01:04:48 +08:00
congzhou09
e31cca3354 fix(editor): non-canvas block size/position in embed-edgeless-doc at non-1 zoom (#14074)
### Problem
●Similar to
[PR#14015](https://github.com/toeverything/AFFiNE/pull/14015), the
container's own scaling factor (`viewScale`) was not taken into account.
This time the issue affects **non-canvas blocks** (e.g. `edgeless-note`,
`edgeless-image`, and any component extending `GfxBlockComponent`).
●The follwing image and video show the case when zoom is 0.5.

<img width="822" height="414" alt="图片"
src="https://github.com/user-attachments/assets/cee1cb88-2764-443c-aa7a-0443308b0e29"
/>


https://github.com/user-attachments/assets/3c744579-16c4-4f10-b421-e0606da1269f

### Solution
●Incorporated `viewScale` into the CSS `translate` calculation for all
`GfxBlockComponent` instances.

### Additional Improvement
●Minor refactor: the class returned by `toGfxBlockComponent()` now
reuses the original `getCSSTransform()` implementation from
`GfxBlockComponent.prototype` via `.call(this)`, eliminating duplicated
code.

### After
●The refined is as follows.


https://github.com/user-attachments/assets/24de0429-63a3-45a7-9b31-d91a4279e233


<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Refactor**
* Improved viewport scaling so visual transforms (translation and zoom)
correctly account for view scale, yielding more consistent rendering
during zoom and pan.
* Centralized transform calculation to a shared implementation, reducing
duplication and ensuring uniform behavior across views.

<sub>✏️ Tip: You can customize this high-level summary in your review
settings.</sub>
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

Co-authored-by: DarkSky <25152247+darkskygit@users.noreply.github.com>
2026-03-04 11:38:09 +00:00
DarkSky
11bc333714 Merge commit '99b07c2ee14dea1383ca6cd63633c3df542f2955' into canary 2026-03-04 01:18:30 +08:00
DarkSky
99b07c2ee1 fix: ios marketing version 2026-03-04 01:17:14 +08:00
renovate[bot]
fc9b99cd17 chore: bump up ava version to v7 (#14563)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
| [ava](https://avajs.dev)
([source](https://redirect.github.com/avajs/ava)) | [`^6.4.1` →
`^7.0.0`](https://renovatebot.com/diffs/npm/ava/6.4.1/7.0.0) |
![age](https://developer.mend.io/api/mc/badges/age/npm/ava/7.0.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/ava/6.4.1/7.0.0?slim=true)
|
| [ava](https://avajs.dev)
([source](https://redirect.github.com/avajs/ava)) | [`^6.4.0` →
`^7.0.0`](https://renovatebot.com/diffs/npm/ava/6.4.1/7.0.0) |
![age](https://developer.mend.io/api/mc/badges/age/npm/ava/7.0.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/ava/6.4.1/7.0.0?slim=true)
|

---

### Release Notes

<details>
<summary>avajs/ava (ava)</summary>

###
[`v7.0.0`](https://redirect.github.com/avajs/ava/releases/tag/v7.0.0)

[Compare
Source](https://redirect.github.com/avajs/ava/compare/v6.4.1...v7.0.0)

##### What's Changed

- Replace `strip-ansi` with `node:util.stripVTControlCharacters` by
[@&#8203;fisker](https://redirect.github.com/fisker) in
[#&#8203;3403](https://redirect.github.com/avajs/ava/pull/3403)
- Remove support for Node.js 18 and 23; require 20.19 or newer, 22.20 or
newer or 24,12 or newer; update dependencies including transitive `glob`
by [@&#8203;novemberborn](https://redirect.github.com/novemberborn) in
[#&#8203;3416](https://redirect.github.com/avajs/ava/pull/3416)

**Full Changelog**:
<https://github.com/avajs/ava/compare/v6.4.1...v7.0.0>

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about these
updates again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40OC4xIiwidXBkYXRlZEluVmVyIjoiNDMuNDguMSIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-03 17:47:47 +08:00
DarkSky
2137f68871 fix(server): normalize mail server name (#14564)
fix #14562 
fix #14226 
fix #14192


#### PR Dependency Tree


* **PR #14564** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* SMTP and fallback SMTP name now default to empty and will use the
system hostname when not set.
* HELO hostname resolution includes stricter normalization/validation
for more reliable mail handshakes.

* **Documentation**
* Updated admin and config descriptions to explain hostname/HELO
behavior and fallback.

* **Tests**
* Added tests covering hostname normalization and rejection of invalid
HELO values.

* **Chores**
  * Updated example env and ignore rules.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-03 15:51:32 +08:00
renovate[bot]
75efa854bf chore: bump up apple-actions/import-codesign-certs action to v6 (#14561)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[apple-actions/import-codesign-certs](https://redirect.github.com/apple-actions/import-codesign-certs)
| action | major | `v5` → `v6` |

---

### Release Notes

<details>
<summary>apple-actions/import-codesign-certs
(apple-actions/import-codesign-certs)</summary>

###
[`v6`](https://redirect.github.com/apple-actions/import-codesign-certs/compare/v5...v6)

[Compare
Source](https://redirect.github.com/apple-actions/import-codesign-certs/compare/v5...v6)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40My4yIiwidXBkYXRlZEluVmVyIjoiNDMuNDMuMiIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-03 12:14:10 +08:00
renovate[bot]
c0139abf79 chore: bump up actions/setup-java action to v5 (#14554)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [actions/setup-java](https://redirect.github.com/actions/setup-java) |
action | major | `v4` → `v5` |

---

### Release Notes

<details>
<summary>actions/setup-java (actions/setup-java)</summary>

###
[`v5`](https://redirect.github.com/actions/setup-java/compare/v4...v5)

[Compare
Source](https://redirect.github.com/actions/setup-java/compare/v4...v5)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40My4yIiwidXBkYXRlZEluVmVyIjoiNDMuNDMuMiIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-02 20:31:47 +08:00
renovate[bot]
5a38e765bd chore: bump up actions/setup-node action to v6 (#14555)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [actions/setup-node](https://redirect.github.com/actions/setup-node) |
action | major | `v4` → `v6` |

---

### Release Notes

<details>
<summary>actions/setup-node (actions/setup-node)</summary>

###
[`v6`](https://redirect.github.com/actions/setup-node/compare/v5...v6)

[Compare
Source](https://redirect.github.com/actions/setup-node/compare/v5...v6)

###
[`v5`](https://redirect.github.com/actions/setup-node/compare/v4...v5)

[Compare
Source](https://redirect.github.com/actions/setup-node/compare/v4...v5)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40My4yIiwidXBkYXRlZEluVmVyIjoiNDMuNDMuMiIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-02 20:31:23 +08:00
renovate[bot]
d3dcdd47ee chore: bump up actions/setup-python action to v6 (#14556)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[actions/setup-python](https://redirect.github.com/actions/setup-python)
| action | major | `v5` → `v6` |

---

### Release Notes

<details>
<summary>actions/setup-python (actions/setup-python)</summary>

###
[`v6`](https://redirect.github.com/actions/setup-python/compare/v5...v6)

[Compare
Source](https://redirect.github.com/actions/setup-python/compare/v5...v6)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40My4yIiwidXBkYXRlZEluVmVyIjoiNDMuNDMuMiIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-02 20:31:01 +08:00
DarkSky
727c9d6d71 fix: server build env 2026-03-02 20:23:00 +08:00
DarkSky
274f491e49 fix: aarch64 build 2026-03-02 19:49:52 +08:00
congzhou09
478138493a fix(editor): invalid caret in note-edgeless-block on focus (#14229)
### Problem
●In edgeless mode, when starting to edit, `note-block` exhibits two
types of invalid caret behavior:
(1)**Title Region Misalignment**: Clicking on the title region
incorrectly generates the caret in the first line of the note content,
rather than in the title itself.
(2)**Vanishing Caret at Line End**: When clicking in the empty space
beyond the end of a text section, the caret appears momentarily at the
line's end but disappears immediately.
●The following video demonstrates these issues:


https://github.com/user-attachments/assets/db9c2c50-709f-4d32-912c-0f01841d2024


### Solution
●**Title Click Interception**: Added a check to determine if the click
coordinates fall in the title region. If so, the caret positioning is
now handled by a dedicated logic path. Otherwise, it falls back to the
existing note-content logic as before.
●**Range Normalization**: When the generated `range.startContainer` is
not a `TextNode`, try to find a most appropriate `TextNode` and update
the `range` accordingly.

### After
●The video below shows the behavior after this fix.


https://github.com/user-attachments/assets/b2f70b64-1fc6-4049-8379-8bcf3a488a05



<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Bug Fixes**
* Clicking a page block title no longer creates unwanted paragraphs and
reliably focuses the title.
* Paragraph creation now occurs only when needed and focus is applied
only after successful creation.
* Click coordinates are clamped to container bounds to prevent misplaced
cursors or focus.

* **Improvements**
* Caret normalization: clicks place the caret at the last meaningful
text position for consistent single-cursor behavior.

* **Tests**
  * Added end-to-end coverage for caret placement and focus transitions.
* New ratio-based click/double-click test utilities and a helper for
double-clicking note bodies.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-02 18:51:23 +08:00
renovate[bot]
5464d1a9ce chore: bump up multer version to v2.1.0 [SECURITY] (#14544)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
| [multer](https://redirect.github.com/expressjs/multer) | [`2.0.2` →
`2.1.0`](https://renovatebot.com/diffs/npm/multer/2.0.2/2.1.0) |
![age](https://developer.mend.io/api/mc/badges/age/npm/multer/2.1.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/multer/2.0.2/2.1.0?slim=true)
|

### GitHub Vulnerability Alerts

####
[CVE-2026-2359](https://redirect.github.com/expressjs/multer/security/advisories/GHSA-v52c-386h-88mc)

### Impact

A vulnerability in Multer versions < 2.1.0 allows an attacker to trigger
a Denial of Service (DoS) by dropping connection during file upload,
potentially causing resource exhaustion.

### Patches

Users should upgrade to `2.1.0`

### Workarounds

None

####
[CVE-2026-3304](https://redirect.github.com/expressjs/multer/security/advisories/GHSA-xf7r-hgr6-v32p)

### Impact

A vulnerability in Multer versions < 2.1.0 allows an attacker to trigger
a Denial of Service (DoS) by sending malformed requests, potentially
causing resource exhaustion.

### Patches

Users should upgrade to `2.1.0`

### Workarounds

None

---

### Release Notes

<details>
<summary>expressjs/multer (multer)</summary>

###
[`v2.1.0`](https://redirect.github.com/expressjs/multer/blob/HEAD/CHANGELOG.md#210)

[Compare
Source](https://redirect.github.com/expressjs/multer/compare/v2.0.2...v2.1.0)

- Add `defParamCharset` option for UTF-8 filename support
([#&#8203;1210](https://redirect.github.com/expressjs/multer/pull/1210))
- Fix [CVE-2026-2359](https://www.cve.org/CVERecord?id=CVE-2026-2359)
([GHSA-v52c-386h-88mc](https://redirect.github.com/expressjs/multer/security/advisories/GHSA-v52c-386h-88mc))
- Fix [CVE-2026-3304](https://www.cve.org/CVERecord?id=CVE-2026-3304)
([GHSA-xf7r-hgr6-v32p](https://redirect.github.com/expressjs/multer/security/advisories/GHSA-xf7r-hgr6-v32p))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "" (UTC), Automerge - At any time (no
schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40My4yIiwidXBkYXRlZEluVmVyIjoiNDMuNDMuMiIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-02 13:59:38 +08:00
renovate[bot]
4c40dcacd9 chore: bump up actions/setup-go action to v6 (#14553)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [actions/setup-go](https://redirect.github.com/actions/setup-go) |
action | major | `v5` → `v6` |

---

### Release Notes

<details>
<summary>actions/setup-go (actions/setup-go)</summary>

### [`v6`](https://redirect.github.com/actions/setup-go/compare/v5...v6)

[Compare
Source](https://redirect.github.com/actions/setup-go/compare/v5...v6)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40My4yIiwidXBkYXRlZEluVmVyIjoiNDMuNDMuMiIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-02 13:59:16 +08:00
renovate[bot]
76d28aaa38 chore: bump up @types/supertest version to v7 (#14546)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
|
[@types/supertest](https://redirect.github.com/DefinitelyTyped/DefinitelyTyped/tree/master/types/supertest)
([source](https://redirect.github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/supertest))
| [`^6.0.2` →
`^7.0.0`](https://renovatebot.com/diffs/npm/@types%2fsupertest/6.0.3/7.2.0)
|
![age](https://developer.mend.io/api/mc/badges/age/npm/@types%2fsupertest/7.2.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/@types%2fsupertest/6.0.3/7.2.0?slim=true)
|

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40My4yIiwidXBkYXRlZEluVmVyIjoiNDMuNDMuMiIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-02 13:58:48 +08:00
renovate[bot]
86f48240ce chore: bump up actions/github-script action to v8 (#14551)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[actions/github-script](https://redirect.github.com/actions/github-script)
| action | major | `v7` → `v8` |

---

### Release Notes

<details>
<summary>actions/github-script (actions/github-script)</summary>

###
[`v8`](https://redirect.github.com/actions/github-script/releases/tag/v8):
.0.0

[Compare
Source](https://redirect.github.com/actions/github-script/compare/v7...v8)

##### What's Changed

- Update Node.js version support to 24.x by
[@&#8203;salmanmkc](https://redirect.github.com/salmanmkc) in
[#&#8203;637](https://redirect.github.com/actions/github-script/pull/637)
- README for updating actions/github-script from v7 to v8 by
[@&#8203;sneha-krip](https://redirect.github.com/sneha-krip) in
[#&#8203;653](https://redirect.github.com/actions/github-script/pull/653)

##### ⚠️ Minimum Compatible Runner Version

**v2.327.1**\
[Release
Notes](https://redirect.github.com/actions/runner/releases/tag/v2.327.1)

Make sure your runner is updated to this version or newer to use this
release.

##### New Contributors

- [@&#8203;salmanmkc](https://redirect.github.com/salmanmkc) made their
first contribution in
[#&#8203;637](https://redirect.github.com/actions/github-script/pull/637)
- [@&#8203;sneha-krip](https://redirect.github.com/sneha-krip) made
their first contribution in
[#&#8203;653](https://redirect.github.com/actions/github-script/pull/653)

**Full Changelog**:
<https://github.com/actions/github-script/compare/v7.1.0...v8.0.0>

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40My4yIiwidXBkYXRlZEluVmVyIjoiNDMuNDMuMiIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-02 13:58:23 +08:00
DarkSky
c5d622531c feat: refactor copilot module (#14537) 2026-03-02 13:57:55 +08:00
renovate[bot]
60acd81d4b chore: bump up actions/labeler action to v6 (#14552)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [actions/labeler](https://redirect.github.com/actions/labeler) |
action | major | `v5` → `v6` |

---

### Release Notes

<details>
<summary>actions/labeler (actions/labeler)</summary>

### [`v6`](https://redirect.github.com/actions/labeler/compare/v5...v6)

[Compare
Source](https://redirect.github.com/actions/labeler/compare/v5...v6)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40My4yIiwidXBkYXRlZEluVmVyIjoiNDMuNDMuMiIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-02 11:59:25 +08:00
renovate[bot]
78f567a178 chore: bump up actions/checkout action to v6 (#14550)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [actions/checkout](https://redirect.github.com/actions/checkout) |
action | major | `v4` → `v6` |

---

### Release Notes

<details>
<summary>actions/checkout (actions/checkout)</summary>

### [`v6`](https://redirect.github.com/actions/checkout/compare/v5...v6)

[Compare
Source](https://redirect.github.com/actions/checkout/compare/v5...v6)

### [`v5`](https://redirect.github.com/actions/checkout/compare/v4...v5)

[Compare
Source](https://redirect.github.com/actions/checkout/compare/v4...v5)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40My4yIiwidXBkYXRlZEluVmVyIjoiNDMuNDMuMiIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-02 10:58:53 +08:00
renovate[bot]
784382cfb1 chore: bump up actions/cache action to v5 (#14549)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [actions/cache](https://redirect.github.com/actions/cache) | action |
major | `v4` → `v5` |

---

### Release Notes

<details>
<summary>actions/cache (actions/cache)</summary>

### [`v5`](https://redirect.github.com/actions/cache/compare/v4...v5)

[Compare
Source](https://redirect.github.com/actions/cache/compare/v4...v5)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40My4yIiwidXBkYXRlZEluVmVyIjoiNDMuNDMuMiIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-02 10:58:27 +08:00
renovate[bot]
342451be1b chore: bump up actions/attest-build-provenance action to v4 (#14547)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[actions/attest-build-provenance](https://redirect.github.com/actions/attest-build-provenance)
| action | major | `v2` → `v4` |

---

### Release Notes

<details>
<summary>actions/attest-build-provenance
(actions/attest-build-provenance)</summary>

###
[`v4`](https://redirect.github.com/actions/attest-build-provenance/compare/v3...v4)

[Compare
Source](https://redirect.github.com/actions/attest-build-provenance/compare/v3...v4)

###
[`v3`](https://redirect.github.com/actions/attest-build-provenance/compare/v2...v3)

[Compare
Source](https://redirect.github.com/actions/attest-build-provenance/compare/v2...v3)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40My4yIiwidXBkYXRlZEluVmVyIjoiNDMuNDMuMiIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-02 10:44:37 +08:00
renovate[bot]
2b6146727b chore: bump up RevenueCat/purchases-ios-spm version to from: "5.60.0" (#14545) 2026-03-02 08:48:25 +08:00
renovate[bot]
d5245a3273 chore: bump up Recouse/EventSource version to from: "0.1.7" (#14541)
This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
| [Recouse/EventSource](https://redirect.github.com/Recouse/EventSource)
| patch | `from: "0.1.5"` → `from: "0.1.7"` |

---

### Release Notes

<details>
<summary>Recouse/EventSource (Recouse/EventSource)</summary>

###
[`v0.1.7`](https://redirect.github.com/Recouse/EventSource/releases/tag/0.1.7)

[Compare
Source](https://redirect.github.com/Recouse/EventSource/compare/0.1.6...0.1.7)

#### What's Changed

- Separate timeout interval values for request and resource by
[@&#8203;Recouse](https://redirect.github.com/Recouse) in
[#&#8203;46](https://redirect.github.com/Recouse/EventSource/pull/46)

**Full Changelog**:
<https://github.com/Recouse/EventSource/compare/0.1.6...0.1.7>

###
[`v0.1.6`](https://redirect.github.com/Recouse/EventSource/releases/tag/0.1.6)

[Compare
Source](https://redirect.github.com/Recouse/EventSource/compare/0.1.5...0.1.6)

#### What's Changed

- Fix visionOS availability error for split(by:) method by
[@&#8203;danielseidl](https://redirect.github.com/danielseidl) in
[#&#8203;45](https://redirect.github.com/Recouse/EventSource/pull/45)

#### New Contributors

- [@&#8203;danielseidl](https://redirect.github.com/danielseidl) made
their first contribution in
[#&#8203;45](https://redirect.github.com/Recouse/EventSource/pull/45)

**Full Changelog**:
<https://github.com/Recouse/EventSource/compare/0.1.5...0.1.6>

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40My4yIiwidXBkYXRlZEluVmVyIjoiNDMuNDMuMiIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-02 06:26:20 +08:00
renovate[bot]
fff63562b1 chore: bump up Lakr233/MarkdownView version to from: "3.6.3" (#14540)
This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
|
[Lakr233/MarkdownView](https://redirect.github.com/Lakr233/MarkdownView)
| patch | `from: "3.6.2"` → `from: "3.6.3"` |

---

### Release Notes

<details>
<summary>Lakr233/MarkdownView (Lakr233/MarkdownView)</summary>

###
[`v3.6.3`](https://redirect.github.com/Lakr233/MarkdownView/compare/3.6.2...3.6.3)

[Compare
Source](https://redirect.github.com/Lakr233/MarkdownView/compare/3.6.2...3.6.3)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40My4yIiwidXBkYXRlZEluVmVyIjoiNDMuNDMuMiIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-02 06:26:03 +08:00
renovate[bot]
4136abdd97 chore: bump up rustc version to v1.93.1 (#14542)
This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
| [rustc](https://redirect.github.com/rust-lang/rust) | patch | `1.93.0`
→ `1.93.1` |

---

### Release Notes

<details>
<summary>rust-lang/rust (rustc)</summary>

###
[`v1.93.1`](https://redirect.github.com/rust-lang/rust/blob/HEAD/RELEASES.md#Version-1931-2026-02-12)

[Compare
Source](https://redirect.github.com/rust-lang/rust/compare/1.93.0...1.93.1)

\===========================

<a id="1.93.1"></a>

- [Don't try to recover keyword as non-keyword
identifier](https://redirect.github.com/rust-lang/rust/pull/150590),
fixing an ICE that especially [affected
rustfmt](https://redirect.github.com/rust-lang/rustfmt/issues/6739).
- [Fix `clippy::panicking_unwrap` false-positive on field access with
implicit
deref](https://redirect.github.com/rust-lang/rust-clippy/pull/16196).
- [Revert "Update wasm-related dependencies in
CI"](https://redirect.github.com/rust-lang/rust/pull/152259), fixing
file descriptor leaks on the `wasm32-wasip2` target.

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40My4yIiwidXBkYXRlZEluVmVyIjoiNDMuNDMuMiIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-02 06:25:51 +08:00
renovate[bot]
e249e2e884 chore: bump up opentelemetry (#14543)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
|
[@opentelemetry/exporter-prometheus](https://redirect.github.com/open-telemetry/opentelemetry-js/tree/main/experimental/packages/opentelemetry-exporter-prometheus)
([source](https://redirect.github.com/open-telemetry/opentelemetry-js))
| [`^0.211.0` →
`^0.212.0`](https://renovatebot.com/diffs/npm/@opentelemetry%2fexporter-prometheus/0.211.0/0.212.0)
|
![age](https://developer.mend.io/api/mc/badges/age/npm/@opentelemetry%2fexporter-prometheus/0.212.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/@opentelemetry%2fexporter-prometheus/0.211.0/0.212.0?slim=true)
|
|
[@opentelemetry/exporter-zipkin](https://redirect.github.com/open-telemetry/opentelemetry-js/tree/main/packages/opentelemetry-exporter-zipkin)
([source](https://redirect.github.com/open-telemetry/opentelemetry-js))
| [`2.5.0` →
`2.5.1`](https://renovatebot.com/diffs/npm/@opentelemetry%2fexporter-zipkin/2.5.0/2.5.1)
|
![age](https://developer.mend.io/api/mc/badges/age/npm/@opentelemetry%2fexporter-zipkin/2.5.1?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/@opentelemetry%2fexporter-zipkin/2.5.0/2.5.1?slim=true)
|
|
[@opentelemetry/host-metrics](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/tree/main/packages/host-metrics#readme)
([source](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/tree/HEAD/packages/host-metrics))
| [`0.38.2` →
`0.38.3`](https://renovatebot.com/diffs/npm/@opentelemetry%2fhost-metrics/0.38.2/0.38.3)
|
![age](https://developer.mend.io/api/mc/badges/age/npm/@opentelemetry%2fhost-metrics/0.38.3?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/@opentelemetry%2fhost-metrics/0.38.2/0.38.3?slim=true)
|
|
[@opentelemetry/instrumentation](https://redirect.github.com/open-telemetry/opentelemetry-js/tree/main/experimental/packages/opentelemetry-instrumentation)
([source](https://redirect.github.com/open-telemetry/opentelemetry-js))
| [`^0.211.0` →
`^0.212.0`](https://renovatebot.com/diffs/npm/@opentelemetry%2finstrumentation/0.211.0/0.212.0)
|
![age](https://developer.mend.io/api/mc/badges/age/npm/@opentelemetry%2finstrumentation/0.212.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/@opentelemetry%2finstrumentation/0.211.0/0.212.0?slim=true)
|
|
[@opentelemetry/instrumentation-graphql](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/tree/main/packages/instrumentation-graphql#readme)
([source](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/tree/HEAD/packages/instrumentation-graphql))
| [`^0.58.0` →
`^0.60.0`](https://renovatebot.com/diffs/npm/@opentelemetry%2finstrumentation-graphql/0.58.0/0.60.0)
|
![age](https://developer.mend.io/api/mc/badges/age/npm/@opentelemetry%2finstrumentation-graphql/0.60.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/@opentelemetry%2finstrumentation-graphql/0.58.0/0.60.0?slim=true)
|
|
[@opentelemetry/instrumentation-http](https://redirect.github.com/open-telemetry/opentelemetry-js/tree/main/experimental/packages/opentelemetry-instrumentation-http)
([source](https://redirect.github.com/open-telemetry/opentelemetry-js))
| [`^0.211.0` →
`^0.212.0`](https://renovatebot.com/diffs/npm/@opentelemetry%2finstrumentation-http/0.211.0/0.212.0)
|
![age](https://developer.mend.io/api/mc/badges/age/npm/@opentelemetry%2finstrumentation-http/0.212.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/@opentelemetry%2finstrumentation-http/0.211.0/0.212.0?slim=true)
|
|
[@opentelemetry/instrumentation-ioredis](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/tree/main/packages/instrumentation-ioredis#readme)
([source](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/tree/HEAD/packages/instrumentation-ioredis))
| [`^0.59.0` →
`^0.60.0`](https://renovatebot.com/diffs/npm/@opentelemetry%2finstrumentation-ioredis/0.59.0/0.60.0)
|
![age](https://developer.mend.io/api/mc/badges/age/npm/@opentelemetry%2finstrumentation-ioredis/0.60.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/@opentelemetry%2finstrumentation-ioredis/0.59.0/0.60.0?slim=true)
|
|
[@opentelemetry/instrumentation-nestjs-core](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/tree/main/packages/instrumentation-nestjs-core#readme)
([source](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/tree/HEAD/packages/instrumentation-nestjs-core))
| [`^0.57.0` →
`^0.58.0`](https://renovatebot.com/diffs/npm/@opentelemetry%2finstrumentation-nestjs-core/0.57.0/0.58.0)
|
![age](https://developer.mend.io/api/mc/badges/age/npm/@opentelemetry%2finstrumentation-nestjs-core/0.58.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/@opentelemetry%2finstrumentation-nestjs-core/0.57.0/0.58.0?slim=true)
|
|
[@opentelemetry/instrumentation-socket.io](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/tree/main/packages/instrumentation-socket.io#readme)
([source](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/tree/HEAD/packages/instrumentation-socket.io))
| [`^0.57.0` →
`^0.59.0`](https://renovatebot.com/diffs/npm/@opentelemetry%2finstrumentation-socket.io/0.57.0/0.59.0)
|
![age](https://developer.mend.io/api/mc/badges/age/npm/@opentelemetry%2finstrumentation-socket.io/0.59.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/@opentelemetry%2finstrumentation-socket.io/0.57.0/0.59.0?slim=true)
|
|
[@opentelemetry/sdk-metrics](https://redirect.github.com/open-telemetry/opentelemetry-js/tree/main/packages/sdk-metrics)
([source](https://redirect.github.com/open-telemetry/opentelemetry-js))
| [`2.5.0` →
`2.5.1`](https://renovatebot.com/diffs/npm/@opentelemetry%2fsdk-metrics/2.5.0/2.5.1)
|
![age](https://developer.mend.io/api/mc/badges/age/npm/@opentelemetry%2fsdk-metrics/2.5.1?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/@opentelemetry%2fsdk-metrics/2.5.0/2.5.1?slim=true)
|
|
[@opentelemetry/sdk-node](https://redirect.github.com/open-telemetry/opentelemetry-js/tree/main/experimental/packages/opentelemetry-sdk-node)
([source](https://redirect.github.com/open-telemetry/opentelemetry-js))
| [`^0.211.0` →
`^0.212.0`](https://renovatebot.com/diffs/npm/@opentelemetry%2fsdk-node/0.211.0/0.212.0)
|
![age](https://developer.mend.io/api/mc/badges/age/npm/@opentelemetry%2fsdk-node/0.212.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/@opentelemetry%2fsdk-node/0.211.0/0.212.0?slim=true)
|
|
[@opentelemetry/sdk-trace-node](https://redirect.github.com/open-telemetry/opentelemetry-js/tree/main/packages/opentelemetry-sdk-trace-node)
([source](https://redirect.github.com/open-telemetry/opentelemetry-js))
| [`2.5.0` →
`2.5.1`](https://renovatebot.com/diffs/npm/@opentelemetry%2fsdk-trace-node/2.5.0/2.5.1)
|
![age](https://developer.mend.io/api/mc/badges/age/npm/@opentelemetry%2fsdk-trace-node/2.5.1?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/@opentelemetry%2fsdk-trace-node/2.5.0/2.5.1?slim=true)
|

---

### Release Notes

<details>
<summary>open-telemetry/opentelemetry-js
(@&#8203;opentelemetry/exporter-prometheus)</summary>

###
[`v0.212.0`](38924cbff2...ad92be4c2c)

[Compare
Source](38924cbff2...ad92be4c2c)

</details>

<details>
<summary>open-telemetry/opentelemetry-js-contrib
(@&#8203;opentelemetry/host-metrics)</summary>

###
[`v0.38.3`](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/blob/HEAD/packages/host-metrics/CHANGELOG.md#0383-2026-02-25)

[Compare
Source](7a5f3c0a09...630937db15)

##### Bug Fixes

- **instrumentation-host-metrics:** unpin and update to
systeminformation@^5.31.1
([#&#8203;3392](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/issues/3392))
([e4ffdb4](e4ffdb43d1))

</details>

<details>
<summary>open-telemetry/opentelemetry-js-contrib
(@&#8203;opentelemetry/instrumentation-graphql)</summary>

###
[`v0.60.0`](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/blob/HEAD/packages/instrumentation-graphql/CHANGELOG.md#0600-2026-02-25)

[Compare
Source](0b33a118f2...630937db15)

##### Features

- **instrumentation-graphql:** add parent name in attributes of resolver
span
([#&#8203;3287](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/issues/3287))
([ea2a90a](ea2a90a87b))

###
[`v0.59.0`](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/blob/HEAD/packages/instrumentation-graphql/CHANGELOG.md#0590-2026-02-16)

[Compare
Source](7a5f3c0a09...0b33a118f2)

##### Features

- **deps:** update deps matching "@&#8203;opentelemetry/\*"
([#&#8203;3383](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/issues/3383))
([d3ac785](d3ac7851d6))

</details>

<details>
<summary>open-telemetry/opentelemetry-js-contrib
(@&#8203;opentelemetry/instrumentation-ioredis)</summary>

###
[`v0.60.0`](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/blob/HEAD/packages/instrumentation-ioredis/CHANGELOG.md#0600-2026-02-16)

[Compare
Source](7a5f3c0a09...0b33a118f2)

##### Features

- **deps:** update deps matching "@&#8203;opentelemetry/\*"
([#&#8203;3383](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/issues/3383))
([d3ac785](d3ac7851d6))

##### Dependencies

- The following workspace dependencies were updated
  - devDependencies
-
[@&#8203;opentelemetry/contrib-test-utils](https://redirect.github.com/opentelemetry/contrib-test-utils)
bumped from ^0.58.0 to ^0.59.0

</details>

<details>
<summary>open-telemetry/opentelemetry-js-contrib
(@&#8203;opentelemetry/instrumentation-nestjs-core)</summary>

###
[`v0.58.0`](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/blob/HEAD/packages/instrumentation-nestjs-core/CHANGELOG.md#0580-2026-02-16)

[Compare
Source](7a5f3c0a09...0b33a118f2)

##### Features

- **deps:** update deps matching "@&#8203;opentelemetry/\*"
([#&#8203;3383](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/issues/3383))
([d3ac785](d3ac7851d6))

</details>

<details>
<summary>open-telemetry/opentelemetry-js-contrib
(@&#8203;opentelemetry/instrumentation-socket.io)</summary>

###
[`v0.59.0`](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/blob/HEAD/packages/instrumentation-socket.io/CHANGELOG.md#0590-2026-02-25)

[Compare
Source](0b33a118f2...630937db15)

##### Features

- **deps:** lock file maintenance
([#&#8203;3261](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/issues/3261))
([540926b](540926bffe))

###
[`v0.58.0`](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/blob/HEAD/packages/instrumentation-socket.io/CHANGELOG.md#0580-2026-02-16)

[Compare
Source](7a5f3c0a09...0b33a118f2)

##### Features

- **deps:** update deps matching "@&#8203;opentelemetry/\*"
([#&#8203;3383](https://redirect.github.com/open-telemetry/opentelemetry-js-contrib/issues/3383))
([d3ac785](d3ac7851d6))

##### Dependencies

- The following workspace dependencies were updated
  - devDependencies
-
[@&#8203;opentelemetry/contrib-test-utils](https://redirect.github.com/opentelemetry/contrib-test-utils)
bumped from ^0.58.0 to ^0.59.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

👻 **Immortal**: This PR will be recreated if closed unmerged. Get
[config
help](https://redirect.github.com/renovatebot/renovate/discussions) if
that's undesired.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My40My4yIiwidXBkYXRlZEluVmVyIjoiNDMuNDMuMiIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-01 21:19:09 +00:00
GyeongSu Han
2e95d91093 fix: restore Accept header and prevent encoded response in GitHub OAuth token request (regression from #14061) (#14538)
## 📝 Summary

This PR fixes a regression that caused the following error during GitHub
OAuth login:

> Unable to parse JSON response from
[https://github.com/login/oauth/access_token](https://github.com/login/oauth/access_token)

Related issue:
[https://github.com/toeverything/AFFiNE/issues/14334](https://github.com/toeverything/AFFiNE/issues/14334)
Regression introduced in:
[https://github.com/toeverything/AFFiNE/pull/14061](https://github.com/toeverything/AFFiNE/pull/14061)

---

## 🎯 Background

GitHub’s OAuth access token endpoint returns different response formats
depending on the request headers.

To receive a JSON response, the request must include:

```
Accept: application/json
```

If the `Accept` header is missing, GitHub responds with:

```
application/x-www-form-urlencoded
```

The current implementation assumes a JSON response and parses it
directly.
When a non-JSON response is returned, JSON parsing fails,
breaking the OAuth login flow.

---

## 🔍 Traffic Analysis (tcpdump)

Network path:

affine-graphql → (HTTPS) → envoy → (HTTP, tcpdump) → envoy → GitHub

### Observed Request

```
POST /login/oauth/access_token HTTP/1.1
host: github-proxy.com
content-type: application/x-www-form-urlencoded
accept: */*
...
```

### Observed Response

```
HTTP/1.1 200 OK
date: Sat, 28 Feb 2026 14:47:43 GMT
content-type: application/x-www-form-urlencoded; charset=utf-8
...
```

The `Accept` header was `*/*` instead of `application/json`,
causing GitHub to return a form-urlencoded response.

---

## 🐛 Root Cause

PR #14061 introduced a side effect in the request configuration.

Although the `Accept` header was initially defined,
the request options were later overwritten by the `init` parameter.

Because `init.headers` replaced the previously defined headers object,
the required header was lost.

Resulting in:

* Missing `Accept: application/json`
* GitHub returning `application/x-www-form-urlencoded`
* JSON parsing failure
* OAuth login failure

---

## 🔧 Changes

### 1️⃣ Fix header overwrite order

* Process the incoming `init` parameter first
* Explicitly overwrite required headers afterward
* Ensure `Accept: application/json` is always enforced

---

## 💥 Breaking Changes

None.

---

## 🧪 How to Test

1. Configure GitHub OAuth.
2. Attempt login via GitHub.
3. Verify that:

   * The request contains `Accept: application/json`
   * The response content-type is `application/json`
   * No JSON parsing error occurs
   * OAuth login completes successfully

---

## 📌 Notes

This change restores correct OAuth behavior and prevents regression
caused by header overwriting introduced in #14061.

The same header overwrite pattern identified in this issue
was also found in the calendar module and has been corrected there as
well.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Refactor**
* Improved backend HTTP header handling for external integrations to
avoid unintended header overrides, ensuring content-type and encoding
hints are applied consistently and improving reliability of service
requests.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2026-03-01 14:32:28 +00:00
DarkSky
2cb171f553 feat: cleanup webpack deps (#14530)
#### PR Dependency Tree


* **PR #14530** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Breaking Changes**
  * Webpack bundler support removed from the build system
* Bundler selection parameter removed from build and development
commands

* **Refactor**
  * Build configuration consolidated to a single bundler approach
* Webpack-specific build paths and workflows removed; development server
simplified

* **Chores**
  * Removed webpack-related dev dependencies and tooling
  * Updated package build scripts for a unified bundle command

* **Dependencies**
* Upgraded Sentry packages across frontend packages
(react/electron/esbuild plugin)
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-02-28 00:24:08 +08:00
DarkSky
a4e2242b8d chore: bump playwright (#13947)
<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Chores**
* Updated Playwright test tooling to 1.58.2 across the repository and
test packages.

* **Tests**
* Improved end-to-end robustness: replaced fragile timing/coordinate
logic with element-based interactions, added polling/retry checks for
flaky asserts and async state, and simplified input/rename flows to
reduce test flakiness.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-02-27 22:56:43 +08:00
DarkSky
c90f173821 chore: bump deps (#14526)
#### PR Dependency Tree


* **PR #14526** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Chores**
* Updated Storybook component development tooling to version 10.2.13 for
improved stability and performance
  * Removed Chromatic integration from the component preview system

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-02-27 20:17:06 +08:00
DarkSky
e1e0ac2345 chore: cleanup deps (#14525)
#### PR Dependency Tree


* **PR #14525** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Chores**
  * Removed an unused development dependency.
* Updated dotLottie/Lottie-related dependency versions across packages
and replaced a removed player dependency with the new package.

* **Refactor**
* AI animated icons now re-export from a shared component and are loaded
only in the browser, reducing upfront bundle weight and centralizing
icon assets.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-02-27 11:56:54 +08:00
DarkSky
bdccf4e9fd fix: typo 2026-02-27 10:20:35 +08:00
DarkSky
11cf1928b5 fix(server): transaction error (#14518)
<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Events can be dispatched in a detached context to avoid inheriting the
current transaction.

* **Bug Fixes**
* Improved resilience and error handling for event processing (graceful
handling of deleted workspaces and ignorable DB errors).
  * More reliable owner assignment flow when changing document owners.

* **Tests**
  * Added tests for doc content staleness with deleted workspaces.
  * Added permission event tests for missing workspace/editor scenarios.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-02-26 19:53:22 +08:00
donqu1xotevincent
5215c73166 chore(ios): update description (#14522) 2026-02-26 19:49:50 +08:00
DarkSky
895e774569 fix: static file handle & ws connect 2026-02-25 11:41:42 +08:00
262 changed files with 10182 additions and 7827 deletions

View File

@@ -197,8 +197,8 @@
"properties": {
"SMTP.name": {
"type": "string",
"description": "Name of the email server (e.g. your domain name)\n@default \"AFFiNE Server\"\n@environment `MAILER_SERVERNAME`",
"default": "AFFiNE Server"
"description": "Hostname used for SMTP HELO/EHLO (e.g. mail.example.com). Leave empty to use the system hostname.\n@default \"\"\n@environment `MAILER_SERVERNAME`",
"default": ""
},
"SMTP.host": {
"type": "string",
@@ -237,8 +237,8 @@
},
"fallbackSMTP.name": {
"type": "string",
"description": "Name of the fallback email server (e.g. your domain name)\n@default \"AFFiNE Server\"",
"default": "AFFiNE Server"
"description": "Hostname used for fallback SMTP HELO/EHLO (e.g. mail.example.com). Leave empty to use the system hostname.\n@default \"\"",
"default": ""
},
"fallbackSMTP.host": {
"type": "string",
@@ -988,6 +988,16 @@
}
}
},
"providers.profiles": {
"type": "array",
"description": "The profile list for copilot providers.\n@default []",
"default": []
},
"providers.defaults": {
"type": "object",
"description": "The default provider ids for model output types and global fallback.\n@default {}",
"default": {}
},
"providers.openai": {
"type": "object",
"description": "The config for the openai provider.\n@default {\"apiKey\":\"\",\"baseURL\":\"https://api.openai.com/v1\"}\n@link https://github.com/openai/openai-node",

View File

@@ -50,8 +50,14 @@ runs:
# https://github.com/tree-sitter/tree-sitter/issues/4186
# pass -D_BSD_SOURCE to clang to fix the tree-sitter build issue
run: |
echo "CC=clang -D_BSD_SOURCE" >> "$GITHUB_ENV"
echo "TARGET_CC=clang -D_BSD_SOURCE" >> "$GITHUB_ENV"
if [[ "${{ inputs.target }}" == "aarch64-unknown-linux-gnu" ]]; then
# napi cross-toolchain 1.0.3 headers miss AT_HWCAP2 in elf.h
echo "CC=clang -D_BSD_SOURCE -DAT_HWCAP2=26" >> "$GITHUB_ENV"
echo "TARGET_CC=clang -D_BSD_SOURCE -DAT_HWCAP2=26" >> "$GITHUB_ENV"
else
echo "CC=clang -D_BSD_SOURCE" >> "$GITHUB_ENV"
echo "TARGET_CC=clang -D_BSD_SOURCE" >> "$GITHUB_ENV"
fi
- name: Cache cargo
uses: Swatinem/rust-cache@v2

View File

@@ -53,7 +53,7 @@ runs:
fi
- name: Setup Node.js
uses: actions/setup-node@v4
uses: actions/setup-node@v6
with:
node-version-file: '.nvmrc'
registry-url: https://npm.pkg.github.com
@@ -93,7 +93,7 @@ runs:
run: node -e "const p = $(yarn config cacheFolder --json).effective; console.log('yarn_global_cache=' + p)" >> $GITHUB_OUTPUT
- name: Cache non-full yarn cache on Linux
uses: actions/cache@v4
uses: actions/cache@v5
if: ${{ inputs.full-cache != 'true' && runner.os == 'Linux' }}
with:
path: |
@@ -105,7 +105,7 @@ runs:
# and the decompression performance on Windows is very terrible
# so we reduce the number of cached files on non-Linux systems by remove node_modules from cache path.
- name: Cache non-full yarn cache on non-Linux
uses: actions/cache@v4
uses: actions/cache@v5
if: ${{ inputs.full-cache != 'true' && runner.os != 'Linux' }}
with:
path: |
@@ -113,7 +113,7 @@ runs:
key: node_modules-cache-${{ github.job }}-${{ runner.os }}-${{ runner.arch }}-${{ steps.system-info.outputs.name }}-${{ steps.system-info.outputs.release }}-${{ steps.system-info.outputs.version }}
- name: Cache full yarn cache on Linux
uses: actions/cache@v4
uses: actions/cache@v5
if: ${{ inputs.full-cache == 'true' && runner.os == 'Linux' }}
with:
path: |
@@ -122,7 +122,7 @@ runs:
key: node_modules-cache-full-${{ runner.os }}-${{ runner.arch }}-${{ steps.system-info.outputs.name }}-${{ steps.system-info.outputs.release }}-${{ steps.system-info.outputs.version }}
- name: Cache full yarn cache on non-Linux
uses: actions/cache@v4
uses: actions/cache@v5
if: ${{ inputs.full-cache == 'true' && runner.os != 'Linux' }}
with:
path: |
@@ -154,7 +154,7 @@ runs:
# Note: Playwright's cache directory is hard coded because that's what it
# says to do in the docs. There doesn't appear to be a command that prints
# it out for us.
- uses: actions/cache@v4
- uses: actions/cache@v5
id: playwright-cache
if: ${{ inputs.playwright-install == 'true' }}
with:
@@ -189,7 +189,7 @@ runs:
run: |
echo "version=$(yarn why --json electron | grep -h 'workspace:.' | jq --raw-output '.children[].locator' | sed -e 's/@playwright\/test@.*://' | head -n 1)" >> $GITHUB_OUTPUT
- uses: actions/cache@v4
- uses: actions/cache@v5
id: electron-cache
if: ${{ inputs.electron-install == 'true' }}
with:

View File

@@ -13,5 +13,5 @@ jobs:
pull-requests: write
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/labeler@v5
- uses: actions/checkout@v6
- uses: actions/labeler@v6

View File

@@ -24,7 +24,7 @@ jobs:
runs-on: ubuntu-latest
environment: ${{ inputs.build-type }}
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Version
uses: ./.github/actions/setup-version
with:
@@ -57,7 +57,7 @@ jobs:
runs-on: ubuntu-latest
environment: ${{ inputs.build-type }}
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Version
uses: ./.github/actions/setup-version
with:
@@ -89,7 +89,7 @@ jobs:
runs-on: ubuntu-latest
environment: ${{ inputs.build-type }}
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Version
uses: ./.github/actions/setup-version
with:
@@ -118,7 +118,7 @@ jobs:
build-server-native:
name: Build Server native - ${{ matrix.targets.name }}
runs-on: ubuntu-latest
runs-on: ubuntu-22.04
environment: ${{ inputs.build-type }}
strategy:
fail-fast: false
@@ -132,7 +132,7 @@ jobs:
file: server-native.armv7.node
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Version
uses: ./.github/actions/setup-version
with:
@@ -166,7 +166,7 @@ jobs:
needs:
- build-server-native
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Version
uses: ./.github/actions/setup-version
with:
@@ -202,7 +202,7 @@ jobs:
- build-mobile
- build-admin
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Download server dist
uses: actions/download-artifact@v4
with:
@@ -222,7 +222,7 @@ jobs:
# setup node without cache configuration
# Prisma cache is not compatible with docker build cache
- name: Setup Node.js
uses: actions/setup-node@v4
uses: actions/setup-node@v6
with:
node-version-file: '.nvmrc'
registry-url: https://npm.pkg.github.com

View File

@@ -46,7 +46,7 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@v6
- name: Initialize CodeQL
uses: github/codeql-action/init@v3
@@ -67,9 +67,9 @@ jobs:
name: Lint
runs-on: ubuntu-24.04-arm
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Go (for actionlint)
uses: actions/setup-go@v5
uses: actions/setup-go@v6
with:
go-version: 'stable'
- name: Install actionlint
@@ -111,7 +111,7 @@ jobs:
env:
NODE_OPTIONS: --max-old-space-size=14384
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
with:
@@ -138,7 +138,7 @@ jobs:
outputs:
run-rust: ${{ steps.rust-filter.outputs.rust }}
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- uses: dorny/paths-filter@v3
id: rust-filter
@@ -159,7 +159,7 @@ jobs:
needs:
- rust-test-filter
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- uses: ./.github/actions/build-rust
with:
target: x86_64-unknown-linux-gnu
@@ -182,7 +182,7 @@ jobs:
needs:
- build-server-native
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
with:
@@ -212,7 +212,7 @@ jobs:
name: Check yarn binary
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Run check
run: |
set -euo pipefail
@@ -226,9 +226,9 @@ jobs:
strategy:
fail-fast: false
matrix:
shard: [1, 2]
shard: [1, 2, 3, 4, 5]
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
with:
@@ -256,7 +256,7 @@ jobs:
name: E2E BlockSuite Cross Browser Test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
with:
@@ -282,52 +282,6 @@ jobs:
path: ./test-results
if-no-files-found: ignore
bundler-matrix:
name: Bundler Matrix (${{ matrix.bundler }})
runs-on: ubuntu-24.04-arm
strategy:
fail-fast: false
matrix:
bundler: [webpack, rspack]
steps:
- uses: actions/checkout@v4
- name: Setup Node.js
uses: ./.github/actions/setup-node
with:
playwright-install: false
electron-install: false
full-cache: true
- name: Run frontend build matrix
env:
AFFINE_BUNDLER: ${{ matrix.bundler }}
run: |
set -euo pipefail
packages=(
"@affine/web"
"@affine/mobile"
"@affine/ios"
"@affine/android"
"@affine/admin"
"@affine/electron-renderer"
)
summary="test-results-bundler-${AFFINE_BUNDLER}.txt"
: > "$summary"
for pkg in "${packages[@]}"; do
start=$(date +%s)
yarn affine "$pkg" build
end=$(date +%s)
echo "${pkg},$((end-start))" >> "$summary"
done
- name: Upload bundler timing
if: always()
uses: actions/upload-artifact@v4
with:
name: test-results-bundler-${{ matrix.bundler }}
path: ./test-results-bundler-${{ matrix.bundler }}.txt
if-no-files-found: ignore
e2e-test:
name: E2E Test
runs-on: ubuntu-24.04-arm
@@ -340,7 +294,7 @@ jobs:
matrix:
shard: [1, 2, 3, 4, 5]
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
with:
@@ -372,7 +326,7 @@ jobs:
matrix:
shard: [1, 2]
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
with:
@@ -402,9 +356,9 @@ jobs:
strategy:
fail-fast: false
matrix:
shard: [1, 2, 3]
shard: [1, 2, 3, 4, 5]
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
with:
@@ -437,7 +391,7 @@ jobs:
env:
CARGO_PROFILE_RELEASE_DEBUG: '1'
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
with:
@@ -476,7 +430,7 @@ jobs:
- { os: macos-latest, target: aarch64-apple-darwin }
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
with:
@@ -517,7 +471,7 @@ jobs:
- { os: windows-latest, target: aarch64-pc-windows-msvc }
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- uses: samypr100/setup-dev-drive@v3
with:
workspace-copy: true
@@ -557,7 +511,7 @@ jobs:
env:
CARGO_PROFILE_RELEASE_DEBUG: '1'
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
with:
@@ -580,7 +534,7 @@ jobs:
name: Build @affine/electron renderer
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
with:
@@ -607,7 +561,7 @@ jobs:
needs:
- build-native-linux
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
with:
@@ -661,7 +615,7 @@ jobs:
ports:
- 9308:9308
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
@@ -742,7 +696,7 @@ jobs:
stack-version: 9.0.1
security-enabled: false
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
@@ -805,7 +759,7 @@ jobs:
ports:
- 9308:9308
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
@@ -846,7 +800,7 @@ jobs:
CARGO_TERM_COLOR: always
MIRIFLAGS: -Zmiri-backtrace=full -Zmiri-tree-borrows
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
@@ -874,7 +828,7 @@ jobs:
RUST_BACKTRACE: full
CARGO_TERM_COLOR: always
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
@@ -898,7 +852,7 @@ jobs:
env:
CARGO_TERM_COLOR: always
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
@@ -937,7 +891,7 @@ jobs:
env:
CARGO_TERM_COLOR: always
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Rust
uses: ./.github/actions/build-rust
with:
@@ -960,7 +914,7 @@ jobs:
run-api: ${{ steps.decision.outputs.run_api }}
run-e2e: ${{ steps.decision.outputs.run_e2e }}
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- uses: dorny/paths-filter@v3
id: copilot-filter
@@ -1029,7 +983,7 @@ jobs:
ports:
- 9308:9308
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
@@ -1102,7 +1056,7 @@ jobs:
ports:
- 9308:9308
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
@@ -1185,7 +1139,7 @@ jobs:
ports:
- 9308:9308
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
@@ -1266,7 +1220,7 @@ jobs:
test: true,
}
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
timeout-minutes: 10

View File

@@ -10,7 +10,7 @@ jobs:
env:
CARGO_PROFILE_RELEASE_DEBUG: '1'
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
with:
@@ -64,7 +64,7 @@ jobs:
ports:
- 9308:9308
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
@@ -134,7 +134,7 @@ jobs:
ports:
- 9308:9308
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: ./.github/actions/setup-node
@@ -167,7 +167,7 @@ jobs:
runs-on: ubuntu-latest
name: Post test result message
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
with:
fetch-depth: 0
- name: Setup Node.js

View File

@@ -18,9 +18,9 @@ jobs:
runs-on: ubuntu-latest
if: ${{ github.event.action != 'edited' || github.event.changes.title != null }}
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Node.js
uses: actions/setup-node@v4
uses: actions/setup-node@v6
with:
cache: 'yarn'
node-version-file: '.nvmrc'

View File

@@ -35,7 +35,7 @@ jobs:
- build-images
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Deploy to ${{ inputs.build-type }}
uses: ./.github/actions/deploy
with:

View File

@@ -69,7 +69,7 @@ jobs:
SENTRY_DSN: ${{ secrets.SENTRY_DSN }}
SENTRY_RELEASE: ${{ inputs.app_version }}
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Version
uses: ./.github/actions/setup-version
@@ -101,7 +101,7 @@ jobs:
- name: Signing By Apple Developer ID
if: ${{ inputs.platform == 'darwin' && inputs.apple_codesign }}
uses: apple-actions/import-codesign-certs@v5
uses: apple-actions/import-codesign-certs@v6
with:
p12-file-base64: ${{ secrets.CERTIFICATES_P12 }}
p12-password: ${{ secrets.CERTIFICATES_P12_PASSWORD }}
@@ -178,14 +178,14 @@ jobs:
mv packages/frontend/apps/electron/out/*/make/deb/${{ inputs.arch }}/*.deb ./builds/affine-${{ env.RELEASE_VERSION }}-${{ env.BUILD_TYPE }}-linux-${{ inputs.arch }}.deb
mv packages/frontend/apps/electron/out/*/make/flatpak/*/*.flatpak ./builds/affine-${{ env.RELEASE_VERSION }}-${{ env.BUILD_TYPE }}-linux-${{ inputs.arch }}.flatpak
- uses: actions/attest-build-provenance@v2
- uses: actions/attest-build-provenance@v4
if: ${{ inputs.platform == 'darwin' }}
with:
subject-path: |
./builds/affine-${{ env.RELEASE_VERSION }}-${{ env.BUILD_TYPE }}-macos-${{ inputs.arch }}.zip
./builds/affine-${{ env.RELEASE_VERSION }}-${{ env.BUILD_TYPE }}-macos-${{ inputs.arch }}.dmg
- uses: actions/attest-build-provenance@v2
- uses: actions/attest-build-provenance@v4
if: ${{ inputs.platform == 'linux' }}
with:
subject-path: |

View File

@@ -48,7 +48,7 @@ jobs:
runs-on: ubuntu-latest
environment: ${{ inputs.build-type }}
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Version
uses: ./.github/actions/setup-version
with:
@@ -187,7 +187,7 @@ jobs:
FILES_TO_BE_SIGNED_x64: ${{ steps.get_files_to_be_signed.outputs.FILES_TO_BE_SIGNED_x64 }}
FILES_TO_BE_SIGNED_arm64: ${{ steps.get_files_to_be_signed.outputs.FILES_TO_BE_SIGNED_arm64 }}
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Version
uses: ./.github/actions/setup-version
with:
@@ -344,7 +344,7 @@ jobs:
mv packages/frontend/apps/electron/out/*/make/squirrel.windows/${{ matrix.spec.arch }}/*.exe ./builds/affine-${{ env.RELEASE_VERSION }}-${{ env.BUILD_TYPE }}-windows-${{ matrix.spec.arch }}.exe
mv packages/frontend/apps/electron/out/*/make/nsis.windows/${{ matrix.spec.arch }}/*.exe ./builds/affine-${{ env.RELEASE_VERSION }}-${{ env.BUILD_TYPE }}-windows-${{ matrix.spec.arch }}.nsis.exe
- uses: actions/attest-build-provenance@v2
- uses: actions/attest-build-provenance@v4
with:
subject-path: |
./builds/affine-${{ env.RELEASE_VERSION }}-${{ env.BUILD_TYPE }}-windows-${{ matrix.spec.arch }}.zip
@@ -369,7 +369,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Download Artifacts (macos-x64)
uses: actions/download-artifact@v4
with:
@@ -395,7 +395,7 @@ jobs:
with:
name: affine-linux-x64-builds
path: ./release
- uses: actions/setup-node@v4
- uses: actions/setup-node@v6
with:
node-version: 20
- name: Copy Selfhost Release Files

View File

@@ -26,7 +26,7 @@ jobs:
runs-on: ubuntu-latest
environment: ${{ inputs.build-type }}
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Version
uses: ./.github/actions/setup-version
with:
@@ -54,7 +54,7 @@ jobs:
build-android-web:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Version
uses: ./.github/actions/setup-version
with:
@@ -83,7 +83,7 @@ jobs:
needs:
- build-ios-web
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Version
uses: ./.github/actions/setup-version
with:
@@ -114,7 +114,7 @@ jobs:
- name: Cap sync
run: yarn workspace @affine/ios sync
- name: Signing By Apple Developer ID
uses: apple-actions/import-codesign-certs@v5
uses: apple-actions/import-codesign-certs@v6
id: import-codesign-certs
with:
p12-file-base64: ${{ secrets.CERTIFICATES_P12_MOBILE }}
@@ -147,7 +147,7 @@ jobs:
needs:
- build-android-web
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Setup Version
uses: ./.github/actions/setup-version
with:
@@ -180,7 +180,7 @@ jobs:
no-build: 'true'
- name: Cap sync
run: yarn workspace @affine/android cap sync
- uses: actions/setup-python@v5
- uses: actions/setup-python@v6
with:
python-version: '3.13'
- name: Auth gcloud
@@ -192,7 +192,7 @@ jobs:
token_format: 'access_token'
project_id: '${{ secrets.GCP_PROJECT_ID }}'
access_token_scopes: 'https://www.googleapis.com/auth/androidpublisher'
- uses: actions/setup-java@v4
- uses: actions/setup-java@v5
with:
distribution: 'temurin'
java-version: '21'

View File

@@ -55,7 +55,7 @@ jobs:
GIT_SHORT_HASH: ${{ steps.prepare.outputs.GIT_SHORT_HASH }}
BUILD_TYPE: ${{ steps.prepare.outputs.BUILD_TYPE }}
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- name: Prepare Release
id: prepare
uses: ./.github/actions/prepare-release
@@ -72,7 +72,7 @@ jobs:
steps:
- name: Decide whether to release
id: decide
uses: actions/github-script@v7
uses: actions/github-script@v8
with:
script: |
const buildType = '${{ needs.prepare.outputs.BUILD_TYPE }}'

1
.gitignore vendored
View File

@@ -48,6 +48,7 @@ testem.log
/typings
tsconfig.tsbuildinfo
.context
/*.md
# System Files
.DS_Store

706
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -40,10 +40,20 @@ resolver = "3"
dotenvy = "0.15"
file-format = { version = "0.28", features = ["reader"] }
homedir = "0.3"
image = { version = "0.25.9", default-features = false, features = [
"bmp",
"gif",
"jpeg",
"png",
"webp",
] }
infer = { version = "0.19.0" }
lasso = { version = "0.7", features = ["multi-threaded"] }
lib0 = { version = "0.16", features = ["lib0-serde"] }
libc = "0.2"
libwebp-sys = "0.14.2"
little_exif = "0.6.23"
llm_adapter = "0.1.1"
log = "0.4"
loom = { version = "0.7", features = ["checkpoint"] }
lru = "0.16"

View File

@@ -23,4 +23,6 @@ We welcome you to provide us with bug reports via and email at [security@toevery
Since we are an open source project, we also welcome you to provide corresponding fix PRs, we will determine specific rewards based on the evaluation results.
Due to limited resources, we do not accept and will not review any AI-generated security reports.
If the vulnerability is caused by a library we depend on, we encourage you to submit a security report to the corresponding dependent library at the same time to benefit more users.

View File

@@ -300,6 +300,6 @@
"devDependencies": {
"@vanilla-extract/vite-plugin": "^5.0.0",
"msw": "^2.12.4",
"vitest": "^3.2.4"
"vitest": "^4.0.18"
}
}

View File

@@ -11,7 +11,7 @@ export default defineConfig({
include: ['src/__tests__/**/*.unit.spec.ts'],
testTimeout: 1000,
coverage: {
provider: 'istanbul', // or 'c8'
provider: 'istanbul',
reporter: ['lcov'],
reportsDirectory: '../../../.coverage/blocksuite-affine',
},

View File

@@ -31,7 +31,8 @@
"zod": "^3.25.76"
},
"devDependencies": {
"vitest": "^3.2.4"
"@vitest/browser-playwright": "^4.0.18",
"vitest": "^4.0.18"
},
"exports": {
".": "./src/index.ts",

View File

@@ -108,7 +108,9 @@ export class BookmarkBlockComponent extends CaptionedBlockComponent<BookmarkBloc
}
open = () => {
window.open(this.link, '_blank');
const link = this.link;
if (!link) return;
window.open(link, '_blank', 'noopener,noreferrer');
};
refreshData = () => {

View File

@@ -1,3 +1,4 @@
import { playwright } from '@vitest/browser-playwright';
import { defineConfig } from 'vitest/config';
export default defineConfig({
@@ -8,10 +9,9 @@ export default defineConfig({
browser: {
enabled: true,
headless: true,
name: 'chromium',
provider: 'playwright',
instances: [{ browser: 'chromium' }],
provider: playwright(),
isolate: false,
providerOptions: {},
},
include: ['src/__tests__/**/*.unit.spec.ts'],
testTimeout: 500,

View File

@@ -45,8 +45,10 @@ export class AffineCodeUnit extends ShadowlessElement {
if (!codeBlock || !vElement) return plainContent;
const tokens = codeBlock.highlightTokens$.value;
if (tokens.length === 0) return plainContent;
const line = tokens[vElement.lineIndex];
if (!line) return plainContent;
// copy the tokens to avoid modifying the original tokens
const lineTokens = structuredClone(tokens[vElement.lineIndex]);
const lineTokens = structuredClone(line);
if (lineTokens.length === 0) return plainContent;
const startOffset = vElement.startOffset;

View File

@@ -35,7 +35,7 @@
"zod": "^3.25.76"
},
"devDependencies": {
"vitest": "^3.2.4"
"vitest": "^4.0.18"
},
"exports": {
".": "./src/index.ts",

View File

@@ -35,7 +35,7 @@
"zod": "^3.25.76"
},
"devDependencies": {
"vitest": "^3.2.4"
"vitest": "^4.0.18"
},
"exports": {
".": "./src/index.ts",

View File

@@ -31,7 +31,7 @@
"zod": "^3.25.76"
},
"devDependencies": {
"vitest": "^3.2.4"
"vitest": "^4.0.18"
},
"exports": {
".": "./src/index.ts",

View File

@@ -221,6 +221,12 @@ export class EdgelessNoteBlockComponent extends toGfxBlockComponent(
}
}
override getCSSScaleVal(): number {
const baseScale = super.getCSSScaleVal();
const extraScale = this.model.props.edgeless?.scale ?? 1;
return baseScale * extraScale;
}
override getRenderingRect() {
const { xywh, edgeless } = this.model.props;
const { collapse, scale = 1 } = edgeless;
@@ -255,7 +261,6 @@ export class EdgelessNoteBlockComponent extends toGfxBlockComponent(
const style = {
borderRadius: borderRadius + 'px',
transform: `scale(${scale})`,
};
const extra = this._editing ? ACTIVE_NOTE_EXTRA_PADDING : 0;
@@ -454,6 +459,28 @@ export const EdgelessNoteInteraction =
return;
}
let isClickOnTitle = false;
const titleRect = view
.querySelector('edgeless-page-block-title')
?.getBoundingClientRect();
if (titleRect) {
const titleBound = new Bound(
titleRect.x,
titleRect.y,
titleRect.width,
titleRect.height
);
if (titleBound.isPointInBound([e.clientX, e.clientY])) {
isClickOnTitle = true;
}
}
if (isClickOnTitle) {
handleNativeRangeAtPoint(e.clientX, e.clientY);
return;
}
if (model.children.length === 0) {
const blockId = std.store.addBlock(
'affine:paragraph',

View File

@@ -33,7 +33,7 @@
"zod": "^3.25.76"
},
"devDependencies": {
"vitest": "^3.2.4"
"vitest": "^4.0.18"
},
"exports": {
".": "./src/index.ts",

View File

@@ -19,7 +19,7 @@
"@blocksuite/sync": "workspace:*",
"@floating-ui/dom": "^1.6.13",
"@lit/context": "^1.1.2",
"@lottiefiles/dotlottie-wc": "^0.5.0",
"@lottiefiles/dotlottie-wc": "^0.9.4",
"@preact/signals-core": "^1.8.0",
"@toeverything/theme": "^1.1.23",
"@types/hast": "^3.0.4",

View File

@@ -1,4 +1,8 @@
import { getHostName } from '@blocksuite/affine-shared/utils';
import {
getHostName,
isValidUrl,
normalizeUrl,
} from '@blocksuite/affine-shared/utils';
import { PropTypes, requiredProperties } from '@blocksuite/std';
import { css, LitElement } from 'lit';
import { property } from 'lit/decorators.js';
@@ -44,15 +48,27 @@ export class LinkPreview extends LitElement {
override render() {
const { url } = this;
const normalizedUrl = normalizeUrl(url);
const safeUrl =
normalizedUrl && isValidUrl(normalizedUrl) ? normalizedUrl : null;
const hostName = getHostName(safeUrl ?? url);
if (!safeUrl) {
return html`
<span class="affine-link-preview">
<span>${hostName}</span>
</span>
`;
}
return html`
<a
class="affine-link-preview"
rel="noopener noreferrer"
target="_blank"
href=${url}
href=${safeUrl}
>
<span>${getHostName(url)}</span>
<span>${hostName}</span>
</a>
`;
}

View File

@@ -32,7 +32,7 @@
"zod": "^3.25.76"
},
"devDependencies": {
"vitest": "^3.2.4"
"vitest": "^4.0.18"
},
"exports": {
".": "./src/index.ts",

View File

@@ -15,7 +15,7 @@
"zod": "^3.25.76"
},
"devDependencies": {
"vitest": "^3.2.4"
"vitest": "^4.0.18"
},
"exports": {
".": "./src/index.ts"

View File

@@ -8,7 +8,7 @@ export default defineConfig({
include: ['src/__tests__/**/*.unit.spec.ts'],
testTimeout: 500,
coverage: {
provider: 'istanbul', // or 'c8'
provider: 'istanbul',
reporter: ['lcov'],
reportsDirectory: '../../../.coverage/ext-loader',
},

View File

@@ -34,7 +34,7 @@
"zod": "^3.25.76"
},
"devDependencies": {
"vitest": "^3.2.4"
"vitest": "^4.0.18"
},
"exports": {
".": "./src/index.ts",

View File

@@ -32,7 +32,7 @@
"zod": "^3.25.76"
},
"devDependencies": {
"vitest": "^3.2.4"
"vitest": "^4.0.18"
},
"exports": {
".": "./src/index.ts",

View File

@@ -29,7 +29,7 @@
"zod": "^3.25.76"
},
"devDependencies": {
"vitest": "^3.2.4"
"vitest": "^4.0.18"
},
"exports": {
".": "./src/index.ts",

View File

@@ -34,7 +34,8 @@
"zod": "^3.25.76"
},
"devDependencies": {
"vitest": "^3.2.4"
"@vitest/browser-playwright": "^4.0.18",
"vitest": "^4.0.18"
},
"exports": {
".": "./src/index.ts",

View File

@@ -4,6 +4,7 @@ import type { FootNote } from '@blocksuite/affine-model';
import { CitationProvider } from '@blocksuite/affine-shared/services';
import { unsafeCSSVarV2 } from '@blocksuite/affine-shared/theme';
import type { AffineTextAttributes } from '@blocksuite/affine-shared/types';
import { isValidUrl, normalizeUrl } from '@blocksuite/affine-shared/utils';
import { WithDisposable } from '@blocksuite/global/lit';
import {
BlockSelection,
@@ -152,7 +153,9 @@ export class AffineFootnoteNode extends WithDisposable(ShadowlessElement) {
};
private readonly _handleUrlReference = (url: string) => {
window.open(url, '_blank');
const normalizedUrl = normalizeUrl(url);
if (!normalizedUrl || !isValidUrl(normalizedUrl)) return;
window.open(normalizedUrl, '_blank', 'noopener,noreferrer');
};
private readonly _updateFootnoteAttributes = (footnote: FootNote) => {

View File

@@ -1,3 +1,4 @@
import { playwright } from '@vitest/browser-playwright';
import { defineConfig } from 'vitest/config';
export default defineConfig({
@@ -8,10 +9,9 @@ export default defineConfig({
browser: {
enabled: true,
headless: true,
name: 'chromium',
provider: 'playwright',
instances: [{ browser: 'chromium' }],
provider: playwright(),
isolate: false,
providerOptions: {},
},
include: ['src/__tests__/**/*.unit.spec.ts'],
testTimeout: 500,

View File

@@ -74,7 +74,7 @@
],
"devDependencies": {
"@types/pdfmake": "^0.2.12",
"vitest": "^3.2.4"
"vitest": "^4.0.18"
},
"version": "0.26.3"
}

View File

@@ -0,0 +1,108 @@
import {
beforeEach,
describe,
expect,
it,
type MockInstance,
vi,
} from 'vitest';
import * as PointToRangeUtils from '../../utils/dom/point-to-range';
import { handleNativeRangeAtPoint } from '../../utils/dom/point-to-range';
describe('test handleNativeRangeAtPoint', () => {
let caretRangeFromPointSpy: MockInstance<
(clientX: number, clientY: number) => Range | null
>;
let resetNativeSelectionSpy: MockInstance<(range: Range | null) => void>;
beforeEach(() => {
caretRangeFromPointSpy = vi.spyOn(
PointToRangeUtils.api,
'caretRangeFromPoint'
);
resetNativeSelectionSpy = vi.spyOn(
PointToRangeUtils.api,
'resetNativeSelection'
);
});
it('does nothing if caretRangeFromPoint returns null', () => {
caretRangeFromPointSpy.mockReturnValue(null);
handleNativeRangeAtPoint(10, 10);
expect(resetNativeSelectionSpy).not.toHaveBeenCalled();
});
it('keeps range untouched if startContainer is a Text node', () => {
const div = document.createElement('div');
div.textContent = 'hello';
const text = div.firstChild!;
const range = document.createRange();
range.setStart(text, 2);
range.collapse(true);
caretRangeFromPointSpy.mockReturnValue(range);
handleNativeRangeAtPoint(10, 10);
expect(range.startContainer).toBe(text);
expect(range.startOffset).toBe(2);
expect(resetNativeSelectionSpy).toHaveBeenCalled();
});
it('moves caret into direct text child when clicking element', () => {
const div = document.createElement('div');
div.append('hello');
const range = document.createRange();
range.setStart(div, 1);
range.collapse(true);
caretRangeFromPointSpy.mockReturnValue(range);
handleNativeRangeAtPoint(10, 10);
expect(range.startContainer.nodeType).toBe(Node.TEXT_NODE);
expect(range.startContainer.textContent).toBe('hello');
expect(range.startOffset).toBe(5);
expect(resetNativeSelectionSpy).toHaveBeenCalled();
});
it('moves caret to last meaningful text inside nested element', () => {
const div = document.createElement('div');
div.innerHTML = `<span>a</span><span><em>b</em>c</span>`;
const range = document.createRange();
range.setStart(div, 2);
range.collapse(true);
caretRangeFromPointSpy.mockReturnValue(range);
handleNativeRangeAtPoint(10, 10);
expect(range.startContainer.nodeType).toBe(Node.TEXT_NODE);
expect(range.startContainer.textContent).toBe('c');
expect(range.startOffset).toBe(1);
expect(resetNativeSelectionSpy).toHaveBeenCalled();
});
it('falls back to searching startContainer when offset element has no text', () => {
const div = document.createElement('div');
div.innerHTML = `<span></span><span>ok</span>`;
const range = document.createRange();
range.setStart(div, 1);
range.collapse(true);
caretRangeFromPointSpy.mockReturnValue(range);
handleNativeRangeAtPoint(10, 10);
expect(range.startContainer.textContent).toBe('ok');
expect(range.startOffset).toBe(2);
expect(resetNativeSelectionSpy).toHaveBeenCalled();
});
});

View File

@@ -88,11 +88,73 @@ export function getCurrentNativeRange(selection = window.getSelection()) {
return selection.getRangeAt(0);
}
// functions need to be mocked in unit-test
export const api = {
caretRangeFromPoint,
resetNativeSelection,
};
export function handleNativeRangeAtPoint(x: number, y: number) {
const range = caretRangeFromPoint(x, y);
const range = api.caretRangeFromPoint(x, y);
if (range) {
normalizeCaretRange(range);
}
const startContainer = range?.startContainer;
// click on rich text
if (startContainer instanceof Node) {
resetNativeSelection(range);
api.resetNativeSelection(range);
}
}
function lastMeaningfulTextNode(node: Node) {
const walker = document.createTreeWalker(node, NodeFilter.SHOW_TEXT, {
acceptNode(node) {
return node.textContent && node.textContent?.trim().length > 0
? NodeFilter.FILTER_ACCEPT
: NodeFilter.FILTER_REJECT;
},
});
let last = null;
while (walker.nextNode()) {
last = walker.currentNode;
}
return last;
}
function normalizeCaretRange(range: Range) {
let { startContainer, startOffset } = range;
if (startContainer.nodeType === Node.TEXT_NODE) return;
// Try to find text in the element at `startOffset`
const offsetEl =
startOffset > 0
? startContainer.childNodes[startOffset - 1]
: startContainer.childNodes[0];
if (offsetEl) {
if (offsetEl.nodeType === Node.TEXT_NODE) {
range.setStart(
offsetEl,
startOffset > 0 ? (offsetEl.textContent?.length ?? 0) : 0
);
range.collapse(true);
return;
}
const text = lastMeaningfulTextNode(offsetEl);
if (text) {
range.setStart(text, text.textContent?.length ?? 0);
range.collapse(true);
return;
}
}
// Fallback, try to find text in startContainer
const text = lastMeaningfulTextNode(startContainer);
if (text) {
range.setStart(text, text.textContent?.length ?? 0);
range.collapse(true);
return;
}
}

View File

@@ -24,6 +24,11 @@ const toURL = (str: string) => {
}
};
const hasAllowedScheme = (url: URL) => {
const protocol = url.protocol.slice(0, -1).toLowerCase();
return ALLOWED_SCHEMES.has(protocol);
};
function resolveURL(str: string, baseUrl: string, padded = false) {
const url = toURL(str);
if (!url) return null;
@@ -61,6 +66,7 @@ export function normalizeUrl(str: string) {
// Formatted
if (url) {
if (!hasAllowedScheme(url)) return '';
if (!str.endsWith('/') && url.href.endsWith('/')) {
return url.href.substring(0, url.href.length - 1);
}

View File

@@ -9,7 +9,7 @@ export default defineConfig({
include: ['src/__tests__/**/*.unit.spec.ts'],
testTimeout: 1000,
coverage: {
provider: 'istanbul', // or 'c8'
provider: 'istanbul', // or 'istanbul'
reporter: ['lcov'],
reportsDirectory: '../../../.coverage/affine-shared',
},

View File

@@ -62,7 +62,7 @@
"zod": "^3.25.76"
},
"devDependencies": {
"vitest": "^3.2.4"
"vitest": "^4.0.18"
},
"version": "0.26.3"
}

View File

@@ -5,7 +5,7 @@ export default defineConfig({
include: ['src/__tests__/**/*.unit.spec.ts'],
testTimeout: 500,
coverage: {
provider: 'istanbul', // or 'c8'
provider: 'istanbul',
reporter: ['lcov'],
reportsDirectory: '../../../.coverage/global',
},

View File

@@ -33,7 +33,8 @@
"zod": "^3.25.76"
},
"devDependencies": {
"vitest": "^3.2.4"
"@vitest/browser-playwright": "^4.0.18",
"vitest": "^4.0.18"
},
"exports": {
".": "./src/index.ts",

View File

@@ -105,17 +105,23 @@ export abstract class GfxBlockComponent<
onBoxSelected(_: BoxSelectionContext) {}
getCSSScaleVal(): number {
const viewport = this.gfx.viewport;
const { zoom, viewScale } = viewport;
return zoom / viewScale;
}
getCSSTransform() {
const viewport = this.gfx.viewport;
const { translateX, translateY, zoom } = viewport;
const { translateX, translateY, zoom, viewScale } = viewport;
const bound = Bound.deserialize(this.model.xywh);
const scaledX = bound.x * zoom;
const scaledY = bound.y * zoom;
const scaledX = (bound.x * zoom) / viewScale;
const scaledY = (bound.y * zoom) / viewScale;
const deltaX = scaledX - bound.x;
const deltaY = scaledY - bound.y;
return `translate(${translateX + deltaX}px, ${translateY + deltaY}px) scale(${zoom})`;
return `translate(${translateX / viewScale + deltaX}px, ${translateY / viewScale + deltaY}px) scale(${this.getCSSScaleVal()})`;
}
getRenderingRect() {
@@ -219,18 +225,12 @@ export function toGfxBlockComponent<
handleGfxConnection(this);
}
// eslint-disable-next-line sonarjs/no-identical-functions
getCSSScaleVal(): number {
return GfxBlockComponent.prototype.getCSSScaleVal.call(this);
}
getCSSTransform() {
const viewport = this.gfx.viewport;
const { translateX, translateY, zoom } = viewport;
const bound = Bound.deserialize(this.model.xywh);
const scaledX = bound.x * zoom;
const scaledY = bound.y * zoom;
const deltaX = scaledX - bound.x;
const deltaY = scaledY - bound.y;
return `translate(${translateX + deltaX}px, ${translateY + deltaY}px) scale(${zoom})`;
return GfxBlockComponent.prototype.getCSSTransform.call(this);
}
// eslint-disable-next-line sonarjs/no-identical-functions

View File

@@ -1,3 +1,4 @@
import { playwright } from '@vitest/browser-playwright';
import { defineConfig } from 'vitest/config';
export default defineConfig({
@@ -8,15 +9,14 @@ export default defineConfig({
browser: {
enabled: true,
headless: true,
name: 'chromium',
provider: 'playwright',
instances: [{ browser: 'chromium' }],
provider: playwright(),
isolate: false,
providerOptions: {},
},
include: ['src/__tests__/**/*.unit.spec.ts'],
testTimeout: 500,
coverage: {
provider: 'istanbul', // or 'c8'
provider: 'istanbul',
reporter: ['lcov'],
reportsDirectory: '../../../.coverage/std',
},

View File

@@ -29,7 +29,7 @@
"devDependencies": {
"@types/lodash.clonedeep": "^4.5.9",
"@types/lodash.merge": "^4.6.9",
"vitest": "^3.2.4"
"vitest": "^4.0.18"
},
"exports": {
".": "./src/index.ts",

View File

@@ -7,15 +7,11 @@ export * from './transformer';
export { type IdGenerator, nanoid, uuidv4 } from './utils/id-generator';
export * from './yjs';
const env = (
typeof globalThis !== 'undefined'
? globalThis
: typeof window !== 'undefined'
? window
: typeof global !== 'undefined'
? global
: {}
) as Record<string, boolean>;
const env = (typeof globalThis !== 'undefined'
? globalThis
: typeof window !== 'undefined'
? window
: {}) as unknown as Record<string, boolean>;
const importIdentifier = '__ $BLOCKSUITE_STORE$ __';
if (env[importIdentifier] === true) {

View File

@@ -8,7 +8,7 @@ export default defineConfig({
include: ['src/__tests__/**/*.unit.spec.ts'],
testTimeout: 500,
coverage: {
provider: 'istanbul', // or 'c8'
provider: 'istanbul',
reporter: ['lcov'],
reportsDirectory: '../../../.coverage/store',
},

View File

@@ -19,7 +19,7 @@
"y-protocols": "^1.0.6"
},
"devDependencies": {
"vitest": "^3.2.4"
"vitest": "^4.0.18"
},
"peerDependencies": {
"yjs": "*"

View File

@@ -5,7 +5,7 @@ export default defineConfig({
include: ['src/__tests__/**/*.unit.spec.ts'],
testTimeout: 500,
coverage: {
provider: 'istanbul', // or 'c8'
provider: 'istanbul',
reporter: ['lcov'],
reportsDirectory: '../../../.coverage/sync',
},

View File

@@ -6,7 +6,7 @@
"dev": "vite",
"build": "tsc",
"test:unit": "vitest --browser.headless --run",
"test:debug": "PWDEBUG=1 npx vitest"
"test:debug": "PWDEBUG=1 npx vitest --browser.headless=false"
},
"sideEffects": false,
"keywords": [],
@@ -17,7 +17,7 @@
"@blocksuite/icons": "^2.2.17",
"@floating-ui/dom": "^1.6.13",
"@lit/context": "^1.1.3",
"@lottiefiles/dotlottie-wc": "^0.5.0",
"@lottiefiles/dotlottie-wc": "^0.9.4",
"@preact/signals-core": "^1.8.0",
"@toeverything/theme": "^1.1.23",
"@vanilla-extract/css": "^1.17.0",
@@ -41,10 +41,11 @@
],
"devDependencies": {
"@vanilla-extract/vite-plugin": "^5.0.0",
"@vitest/browser-playwright": "^4.0.18",
"vite": "^7.2.7",
"vite-plugin-istanbul": "^7.2.1",
"vite-plugin-wasm": "^3.5.0",
"vitest": "^3.2.4"
"vitest": "^4.0.18"
},
"version": "0.26.3"
}

View File

@@ -1,4 +1,5 @@
import { vanillaExtractPlugin } from '@vanilla-extract/vite-plugin';
import { playwright } from '@vitest/browser-playwright';
import { defineConfig } from 'vitest/config';
export default defineConfig(_configEnv =>
@@ -18,13 +19,13 @@ export default defineConfig(_configEnv =>
retry: process.env.CI === 'true' ? 3 : 0,
browser: {
enabled: true,
headless: process.env.CI === 'true',
headless: true,
instances: [
{ browser: 'chromium' },
{ browser: 'firefox' },
{ browser: 'webkit' },
],
provider: 'playwright',
provider: playwright(),
isolate: false,
viewport: {
width: 1024,
@@ -32,16 +33,13 @@ export default defineConfig(_configEnv =>
},
},
coverage: {
provider: 'istanbul', // or 'c8'
provider: 'istanbul',
reporter: ['lcov'],
reportsDirectory: '../../.coverage/integration-test',
},
deps: {
interopDefault: true,
},
testTransformMode: {
web: ['src/__tests__/**/*.spec.ts'],
},
},
})
);

View File

@@ -22,7 +22,7 @@
"af": "r affine.ts",
"dev": "yarn affine dev",
"build": "yarn affine build",
"lint:eslint": "cross-env NODE_OPTIONS=\"--max-old-space-size=8192\" eslint --report-unused-disable-directives-severity=off . --cache",
"lint:eslint": "cross-env NODE_OPTIONS=\"--max-old-space-size=16384\" eslint --report-unused-disable-directives-severity=off . --cache",
"lint:eslint:fix": "yarn lint:eslint --fix --fix-type problem,suggestion,layout",
"lint:prettier": "prettier --ignore-unknown --cache --check .",
"lint:prettier:fix": "prettier --ignore-unknown --cache --write .",
@@ -56,7 +56,7 @@
"@faker-js/faker": "^10.1.0",
"@istanbuljs/schema": "^0.1.3",
"@magic-works/i18n-codegen": "^0.6.1",
"@playwright/test": "=1.52.0",
"@playwright/test": "=1.58.2",
"@smarttools/eslint-plugin-rxjs": "^1.0.8",
"@taplo/cli": "^0.7.0",
"@toeverything/infra": "workspace:*",
@@ -64,9 +64,9 @@
"@types/node": "^22.0.0",
"@typescript-eslint/parser": "^8.55.0",
"@vanilla-extract/vite-plugin": "^5.0.0",
"@vitest/browser": "^3.2.4",
"@vitest/coverage-istanbul": "^3.2.4",
"@vitest/ui": "^3.2.4",
"@vitest/browser": "^4.0.18",
"@vitest/coverage-istanbul": "^4.0.18",
"@vitest/ui": "^4.0.18",
"cross-env": "^10.1.0",
"electron": "^39.0.0",
"eslint": "^9.39.2",
@@ -90,7 +90,7 @@
"typescript-eslint": "^8.55.0",
"unplugin-swc": "^1.5.9",
"vite": "^7.2.7",
"vitest": "^3.2.4"
"vitest": "^4.0.18"
},
"packageManager": "yarn@4.12.0",
"resolutions": {

View File

@@ -14,13 +14,20 @@ affine_common = { workspace = true, features = [
"napi",
"ydoc-loader",
] }
anyhow = { workspace = true }
chrono = { workspace = true }
file-format = { workspace = true }
image = { workspace = true }
infer = { workspace = true }
libwebp-sys = { workspace = true }
little_exif = { workspace = true }
llm_adapter = { workspace = true }
mp4parse = { workspace = true }
napi = { workspace = true, features = ["async"] }
napi-derive = { workspace = true }
rand = { workspace = true }
serde = { workspace = true, features = ["derive"] }
serde_json = { workspace = true }
sha3 = { workspace = true }
tiktoken-rs = { workspace = true }
v_htmlescape = { workspace = true }

View File

@@ -1,5 +1,9 @@
/* auto-generated by NAPI-RS */
/* eslint-disable */
export declare class LlmStreamHandle {
abort(): void
}
export declare class Tokenizer {
count(content: string, allowedSpecial?: Array<string> | undefined | null): number
}
@@ -46,6 +50,10 @@ export declare function getMime(input: Uint8Array): string
export declare function htmlSanitize(input: string): string
export declare function llmDispatch(protocol: string, backendConfigJson: string, requestJson: string): string
export declare function llmDispatchStream(protocol: string, backendConfigJson: string, requestJson: string, callback: ((err: Error | null, arg: string) => void)): LlmStreamHandle
/**
* Merge updates in form like `Y.applyUpdate(doc, update)` way and return the
* result binary.
@@ -75,6 +83,8 @@ export interface NativeCrawlResult {
export interface NativeMarkdownResult {
title: string
markdown: string
knownUnsupportedBlocks: Array<string>
unknownBlocks: Array<string>
}
export interface NativePageDocContent {
@@ -102,6 +112,8 @@ export declare function parsePageDoc(docBin: Buffer, maxSummaryLength?: number |
export declare function parseWorkspaceDoc(docBin: Buffer): NativeWorkspaceDocContent | null
export declare function processImage(input: Buffer, maxEdge: number, keepExif: boolean): Promise<Buffer>
export declare function readAllDocIdsFromRootDoc(docBin: Buffer, includeTrash?: boolean | undefined | null): Array<string>
/**

View File

@@ -9,6 +9,8 @@ use napi_derive::napi;
pub struct NativeMarkdownResult {
pub title: String,
pub markdown: String,
pub known_unsupported_blocks: Vec<String>,
pub unknown_blocks: Vec<String>,
}
impl From<MarkdownResult> for NativeMarkdownResult {
@@ -16,6 +18,8 @@ impl From<MarkdownResult> for NativeMarkdownResult {
Self {
title: result.title,
markdown: result.markdown,
known_unsupported_blocks: result.known_unsupported_blocks,
unknown_blocks: result.unknown_blocks,
}
}
}

View File

@@ -0,0 +1,353 @@
use std::io::Cursor;
use anyhow::{Context, Result as AnyResult, bail};
use image::{
AnimationDecoder, DynamicImage, ImageDecoder, ImageFormat, ImageReader,
codecs::{gif::GifDecoder, png::PngDecoder, webp::WebPDecoder},
imageops::FilterType,
metadata::Orientation,
};
use libwebp_sys::{
WEBP_MUX_ABI_VERSION, WebPData, WebPDataClear, WebPDataInit, WebPEncodeRGBA, WebPFree, WebPMuxAssemble,
WebPMuxCreateInternal, WebPMuxDelete, WebPMuxError, WebPMuxSetChunk,
};
use little_exif::{exif_tag::ExifTag, filetype::FileExtension, metadata::Metadata};
use napi::{
Env, Error, Result, Status, Task,
bindgen_prelude::{AsyncTask, Buffer},
};
use napi_derive::napi;
const WEBP_QUALITY: f32 = 80.0;
const MAX_IMAGE_DIMENSION: u32 = 16_384;
const MAX_IMAGE_PIXELS: u64 = 40_000_000;
pub struct AsyncProcessImageTask {
input: Vec<u8>,
max_edge: u32,
keep_exif: bool,
}
#[napi]
impl Task for AsyncProcessImageTask {
type Output = Vec<u8>;
type JsValue = Buffer;
fn compute(&mut self) -> Result<Self::Output> {
process_image_inner(&self.input, self.max_edge, self.keep_exif)
.map_err(|error| Error::new(Status::InvalidArg, error.to_string()))
}
fn resolve(&mut self, _: Env, output: Self::Output) -> Result<Self::JsValue> {
Ok(output.into())
}
}
#[napi]
pub fn process_image(input: Buffer, max_edge: u32, keep_exif: bool) -> AsyncTask<AsyncProcessImageTask> {
AsyncTask::new(AsyncProcessImageTask {
input: input.to_vec(),
max_edge,
keep_exif,
})
}
fn process_image_inner(input: &[u8], max_edge: u32, keep_exif: bool) -> AnyResult<Vec<u8>> {
if max_edge == 0 {
bail!("max_edge must be greater than 0");
}
let format = image::guess_format(input).context("unsupported image format")?;
let (width, height) = read_dimensions(input, format)?;
validate_dimensions(width, height)?;
let mut image = decode_image(input, format)?;
let orientation = read_orientation(input, format)?;
image.apply_orientation(orientation);
if image.width().max(image.height()) > max_edge {
image = image.resize(max_edge, max_edge, FilterType::Lanczos3);
}
let mut output = encode_webp_lossy(&image.into_rgba8())?;
if keep_exif {
preserve_exif(input, format, &mut output)?;
}
Ok(output)
}
fn read_dimensions(input: &[u8], format: ImageFormat) -> AnyResult<(u32, u32)> {
ImageReader::with_format(Cursor::new(input), format)
.into_dimensions()
.context("failed to decode image")
}
fn validate_dimensions(width: u32, height: u32) -> AnyResult<()> {
if width == 0 || height == 0 {
bail!("failed to decode image");
}
if width > MAX_IMAGE_DIMENSION || height > MAX_IMAGE_DIMENSION {
bail!("image dimensions exceed limit");
}
if u64::from(width) * u64::from(height) > MAX_IMAGE_PIXELS {
bail!("image pixel count exceeds limit");
}
Ok(())
}
fn decode_image(input: &[u8], format: ImageFormat) -> AnyResult<DynamicImage> {
Ok(match format {
ImageFormat::Gif => {
let decoder = GifDecoder::new(Cursor::new(input)).context("failed to decode image")?;
let frame = decoder
.into_frames()
.next()
.transpose()
.context("failed to decode image")?
.context("image does not contain any frames")?;
DynamicImage::ImageRgba8(frame.into_buffer())
}
ImageFormat::Png => {
let decoder = PngDecoder::new(Cursor::new(input)).context("failed to decode image")?;
if decoder.is_apng().context("failed to decode image")? {
let frame = decoder
.apng()
.context("failed to decode image")?
.into_frames()
.next()
.transpose()
.context("failed to decode image")?
.context("image does not contain any frames")?;
DynamicImage::ImageRgba8(frame.into_buffer())
} else {
DynamicImage::from_decoder(decoder).context("failed to decode image")?
}
}
ImageFormat::WebP => {
let decoder = WebPDecoder::new(Cursor::new(input)).context("failed to decode image")?;
let frame = decoder
.into_frames()
.next()
.transpose()
.context("failed to decode image")?
.context("image does not contain any frames")?;
DynamicImage::ImageRgba8(frame.into_buffer())
}
_ => {
let reader = ImageReader::with_format(Cursor::new(input), format);
let decoder = reader.into_decoder().context("failed to decode image")?;
DynamicImage::from_decoder(decoder).context("failed to decode image")?
}
})
}
fn read_orientation(input: &[u8], format: ImageFormat) -> AnyResult<Orientation> {
Ok(match format {
ImageFormat::Gif => GifDecoder::new(Cursor::new(input))
.context("failed to decode image")?
.orientation()
.context("failed to decode image")?,
ImageFormat::Png => PngDecoder::new(Cursor::new(input))
.context("failed to decode image")?
.orientation()
.context("failed to decode image")?,
ImageFormat::WebP => WebPDecoder::new(Cursor::new(input))
.context("failed to decode image")?
.orientation()
.context("failed to decode image")?,
_ => ImageReader::with_format(Cursor::new(input), format)
.into_decoder()
.context("failed to decode image")?
.orientation()
.context("failed to decode image")?,
})
}
fn encode_webp_lossy(image: &image::RgbaImage) -> AnyResult<Vec<u8>> {
let width = i32::try_from(image.width()).context("image width is too large")?;
let height = i32::try_from(image.height()).context("image height is too large")?;
let stride = width.checked_mul(4).context("image width is too large")?;
let mut output = std::ptr::null_mut();
let encoded_len = unsafe { WebPEncodeRGBA(image.as_ptr(), width, height, stride, WEBP_QUALITY, &mut output) };
if output.is_null() || encoded_len == 0 {
bail!("failed to encode webp");
}
let encoded = unsafe { std::slice::from_raw_parts(output, encoded_len) }.to_vec();
unsafe {
WebPFree(output.cast());
}
Ok(encoded)
}
fn preserve_exif(input: &[u8], format: ImageFormat, output: &mut Vec<u8>) -> AnyResult<()> {
let Some(file_type) = map_exif_file_type(format) else {
return Ok(());
};
let input = input.to_vec();
let Ok(mut metadata) = Metadata::new_from_vec(&input, file_type) else {
return Ok(());
};
metadata.remove_tag(ExifTag::Orientation(vec![1]));
if !metadata.get_ifds().iter().any(|ifd| !ifd.get_tags().is_empty()) {
return Ok(());
}
let encoded_metadata = metadata.encode().context("failed to preserve exif metadata")?;
let source = WebPData {
bytes: output.as_ptr(),
size: output.len(),
};
let exif = WebPData {
bytes: encoded_metadata.as_ptr(),
size: encoded_metadata.len(),
};
let mut assembled = WebPData::default();
let mux = unsafe { WebPMuxCreateInternal(&source, 1, WEBP_MUX_ABI_VERSION as _) };
if mux.is_null() {
bail!("failed to preserve exif metadata");
}
let encoded = (|| -> AnyResult<Vec<u8>> {
if unsafe { WebPMuxSetChunk(mux, c"EXIF".as_ptr(), &exif, 1) } != WebPMuxError::WEBP_MUX_OK {
bail!("failed to preserve exif metadata");
}
WebPDataInit(&mut assembled);
if unsafe { WebPMuxAssemble(mux, &mut assembled) } != WebPMuxError::WEBP_MUX_OK {
bail!("failed to preserve exif metadata");
}
Ok(unsafe { std::slice::from_raw_parts(assembled.bytes, assembled.size) }.to_vec())
})();
unsafe {
WebPDataClear(&mut assembled);
WebPMuxDelete(mux);
}
*output = encoded?;
Ok(())
}
fn map_exif_file_type(format: ImageFormat) -> Option<FileExtension> {
match format {
ImageFormat::Jpeg => Some(FileExtension::JPEG),
ImageFormat::Png => Some(FileExtension::PNG { as_zTXt_chunk: true }),
ImageFormat::Tiff => Some(FileExtension::TIFF),
ImageFormat::WebP => Some(FileExtension::WEBP),
_ => None,
}
}
#[cfg(test)]
mod tests {
use image::{ExtendedColorType, GenericImageView, ImageEncoder, codecs::png::PngEncoder};
use super::*;
fn encode_png(width: u32, height: u32) -> Vec<u8> {
let image = image::RgbaImage::from_pixel(width, height, image::Rgba([255, 0, 0, 255]));
let mut encoded = Vec::new();
PngEncoder::new(&mut encoded)
.write_image(image.as_raw(), width, height, ExtendedColorType::Rgba8)
.unwrap();
encoded
}
fn encode_bmp_header(width: u32, height: u32) -> Vec<u8> {
let mut encoded = Vec::with_capacity(54);
encoded.extend_from_slice(b"BM");
encoded.extend_from_slice(&(54u32).to_le_bytes());
encoded.extend_from_slice(&0u16.to_le_bytes());
encoded.extend_from_slice(&0u16.to_le_bytes());
encoded.extend_from_slice(&(54u32).to_le_bytes());
encoded.extend_from_slice(&(40u32).to_le_bytes());
encoded.extend_from_slice(&(width as i32).to_le_bytes());
encoded.extend_from_slice(&(height as i32).to_le_bytes());
encoded.extend_from_slice(&1u16.to_le_bytes());
encoded.extend_from_slice(&24u16.to_le_bytes());
encoded.extend_from_slice(&0u32.to_le_bytes());
encoded.extend_from_slice(&0u32.to_le_bytes());
encoded.extend_from_slice(&0u32.to_le_bytes());
encoded.extend_from_slice(&0u32.to_le_bytes());
encoded.extend_from_slice(&0u32.to_le_bytes());
encoded.extend_from_slice(&0u32.to_le_bytes());
encoded
}
#[test]
fn process_image_keeps_small_dimensions() {
let png = encode_png(8, 6);
let output = process_image_inner(&png, 512, false).unwrap();
let format = image::guess_format(&output).unwrap();
assert_eq!(format, ImageFormat::WebP);
let decoded = image::load_from_memory(&output).unwrap();
assert_eq!(decoded.dimensions(), (8, 6));
}
#[test]
fn process_image_scales_down_large_dimensions() {
let png = encode_png(1024, 256);
let output = process_image_inner(&png, 512, false).unwrap();
let decoded = image::load_from_memory(&output).unwrap();
assert_eq!(decoded.dimensions(), (512, 128));
}
#[test]
fn process_image_preserves_exif_without_orientation() {
let png = encode_png(8, 8);
let mut png_with_exif = png.clone();
let mut metadata = Metadata::new();
metadata.set_tag(ExifTag::ImageDescription("copilot".to_string()));
metadata.set_tag(ExifTag::Orientation(vec![6]));
metadata
.write_to_vec(&mut png_with_exif, FileExtension::PNG { as_zTXt_chunk: true })
.unwrap();
let output = process_image_inner(&png_with_exif, 512, true).unwrap();
let decoded_metadata = Metadata::new_from_vec(&output, FileExtension::WEBP).unwrap();
assert!(
decoded_metadata
.get_tag(&ExifTag::ImageDescription(String::new()))
.next()
.is_some()
);
assert!(
decoded_metadata
.get_tag(&ExifTag::Orientation(vec![1]))
.next()
.is_none()
);
}
#[test]
fn process_image_rejects_invalid_input() {
let error = process_image_inner(b"not-an-image", 512, false).unwrap_err();
assert_eq!(error.to_string(), "unsupported image format");
}
#[test]
fn process_image_rejects_images_over_dimension_limit_before_decode() {
let bmp = encode_bmp_header(MAX_IMAGE_DIMENSION + 1, 1);
let error = process_image_inner(&bmp, 512, false).unwrap_err();
assert_eq!(error.to_string(), "image dimensions exceed limit");
}
}

View File

@@ -7,6 +7,8 @@ pub mod doc_loader;
pub mod file_type;
pub mod hashcash;
pub mod html_sanitize;
pub mod image;
pub mod llm;
pub mod tiktoken;
use affine_common::napi_utils::map_napi_err;

View File

@@ -0,0 +1,339 @@
use std::sync::{
Arc,
atomic::{AtomicBool, Ordering},
};
use llm_adapter::{
backend::{
BackendConfig, BackendError, BackendProtocol, ReqwestHttpClient, dispatch_request, dispatch_stream_events_with,
},
core::{CoreRequest, StreamEvent},
middleware::{
MiddlewareConfig, PipelineContext, RequestMiddleware, StreamMiddleware, citation_indexing, clamp_max_tokens,
normalize_messages, run_request_middleware_chain, run_stream_middleware_chain, stream_event_normalize,
tool_schema_rewrite,
},
};
use napi::{
Error, Result, Status,
threadsafe_function::{ThreadsafeFunction, ThreadsafeFunctionCallMode},
};
use serde::Deserialize;
pub const STREAM_END_MARKER: &str = "__AFFINE_LLM_STREAM_END__";
const STREAM_ABORTED_REASON: &str = "__AFFINE_LLM_STREAM_ABORTED__";
const STREAM_CALLBACK_DISPATCH_FAILED_REASON: &str = "__AFFINE_LLM_STREAM_CALLBACK_DISPATCH_FAILED__";
#[derive(Debug, Clone, Default, Deserialize)]
#[serde(default)]
struct LlmMiddlewarePayload {
request: Vec<String>,
stream: Vec<String>,
config: MiddlewareConfig,
}
#[derive(Debug, Clone, Deserialize)]
struct LlmDispatchPayload {
#[serde(flatten)]
request: CoreRequest,
#[serde(default)]
middleware: LlmMiddlewarePayload,
}
#[napi]
pub struct LlmStreamHandle {
aborted: Arc<AtomicBool>,
}
#[napi]
impl LlmStreamHandle {
#[napi]
pub fn abort(&self) {
self.aborted.store(true, Ordering::SeqCst);
}
}
#[napi(catch_unwind)]
pub fn llm_dispatch(protocol: String, backend_config_json: String, request_json: String) -> Result<String> {
let protocol = parse_protocol(&protocol)?;
let config: BackendConfig = serde_json::from_str(&backend_config_json).map_err(map_json_error)?;
let payload: LlmDispatchPayload = serde_json::from_str(&request_json).map_err(map_json_error)?;
let request = apply_request_middlewares(payload.request, &payload.middleware)?;
let response =
dispatch_request(&ReqwestHttpClient::default(), &config, protocol, &request).map_err(map_backend_error)?;
serde_json::to_string(&response).map_err(map_json_error)
}
#[napi(catch_unwind)]
pub fn llm_dispatch_stream(
protocol: String,
backend_config_json: String,
request_json: String,
callback: ThreadsafeFunction<String, ()>,
) -> Result<LlmStreamHandle> {
let protocol = parse_protocol(&protocol)?;
let config: BackendConfig = serde_json::from_str(&backend_config_json).map_err(map_json_error)?;
let payload: LlmDispatchPayload = serde_json::from_str(&request_json).map_err(map_json_error)?;
let request = apply_request_middlewares(payload.request, &payload.middleware)?;
let middleware = payload.middleware.clone();
let aborted = Arc::new(AtomicBool::new(false));
let aborted_in_worker = aborted.clone();
std::thread::spawn(move || {
let chain = match resolve_stream_chain(&middleware.stream) {
Ok(chain) => chain,
Err(error) => {
emit_error_event(&callback, error.reason.clone(), "middleware_error");
let _ = callback.call(
Ok(STREAM_END_MARKER.to_string()),
ThreadsafeFunctionCallMode::NonBlocking,
);
return;
}
};
let mut pipeline = StreamPipeline::new(chain, middleware.config.clone());
let mut aborted_by_user = false;
let mut callback_dispatch_failed = false;
let result = dispatch_stream_events_with(&ReqwestHttpClient::default(), &config, protocol, &request, |event| {
if aborted_in_worker.load(Ordering::Relaxed) {
aborted_by_user = true;
return Err(BackendError::Http(STREAM_ABORTED_REASON.to_string()));
}
for event in pipeline.process(event) {
let status = emit_stream_event(&callback, &event);
if status != Status::Ok {
callback_dispatch_failed = true;
return Err(BackendError::Http(format!(
"{STREAM_CALLBACK_DISPATCH_FAILED_REASON}:{status}"
)));
}
}
Ok(())
});
if !aborted_by_user {
for event in pipeline.finish() {
if aborted_in_worker.load(Ordering::Relaxed) {
aborted_by_user = true;
break;
}
if emit_stream_event(&callback, &event) != Status::Ok {
callback_dispatch_failed = true;
break;
}
}
}
if let Err(error) = result
&& !aborted_by_user
&& !callback_dispatch_failed
&& !is_abort_error(&error)
&& !is_callback_dispatch_failed_error(&error)
{
emit_error_event(&callback, error.to_string(), "dispatch_error");
}
if !callback_dispatch_failed {
let _ = callback.call(
Ok(STREAM_END_MARKER.to_string()),
ThreadsafeFunctionCallMode::NonBlocking,
);
}
});
Ok(LlmStreamHandle { aborted })
}
fn apply_request_middlewares(request: CoreRequest, middleware: &LlmMiddlewarePayload) -> Result<CoreRequest> {
let chain = resolve_request_chain(&middleware.request)?;
Ok(run_request_middleware_chain(request, &middleware.config, &chain))
}
#[derive(Clone)]
struct StreamPipeline {
chain: Vec<StreamMiddleware>,
config: MiddlewareConfig,
context: PipelineContext,
}
impl StreamPipeline {
fn new(chain: Vec<StreamMiddleware>, config: MiddlewareConfig) -> Self {
Self {
chain,
config,
context: PipelineContext::default(),
}
}
fn process(&mut self, event: StreamEvent) -> Vec<StreamEvent> {
run_stream_middleware_chain(event, &mut self.context, &self.config, &self.chain)
}
fn finish(&mut self) -> Vec<StreamEvent> {
self.context.flush_pending_deltas();
self.context.drain_queued_events()
}
}
fn emit_stream_event(callback: &ThreadsafeFunction<String, ()>, event: &StreamEvent) -> Status {
let value = serde_json::to_string(event).unwrap_or_else(|error| {
serde_json::json!({
"type": "error",
"message": format!("failed to serialize stream event: {error}"),
})
.to_string()
});
callback.call(Ok(value), ThreadsafeFunctionCallMode::NonBlocking)
}
fn emit_error_event(callback: &ThreadsafeFunction<String, ()>, message: String, code: &str) {
let error_event = serde_json::to_string(&StreamEvent::Error {
message: message.clone(),
code: Some(code.to_string()),
})
.unwrap_or_else(|_| {
serde_json::json!({
"type": "error",
"message": message,
"code": code,
})
.to_string()
});
let _ = callback.call(Ok(error_event), ThreadsafeFunctionCallMode::NonBlocking);
}
fn is_abort_error(error: &BackendError) -> bool {
matches!(
error,
BackendError::Http(reason) if reason == STREAM_ABORTED_REASON
)
}
fn is_callback_dispatch_failed_error(error: &BackendError) -> bool {
matches!(
error,
BackendError::Http(reason) if reason.starts_with(STREAM_CALLBACK_DISPATCH_FAILED_REASON)
)
}
fn resolve_request_chain(request: &[String]) -> Result<Vec<RequestMiddleware>> {
if request.is_empty() {
return Ok(vec![normalize_messages, tool_schema_rewrite]);
}
request
.iter()
.map(|name| match name.as_str() {
"normalize_messages" => Ok(normalize_messages as RequestMiddleware),
"clamp_max_tokens" => Ok(clamp_max_tokens as RequestMiddleware),
"tool_schema_rewrite" => Ok(tool_schema_rewrite as RequestMiddleware),
_ => Err(Error::new(
Status::InvalidArg,
format!("Unsupported request middleware: {name}"),
)),
})
.collect()
}
fn resolve_stream_chain(stream: &[String]) -> Result<Vec<StreamMiddleware>> {
if stream.is_empty() {
return Ok(vec![stream_event_normalize, citation_indexing]);
}
stream
.iter()
.map(|name| match name.as_str() {
"stream_event_normalize" => Ok(stream_event_normalize as StreamMiddleware),
"citation_indexing" => Ok(citation_indexing as StreamMiddleware),
_ => Err(Error::new(
Status::InvalidArg,
format!("Unsupported stream middleware: {name}"),
)),
})
.collect()
}
fn parse_protocol(protocol: &str) -> Result<BackendProtocol> {
match protocol {
"openai_chat" | "openai-chat" | "openai_chat_completions" | "chat-completions" | "chat_completions" => {
Ok(BackendProtocol::OpenaiChatCompletions)
}
"openai_responses" | "openai-responses" | "responses" => Ok(BackendProtocol::OpenaiResponses),
"anthropic" | "anthropic_messages" | "anthropic-messages" => Ok(BackendProtocol::AnthropicMessages),
other => Err(Error::new(
Status::InvalidArg,
format!("Unsupported llm backend protocol: {other}"),
)),
}
}
fn map_json_error(error: serde_json::Error) -> Error {
Error::new(Status::InvalidArg, format!("Invalid JSON payload: {error}"))
}
fn map_backend_error(error: BackendError) -> Error {
Error::new(Status::GenericFailure, error.to_string())
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn should_parse_supported_protocol_aliases() {
assert!(parse_protocol("openai_chat").is_ok());
assert!(parse_protocol("chat-completions").is_ok());
assert!(parse_protocol("responses").is_ok());
assert!(parse_protocol("anthropic").is_ok());
}
#[test]
fn should_reject_unsupported_protocol() {
let error = parse_protocol("unknown").unwrap_err();
assert_eq!(error.status, Status::InvalidArg);
assert!(error.reason.contains("Unsupported llm backend protocol"));
}
#[test]
fn llm_dispatch_should_reject_invalid_backend_json() {
let error = llm_dispatch("openai_chat".to_string(), "{".to_string(), "{}".to_string()).unwrap_err();
assert_eq!(error.status, Status::InvalidArg);
assert!(error.reason.contains("Invalid JSON payload"));
}
#[test]
fn map_json_error_should_use_invalid_arg_status() {
let parse_error = serde_json::from_str::<serde_json::Value>("{").unwrap_err();
let error = map_json_error(parse_error);
assert_eq!(error.status, Status::InvalidArg);
assert!(error.reason.contains("Invalid JSON payload"));
}
#[test]
fn resolve_request_chain_should_support_clamp_max_tokens() {
let chain = resolve_request_chain(&["normalize_messages".to_string(), "clamp_max_tokens".to_string()]).unwrap();
assert_eq!(chain.len(), 2);
}
#[test]
fn resolve_request_chain_should_reject_unknown_middleware() {
let error = resolve_request_chain(&["unknown".to_string()]).unwrap_err();
assert_eq!(error.status, Status::InvalidArg);
assert!(error.reason.contains("Unsupported request middleware"));
}
#[test]
fn resolve_stream_chain_should_reject_unknown_middleware() {
let error = resolve_stream_chain(&["unknown".to_string()]).unwrap_err();
assert_eq!(error.status, Status::InvalidArg);
assert!(error.reason.contains("Unsupported stream middleware"));
}
}

View File

@@ -6,6 +6,7 @@
# MAILER_HOST=127.0.0.1
# MAILER_PORT=1025
# MAILER_SERVERNAME="mail.example.com"
# MAILER_SENDER="noreply@toeverything.info"
# MAILER_USER="noreply@toeverything.info"
# MAILER_PASSWORD="affine"

View File

@@ -4,17 +4,14 @@
"version": "0.26.3",
"description": "Affine Node.js server",
"type": "module",
"bin": {
"run-test": "./scripts/run-test.ts"
},
"scripts": {
"build": "affine bundle -p @affine/server",
"dev": "nodemon ./src/index.ts",
"dev:mail": "email dev -d src/mails",
"test": "ava --concurrency 1 --serial",
"test:copilot": "ava \"src/__tests__/copilot-*.spec.ts\"",
"test:copilot": "ava \"src/__tests__/copilot/copilot-*.spec.ts\"",
"test:coverage": "c8 ava --concurrency 1 --serial",
"test:copilot:coverage": "c8 ava --timeout=5m \"src/__tests__/copilot-*.spec.ts\"",
"test:copilot:coverage": "c8 ava --timeout=5m \"src/__tests__/copilot/copilot-*.spec.ts\"",
"e2e": "cross-env TEST_MODE=e2e ava --serial",
"e2e:coverage": "cross-env TEST_MODE=e2e c8 ava --serial",
"data-migration": "cross-env NODE_ENV=development SERVER_FLAVOR=script r ./src/index.ts",
@@ -28,17 +25,12 @@
"dependencies": {
"@affine/s3-compat": "workspace:*",
"@affine/server-native": "workspace:*",
"@ai-sdk/anthropic": "^2.0.54",
"@ai-sdk/google": "^2.0.45",
"@ai-sdk/google-vertex": "^3.0.88",
"@ai-sdk/openai": "^2.0.80",
"@ai-sdk/openai-compatible": "^1.0.28",
"@ai-sdk/perplexity": "^2.0.21",
"@apollo/server": "^4.13.0",
"@fal-ai/serverless-client": "^0.15.0",
"@google-cloud/opentelemetry-cloud-trace-exporter": "^3.0.0",
"@google-cloud/opentelemetry-resource-util": "^3.0.0",
"@modelcontextprotocol/sdk": "^1.26.0",
"@nestjs-cls/transactional": "^2.7.0",
"@nestjs-cls/transactional-adapter-prisma": "^1.2.24",
"@nestjs/apollo": "^13.0.4",
@@ -55,18 +47,18 @@
"@node-rs/crc32": "^1.10.6",
"@opentelemetry/api": "^1.9.0",
"@opentelemetry/core": "^2.2.0",
"@opentelemetry/exporter-prometheus": "^0.211.0",
"@opentelemetry/exporter-prometheus": "^0.212.0",
"@opentelemetry/exporter-zipkin": "^2.2.0",
"@opentelemetry/host-metrics": "^0.38.0",
"@opentelemetry/instrumentation": "^0.211.0",
"@opentelemetry/instrumentation-graphql": "^0.58.0",
"@opentelemetry/instrumentation-http": "^0.211.0",
"@opentelemetry/instrumentation-ioredis": "^0.59.0",
"@opentelemetry/instrumentation-nestjs-core": "^0.57.0",
"@opentelemetry/instrumentation-socket.io": "^0.57.0",
"@opentelemetry/instrumentation": "^0.212.0",
"@opentelemetry/instrumentation-graphql": "^0.60.0",
"@opentelemetry/instrumentation-http": "^0.212.0",
"@opentelemetry/instrumentation-ioredis": "^0.60.0",
"@opentelemetry/instrumentation-nestjs-core": "^0.58.0",
"@opentelemetry/instrumentation-socket.io": "^0.59.0",
"@opentelemetry/resources": "^2.2.0",
"@opentelemetry/sdk-metrics": "^2.2.0",
"@opentelemetry/sdk-node": "^0.211.0",
"@opentelemetry/sdk-node": "^0.212.0",
"@opentelemetry/sdk-trace-node": "^2.2.0",
"@opentelemetry/semantic-conventions": "^1.38.0",
"@prisma/client": "^6.6.0",
@@ -126,7 +118,6 @@
"@faker-js/faker": "^10.1.0",
"@nestjs/swagger": "^11.2.0",
"@nestjs/testing": "patch:@nestjs/testing@npm%3A10.4.15#~/.yarn/patches/@nestjs-testing-npm-10.4.15-d591a1705a.patch",
"@react-email/preview-server": "^4.3.2",
"@types/cookie-parser": "^1.4.8",
"@types/express": "^5.0.1",
"@types/express-serve-static-core": "^5.0.6",
@@ -142,8 +133,8 @@
"@types/react-dom": "^19.0.2",
"@types/semver": "^7.5.8",
"@types/sinon": "^21.0.0",
"@types/supertest": "^6.0.2",
"ava": "^6.4.0",
"@types/supertest": "^7.0.0",
"ava": "^7.0.0",
"c8": "^10.1.3",
"nodemon": "^3.1.14",
"react-email": "^4.3.2",

View File

@@ -43,7 +43,9 @@ Generated by [AVA](https://avajs.dev).
> Snapshot 5
Buffer @Uint8Array [
66616b65 20696d61 6765
89504e47 0d0a1a0a 0000000d 49484452 00000001 00000001 08040000 00b51c0c
02000000 0b494441 5478da63 fcff1f00 03030200 efa37c9f 00000000 49454e44
ae426082
]
## should preview link

View File

@@ -12,12 +12,12 @@ Generated by [AVA](https://avajs.dev).
{
messages: [
{
content: 'generate text to text',
content: 'generate text to text stream',
role: 'assistant',
},
],
pinned: false,
tokens: 8,
tokens: 10,
},
]
@@ -27,12 +27,12 @@ Generated by [AVA](https://avajs.dev).
{
messages: [
{
content: 'generate text to text',
content: 'generate text to text stream',
role: 'assistant',
},
],
pinned: false,
tokens: 8,
tokens: 10,
},
]

View File

@@ -4,31 +4,31 @@ import type { ExecutionContext, TestFn } from 'ava';
import ava from 'ava';
import { z } from 'zod';
import { ServerFeature, ServerService } from '../core';
import { AuthService } from '../core/auth';
import { QuotaModule } from '../core/quota';
import { Models } from '../models';
import { CopilotModule } from '../plugins/copilot';
import { prompts, PromptService } from '../plugins/copilot/prompt';
import { ServerFeature, ServerService } from '../../core';
import { AuthService } from '../../core/auth';
import { QuotaModule } from '../../core/quota';
import { Models } from '../../models';
import { CopilotModule } from '../../plugins/copilot';
import { prompts, PromptService } from '../../plugins/copilot/prompt';
import {
CopilotProviderFactory,
CopilotProviderType,
StreamObject,
StreamObjectSchema,
} from '../plugins/copilot/providers';
import { TranscriptionResponseSchema } from '../plugins/copilot/transcript/types';
} from '../../plugins/copilot/providers';
import { TranscriptionResponseSchema } from '../../plugins/copilot/transcript/types';
import {
CopilotChatTextExecutor,
CopilotWorkflowService,
GraphExecutorState,
} from '../plugins/copilot/workflow';
} from '../../plugins/copilot/workflow';
import {
CopilotChatImageExecutor,
CopilotCheckHtmlExecutor,
CopilotCheckJsonExecutor,
} from '../plugins/copilot/workflow/executor';
import { createTestingModule, TestingModule } from './utils';
import { TestAssets } from './utils/copilot';
} from '../../plugins/copilot/workflow/executor';
import { createTestingModule, TestingModule } from '../utils';
import { TestAssets } from '../utils/copilot';
type Tester = {
auth: AuthService;

View File

@@ -6,25 +6,26 @@ import type { TestFn } from 'ava';
import ava from 'ava';
import Sinon from 'sinon';
import { AppModule } from '../app.module';
import { JobQueue } from '../base';
import { ConfigModule } from '../base/config';
import { AuthService } from '../core/auth';
import { DocReader } from '../core/doc';
import { CopilotContextService } from '../plugins/copilot/context';
import { AppModule } from '../../app.module';
import { JobQueue } from '../../base';
import { ConfigModule } from '../../base/config';
import { AuthService } from '../../core/auth';
import { DocReader } from '../../core/doc';
import { CopilotContextService } from '../../plugins/copilot/context';
import {
CopilotEmbeddingJob,
MockEmbeddingClient,
} from '../plugins/copilot/embedding';
import { prompts, PromptService } from '../plugins/copilot/prompt';
} from '../../plugins/copilot/embedding';
import { ChatMessageCache } from '../../plugins/copilot/message';
import { prompts, PromptService } from '../../plugins/copilot/prompt';
import {
CopilotProviderFactory,
CopilotProviderType,
GeminiGenerativeProvider,
OpenAIProvider,
} from '../plugins/copilot/providers';
import { CopilotStorage } from '../plugins/copilot/storage';
import { MockCopilotProvider } from './mocks';
} from '../../plugins/copilot/providers';
import { CopilotStorage } from '../../plugins/copilot/storage';
import { MockCopilotProvider } from '../mocks';
import {
acceptInviteById,
createTestingApp,
@@ -33,7 +34,7 @@ import {
smallestPng,
TestingApp,
TestUser,
} from './utils';
} from '../utils';
import {
addContextDoc,
addContextFile,
@@ -67,7 +68,7 @@ import {
textToEventStream,
unsplashSearch,
updateCopilotSession,
} from './utils/copilot';
} from '../utils/copilot';
const test = ava as TestFn<{
auth: AuthService;
@@ -416,6 +417,7 @@ test('should be able to use test provider', async t => {
test('should create message correctly', async t => {
const { app } = t.context;
const messageCache = app.get(ChatMessageCache);
{
const { id } = await createWorkspace(app);
@@ -463,6 +465,19 @@ test('should create message correctly', async t => {
new File([new Uint8Array(pngData)], '1.png', { type: 'image/png' })
);
t.truthy(messageId, 'should be able to create message with blob');
const message = await messageCache.get(messageId);
const attachment = message?.attachments?.[0] as
| { attachment: string; mimeType: string }
| undefined;
const payload = Buffer.from(
attachment?.attachment.split(',').at(1) || '',
'base64'
);
t.is(attachment?.mimeType, 'image/webp');
t.is(payload.subarray(0, 4).toString('ascii'), 'RIFF');
t.is(payload.subarray(8, 12).toString('ascii'), 'WEBP');
}
// with attachments
@@ -513,7 +528,11 @@ test('should be able to chat with api', async t => {
);
const messageId = await createCopilotMessage(app, sessionId);
const ret = await chatWithText(app, sessionId, messageId);
t.is(ret, 'generate text to text', 'should be able to chat with text');
t.is(
ret,
'generate text to text stream',
'should be able to chat with text'
);
const ret2 = await chatWithTextStream(app, sessionId, messageId);
t.is(
@@ -657,7 +676,7 @@ test('should be able to retry with api', async t => {
const histories = await getHistories(app, { workspaceId: id, docId });
t.deepEqual(
histories.map(h => h.messages.map(m => m.content)),
[['generate text to text', 'generate text to text']],
[['generate text to text stream', 'generate text to text stream']],
'should be able to list history'
);
}
@@ -794,7 +813,7 @@ test('should be able to list history', async t => {
const histories = await getHistories(app, { workspaceId, docId });
t.deepEqual(
histories.map(h => h.messages.map(m => m.content)),
[['hello', 'generate text to text']],
[['hello', 'generate text to text stream']],
'should be able to list history'
);
}
@@ -807,7 +826,7 @@ test('should be able to list history', async t => {
});
t.deepEqual(
histories.map(h => h.messages.map(m => m.content)),
[['generate text to text', 'hello']],
[['generate text to text stream', 'hello']],
'should be able to list history'
);
}
@@ -858,7 +877,7 @@ test('should reject request that user have not permission', async t => {
const histories = await getHistories(app, { workspaceId, docId });
t.deepEqual(
histories.map(h => h.messages.map(m => m.content)),
[['generate text to text']],
[['generate text to text stream']],
'should able to list history'
);

View File

@@ -8,38 +8,38 @@ import ava from 'ava';
import { nanoid } from 'nanoid';
import Sinon from 'sinon';
import { EventBus, JobQueue } from '../base';
import { ConfigModule } from '../base/config';
import { AuthService } from '../core/auth';
import { QuotaModule } from '../core/quota';
import { StorageModule, WorkspaceBlobStorage } from '../core/storage';
import { EventBus, JobQueue } from '../../base';
import { ConfigModule } from '../../base/config';
import { AuthService } from '../../core/auth';
import { QuotaModule } from '../../core/quota';
import { StorageModule, WorkspaceBlobStorage } from '../../core/storage';
import {
ContextCategories,
CopilotSessionModel,
WorkspaceModel,
} from '../models';
import { CopilotModule } from '../plugins/copilot';
import { CopilotContextService } from '../plugins/copilot/context';
import { CopilotCronJobs } from '../plugins/copilot/cron';
} from '../../models';
import { CopilotModule } from '../../plugins/copilot';
import { CopilotContextService } from '../../plugins/copilot/context';
import { CopilotCronJobs } from '../../plugins/copilot/cron';
import {
CopilotEmbeddingJob,
MockEmbeddingClient,
} from '../plugins/copilot/embedding';
import { prompts, PromptService } from '../plugins/copilot/prompt';
} from '../../plugins/copilot/embedding';
import { prompts, PromptService } from '../../plugins/copilot/prompt';
import {
CopilotProviderFactory,
CopilotProviderType,
ModelInputType,
ModelOutputType,
OpenAIProvider,
} from '../plugins/copilot/providers';
} from '../../plugins/copilot/providers';
import {
CitationParser,
TextStreamParser,
} from '../plugins/copilot/providers/utils';
import { ChatSessionService } from '../plugins/copilot/session';
import { CopilotStorage } from '../plugins/copilot/storage';
import { CopilotTranscriptionService } from '../plugins/copilot/transcript';
} from '../../plugins/copilot/providers/utils';
import { ChatSessionService } from '../../plugins/copilot/session';
import { CopilotStorage } from '../../plugins/copilot/storage';
import { CopilotTranscriptionService } from '../../plugins/copilot/transcript';
import {
CopilotChatTextExecutor,
CopilotWorkflowService,
@@ -48,7 +48,7 @@ import {
WorkflowGraphExecutor,
type WorkflowNodeData,
WorkflowNodeType,
} from '../plugins/copilot/workflow';
} from '../../plugins/copilot/workflow';
import {
CopilotChatImageExecutor,
CopilotCheckHtmlExecutor,
@@ -56,16 +56,16 @@ import {
getWorkflowExecutor,
NodeExecuteState,
NodeExecutorType,
} from '../plugins/copilot/workflow/executor';
import { AutoRegisteredWorkflowExecutor } from '../plugins/copilot/workflow/executor/utils';
import { WorkflowGraphList } from '../plugins/copilot/workflow/graph';
import { CopilotWorkspaceService } from '../plugins/copilot/workspace';
import { PaymentModule } from '../plugins/payment';
import { SubscriptionService } from '../plugins/payment/service';
import { SubscriptionStatus } from '../plugins/payment/types';
import { MockCopilotProvider } from './mocks';
import { createTestingModule, TestingModule } from './utils';
import { WorkflowTestCases } from './utils/copilot';
} from '../../plugins/copilot/workflow/executor';
import { AutoRegisteredWorkflowExecutor } from '../../plugins/copilot/workflow/executor/utils';
import { WorkflowGraphList } from '../../plugins/copilot/workflow/graph';
import { CopilotWorkspaceService } from '../../plugins/copilot/workspace';
import { PaymentModule } from '../../plugins/payment';
import { SubscriptionService } from '../../plugins/payment/service';
import { SubscriptionStatus } from '../../plugins/payment/types';
import { MockCopilotProvider } from '../mocks';
import { createTestingModule, TestingModule } from '../utils';
import { WorkflowTestCases } from '../utils/copilot';
type Context = {
auth: AuthService;
@@ -364,6 +364,21 @@ test('should be able to manage chat session', async t => {
});
t.is(newSessionId, sessionId, 'should get same session id');
}
// should create a fresh session when reuseLatestChat is explicitly disabled
{
const newSessionId = await session.create({
userId,
promptName,
...commonParams,
reuseLatestChat: false,
});
t.not(
newSessionId,
sessionId,
'should create new session id when reuseLatestChat is false'
);
}
});
test('should be able to update chat session prompt', async t => {
@@ -881,6 +896,26 @@ test('should be able to get provider', async t => {
}
});
test('should resolve provider by prefixed model id', async t => {
const { factory } = t.context;
const provider = await factory.getProviderByModel('openai-default/test');
t.truthy(provider, 'should resolve prefixed model id');
t.is(provider?.type, CopilotProviderType.OpenAI);
const result = await provider?.text({ modelId: 'openai-default/test' }, [
{ role: 'user', content: 'hello' },
]);
t.is(result, 'generate text to text');
});
test('should fallback to null when prefixed provider id does not exist', async t => {
const { factory } = t.context;
const provider = await factory.getProviderByModel('unknown/test');
t.is(provider, null);
});
// ==================== workflow ====================
// this test used to preview the final result of the workflow
@@ -2063,25 +2098,23 @@ test('should handle copilot cron jobs correctly', async t => {
});
test('should resolve model correctly based on subscription status and prompt config', async t => {
const { db, session, subscription } = t.context;
const { prompt, session, subscription } = t.context;
// 1) Seed a prompt that has optionalModels and proModels in config
const promptName = 'resolve-model-test';
await db.aiPrompt.create({
data: {
name: promptName,
model: 'gemini-2.5-flash',
messages: {
create: [{ idx: 0, role: 'system', content: 'test' }],
},
config: { proModels: ['gemini-2.5-pro', 'claude-sonnet-4-5@20250929'] },
await prompt.set(
promptName,
'gemini-2.5-flash',
[{ role: 'system', content: 'test' }],
{ proModels: ['gemini-2.5-pro', 'claude-sonnet-4-5@20250929'] },
{
optionalModels: [
'gemini-2.5-flash',
'gemini-2.5-pro',
'claude-sonnet-4-5@20250929',
],
},
});
}
);
// 2) Create a chat session with this prompt
const sessionId = await session.create({
@@ -2106,6 +2139,16 @@ test('should resolve model correctly based on subscription status and prompt con
const model1 = await s.resolveModel(false, 'gemini-2.5-pro');
t.snapshot(model1, 'should honor requested pro model');
const model1WithPrefix = await s.resolveModel(
false,
'openai-default/gemini-2.5-pro'
);
t.is(
model1WithPrefix,
'openai-default/gemini-2.5-pro',
'should honor requested prefixed pro model'
);
const model2 = await s.resolveModel(false, 'not-in-optional');
t.snapshot(model2, 'should fallback to default model');
}
@@ -2119,6 +2162,16 @@ test('should resolve model correctly based on subscription status and prompt con
'should fallback to default model when requesting pro model during trialing'
);
const model3WithPrefix = await s.resolveModel(
true,
'openai-default/gemini-2.5-pro'
);
t.is(
model3WithPrefix,
'gemini-2.5-flash',
'should fallback to default model when requesting prefixed pro model during trialing'
);
const model4 = await s.resolveModel(true, 'gemini-2.5-flash');
t.snapshot(model4, 'should honor requested non-pro model during trialing');
@@ -2141,6 +2194,16 @@ test('should resolve model correctly based on subscription status and prompt con
const model7 = await s.resolveModel(true, 'claude-sonnet-4-5@20250929');
t.snapshot(model7, 'should honor requested pro model during active');
const model7WithPrefix = await s.resolveModel(
true,
'openai-default/claude-sonnet-4-5@20250929'
);
t.is(
model7WithPrefix,
'openai-default/claude-sonnet-4-5@20250929',
'should honor requested prefixed pro model during active'
);
const model8 = await s.resolveModel(true, 'not-in-optional');
t.snapshot(
model8,

View File

@@ -0,0 +1,210 @@
import test from 'ava';
import { z } from 'zod';
import type { NativeLlmRequest, NativeLlmStreamEvent } from '../../native';
import {
buildNativeRequest,
NativeProviderAdapter,
} from '../../plugins/copilot/providers/native';
const mockDispatch = () =>
(async function* (): AsyncIterableIterator<NativeLlmStreamEvent> {
yield { type: 'text_delta', text: 'Use [^1] now' };
yield { type: 'citation', index: 1, url: 'https://affine.pro' };
yield { type: 'done', finish_reason: 'stop' };
})();
test('NativeProviderAdapter streamText should append citation footnotes', async t => {
const adapter = new NativeProviderAdapter(mockDispatch, {}, 3);
const chunks: string[] = [];
for await (const chunk of adapter.streamText({
model: 'gpt-4.1',
stream: true,
messages: [{ role: 'user', content: [{ type: 'text', text: 'hi' }] }],
})) {
chunks.push(chunk);
}
const text = chunks.join('');
t.true(text.includes('Use [^1] now'));
t.true(
text.includes('[^1]: {"type":"url","url":"https%3A%2F%2Faffine.pro"}')
);
});
test('NativeProviderAdapter streamObject should append citation footnotes', async t => {
const adapter = new NativeProviderAdapter(mockDispatch, {}, 3);
const chunks = [];
for await (const chunk of adapter.streamObject({
model: 'gpt-4.1',
stream: true,
messages: [{ role: 'user', content: [{ type: 'text', text: 'hi' }] }],
})) {
chunks.push(chunk);
}
t.deepEqual(
chunks.map(chunk => chunk.type),
['text-delta', 'text-delta']
);
const text = chunks
.filter(chunk => chunk.type === 'text-delta')
.map(chunk => chunk.textDelta)
.join('');
t.true(text.includes('Use [^1] now'));
t.true(
text.includes('[^1]: {"type":"url","url":"https%3A%2F%2Faffine.pro"}')
);
});
test('NativeProviderAdapter streamObject should append fallback attachment footnotes', async t => {
const dispatch = () =>
(async function* (): AsyncIterableIterator<NativeLlmStreamEvent> {
yield {
type: 'tool_result',
call_id: 'call_1',
name: 'blob_read',
arguments: { blob_id: 'blob_1' },
output: {
blobId: 'blob_1',
fileName: 'a.txt',
fileType: 'text/plain',
content: 'A',
},
};
yield {
type: 'tool_result',
call_id: 'call_2',
name: 'blob_read',
arguments: { blob_id: 'blob_2' },
output: {
blobId: 'blob_2',
fileName: 'b.txt',
fileType: 'text/plain',
content: 'B',
},
};
yield { type: 'text_delta', text: 'Answer from files.' };
yield { type: 'done', finish_reason: 'stop' };
})();
const adapter = new NativeProviderAdapter(dispatch, {}, 3);
const chunks = [];
for await (const chunk of adapter.streamObject({
model: 'gpt-4.1',
stream: true,
messages: [{ role: 'user', content: [{ type: 'text', text: 'hi' }] }],
})) {
chunks.push(chunk);
}
const text = chunks
.filter(chunk => chunk.type === 'text-delta')
.map(chunk => chunk.textDelta)
.join('');
t.true(text.includes('Answer from files.'));
t.true(text.includes('[^1][^2]'));
t.true(
text.includes(
'[^1]: {"type":"attachment","blobId":"blob_1","fileName":"a.txt","fileType":"text/plain"}'
)
);
t.true(
text.includes(
'[^2]: {"type":"attachment","blobId":"blob_2","fileName":"b.txt","fileType":"text/plain"}'
)
);
});
test('NativeProviderAdapter streamObject should map tool and text events', async t => {
let round = 0;
const dispatch = (_request: NativeLlmRequest) =>
(async function* (): AsyncIterableIterator<NativeLlmStreamEvent> {
round += 1;
if (round === 1) {
yield {
type: 'tool_call',
call_id: 'call_1',
name: 'doc_read',
arguments: { doc_id: 'a1' },
};
yield { type: 'done', finish_reason: 'tool_calls' };
return;
}
yield { type: 'text_delta', text: 'ok' };
yield { type: 'done', finish_reason: 'stop' };
})();
const adapter = new NativeProviderAdapter(
dispatch,
{
doc_read: {
inputSchema: z.object({ doc_id: z.string() }),
execute: async () => ({ markdown: '# a1' }),
},
},
4
);
const events = [];
for await (const event of adapter.streamObject({
model: 'gpt-4.1',
stream: true,
messages: [{ role: 'user', content: [{ type: 'text', text: 'read' }] }],
})) {
events.push(event);
}
t.deepEqual(
events.map(event => event.type),
['tool-call', 'tool-result', 'text-delta']
);
t.deepEqual(events[0], {
type: 'tool-call',
toolCallId: 'call_1',
toolName: 'doc_read',
args: { doc_id: 'a1' },
});
});
test('buildNativeRequest should include rust middleware from profile', async t => {
const { request } = await buildNativeRequest({
model: 'gpt-4.1',
messages: [{ role: 'user', content: 'hello' }],
tools: {},
middleware: {
rust: {
request: ['normalize_messages', 'clamp_max_tokens'],
stream: ['stream_event_normalize', 'citation_indexing'],
},
node: {
text: ['callout'],
},
},
});
t.deepEqual(request.middleware, {
request: ['normalize_messages', 'clamp_max_tokens'],
stream: ['stream_event_normalize', 'citation_indexing'],
});
});
test('NativeProviderAdapter streamText should skip citation footnotes when disabled', async t => {
const adapter = new NativeProviderAdapter(mockDispatch, {}, 3, {
nodeTextMiddleware: ['callout'],
});
const chunks: string[] = [];
for await (const chunk of adapter.streamText({
model: 'gpt-4.1',
stream: true,
messages: [{ role: 'user', content: [{ type: 'text', text: 'hi' }] }],
})) {
chunks.push(chunk);
}
const text = chunks.join('');
t.true(text.includes('Use [^1] now'));
t.false(
text.includes('[^1]: {"type":"url","url":"https%3A%2F%2Faffine.pro"}')
);
});

View File

@@ -0,0 +1,56 @@
import test from 'ava';
import { resolveProviderMiddleware } from '../../plugins/copilot/providers/provider-middleware';
import { buildProviderRegistry } from '../../plugins/copilot/providers/provider-registry';
import { CopilotProviderType } from '../../plugins/copilot/providers/types';
test('resolveProviderMiddleware should include anthropic defaults', t => {
const middleware = resolveProviderMiddleware(CopilotProviderType.Anthropic);
t.deepEqual(middleware.rust?.request, [
'normalize_messages',
'tool_schema_rewrite',
]);
t.deepEqual(middleware.rust?.stream, [
'stream_event_normalize',
'citation_indexing',
]);
t.deepEqual(middleware.node?.text, ['citation_footnote', 'callout']);
});
test('resolveProviderMiddleware should merge defaults and overrides', t => {
const middleware = resolveProviderMiddleware(CopilotProviderType.OpenAI, {
rust: { request: ['clamp_max_tokens'] },
node: { text: ['thinking_format'] },
});
t.deepEqual(middleware.rust?.request, [
'normalize_messages',
'clamp_max_tokens',
]);
t.deepEqual(middleware.node?.text, [
'citation_footnote',
'callout',
'thinking_format',
]);
});
test('buildProviderRegistry should normalize profile middleware defaults', t => {
const registry = buildProviderRegistry({
profiles: [
{
id: 'openai-main',
type: CopilotProviderType.OpenAI,
config: { apiKey: '1' },
},
],
});
const profile = registry.profiles.get('openai-main');
t.truthy(profile);
t.deepEqual(profile?.middleware.rust?.stream, [
'stream_event_normalize',
'citation_indexing',
]);
t.deepEqual(profile?.middleware.node?.text, ['citation_footnote', 'callout']);
});

View File

@@ -0,0 +1,99 @@
import test from 'ava';
import { ProviderMiddlewareConfig } from '../../plugins/copilot/config';
import { CopilotProvider } from '../../plugins/copilot/providers/provider';
import {
CopilotProviderType,
ModelInputType,
ModelOutputType,
} from '../../plugins/copilot/providers/types';
class TestOpenAIProvider extends CopilotProvider<{ apiKey: string }> {
readonly type = CopilotProviderType.OpenAI;
readonly models = [
{
id: 'gpt-4.1',
capabilities: [
{
input: [ModelInputType.Text],
output: [ModelOutputType.Text],
defaultForOutputType: true,
},
],
},
];
configured() {
return true;
}
async text(_cond: any, _messages: any[], _options?: any) {
return '';
}
async *streamText(_cond: any, _messages: any[], _options?: any) {
yield '';
}
exposeMetricLabels() {
return this.metricLabels('gpt-4.1');
}
exposeMiddleware() {
return this.getActiveProviderMiddleware();
}
}
function createProvider(profileMiddleware?: ProviderMiddlewareConfig) {
const provider = new TestOpenAIProvider();
(provider as any).AFFiNEConfig = {
copilot: {
providers: {
profiles: [
{
id: 'openai-main',
type: CopilotProviderType.OpenAI,
config: { apiKey: 'test' },
middleware: profileMiddleware,
},
],
defaults: {},
openai: { apiKey: 'legacy' },
},
},
};
return provider;
}
test('metricLabels should include active provider id', t => {
const provider = createProvider();
const labels = provider.runWithProfile('openai-main', () =>
provider.exposeMetricLabels()
);
t.is(labels.providerId, 'openai-main');
});
test('getActiveProviderMiddleware should merge defaults with profile override', t => {
const provider = createProvider({
rust: { request: ['clamp_max_tokens'] },
node: { text: ['thinking_format'] },
});
const middleware = provider.runWithProfile('openai-main', () =>
provider.exposeMiddleware()
);
t.deepEqual(middleware.rust?.request, [
'normalize_messages',
'clamp_max_tokens',
]);
t.deepEqual(middleware.rust?.stream, [
'stream_event_normalize',
'citation_indexing',
]);
t.deepEqual(middleware.node?.text, [
'citation_footnote',
'callout',
'thinking_format',
]);
});

View File

@@ -0,0 +1,165 @@
import test from 'ava';
import {
buildProviderRegistry,
resolveModel,
stripProviderPrefix,
} from '../../plugins/copilot/providers/provider-registry';
import {
CopilotProviderType,
ModelOutputType,
} from '../../plugins/copilot/providers/types';
test('buildProviderRegistry should keep explicit profile over legacy compatibility profile', t => {
const registry = buildProviderRegistry({
profiles: [
{
id: 'openai-default',
type: CopilotProviderType.OpenAI,
priority: 100,
config: { apiKey: 'new' },
},
],
openai: { apiKey: 'legacy' },
});
const profile = registry.profiles.get('openai-default');
t.truthy(profile);
t.deepEqual(profile?.config, { apiKey: 'new' });
});
test('buildProviderRegistry should reject duplicated profile ids', t => {
const error = t.throws(() =>
buildProviderRegistry({
profiles: [
{
id: 'openai-main',
type: CopilotProviderType.OpenAI,
config: { apiKey: '1' },
},
{
id: 'openai-main',
type: CopilotProviderType.OpenAI,
config: { apiKey: '2' },
},
],
})
) as Error;
t.truthy(error);
t.regex(error.message, /Duplicated copilot provider profile id/);
});
test('buildProviderRegistry should reject defaults that reference unknown providers', t => {
const error = t.throws(() =>
buildProviderRegistry({
profiles: [
{
id: 'openai-main',
type: CopilotProviderType.OpenAI,
config: { apiKey: '1' },
},
],
defaults: {
fallback: 'unknown-provider',
},
})
) as Error;
t.truthy(error);
t.regex(error.message, /defaults references unknown providerId/);
});
test('resolveModel should support explicit provider prefix and keep slash models untouched', t => {
const registry = buildProviderRegistry({
profiles: [
{
id: 'openai-main',
type: CopilotProviderType.OpenAI,
config: { apiKey: '1' },
},
{
id: 'fal-main',
type: CopilotProviderType.FAL,
config: { apiKey: '2' },
},
],
});
const prefixed = resolveModel({
registry,
modelId: 'openai-main/gpt-4.1',
});
t.deepEqual(prefixed, {
rawModelId: 'openai-main/gpt-4.1',
modelId: 'gpt-4.1',
explicitProviderId: 'openai-main',
candidateProviderIds: ['openai-main'],
});
const slashModel = resolveModel({
registry,
modelId: 'lora/image-to-image',
});
t.is(slashModel.modelId, 'lora/image-to-image');
t.false(slashModel.candidateProviderIds.includes('lora'));
});
test('resolveModel should follow defaults -> fallback -> order and apply filters', t => {
const registry = buildProviderRegistry({
profiles: [
{
id: 'openai-main',
type: CopilotProviderType.OpenAI,
priority: 10,
config: { apiKey: '1' },
},
{
id: 'anthropic-main',
type: CopilotProviderType.Anthropic,
priority: 5,
config: { apiKey: '2' },
},
{
id: 'fal-main',
type: CopilotProviderType.FAL,
priority: 1,
config: { apiKey: '3' },
},
],
defaults: {
[ModelOutputType.Text]: 'anthropic-main',
fallback: 'openai-main',
},
});
const routed = resolveModel({
registry,
outputType: ModelOutputType.Text,
preferredProviderIds: ['openai-main', 'fal-main'],
});
t.deepEqual(routed.candidateProviderIds, ['openai-main', 'fal-main']);
});
test('stripProviderPrefix should only strip matched provider prefix', t => {
const registry = buildProviderRegistry({
profiles: [
{
id: 'openai-main',
type: CopilotProviderType.OpenAI,
config: { apiKey: '1' },
},
],
});
t.is(
stripProviderPrefix(registry, 'openai-main', 'openai-main/gpt-4.1'),
'gpt-4.1'
);
t.is(
stripProviderPrefix(registry, 'openai-main', 'another-main/gpt-4.1'),
'another-main/gpt-4.1'
);
t.is(stripProviderPrefix(registry, 'openai-main', 'gpt-4.1'), 'gpt-4.1');
});

View File

@@ -0,0 +1,134 @@
import test from 'ava';
import { z } from 'zod';
import { NativeLlmRequest, NativeLlmStreamEvent } from '../../native';
import {
ToolCallAccumulator,
ToolCallLoop,
ToolSchemaExtractor,
} from '../../plugins/copilot/providers/loop';
test('ToolCallAccumulator should merge deltas and complete tool call', t => {
const accumulator = new ToolCallAccumulator();
accumulator.feedDelta({
type: 'tool_call_delta',
call_id: 'call_1',
name: 'doc_read',
arguments_delta: '{"doc_id":"',
});
accumulator.feedDelta({
type: 'tool_call_delta',
call_id: 'call_1',
arguments_delta: 'a1"}',
});
const completed = accumulator.complete({
type: 'tool_call',
call_id: 'call_1',
name: 'doc_read',
arguments: { doc_id: 'a1' },
});
t.deepEqual(completed, {
id: 'call_1',
name: 'doc_read',
args: { doc_id: 'a1' },
thought: undefined,
});
});
test('ToolSchemaExtractor should convert zod schema to json schema', t => {
const toolSet = {
doc_read: {
description: 'Read doc',
inputSchema: z.object({
doc_id: z.string(),
limit: z.number().optional(),
}),
execute: async () => ({}),
},
};
const extracted = ToolSchemaExtractor.extract(toolSet);
t.deepEqual(extracted, [
{
name: 'doc_read',
description: 'Read doc',
parameters: {
type: 'object',
properties: {
doc_id: { type: 'string' },
limit: { type: 'number' },
},
additionalProperties: false,
required: ['doc_id'],
},
},
]);
});
test('ToolCallLoop should execute tool call and continue to next round', async t => {
const dispatchRequests: NativeLlmRequest[] = [];
const dispatch = (request: NativeLlmRequest) => {
dispatchRequests.push(request);
const round = dispatchRequests.length;
return (async function* (): AsyncIterableIterator<NativeLlmStreamEvent> {
if (round === 1) {
yield {
type: 'tool_call_delta',
call_id: 'call_1',
name: 'doc_read',
arguments_delta: '{"doc_id":"a1"}',
};
yield {
type: 'tool_call',
call_id: 'call_1',
name: 'doc_read',
arguments: { doc_id: 'a1' },
};
yield { type: 'done', finish_reason: 'tool_calls' };
return;
}
yield { type: 'text_delta', text: 'done' };
yield { type: 'done', finish_reason: 'stop' };
})();
};
let executedArgs: Record<string, unknown> | null = null;
const loop = new ToolCallLoop(
dispatch,
{
doc_read: {
inputSchema: z.object({ doc_id: z.string() }),
execute: async args => {
executedArgs = args;
return { markdown: '# doc' };
},
},
},
4
);
const events: NativeLlmStreamEvent[] = [];
for await (const event of loop.run({
model: 'gpt-4.1',
stream: true,
messages: [{ role: 'user', content: [{ type: 'text', text: 'read doc' }] }],
})) {
events.push(event);
}
t.deepEqual(executedArgs, { doc_id: 'a1' });
t.true(
dispatchRequests[1]?.messages.some(message => message.role === 'tool')
);
t.deepEqual(
events.map(event => event.type),
['tool_call', 'tool_result', 'text_delta', 'done']
);
});

View File

@@ -0,0 +1,116 @@
import test from 'ava';
import { z } from 'zod';
import {
chatToGPTMessage,
CitationFootnoteFormatter,
CitationParser,
StreamPatternParser,
} from '../../plugins/copilot/providers/utils';
test('CitationFootnoteFormatter should format sorted footnotes from citation events', t => {
const formatter = new CitationFootnoteFormatter();
formatter.consume({
type: 'citation',
index: 2,
url: 'https://example.com/b',
});
formatter.consume({
type: 'citation',
index: 1,
url: 'https://example.com/a',
});
t.is(
formatter.end(),
[
'[^1]: {"type":"url","url":"https%3A%2F%2Fexample.com%2Fa"}',
'[^2]: {"type":"url","url":"https%3A%2F%2Fexample.com%2Fb"}',
].join('\n')
);
});
test('CitationFootnoteFormatter should overwrite duplicated index with latest url', t => {
const formatter = new CitationFootnoteFormatter();
formatter.consume({
type: 'citation',
index: 1,
url: 'https://example.com/old',
});
formatter.consume({
type: 'citation',
index: 1,
url: 'https://example.com/new',
});
t.is(
formatter.end(),
'[^1]: {"type":"url","url":"https%3A%2F%2Fexample.com%2Fnew"}'
);
});
test('StreamPatternParser should keep state across chunks', t => {
const parser = new StreamPatternParser(pattern => {
if (pattern.kind === 'wrappedLink') {
return `[^${pattern.url}]`;
}
if (pattern.kind === 'index') {
return `[#${pattern.value}]`;
}
return `[${pattern.text}](${pattern.url})`;
});
const first = parser.write('ref ([AFFiNE](https://affine.pro');
const second = parser.write(')) and [2]');
t.is(first, 'ref ');
t.is(second, '[^https://affine.pro] and [#2]');
t.is(parser.end(), '');
});
test('CitationParser should convert wrapped links to numbered footnotes', t => {
const parser = new CitationParser();
const output = parser.parse('Use ([AFFiNE](https://affine.pro)) now');
t.is(output, 'Use [^1] now');
t.regex(
parser.end(),
/\[\^1\]: \{"type":"url","url":"https%3A%2F%2Faffine.pro"\}/
);
});
test('chatToGPTMessage should not mutate input and should keep system schema', async t => {
const schema = z.object({
query: z.string(),
});
const messages = [
{
role: 'system' as const,
content: 'You are helper',
params: { schema },
},
{
role: 'user' as const,
content: '',
attachments: ['https://example.com/a.png'],
},
];
const firstRef = messages[0];
const secondRef = messages[1];
const [system, normalized, parsedSchema] = await chatToGPTMessage(
messages,
false
);
t.is(system, 'You are helper');
t.is(parsedSchema, schema);
t.is(messages.length, 2);
t.is(messages[0], firstRef);
t.is(messages[1], secondRef);
t.deepEqual(normalized[0], {
role: 'user',
content: [{ type: 'text', text: '[no content]' }],
});
});

View File

@@ -9,6 +9,16 @@ Generated by [AVA](https://avajs.dev).
> Snapshot 1
{
knownUnsupportedBlocks: [
'RX4CG2zsBk:affine:note',
'S1mkc8zUoU:affine:note',
'yGlBdshAqN:affine:note',
'6lDiuDqZGL:affine:note',
'cauvaHOQmh:affine:note',
'2jwCeO8Yot:affine:note',
'c9MF_JiRgx:affine:note',
'6x7ALjUDjj:affine:surface',
],
markdown: `AFFiNE is an open source all in one workspace, an operating system for all the building blocks of your team wiki, knowledge management and digital assets and a better alternative to Notion and Miro.␊
@@ -70,35 +80,9 @@ Generated by [AVA](https://avajs.dev).
[](Bookmark,https://affine.pro/)␊
[](Bookmark,https://www.youtube.com/@affinepro)␊
<img␊
src="blob://BFZk3c2ERp-sliRvA7MQ_p3NdkdCLt2Ze0DQ9i21dpA="␊
alt=""␊
width="1302"␊
height="728"␊
/>␊
<img␊
src="blob://HWvCItS78DzPGbwcuaGcfkpVDUvL98IvH5SIK8-AcL8="␊
alt=""␊
width="1463"␊
height="374"␊
/>␊
<img␊
src="blob://ZRKpsBoC88qEMmeiXKXqywfA1rLvWoLa5rpEh9x9Oj0="␊
alt=""␊
width="862"␊
height="1388"␊
/>␊
`,
title: 'Write, Draw, Plan all at Once.',
unknownBlocks: [],
}
## should get doc markdown return null when doc not exists

View File

@@ -1,5 +1,6 @@
import test from 'ava';
import { normalizeSMTPHeloHostname } from '../core/mail/utils';
import { Renderers } from '../mails';
import { TEST_DOC, TEST_USER } from '../mails/common';
@@ -21,3 +22,22 @@ test('should render mention email with empty doc title', async t => {
});
t.snapshot(content.html, content.subject);
});
test('should normalize valid SMTP HELO hostnames', t => {
t.is(normalizeSMTPHeloHostname('mail.example.com'), 'mail.example.com');
t.is(normalizeSMTPHeloHostname(' localhost '), 'localhost');
t.is(normalizeSMTPHeloHostname('[127.0.0.1]'), '[127.0.0.1]');
t.is(normalizeSMTPHeloHostname('[IPv6:2001:db8::1]'), '[IPv6:2001:db8::1]');
});
test('should reject invalid SMTP HELO hostnames', t => {
t.is(normalizeSMTPHeloHostname(''), undefined);
t.is(normalizeSMTPHeloHostname(' '), undefined);
t.is(normalizeSMTPHeloHostname('AFFiNE Server'), undefined);
t.is(normalizeSMTPHeloHostname('-example.com'), undefined);
t.is(normalizeSMTPHeloHostname('example-.com'), undefined);
t.is(normalizeSMTPHeloHostname('example..com'), undefined);
t.is(normalizeSMTPHeloHostname('[bad host]'), undefined);
t.is(normalizeSMTPHeloHostname('[foo]'), undefined);
t.is(normalizeSMTPHeloHostname('[IPv6:foo]'), undefined);
});

View File

@@ -8,6 +8,7 @@ export class MockEventBus {
emit = this.stub.emitAsync;
emitAsync = this.stub.emitAsync;
emitDetached = this.stub.emitAsync;
broadcast = this.stub.broadcast;
last<Event extends EventName>(

View File

@@ -0,0 +1,82 @@
import test from 'ava';
import { NativeStreamAdapter } from '../native';
test('NativeStreamAdapter should support buffered and awaited consumption', async t => {
const adapter = new NativeStreamAdapter<number>(undefined);
adapter.push(1);
const first = await adapter.next();
t.deepEqual(first, { value: 1, done: false });
const pending = adapter.next();
adapter.push(2);
const second = await pending;
t.deepEqual(second, { value: 2, done: false });
adapter.push(null);
const done = await adapter.next();
t.true(done.done);
});
test('NativeStreamAdapter return should abort handle and end iteration', async t => {
let abortCount = 0;
const adapter = new NativeStreamAdapter<number>({
abort: () => {
abortCount += 1;
},
});
const ended = await adapter.return();
t.is(abortCount, 1);
t.true(ended.done);
const secondReturn = await adapter.return();
t.true(secondReturn.done);
t.is(abortCount, 1);
const next = await adapter.next();
t.true(next.done);
});
test('NativeStreamAdapter should abort when AbortSignal is triggered', async t => {
let abortCount = 0;
const controller = new AbortController();
const adapter = new NativeStreamAdapter<number>(
{
abort: () => {
abortCount += 1;
},
},
controller.signal
);
const pending = adapter.next();
controller.abort();
const done = await pending;
t.true(done.done);
t.is(abortCount, 1);
});
test('NativeStreamAdapter should end immediately for pre-aborted signal', async t => {
let abortCount = 0;
const controller = new AbortController();
controller.abort();
const adapter = new NativeStreamAdapter<number>(
{
abort: () => {
abortCount += 1;
},
},
controller.signal
);
const next = await adapter.next();
t.true(next.done);
t.is(abortCount, 1);
adapter.push(1);
const stillDone = await adapter.next();
t.true(stillDone.done);
});

View File

@@ -4,9 +4,9 @@ import type { TestFn } from 'ava';
import ava from 'ava';
import {
createBmp,
createTestingApp,
getPublicUserById,
smallestGif,
smallestPng,
TestingApp,
updateAvatar,
@@ -40,7 +40,10 @@ test('should be able to upload user avatar', async t => {
const avatarRes = await app.GET(new URL(avatarUrl).pathname);
t.deepEqual(avatarRes.body, avatar);
t.true(avatarRes.headers['content-type'].startsWith('image/webp'));
t.notDeepEqual(avatarRes.body, avatar);
t.is(avatarRes.body.subarray(0, 4).toString('ascii'), 'RIFF');
t.is(avatarRes.body.subarray(8, 12).toString('ascii'), 'WEBP');
});
test('should be able to update user avatar, and invalidate old avatar url', async t => {
@@ -54,9 +57,7 @@ test('should be able to update user avatar, and invalidate old avatar url', asyn
const oldAvatarUrl = res.body.data.uploadAvatar.avatarUrl;
const newAvatar = await fetch(smallestGif)
.then(res => res.arrayBuffer())
.then(b => Buffer.from(b));
const newAvatar = createBmp(32, 32);
res = await updateAvatar(app, newAvatar);
const newAvatarUrl = res.body.data.uploadAvatar.avatarUrl;
@@ -66,7 +67,46 @@ test('should be able to update user avatar, and invalidate old avatar url', asyn
t.is(avatarRes.status, 404);
const newAvatarRes = await app.GET(new URL(newAvatarUrl).pathname);
t.deepEqual(newAvatarRes.body, newAvatar);
t.true(newAvatarRes.headers['content-type'].startsWith('image/webp'));
t.notDeepEqual(newAvatarRes.body, newAvatar);
t.is(newAvatarRes.body.subarray(0, 4).toString('ascii'), 'RIFF');
t.is(newAvatarRes.body.subarray(8, 12).toString('ascii'), 'WEBP');
});
test('should accept avatar uploads up to 5MB after conversion', async t => {
const { app } = t.context;
await app.signup();
const avatar = createBmp(1024, 1024);
t.true(avatar.length > 500 * 1024);
t.true(avatar.length < 5 * 1024 * 1024);
const res = await updateAvatar(app, avatar, {
filename: 'large.bmp',
contentType: 'image/bmp',
});
t.is(res.status, 200);
const avatarUrl = res.body.data.uploadAvatar.avatarUrl;
const avatarRes = await app.GET(new URL(avatarUrl).pathname);
t.true(avatarRes.headers['content-type'].startsWith('image/webp'));
});
test('should reject unsupported vector avatars', async t => {
const { app } = t.context;
await app.signup();
const avatar = Buffer.from(
'<svg xmlns="http://www.w3.org/2000/svg" width="10" height="10"></svg>'
);
const res = await updateAvatar(app, avatar, {
filename: 'avatar.svg',
contentType: 'image/svg+xml',
});
t.is(res.status, 200);
t.is(res.body.errors[0].message, 'Image format not supported: image/svg+xml');
});
test('should be able to get public user by id', async t => {

View File

@@ -7,6 +7,35 @@ export const smallestPng =
'data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAgAAAAIAQMAAAD+wSzIAAAABlBMVEX///+/v7+jQ3Y5AAAADklEQVQI12P4AIX8EAgALgAD/aNpbtEAAAAASUVORK5CYII';
export const smallestGif = 'data:image/gif;base64,R0lGODlhAQABAAAAACw=';
export function createBmp(width: number, height: number) {
const rowSize = Math.ceil((width * 3) / 4) * 4;
const pixelDataSize = rowSize * height;
const fileSize = 54 + pixelDataSize;
const buffer = Buffer.alloc(fileSize);
buffer.write('BM', 0, 'ascii');
buffer.writeUInt32LE(fileSize, 2);
buffer.writeUInt32LE(54, 10);
buffer.writeUInt32LE(40, 14);
buffer.writeInt32LE(width, 18);
buffer.writeInt32LE(height, 22);
buffer.writeUInt16LE(1, 26);
buffer.writeUInt16LE(24, 28);
buffer.writeUInt32LE(pixelDataSize, 34);
for (let y = 0; y < height; y++) {
const rowOffset = 54 + y * rowSize;
for (let x = 0; x < width; x++) {
const pixelOffset = rowOffset + x * 3;
buffer[pixelOffset] = 0x33;
buffer[pixelOffset + 1] = 0x66;
buffer[pixelOffset + 2] = 0x99;
}
}
return buffer;
}
export async function listBlobs(
app: TestingApp,
workspaceId: string

View File

@@ -629,14 +629,35 @@ export async function chatWithText(
prefix = '',
retry?: boolean
): Promise<string> {
const endpoint = prefix || '/stream';
const query = messageId
? `?messageId=${messageId}` + (retry ? '&retry=true' : '')
: '';
const res = await app
.GET(`/api/copilot/chat/${sessionId}${prefix}${query}`)
.GET(`/api/copilot/chat/${sessionId}${endpoint}${query}`)
.expect(200);
return res.text;
if (prefix) {
return res.text;
}
const events = sse2array(res.text);
const errorEvent = events.find(event => event.event === 'error');
if (errorEvent?.data) {
let message = errorEvent.data;
try {
const parsed = JSON.parse(errorEvent.data);
message = parsed.message || message;
} catch {
// noop: keep raw error data
}
throw new Error(message);
}
return events
.filter(event => event.event === 'message')
.map(event => event.data ?? '')
.join('');
}
export async function chatWithTextStream(

View File

@@ -121,7 +121,11 @@ export async function deleteAccount(app: TestingApp) {
return res.deleteAccount.success;
}
export async function updateAvatar(app: TestingApp, avatar: Buffer) {
export async function updateAvatar(
app: TestingApp,
avatar: Buffer,
options: { filename?: string; contentType?: string } = {}
) {
return app
.POST('/graphql')
.field(
@@ -138,7 +142,7 @@ export async function updateAvatar(app: TestingApp, avatar: Buffer) {
)
.field('map', JSON.stringify({ '0': ['variables.avatar'] }))
.attach('0', avatar, {
filename: 'test.png',
contentType: 'image/png',
filename: options.filename || 'test.png',
contentType: options.contentType || 'image/png',
});
}

View File

@@ -38,8 +38,11 @@ test.before(async t => {
t.context.app = app;
});
test.after.always(async t => {
test.afterEach.always(() => {
Sinon.restore();
});
test.after.always(async t => {
__resetDnsLookupForTests();
await t.context.app.close();
});
@@ -80,6 +83,7 @@ const assertAndSnapshotRaw = async (
test('should proxy image', async t => {
const assertAndSnapshot = assertAndSnapshotRaw.bind(null, t);
const imageUrl = `http://example.com/image-${Date.now()}.png`;
await assertAndSnapshot(
'/api/worker/image-proxy',
@@ -105,7 +109,7 @@ test('should proxy image', async t => {
{
await assertAndSnapshot(
'/api/worker/image-proxy?url=http://example.com/image.png',
`/api/worker/image-proxy?url=${imageUrl}`,
'should return 400 if origin and referer are missing',
{ status: 400, origin: null, referer: null }
);
@@ -113,14 +117,17 @@ test('should proxy image', async t => {
{
await assertAndSnapshot(
'/api/worker/image-proxy?url=http://example.com/image.png',
`/api/worker/image-proxy?url=${imageUrl}`,
'should return 400 for invalid origin header',
{ status: 400, origin: 'http://invalid.com' }
);
}
{
const fakeBuffer = Buffer.from('fake image');
const fakeBuffer = Buffer.from(
'iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAQAAAC1HAwCAAAAC0lEQVR42mP8/x8AAwMCAO+jfJ8AAAAASUVORK5CYII=',
'base64'
);
const fakeResponse = new Response(fakeBuffer, {
status: 200,
headers: {
@@ -130,13 +137,14 @@ test('should proxy image', async t => {
});
const fetchSpy = Sinon.stub(global, 'fetch').resolves(fakeResponse);
await assertAndSnapshot(
'/api/worker/image-proxy?url=http://example.com/image.png',
'should return image buffer'
);
fetchSpy.restore();
try {
await assertAndSnapshot(
`/api/worker/image-proxy?url=${imageUrl}`,
'should return image buffer'
);
} finally {
fetchSpy.restore();
}
}
});
@@ -200,18 +208,19 @@ test('should preview link', async t => {
});
const fetchSpy = Sinon.stub(global, 'fetch').resolves(fakeHTML);
await assertAndSnapshot(
'/api/worker/link-preview',
'should process a valid external URL and return link preview data',
{
status: 200,
method: 'POST',
body: { url: 'http://external.com/page' },
}
);
fetchSpy.restore();
try {
await assertAndSnapshot(
'/api/worker/link-preview',
'should process a valid external URL and return link preview data',
{
status: 200,
method: 'POST',
body: { url: 'http://external.com/page' },
}
);
} finally {
fetchSpy.restore();
}
}
{
@@ -251,18 +260,19 @@ test('should preview link', async t => {
});
const fetchSpy = Sinon.stub(global, 'fetch').resolves(fakeHTML);
await assertAndSnapshot(
'/api/worker/link-preview',
'should decode HTML content with charset',
{
status: 200,
method: 'POST',
body: { url: `http://example.com/${charset}` },
}
);
fetchSpy.restore();
try {
await assertAndSnapshot(
'/api/worker/link-preview',
'should decode HTML content with charset',
{
status: 200,
method: 'POST',
body: { url: `http://example.com/${charset}` },
}
);
} finally {
fetchSpy.restore();
}
}
}
});

View File

@@ -301,6 +301,11 @@ export const USER_FRIENDLY_ERRORS = {
},
// Input errors
image_format_not_supported: {
type: 'invalid_input',
args: { format: 'string' },
message: ({ format }) => `Image format not supported: ${format}`,
},
query_too_long: {
type: 'invalid_input',
args: { max: 'number' },

View File

@@ -82,6 +82,16 @@ export class EmailServiceNotConfigured extends UserFriendlyError {
}
}
@ObjectType()
class ImageFormatNotSupportedDataType {
@Field() format!: string
}
export class ImageFormatNotSupported extends UserFriendlyError {
constructor(args: ImageFormatNotSupportedDataType, message?: string | ((args: ImageFormatNotSupportedDataType) => string)) {
super('invalid_input', 'image_format_not_supported', message, args);
}
}
@ObjectType()
class QueryTooLongDataType {
@Field() max!: number
}
@@ -1155,6 +1165,7 @@ export enum ErrorNames {
SSRF_BLOCKED_ERROR,
RESPONSE_TOO_LARGE_ERROR,
EMAIL_SERVICE_NOT_CONFIGURED,
IMAGE_FORMAT_NOT_SUPPORTED,
QUERY_TOO_LONG,
VALIDATION_ERROR,
USER_NOT_FOUND,
@@ -1297,5 +1308,5 @@ registerEnumType(ErrorNames, {
export const ErrorDataUnionType = createUnionType({
name: 'ErrorDataUnion',
types: () =>
[GraphqlBadRequestDataType, HttpRequestErrorDataType, SsrfBlockedErrorDataType, ResponseTooLargeErrorDataType, QueryTooLongDataType, ValidationErrorDataType, WrongSignInCredentialsDataType, UnknownOauthProviderDataType, InvalidOauthCallbackCodeDataType, MissingOauthQueryParameterDataType, InvalidOauthResponseDataType, InvalidEmailDataType, InvalidPasswordLengthDataType, WorkspacePermissionNotFoundDataType, SpaceNotFoundDataType, MemberNotFoundInSpaceDataType, NotInSpaceDataType, AlreadyInSpaceDataType, SpaceAccessDeniedDataType, SpaceOwnerNotFoundDataType, SpaceShouldHaveOnlyOneOwnerDataType, DocNotFoundDataType, DocActionDeniedDataType, DocUpdateBlockedDataType, VersionRejectedDataType, InvalidHistoryTimestampDataType, DocHistoryNotFoundDataType, BlobNotFoundDataType, ExpectToGrantDocUserRolesDataType, ExpectToRevokeDocUserRolesDataType, ExpectToUpdateDocUserRoleDataType, NoMoreSeatDataType, UnsupportedSubscriptionPlanDataType, SubscriptionAlreadyExistsDataType, SubscriptionNotExistsDataType, SameSubscriptionRecurringDataType, SubscriptionPlanNotFoundDataType, CalendarProviderRequestErrorDataType, NoCopilotProviderAvailableDataType, CopilotFailedToGenerateEmbeddingDataType, CopilotDocNotFoundDataType, CopilotMessageNotFoundDataType, CopilotPromptNotFoundDataType, CopilotProviderNotSupportedDataType, CopilotProviderSideErrorDataType, CopilotInvalidContextDataType, CopilotContextFileNotSupportedDataType, CopilotFailedToModifyContextDataType, CopilotFailedToMatchContextDataType, CopilotFailedToMatchGlobalContextDataType, CopilotFailedToAddWorkspaceFileEmbeddingDataType, RuntimeConfigNotFoundDataType, InvalidRuntimeConfigTypeDataType, InvalidLicenseToActivateDataType, InvalidLicenseUpdateParamsDataType, UnsupportedClientVersionDataType, MentionUserDocAccessDeniedDataType, InvalidAppConfigDataType, InvalidAppConfigInputDataType, InvalidSearchProviderRequestDataType, InvalidIndexerInputDataType] as const,
[GraphqlBadRequestDataType, HttpRequestErrorDataType, SsrfBlockedErrorDataType, ResponseTooLargeErrorDataType, ImageFormatNotSupportedDataType, QueryTooLongDataType, ValidationErrorDataType, WrongSignInCredentialsDataType, UnknownOauthProviderDataType, InvalidOauthCallbackCodeDataType, MissingOauthQueryParameterDataType, InvalidOauthResponseDataType, InvalidEmailDataType, InvalidPasswordLengthDataType, WorkspacePermissionNotFoundDataType, SpaceNotFoundDataType, MemberNotFoundInSpaceDataType, NotInSpaceDataType, AlreadyInSpaceDataType, SpaceAccessDeniedDataType, SpaceOwnerNotFoundDataType, SpaceShouldHaveOnlyOneOwnerDataType, DocNotFoundDataType, DocActionDeniedDataType, DocUpdateBlockedDataType, VersionRejectedDataType, InvalidHistoryTimestampDataType, DocHistoryNotFoundDataType, BlobNotFoundDataType, ExpectToGrantDocUserRolesDataType, ExpectToRevokeDocUserRolesDataType, ExpectToUpdateDocUserRoleDataType, NoMoreSeatDataType, UnsupportedSubscriptionPlanDataType, SubscriptionAlreadyExistsDataType, SubscriptionNotExistsDataType, SameSubscriptionRecurringDataType, SubscriptionPlanNotFoundDataType, CalendarProviderRequestErrorDataType, NoCopilotProviderAvailableDataType, CopilotFailedToGenerateEmbeddingDataType, CopilotDocNotFoundDataType, CopilotMessageNotFoundDataType, CopilotPromptNotFoundDataType, CopilotProviderNotSupportedDataType, CopilotProviderSideErrorDataType, CopilotInvalidContextDataType, CopilotContextFileNotSupportedDataType, CopilotFailedToModifyContextDataType, CopilotFailedToMatchContextDataType, CopilotFailedToMatchGlobalContextDataType, CopilotFailedToAddWorkspaceFileEmbeddingDataType, RuntimeConfigNotFoundDataType, InvalidRuntimeConfigTypeDataType, InvalidLicenseToActivateDataType, InvalidLicenseUpdateParamsDataType, UnsupportedClientVersionDataType, MentionUserDocAccessDeniedDataType, InvalidAppConfigDataType, InvalidAppConfigInputDataType, InvalidSearchProviderRequestDataType, InvalidIndexerInputDataType] as const,
});

View File

@@ -88,12 +88,21 @@ export class EventBus
emit<T extends EventName>(event: T, payload: Events[T]) {
this.logger.debug(`Dispatch event: ${event}`);
// NOTE(@forehalo):
// Because all event handlers are wrapped in promisified metrics and cls context, they will always run in standalone tick.
// In which way, if handler throws, an unhandled rejection will be triggered and end up with process exiting.
// So we catch it here with `emitAsync`
this.emitter.emitAsync(event, payload).catch(e => {
this.emitter.emit('error', { event, payload, error: e });
this.dispatchAsync(event, payload);
return true;
}
/**
* Emit event in detached cls context to avoid inheriting current transaction.
*/
emitDetached<T extends EventName>(event: T, payload: Events[T]) {
this.logger.debug(`Dispatch event: ${event} (detached)`);
const requestId = this.cls.getId();
this.cls.run({ ifNested: 'override' }, () => {
this.cls.set(CLS_ID, requestId ?? genRequestId('event'));
this.dispatchAsync(event, payload);
});
return true;
@@ -166,6 +175,16 @@ export class EventBus
return this.emitter.waitFor(name, timeout);
}
private dispatchAsync<T extends EventName>(event: T, payload: Events[T]) {
// NOTE:
// Because all event handlers are wrapped in promisified metrics and cls context, they will always run in standalone tick.
// In which way, if handler throws, an unhandled rejection will be triggered and end up with process exiting.
// So we catch it here with `emitAsync`
this.emitter.emitAsync(event, payload).catch(e => {
this.emitter.emit('error', { event, payload, error: e });
});
}
private readonly bindEventHandlers = once(() => {
this.scanner.scan().forEach(({ event, handler, opts }) => {
this.on(event, handler, opts);

View File

@@ -129,6 +129,8 @@ test('should return markdown content and skip page view when accept is text/mark
const markdown = Sinon.stub(docReader, 'getDocMarkdown').resolves({
title: 'markdown-doc',
markdown: '# markdown-doc',
knownUnsupportedBlocks: [],
unknownBlocks: [],
});
const docContent = Sinon.stub(docReader, 'getDocContent');
const record = Sinon.stub(

View File

@@ -402,6 +402,8 @@ test('should get doc markdown in json format', async t => {
return {
title: 'test title',
markdown: 'test markdown',
knownUnsupportedBlocks: [],
unknownBlocks: [],
};
});
@@ -418,6 +420,8 @@ test('should get doc markdown in json format', async t => {
.expect({
title: 'test title',
markdown: 'test markdown',
knownUnsupportedBlocks: [],
unknownBlocks: [],
});
t.pass();
});

Some files were not shown because too many files have changed in this diff Show More