Compare commits

..

120 Commits

Author SHA1 Message Date
DarkSky
ae4c201e40 fix: use secure websocket (#5297) 2023-12-14 11:28:25 +08:00
DarkSky
6724e5a537 feat: only follow serverUrlPrefix at redirect to client (#5295) 2023-12-14 11:28:09 +08:00
李华桥
ad50320391 v0.10.3 2023-12-01 12:52:15 +08:00
李华桥
eb21a60dda v0.10.3-beta.7 2023-12-01 12:12:20 +08:00
Joooye_34
c0e3be2d40 fix(core): rerender error boundary when route change and improve sentry report (#5147) 2023-12-01 04:04:44 +00:00
李华桥
09d3b72358 v0.10.3-beta.6 2023-11-30 23:02:26 +08:00
Joooye_34
246e16c6c0 fix(infra): compatibility logic follow blocksuite (#5143) 2023-11-30 23:01:38 +08:00
李华桥
dc279d062b v0.10.3-beta.5 2023-11-30 16:49:55 +08:00
Joooye_34
47d5f9e1c2 fix(infra): use blocksuite api to check compatibility (#5137) 2023-11-30 08:48:13 +00:00
Joooye_34
a226eb8d5f fix(core): expose catched editor load error (#5133) 2023-11-29 20:31:35 +08:00
Joooye_34
908c4e1a6f ci: add sentry env when frontend assets build (#5131) 2023-11-29 10:03:49 +00:00
李华桥
1d0bcc80a0 v0.10.3-beta.4 2023-11-29 16:14:06 +08:00
Joooye_34
50010bd824 fix(core): implement editor timeout and report error from boundary (#5105) 2023-11-29 08:10:38 +00:00
liuyi
c0ede1326d fix(server): wrong OTEL config (#5084) 2023-11-29 11:19:13 +08:00
李华桥
89197bacef Revert "Merge remote-tracking branch 'origin/canary' into stable"
This reverts commit 992ed89a89, reversing
changes made to d272d7922d.
2023-11-29 11:18:45 +08:00
李华桥
f97d323ab5 Revert "Revert "refactor(server): standarderlize metrics and trace with OTEL (#5054)""
This reverts commit c1cd1713b9.
2023-11-29 11:07:28 +08:00
EYHN
2acb219dcc fix(workspace): filter awareness from other workspace (#5093) 2023-11-28 16:47:45 +08:00
LongYinan
992ed89a89 Merge remote-tracking branch 'origin/canary' into stable 2023-11-28 15:12:52 +08:00
liuyi
e73c39fe6b fix(server): wrong OTEL config (#5084) 2023-11-28 05:54:42 +00:00
Peng Xiao
3891f23dfa fix(component): rework tags list collapsing (#5072)
Before:

![CleanShot 2023-11-27 at 16.39.55@2x.png](https://graphite-user-uploaded-assets-prod.s3.amazonaws.com/T2klNLEk0wxLh4NRDzhk/2ac2b8e3-6c30-41f7-a9b2-7a9c81b250fa.png)

After:
![CleanShot 2023-11-27 at 16.38.50@2x.png](https://graphite-user-uploaded-assets-prod.s3.amazonaws.com/T2klNLEk0wxLh4NRDzhk/12eac806-e641-45be-9215-d166f8733db9.png)
2023-11-27 09:56:25 +00:00
Peng Xiao
8841dc3c4e fix(electron): electron dev startup on win (#5031) 2023-11-27 08:45:33 +00:00
EYHN
9cdfeba9b4 docs: issue triaging document (#5071)
I would like to sort out our process for handling github issues. When we receive a issue, we should first triage it.

This PR contains the document about issue triaging.

reference:
[YouTrack issue states used in .NET tools team and their description](https://rider-support.jetbrains.com/hc/en-us/articles/360021572199-YouTrack-issue-states-used-in-NET-tools-team-and-their-description)
[vscode Issues Triaging](https://github.com/microsoft/vscode/wiki/Issues-Triaging)
2023-11-27 08:27:34 +00:00
LongYinan
30ec08cadf chore: bump the all-cargo-dependencies group with 5 updates (#5068)
[//]: # (dependabot-start)
⚠️  **Dependabot is rebasing this PR** ⚠️

Rebasing might not happen immediately, so don't worry if this takes some time.

Note: if you make any changes to this PR yourself, they will take precedence over the rebase.

---

[//]: # (dependabot-end)

Bumps the all-cargo-dependencies group with 5 updates:

| Package | From | To |
| --- | --- | --- |
| [napi](https://github.com/napi-rs/napi-rs) | `2.14.0` | `2.14.1` |
| [napi-derive](https://github.com/napi-rs/napi-rs) | `2.14.1` | `2.14.2` |
| [serde](https://github.com/serde-rs/serde) | `1.0.192` | `1.0.193` |
| [sqlx](https://github.com/launchbadge/sqlx) | `0.7.2` | `0.7.3` |
| [uuid](https://github.com/uuid-rs/uuid) | `1.6.0` | `1.6.1` |

Updates `napi` from 2.14.0 to 2.14.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/napi-rs/napi-rs/releases">napi's releases</a>.</em></p>
<blockquote>
<h2><code>@​napi-rs/cli</code><a href="https://github.com/2"><code>@​2</code></a>.14.1</h2>
<h2>What's Changed</h2>
<ul>
<li>[Fix] Quote toml path by <a href="https://github.com/TheBrenny"><code>@​TheBrenny</code></a> in <a href="https://redirect.github.com/napi-rs/napi-rs/pull/1410">napi-rs/napi-rs#1410</a></li>
<li>chore(cli): update CI template by <a href="https://github.com/Brooooooklyn"><code>@​Brooooooklyn</code></a> in <a href="https://redirect.github.com/napi-rs/napi-rs/pull/1416">napi-rs/napi-rs#1416</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/TheBrenny"><code>@​TheBrenny</code></a> made their first contribution in <a href="https://redirect.github.com/napi-rs/napi-rs/pull/1410">napi-rs/napi-rs#1410</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a href="https://github.com/napi-rs/napi-rs/compare/napi@2.10.4...@napi-rs/cli@2.14.1">https://github.com/napi-rs/napi-rs/compare/napi@2.10.4...<code>@​napi-rs/cli</code><code>@​2.14.1</code></a></p>
<h2>napi-derive@2.14.1</h2>
<h2>What's Changed</h2>
<ul>
<li>fix(napi-derive): async task void output type by <a href="https://github.com/Brooooooklyn"><code>@​Brooooooklyn</code></a> in <a href="https://redirect.github.com/napi-rs/napi-rs/pull/1795">napi-rs/napi-rs#1795</a></li>
<li>fix(napi-derive): async task optional output type by <a href="https://github.com/Brooooooklyn"><code>@​Brooooooklyn</code></a> in <a href="https://redirect.github.com/napi-rs/napi-rs/pull/1796">napi-rs/napi-rs#1796</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a href="https://github.com/napi-rs/napi-rs/compare/napi-sys@2.3.0...napi-derive@2.14.1">https://github.com/napi-rs/napi-rs/compare/napi-sys@2.3.0...napi-derive@2.14.1</a></p>
<h2>napi@2.14.1</h2>
<h2>What's Changed</h2>
<ul>
<li>style(napi): clippy fix by <a href="https://github.com/Brooooooklyn"><code>@​Brooooooklyn</code></a> in <a href="https://redirect.github.com/napi-rs/napi-rs/pull/1815">napi-rs/napi-rs#1815</a></li>
<li>fix(napi): cargo doc build by <a href="https://github.com/Brooooooklyn"><code>@​Brooooooklyn</code></a> in <a href="https://redirect.github.com/napi-rs/napi-rs/pull/1819">napi-rs/napi-rs#1819</a></li>
<li>fix(napi): compile error for wasm32-unknown-unknown target by <a href="https://github.com/Brooooooklyn"><code>@​Brooooooklyn</code></a> in <a href="https://redirect.github.com/napi-rs/napi-rs/pull/1822">napi-rs/napi-rs#1822</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a href="https://github.com/napi-rs/napi-rs/compare/napi@2.14.0...napi@2.14.1">https://github.com/napi-rs/napi-rs/compare/napi@2.14.0...napi@2.14.1</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="6a4f4f173d"><code>6a4f4f1</code></a> chore(release): publish</li>
<li><a href="e4ac44e560"><code>e4ac44e</code></a> Release independent packages</li>
<li><a href="8a9c42a985"><code>8a9c42a</code></a> fix(napi): compile error for wasm32-unknown-unknown target</li>
<li><a href="7dced934a7"><code>7dced93</code></a> fix(napi): cargo doc build</li>
<li><a href="751312cec9"><code>751312c</code></a> test: add test file name into error message (<a href="https://redirect.github.com/napi-rs/napi-rs/issues/1821">#1821</a>)</li>
<li><a href="7c3f8b514e"><code>7c3f8b5</code></a> fix(napi-derive): compile warning (<a href="https://redirect.github.com/napi-rs/napi-rs/issues/1820">#1820</a>)</li>
<li><a href="8c911b5d34"><code>8c911b5</code></a> chore: upgrade emnapi dependencies (<a href="https://redirect.github.com/napi-rs/napi-rs/issues/1817">#1817</a>)</li>
<li><a href="76dcf833da"><code>76dcf83</code></a> chore(deps): update dependency emnapi to v0.44.0 (<a href="https://redirect.github.com/napi-rs/napi-rs/issues/1805">#1805</a>)</li>
<li><a href="6df0ca112e"><code>6df0ca1</code></a> chore: 🤖 align wasi template to nodejs demo (<a href="https://redirect.github.com/napi-rs/napi-rs/issues/1814">#1814</a>)</li>
<li><a href="c321071c89"><code>c321071</code></a> chore(deps): update dependency <code>@​emnapi/runtime</code> to v0.44.0 (<a href="https://redirect.github.com/napi-rs/napi-rs/issues/1804">#1804</a>)</li>
<li>Additional commits viewable in <a href="https://github.com/napi-rs/napi-rs/compare/napi@2.14.0...napi@2.14.1">compare view</a></li>
</ul>
</details>
<br />

Updates `napi-derive` from 2.14.1 to 2.14.2
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/napi-rs/napi-rs/releases">napi-derive's releases</a>.</em></p>
<blockquote>
<h2>napi-derive@2.14.2</h2>
<h2>What's Changed</h2>
<ul>
<li>fix(napi-derive): compile warning by <a href="https://github.com/Brooooooklyn"><code>@​Brooooooklyn</code></a> in <a href="https://redirect.github.com/napi-rs/napi-rs/pull/1820">napi-rs/napi-rs#1820</a></li>
<li>fix(napi): compile error for wasm32-unknown-unknown target by <a href="https://github.com/Brooooooklyn"><code>@​Brooooooklyn</code></a> in <a href="https://redirect.github.com/napi-rs/napi-rs/pull/1822">napi-rs/napi-rs#1822</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a href="https://github.com/napi-rs/napi-rs/compare/napi-derive@2.14.1...napi-derive@2.14.2">https://github.com/napi-rs/napi-rs/compare/napi-derive@2.14.1...napi-derive@2.14.2</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="6a4f4f173d"><code>6a4f4f1</code></a> chore(release): publish</li>
<li><a href="e4ac44e560"><code>e4ac44e</code></a> Release independent packages</li>
<li><a href="8a9c42a985"><code>8a9c42a</code></a> fix(napi): compile error for wasm32-unknown-unknown target</li>
<li><a href="7dced934a7"><code>7dced93</code></a> fix(napi): cargo doc build</li>
<li><a href="751312cec9"><code>751312c</code></a> test: add test file name into error message (<a href="https://redirect.github.com/napi-rs/napi-rs/issues/1821">#1821</a>)</li>
<li><a href="7c3f8b514e"><code>7c3f8b5</code></a> fix(napi-derive): compile warning (<a href="https://redirect.github.com/napi-rs/napi-rs/issues/1820">#1820</a>)</li>
<li><a href="8c911b5d34"><code>8c911b5</code></a> chore: upgrade emnapi dependencies (<a href="https://redirect.github.com/napi-rs/napi-rs/issues/1817">#1817</a>)</li>
<li><a href="76dcf833da"><code>76dcf83</code></a> chore(deps): update dependency emnapi to v0.44.0 (<a href="https://redirect.github.com/napi-rs/napi-rs/issues/1805">#1805</a>)</li>
<li><a href="6df0ca112e"><code>6df0ca1</code></a> chore: 🤖 align wasi template to nodejs demo (<a href="https://redirect.github.com/napi-rs/napi-rs/issues/1814">#1814</a>)</li>
<li><a href="c321071c89"><code>c321071</code></a> chore(deps): update dependency <code>@​emnapi/runtime</code> to v0.44.0 (<a href="https://redirect.github.com/napi-rs/napi-rs/issues/1804">#1804</a>)</li>
<li>Additional commits viewable in <a href="https://github.com/napi-rs/napi-rs/compare/napi-derive@2.14.1...napi-derive@2.14.2">compare view</a></li>
</ul>
</details>
<br />

Updates `serde` from 1.0.192 to 1.0.193
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/serde-rs/serde/releases">serde's releases</a>.</em></p>
<blockquote>
<h2>v1.0.193</h2>
<ul>
<li>Fix field names used for the deserialization of <code>RangeFrom</code> and <code>RangeTo</code> (<a href="https://redirect.github.com/serde-rs/serde/issues/2653">#2653</a>, <a href="https://redirect.github.com/serde-rs/serde/issues/2654">#2654</a>, <a href="https://redirect.github.com/serde-rs/serde/issues/2655">#2655</a>, thanks <a href="https://github.com/emilbonnek"><code>@​emilbonnek</code></a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="44613c7d01"><code>44613c7</code></a> Release 1.0.193</li>
<li><a href="c706281df3"><code>c706281</code></a> Merge pull request <a href="https://redirect.github.com/serde-rs/serde/issues/2655">#2655</a> from dtolnay/rangestartend</li>
<li><a href="65d75b8fe3"><code>65d75b8</code></a> Add RangeFrom and RangeTo tests</li>
<li><a href="332b0cba40"><code>332b0cb</code></a> Merge pull request <a href="https://redirect.github.com/serde-rs/serde/issues/2654">#2654</a> from dtolnay/rangestartend</li>
<li><a href="8c4af41296"><code>8c4af41</code></a> Fix more RangeFrom / RangeEnd mixups</li>
<li><a href="24a78f071b"><code>24a78f0</code></a> Merge pull request <a href="https://redirect.github.com/serde-rs/serde/issues/2653">#2653</a> from emilbonnek/fix/range-to-from-de-mixup</li>
<li><a href="c91c33436d"><code>c91c334</code></a> Fix Range{From,To} deserialize mixup</li>
<li><a href="2083f43a28"><code>2083f43</code></a> Update ui test suite to nightly-2023-11-19</li>
<li>See full diff in <a href="https://github.com/serde-rs/serde/compare/v1.0.192...v1.0.193">compare view</a></li>
</ul>
</details>
<br />

Updates `sqlx` from 0.7.2 to 0.7.3
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/launchbadge/sqlx/blob/main/CHANGELOG.md">sqlx's changelog</a>.</em></p>
<blockquote>
<h2>0.7.3 - 2023-11-22</h2>
<p>38 pull requests were merged this release cycle.</p>
<h3>Added</h3>
<ul>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2478">#2478</a>]: feat(citext): support postgres citext [[<a href="https://github.com/hgranthorner"><code>@​hgranthorner</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2545">#2545</a>]: Add <code>fixtures_path</code> in sqlx::test args [[<a href="https://github.com/ripa1995"><code>@​ripa1995</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2665">#2665</a>]: feat(mysql): support packet splitting [[<a href="https://github.com/tk2217"><code>@​tk2217</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2752">#2752</a>]: Enhancement <a href="https://redirect.github.com/launchbadge/sqlx/issues/2747">#2747</a> Provide <code>fn PgConnectOptions::get_host(&amp;self)</code> [[<a href="https://github.com/boris-lok"><code>@​boris-lok</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2769">#2769</a>]: Customize the macro error message based on the metadata [[<a href="https://github.com/Nemo157"><code>@​Nemo157</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2793">#2793</a>]: derived Hash trait for PgInterval [[<a href="https://github.com/yasamoka"><code>@​yasamoka</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2801">#2801</a>]: derive FromRow: sqlx(default) for all fields [[<a href="https://github.com/grgi"><code>@​grgi</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2827">#2827</a>]: Add impl <code>FromRow</code> for the unit type [[<a href="https://github.com/nanoqsh"><code>@​nanoqsh</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2871">#2871</a>]: Add <code>MySqlConnectOptions::get_database()</code>  [[<a href="https://github.com/shiftrightonce"><code>@​shiftrightonce</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2873">#2873</a>]: Sqlx Cli: Added force flag to drop database for postgres [[<a href="https://github.com/Vrajs16"><code>@​Vrajs16</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2894">#2894</a>]: feat: <code>Text</code> adapter [[<a href="https://github.com/abonander"><code>@​abonander</code></a>]]</li>
</ul>
<h3>Changed</h3>
<ul>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2701">#2701</a>]: Remove documentation on offline feature [[<a href="https://github.com/Baptistemontan"><code>@​Baptistemontan</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2713">#2713</a>]: Add additional info regarding using Transaction and PoolConnection as… [[<a href="https://github.com/satwanjyu"><code>@​satwanjyu</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2770">#2770</a>]: Update README.md [[<a href="https://github.com/snspinn"><code>@​snspinn</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2797">#2797</a>]: doc(mysql): document behavior regarding <code>BOOLEAN</code> and the query macros [[<a href="https://github.com/abonander"><code>@​abonander</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2803">#2803</a>]: Don't use separate temp dir for query jsons (2)  [[<a href="https://github.com/mattfbacon"><code>@​mattfbacon</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2819">#2819</a>]: postgres begin cancel safe [[<a href="https://github.com/conradludgate"><code>@​conradludgate</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2832">#2832</a>]: Update extra_float_digits default to 2 instead of 3 [[<a href="https://github.com/brianheineman"><code>@​brianheineman</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2865">#2865</a>]: Update Faq - Bulk upsert with optional fields  [[<a href="https://github.com/Vrajs16"><code>@​Vrajs16</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2880">#2880</a>]: feat: use specific message for slow query logs [[<a href="https://github.com/abonander"><code>@​abonander</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2882">#2882</a>]: Do not require db url for prepare [[<a href="https://github.com/tamasfe"><code>@​tamasfe</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2890">#2890</a>]: doc(sqlite): cover lack of <code>NUMERIC</code> support [[<a href="https://github.com/abonander"><code>@​abonander</code></a>]]</li>
<li>[No PR]: Upgraded <code>libsqlite3-sys</code> to 0.27.0
<ul>
<li>Note: linkage to <code>libsqlite3-sys</code> is considered semver-exempt;
see the release notes for 0.7.0 below for details.</li>
</ul>
</li>
</ul>
<h3>Fixed</h3>
<ul>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2640">#2640</a>]: fix: sqlx::macro db cleanup race condition by adding a margin to current timestamp [[<a href="https://github.com/fhsgoncalves"><code>@​fhsgoncalves</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2655">#2655</a>]: [fix] Urlencode when passing filenames to sqlite3 [[<a href="https://github.com/uttarayan21"><code>@​uttarayan21</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2684">#2684</a>]: Make PgListener recover from UnexpectedEof [[<a href="https://github.com/hamiltop"><code>@​hamiltop</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2688">#2688</a>]: fix: Make rust_decimal and bigdecimal decoding more lenient [[<a href="https://github.com/cameronbraid"><code>@​cameronbraid</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2754">#2754</a>]: Is tests/x.py maintained? And I tried fix it. [[<a href="https://github.com/qwerty2501"><code>@​qwerty2501</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2784">#2784</a>]: fix: decode postgres time without subsecond [[<a href="https://github.com/granddaifuku"><code>@​granddaifuku</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2806">#2806</a>]: Depend on version of async-std with non-private spawn-blocking [[<a href="https://github.com/A248"><code>@​A248</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2820">#2820</a>]: fix: correct decoding of <code>rust_decimal::Decimal</code> for high-precision values [[<a href="https://github.com/abonander"><code>@​abonander</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2822">#2822</a>]: issue <a href="https://redirect.github.com/launchbadge/sqlx/issues/2821">#2821</a> Update error handling logic when opening a TCP connection [[<a href="https://github.com/anupj"><code>@​anupj</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2826">#2826</a>]: chore: bump some sqlx-core dependencies [[<a href="https://github.com/djc"><code>@​djc</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2838">#2838</a>]: Fixes rust_decimal scale for Postgres [[<a href="https://github.com/jkleinknox"><code>@​jkleinknox</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2847">#2847</a>]: Fix comment in <code>sqlx migrate add</code> help text [[<a href="https://github.com/cryeprecision"><code>@​cryeprecision</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2850">#2850</a>]: fix(core): avoid unncessary wakeups in <code>try_stream!()</code> [[<a href="https://github.com/abonander"><code>@​abonander</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2856">#2856</a>]: Prevent warnings running <code>cargo build</code> [[<a href="https://github.com/nyurik"><code>@​nyurik</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2864">#2864</a>]: fix(sqlite): use <code>AtomicUsize</code> for thread IDs [[<a href="https://github.com/abonander"><code>@​abonander</code></a>]]</li>
<li>[<a href="https://redirect.github.com/launchbadge/sqlx/issues/2892">#2892</a>]: Fixed force dropping bug [[<a href="https://github.com/Vrajs16"><code>@​Vrajs16</code></a>]]</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a href="https://github.com/launchbadge/sqlx/commits/v0.7.3">compare view</a></li>
</ul>
</details>
<br />

Updates `uuid` from 1.6.0 to 1.6.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/uuid-rs/uuid/releases">uuid's releases</a>.</em></p>
<blockquote>
<h2>1.6.1</h2>
<h2>What's Changed</h2>
<ul>
<li>Fix uuid macro in consts by <a href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a href="https://redirect.github.com/uuid-rs/uuid/pull/721">uuid-rs/uuid#721</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a href="https://github.com/uuid-rs/uuid/compare/1.6.0...1.6.1">https://github.com/uuid-rs/uuid/compare/1.6.0...1.6.1</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="c889107324"><code>c889107</code></a> Merge pull request <a href="https://redirect.github.com/uuid-rs/uuid/issues/721">#721</a> from uuid-rs/fix/uuid-macro</li>
<li><a href="f3f74961c4"><code>f3f7496</code></a> fix uuid macro in consts</li>
<li>See full diff in <a href="https://github.com/uuid-rs/uuid/compare/1.6.0...1.6.1">compare view</a></li>
</ul>
</details>
<br />

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will remove the ignore condition of the specified dependency and ignore conditions

</details>
2023-11-27 07:24:08 +00:00
liuyi
8cc9a0b21b feat(server): add soft deleted flag to optimized blob table (#5058)
requires https://github.com/toeverything/OctoBase/pull/561
2023-11-27 07:06:31 +00:00
Peng Xiao
2deceb6e85 test(core): simple recovery ui e2e (#5059) 2023-11-27 06:39:41 +00:00
Peng Xiao
71d6b730f7 chore: bump blocksuite (#5051)
https://github.com/toeverything/blocksuite/pull/5337
2023-11-27 04:46:23 +00:00
Peng Xiao
34d575078c feat(core): simple recovery history ui poc (#5033)
Simple recovery history UI poc.
What's missing
- [x] e2e

All biz logic should be done, excluding complete ui details.
- [ ] offline prompt
- [ ] history timeline
- [ ] page ui

https://github.com/toeverything/AFFiNE/assets/584378/fc3f6a48-ff7f-4265-b9f5-9c0087cb2635
2023-11-27 02:41:19 +00:00
李华桥
d272d7922d v0.10.3-beta.2 2023-11-25 23:50:40 +08:00
李华桥
c1cd1713b9 Revert "refactor(server): standarderlize metrics and trace with OTEL (#5054)"
This reverts commit 91efca107a.
2023-11-25 23:50:39 +08:00
DarkSky
f04ec50d12 feat: optional payment for frontend (#5056) 2023-11-25 15:15:44 +00:00
DarkSky
13e712158c feat: optional payment for server (#5055) 2023-11-25 14:59:47 +00:00
李华桥
b20e91bee0 v0.10.3-beta.1 2023-11-25 14:14:40 +08:00
李华桥
9a4e5ec8c3 Merge branch 'canary' into stable 2023-11-25 14:14:14 +08:00
liuyi
9dc2d55a5a fix(server): add guid compatibility of :space:page variant (#5062) 2023-11-24 15:46:09 +00:00
liuyi
91efca107a refactor(server): standarderlize metrics and trace with OTEL (#5054)
you can now export span to Zipkin and metrics to Prometheus when developing locally
follow the docs of OTEL: https://opentelemetry.io/docs/instrumentation/js/exporters/

<img width="2357" alt="image" src="https://github.com/toeverything/AFFiNE/assets/8281226/ec615e1f-3e91-43f7-9111-d7d2629e9679">
2023-11-24 15:19:22 +00:00
liuyi
cf65a5cd93 fix(server): never throw in websocket gateways (#5050) 2023-11-24 07:26:40 +00:00
LongYinan
42f4045ad6 chore: adjust the request memory size and replica count (#5046) 2023-11-24 06:19:38 +00:00
李华桥
2019838ae7 v0.10.3-beta.0 2023-11-24 11:39:23 +08:00
李华桥
30ff25f400 Merge branch 'canary' into stable 2023-11-23 23:40:32 +08:00
Joooye_34
317ca7f4e7 ci: fix storybook publish problem (#5047) 2023-11-23 23:38:09 +08:00
李华桥
e766208c18 chore: reset merge wrong codes 2023-11-23 22:53:06 +08:00
李华桥
8742f28148 Merge branch 'canary' into stable 2023-11-23 21:31:42 +08:00
JimmFly
4168551783 chore: bump icons version (#5042) 2023-11-23 12:00:51 +00:00
LongYinan
55c6477bcc fix(electron): appimage forge builder (#5043) 2023-11-23 11:46:50 +00:00
Peng Xiao
ae8329c590 chore(core): update react-resizable-panels (#5041)
`react-resizable-panels` will throw some errors sometime when showing history modal dialog.
I haven't checked the root cause, but upgrade it to the latest will get rid of the error.
2023-11-23 09:20:12 +00:00
LongYinan
25eda22af6 v0.10.3-canary.2 2023-11-23 16:47:40 +08:00
EYHN
23e0137ed8 refactor(workspace): blob sync (#5037)
This pr implements a blob engine.
It exposes a single `BlobStorage` to the `blocksuite`, and in it we sync blobs between multiple storages.

The implement still have few issues, but we can merge this pr first and fix them in future.

* BlobEngine currently **do nothing when delete**, because synchronization logic conflicts with deletion logic.
* BlobEngine sync between storages by querying the blob list at regular intervals. This will **cause many queries**, we can avoid this in the future by subscribing to remote changes.
2023-11-23 07:56:19 +00:00
liuyi
1740e7efa1 fix(server): check state changes before saving history record (#5038) 2023-11-23 07:39:02 +00:00
Peng Xiao
7463e87742 fix(electron): clone db file when enable cloud for desktop (#5028)
At the moment on desktop the user's local blob data will be lost after enable cloud.
This is because blob data is only synced from old idb to new idb, but not sync into sqlitedb.

This pr will simply clone the db file for desktop app. It should also speed up the time when enabling cloud for a large local workspace.
2023-11-23 07:23:16 +00:00
李华桥
9ded6afb4b chore: v0.10.3-canary.1 2023-11-23 14:39:55 +08:00
JimmFly
ad2d3b9167 feat(core): add download app button to web (#5023) 2023-11-23 14:33:25 +08:00
LongYinan
3499dbbb7f feat: upgrade dependencies and lockfile (#5016)
- Close https://github.com/toeverything/AFFiNE/security/dependabot/47
2023-11-23 05:18:05 +00:00
Joooye_34
4c8d54b3a7 refactor(core): use manual upgrade to replace auto migration when web setup (#5022)
1. Split logic in `packages/common/infra/src/blocksuite/index.ts` to multiple single files
2. Move migration logic from setup to upgrade module, to prevent auto migration problems and loading problem
2023-11-23 02:26:06 +00:00
liuyi
3710bcdc14 fix(server): use iso date string as history query input (#5035) 2023-11-23 01:59:08 +00:00
Peng Xiao
ca07b143ef fix(core): should not reset page preset on rerender (#5034)
Should not reset editor preset when re-render.

See ce7ac88fc7/packages/editor/src/components/editor-container.ts (L197). If these props changes, it will trigger some unexpected side effects.
2023-11-22 18:29:34 +00:00
EYHN
e8616acfe4 fix(workspace): fast check svg buffer (#5032) 2023-11-22 14:53:33 +00:00
EYHN
06203498da fix(core): fix page loading shimmer (#5027) 2023-11-22 12:55:10 +00:00
Flrande
d7d47853fe chore: bump blocksuite (#5030) 2023-11-22 20:51:35 +08:00
Flrande
a3d880daa3 chore: bump blocksuite (#5026) 2023-11-22 20:12:52 +08:00
liuyi
d1476495ae feat(server): impl doc history (#5004) 2023-11-22 07:56:59 +00:00
liuyi
946b7b4004 feat(server): event on snapshot upserted (#5002) 2023-11-22 07:23:44 +00:00
liuyi
525b196cae feat(server): reduce duplidated merge with cache (#4975) 2023-11-22 04:09:07 +00:00
liuyi
c69e542b98 feat(server): add cache module (#4973) 2023-11-22 04:09:00 +00:00
liuyi
85bee72e6b chore(server): remove deprecated redis manager (#4971) 2023-11-22 03:51:18 +00:00
liuyi
b7d6237c20 feat(server): add doc history support (#4970) 2023-11-22 03:31:22 +00:00
LongYinan
5f1a124b53 fix(core): add error boundary for workspace layout (#5014)
https://github.com/toeverything/AFFiNE/assets/3468483/d478bf4f-2be3-4d7d-8d94-aa95c1f74c8e
2023-11-22 09:58:33 +08:00
Peng Xiao
3839a9bd15 build(electron): asar (#4965)
Due to restrictions on how Electron package works, the `node_modules` should not be hoisted and not to use s/h-links at all. This is why we need to have two separate installs for electron and non-electron packages in the build.

Tested via the following script

```bash
#!/bin/bash

echo "step 1: clean up"
find . -name "node_modules" -prune -exec rm -rf '{}' +
# git clean -dfX
build_type=canary

echo "step 2: install web dependencies"
# firstly, build web static
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD=1 SENTRYCLI_SKIP_DOWNLOAD=1 PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD=1 HUSKY=1 yarn

echo "step 3: generate assets"
BUILD_TYPE="$build_type" yarn workspace @affine/electron generate-assets

# cleanup node_modules
find . -name "node_modules" -prune -exec rm -rf '{}' +

echo "step 4: install electron dependencies"
# install electron deps
yarn config set nmHoistingLimits workspaces
yarn config set enableScripts false
yarn config set nmMode classic
PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD=1 HUSKY=0 yarn workspaces focus @affine/electron @affine/monorepo

echo "step 5: build native"
# build native
yarn workspace @affine/native build
yarn workspace @affine/storage build

echo "step 6: build electron"
# build electron
yarn workspace @affine/electron build

echo "step 7: package electron"
# package
SKIP_GENERATE_ASSETS=1 BUILD_TYPE="$build_type" HOIST_NODE_MODULES=1 yarn workspace @affine/electron package
```
2023-11-21 17:44:30 +00:00
Peng Xiao
f33c49b27e fix(core): hmr issue on dev (#5006)
I suspect HMR does not working properly on dev because we have multiple entries.
One relative issue: https://github.com/webpack/webpack-dev-server/issues/2792/

I think we do not need multiple entries for polyfills & plugins after all. They could be in the same chunk, and could be later optimized through splitChunks option.

`ses.ts` is changed to `ses-lockdown.ts` because `ses.ts` does not pass circular dependency check by madge. I haven't looked through the real root cause though. See https://github.com/pahen/madge/issues/355
2023-11-21 17:27:16 +00:00
Peng Xiao
615255706d fix: invisible button should not be interactive (#5017) 2023-11-22 01:05:38 +08:00
EYHN
5e8103adbd chore: faster lint-staged (#5013)
Co-authored-by: EYHN <13579374+EYHN@users.noreply.github.com>
2023-11-21 22:24:24 +08:00
JimmFly
f06bdd9a39 fix(core): cmdk crash when entering double quotes (#5008)
Due to a bug in the upstream repository, a temporary fix was implemented until the issue in the upstream repository is resolved.
https://github.com/pacocoursey/cmdk/issues/189
2023-11-21 12:51:22 +00:00
李华桥
00c11d40cf v0.10.3-canary.0 2023-11-21 10:02:46 +08:00
李华桥
0f6b28fd06 c0.11.0-canary.0 2023-11-20 23:53:50 +08:00
EYHN
90c130cf15 fix(core): merge updates before push to storage (#4986) 2023-11-20 23:26:19 +08:00
EYHN
9370110cdc feat(workspace): more status for SyncEngine (#4984) 2023-11-20 22:51:20 +08:00
EYHN
c9f1fd9649 feat(workspace): more status for SyncPeer (#4983) 2023-11-20 20:37:12 +08:00
EYHN
70e71bd43e fix(core): make e2e more stable (#4987) 2023-11-20 20:17:30 +08:00
EYHN
899e46b1fa fix(core): rerender (#4988) 2023-11-20 17:32:40 +08:00
dependabot[bot]
c127d449a1 chore: bump the all-cargo-dependencies group with 1 update (#4997)
Bumps the all-cargo-dependencies group with 1 update: [uuid](https://github.com/uuid-rs/uuid).

<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/uuid-rs/uuid/releases">uuid's releases</a>.</em></p>
<blockquote>
<h2>1.6.0</h2>
<h2>What's Changed</h2>
<ul>
<li>doc: fix links in v6 module by <a href="https://github.com/metalalive"><code>@​metalalive</code></a> in <a href="https://redirect.github.com/uuid-rs/uuid/pull/714">uuid-rs/uuid#714</a></li>
<li>Stabilize UUIDv6-v8 support by <a href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a href="https://redirect.github.com/uuid-rs/uuid/pull/718">uuid-rs/uuid#718</a></li>
<li>Prepare for 1.6.0 release by <a href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a href="https://redirect.github.com/uuid-rs/uuid/pull/719">uuid-rs/uuid#719</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/metalalive"><code>@​metalalive</code></a> made their first contribution in <a href="https://redirect.github.com/uuid-rs/uuid/pull/714">uuid-rs/uuid#714</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a href="https://github.com/uuid-rs/uuid/compare/1.5.0...1.6.0">https://github.com/uuid-rs/uuid/compare/1.5.0...1.6.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="4609e61794"><code>4609e61</code></a> Merge pull request <a href="https://redirect.github.com/uuid-rs/uuid/issues/719">#719</a> from uuid-rs/cargo/1.6.0</li>
<li><a href="24330666ec"><code>2433066</code></a> prepare for 1.6.0 release</li>
<li><a href="9787ea1d0b"><code>9787ea1</code></a> Merge pull request <a href="https://redirect.github.com/uuid-rs/uuid/issues/718">#718</a> from uuid-rs/feat/stabilize-v6-plus</li>
<li><a href="90b0bc0a1c"><code>90b0bc0</code></a> Merge pull request <a href="https://redirect.github.com/uuid-rs/uuid/issues/714">#714</a> from metalalive/doc/fix-v6-links</li>
<li><a href="1eebe7d299"><code>1eebe7d</code></a> bump msrv to 1.60.0</li>
<li><a href="6bade3ae59"><code>6bade3a</code></a> just test lib with miri</li>
<li><a href="3df0aaa80d"><code>3df0aaa</code></a> stabilize UUIDv6-v8 support</li>
<li><a href="003dc57994"><code>003dc57</code></a> doc: fix links to timestamp module</li>
<li>See full diff in <a href="https://github.com/uuid-rs/uuid/compare/1.5.0...1.6.0">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=uuid&package-manager=cargo&previous-version=1.5.0&new-version=1.6.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will remove the ignore condition of the specified dependency and ignore conditions


</details>
2023-11-20 14:43:11 +08:00
Peng Xiao
add20ec2f8 fix(core): blob key issue for cloud blob provider (#4907)
There are some resources that only exists on `/static`. Current prefix check is incorrect since it could start with `/static`
2023-11-20 14:06:24 +08:00
LongYinan
34c5e7d83d build: remove useless source-map-loader to speedup webpack (#4910) 2023-11-20 11:04:57 +08:00
LongYinan
7f09652cca fix(core): handle the getSession network error properly (#4909)
If network offline or API error happens, the `session` returned by the `useSession` hook will be null, so we can't assume it is not null.

There should be following changes:
1. create a page in ErrorBoundary to let the user refetch the session.
2. The `SessionProvider` stop to pull the new session once the session is null, we need to figure out a way to pull the new session when the network is back or the user click the refetch button.
2023-11-20 11:04:39 +08:00
LongYinan
cd291bb60e build: remove useless source-map-loader to speedup webpack (#4910) 2023-11-20 10:52:28 +08:00
Cats Juice
57d42bf491 refactor(core): remove all MUI related components and utilities (#4941) 2023-11-20 10:51:28 +08:00
JimmFly
4ef1f4c046 fix(core): escape cmdk value (#4947)
Co-authored-by: LongYinan <lynweklm@gmail.com>
2023-11-20 10:49:32 +08:00
JimmFly
9bab1b5dff feat(core): keep the latest toast showing when multiple call (#4961) 2023-11-20 10:47:09 +08:00
JimmFly
f09c717413 fix(core): adjust cmdk list scroll padding block (#4972) 2023-11-20 10:39:45 +08:00
Cats Juice
134428f38d style(core): update pro plan card style (#4960) 2023-11-18 00:36:10 +08:00
Cats Juice
ce7a691eef fix(component): stack notification cards expand animation (#4962) 2023-11-18 00:32:06 +08:00
EYHN
5fea0102fb chore: add devcontainer config (#4974)
Co-authored-by: Reese <3253971+figadore@users.noreply.github.com>
2023-11-17 17:54:19 +08:00
JimmFly
ce2eeeffbe feat(i18n): update translation (#4923) 2023-11-17 17:39:33 +08:00
LongYinan
62c0efcfd1 fix(core): handle the getSession network error properly (#4909)
If network offline or API error happens, the `session` returned by the `useSession` hook will be null, so we can't assume it is not null.

There should be following changes:
1. create a page in ErrorBoundary to let the user refetch the session.
2. The `SessionProvider` stop to pull the new session once the session is null, we need to figure out a way to pull the new session when the network is back or the user click the refetch button.
2023-11-17 16:50:48 +08:00
EYHN
aa4c7407de refactor: new provider (#4900) 2023-11-17 15:50:01 +08:00
liuyi
9baad36e41 fix(server): all viewers can share public link (#4968) 2023-11-17 13:48:09 +08:00
liuyi
87248b3337 fix(server): all viewers can share public link (#4968) 2023-11-17 12:34:15 +08:00
Flrande
8b2c3d4c41 chore: bump blocksuite (#4958) 2023-11-16 22:01:03 +08:00
LongYinan
703fad6a0d ci: prevent error if rust build is cached by nx (#4951)
If Rust build was cached by nx, only the output file will be presented. The chmod command will be failed in this case like: https://github.com/toeverything/AFFiNE/actions/runs/6874496337/job/18697360212
2023-11-16 21:57:22 +08:00
Peng Xiao
791eb75ca8 fix(infra): page id compat fix for page ids in workspace.meta (#4950)
since we strip `page:` in keys of workspacedoc.spaces, we should also strip the prefix in meta.pages as well.
2023-11-16 21:57:17 +08:00
JimmFly
ddd7cab414 feat(core): support share edgeless mode (#4856)
Close #3287

<!--
copilot:all
-->
### <samp>🤖 Generated by Copilot at d3fdf86</samp>

### Summary
📄🚀🔗

<!--
1.  📄 - This emoji represents the page and edgeless modes of sharing a page, as well as the GraphQL operations and types related to public pages.
2.  🚀 - This emoji represents the functionality of publishing and revoking public pages, as well as the confirmation modal and the notifications for the user.
3.  🔗 - This emoji represents the sharing URL and the query parameter for the share mode, as well as the hooks and functions that generate and use the URL.
-->
This pull request adds a feature to the frontend component of AFFiNE that allows the user to share a page in either `page` or `edgeless` mode, which affects the appearance and functionality of the page. It also adds the necessary GraphQL operations, types, and schema to support this feature in the backend, and updates the tests and the storybook stories accordingly.

*  Modify the `useIsSharedPage` hook to accept an optional `shareMode` argument and use the `getWorkspacePublicPagesQuery`, `publishPageMutation`, and `revokePublicPageMutation` from `@affine/graphql`
2023-11-15 16:02:58 +08:00
LongYinan
e7e617a791 chore: change default branch to canary (#4948) 2023-11-15 07:46:50 +00:00
LongYinan
cc2ade601c ci: only disable postinstall on macOS in nightly desktop build (#4938) 2023-11-14 23:03:49 +08:00
Joooye_34
ea4f5ffc83 fix(infra): workspace migration without blockVersions (#4936) 2023-11-14 23:03:40 +08:00
Peng Xiao
9ac8a32e00 perf(component): use png instead of svg for rendering noise svg (#4935) 2023-11-14 23:03:19 +08:00
DarkSky
8d55e5cdf9 fix: change password token check (#4934) (#4932) 2023-11-14 23:03:01 +08:00
LongYinan
8bcc886b46 ci: disable postinstall in nightly desktop build (#4930)
Should be part of https://github.com/toeverything/AFFiNE/pull/4885
2023-11-14 14:45:52 +08:00
Peng Xiao
f9971ba922 fix(core): change server url of stable to insider (#4902) (#4926) 2023-11-14 14:40:06 +08:00
LongYinan
5b0b8cf216 test(e2e): add subdoc migration test (#4921)
test(e2e): add subdoc migration test

fix: remove .only
2023-11-14 14:39:59 +08:00
Peng Xiao
16488d594c fix(infra): compatibility fix for space prefix (#4912)
It seems there are some cases that [this upstream PR](https://github.com/toeverything/blocksuite/pull/4747) will cause data loss.

Because of some historical reasons, the page id could be different with its doc id.
It might be caused by subdoc migration in the following (not 100% sure if all white screen issue is caused by it) 0714c12703/packages/common/infra/src/blocksuite/index.ts (L538-L540)

In version 0.10, page id in spaces no longer has prefix "space:"
The data flow for fetching a doc's updates is:
- page id in `meta.pages` -> find `${page-id}` in `doc.spaces` -> `doc` -> `doc.guid`
if `doc` is not found in `doc.spaces`, a new doc will be created and its `doc.guid` is the same with its pageId
- because of guid logic change, the doc that previously prefixed with `space:` will not be found in `doc.spaces`
- when fetching the rows of this doc using the doc id === page id,
  it will return EMPTY since there is no updates associated with the page id

The provided fix in the PR will patch the `spaces` field of the root doc so that after 0.10 the page doc can still be found in the `spaces` map. It shall apply to both of the idb & sqlite datasources.

Special thanks to @lawvs 's db file for investigation!
2023-11-14 14:39:50 +08:00
liuyi
c44a9a4903 fix(server): wrap updates applying in a transaction (#4922) 2023-11-14 14:39:39 +08:00
Peng Xiao
76b585d1ef fix(storybook): page tags display (#4924) 2023-11-14 09:45:12 +08:00
dependabot[bot]
993974d20d chore: bump the all-cargo-dependencies group with 5 updates (#4918)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-13 16:01:07 +08:00
LongYinan
f17c0e1268 Merge pull request #4915 from toeverything/chore/hotfix-back-to-master
fix: cherry pick hotfix back to master
2023-11-13 10:12:06 +08:00
Whitewater
eded501123 fix: get page preview based on block order (#4888)
Co-authored-by: Peng Xiao <pengxiao@outlook.com>
2023-11-12 15:09:57 +00:00
DarkSky
ac3756ea23 chore: cleanup deployment 2023-11-12 11:22:21 +08:00
forehalo
dc8e84df31 fix(server): increase server acceptable websocket payload size 2023-11-12 11:22:00 +08:00
Peng Xiao
a8d89254ce fix(electron): dev reload (#4911) 2023-11-12 03:19:27 +00:00
李华桥
7525126d89 fix(core): change server url of stable to insider 2023-11-10 20:07:18 +08:00
Joooye_34
30bac7dce2 ci(core): eslint errors for core (#4662) 2023-11-10 10:25:59 +00:00
Joooye_34
b98a258083 fix(core): change server url of stable to insider (#4902) 2023-11-10 15:46:31 +08:00
Yifeng Wang
28177657ef chore: bump theme (#4904)
Co-authored-by: 李华桥 <joooye1991@gmail.com>
2023-11-10 15:42:06 +08:00
406 changed files with 17918 additions and 13727 deletions

9
.devcontainer/Dockerfile Normal file
View File

@@ -0,0 +1,9 @@
FROM mcr.microsoft.com/devcontainers/base:bookworm
# Install Homebrew For Linux
RUN /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)" && \
eval "$(/home/linuxbrew/.linuxbrew/bin/brew shellenv)" && \
echo "eval \"\$($(brew --prefix)/bin/brew shellenv)\"" >> /home/vscode/.zshrc && \
echo "eval \"\$($(brew --prefix)/bin/brew shellenv)\"" >> /home/vscode/.bashrc && \
# Install Graphite
brew install withgraphite/tap/graphite && gt --version

12
.devcontainer/build.sh Normal file
View File

@@ -0,0 +1,12 @@
#!/bin/bash
# This is a script used by the devcontainer to build the project
#Enable yarn
corepack enable
corepack prepare yarn@stable --activate
# install dependencies
yarn install
# Create database
yarn workspace @affine/server prisma db push

View File

@@ -0,0 +1,25 @@
// For format details, see https://aka.ms/devcontainer.json.
{
"name": "Debian",
"dockerComposeFile": "docker-compose.yml",
"service": "app",
"workspaceFolder": "/workspaces/${localWorkspaceFolderBasename}",
"features": {
"ghcr.io/devcontainers/features/node:1": {
"version": "18"
},
"ghcr.io/devcontainers/features/rust:1": {}
},
// Configure tool-specific properties.
"customizations": {
"vscode": {
"extensions": [
"ms-playwright.playwright",
"esbenp.prettier-vscode",
"streetsidesoftware.code-spell-checker"
]
}
},
"updateContentCommand": "bash ./.devcontainer/build.sh",
"postCreateCommand": "bash ./.devcontainer/setup-user.sh"
}

View File

@@ -0,0 +1,26 @@
version: '3.8'
services:
app:
build:
context: .
dockerfile: Dockerfile
volumes:
- ../..:/workspaces:cached
command: sleep infinity
network_mode: service:db
environment:
DATABASE_URL: postgresql://affine:affine@db:5432/affine
db:
image: postgres:latest
restart: unless-stopped
volumes:
- postgres-data:/var/lib/postgresql/data
environment:
POSTGRES_PASSWORD: affine
POSTGRES_USER: affine
POSTGRES_DB: affine
volumes:
postgres-data:

7
.devcontainer/setup-user.sh Executable file
View File

@@ -0,0 +1,7 @@
if [ -v GRAPHITE_TOKEN ];then
gt auth --token $GRAPHITE_TOKEN
fi
git fetch
git branch canary -t origin/canary
gt init --trunk canary

View File

@@ -58,7 +58,7 @@ const createPattern = packageName => [
const allPackages = [
'packages/backend/server',
'packages/frontend/component',
'packages/frontend/web',
'packages/frontend/core',
'packages/frontend/electron',
'packages/frontend/graphql',
'packages/frontend/hooks',
@@ -255,6 +255,12 @@ const config = {
],
'@typescript-eslint/no-misused-promises': ['error'],
'i/no-extraneous-dependencies': ['error'],
'react-hooks/exhaustive-deps': [
'warn',
{
additionalHooks: 'useAsyncCallback',
},
],
},
})),
{

View File

@@ -58,6 +58,6 @@ body:
label: Are you willing to submit a PR?
description: >
(Optional) We encourage you to submit a [Pull Request](https://github.com/toeverything/affine/pulls) (PR) to help improve AFFiNE for everyone, especially if you have a good understanding of how to implement a fix or feature.
See the AFFiNE [Contributing Guide](https://github.com/toeverything/affine/blob/master/CONTRIBUTING.md) to get started.
See the AFFiNE [Contributing Guide](https://github.com/toeverything/affine/blob/canary/CONTRIBUTING.md) to get started.
options:
- label: Yes I'd like to help by submitting a PR!

View File

@@ -31,6 +31,6 @@ body:
label: Are you willing to submit a PR?
description: >
(Optional) We encourage you to submit a [Pull Request](https://github.com/toeverything/affine/pulls) (PR) to help improve AFFiNE for everyone, especially if you have a good understanding of how to implement a fix or feature.
See the AFFiNE [Contributing Guide](https://github.com/toeverything/affine/blob/master/CONTRIBUTING.md) to get started.
See the AFFiNE [Contributing Guide](https://github.com/toeverything/affine/blob/canary/CONTRIBUTING.md) to get started.
options:
- label: Yes I'd like to help by submitting a PR!

View File

@@ -41,8 +41,8 @@ const createHelmCommand = ({ isDryRun }) => {
const staticIpName = isProduction
? 'affine-cluster-production'
: isBeta
? 'affine-cluster-beta'
: 'affine-cluster-dev';
? 'affine-cluster-beta'
: 'affine-cluster-dev';
const redisAndPostgres =
isProduction || isBeta
? [
@@ -68,8 +68,8 @@ const createHelmCommand = ({ isDryRun }) => {
]
: [];
const webReplicaCount = isProduction ? 3 : isBeta ? 2 : 2;
const graphqlReplicaCount = isProduction ? 10 : isBeta ? 10 : 2;
const syncReplicaCount = isProduction ? 10 : isBeta ? 10 : 2;
const graphqlReplicaCount = isProduction ? 10 : isBeta ? 5 : 2;
const syncReplicaCount = isProduction ? 10 : isBeta ? 5 : 2;
const namespace = isProduction ? 'production' : isBeta ? 'beta' : 'dev';
// eslint-disable-next-line @typescript-eslint/no-non-null-assertion
const host = DEPLOY_HOST || CANARY_DEPLOY_HOST;

View File

@@ -49,9 +49,9 @@ runs:
cache: 'yarn'
- name: Set nmMode
if: ${{ inputs.hard-link-nm == 'true' }}
if: ${{ inputs.hard-link-nm == 'false' }}
shell: bash
run: yarn config set nmMode hardlinks-local
run: yarn config set nmMode classic
- name: Set nmHoistingLimits
if: ${{ inputs.nmHoistingLimits }}

View File

@@ -35,6 +35,8 @@ spec:
key: key
- name: NODE_ENV
value: "{{ .Values.env }}"
- name: NODE_OPTIONS
value: "--max-old-space-size=4096"
- name: NO_COLOR
value: "1"
- name: SERVER_FLAVOR

View File

@@ -1,13 +0,0 @@
{{- if .Values.global.gke.enabled -}}
apiVersion: monitoring.googleapis.com/v1
kind: PodMonitoring
metadata:
name: "{{ .Chart.Name }}-monitoring"
spec:
selector:
matchLabels:
app.kubernetes.io/name: "{{ include "graphql.name" . }}"
endpoints:
- port: {{ .Values.service.port }}
interval: 30s
{{- end }}

View File

@@ -72,11 +72,8 @@ podSecurityContext:
fsGroup: 2000
resources:
limits:
cpu: '4'
memory: 8Gi
requests:
cpu: '2'
cpu: '4'
memory: 4Gi
probe:

View File

@@ -1,13 +0,0 @@
{{- if .Values.global.gke.enabled -}}
apiVersion: monitoring.googleapis.com/v1
kind: PodMonitoring
metadata:
name: "{{ .Chart.Name }}-monitoring"
spec:
selector:
matchLabels:
app.kubernetes.io/name: "{{ include "sync.name" . }}"
endpoints:
- port: {{ .Values.service.port }}
interval: 30s
{{- end }}

View File

@@ -3,7 +3,7 @@ name: Build(Desktop) & Test
on:
push:
branches:
- master
- canary
- v[0-9]+.[0-9]+.x-staging
- v[0-9]+.[0-9]+.x
paths-ignore:
@@ -15,7 +15,7 @@ on:
pull_request:
merge_group:
branches:
- master
- canary
- v[0-9]+.[0-9]+.x-staging
- v[0-9]+.[0-9]+.x
paths-ignore:
@@ -159,7 +159,8 @@ jobs:
env:
SKIP_BUNDLE: true
SKIP_WEB_BUILD: true
run: yarn workspace @affine/electron make --platform=darwin --arch=arm64
HOIST_NODE_MODULES: 1
run: yarn workspace @affine/electron package --platform=darwin --arch=arm64
- name: Output check
if: ${{ matrix.spec.os == 'macos-latest' && matrix.spec.arch == 'arm64' }}

View File

@@ -3,7 +3,7 @@ name: Build(Server) & Test
on:
push:
branches:
- master
- canary
- v[0-9]+.[0-9]+.x-staging
- v[0-9]+.[0-9]+.x
paths-ignore:
@@ -15,7 +15,7 @@ on:
pull_request:
merge_group:
branches:
- master
- canary
- v[0-9]+.[0-9]+.x-staging
- v[0-9]+.[0-9]+.x
paths-ignore:

View File

@@ -3,7 +3,7 @@ name: Build & Test
on:
push:
branches:
- master
- canary
- v[0-9]+.[0-9]+.x-staging
- v[0-9]+.[0-9]+.x
paths-ignore:
@@ -15,7 +15,7 @@ on:
pull_request:
merge_group:
branches:
- master
- canary
- v[0-9]+.[0-9]+.x-staging
- v[0-9]+.[0-9]+.x
paths-ignore:

View File

@@ -13,11 +13,11 @@ name: 'CodeQL'
on:
push:
branches: [master]
branches: [canary]
pull_request:
merge_group:
# The branches below must be a subset of the branches above
branches: [master]
branches: [canary]
jobs:
analyze:

View File

@@ -35,7 +35,7 @@ jobs:
build-core:
name: Build @affine/core
runs-on: ubuntu-latest
environment: ${{ github.event.inputs.flavor }}
steps:
- uses: actions/checkout@v4
- name: Setup Node.js
@@ -52,6 +52,10 @@ jobs:
SHOULD_REPORT_TRACE: true
TRACE_REPORT_ENDPOINT: ${{ secrets.TRACE_REPORT_ENDPOINT }}
CAPTCHA_SITE_KEY: ${{ secrets.CAPTCHA_SITE_KEY }}
SENTRY_ORG: ${{ secrets.SENTRY_ORG }}
SENTRY_PROJECT: ${{ secrets.SENTRY_PROJECT }}
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
SENTRY_DSN: ${{ secrets.SENTRY_DSN }}
- name: Upload core artifact
uses: actions/upload-artifact@v3
with:

View File

@@ -2,7 +2,7 @@ name: Release Charts
on:
push:
branches: [master]
branches: [canary]
paths:
- '.github/helm/**/Chart.yml'

View File

@@ -6,7 +6,7 @@ on:
- labeled
- unlabeled
branches:
- master
- canary
jobs:
check_labels:

View File

@@ -2,13 +2,13 @@ name: Languages Sync
on:
push:
branches: ['master']
branches: ['canary']
paths:
- 'packages/frontend/i18n/**'
- '.github/workflows/languages-sync.yml'
- '!.github/actions/setup-node/action.yml'
pull_request_target:
branches: ['master']
branches: ['canary']
paths:
- 'packages/frontend/i18n/**'
- '.github/workflows/languages-sync.yml'
@@ -23,13 +23,13 @@ jobs:
- name: Setup Node.js
uses: ./.github/actions/setup-node
- name: Check Language Key
if: github.ref != 'refs/heads/master'
if: github.ref != 'refs/heads/canary'
run: yarn workspace @affine/i18n run sync-languages:check
env:
TOLGEE_API_KEY: ${{ secrets.TOLGEE_API_KEY }}
- name: Sync Languages
if: github.ref == 'refs/heads/master'
if: github.ref == 'refs/heads/canary'
run: yarn workspace @affine/i18n run sync-languages
env:
TOLGEE_API_KEY: ${{ secrets.TOLGEE_API_KEY }}

View File

@@ -70,8 +70,8 @@ jobs:
env:
SENTRY_ORG: ${{ secrets.SENTRY_ORG }}
SENTRY_PROJECT: ${{ secrets.SENTRY_PROJECT }}
NEXT_PUBLIC_SENTRY_DSN: ${{ secrets.NEXT_PUBLIC_SENTRY_DSN }}
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
SENTRY_DSN: ${{ secrets.SENTRY_DSN }}
RELEASE_VERSION: ${{ needs.set-build-version.outputs.version }}
- name: Upload core artifact

View File

@@ -7,7 +7,7 @@ on:
- edited
- synchronize
branches:
- master
- canary
permissions:
contents: read

View File

@@ -7,10 +7,10 @@ on:
workflow_dispatch:
push:
branches:
- master
- canary
pull_request:
branches:
- master
- canary
paths-ignore:
- README.md
- .github/**

View File

@@ -40,6 +40,7 @@ env:
jobs:
before-make:
runs-on: ubuntu-latest
environment: ${{ github.event.inputs.build-type || (github.ref_type == 'tag' && contains(github.ref, 'canary') && 'canary') }}
outputs:
RELEASE_VERSION: ${{ steps.get-canary-version.outputs.RELEASE_VERSION }}
steps:
@@ -65,6 +66,7 @@ jobs:
SENTRY_ORG: ${{ secrets.SENTRY_ORG }}
SENTRY_PROJECT: ${{ secrets.SENTRY_PROJECT }}
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
SENTRY_DSN: ${{ secrets.SENTRY_DSN }}
RELEASE_VERSION: ${{ github.event.inputs.version || steps.get-canary-version.outputs.RELEASE_VERSION }}
SKIP_PLUGIN_BUILD: 'true'

View File

@@ -3,7 +3,7 @@ name: Release
on:
push:
branches:
- master
- canary
env:
BUILD_TYPE: stable
@@ -89,7 +89,7 @@ jobs:
if-no-files-found: error
build-docker:
if: github.ref == 'refs/heads/master'
if: github.ref == 'refs/heads/canary'
name: Build Docker
runs-on: ubuntu-latest
needs:

View File

@@ -3,7 +3,7 @@ name: Deploy Cloudflare Worker
on:
push:
branches:
- master
- canary
paths:
- tools/workers/**

1
.gitignore vendored
View File

@@ -78,3 +78,4 @@ tsconfig.node.tsbuildinfo
lib
affine.db
apps/web/next-routes.conf
.nx

View File

@@ -1,13 +0,0 @@
diff --git a/dist/util/forge-config.js b/dist/util/forge-config.js
index 3466ac1a340c8dfe5ea8997178961e8328457d68..ceb33770db48df80e4355e6bac12e8c99162d7bc 100644
--- a/dist/util/forge-config.js
+++ b/dist/util/forge-config.js
@@ -130,7 +130,7 @@ exports.default = async (dir) => {
try {
// The loaded "config" could potentially be a static forge config, ESM module or async function
// eslint-disable-next-line @typescript-eslint/no-var-requires
- const loaded = require(path_1.default.resolve(dir, forgeConfig));
+ const loaded = await import(require('node:url').pathToFileURL(path_1.default.join(dir, forgeConfig)))
const maybeForgeConfig = 'default' in loaded ? loaded.default : loaded;
forgeConfig = typeof maybeForgeConfig === 'function' ? await maybeForgeConfig() : maybeForgeConfig;
}

View File

@@ -1,15 +1,15 @@
diff --git a/package.json b/package.json
index 26dcf8217f3e221e4c53722f14d29bb788332772..57a66dcb0943b9dd5cdaac2eaffccd9225a6b735 100644
index ca30bca63196b923fa5a27eb85ce2ee890222d36..39e9d08dea40f25568a39bfbc0154458d32c8a66 100644
--- a/package.json
+++ b/package.json
@@ -34,6 +34,10 @@
"./adapters": {
"types": "./adapters.d.ts"
@@ -31,6 +31,10 @@
"types": "./index.d.ts",
"default": "./index.js"
},
+ "./core": {
+ "types": "./core/index.d.ts",
+ "default": "./core/index.js"
+ },
"./jwt": {
"types": "./jwt/index.d.ts",
"default": "./jwt/index.js"
"./adapters": {
"types": "./adapters.d.ts"
},

50
Cargo.lock generated
View File

@@ -1241,12 +1241,12 @@ checksum = "a08173bc88b7955d1b3145aa561539096c421ac8debde8cbc3612ec635fee29b"
[[package]]
name = "libloading"
version = "0.7.4"
version = "0.8.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b67380fd3b2fbe7527a606e18729d21c6f3951633d0500574c4dc22d2d638b9f"
checksum = "c571b676ddfc9a8c12f1f3d3085a7b163966a8fd8098a90640953ce5f6170161"
dependencies = [
"cfg-if",
"winapi",
"windows-sys",
]
[[package]]
@@ -1354,9 +1354,9 @@ dependencies = [
[[package]]
name = "mio"
version = "0.8.8"
version = "0.8.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "927a765cd3fc26206e66b296465fa9d3e5ab003e651c1b3c060e7956d96b19d2"
checksum = "3dce281c5e46beae905d4de1870d8b1509a9142b62eedf18b443b011ca8343d0"
dependencies = [
"libc",
"log",
@@ -1375,9 +1375,9 @@ dependencies = [
[[package]]
name = "napi"
version = "2.13.3"
version = "2.14.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fd063c93b900149304e3ba96ce5bf210cd4f81ef5eb80ded0d100df3e85a3ac0"
checksum = "f9d90182620f32fe34b6ac9b52cba898af26e94c7f5abc01eb4094c417ae2e6c"
dependencies = [
"anyhow",
"bitflags 2.4.1",
@@ -1393,15 +1393,15 @@ dependencies = [
[[package]]
name = "napi-build"
version = "2.0.1"
version = "2.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "882a73d9ef23e8dc2ebbffb6a6ae2ef467c0f18ac10711e4cc59c5485d41df0e"
checksum = "d4b4532cf86bfef556348ac65e561e3123879f0e7566cca6d43a6ff5326f13df"
[[package]]
name = "napi-derive"
version = "2.13.0"
version = "2.14.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "da1c6a8fa84d549aa8708fcd062372bf8ec6e849de39016ab921067d21bde367"
checksum = "3619fa472d23cd5af94d63a2bae454a77a8863251f40230fbf59ce20eafa8a86"
dependencies = [
"cfg-if",
"convert_case",
@@ -1413,9 +1413,9 @@ dependencies = [
[[package]]
name = "napi-derive-backend"
version = "1.0.52"
version = "1.0.54"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "20bbc7c69168d06a848f925ec5f0e0997f98e8c8d4f2cc30157f0da51c009e17"
checksum = "ecd3ea4b54020c73d591a49cd192f6334c5f37f71a63ead54dbc851fa991ef00"
dependencies = [
"convert_case",
"once_cell",
@@ -1428,9 +1428,9 @@ dependencies = [
[[package]]
name = "napi-sys"
version = "2.2.3"
version = "2.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "166b5ef52a3ab5575047a9fe8d4a030cdd0f63c96f071cd6907674453b07bae3"
checksum = "2503fa6af34dc83fb74888df8b22afe933b58d37daf7d80424b1c60c68196b8b"
dependencies = [
"libloading",
]
@@ -2292,18 +2292,18 @@ checksum = "836fa6a3e1e547f9a2c4040802ec865b5d85f4014efe00555d7090a3dcaa1090"
[[package]]
name = "serde"
version = "1.0.190"
version = "1.0.192"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "91d3c334ca1ee894a2c6f6ad698fe8c435b76d504b13d436f0685d648d6d96f7"
checksum = "bca2a08484b285dcb282d0f67b26cadc0df8b19f8c12502c13d966bf9482f001"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.190"
version = "1.0.192"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "67c5609f394e5c2bd7fc51efda478004ea80ef42fee983d5c67a65e34f32c0e3"
checksum = "d6c7207fbec9faa48073f3e3074cbe553af6ea512d7c21ba46e434e70ea9fbc1"
dependencies = [
"proc-macro2",
"quote",
@@ -2817,9 +2817,9 @@ checksum = "1f3ccbac311fea05f86f61904b462b55fb3df8837a366dfc601a0161d0532f20"
[[package]]
name = "tokio"
version = "1.33.0"
version = "1.34.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4f38200e3ef7995e5ef13baec2f432a6da0aa9ac495b2c0e8f3b7eec2c92d653"
checksum = "d0c014766411e834f7af5b8f4cf46257aab4036ca95e9d2c144a10f59ad6f5b9"
dependencies = [
"backtrace",
"bytes",
@@ -2836,9 +2836,9 @@ dependencies = [
[[package]]
name = "tokio-macros"
version = "2.1.0"
version = "2.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "630bdcf245f78637c13ec01ffae6187cca34625e8c63150d424b59e55af2675e"
checksum = "5b8a1e28f2deaa14e508979454cb3a223b10b938b45af148bc0986de36f1923b"
dependencies = [
"proc-macro2",
"quote",
@@ -3032,9 +3032,9 @@ checksum = "711b9620af191e0cdc7468a8d14e709c3dcdb115b36f838e601583af800a370a"
[[package]]
name = "uuid"
version = "1.5.0"
version = "1.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "88ad59a7560b41a70d191093a945f0b87bc1deeda46fb237479708a1d6b6cdfc"
checksum = "c58fe91d841bc04822c9801002db4ea904b9e4b8e6bbad25127b46eff8dc516b"
dependencies = [
"getrandom",
"rand",

View File

@@ -195,6 +195,13 @@ For feature request, please see [community.affine.pro](https://community.affine.
## Building
### Codespaces
From the GitHub repo main page, click the green "Code" button and select "Create codespace on master". This will open a new Codespace with the (supposedly auto-forked
AFFiNE repo cloned, built, and ready to go.
### Local
See [BUILDING.md] for instructions on how to build AFFiNE from source code.
## Contributing
@@ -220,10 +227,10 @@ See [LICENSE] for details.
[update page]: https://affine.pro/blog?tag=Release%20Note
[jobs available]: ./docs/jobs.md
[latest packages]: https://github.com/toeverything/AFFiNE/pkgs/container/affine-self-hosted
[contributor license agreement]: https://github.com/toeverything/affine/edit/master/.github/CLA.md
[contributor license agreement]: https://github.com/toeverything/affine/edit/canary/.github/CLA.md
[rust-version-icon]: https://img.shields.io/badge/Rust-1.71.0-dea584
[stars-icon]: https://img.shields.io/github/stars/toeverything/AFFiNE.svg?style=flat&logo=github&colorB=red&label=stars
[codecov]: https://codecov.io/gh/toeverything/affine/branch/master/graphs/badge.svg?branch=master
[codecov]: https://codecov.io/gh/toeverything/affine/branch/canary/graphs/badge.svg?branch=canary
[node-version-icon]: https://img.shields.io/badge/node-%3E=18.16.1-success
[typescript-version-icon]: https://img.shields.io/github/package-json/dependency-version/toeverything/affine/dev/typescript
[react-version-icon]: https://img.shields.io/github/package-json/dependency-version/toeverything/AFFiNE/react?filename=packages%2Ffrontend%2Fcore%2Fpackage.json&color=rgb(97%2C228%2C251)

View File

@@ -13,7 +13,7 @@ Use the table of contents icon on the top left corner of this document to get to
Currently we have two versions of AFFiNE:
- [AFFiNE Pre-Alpha](https://livedemo.affine.pro/). This version uses the branch `Pre-Alpha`, it is no longer actively developed but contains some different functions and features.
- [AFFiNE Alpha](https://pathfinder.affine.pro/). This version uses the `master` branch, this is the latest version under active development.
- [AFFiNE Alpha](https://pathfinder.affine.pro/). This version uses the `canary` branch, this is the latest version under active development.
To get an overview of the project, read the [README](../README.md). Here are some resources to help you get started with open source contributions:

View File

@@ -11,7 +11,7 @@ The AFFiNE core team gives release authorization. And also have the following re
## How to make a release?
Before releasing, ensure you have the latest version of the `master` branch.
Before releasing, ensure you have the latest version of the `canary` branch.
And Read the semver specification to understand how to version your release. https://semver.org
@@ -21,13 +21,13 @@ And Read the semver specification to understand how to version your release. htt
./scripts/set-version.sh 0.5.4-canary.5
```
### 2. Commit changes and push to `master`
### 2. Commit changes and push to `canary`
```shell
git add .
# vx.y.z-canary.n
git commit -m "v0.5.4-canary.5"
git push origin master
git push origin canary
```
### 3. Create a release action

View File

@@ -11,7 +11,7 @@
}
},
"affected": {
"defaultBase": "master"
"defaultBase": "canary"
},
"namedInputs": {
"default": ["{projectRoot}/**/*", "sharedGlobals"],
@@ -56,7 +56,7 @@
"env": "SENTRY_AUTH_TOKEN"
},
{
"env": "NEXT_PUBLIC_SENTRY_DSN"
"env": "SENTRY_DSN"
},
{
"env": "DISTRIBUTION"

View File

@@ -1,6 +1,6 @@
{
"name": "@affine/monorepo",
"version": "0.10.2",
"version": "0.10.3",
"private": true,
"author": "toeverything",
"license": "MIT",
@@ -38,7 +38,6 @@
"test": "vitest --run",
"test:ui": "vitest --ui",
"test:coverage": "vitest run --coverage",
"notify": "node scripts/notify.mjs",
"circular": "madge --circular --ts-config ./tsconfig.json ./packages/frontend/core/src/pages/**/*.tsx ./packages/frontend/core/src/index.tsx ./packages/frontend/electron/src/*/index.ts",
"typecheck": "tsc -b tsconfig.json --diagnostics",
"postinstall": "node ./scripts/check-version.mjs && yarn i18n-codegen gen && yarn husky install"
@@ -46,11 +45,11 @@
"lint-staged": {
"*": "prettier --write --ignore-unknown --cache",
"*.{ts,tsx,mjs,js,jsx}": [
"prettier . --ignore-unknown --write",
"prettier --ignore-unknown --write",
"eslint --cache --fix"
],
"*.toml": [
"prettier . --ignore-unknown --write",
"prettier --ignore-unknown --write",
"taplo format"
]
},
@@ -58,58 +57,58 @@
"@affine-test/kit": "workspace:*",
"@affine/cli": "workspace:*",
"@affine/plugin-cli": "workspace:*",
"@commitlint/cli": "^17.8.0",
"@commitlint/config-conventional": "^17.8.0",
"@faker-js/faker": "^8.2.0",
"@commitlint/cli": "^18.4.3",
"@commitlint/config-conventional": "^18.4.3",
"@faker-js/faker": "^8.3.1",
"@istanbuljs/schema": "^0.1.3",
"@magic-works/i18n-codegen": "^0.5.0",
"@nx/vite": "16.10.0",
"@nx/vite": "17.1.3",
"@perfsee/sdk": "^1.9.0",
"@playwright/test": "^1.39.0",
"@playwright/test": "^1.40.0",
"@taplo/cli": "^0.5.2",
"@testing-library/react": "^14.0.0",
"@testing-library/react": "^14.1.2",
"@toeverything/infra": "workspace:*",
"@types/affine__env": "workspace:*",
"@types/eslint": "^8.44.4",
"@types/node": "^18.18.5",
"@typescript-eslint/eslint-plugin": "^6.7.5",
"@typescript-eslint/parser": "^6.7.5",
"@vanilla-extract/vite-plugin": "^3.9.0",
"@types/eslint": "^8.44.7",
"@types/node": "^20.9.3",
"@typescript-eslint/eslint-plugin": "^6.12.0",
"@typescript-eslint/parser": "^6.12.0",
"@vanilla-extract/vite-plugin": "^3.9.2",
"@vanilla-extract/webpack-plugin": "^2.3.1",
"@vitejs/plugin-react-swc": "^3.4.0",
"@vitejs/plugin-react-swc": "^3.5.0",
"@vitest/coverage-istanbul": "0.34.6",
"@vitest/ui": "0.34.6",
"electron": "^27.0.0",
"eslint": "^8.51.0",
"electron": "^27.1.0",
"eslint": "^8.54.0",
"eslint-config-prettier": "^9.0.0",
"eslint-plugin-i": "^2.28.1",
"eslint-plugin-i": "^2.29.0",
"eslint-plugin-prettier": "^5.0.1",
"eslint-plugin-react": "^7.33.2",
"eslint-plugin-react-hooks": "^4.6.0",
"eslint-plugin-simple-import-sort": "^10.0.0",
"eslint-plugin-sonarjs": "^0.21.0",
"eslint-plugin-unicorn": "^48.0.1",
"eslint-plugin-sonarjs": "^0.23.0",
"eslint-plugin-unicorn": "^49.0.0",
"eslint-plugin-unused-imports": "^3.0.0",
"eslint-plugin-vue": "^9.17.0",
"fake-indexeddb": "5.0.0",
"happy-dom": "^12.9.1",
"eslint-plugin-vue": "^9.18.1",
"fake-indexeddb": "5.0.1",
"happy-dom": "^12.10.3",
"husky": "^8.0.3",
"lint-staged": "^15.0.0",
"lint-staged": "^15.1.0",
"madge": "^6.1.0",
"msw": "^1.3.2",
"nanoid": "^5.0.1",
"nx": "^16.10.0",
"msw": "^2.0.8",
"nanoid": "^5.0.3",
"nx": "^17.1.3",
"nx-cloud": "^16.5.2",
"nyc": "^15.1.0",
"prettier": "^3.0.3",
"prettier": "^3.1.0",
"semver": "^7.5.4",
"serve": "^14.2.1",
"string-width": "^6.1.0",
"string-width": "^7.0.0",
"ts-node": "^10.9.1",
"typescript": "^5.2.2",
"vite": "^4.4.11",
"typescript": "^5.3.2",
"vite": "^5.0.1",
"vite-plugin-istanbul": "^5.0.0",
"vite-plugin-static-copy": "^0.17.0",
"vite-plugin-static-copy": "^0.17.1",
"vite-tsconfig-paths": "^4.2.1",
"vitest": "0.34.6",
"vitest-fetch-mock": "^0.2.2",
@@ -173,9 +172,8 @@
"unbox-primitive": "npm:@nolyfill/unbox-primitive@latest",
"which-boxed-primitive": "npm:@nolyfill/which-boxed-primitive@latest",
"which-typed-array": "npm:@nolyfill/which-typed-array@latest",
"next-auth@^4.23.2": "patch:next-auth@npm%3A4.23.2#./.yarn/patches/next-auth-npm-4.23.2-5f0e551bc7.patch",
"@electron-forge/core@^6.4.2": "patch:@electron-forge/core@npm%3A6.4.2#./.yarn/patches/@electron-forge-core-npm-6.4.2-ab60c87e75.patch",
"@electron-forge/core@6.4.2": "patch:@electron-forge/core@npm%3A6.4.2#./.yarn/patches/@electron-forge-core-npm-6.4.2-ab60c87e75.patch",
"next-auth@^4.24.5": "patch:next-auth@npm%3A4.24.5#~/.yarn/patches/next-auth-npm-4.24.5-8428e11927.patch",
"@reforged/maker-appimage/@electron-forge/maker-base": "7.1.0",
"macos-alias": "npm:macos-alias-building@latest",
"fs-xattr": "npm:@napi-rs/xattr@latest"
}

View File

@@ -0,0 +1,14 @@
-- AlterTable
ALTER TABLE "blobs" ADD COLUMN "deleted_at" TIMESTAMPTZ(6);
-- CreateTable
CREATE TABLE "snapshot_histories" (
"workspace_id" VARCHAR(36) NOT NULL,
"guid" VARCHAR(36) NOT NULL,
"timestamp" TIMESTAMPTZ(6) NOT NULL,
"blob" BYTEA NOT NULL,
"state" BYTEA,
"expired_at" TIMESTAMPTZ(6) NOT NULL,
CONSTRAINT "snapshot_histories_pkey" PRIMARY KEY ("workspace_id","guid","timestamp")
);

View File

@@ -1,3 +1,3 @@
# Please do not edit this file manually
# It should be added in your version-control system (i.e. Git)
provider = "postgresql"
provider = "postgresql"

View File

@@ -1,7 +1,7 @@
{
"name": "@affine/server",
"private": true,
"version": "0.10.2",
"version": "0.10.3",
"description": "Affine Node.js server",
"type": "module",
"bin": {
@@ -18,42 +18,46 @@
"predeploy": "yarn prisma migrate deploy && node --es-module-specifier-resolution node ./dist/data/app.js run"
},
"dependencies": {
"@apollo/server": "^4.9.4",
"@auth/prisma-adapter": "^1.0.3",
"@aws-sdk/client-s3": "^3.433.0",
"@apollo/server": "^4.9.5",
"@auth/prisma-adapter": "^1.0.7",
"@aws-sdk/client-s3": "^3.454.0",
"@google-cloud/opentelemetry-cloud-monitoring-exporter": "^0.17.0",
"@google-cloud/opentelemetry-cloud-trace-exporter": "^2.1.0",
"@keyv/redis": "^2.8.0",
"@nestjs/apollo": "^12.0.9",
"@nestjs/common": "^10.2.7",
"@nestjs/core": "^10.2.7",
"@nestjs/event-emitter": "^2.0.2",
"@nestjs/graphql": "^12.0.9",
"@nestjs/platform-express": "^10.2.7",
"@nestjs/platform-socket.io": "^10.2.7",
"@nestjs/throttler": "^5.0.0",
"@nestjs/websockets": "^10.2.7",
"@nestjs/apollo": "^12.0.11",
"@nestjs/common": "^10.2.10",
"@nestjs/core": "^10.2.10",
"@nestjs/event-emitter": "^2.0.3",
"@nestjs/graphql": "^12.0.11",
"@nestjs/platform-express": "^10.2.10",
"@nestjs/platform-socket.io": "^10.2.10",
"@nestjs/schedule": "^4.0.0",
"@nestjs/throttler": "^5.0.1",
"@nestjs/websockets": "^10.2.10",
"@node-rs/argon2": "^1.5.2",
"@node-rs/crc32": "^1.7.2",
"@node-rs/jsonwebtoken": "^0.2.3",
"@opentelemetry/api": "^1.6.0",
"@opentelemetry/core": "^1.17.1",
"@opentelemetry/instrumentation": "^0.44.0",
"@opentelemetry/instrumentation-graphql": "^0.35.2",
"@opentelemetry/instrumentation-http": "^0.44.0",
"@opentelemetry/instrumentation-ioredis": "^0.35.2",
"@opentelemetry/instrumentation-nestjs-core": "^0.33.2",
"@opentelemetry/instrumentation-socket.io": "^0.34.2",
"@opentelemetry/sdk-metrics": "^1.17.1",
"@opentelemetry/sdk-node": "^0.44.0",
"@opentelemetry/sdk-trace-node": "^1.17.1",
"@prisma/client": "^5.4.2",
"@prisma/instrumentation": "^5.4.2",
"@opentelemetry/api": "^1.7.0",
"@opentelemetry/core": "^1.18.1",
"@opentelemetry/exporter-prometheus": "^0.45.1",
"@opentelemetry/exporter-zipkin": "^1.18.1",
"@opentelemetry/host-metrics": "^0.33.2",
"@opentelemetry/instrumentation": "^0.45.1",
"@opentelemetry/instrumentation-graphql": "^0.36.0",
"@opentelemetry/instrumentation-http": "^0.45.1",
"@opentelemetry/instrumentation-ioredis": "^0.35.3",
"@opentelemetry/instrumentation-nestjs-core": "^0.33.3",
"@opentelemetry/instrumentation-socket.io": "^0.34.3",
"@opentelemetry/sdk-metrics": "^1.18.1",
"@opentelemetry/sdk-node": "^0.45.1",
"@opentelemetry/sdk-trace-node": "^1.18.1",
"@prisma/client": "^5.6.0",
"@prisma/instrumentation": "^5.6.0",
"@socket.io/redis-adapter": "^8.2.1",
"cookie-parser": "^1.4.6",
"dotenv": "^16.3.1",
"express": "^4.18.2",
"file-type": "^18.5.0",
"file-type": "^18.7.0",
"get-stream": "^8.0.1",
"graphql": "^16.8.1",
"graphql-type-json": "^0.3.2",
@@ -61,49 +65,49 @@
"ioredis": "^5.3.2",
"keyv": "^4.5.4",
"lodash-es": "^4.17.21",
"nanoid": "^5.0.1",
"nest-commander": "^3.12.0",
"nanoid": "^5.0.3",
"nest-commander": "^3.12.2",
"nestjs-throttler-storage-redis": "^0.4.1",
"next-auth": "^4.23.2",
"nodemailer": "^6.9.6",
"next-auth": "^4.24.5",
"nodemailer": "^6.9.7",
"on-headers": "^1.0.2",
"parse-duration": "^1.1.0",
"pretty-time": "^1.1.0",
"prisma": "^5.4.2",
"prisma": "^5.6.0",
"prom-client": "^15.0.0",
"reflect-metadata": "^0.1.13",
"rxjs": "^7.8.1",
"semver": "^7.5.4",
"socket.io": "^4.7.2",
"stripe": "^14.1.0",
"stripe": "^14.5.0",
"ws": "^8.14.2",
"yjs": "^13.6.8"
"yjs": "^13.6.10"
},
"devDependencies": {
"@affine-test/kit": "workspace:*",
"@affine/storage": "workspace:*",
"@napi-rs/image": "^1.7.0",
"@nestjs/testing": "^10.2.7",
"@types/cookie-parser": "^1.4.4",
"@types/engine.io": "^3.1.8",
"@types/express": "^4.17.19",
"@types/graphql-upload": "^16.0.3",
"@nestjs/testing": "^10.2.10",
"@types/cookie-parser": "^1.4.6",
"@types/engine.io": "^3.1.10",
"@types/express": "^4.17.21",
"@types/graphql-upload": "^16.0.5",
"@types/keyv": "^4.2.0",
"@types/lodash-es": "^4.17.9",
"@types/node": "^18.18.5",
"@types/nodemailer": "^6.4.11",
"@types/on-headers": "^1.0.1",
"@types/pretty-time": "^1.1.3",
"@types/sinon": "^10.0.19",
"@types/supertest": "^2.0.14",
"@types/ws": "^8.5.7",
"@types/lodash-es": "^4.17.11",
"@types/node": "^20.9.3",
"@types/nodemailer": "^6.4.14",
"@types/on-headers": "^1.0.3",
"@types/pretty-time": "^1.1.5",
"@types/sinon": "^17.0.2",
"@types/supertest": "^2.0.16",
"@types/ws": "^8.5.10",
"ava": "^5.3.1",
"c8": "^8.0.1",
"nodemon": "^3.0.1",
"sinon": "^16.1.0",
"sinon": "^17.0.1",
"supertest": "^6.3.3",
"ts-node": "^10.9.1",
"typescript": "^5.2.2"
"typescript": "^5.3.2"
},
"ava": {
"extensions": {

View File

@@ -164,12 +164,14 @@ model VerificationToken {
}
model Blob {
id Int @id @default(autoincrement()) @db.Integer
hash String @db.VarChar
workspaceId String @map("workspace_id") @db.VarChar
blob Bytes @db.ByteA
id Int @id @default(autoincrement()) @db.Integer
hash String @db.VarChar
workspaceId String @map("workspace_id") @db.VarChar
blob Bytes @db.ByteA
length BigInt
createdAt DateTime @default(now()) @map("created_at") @db.Timestamptz(6)
createdAt DateTime @default(now()) @map("created_at") @db.Timestamptz(6)
// not for keeping, but for snapshot history
deletedAt DateTime? @map("deleted_at") @db.Timestamptz(6)
@@unique([workspaceId, hash])
@@map("blobs")
@@ -191,8 +193,8 @@ model OptimizedBlob {
// the latest snapshot of each doc that we've seen
// Snapshot + Updates are the latest state of the doc
model Snapshot {
id String @default(uuid()) @map("guid") @db.VarChar
workspaceId String @map("workspace_id") @db.VarChar
id String @default(uuid()) @map("guid") @db.VarChar
blob Bytes @db.ByteA
seq Int @default(0) @db.Integer
state Bytes? @db.ByteA
@@ -214,6 +216,18 @@ model Update {
@@map("updates")
}
model SnapshotHistory {
workspaceId String @map("workspace_id") @db.VarChar(36)
id String @map("guid") @db.VarChar(36)
timestamp DateTime @db.Timestamptz(6)
blob Bytes @db.ByteA
state Bytes? @db.ByteA
expiredAt DateTime @map("expired_at") @db.Timestamptz(6)
@@id([workspaceId, id, timestamp])
@@map("snapshot_histories")
}
model NewFeaturesWaitingList {
id String @id @default(uuid()) @db.VarChar
email String @unique

View File

@@ -1,8 +1,8 @@
import { Module } from '@nestjs/common';
import { AppController } from './app.controller';
import { CacheModule } from './cache';
import { ConfigModule } from './config';
import { MetricsModule } from './metrics';
import { BusinessModules } from './modules';
import { AuthModule } from './modules/auth';
import { PrismaModule } from './prisma';
@@ -10,17 +10,18 @@ import { SessionModule } from './session';
import { StorageModule } from './storage';
import { RateLimiterModule } from './throttler';
const BasicModules = [
PrismaModule,
ConfigModule.forRoot(),
CacheModule,
StorageModule.forRoot(),
SessionModule,
RateLimiterModule,
AuthModule,
];
@Module({
imports: [
PrismaModule,
ConfigModule.forRoot(),
StorageModule.forRoot(),
MetricsModule,
SessionModule,
RateLimiterModule,
AuthModule,
...BusinessModules,
],
imports: [...BasicModules, ...BusinessModules],
controllers: [AppController],
})
export class AppModule {}

View File

@@ -0,0 +1,330 @@
import Keyv from 'keyv';
export interface CacheSetOptions {
// in milliseconds
ttl?: number;
}
// extends if needed
export interface Cache {
// standard operation
get<T = unknown>(key: string): Promise<T | undefined>;
set<T = unknown>(
key: string,
value: T,
opts?: CacheSetOptions
): Promise<boolean>;
setnx<T = unknown>(
key: string,
value: T,
opts?: CacheSetOptions
): Promise<boolean>;
increase(key: string, count?: number): Promise<number>;
decrease(key: string, count?: number): Promise<number>;
delete(key: string): Promise<boolean>;
has(key: string): Promise<boolean>;
ttl(key: string): Promise<number>;
expire(key: string, ttl: number): Promise<boolean>;
// list operations
pushBack<T = unknown>(key: string, ...values: T[]): Promise<number>;
pushFront<T = unknown>(key: string, ...values: T[]): Promise<number>;
len(key: string): Promise<number>;
list<T = unknown>(key: string, start: number, end: number): Promise<T[]>;
popFront<T = unknown>(key: string, count?: number): Promise<T[]>;
popBack<T = unknown>(key: string, count?: number): Promise<T[]>;
// map operations
mapSet<T = unknown>(
map: string,
key: string,
value: T,
opts: CacheSetOptions
): Promise<boolean>;
mapIncrease(map: string, key: string, count?: number): Promise<number>;
mapDecrease(map: string, key: string, count?: number): Promise<number>;
mapGet<T = unknown>(map: string, key: string): Promise<T | undefined>;
mapDelete(map: string, key: string): Promise<boolean>;
mapKeys(map: string): Promise<string[]>;
mapRandomKey(map: string): Promise<string | undefined>;
mapLen(map: string): Promise<number>;
}
export class LocalCache implements Cache {
private readonly kv: Keyv;
constructor() {
this.kv = new Keyv();
}
// standard operation
async get<T = unknown>(key: string): Promise<T | undefined> {
return this.kv.get(key).catch(() => undefined);
}
async set<T = unknown>(
key: string,
value: T,
opts: CacheSetOptions = {}
): Promise<boolean> {
return this.kv
.set(key, value, opts.ttl)
.then(() => true)
.catch(() => false);
}
async setnx<T = unknown>(
key: string,
value: T,
opts?: CacheSetOptions | undefined
): Promise<boolean> {
if (!(await this.has(key))) {
return this.set(key, value, opts);
}
return false;
}
async increase(key: string, count: number = 1): Promise<number> {
const prev = (await this.get(key)) ?? 0;
if (typeof prev !== 'number') {
throw new Error(
`Expect a Number keyed by ${key}, but found ${typeof prev}`
);
}
const curr = prev + count;
return (await this.set(key, curr)) ? curr : prev;
}
async decrease(key: string, count: number = 1): Promise<number> {
return this.increase(key, -count);
}
async delete(key: string): Promise<boolean> {
return this.kv.delete(key).catch(() => false);
}
async has(key: string): Promise<boolean> {
return this.kv.has(key).catch(() => false);
}
async ttl(key: string): Promise<number> {
return this.kv
.get(key, { raw: true })
.then(raw => (raw?.expires ? raw.expires - Date.now() : Infinity))
.catch(() => 0);
}
async expire(key: string, ttl: number): Promise<boolean> {
const value = await this.kv.get(key);
return this.set(key, value, { ttl });
}
// list operations
private async getArray<T = unknown>(key: string) {
const raw = await this.kv.get(key, { raw: true });
if (raw && !Array.isArray(raw.value)) {
throw new Error(
`Expect an Array keyed by ${key}, but found ${raw.value}`
);
}
return raw as Keyv.DeserializedData<T[]>;
}
private async setArray<T = unknown>(
key: string,
value: T[],
opts: CacheSetOptions = {}
) {
return this.set(key, value, opts).then(() => value.length);
}
async pushBack<T = unknown>(key: string, ...values: T[]): Promise<number> {
let list: any[] = [];
let ttl: number | undefined = undefined;
const raw = await this.getArray(key);
if (raw) {
list = raw.value;
if (raw.expires) {
ttl = raw.expires - Date.now();
}
}
list = list.concat(values);
return this.setArray(key, list, { ttl });
}
async pushFront<T = unknown>(key: string, ...values: T[]): Promise<number> {
let list: any[] = [];
let ttl: number | undefined = undefined;
const raw = await this.getArray(key);
if (raw) {
list = raw.value;
if (raw.expires) {
ttl = raw.expires - Date.now();
}
}
list = values.concat(list);
return this.setArray(key, list, { ttl });
}
async len(key: string): Promise<number> {
return this.getArray(key).then(v => v?.value.length ?? 0);
}
/**
* list array elements with `[start, end]`
* the end indice is inclusive
*/
async list<T = unknown>(
key: string,
start: number,
end: number
): Promise<T[]> {
const raw = await this.getArray<T>(key);
if (raw?.value) {
start = (raw.value.length + start) % raw.value.length;
end = ((raw.value.length + end) % raw.value.length) + 1;
return raw.value.slice(start, end);
} else {
return [];
}
}
private async trim<T = unknown>(key: string, start: number, end: number) {
const raw = await this.getArray<T>(key);
if (raw) {
start = (raw.value.length + start) % raw.value.length;
// make negative end index work, and end indice is inclusive
end = ((raw.value.length + end) % raw.value.length) + 1;
const result = raw.value.splice(start, end);
await this.set(key, raw.value, {
ttl: raw.expires ? raw.expires - Date.now() : undefined,
});
return result;
}
return [];
}
async popFront<T = unknown>(key: string, count: number = 1) {
return this.trim<T>(key, 0, count - 1);
}
async popBack<T = unknown>(key: string, count: number = 1) {
return this.trim<T>(key, -count, count - 1);
}
// map operations
private async getMap<T = unknown>(map: string) {
const raw = await this.kv.get(map, { raw: true });
if (raw) {
if (typeof raw.value !== 'object') {
throw new Error(
`Expect an Object keyed by ${map}, but found ${typeof raw}`
);
}
if (Array.isArray(raw.value)) {
throw new Error(`Expect an Object keyed by ${map}, but found an Array`);
}
}
return raw as Keyv.DeserializedData<Record<string, T>>;
}
private async setMap<T = unknown>(
map: string,
value: Record<string, T>,
opts: CacheSetOptions = {}
) {
return this.kv.set(map, value, opts.ttl).then(() => true);
}
async mapGet<T = unknown>(map: string, key: string): Promise<T | undefined> {
const raw = await this.getMap<T>(map);
if (raw?.value) {
return raw.value[key];
}
return undefined;
}
async mapSet<T = unknown>(
map: string,
key: string,
value: T
): Promise<boolean> {
const raw = await this.getMap(map);
const data = raw?.value ?? {};
data[key] = value;
return this.setMap(map, data, {
ttl: raw?.expires ? raw.expires - Date.now() : undefined,
});
}
async mapDelete(map: string, key: string): Promise<boolean> {
const raw = await this.getMap(map);
if (raw?.value) {
delete raw.value[key];
return this.setMap(map, raw.value, {
ttl: raw.expires ? raw.expires - Date.now() : undefined,
});
}
return false;
}
async mapIncrease(
map: string,
key: string,
count: number = 1
): Promise<number> {
const prev = (await this.mapGet(map, key)) ?? 0;
if (typeof prev !== 'number') {
throw new Error(
`Expect a Number keyed by ${key}, but found ${typeof prev}`
);
}
const curr = prev + count;
return (await this.mapSet(map, key, curr)) ? curr : prev;
}
async mapDecrease(
map: string,
key: string,
count: number = 1
): Promise<number> {
return this.mapIncrease(map, key, -count);
}
async mapKeys(map: string): Promise<string[]> {
const raw = await this.getMap(map);
if (raw) {
return Object.keys(raw.value);
}
return [];
}
async mapRandomKey(map: string): Promise<string | undefined> {
const keys = await this.mapKeys(map);
return keys[Math.floor(Math.random() * keys.length)];
}
async mapLen(map: string): Promise<number> {
const raw = await this.getMap(map);
return raw ? Object.keys(raw.value).length : 0;
}
}

View File

@@ -0,0 +1,24 @@
import { FactoryProvider, Global, Module } from '@nestjs/common';
import { Redis } from 'ioredis';
import { Config } from '../config';
import { LocalCache } from './cache';
import { RedisCache } from './redis';
const CacheProvider: FactoryProvider = {
provide: LocalCache,
useFactory: (config: Config) => {
return config.redis.enabled
? new RedisCache(new Redis(config.redis))
: new LocalCache();
},
inject: [Config],
};
@Global()
@Module({
providers: [CacheProvider],
exports: [CacheProvider],
})
export class CacheModule {}
export { LocalCache as Cache };

View File

@@ -0,0 +1,194 @@
import { Redis } from 'ioredis';
import { Cache, CacheSetOptions } from './cache';
export class RedisCache implements Cache {
constructor(private readonly redis: Redis) {}
// standard operation
async get<T = unknown>(key: string): Promise<T> {
return this.redis
.get(key)
.then(v => {
if (v) {
return JSON.parse(v);
}
return undefined;
})
.catch(() => undefined);
}
async set<T = unknown>(
key: string,
value: T,
opts: CacheSetOptions = {}
): Promise<boolean> {
if (opts.ttl) {
return this.redis
.set(key, JSON.stringify(value), 'PX', opts.ttl)
.then(() => true)
.catch(() => false);
}
return this.redis
.set(key, JSON.stringify(value))
.then(() => true)
.catch(() => false);
}
async increase(key: string, count: number = 1): Promise<number> {
return this.redis.incrby(key, count).catch(() => 0);
}
async decrease(key: string, count: number = 1): Promise<number> {
return this.redis.decrby(key, count).catch(() => 0);
}
async setnx<T = unknown>(
key: string,
value: T,
opts: CacheSetOptions = {}
): Promise<boolean> {
if (opts.ttl) {
return this.redis
.set(key, JSON.stringify(value), 'PX', opts.ttl, 'NX')
.then(v => !!v)
.catch(() => false);
}
return this.redis
.set(key, JSON.stringify(value), 'NX')
.then(v => !!v)
.catch(() => false);
}
async delete(key: string): Promise<boolean> {
return this.redis
.del(key)
.then(v => v > 0)
.catch(() => false);
}
async has(key: string): Promise<boolean> {
return this.redis
.exists(key)
.then(v => v > 0)
.catch(() => false);
}
async ttl(key: string): Promise<number> {
return this.redis.ttl(key).catch(() => 0);
}
async expire(key: string, ttl: number): Promise<boolean> {
return this.redis
.pexpire(key, ttl)
.then(v => v > 0)
.catch(() => false);
}
// list operations
async pushBack<T = unknown>(key: string, ...values: T[]): Promise<number> {
return this.redis
.rpush(key, ...values.map(v => JSON.stringify(v)))
.catch(() => 0);
}
async pushFront<T = unknown>(key: string, ...values: T[]): Promise<number> {
return this.redis
.lpush(key, ...values.map(v => JSON.stringify(v)))
.catch(() => 0);
}
async len(key: string): Promise<number> {
return this.redis.llen(key).catch(() => 0);
}
async list<T = unknown>(
key: string,
start: number,
end: number
): Promise<T[]> {
return this.redis
.lrange(key, start, end)
.then(data => data.map(v => JSON.parse(v)))
.catch(() => []);
}
async popFront<T = unknown>(key: string, count: number = 1): Promise<T[]> {
return this.redis
.lpop(key, count)
.then(data => (data ?? []).map(v => JSON.parse(v)))
.catch(() => []);
}
async popBack<T = unknown>(key: string, count: number = 1): Promise<T[]> {
return this.redis
.rpop(key, count)
.then(data => (data ?? []).map(v => JSON.parse(v)))
.catch(() => []);
}
// map operations
async mapSet<T = unknown>(
map: string,
key: string,
value: T
): Promise<boolean> {
return this.redis
.hset(map, key, JSON.stringify(value))
.then(v => v > 0)
.catch(() => false);
}
async mapIncrease(
map: string,
key: string,
count: number = 1
): Promise<number> {
return this.redis.hincrby(map, key, count);
}
async mapDecrease(
map: string,
key: string,
count: number = 1
): Promise<number> {
return this.redis.hincrby(map, key, -count);
}
async mapGet<T = unknown>(map: string, key: string): Promise<T | undefined> {
return this.redis
.hget(map, key)
.then(v => (v ? JSON.parse(v) : undefined))
.catch(() => undefined);
}
async mapDelete(map: string, key: string): Promise<boolean> {
return this.redis
.hdel(map, key)
.then(v => v > 0)
.catch(() => false);
}
async mapKeys(map: string): Promise<string[]> {
return this.redis.hkeys(map).catch(() => []);
}
async mapRandomKey(map: string): Promise<string | undefined> {
return this.redis
.hrandfield(map, 1)
.then(v =>
typeof v === 'string'
? v
: Array.isArray(v)
? (v[0] as string)
: undefined
)
.catch(() => undefined);
}
async mapLen(map: string): Promise<number> {
return this.redis.hlen(map).catch(() => 0);
}
}

View File

@@ -57,10 +57,10 @@ export function parseEnvValue(value: string | undefined, type?: EnvConfigType) {
return type === 'int'
? int(value)
: type === 'float'
? float(value)
: type === 'boolean'
? boolean(value)
: value;
? float(value)
: type === 'boolean'
? boolean(value)
: value;
}
/**
@@ -362,6 +362,14 @@ export interface AFFiNEConfig {
*/
experimentalMergeWithJwstCodec: boolean;
};
history: {
/**
* How long the buffer time of creating a new history snapshot when doc get updated.
*
* in {ms}
*/
interval: number;
};
};
payment: {

View File

@@ -209,6 +209,9 @@ export const getDefaultAFFiNEConfig: () => AFFiNEConfig = () => {
updatePollInterval: 3000,
experimentalMergeWithJwstCodec: false,
},
history: {
interval: 1000 * 60 * 10 /* 10 mins */,
},
},
payment: {
stripe: {

View File

@@ -8,14 +8,13 @@ import { fileURLToPath } from 'url';
import { Config } from './config';
import { GQLLoggerPlugin } from './graphql/logger-plugin';
import { Metrics } from './metrics/metrics';
@Global()
@Module({
imports: [
GraphQLModule.forRootAsync<ApolloDriverConfig>({
driver: ApolloDriver,
useFactory: (config: Config, metrics: Metrics) => {
useFactory: (config: Config) => {
return {
...config.graphql,
path: `${config.path}/graphql`,
@@ -31,10 +30,10 @@ import { Metrics } from './metrics/metrics';
req,
res,
}),
plugins: [new GQLLoggerPlugin(metrics)],
plugins: [new GQLLoggerPlugin()],
};
},
inject: [Config, Metrics],
inject: [Config],
}),
],
})

View File

@@ -7,40 +7,39 @@ import { Plugin } from '@nestjs/apollo';
import { Logger } from '@nestjs/common';
import { Response } from 'express';
import { Metrics } from '../metrics/metrics';
import { metrics } from '../metrics/metrics';
import { ReqContext } from '../types';
@Plugin()
export class GQLLoggerPlugin implements ApolloServerPlugin {
protected logger = new Logger(GQLLoggerPlugin.name);
constructor(private readonly metrics: Metrics) {}
requestDidStart(
reqContext: GraphQLRequestContext<ReqContext>
): Promise<GraphQLRequestListener<GraphQLRequestContext<ReqContext>>> {
const res = reqContext.contextValue.req.res as Response;
const operation = reqContext.request.operationName;
this.metrics.gqlRequest(1, { operation });
const timer = this.metrics.gqlTimer({ operation });
metrics().gqlRequest.add(1, { operation });
const start = Date.now();
return Promise.resolve({
willSendResponse: () => {
const costInMilliseconds = timer() * 1000;
const costInMilliseconds = Date.now() - start;
res.setHeader(
'Server-Timing',
`gql;dur=${costInMilliseconds};desc="GraphQL"`
);
metrics().gqlTimer.record(costInMilliseconds, { operation });
return Promise.resolve();
},
didEncounterErrors: () => {
this.metrics.gqlError(1, { operation });
const costInMilliseconds = timer() * 1000;
const costInMilliseconds = Date.now() - start;
res.setHeader(
'Server-Timing',
`gql;dur=${costInMilliseconds};desc="GraphQL ${operation}"`
);
metrics().gqlTimer.record(costInMilliseconds, { operation });
return Promise.resolve();
},
});

View File

@@ -1,22 +1,9 @@
/// <reference types="./global.d.ts" />
import { MetricExporter } from '@google-cloud/opentelemetry-cloud-monitoring-exporter';
import { TraceExporter } from '@google-cloud/opentelemetry-cloud-trace-exporter';
import { start as startAutoMetrics } from './metrics';
startAutoMetrics();
import { NestFactory } from '@nestjs/core';
import type { NestExpressApplication } from '@nestjs/platform-express';
import {
CompositePropagator,
W3CBaggagePropagator,
W3CTraceContextPropagator,
} from '@opentelemetry/core';
import gql from '@opentelemetry/instrumentation-graphql';
import { HttpInstrumentation } from '@opentelemetry/instrumentation-http';
import ioredis from '@opentelemetry/instrumentation-ioredis';
import { NestInstrumentation } from '@opentelemetry/instrumentation-nestjs-core';
import socketIO from '@opentelemetry/instrumentation-socket.io';
import { PeriodicExportingMetricReader } from '@opentelemetry/sdk-metrics';
import { NodeSDK } from '@opentelemetry/sdk-node';
import { BatchSpanProcessor } from '@opentelemetry/sdk-trace-node';
import { PrismaInstrumentation } from '@prisma/instrumentation';
import cookieParser from 'cookie-parser';
import { static as staticMiddleware } from 'express';
import graphqlUploadExpress from 'graphql-upload/graphqlUploadExpress.mjs';
@@ -28,35 +15,6 @@ import { serverTimingAndCache } from './middleware/timing';
import { RedisIoAdapter } from './modules/sync/redis-adapter';
const { NODE_ENV, AFFINE_ENV } = process.env;
if (NODE_ENV === 'production') {
const traceExporter = new TraceExporter();
const tracing = new NodeSDK({
traceExporter,
metricReader: new PeriodicExportingMetricReader({
exporter: new MetricExporter(),
}),
spanProcessor: new BatchSpanProcessor(traceExporter),
textMapPropagator: new CompositePropagator({
propagators: [
new W3CBaggagePropagator(),
new W3CTraceContextPropagator(),
],
}),
instrumentations: [
new NestInstrumentation(),
new ioredis.IORedisInstrumentation(),
new socketIO.SocketIoInstrumentation({ traceReserved: true }),
new gql.GraphQLInstrumentation({ mergeItems: true }),
new HttpInstrumentation(),
new PrismaInstrumentation(),
],
serviceName: 'affine-cloud',
});
tracing.start();
}
const app = await NestFactory.create<NestExpressApplication>(AppModule, {
cors: true,
rawBody: true,

View File

@@ -1,18 +0,0 @@
import { Controller, Get, Res } from '@nestjs/common';
import type { Response } from 'express';
import { register } from 'prom-client';
import { PrismaService } from '../prisma';
@Controller()
export class MetricsController {
constructor(private readonly prisma: PrismaService) {}
@Get('/metrics')
async index(@Res() res: Response): Promise<void> {
res.header('Content-Type', register.contentType);
const prismaMetrics = await this.prisma.$metrics.prometheus();
const appMetrics = await register.metrics();
res.send(appMetrics + prismaMetrics);
}
}

View File

@@ -1,12 +1,3 @@
import { Global, Module } from '@nestjs/common';
import { MetricsController } from '../metrics/controller';
import { Metrics } from './metrics';
@Global()
@Module({
providers: [Metrics],
exports: [Metrics],
controllers: [MetricsController],
})
export class MetricsModule {}
export * from './metrics';
export { start } from './opentelemetry';
export * from './utils';

View File

@@ -1,28 +1,76 @@
import { Injectable, OnModuleDestroy } from '@nestjs/common';
import { register } from 'prom-client';
import opentelemetry, { Attributes, Observable } from '@opentelemetry/api';
import { metricsCreator } from './utils';
interface AsyncMetric {
ob: Observable;
get value(): any;
get attrs(): Attributes | undefined;
}
@Injectable()
export class Metrics implements OnModuleDestroy {
onModuleDestroy(): void {
register.clear();
let _metrics: ReturnType<typeof createBusinessMetrics> | undefined = undefined;
export function getMeter(name = 'business') {
return opentelemetry.metrics.getMeter(name);
}
function createBusinessMetrics() {
const meter = getMeter();
const asyncMetrics: AsyncMetric[] = [];
function createGauge(name: string) {
let value: any;
let attrs: Attributes | undefined;
const ob = meter.createObservableGauge(name);
asyncMetrics.push({
ob,
get value() {
return value;
},
get attrs() {
return attrs;
},
});
return (newValue: any, newAttrs?: Attributes) => {
value = newValue;
attrs = newAttrs;
};
}
socketIOEventCounter = metricsCreator.counter('socket_io_counter', ['event']);
socketIOEventTimer = metricsCreator.timer('socket_io_timer', ['event']);
socketIOConnectionGauge = metricsCreator.gauge(
'socket_io_connection_counter'
const metrics = {
socketIOConnectionGauge: createGauge('socket_io_connection'),
gqlRequest: meter.createCounter('gql_request'),
gqlError: meter.createCounter('gql_error'),
gqlTimer: meter.createHistogram('gql_timer'),
jwstCodecMerge: meter.createCounter('jwst_codec_merge'),
jwstCodecDidnotMatch: meter.createCounter('jwst_codec_didnot_match'),
jwstCodecFail: meter.createCounter('jwst_codec_fail'),
authCounter: meter.createCounter('auth'),
authFailCounter: meter.createCounter('auth_fail'),
docHistoryCounter: meter.createCounter('doc_history_created'),
docRecoverCounter: meter.createCounter('doc_history_recovered'),
};
meter.addBatchObservableCallback(
result => {
asyncMetrics.forEach(metric => {
result.observe(metric.ob, metric.value, metric.attrs);
});
},
asyncMetrics.map(({ ob }) => ob)
);
gqlRequest = metricsCreator.counter('gql_request', ['operation']);
gqlError = metricsCreator.counter('gql_error', ['operation']);
gqlTimer = metricsCreator.timer('gql_timer', ['operation']);
jwstCodecMerge = metricsCreator.counter('jwst_codec_merge');
jwstCodecDidnotMatch = metricsCreator.counter('jwst_codec_didnot_match');
jwstCodecFail = metricsCreator.counter('jwst_codec_fail');
authCounter = metricsCreator.counter('auth');
authFailCounter = metricsCreator.counter('auth_fail', ['reason']);
return metrics;
}
export function registerBusinessMetrics() {
if (!_metrics) {
_metrics = createBusinessMetrics();
}
return _metrics;
}
export const metrics = registerBusinessMetrics;

View File

@@ -0,0 +1,127 @@
import { MetricExporter } from '@google-cloud/opentelemetry-cloud-monitoring-exporter';
import { TraceExporter } from '@google-cloud/opentelemetry-cloud-trace-exporter';
import {
CompositePropagator,
W3CBaggagePropagator,
W3CTraceContextPropagator,
} from '@opentelemetry/core';
import { PrometheusExporter } from '@opentelemetry/exporter-prometheus';
import { ZipkinExporter } from '@opentelemetry/exporter-zipkin';
import { HostMetrics } from '@opentelemetry/host-metrics';
import { Instrumentation } from '@opentelemetry/instrumentation';
import { GraphQLInstrumentation } from '@opentelemetry/instrumentation-graphql';
import { HttpInstrumentation } from '@opentelemetry/instrumentation-http';
import { IORedisInstrumentation } from '@opentelemetry/instrumentation-ioredis';
import { NestInstrumentation } from '@opentelemetry/instrumentation-nestjs-core';
import { SocketIoInstrumentation } from '@opentelemetry/instrumentation-socket.io';
import {
ConsoleMetricExporter,
MetricReader,
PeriodicExportingMetricReader,
} from '@opentelemetry/sdk-metrics';
import { NodeSDK } from '@opentelemetry/sdk-node';
import {
BatchSpanProcessor,
ConsoleSpanExporter,
SpanExporter,
} from '@opentelemetry/sdk-trace-node';
import { PrismaInstrumentation } from '@prisma/instrumentation';
import { registerBusinessMetrics } from './metrics';
abstract class OpentelemetryFactor {
abstract getMetricReader(): MetricReader;
abstract getSpanExporter(): SpanExporter;
getInstractions(): Instrumentation[] {
return [
new NestInstrumentation(),
new IORedisInstrumentation(),
new SocketIoInstrumentation({ traceReserved: true }),
new GraphQLInstrumentation({ mergeItems: true }),
new HttpInstrumentation(),
new PrismaInstrumentation(),
];
}
create() {
const traceExporter = this.getSpanExporter();
return new NodeSDK({
traceExporter,
metricReader: this.getMetricReader(),
spanProcessor: new BatchSpanProcessor(traceExporter),
textMapPropagator: new CompositePropagator({
propagators: [
new W3CBaggagePropagator(),
new W3CTraceContextPropagator(),
],
}),
instrumentations: this.getInstractions(),
serviceName: 'affine-cloud',
});
}
}
class GCloudOpentelemetryFactor extends OpentelemetryFactor {
override getMetricReader(): MetricReader {
return new PeriodicExportingMetricReader({
exportIntervalMillis: 30000,
exportTimeoutMillis: 10000,
exporter: new MetricExporter(),
});
}
override getSpanExporter(): SpanExporter {
return new TraceExporter();
}
}
class LocalOpentelemetryFactor extends OpentelemetryFactor {
override getMetricReader(): MetricReader {
return new PrometheusExporter();
}
override getSpanExporter(): SpanExporter {
return new ZipkinExporter();
}
}
class DebugOpentelemetryFactor extends OpentelemetryFactor {
override getMetricReader(): MetricReader {
return new PeriodicExportingMetricReader({
exporter: new ConsoleMetricExporter(),
});
}
override getSpanExporter(): SpanExporter {
return new ConsoleSpanExporter();
}
}
function createSDK() {
let factor: OpentelemetryFactor | null = null;
if (process.env.NODE_ENV === 'production') {
factor = new GCloudOpentelemetryFactor();
} else if (process.env.DEBUG_METRICS) {
factor = new DebugOpentelemetryFactor();
} else {
factor = new LocalOpentelemetryFactor();
}
return factor?.create();
}
function registerCustomMetrics() {
const host = new HostMetrics({ name: 'instance-host-metrics' });
host.start();
}
export function start() {
const sdk = createSDK();
if (sdk) {
sdk.start();
registerCustomMetrics();
registerBusinessMetrics();
}
}

View File

@@ -1,99 +1,11 @@
import { Counter, Gauge, register, Summary } from 'prom-client';
import { Attributes } from '@opentelemetry/api';
function getOr<T>(name: string, or: () => T): T {
return (register.getSingleMetric(name) as T) || or();
}
type LabelValues<T extends string> = Partial<Record<T, string | number>>;
type MetricsCreator<T extends string> = (
value: number,
labels: LabelValues<T>
) => void;
type TimerMetricsCreator<T extends string> = (
labels: LabelValues<T>
) => () => number;
export const metricsCreatorGenerator = () => {
const counterCreator = <T extends string>(
name: string,
labelNames?: T[]
): MetricsCreator<T> => {
const counter = getOr(
name,
() =>
new Counter({
name,
help: name,
...(labelNames ? { labelNames } : {}),
})
);
return (value: number, labels: LabelValues<T>) => {
counter.inc(labels, value);
};
};
const gaugeCreator = <T extends string>(
name: string,
labelNames?: T[]
): MetricsCreator<T> => {
const gauge = getOr(
name,
() =>
new Gauge({
name,
help: name,
...(labelNames ? { labelNames } : {}),
})
);
return (value: number, labels: LabelValues<T>) => {
gauge.set(labels, value);
};
};
const timerCreator = <T extends string>(
name: string,
labelNames?: T[]
): TimerMetricsCreator<T> => {
const summary = getOr(
name,
() =>
new Summary({
name,
help: name,
...(labelNames ? { labelNames } : {}),
})
);
return (labels: LabelValues<T>) => {
const now = process.hrtime();
return () => {
const delta = process.hrtime(now);
const value = delta[0] + delta[1] / 1e9;
summary.observe(labels, value);
return value;
};
};
};
return {
counter: counterCreator,
gauge: gaugeCreator,
timer: timerCreator,
};
};
export const metricsCreator = metricsCreatorGenerator();
import { getMeter } from './metrics';
export const CallTimer = (
name: string,
labels: Record<string, any> = {}
attrs?: Attributes
): MethodDecorator => {
const timer = metricsCreator.timer(name, Object.keys(labels));
// @ts-expect-error allow
return (
_target,
@@ -106,19 +18,27 @@ export const CallTimer = (
}
desc.value = function (...args: any[]) {
const endTimer = timer(labels);
const timer = getMeter().createHistogram(name, {
description: `function call time costs of ${name}`,
});
const start = Date.now();
const end = () => {
timer.record(Date.now() - start, attrs);
};
let result: any;
try {
result = originalMethod.apply(this, args);
} catch (e) {
endTimer();
end();
throw e;
}
if (result instanceof Promise) {
return result.finally(endTimer);
return result.finally(end);
} else {
endTimer();
end();
return result;
}
};
@@ -129,10 +49,8 @@ export const CallTimer = (
export const CallCounter = (
name: string,
labels: Record<string, any> = {}
attrs?: Attributes
): MethodDecorator => {
const count = metricsCreator.counter(name, Object.keys(labels));
// @ts-expect-error allow
return (
_target,
@@ -145,7 +63,11 @@ export const CallCounter = (
}
desc.value = function (...args: any[]) {
count(1, labels);
const count = getMeter().createCounter(name, {
description: `function call counter of ${name}`,
});
count.add(1, attrs);
return originalMethod.apply(this, args);
};

View File

@@ -23,7 +23,7 @@ import type { AuthAction, CookieOption, NextAuthOptions } from 'next-auth';
import { AuthHandler } from 'next-auth/core';
import { Config } from '../../config';
import { Metrics } from '../../metrics/metrics';
import { metrics } from '../../metrics';
import { PrismaService } from '../../prisma/service';
import { SessionService } from '../../session';
import { AuthThrottlerGuard, Throttle } from '../../throttler';
@@ -46,7 +46,6 @@ export class NextAuthController {
private readonly authService: AuthService,
@Inject(NextAuthOptionsProvide)
private readonly nextAuthOptions: NextAuthOptions,
private readonly metrics: Metrics,
private readonly session: SessionService
) {
// eslint-disable-next-line @typescript-eslint/no-non-null-assertion
@@ -90,7 +89,7 @@ export class NextAuthController {
res.redirect(`/signin${query}`);
return;
}
this.metrics.authCounter(1, {});
metrics().authCounter.add(1);
const [action, providerId] = req.url // start with request url
.slice(BASE_URL.length) // make relative to baseUrl
.replace(/\?.*/, '') // remove query part, use only path part
@@ -127,7 +126,7 @@ export class NextAuthController {
const options = this.nextAuthOptions;
if (req.method === 'POST' && action === 'session') {
if (typeof req.body !== 'object' || typeof req.body.data !== 'object') {
this.metrics.authFailCounter(1, { reason: 'invalid_session_data' });
metrics().authFailCounter.add(1, { reason: 'invalid_session_data' });
throw new BadRequestException(`Invalid new session data`);
}
const user = await this.updateSession(req, req.body.data);
@@ -210,7 +209,7 @@ export class NextAuthController {
if (redirect?.endsWith('api/auth/error?error=AccessDenied')) {
this.logger.log(`Early access redirect headers: ${req.headers}`);
this.metrics.authFailCounter(1, {
metrics().authFailCounter.add(1, {
reason: 'no_early_access_permission',
});
if (

View File

@@ -0,0 +1,240 @@
import { isDeepStrictEqual } from 'node:util';
import { Injectable, Logger } from '@nestjs/common';
import { OnEvent } from '@nestjs/event-emitter';
import { Cron, CronExpression } from '@nestjs/schedule';
import type { Snapshot } from '@prisma/client';
import { Config } from '../../config';
import { metrics } from '../../metrics';
import { PrismaService } from '../../prisma';
import { SubscriptionStatus } from '../payment/service';
import { Permission } from '../workspaces/types';
@Injectable()
export class DocHistoryManager {
private readonly logger = new Logger(DocHistoryManager.name);
constructor(
private readonly config: Config,
private readonly db: PrismaService
) {}
@OnEvent('doc:manager:snapshot:beforeUpdate')
async onDocUpdated(snapshot: Snapshot, forceCreate = false) {
const last = await this.last(snapshot.workspaceId, snapshot.id);
let shouldCreateHistory = false;
if (!last) {
// never created
shouldCreateHistory = true;
} else if (last.timestamp === snapshot.updatedAt) {
// no change
shouldCreateHistory = false;
} else if (
// force
forceCreate ||
// last history created before interval in configs
last.timestamp.getTime() <
snapshot.updatedAt.getTime() - this.config.doc.history.interval
) {
shouldCreateHistory = true;
}
if (shouldCreateHistory) {
// skip the history recording when no actual update on snapshot happended
if (last && isDeepStrictEqual(last.state, snapshot.state)) {
this.logger.debug(
`State matches, skip creating history record for ${snapshot.id} in workspace ${snapshot.workspaceId}`
);
return;
}
await this.db.snapshotHistory
.create({
select: {
timestamp: true,
},
data: {
workspaceId: snapshot.workspaceId,
id: snapshot.id,
timestamp: snapshot.updatedAt,
blob: snapshot.blob,
state: snapshot.state,
expiredAt: await this.getExpiredDateFromNow(snapshot.workspaceId),
},
})
.catch(() => {
// safe to ignore
// only happens when duplicated history record created in multi processes
});
metrics().docHistoryCounter.add(1, {});
this.logger.log(
`History created for ${snapshot.id} in workspace ${snapshot.workspaceId}.`
);
}
}
async list(
workspaceId: string,
id: string,
before: Date = new Date(),
take: number = 10
) {
return this.db.snapshotHistory.findMany({
select: {
timestamp: true,
},
where: {
workspaceId,
id,
timestamp: {
lte: before,
},
// only include the ones has not expired
expiredAt: {
gt: new Date(),
},
},
orderBy: {
timestamp: 'desc',
},
take,
});
}
async count(workspaceId: string, id: string) {
return this.db.snapshotHistory.count({
where: {
workspaceId,
id,
expiredAt: {
gt: new Date(),
},
},
});
}
async get(workspaceId: string, id: string, timestamp: Date) {
return this.db.snapshotHistory.findUnique({
where: {
workspaceId_id_timestamp: {
workspaceId,
id,
timestamp,
},
expiredAt: {
gt: new Date(),
},
},
});
}
async last(workspaceId: string, id: string) {
return this.db.snapshotHistory.findFirst({
where: {
workspaceId,
id,
},
select: {
timestamp: true,
state: true,
},
orderBy: {
timestamp: 'desc',
},
});
}
async recover(workspaceId: string, id: string, timestamp: Date) {
const history = await this.db.snapshotHistory.findUnique({
where: {
workspaceId_id_timestamp: {
workspaceId,
id,
timestamp,
},
},
});
if (!history) {
throw new Error('Given history not found');
}
const oldSnapshot = await this.db.snapshot.findUnique({
where: {
id_workspaceId: {
id,
workspaceId,
},
},
});
if (!oldSnapshot) {
// unreachable actually
throw new Error('Given Doc not found');
}
// save old snapshot as one history record
await this.onDocUpdated(oldSnapshot, true);
// WARN:
// we should never do the snapshot updating in recovering,
// which is not the solution in CRDT.
// let user revert in client and update the data in sync system
// `await this.db.snapshot.update();`
metrics().docRecoverCounter.add(1, {});
return history.timestamp;
}
/**
* @todo(@darkskygit) refactor with [Usage Control] system
*/
async getExpiredDateFromNow(workspaceId: string) {
const permission = await this.db.workspaceUserPermission.findFirst({
select: {
userId: true,
},
where: {
workspaceId,
type: Permission.Owner,
},
});
if (!permission) {
// unreachable actually
throw new Error('Workspace owner not found');
}
const sub = await this.db.userSubscription.findFirst({
select: {
id: true,
},
where: {
userId: permission.userId,
status: SubscriptionStatus.Active,
},
});
return new Date(
Date.now() +
1000 *
60 *
60 *
24 *
// 30 days for subscription user, 7 days for free user
(sub ? 30 : 7)
);
}
@Cron(CronExpression.EVERY_DAY_AT_MIDNIGHT /* everyday at 12am */)
async cleanupExpiredHistory() {
await this.db.snapshotHistory.deleteMany({
where: {
expiredAt: {
lte: new Date(),
},
},
});
}
}

View File

@@ -1,7 +1,7 @@
import { DynamicModule } from '@nestjs/common';
import { DocHistoryManager } from './history';
import { DocManager } from './manager';
import { RedisDocManager } from './redis-manager';
export class DocModule {
/**
@@ -15,14 +15,10 @@ export class DocModule {
provide: 'DOC_MANAGER_AUTOMATION',
useValue: automation,
},
{
provide: DocManager,
useClass: globalThis.AFFiNE.redis.enabled
? RedisDocManager
: DocManager,
},
DocManager,
DocHistoryManager,
],
exports: [DocManager],
exports: [DocManager, DocHistoryManager],
};
}
@@ -39,4 +35,4 @@ export class DocModule {
}
}
export { DocManager };
export { DocHistoryManager, DocManager };

View File

@@ -5,6 +5,7 @@ import {
OnModuleDestroy,
OnModuleInit,
} from '@nestjs/common';
import { EventEmitter2 } from '@nestjs/event-emitter';
import { Snapshot, Update } from '@prisma/client';
import { chunk } from 'lodash-es';
import { defer, retry } from 'rxjs';
@@ -16,8 +17,9 @@ import {
transact,
} from 'yjs';
import { Cache } from '../../cache';
import { Config } from '../../config';
import { Metrics } from '../../metrics/metrics';
import { metrics } from '../../metrics/metrics';
import { PrismaService } from '../../prisma';
import { mergeUpdatesInApplyWay as jwstMergeUpdates } from '../../storage';
@@ -58,17 +60,18 @@ const MAX_SEQ_NUM = 0x3fffffff; // u31
*/
@Injectable()
export class DocManager implements OnModuleInit, OnModuleDestroy {
protected logger = new Logger(DocManager.name);
private logger = new Logger(DocManager.name);
private job: NodeJS.Timeout | null = null;
private seqMap = new Map<string, number>();
private busy = false;
constructor(
protected readonly db: PrismaService,
@Inject('DOC_MANAGER_AUTOMATION')
protected readonly automation: boolean,
protected readonly config: Config,
protected readonly metrics: Metrics
private readonly automation: boolean,
private readonly db: PrismaService,
private readonly config: Config,
private readonly cache: Cache,
private readonly event: EventEmitter2
) {}
onModuleInit() {
@@ -82,7 +85,7 @@ export class DocManager implements OnModuleInit, OnModuleDestroy {
this.destroy();
}
protected recoverDoc(...updates: Buffer[]): Promise<Doc> {
private recoverDoc(...updates: Buffer[]): Promise<Doc> {
const doc = new Doc();
const chunks = chunk(updates, 10);
@@ -95,11 +98,7 @@ export class DocManager implements OnModuleInit, OnModuleDestroy {
try {
applyUpdate(doc, u);
} catch (e) {
this.logger.error(
`Failed to apply update: ${updates
.map(u => u.toString('hex'))
.join('\n')}`
);
this.logger.error('Failed to apply update', e);
}
});
});
@@ -117,24 +116,22 @@ export class DocManager implements OnModuleInit, OnModuleDestroy {
});
}
protected async applyUpdates(
guid: string,
...updates: Buffer[]
): Promise<Doc> {
private async applyUpdates(guid: string, ...updates: Buffer[]): Promise<Doc> {
const doc = await this.recoverDoc(...updates);
// test jwst codec
if (
this.config.affine.canary &&
this.config.doc.manager.experimentalMergeWithJwstCodec &&
updates.length < 100 /* avoid overloading */
) {
this.metrics.jwstCodecMerge(1, {});
metrics().jwstCodecMerge.add(1);
const yjsResult = Buffer.from(encodeStateAsUpdate(doc));
let log = false;
try {
const jwstResult = jwstMergeUpdates(updates);
if (!compare(yjsResult, jwstResult)) {
this.metrics.jwstCodecDidnotMatch(1, {});
metrics().jwstCodecDidnotMatch.add(1);
this.logger.warn(
`jwst codec result doesn't match yjs codec result for: ${guid}`
);
@@ -145,11 +142,11 @@ export class DocManager implements OnModuleInit, OnModuleDestroy {
}
}
} catch (e) {
this.metrics.jwstCodecFail(1, {});
metrics().jwstCodecFail.add(1);
this.logger.warn(`jwst apply update failed for ${guid}: ${e}`);
log = true;
} finally {
if (log) {
if (log && this.config.node.dev) {
this.logger.warn(
`Updates: ${updates.map(u => u.toString('hex')).join('\n')}`
);
@@ -223,8 +220,8 @@ export class DocManager implements OnModuleInit, OnModuleDestroy {
.pipe(retry(retryTimes)) // retry until seq num not conflict
.subscribe({
next: () => {
this.logger.verbose(
`pushed update for workspace: ${workspaceId}, guid: ${guid}`
this.logger.debug(
`pushed 1 update for ${guid} in workspace ${workspaceId}`
);
resolve();
},
@@ -233,6 +230,8 @@ export class DocManager implements OnModuleInit, OnModuleDestroy {
reject(new Error('Failed to push update'));
},
});
}).then(() => {
return this.updateCachedUpdatesCount(workspaceId, guid, 1);
});
}
@@ -267,8 +266,8 @@ export class DocManager implements OnModuleInit, OnModuleDestroy {
.pipe(retry(retryTimes)) // retry until seq num not conflict
.subscribe({
next: () => {
this.logger.verbose(
`pushed updates for workspace: ${workspaceId}, guid: ${guid}`
this.logger.debug(
`pushed ${updates.length} updates for ${guid} in workspace ${workspaceId}`
);
resolve();
},
@@ -277,6 +276,8 @@ export class DocManager implements OnModuleInit, OnModuleDestroy {
reject(new Error('Failed to push update'));
},
});
}).then(() => {
return this.updateCachedUpdatesCount(workspaceId, guid, updates.length);
});
}
@@ -363,21 +364,22 @@ export class DocManager implements OnModuleInit, OnModuleDestroy {
/**
* apply pending updates to snapshot
*/
protected async autoSquash() {
private async autoSquash() {
// find the first update and batch process updates with same id
const first = await this.db.update.findFirst({
select: {
id: true,
workspaceId: true,
},
});
const candidate = await this.getAutoSquashCandidate();
// no pending updates
if (!first) {
if (!candidate) {
return;
}
const { id, workspaceId } = first;
const { id, workspaceId } = candidate;
// acquire lock
const ok = await this.lockUpdatesForAutoSquash(workspaceId, id);
if (!ok) {
return;
}
try {
await this._get(workspaceId, id);
@@ -386,14 +388,31 @@ export class DocManager implements OnModuleInit, OnModuleDestroy {
`Failed to apply updates for workspace: ${workspaceId}, guid: ${id}`
);
this.logger.error(e);
} finally {
await this.unlockUpdatesForAutoSquash(workspaceId, id);
}
}
protected async upsert(
private async getAutoSquashCandidate() {
const cache = await this.getAutoSquashCandidateFromCache();
if (cache) {
return cache;
}
return this.db.update.findFirst({
select: {
id: true,
workspaceId: true,
},
});
}
private async upsert(
workspaceId: string,
guid: string,
doc: Doc,
seq?: number
initialSeq?: number
) {
const blob = Buffer.from(encodeStateAsUpdate(doc));
const state = Buffer.from(encodeStateVector(doc));
@@ -417,7 +436,7 @@ export class DocManager implements OnModuleInit, OnModuleDestroy {
workspaceId,
blob,
state,
seq,
seq: initialSeq,
},
update: {
blob,
@@ -426,7 +445,7 @@ export class DocManager implements OnModuleInit, OnModuleDestroy {
});
}
protected async _get(
private async _get(
workspaceId: string,
guid: string
): Promise<{ doc: Doc } | { snapshot: Buffer } | null> {
@@ -446,22 +465,29 @@ export class DocManager implements OnModuleInit, OnModuleDestroy {
* Squash updates into a single update and save it as snapshot,
* and delete the updates records at the same time.
*/
protected async squash(updates: Update[], snapshot: Snapshot | null) {
private async squash(updates: Update[], snapshot: Snapshot | null) {
if (!updates.length) {
throw new Error('No updates to squash');
}
const first = updates[0];
const last = updates[updates.length - 1];
const { id, workspaceId } = first;
const doc = await this.applyUpdates(
first.id,
snapshot ? snapshot.blob : Buffer.from([0, 0]),
...updates.map(u => u.blob)
);
const { id, workspaceId } = first;
if (snapshot) {
this.event.emit('doc:manager:snapshot:beforeUpdate', snapshot);
}
await this.upsert(workspaceId, id, doc, last.seq);
this.logger.debug(
`Squashed ${updates.length} updates for ${id} in workspace ${workspaceId}`
);
await this.db.update.deleteMany({
where: {
id,
@@ -471,6 +497,8 @@ export class DocManager implements OnModuleInit, OnModuleDestroy {
},
},
});
await this.updateCachedUpdatesCount(workspaceId, id, -updates.length);
return doc;
}
@@ -496,6 +524,9 @@ export class DocManager implements OnModuleInit, OnModuleDestroy {
// reset
if (seq >= MAX_SEQ_NUM) {
await this.db.snapshot.update({
select: {
seq: true,
},
where: {
id_workspaceId: {
workspaceId,
@@ -516,4 +547,56 @@ export class DocManager implements OnModuleInit, OnModuleDestroy {
return last + batch;
}
}
private async updateCachedUpdatesCount(
workspaceId: string,
guid: string,
count: number
) {
const result = await this.cache.mapIncrease(
`doc:manager:updates`,
`${workspaceId}::${guid}`,
count
);
if (result <= 0) {
await this.cache.mapDelete(
`doc:manager:updates`,
`${workspaceId}::${guid}`
);
}
}
private async getAutoSquashCandidateFromCache() {
const key = await this.cache.mapRandomKey('doc:manager:updates');
if (key) {
const count = await this.cache.mapGet<number>('doc:manager:updates', key);
if (typeof count === 'number' && count > 0) {
const [workspaceId, id] = key.split('::');
return { id, workspaceId };
}
}
return null;
}
private async lockUpdatesForAutoSquash(workspaceId: string, guid: string) {
return this.cache.setnx(
`doc:manager:updates-lock:${workspaceId}::${guid}`,
1,
{
ttl: 60 * 1000,
}
);
}
private async unlockUpdatesForAutoSquash(workspaceId: string, guid: string) {
return this.cache
.delete(`doc:manager:updates-lock:${workspaceId}::${guid}`)
.catch(e => {
// safe, the lock will be expired when ttl ends
this.logger.error('Failed to release updates lock', e);
});
}
}

View File

@@ -1,129 +0,0 @@
import { Inject, Injectable } from '@nestjs/common';
import Redis from 'ioredis';
import { Config } from '../../config';
import { Metrics } from '../../metrics/metrics';
import { PrismaService } from '../../prisma';
import { DocID } from '../../utils/doc';
import { DocManager } from './manager';
function makeKey(prefix: string) {
return (parts: TemplateStringsArray, ...args: any[]) => {
return parts.reduce((prev, curr, i) => {
return prev + curr + (args[i] || '');
}, prefix);
};
}
const pending = 'um_pending:';
const updates = makeKey('um_u:');
const lock = makeKey('um_l:');
const pushUpdateLua = `
redis.call('sadd', KEYS[1], ARGV[1])
redis.call('rpush', KEYS[2], ARGV[2])
`;
/**
* @deprecated unstable
*/
@Injectable()
export class RedisDocManager extends DocManager {
private readonly redis: Redis;
constructor(
protected override readonly db: PrismaService,
@Inject('DOC_MANAGER_AUTOMATION')
protected override readonly automation: boolean,
protected override readonly config: Config,
protected override readonly metrics: Metrics
) {
super(db, automation, config, metrics);
this.redis = new Redis(config.redis);
this.redis.defineCommand('pushDocUpdate', {
numberOfKeys: 2,
lua: pushUpdateLua,
});
}
override onModuleInit(): void {
if (this.automation) {
this.setup();
}
}
override async autoSquash(): Promise<void> {
// incase some update fallback to db
await super.autoSquash();
// consume rest updates in redis queue
const pendingDoc = await this.redis.spop(pending).catch(() => null); // safe
if (!pendingDoc) {
return;
}
const docId = new DocID(pendingDoc);
const updateKey = updates`${pendingDoc}`;
const lockKey = lock`${pendingDoc}`;
// acquire the lock
const lockResult = await this.redis
.set(
lockKey,
'1',
'EX',
// 10mins, incase progress exit in between lock require & release, which is a rare.
// if the lock is really hold more then 10mins, we should check the merge logic correctness
600,
'NX'
)
.catch(() => null); // safe;
if (!lockResult) {
// we failed to acquire the lock, put the pending doc back to queue.
await this.redis.sadd(pending, pendingDoc).catch(() => null); // safe
return;
}
try {
// fetch pending updates
const updates = await this.redis
.lrangeBuffer(updateKey, 0, -1)
.catch(() => []); // safe
if (!updates.length) {
return;
}
this.logger.verbose(
`applying ${updates.length} updates for workspace: ${docId}`
);
const snapshot = await this.getSnapshot(docId.workspace, docId.guid);
// merge
const doc = await (snapshot
? this.applyUpdates(docId.full, snapshot.blob, ...updates)
: this.applyUpdates(docId.full, ...updates));
// update snapshot
await this.upsert(docId.workspace, docId.guid, doc, snapshot?.seq);
// delete merged updates
await this.redis
.ltrim(updateKey, updates.length, -1)
// safe, fallback to mergeUpdates
.catch(e => {
this.logger.error(`Failed to remove merged updates from Redis: ${e}`);
});
} catch (e) {
this.logger.error(
`Failed to merge updates with snapshot for ${docId}: ${e}`
);
await this.redis.sadd(pending, docId.toString()).catch(() => null); // safe
} finally {
await this.redis.del(lockKey);
}
}
}

View File

@@ -1,5 +1,6 @@
import { DynamicModule, Type } from '@nestjs/common';
import { EventEmitterModule } from '@nestjs/event-emitter';
import { ScheduleModule } from '@nestjs/schedule';
import { GqlModule } from '../graphql.module';
import { AuthModule } from './auth';
@@ -11,7 +12,11 @@ import { WorkspaceModule } from './workspaces';
const { SERVER_FLAVOR } = process.env;
const BusinessModules: (Type | DynamicModule)[] = [];
const BusinessModules: (Type | DynamicModule)[] = [
EventEmitterModule.forRoot({
global: true,
}),
];
switch (SERVER_FLAVOR) {
case 'sync':
@@ -19,9 +24,7 @@ switch (SERVER_FLAVOR) {
break;
case 'graphql':
BusinessModules.push(
EventEmitterModule.forRoot({
global: true,
}),
ScheduleModule.forRoot(),
GqlModule,
WorkspaceModule,
UsersModule,
@@ -33,9 +36,7 @@ switch (SERVER_FLAVOR) {
case 'allinone':
default:
BusinessModules.push(
EventEmitterModule.forRoot({
global: true,
}),
ScheduleModule.forRoot(),
GqlModule,
WorkspaceModule,
UsersModule,

View File

@@ -11,8 +11,8 @@ import {
import { Server, Socket } from 'socket.io';
import { encodeStateAsUpdate, encodeStateVector } from 'yjs';
import { Metrics } from '../../../metrics/metrics';
import { CallCounter, CallTimer } from '../../../metrics/utils';
import { metrics } from '../../../metrics';
import { CallTimer } from '../../../metrics/utils';
import { DocID } from '../../../utils/doc';
import { Auth, CurrentUser } from '../../auth';
import { DocManager } from '../../doc';
@@ -28,10 +28,47 @@ import {
WorkspaceNotFoundError,
} from './error';
export const GatewayErrorWrapper = (): MethodDecorator => {
// @ts-expect-error allow
return (
_target,
_key,
desc: TypedPropertyDescriptor<(...args: any[]) => any>
) => {
const originalMethod = desc.value;
if (!originalMethod) {
return desc;
}
desc.value = function (...args: any[]) {
let result: any;
try {
result = originalMethod.apply(this, args);
} catch (e) {
return {
error: new InternalError(e as Error),
};
}
if (result instanceof Promise) {
return result.catch(e => {
return {
error: new InternalError(e),
};
});
} else {
return result;
}
};
return desc;
};
};
const SubscribeMessage = (event: string) =>
applyDecorators(
CallCounter('socket_io_counter', { event }),
CallTimer('socket_io_timer', { event }),
GatewayErrorWrapper(),
CallTimer('socket_io_event_duration', { event }),
RawSubscribeMessage(event)
);
@@ -59,7 +96,6 @@ export class EventsGateway implements OnGatewayConnection, OnGatewayDisconnect {
constructor(
private readonly docManager: DocManager,
private readonly metric: Metrics,
private readonly permissions: PermissionService
) {}
@@ -68,12 +104,12 @@ export class EventsGateway implements OnGatewayConnection, OnGatewayDisconnect {
handleConnection() {
this.connectionCount++;
this.metric.socketIOConnectionGauge(this.connectionCount, {});
metrics().socketIOConnectionGauge(this.connectionCount);
}
handleDisconnect() {
this.connectionCount--;
this.metric.socketIOConnectionGauge(this.connectionCount, {});
metrics().socketIOConnectionGauge(this.connectionCount);
}
@Auth()
@@ -233,25 +269,19 @@ export class EventsGateway implements OnGatewayConnection, OnGatewayDisconnect {
};
}
try {
const docId = new DocID(guid, workspaceId);
client
.to(docId.workspace)
.emit('server-updates', { workspaceId, guid, updates });
const docId = new DocID(guid, workspaceId);
client
.to(docId.workspace)
.emit('server-updates', { workspaceId, guid, updates });
const buffers = updates.map(update => Buffer.from(update, 'base64'));
const buffers = updates.map(update => Buffer.from(update, 'base64'));
await this.docManager.batchPush(docId.workspace, docId.guid, buffers);
return {
data: {
accepted: true,
},
};
} catch (e) {
return {
error: new InternalError(e as Error),
};
}
await this.docManager.batchPush(docId.workspace, docId.guid, buffers);
return {
data: {
accepted: true,
},
};
}
@Auth()

View File

@@ -4,35 +4,37 @@ import {
ForbiddenException,
Get,
Inject,
Logger,
NotFoundException,
Param,
Res,
} from '@nestjs/common';
import type { Response } from 'express';
import format from 'pretty-time';
import { CallTimer } from '../../metrics';
import { PrismaService } from '../../prisma';
import { StorageProvide } from '../../storage';
import { DocID } from '../../utils/doc';
import { Auth, CurrentUser, Publicable } from '../auth';
import { DocManager } from '../doc';
import { DocHistoryManager, DocManager } from '../doc';
import { UserType } from '../users';
import { PermissionService } from './permission';
import { PermissionService, PublicPageMode } from './permission';
import { Permission } from './types';
@Controller('/api/workspaces')
export class WorkspacesController {
private readonly logger = new Logger('WorkspacesController');
constructor(
@Inject(StorageProvide) private readonly storage: Storage,
private readonly permission: PermissionService,
private readonly docManager: DocManager
private readonly docManager: DocManager,
private readonly historyManager: DocHistoryManager,
private readonly prisma: PrismaService
) {}
// get workspace blob
//
// NOTE: because graphql can't represent a File, so we have to use REST API to get blob
@Get('/:id/blobs/:name')
@CallTimer('doc_controller', { method: 'get_blob' })
async blob(
@Param('id') workspaceId: string,
@Param('name') name: string,
@@ -57,13 +59,13 @@ export class WorkspacesController {
@Get('/:id/docs/:guid')
@Auth()
@Publicable()
@CallTimer('doc_controller', { method: 'get_doc' })
async doc(
@CurrentUser() user: UserType | undefined,
@Param('id') ws: string,
@Param('guid') guid: string,
@Res() res: Response
) {
const start = process.hrtime();
const docId = new DocID(guid, ws);
if (
// if a user has the permission
@@ -82,8 +84,62 @@ export class WorkspacesController {
throw new NotFoundException('Doc not found');
}
if (!docId.isWorkspace) {
// fetch the publish page mode for publish page
const publishPage = await this.prisma.workspacePage.findUnique({
where: {
workspaceId_pageId: {
workspaceId: docId.workspace,
pageId: docId.guid,
},
},
});
const publishPageMode =
publishPage?.mode === PublicPageMode.Edgeless ? 'edgeless' : 'page';
res.setHeader('publish-mode', publishPageMode);
}
res.setHeader('content-type', 'application/octet-stream');
res.send(update);
this.logger.debug(`workspaces doc api: ${format(process.hrtime(start))}`);
}
@Get('/:id/docs/:guid/histories/:timestamp')
@Auth()
@CallTimer('doc_controller', { method: 'get_history' })
async history(
@CurrentUser() user: UserType,
@Param('id') ws: string,
@Param('guid') guid: string,
@Param('timestamp') timestamp: string,
@Res() res: Response
) {
const docId = new DocID(guid, ws);
let ts;
try {
ts = new Date(timestamp);
} catch (e) {
throw new Error('Invalid timestamp');
}
await this.permission.checkPagePermission(
docId.workspace,
docId.guid,
user.id,
Permission.Write
);
const history = await this.historyManager.get(
docId.workspace,
docId.guid,
ts
);
if (history) {
res.setHeader('content-type', 'application/octet-stream');
res.send(history.blob);
} else {
throw new NotFoundException('Doc history not found');
}
}
}

View File

@@ -0,0 +1,92 @@
import {
Args,
Field,
GraphQLISODateTime,
Int,
Mutation,
ObjectType,
Parent,
ResolveField,
Resolver,
} from '@nestjs/graphql';
import type { SnapshotHistory } from '@prisma/client';
import { DocID } from '../../utils/doc';
import { Auth, CurrentUser } from '../auth';
import { DocHistoryManager } from '../doc/history';
import { UserType } from '../users';
import { PermissionService } from './permission';
import { WorkspaceType } from './resolver';
import { Permission } from './types';
@ObjectType()
class DocHistoryType implements Partial<SnapshotHistory> {
@Field()
workspaceId!: string;
@Field()
id!: string;
@Field(() => GraphQLISODateTime)
timestamp!: Date;
}
@Resolver(() => WorkspaceType)
export class DocHistoryResolver {
constructor(
private readonly historyManager: DocHistoryManager,
private readonly permission: PermissionService
) {}
@ResolveField(() => [DocHistoryType])
async histories(
@Parent() workspace: WorkspaceType,
@Args('guid') guid: string,
@Args({ name: 'before', type: () => GraphQLISODateTime, nullable: true })
timestamp: Date = new Date(),
@Args({ name: 'take', type: () => Int, nullable: true })
take?: number
): Promise<DocHistoryType[]> {
const docId = new DocID(guid, workspace.id);
if (docId.isWorkspace) {
throw new Error('Invalid guid for listing doc histories.');
}
return this.historyManager
.list(workspace.id, docId.guid, timestamp, take)
.then(rows =>
rows.map(({ timestamp }) => {
return {
workspaceId: workspace.id,
id: docId.guid,
timestamp,
};
})
);
}
@Auth()
@Mutation(() => Date)
async recoverDoc(
@CurrentUser() user: UserType,
@Args('workspaceId') workspaceId: string,
@Args('guid') guid: string,
@Args({ name: 'timestamp', type: () => GraphQLISODateTime }) timestamp: Date
): Promise<Date> {
const docId = new DocID(guid, workspaceId);
if (docId.isWorkspace) {
throw new Error('Invalid guid for recovering doc from history.');
}
await this.permission.checkPagePermission(
docId.workspace,
docId.guid,
user.id,
Permission.Write
);
return this.historyManager.recover(docId.workspace, docId.guid, timestamp);
}
}

View File

@@ -3,6 +3,7 @@ import { Module } from '@nestjs/common';
import { DocModule } from '../doc';
import { UsersService } from '../users';
import { WorkspacesController } from './controller';
import { DocHistoryResolver } from './history.resolver';
import { PermissionService } from './permission';
import { PagePermissionResolver, WorkspaceResolver } from './resolver';
@@ -14,6 +15,7 @@ import { PagePermissionResolver, WorkspaceResolver } from './resolver';
PermissionService,
UsersService,
PagePermissionResolver,
DocHistoryResolver,
],
exports: [PermissionService],
})

View File

@@ -244,18 +244,20 @@ export class PermissionService {
permission = Permission.Read
) {
// check whether page is public
const count = await this.prisma.workspacePage.count({
where: {
workspaceId: ws,
pageId: page,
public: true,
},
});
if (permission === Permission.Read) {
const count = await this.prisma.workspacePage.count({
where: {
workspaceId: ws,
pageId: page,
public: true,
},
});
// page is public
// accessible
if (count > 0) {
return true;
// page is public
// accessible
if (count > 0) {
return true;
}
}
if (user) {

View File

@@ -885,9 +885,9 @@ export class PagePermissionResolver {
}
await this.permission.checkWorkspace(
workspaceId,
docId.workspace,
user.id,
Permission.Admin
Permission.Read
);
return this.permission.publishPage(docId.workspace, docId.guid, mode);
@@ -924,7 +924,7 @@ export class PagePermissionResolver {
await this.permission.checkWorkspace(
docId.workspace,
user.id,
Permission.Admin
Permission.Read
);
return this.permission.revokePublicPage(docId.workspace, docId.guid);

View File

@@ -192,6 +192,7 @@ type WorkspaceType {
"""Public pages of a workspace"""
publicPages: [WorkspacePage!]!
histories(guid: String!, before: DateTime, take: Int): [DocHistoryType!]!
}
type InvitationWorkspaceType {
@@ -232,6 +233,12 @@ enum PublicPageMode {
Edgeless
}
type DocHistoryType {
workspaceId: String!
id: String!
timestamp: DateTime!
}
type Query {
"""Get is owner of workspace"""
isOwner(workspaceId: String!): Boolean!
@@ -288,6 +295,7 @@ type Mutation {
publishPage(workspaceId: String!, pageId: String!, mode: PublicPageMode = Page): WorkspacePage!
revokePage(workspaceId: String!, pageId: String!): Boolean! @deprecated(reason: "use revokePublicPage")
revokePublicPage(workspaceId: String!, pageId: String!): WorkspacePage!
recoverDoc(workspaceId: String!, guid: String!, timestamp: DateTime!): DateTime!
"""Upload user avatar"""
uploadAvatar(avatar: Upload!): UserType!

View File

@@ -70,3 +70,13 @@ test('fix', t => {
t.is(id.workspace, 'ws');
t.is(id.toString(), 'ws:space:sub');
});
test('special case: `wsId:space:page:pageId`', t => {
const id = new DocID('ws:space:page:page');
t.is(id.workspace, 'ws');
t.is(id.guid, 'page');
t.throws(() => new DocID('ws:s:p:page'));
t.throws(() => new DocID('ws:space:b:page'));
t.throws(() => new DocID('ws:s:page:page'));
});

View File

@@ -55,7 +55,12 @@ export class DocID {
let parts = raw.split(':');
if (parts.length > 3) {
throw new Error(`Invalid format of Doc ID: ${raw}`);
// special adapt case `wsId:space:page:pageId`
if (parts[1] === DocVariant.Space && parts[2] === DocVariant.Page) {
parts = [workspaceId ?? parts[0], DocVariant.Space, parts[3]];
} else {
throw new Error(`Invalid format of Doc ID: ${raw}`);
}
} else if (parts.length === 2) {
// `${variant}:${guid}`
if (!workspaceId) {

View File

@@ -1,12 +1,12 @@
export type DeepPartial<T> = T extends Array<infer U>
? DeepPartial<U>[]
: T extends ReadonlyArray<infer U>
? ReadonlyArray<DeepPartial<U>>
: T extends object
? {
[K in keyof T]?: DeepPartial<T[K]>;
}
: T;
? ReadonlyArray<DeepPartial<U>>
: T extends object
? {
[K in keyof T]?: DeepPartial<T[K]>;
}
: T;
type Join<Prefix, Suffixes> = Prefix extends string | number
? Suffixes extends string | number
@@ -32,11 +32,11 @@ export type LeafPaths<
> = Depth extends MaxDepth
? never
: T extends Record<string | number, any>
? {
[K in keyof T]-?: K extends string | number
? T[K] extends PrimitiveType
? K
: Join<K, LeafPaths<T[K], Path, MaxDepth, `${Depth}.`>>
: never;
}[keyof T]
: never;
? {
[K in keyof T]-?: K extends string | number
? T[K] extends PrimitiveType
? K
: Join<K, LeafPaths<T[K], Path, MaxDepth, `${Depth}.`>>
: never;
}[keyof T]
: never;

View File

@@ -5,7 +5,6 @@ import test from 'ava';
import { ConfigModule } from '../src/config';
import { GqlModule } from '../src/graphql.module';
import { MetricsModule } from '../src/metrics';
import { AuthModule } from '../src/modules/auth';
import { AuthResolver } from '../src/modules/auth/resolver';
import { AuthService } from '../src/modules/auth/service';
@@ -40,7 +39,6 @@ test.beforeEach(async () => {
PrismaModule,
GqlModule,
AuthModule,
MetricsModule,
RateLimiterModule,
],
}).compile();

View File

@@ -0,0 +1,108 @@
import { Test, TestingModule } from '@nestjs/testing';
import test from 'ava';
import { Cache, CacheModule } from '../src/cache';
import { ConfigModule } from '../src/config';
let cache: Cache;
let module: TestingModule;
test.beforeEach(async () => {
module = await Test.createTestingModule({
imports: [ConfigModule.forRoot(), CacheModule],
}).compile();
const prefix = Math.random().toString(36).slice(2, 7);
cache = new Proxy(module.get(Cache), {
get(target, prop) {
// @ts-expect-error safe
const fn = target[prop];
if (typeof fn === 'function') {
// replase first parameter of fn with prefix
return (...args: any[]) =>
fn.call(target, `${prefix}:${args[0]}`, ...args.slice(1));
}
return fn;
},
});
});
test.afterEach(async () => {
await module.close();
});
test('should be able to set normal cache', async t => {
t.true(await cache.set('test', 1));
t.is(await cache.get<number>('test'), 1);
t.true(await cache.has('test'));
t.true(await cache.delete('test'));
t.is(await cache.get('test'), undefined);
t.true(await cache.set('test', { a: 1 }));
t.deepEqual(await cache.get('test'), { a: 1 });
});
test('should be able to set cache with non-exiting flag', async t => {
t.true(await cache.setnx('test', 1));
t.false(await cache.setnx('test', 2));
t.is(await cache.get('test'), 1);
});
test('should be able to set cache with ttl', async t => {
t.true(await cache.set('test', 1));
t.is(await cache.get('test'), 1);
t.true(await cache.expire('test', 1 * 1000));
const ttl = await cache.ttl('test');
t.true(ttl <= 1 * 1000);
t.true(ttl > 0);
});
test('should be able to incr/decr number cache', async t => {
t.true(await cache.set('test', 1));
t.is(await cache.increase('test'), 2);
t.is(await cache.increase('test'), 3);
t.is(await cache.decrease('test'), 2);
t.is(await cache.decrease('test'), 1);
// increase an nonexists number
t.is(await cache.increase('test2'), 1);
t.is(await cache.increase('test2'), 2);
});
test('should be able to manipulate list cache', async t => {
t.is(await cache.pushBack('test', 1), 1);
t.is(await cache.pushBack('test', 2, 3, 4), 4);
t.is(await cache.len('test'), 4);
t.deepEqual(await cache.list('test', 1, -1), [2, 3, 4]);
t.deepEqual(await cache.popFront('test', 2), [1, 2]);
t.deepEqual(await cache.popBack('test', 1), [4]);
t.is(await cache.pushBack('test2', { a: 1 }), 1);
t.deepEqual(await cache.popFront('test2', 1), [{ a: 1 }]);
});
test('should be able to manipulate map cache', async t => {
t.is(await cache.mapSet('test', 'a', 1), true);
t.is(await cache.mapSet('test', 'b', 2), true);
t.is(await cache.mapLen('test'), 2);
t.is(await cache.mapGet('test', 'a'), 1);
t.is(await cache.mapGet('test', 'b'), 2);
t.is(await cache.mapIncrease('test', 'a'), 2);
t.is(await cache.mapIncrease('test', 'a'), 3);
t.is(await cache.mapDecrease('test', 'b', 3), -1);
const keys = await cache.mapKeys('test');
t.deepEqual(keys, ['a', 'b']);
const randomKey = await cache.mapRandomKey('test');
t.truthy(randomKey);
t.true(keys.includes(randomKey!));
t.is(await cache.mapDelete('test', 'a'), true);
t.is(await cache.mapGet('test', 'a'), undefined);
});

View File

@@ -1,14 +1,15 @@
import { mock } from 'node:test';
import type { INestApplication } from '@nestjs/common';
import { EventEmitterModule } from '@nestjs/event-emitter';
import { Test, TestingModule } from '@nestjs/testing';
import test from 'ava';
import { register } from 'prom-client';
import * as Sinon from 'sinon';
import { Doc as YDoc, encodeStateAsUpdate } from 'yjs';
import { CacheModule } from '../src/cache';
import { Config, ConfigModule } from '../src/config';
import { MetricsModule } from '../src/metrics';
import { DocManager, DocModule } from '../src/modules/doc';
import { PrismaModule, PrismaService } from '../src/prisma';
import { flushDB } from './utils';
@@ -17,7 +18,8 @@ const createModule = () => {
return Test.createTestingModule({
imports: [
PrismaModule,
MetricsModule,
CacheModule,
EventEmitterModule.forRoot(),
ConfigModule.forRoot(),
DocModule.forRoot(),
],

View File

@@ -0,0 +1,335 @@
import { INestApplication } from '@nestjs/common';
import { ScheduleModule } from '@nestjs/schedule';
import { Test, TestingModule } from '@nestjs/testing';
import type { Snapshot } from '@prisma/client';
import test from 'ava';
import * as Sinon from 'sinon';
import { ConfigModule } from '../src/config';
import { DocHistoryManager } from '../src/modules/doc';
import { PrismaModule, PrismaService } from '../src/prisma';
import { flushDB } from './utils';
let app: INestApplication;
let m: TestingModule;
let manager: DocHistoryManager;
let db: PrismaService;
// cleanup database before each test
test.beforeEach(async () => {
await flushDB();
m = await Test.createTestingModule({
imports: [PrismaModule, ScheduleModule.forRoot(), ConfigModule.forRoot()],
providers: [DocHistoryManager],
}).compile();
app = m.createNestApplication();
await app.init();
manager = m.get(DocHistoryManager);
Sinon.stub(manager, 'getExpiredDateFromNow').resolves(
new Date(Date.now() + 1000)
);
db = m.get(PrismaService);
});
test.afterEach(async () => {
await app.close();
await m.close();
Sinon.restore();
});
const snapshot: Snapshot = {
workspaceId: '1',
id: 'doc1',
blob: Buffer.from([0, 0]),
state: Buffer.from([0, 0]),
seq: 0,
updatedAt: new Date(),
createdAt: new Date(),
};
test('should create doc history if never created before', async t => {
Sinon.stub(manager, 'last').resolves(null);
const timestamp = new Date();
await manager.onDocUpdated({
...snapshot,
updatedAt: timestamp,
});
const history = await db.snapshotHistory.findFirst({
where: {
workspaceId: '1',
id: 'doc1',
},
});
t.truthy(history);
t.is(history?.timestamp.getTime(), timestamp.getTime());
});
test('should not create history if timestamp equals to last record', async t => {
const timestamp = new Date();
Sinon.stub(manager, 'last').resolves({ timestamp, state: null });
await manager.onDocUpdated({
...snapshot,
updatedAt: timestamp,
});
const history = await db.snapshotHistory.findFirst({
where: {
workspaceId: '1',
id: 'doc1',
},
});
t.falsy(history);
});
test('should not create history if state equals to last record', async t => {
const timestamp = new Date();
Sinon.stub(manager, 'last').resolves({
timestamp: new Date(timestamp.getTime() - 1),
state: snapshot.state,
});
await manager.onDocUpdated({
...snapshot,
updatedAt: timestamp,
});
const history = await db.snapshotHistory.findFirst({
where: {
workspaceId: '1',
id: 'doc1',
},
});
t.falsy(history);
});
test('should not create history if time diff is less than interval config', async t => {
const timestamp = new Date();
Sinon.stub(manager, 'last').resolves({
timestamp: new Date(timestamp.getTime() - 1000),
state: Buffer.from([0, 1]),
});
await manager.onDocUpdated({
...snapshot,
updatedAt: timestamp,
});
const history = await db.snapshotHistory.findFirst({
where: {
workspaceId: '1',
id: 'doc1',
},
});
t.falsy(history);
});
test('should create history if time diff is larger than interval config and state diff', async t => {
const timestamp = new Date();
Sinon.stub(manager, 'last').resolves({
timestamp: new Date(timestamp.getTime() - 1000 * 60 * 20),
state: Buffer.from([0, 1]),
});
await manager.onDocUpdated({
...snapshot,
updatedAt: timestamp,
});
const history = await db.snapshotHistory.findFirst({
where: {
workspaceId: '1',
id: 'doc1',
},
});
t.truthy(history);
});
test('should create history with force flag even if time diff in small', async t => {
const timestamp = new Date();
Sinon.stub(manager, 'last').resolves({
timestamp: new Date(timestamp.getTime() - 1),
state: Buffer.from([0, 1]),
});
await manager.onDocUpdated(
{
...snapshot,
updatedAt: timestamp,
},
true
);
const history = await db.snapshotHistory.findFirst({
where: {
workspaceId: '1',
id: 'doc1',
},
});
t.truthy(history);
});
test('should correctly list all history records', async t => {
const timestamp = Date.now();
// insert expired data
await db.snapshotHistory.createMany({
data: new Array(10).fill(0).map((_, i) => ({
workspaceId: snapshot.workspaceId,
id: snapshot.id,
blob: snapshot.blob,
state: snapshot.state,
timestamp: new Date(timestamp - 10 - i),
expiredAt: new Date(timestamp - 1),
})),
});
// insert available data
await db.snapshotHistory.createMany({
data: new Array(10).fill(0).map((_, i) => ({
workspaceId: snapshot.workspaceId,
id: snapshot.id,
blob: snapshot.blob,
state: snapshot.state,
timestamp: new Date(timestamp + i),
expiredAt: new Date(timestamp + 1000),
})),
});
const list = await manager.list(
snapshot.workspaceId,
snapshot.id,
new Date(timestamp + 20),
8
);
const count = await manager.count(snapshot.workspaceId, snapshot.id);
t.is(list.length, 8);
t.is(count, 10);
});
test('should be able to get history data', async t => {
const timestamp = new Date();
await manager.onDocUpdated(
{
...snapshot,
updatedAt: timestamp,
},
true
);
const history = await manager.get(
snapshot.workspaceId,
snapshot.id,
timestamp
);
t.truthy(history);
t.deepEqual(history?.blob, snapshot.blob);
});
test('should be able to get last history record', async t => {
const timestamp = Date.now();
// insert available data
await db.snapshotHistory.createMany({
data: new Array(10).fill(0).map((_, i) => ({
workspaceId: snapshot.workspaceId,
id: snapshot.id,
blob: snapshot.blob,
state: snapshot.state,
timestamp: new Date(timestamp + i),
expiredAt: new Date(timestamp + 1000),
})),
});
const history = await manager.last(snapshot.workspaceId, snapshot.id);
t.truthy(history);
t.is(history?.timestamp.getTime(), timestamp + 9);
});
test('should be able to recover from history', async t => {
await db.snapshot.create({
data: {
...snapshot,
blob: Buffer.from([1, 1]),
state: Buffer.from([1, 1]),
},
});
const history1Timestamp = snapshot.updatedAt.getTime() - 10;
await manager.onDocUpdated({
...snapshot,
updatedAt: new Date(history1Timestamp),
});
await manager.recover(
snapshot.workspaceId,
snapshot.id,
new Date(history1Timestamp)
);
const [history1, history2] = await db.snapshotHistory.findMany({
where: {
workspaceId: snapshot.workspaceId,
id: snapshot.id,
},
});
t.is(history1.timestamp.getTime(), history1Timestamp);
t.is(history2.timestamp.getTime(), snapshot.updatedAt.getTime());
// new history data force created with snapshot state before recovered
t.deepEqual(history2?.blob, Buffer.from([1, 1]));
t.deepEqual(history2?.state, Buffer.from([1, 1]));
});
test('should be able to cleanup expired history', async t => {
const timestamp = Date.now();
// insert expired data
await db.snapshotHistory.createMany({
data: new Array(10).fill(0).map((_, i) => ({
workspaceId: snapshot.workspaceId,
id: snapshot.id,
blob: snapshot.blob,
state: snapshot.state,
timestamp: new Date(timestamp - 10 - i),
expiredAt: new Date(timestamp - 1),
})),
});
// insert available data
await db.snapshotHistory.createMany({
data: new Array(10).fill(0).map((_, i) => ({
workspaceId: snapshot.workspaceId,
id: snapshot.id,
blob: snapshot.blob,
state: snapshot.state,
timestamp: new Date(timestamp + i),
expiredAt: new Date(timestamp + 1000),
})),
});
let count = await db.snapshotHistory.count();
t.is(count, 20);
await manager.cleanupExpiredHistory();
count = await db.snapshotHistory.count();
t.is(count, 10);
const example = await db.snapshotHistory.findFirst();
t.truthy(example);
t.true(example!.expiredAt > new Date());
});

View File

@@ -12,7 +12,6 @@ import ava, { type TestFn } from 'ava';
import { ConfigModule } from '../src/config';
import { GqlModule } from '../src/graphql.module';
import { MetricsModule } from '../src/metrics';
import { AuthModule } from '../src/modules/auth';
import { AuthService } from '../src/modules/auth/service';
import { PrismaModule } from '../src/prisma';
@@ -44,7 +43,6 @@ test.beforeEach(async t => {
PrismaModule,
GqlModule,
AuthModule,
MetricsModule,
RateLimiterModule,
],
}).compile();

View File

@@ -1,78 +0,0 @@
import { Test, TestingModule } from '@nestjs/testing';
import test from 'ava';
import { register } from 'prom-client';
import { MetricsModule } from '../src/metrics';
import { Metrics } from '../src/metrics/metrics';
import { PrismaModule } from '../src/prisma';
let metrics: Metrics;
let module: TestingModule;
test.beforeEach(async () => {
module = await Test.createTestingModule({
imports: [MetricsModule, PrismaModule],
}).compile();
metrics = module.get(Metrics);
});
test.afterEach.always(async () => {
await module.close();
});
test('should be able to increment counter', async t => {
metrics.socketIOEventCounter(1, { event: 'client-handshake' });
const socketIOCounterMetric = register.getSingleMetric('socket_io_counter');
t.truthy(socketIOCounterMetric);
t.truthy(
JSON.stringify((await socketIOCounterMetric!.get()).values) ===
'[{"value":1,"labels":{"event":"client-handshake"}}]'
);
t.pass();
});
test('should be able to timer', async t => {
let minimum: number;
{
const endTimer = metrics.socketIOEventTimer({ event: 'client-handshake' });
const a = performance.now();
await new Promise(resolve => setTimeout(resolve, 50));
const b = performance.now();
minimum = b - a;
endTimer();
}
let maximum: number;
{
const a = performance.now();
const endTimer = metrics.socketIOEventTimer({ event: 'client-handshake' });
await new Promise(resolve => setTimeout(resolve, 100));
endTimer();
const b = performance.now();
maximum = b - a;
}
const socketIOTimerMetric = register.getSingleMetric('socket_io_timer');
t.truthy(socketIOTimerMetric);
const observations = (await socketIOTimerMetric!.get()).values;
for (const observation of observations) {
if (
observation.labels.event === 'client-handshake' &&
'quantile' in observation.labels
) {
t.truthy(
observation.value >= minimum / 1000,
'observation.value should be greater than minimum'
);
t.truthy(
observation.value <= maximum / 1000,
'observation.value should be less than maximum'
);
}
}
t.pass();
});

View File

@@ -1,6 +1,6 @@
{
"name": "@affine/storage",
"version": "0.10.2",
"version": "0.10.3",
"engines": {
"node": ">= 10.16.0 < 11 || >= 11.8.0"
},
@@ -36,10 +36,10 @@
"version": "napi version"
},
"devDependencies": {
"@napi-rs/cli": "^2.16.3",
"@napi-rs/cli": "^2.16.5",
"lib0": "^0.2.87",
"nx": "^16.10.0",
"nx": "^17.1.3",
"nx-cloud": "^16.5.2",
"yjs": "^13.6.8"
"yjs": "^13.6.10"
}
}

View File

@@ -8,5 +8,5 @@
"react": "18.2.0",
"react-dom": "18.2.0"
},
"version": "0.10.2"
"version": "0.10.3"
}

View File

@@ -496,8 +496,8 @@ const Command = React.forwardRef<HTMLDivElement, CommandProps>(
index + change < 0
? items[items.length - 1]
: index + change === items.length
? items[0]
: items[index + change];
? items[0]
: items[index + change];
}
if (newSelected)
@@ -666,10 +666,10 @@ const Item = React.forwardRef<HTMLDivElement, ItemProps>(
forceMount
? true
: context.filter() === false
? true
: !state.search
? true
: state.filtered.items.get(id) > 0
? true
: !state.search
? true
: state.filtered.items.get(id) > 0
);
React.useEffect(() => {
@@ -728,10 +728,10 @@ const Group = React.forwardRef<HTMLDivElement, GroupProps>(
forceMount
? true
: context.filter() === false
? true
: !state.search
? true
: state.filtered.groups.has(id)
? true
: !state.search
? true
: state.filtered.groups.has(id)
);
useLayoutEffect(() => {

View File

@@ -9,5 +9,5 @@
"@types/debug": "^4.1.9",
"vitest": "0.34.6"
},
"version": "0.10.2"
"version": "0.10.3"
}

View File

@@ -3,8 +3,8 @@
"private": true,
"type": "module",
"devDependencies": {
"@blocksuite/global": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/store": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/global": "0.0.0-20231122113751-6bf81eb3-nightly",
"@blocksuite/store": "0.0.0-20231122113751-6bf81eb3-nightly",
"react": "18.2.0",
"react-dom": "18.2.0",
"vitest": "0.34.6",
@@ -27,5 +27,5 @@
"dependencies": {
"lit": "^3.0.2"
},
"version": "0.10.2"
"version": "0.10.3"
}

View File

@@ -55,34 +55,33 @@
},
"dependencies": {
"@affine/sdk": "workspace:*",
"@blocksuite/blocks": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/global": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/store": "0.0.0-20231116023037-31273bb7-nightly",
"jotai": "^2.4.3",
"jotai-effect": "^0.2.2",
"@blocksuite/blocks": "0.0.0-20231122113751-6bf81eb3-nightly",
"@blocksuite/global": "0.0.0-20231122113751-6bf81eb3-nightly",
"@blocksuite/store": "0.0.0-20231122113751-6bf81eb3-nightly",
"jotai": "^2.5.1",
"jotai-effect": "^0.2.3",
"tinykeys": "^2.1.0",
"zod": "^3.22.4"
},
"devDependencies": {
"@affine-test/fixtures": "workspace:*",
"@affine/templates": "workspace:*",
"@blocksuite/editor": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/lit": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/editor": "0.0.0-20231122113751-6bf81eb3-nightly",
"@blocksuite/lit": "0.0.0-20231122113751-6bf81eb3-nightly",
"@testing-library/react": "^14.0.0",
"async-call-rpc": "^6.3.1",
"electron": "link:../../frontend/electron/node_modules/electron",
"nanoid": "^5.0.1",
"nanoid": "^5.0.3",
"react": "^18.2.0",
"rxjs": "^7.8.1",
"vite": "^4.4.11",
"vite-plugin-dts": "3.6.0",
"vitest": "0.34.6",
"yjs": "^13.6.8"
"yjs": "^13.6.10"
},
"peerDependencies": {
"@affine/templates": "*",
"@blocksuite/editor": "*",
"@blocksuite/lit": "*",
"async-call-rpc": "*",
"electron": "*",
"react": "*",
@@ -95,9 +94,6 @@
"@blocksuite/editor": {
"optional": true
},
"@blocksuite/lit": {
"optional": true
},
"async-call-rpc": {
"optional": true
},
@@ -111,5 +107,5 @@
"optional": true
}
},
"version": "0.10.2"
"version": "0.10.3"
}

View File

@@ -1,750 +1,17 @@
import type { Page, PageMeta, Workspace } from '@blocksuite/store';
import { createIndexeddbStorage } from '@blocksuite/store';
import type { createStore, WritableAtom } from 'jotai/vanilla';
import type { Doc } from 'yjs';
import { Array as YArray, Doc as YDoc, Map as YMap, transact } from 'yjs';
export async function initEmptyPage(page: Page, title?: string) {
await page.waitForLoaded();
const pageBlockId = page.addBlock('affine:page', {
title: new page.Text(title ?? ''),
});
page.addBlock('affine:surface', {}, pageBlockId);
const noteBlockId = page.addBlock('affine:note', {}, pageBlockId);
page.addBlock('affine:paragraph', {}, noteBlockId);
}
export async function buildEmptyBlockSuite(workspace: Workspace) {
const page = workspace.createPage();
await initEmptyPage(page);
workspace.setPageMeta(page.id, {
jumpOnce: true,
});
}
export async function buildShowcaseWorkspace(
workspace: Workspace,
options: {
schema: Schema;
atoms: {
pageMode: WritableAtom<
undefined,
[pageId: string, mode: 'page' | 'edgeless'],
void
>;
};
store: ReturnType<typeof createStore>;
}
) {
const prototypes = {
tags: {
options: [
{
id: 'icg1n5UdkP',
value: 'Travel',
color: 'var(--affine-tag-gray)',
},
{
id: 'Oe5dSe1DDJ',
value: 'Quick summary',
color: 'var(--affine-tag-green)',
},
{
id: 'g1L5dXKctL',
value: 'OKR',
color: 'var(--affine-tag-purple)',
},
{
id: 'q3mceOl_zi',
value: 'Streamline your workflow',
color: 'var(--affine-tag-teal)',
},
{
id: 'ze07JVwBu4',
value: 'Plan',
color: 'var(--affine-tag-teal)',
},
{
id: '8qcYPCTK0h',
value: 'Review',
color: 'var(--affine-tag-orange)',
},
{
id: 'wg-fBtd2eI',
value: 'Engage',
color: 'var(--affine-tag-pink)',
},
{
id: 'QYFD_HeQc-',
value: 'Create',
color: 'var(--affine-tag-blue)',
},
{
id: 'ZHBa2NtdSo',
value: 'Learn',
color: 'var(--affine-tag-yellow)',
},
],
},
};
workspace.meta.setProperties(prototypes);
const edgelessPage1 = nanoid();
const edgelessPage2 = nanoid();
const edgelessPage3 = nanoid();
const { store, atoms } = options;
[edgelessPage1, edgelessPage2, edgelessPage3].forEach(pageId => {
store.set(atoms.pageMode, pageId, 'edgeless');
});
const pageMetas = {
'9f6f3c04-cf32-470c-9648-479dc838f10e': {
createDate: 1691548231530,
tags: ['ZHBa2NtdSo', 'QYFD_HeQc-', 'wg-fBtd2eI'],
updatedDate: 1691676331623,
favorite: true,
jumpOnce: true,
},
'0773e198-5de0-45d4-a35e-de22ea72b96b': {
createDate: 1691548220794,
tags: [],
updatedDate: 1691676775642,
favorite: false,
},
'59b140eb-4449-488f-9eeb-42412dcc044e': {
createDate: 1691551731225,
tags: [],
updatedDate: 1691654611175,
favorite: false,
},
'7217fbe2-61db-4a91-93c6-ad5c800e5a43': {
createDate: 1691552082822,
tags: [],
updatedDate: 1691654606912,
favorite: false,
},
'6eb43ea8-8c11-456d-bb1d-5193937961ab': {
createDate: 1691552090989,
tags: [],
updatedDate: 1691646748171,
favorite: false,
},
'3ddc8a4f-62c7-4fd4-8064-9ed9f61e437a': {
createDate: 1691564303138,
tags: [],
updatedDate: 1691646845195,
},
'512b1cb3-d22d-4b20-a7aa-58e2afcb1238': {
createDate: 1691574743531,
tags: ['icg1n5UdkP'],
updatedDate: 1691647117761,
},
'22163830-8252-43fe-b62d-fd9bbeaa4caa': {
createDate: 1691574859042,
tags: [],
updatedDate: 1691648159371,
},
'b7a9e1bc-e205-44aa-8dad-7e328269d00b': {
createDate: 1691575011078,
tags: ['8qcYPCTK0h'],
updatedDate: 1691645074511,
favorite: false,
},
'646305d9-93e0-48df-bb92-d82944ceb5a3': {
createDate: 1691634722239,
tags: ['ze07JVwBu4'],
updatedDate: 1691647069662,
favorite: false,
},
'0350509d-8702-4797-b4d7-168f5e9359c7': {
createDate: 1691635388447,
tags: ['Oe5dSe1DDJ'],
updatedDate: 1691645873930,
},
'aa02af3c-5c5c-4856-b7ce-947ad17331f3': {
createDate: 1691636192263,
tags: ['q3mceOl_zi', 'g1L5dXKctL'],
updatedDate: 1691645102104,
},
'9d6e716e-a071-45a2-88ac-2f2f6eec0109': {
createDate: 1691574743531,
tags: ['icg1n5UdkP'],
updatedDate: 1691574743531,
},
} satisfies Record<string, Partial<PageMeta>>;
const data = [
[
'9f6f3c04-cf32-470c-9648-479dc838f10e',
import('@affine/templates/v1/getting-started.json'),
nanoid(),
],
[
'0773e198-5de0-45d4-a35e-de22ea72b96b',
import('@affine/templates/v1/preloading.json'),
edgelessPage1,
],
[
'59b140eb-4449-488f-9eeb-42412dcc044e',
import('@affine/templates/v1/template-galleries.json'),
nanoid(),
],
[
'7217fbe2-61db-4a91-93c6-ad5c800e5a43',
import('@affine/templates/v1/personal-home.json'),
nanoid(),
],
[
'6eb43ea8-8c11-456d-bb1d-5193937961ab',
import('@affine/templates/v1/working-home.json'),
nanoid(),
],
[
'3ddc8a4f-62c7-4fd4-8064-9ed9f61e437a',
import('@affine/templates/v1/personal-project-management.json'),
nanoid(),
],
[
'512b1cb3-d22d-4b20-a7aa-58e2afcb1238',
import('@affine/templates/v1/travel-plan.json'),
edgelessPage2,
],
[
'22163830-8252-43fe-b62d-fd9bbeaa4caa',
import('@affine/templates/v1/personal-knowledge-management.json'),
nanoid(),
],
[
'b7a9e1bc-e205-44aa-8dad-7e328269d00b',
import('@affine/templates/v1/annual-performance-review.json'),
nanoid(),
],
[
'646305d9-93e0-48df-bb92-d82944ceb5a3',
import('@affine/templates/v1/brief-event-planning.json'),
nanoid(),
],
[
'0350509d-8702-4797-b4d7-168f5e9359c7',
import('@affine/templates/v1/meeting-summary.json'),
nanoid(),
],
[
'aa02af3c-5c5c-4856-b7ce-947ad17331f3',
import('@affine/templates/v1/okr-template.json'),
nanoid(),
],
[
'9d6e716e-a071-45a2-88ac-2f2f6eec0109',
import('@affine/templates/v1/travel-note.json'),
edgelessPage3,
],
] as const;
const idMap = await Promise.all(data).then(async data => {
return data.reduce<Record<string, string>>(
(record, currentValue) => {
const [oldId, _, newId] = currentValue;
record[oldId] = newId;
return record;
},
{} as Record<string, string>
);
});
await Promise.all(
data.map(async ([id, promise, newId]) => {
const { default: template } = await promise;
let json = JSON.stringify(template);
Object.entries(idMap).forEach(([oldId, newId]) => {
json = json.replaceAll(oldId, newId);
});
json = JSON.parse(json);
await workspace
.importPageSnapshot(structuredClone(json), newId)
.catch(error => {
console.error('error importing page', id, error);
});
const page = workspace.getPage(newId);
assertExists(page);
await page.waitForLoaded();
workspace.schema.upgradePage(
0,
{
'affine:note': 1,
'affine:bookmark': 1,
'affine:database': 2,
'affine:divider': 1,
'affine:image': 1,
'affine:list': 1,
'affine:code': 1,
'affine:page': 2,
'affine:paragraph': 1,
'affine:surface': 3,
},
page.spaceDoc
);
})
);
Object.entries(pageMetas).forEach(([oldId, meta]) => {
const newId = idMap[oldId];
workspace.setPageMeta(newId, meta);
});
}
import { applyUpdate, encodeStateAsUpdate } from 'yjs';
const migrationOrigin = 'affine-migration';
import { assertExists } from '@blocksuite/global/utils';
import type { Schema } from '@blocksuite/store';
import { nanoid } from 'nanoid';
type XYWH = [number, number, number, number];
function deserializeXYWH(xywh: string): XYWH {
return JSON.parse(xywh) as XYWH;
}
const getLatestVersions = (schema: Schema): Record<string, number> => {
return [...schema.flavourSchemaMap.entries()].reduce(
(record, [flavour, schema]) => {
record[flavour] = schema.version;
return record;
},
{} as Record<string, number>
);
};
function migrateDatabase(data: YMap<unknown>) {
data.delete('prop:mode');
data.set('prop:views', new YArray());
const columns = (data.get('prop:columns') as YArray<unknown>).toJSON() as {
id: string;
name: string;
hide: boolean;
type: string;
width: number;
selection?: unknown[];
}[];
const views = [
{
id: 'default',
name: 'Table',
columns: columns.map(col => ({
id: col.id,
width: col.width,
hide: col.hide,
})),
filter: { type: 'group', op: 'and', conditions: [] },
mode: 'table',
},
];
const cells = (data.get('prop:cells') as YMap<unknown>).toJSON() as Record<
string,
Record<
string,
{
id: string;
value: unknown;
}
>
>;
const convertColumn = (
id: string,
update: (cell: { id: string; value: unknown }) => void
) => {
Object.values(cells).forEach(row => {
if (row[id] != null) {
update(row[id]);
}
});
};
const newColumns = columns.map(v => {
let data: Record<string, unknown> = {};
if (v.type === 'select' || v.type === 'multi-select') {
data = { options: v.selection };
if (v.type === 'select') {
convertColumn(v.id, cell => {
if (Array.isArray(cell.value)) {
cell.value = cell.value[0]?.id;
}
});
} else {
convertColumn(v.id, cell => {
if (Array.isArray(cell.value)) {
cell.value = cell.value.map(v => v.id);
}
});
}
}
if (v.type === 'number') {
convertColumn(v.id, cell => {
if (typeof cell.value === 'string') {
cell.value = Number.parseFloat(cell.value.toString());
}
});
}
return {
id: v.id,
type: v.type,
name: v.name,
data,
};
});
data.set('prop:columns', newColumns);
data.set('prop:views', views);
data.set('prop:cells', cells);
}
function runBlockMigration(
flavour: string,
data: YMap<unknown>,
version: number
) {
if (flavour === 'affine:frame') {
data.set('sys:flavour', 'affine:note');
return;
}
if (flavour === 'affine:surface' && version <= 3) {
if (data.has('elements')) {
const elements = data.get('elements') as YMap<unknown>;
migrateSurface(elements);
data.set('prop:elements', elements.clone());
data.delete('elements');
} else {
data.set('prop:elements', new YMap());
}
}
if (flavour === 'affine:embed') {
data.set('sys:flavour', 'affine:image');
data.delete('prop:type');
}
if (flavour === 'affine:database' && version < 2) {
migrateDatabase(data);
}
}
function migrateSurface(data: YMap<unknown>) {
for (const [, value] of <IterableIterator<[string, YMap<unknown>]>>(
data.entries()
)) {
if (value.get('type') === 'connector') {
migrateSurfaceConnector(value);
}
}
}
function migrateSurfaceConnector(data: YMap<any>) {
let id = data.get('startElement')?.id;
const controllers = data.get('controllers');
const length = controllers.length;
const xywh = deserializeXYWH(data.get('xywh'));
if (id) {
data.set('source', { id });
} else {
data.set('source', {
position: [controllers[0].x + xywh[0], controllers[0].y + xywh[1]],
});
}
id = data.get('endElement')?.id;
if (id) {
data.set('target', { id });
} else {
data.set('target', {
position: [
controllers[length - 1].x + xywh[0],
controllers[length - 1].y + xywh[1],
],
});
}
const width = data.get('lineWidth') ?? 4;
data.set('strokeWidth', width);
const color = data.get('color');
data.set('stroke', color);
data.delete('startElement');
data.delete('endElement');
data.delete('controllers');
data.delete('lineWidth');
data.delete('color');
data.delete('xywh');
}
function updateBlockVersions(versions: YMap<number>) {
const frameVersion = versions.get('affine:frame');
if (frameVersion !== undefined) {
versions.set('affine:note', frameVersion);
versions.delete('affine:frame');
}
const embedVersion = versions.get('affine:embed');
if (embedVersion !== undefined) {
versions.set('affine:image', embedVersion);
versions.delete('affine:embed');
}
const databaseVersion = versions.get('affine:database');
if (databaseVersion !== undefined && databaseVersion < 2) {
versions.set('affine:database', 2);
}
}
function migrateMeta(
oldDoc: YDoc,
newDoc: YDoc,
idMap: Record<string, string>
) {
const originalMeta = oldDoc.getMap('space:meta');
const originalVersions = originalMeta.get('versions') as YMap<number>;
const originalPages = originalMeta.get('pages') as YArray<YMap<unknown>>;
const meta = newDoc.getMap('meta');
const pages = new YArray();
const blockVersions = originalVersions.clone();
meta.set('workspaceVersion', 1);
meta.set('blockVersions', blockVersions);
meta.set('pages', pages);
meta.set('name', originalMeta.get('name') as string);
updateBlockVersions(blockVersions);
const mapList = originalPages.map(page => {
const map = new YMap();
Array.from(page.entries())
.filter(([key]) => key !== 'subpageIds')
.forEach(([key, value]) => {
if (key === 'id') {
idMap[value] = nanoid();
map.set(key, idMap[value]);
} else {
map.set(key, value);
}
});
return map;
});
pages.push(mapList);
}
function migrateBlocks(
oldDoc: YDoc,
newDoc: YDoc,
idMap: Record<string, string>
) {
const spaces = newDoc.getMap('spaces');
const originalMeta = oldDoc.getMap('space:meta');
const originalVersions = originalMeta.get('versions') as YMap<number>;
const originalPages = originalMeta.get('pages') as YArray<YMap<unknown>>;
originalPages.forEach(page => {
const id = page.get('id') as string;
const newId = idMap[id];
const spaceId = id.startsWith('space:') ? id : `space:${id}`;
const originalBlocks = oldDoc.getMap(spaceId) as YMap<unknown>;
const subdoc = new YDoc();
spaces.set(newId, subdoc);
subdoc.guid = id;
const blocks = subdoc.getMap('blocks');
Array.from(originalBlocks.entries()).forEach(([key, value]) => {
const blockData = value.clone();
blocks.set(key, blockData);
const flavour = blockData.get('sys:flavour') as string;
const version = originalVersions.get(flavour);
if (version !== undefined) {
runBlockMigration(flavour, blockData, version);
}
});
});
}
export function migrateToSubdoc(oldDoc: YDoc): YDoc {
const needMigration =
Array.from(oldDoc.getMap('space:meta').keys()).length > 0;
if (!needMigration) {
return oldDoc;
}
const newDoc = new YDoc();
const idMap = {} as Record<string, string>;
migrateMeta(oldDoc, newDoc, idMap);
migrateBlocks(oldDoc, newDoc, idMap);
return newDoc;
}
export type UpgradeOptions = {
getCurrentRootDoc: () => Promise<YDoc>;
createWorkspace: () => Promise<Workspace>;
getSchema: () => Schema;
};
const upgradeV1ToV2 = async (options: UpgradeOptions) => {
const oldDoc = await options.getCurrentRootDoc();
const newDoc = migrateToSubdoc(oldDoc);
const newWorkspace = await options.createWorkspace();
applyUpdate(newWorkspace.doc, encodeStateAsUpdate(newDoc), migrationOrigin);
newDoc.getSubdocs().forEach(subdoc => {
newWorkspace.doc.getSubdocs().forEach(newDoc => {
if (subdoc.guid === newDoc.guid) {
applyUpdate(newDoc, encodeStateAsUpdate(subdoc), migrationOrigin);
}
});
});
return newWorkspace;
};
export * from './initialization';
export * from './migration/blob';
export { migratePages as forceUpgradePages } from './migration/blocksuite'; // campatible with electron
export * from './migration/fixing';
export { migrateToSubdoc } from './migration/subdoc';
export * from './migration/workspace';
/**
* Force upgrade block schema to the latest.
* Don't force to upgrade the pages without the check.
*
* Please note that this function will not upgrade the workspace version.
*
* @returns true if any schema is upgraded.
* @returns false if no schema is upgraded.
* @deprecated
* Use workspace meta data to determine the workspace version.
*/
export async function forceUpgradePages(
options: Omit<UpgradeOptions, 'createWorkspace'>
): Promise<boolean> {
const rootDoc = await options.getCurrentRootDoc();
guidCompatibilityFix(rootDoc);
const spaces = rootDoc.getMap('spaces') as YMap<any>;
const meta = rootDoc.getMap('meta') as YMap<unknown>;
const versions = meta.get('blockVersions') as YMap<number>;
const schema = options.getSchema();
const oldVersions = versions?.toJSON() ?? {};
spaces.forEach((space: Doc) => {
try {
schema.upgradePage(0, oldVersions, space);
} catch (e) {
console.error(`page ${space.guid} upgrade failed`, e);
}
});
const newVersions = getLatestVersions(schema);
meta.set('blockVersions', new YMap(Object.entries(newVersions)));
return Object.entries(oldVersions).some(
([flavour, version]) => newVersions[flavour] !== version
);
}
// database from 2 to 3
async function upgradeV2ToV3(options: UpgradeOptions): Promise<boolean> {
const rootDoc = await options.getCurrentRootDoc();
const spaces = rootDoc.getMap('spaces') as YMap<any>;
const meta = rootDoc.getMap('meta') as YMap<unknown>;
const versions = meta.get('blockVersions') as YMap<number>;
const schema = options.getSchema();
guidCompatibilityFix(rootDoc);
spaces.forEach((space: Doc) => {
schema.upgradePage(
0,
{
'affine:note': 1,
'affine:bookmark': 1,
'affine:database': 2,
'affine:divider': 1,
'affine:image': 1,
'affine:list': 1,
'affine:code': 1,
'affine:page': 2,
'affine:paragraph': 1,
'affine:surface': 3,
},
space
);
});
if ('affine:database' in versions) {
meta.set(
'blockVersions',
new YMap(Object.entries(getLatestVersions(schema)))
);
} else {
Object.entries(getLatestVersions(schema)).map(([flavour, version]) =>
versions.set(flavour, version)
);
}
return true;
}
// patch root doc's space guid compatibility issue
//
// in version 0.10, page id in spaces no longer has prefix "space:"
// The data flow for fetching a doc's updates is:
// - page id in `meta.pages` -> find `${page-id}` in `doc.spaces` -> `doc` -> `doc.guid`
// if `doc` is not found in `doc.spaces`, a new doc will be created and its `doc.guid` is the same with its pageId
// - because of guid logic change, the doc that previously prefixed with "space:" will not be found in `doc.spaces`
// - when fetching the rows of this doc using the doc id === page id,
// it will return empty since there is no updates associated with the page id
export function guidCompatibilityFix(rootDoc: YDoc) {
let changed = false;
transact(rootDoc, () => {
const meta = rootDoc.getMap('meta') as YMap<unknown>;
const pages = meta.get('pages') as YArray<YMap<unknown>>;
pages?.forEach(page => {
const pageId = page.get('id') as string | undefined;
if (pageId?.includes(':')) {
// remove the prefix "space:" from page id
page.set('id', pageId.split(':').at(-1));
}
});
const spaces = rootDoc.getMap('spaces') as YMap<YDoc>;
spaces?.forEach((doc: YDoc, pageId: string) => {
if (pageId.includes(':')) {
const newPageId = pageId.split(':').at(-1) ?? pageId;
const newDoc = new YDoc();
// clone the original doc. yjs is not happy to use the same doc instance
applyUpdate(newDoc, encodeStateAsUpdate(doc));
newDoc.guid = doc.guid;
spaces.set(newPageId, newDoc);
// should remove the old doc, otherwise we will do it again in the next run
spaces.delete(pageId);
changed = true;
console.debug(
`fixed space id ${pageId} -> ${newPageId}, doc id: ${doc.guid}`
);
}
});
});
return changed;
}
export enum WorkspaceVersion {
// v1 is treated as undefined
SubDoc = 2,
DatabaseV3 = 3,
Surface = 4,
}
/**
* If returns false, it means no migration is needed.
* If returns true, it means migration is done.
* If returns Workspace, it means new workspace is created,
* and the old workspace should be deleted.
*/
export async function migrateWorkspace(
currentVersion: WorkspaceVersion | undefined,
options: UpgradeOptions
): Promise<Workspace | boolean> {
if (currentVersion === undefined) {
const workspace = await upgradeV1ToV2(options);
await upgradeV2ToV3({
...options,
getCurrentRootDoc: () => Promise.resolve(workspace.doc),
});
return workspace;
}
if (currentVersion === WorkspaceVersion.SubDoc) {
return upgradeV2ToV3(options);
} else if (currentVersion === WorkspaceVersion.DatabaseV3) {
// surface from 3 to 5
return forceUpgradePages(options);
} else {
return false;
}
}
export async function migrateLocalBlobStorage(from: string, to: string) {
const fromStorage = createIndexeddbStorage(from);
const toStorage = createIndexeddbStorage(to);
const keys = await fromStorage.crud.list();
for (const key of keys) {
const value = await fromStorage.crud.get(key);
if (!value) {
console.warn('cannot find blob:', key);
continue;
}
await toStorage.crud.set(key, value);
}
}

View File

@@ -0,0 +1,294 @@
import { assertExists } from '@blocksuite/global/utils';
import type { Page, PageMeta, Workspace } from '@blocksuite/store';
import type { createStore, WritableAtom } from 'jotai/vanilla';
import { nanoid } from 'nanoid';
import { checkWorkspaceCompatibility, MigrationPoint } from '..';
import { migratePages } from '../migration/blocksuite';
export async function initEmptyPage(page: Page, title?: string) {
await page.load(() => {
const pageBlockId = page.addBlock('affine:page', {
title: new page.Text(title ?? ''),
});
page.addBlock('affine:surface', {}, pageBlockId);
const noteBlockId = page.addBlock('affine:note', {}, pageBlockId);
page.addBlock('affine:paragraph', {}, noteBlockId);
});
}
/**
* FIXME: Use exported json data to instead of building data.
*/
export async function buildShowcaseWorkspace(
workspace: Workspace,
options: {
atoms: {
pageMode: WritableAtom<
undefined,
[pageId: string, mode: 'page' | 'edgeless'],
void
>;
};
store: ReturnType<typeof createStore>;
}
) {
const prototypes = {
tags: {
options: [
{
id: 'icg1n5UdkP',
value: 'Travel',
color: 'var(--affine-tag-gray)',
},
{
id: 'Oe5dSe1DDJ',
value: 'Quick summary',
color: 'var(--affine-tag-green)',
},
{
id: 'g1L5dXKctL',
value: 'OKR',
color: 'var(--affine-tag-purple)',
},
{
id: 'q3mceOl_zi',
value: 'Streamline your workflow',
color: 'var(--affine-tag-teal)',
},
{
id: 'ze07JVwBu4',
value: 'Plan',
color: 'var(--affine-tag-teal)',
},
{
id: '8qcYPCTK0h',
value: 'Review',
color: 'var(--affine-tag-orange)',
},
{
id: 'wg-fBtd2eI',
value: 'Engage',
color: 'var(--affine-tag-pink)',
},
{
id: 'QYFD_HeQc-',
value: 'Create',
color: 'var(--affine-tag-blue)',
},
{
id: 'ZHBa2NtdSo',
value: 'Learn',
color: 'var(--affine-tag-yellow)',
},
],
},
};
workspace.meta.setProperties(prototypes);
const edgelessPage1 = nanoid();
const edgelessPage2 = nanoid();
const edgelessPage3 = nanoid();
const { store, atoms } = options;
[edgelessPage1, edgelessPage2, edgelessPage3].forEach(pageId => {
store.set(atoms.pageMode, pageId, 'edgeless');
});
const pageMetas = {
'9f6f3c04-cf32-470c-9648-479dc838f10e': {
createDate: 1691548231530,
tags: ['ZHBa2NtdSo', 'QYFD_HeQc-', 'wg-fBtd2eI'],
updatedDate: 1691676331623,
favorite: true,
jumpOnce: true,
},
'0773e198-5de0-45d4-a35e-de22ea72b96b': {
createDate: 1691548220794,
tags: [],
updatedDate: 1691676775642,
favorite: false,
},
'59b140eb-4449-488f-9eeb-42412dcc044e': {
createDate: 1691551731225,
tags: [],
updatedDate: 1691654611175,
favorite: false,
},
'7217fbe2-61db-4a91-93c6-ad5c800e5a43': {
createDate: 1691552082822,
tags: [],
updatedDate: 1691654606912,
favorite: false,
},
'6eb43ea8-8c11-456d-bb1d-5193937961ab': {
createDate: 1691552090989,
tags: [],
updatedDate: 1691646748171,
favorite: false,
},
'3ddc8a4f-62c7-4fd4-8064-9ed9f61e437a': {
createDate: 1691564303138,
tags: [],
updatedDate: 1691646845195,
},
'512b1cb3-d22d-4b20-a7aa-58e2afcb1238': {
createDate: 1691574743531,
tags: ['icg1n5UdkP'],
updatedDate: 1691647117761,
},
'22163830-8252-43fe-b62d-fd9bbeaa4caa': {
createDate: 1691574859042,
tags: [],
updatedDate: 1691648159371,
},
'b7a9e1bc-e205-44aa-8dad-7e328269d00b': {
createDate: 1691575011078,
tags: ['8qcYPCTK0h'],
updatedDate: 1691645074511,
favorite: false,
},
'646305d9-93e0-48df-bb92-d82944ceb5a3': {
createDate: 1691634722239,
tags: ['ze07JVwBu4'],
updatedDate: 1691647069662,
favorite: false,
},
'0350509d-8702-4797-b4d7-168f5e9359c7': {
createDate: 1691635388447,
tags: ['Oe5dSe1DDJ'],
updatedDate: 1691645873930,
},
'aa02af3c-5c5c-4856-b7ce-947ad17331f3': {
createDate: 1691636192263,
tags: ['q3mceOl_zi', 'g1L5dXKctL'],
updatedDate: 1691645102104,
},
'9d6e716e-a071-45a2-88ac-2f2f6eec0109': {
createDate: 1691574743531,
tags: ['icg1n5UdkP'],
updatedDate: 1691574743531,
},
} satisfies Record<string, Partial<PageMeta>>;
const data = [
[
'9f6f3c04-cf32-470c-9648-479dc838f10e',
import('@affine/templates/v1/getting-started.json'),
nanoid(),
],
[
'0773e198-5de0-45d4-a35e-de22ea72b96b',
import('@affine/templates/v1/preloading.json'),
edgelessPage1,
],
[
'59b140eb-4449-488f-9eeb-42412dcc044e',
import('@affine/templates/v1/template-galleries.json'),
nanoid(),
],
[
'7217fbe2-61db-4a91-93c6-ad5c800e5a43',
import('@affine/templates/v1/personal-home.json'),
nanoid(),
],
[
'6eb43ea8-8c11-456d-bb1d-5193937961ab',
import('@affine/templates/v1/working-home.json'),
nanoid(),
],
[
'3ddc8a4f-62c7-4fd4-8064-9ed9f61e437a',
import('@affine/templates/v1/personal-project-management.json'),
nanoid(),
],
[
'512b1cb3-d22d-4b20-a7aa-58e2afcb1238',
import('@affine/templates/v1/travel-plan.json'),
edgelessPage2,
],
[
'22163830-8252-43fe-b62d-fd9bbeaa4caa',
import('@affine/templates/v1/personal-knowledge-management.json'),
nanoid(),
],
[
'b7a9e1bc-e205-44aa-8dad-7e328269d00b',
import('@affine/templates/v1/annual-performance-review.json'),
nanoid(),
],
[
'646305d9-93e0-48df-bb92-d82944ceb5a3',
import('@affine/templates/v1/brief-event-planning.json'),
nanoid(),
],
[
'0350509d-8702-4797-b4d7-168f5e9359c7',
import('@affine/templates/v1/meeting-summary.json'),
nanoid(),
],
[
'aa02af3c-5c5c-4856-b7ce-947ad17331f3',
import('@affine/templates/v1/okr-template.json'),
nanoid(),
],
[
'9d6e716e-a071-45a2-88ac-2f2f6eec0109',
import('@affine/templates/v1/travel-note.json'),
edgelessPage3,
],
] as const;
const idMap = await Promise.all(data).then(async data => {
return data.reduce<Record<string, string>>(
(record, currentValue) => {
const [oldId, _, newId] = currentValue;
record[oldId] = newId;
return record;
},
{} as Record<string, string>
);
});
// Import page one by one to prevent workspace meta race condition problem.
for (const [id, promise, newId] of data) {
const { default: template } = await promise;
let json = JSON.stringify(template);
Object.entries(idMap).forEach(([oldId, newId]) => {
json = json.replaceAll(oldId, newId);
});
json = JSON.parse(json);
await workspace
.importPageSnapshot(structuredClone(json), newId)
.catch(error => {
console.error('error importing page', id, error);
});
const page = workspace.getPage(newId);
assertExists(page);
await page.load();
workspace.schema.upgradePage(
0,
{
'affine:note': 1,
'affine:bookmark': 1,
'affine:database': 2,
'affine:divider': 1,
'affine:image': 1,
'affine:list': 1,
'affine:code': 1,
'affine:page': 2,
'affine:paragraph': 1,
'affine:surface': 3,
},
page.spaceDoc
);
}
// The showcase building will create multiple pages once, and may skip the version writing.
// https://github.com/toeverything/blocksuite/blob/master/packages/store/src/workspace/page.ts#L662
const compatibilityResult = checkWorkspaceCompatibility(workspace);
if (compatibilityResult === MigrationPoint.BlockVersion) {
await migratePages(workspace.doc, workspace.schema);
}
Object.entries(pageMetas).forEach(([oldId, meta]) => {
const newId = idMap[oldId];
workspace.setPageMeta(newId, meta);
});
}

View File

@@ -0,0 +1,15 @@
import { createIndexeddbStorage } from '@blocksuite/store';
export async function migrateLocalBlobStorage(from: string, to: string) {
const fromStorage = createIndexeddbStorage(from);
const toStorage = createIndexeddbStorage(to);
const keys = await fromStorage.crud.list();
for (const key of keys) {
const value = await fromStorage.crud.get(key);
if (!value) {
console.warn('cannot find blob:', key);
continue;
}
await toStorage.crud.set(key, value);
}
}

View File

@@ -0,0 +1,41 @@
import type { Schema } from '@blocksuite/store';
import type { Doc as YDoc } from 'yjs';
import { Map as YMap } from 'yjs';
const getLatestVersions = (schema: Schema): Record<string, number> => {
return [...schema.flavourSchemaMap.entries()].reduce(
(record, [flavour, schema]) => {
record[flavour] = schema.version;
return record;
},
{} as Record<string, number>
);
};
export async function migratePages(
rootDoc: YDoc,
schema: Schema
): Promise<boolean> {
const spaces = rootDoc.getMap('spaces') as YMap<any>;
const meta = rootDoc.getMap('meta') as YMap<unknown>;
const versions = meta.get('blockVersions') as YMap<number>;
const oldVersions = versions?.toJSON() ?? {};
spaces.forEach((space: YDoc) => {
schema.upgradePage(0, oldVersions, space);
});
schema.upgradeWorkspace(rootDoc);
// Hard code to upgrade page version to 2.
// Let e2e to ensure the data version is correct.
const pageVersion = meta.get('pageVersion');
if (typeof pageVersion !== 'number' || pageVersion < 2) {
meta.set('pageVersion', 2);
}
const newVersions = getLatestVersions(schema);
meta.set('blockVersions', new YMap(Object.entries(newVersions)));
return Object.entries(oldVersions).some(
([flavour, version]) => newVersions[flavour] !== version
);
}

View File

@@ -0,0 +1,67 @@
import type { Array as YArray, Map as YMap } from 'yjs';
import { Doc as YDoc, transact } from 'yjs';
import { applyUpdate, encodeStateAsUpdate } from 'yjs';
// patch root doc's space guid compatibility issue
//
// in version 0.10, page id in spaces no longer has prefix "space:"
// The data flow for fetching a doc's updates is:
// - page id in `meta.pages` -> find `${page-id}` in `doc.spaces` -> `doc` -> `doc.guid`
// if `doc` is not found in `doc.spaces`, a new doc will be created and its `doc.guid` is the same with its pageId
// - because of guid logic change, the doc that previously prefixed with "space:" will not be found in `doc.spaces`
// - when fetching the rows of this doc using the doc id === page id,
// it will return empty since there is no updates associated with the page id
export function guidCompatibilityFix(rootDoc: YDoc) {
let changed = false;
transact(rootDoc, () => {
const meta = rootDoc.getMap('meta') as YMap<unknown>;
const pages = meta.get('pages') as YArray<YMap<unknown>>;
pages?.forEach(page => {
const pageId = page.get('id') as string | undefined;
if (pageId?.includes(':')) {
// remove the prefix "space:" from page id
page.set('id', pageId.split(':').at(-1));
}
});
const spaces = rootDoc.getMap('spaces') as YMap<YDoc>;
spaces?.forEach((doc: YDoc, pageId: string) => {
if (pageId.includes(':')) {
const newPageId = pageId.split(':').at(-1) ?? pageId;
const newDoc = new YDoc();
// clone the original doc. yjs is not happy to use the same doc instance
applyUpdate(newDoc, encodeStateAsUpdate(doc));
newDoc.guid = doc.guid;
spaces.set(newPageId, newDoc);
// should remove the old doc, otherwise we will do it again in the next run
spaces.delete(pageId);
changed = true;
console.debug(
`fixed space id ${pageId} -> ${newPageId}, doc id: ${doc.guid}`
);
}
});
});
return changed;
}
/**
* Hard code to fix workspace version to be compatible with legacy data.
* Let e2e to ensure the data version is correct.
*/
export function fixWorkspaceVersion(rootDoc: YDoc) {
const meta = rootDoc.getMap('meta') as YMap<unknown>;
/**
* It doesn't matter to upgrade workspace version from 1 or undefined to 2.
* Blocksuite just set the value, do nothing else.
*/
const workspaceVersion = meta.get('workspaceVersion');
if (typeof workspaceVersion !== 'number' || workspaceVersion < 2) {
meta.set('workspaceVersion', 2);
const pageVersion = meta.get('pageVersion');
if (typeof pageVersion !== 'number') {
meta.set('pageVersion', 1);
}
}
}

View File

@@ -0,0 +1,282 @@
import type { Workspace } from '@blocksuite/store';
import { nanoid } from 'nanoid';
import { Array as YArray, Doc as YDoc, Map as YMap } from 'yjs';
import { applyUpdate, encodeStateAsUpdate } from 'yjs';
const migrationOrigin = 'affine-migration';
type XYWH = [number, number, number, number];
function deserializeXYWH(xywh: string): XYWH {
return JSON.parse(xywh) as XYWH;
}
function migrateDatabase(data: YMap<unknown>) {
data.delete('prop:mode');
data.set('prop:views', new YArray());
const columns = (data.get('prop:columns') as YArray<unknown>).toJSON() as {
id: string;
name: string;
hide: boolean;
type: string;
width: number;
selection?: unknown[];
}[];
const views = [
{
id: 'default',
name: 'Table',
columns: columns.map(col => ({
id: col.id,
width: col.width,
hide: col.hide,
})),
filter: { type: 'group', op: 'and', conditions: [] },
mode: 'table',
},
];
const cells = (data.get('prop:cells') as YMap<unknown>).toJSON() as Record<
string,
Record<
string,
{
id: string;
value: unknown;
}
>
>;
const convertColumn = (
id: string,
update: (cell: { id: string; value: unknown }) => void
) => {
Object.values(cells).forEach(row => {
if (row[id] != null) {
update(row[id]);
}
});
};
const newColumns = columns.map(v => {
let data: Record<string, unknown> = {};
if (v.type === 'select' || v.type === 'multi-select') {
data = { options: v.selection };
if (v.type === 'select') {
convertColumn(v.id, cell => {
if (Array.isArray(cell.value)) {
cell.value = cell.value[0]?.id;
}
});
} else {
convertColumn(v.id, cell => {
if (Array.isArray(cell.value)) {
cell.value = cell.value.map(v => v.id);
}
});
}
}
if (v.type === 'number') {
convertColumn(v.id, cell => {
if (typeof cell.value === 'string') {
cell.value = Number.parseFloat(cell.value.toString());
}
});
}
return {
id: v.id,
type: v.type,
name: v.name,
data,
};
});
data.set('prop:columns', newColumns);
data.set('prop:views', views);
data.set('prop:cells', cells);
}
function runBlockMigration(
flavour: string,
data: YMap<unknown>,
version: number
) {
if (flavour === 'affine:frame') {
data.set('sys:flavour', 'affine:note');
return;
}
if (flavour === 'affine:surface' && version <= 3) {
if (data.has('elements')) {
const elements = data.get('elements') as YMap<unknown>;
migrateSurface(elements);
data.set('prop:elements', elements.clone());
data.delete('elements');
} else {
data.set('prop:elements', new YMap());
}
}
if (flavour === 'affine:embed') {
data.set('sys:flavour', 'affine:image');
data.delete('prop:type');
}
if (flavour === 'affine:database' && version < 2) {
migrateDatabase(data);
}
}
function migrateSurface(data: YMap<unknown>) {
for (const [, value] of <IterableIterator<[string, YMap<unknown>]>>(
data.entries()
)) {
if (value.get('type') === 'connector') {
migrateSurfaceConnector(value);
}
}
}
function migrateSurfaceConnector(data: YMap<any>) {
let id = data.get('startElement')?.id;
const controllers = data.get('controllers');
const length = controllers.length;
const xywh = deserializeXYWH(data.get('xywh'));
if (id) {
data.set('source', { id });
} else {
data.set('source', {
position: [controllers[0].x + xywh[0], controllers[0].y + xywh[1]],
});
}
id = data.get('endElement')?.id;
if (id) {
data.set('target', { id });
} else {
data.set('target', {
position: [
controllers[length - 1].x + xywh[0],
controllers[length - 1].y + xywh[1],
],
});
}
const width = data.get('lineWidth') ?? 4;
data.set('strokeWidth', width);
const color = data.get('color');
data.set('stroke', color);
data.delete('startElement');
data.delete('endElement');
data.delete('controllers');
data.delete('lineWidth');
data.delete('color');
data.delete('xywh');
}
function updateBlockVersions(versions: YMap<number>) {
const frameVersion = versions.get('affine:frame');
if (frameVersion !== undefined) {
versions.set('affine:note', frameVersion);
versions.delete('affine:frame');
}
const embedVersion = versions.get('affine:embed');
if (embedVersion !== undefined) {
versions.set('affine:image', embedVersion);
versions.delete('affine:embed');
}
const databaseVersion = versions.get('affine:database');
if (databaseVersion !== undefined && databaseVersion < 2) {
versions.set('affine:database', 2);
}
}
function migrateMeta(
oldDoc: YDoc,
newDoc: YDoc,
idMap: Record<string, string>
) {
const originalMeta = oldDoc.getMap('space:meta');
const originalVersions = originalMeta.get('versions') as YMap<number>;
const originalPages = originalMeta.get('pages') as YArray<YMap<string>>;
const meta = newDoc.getMap('meta');
const pages = new YArray();
const blockVersions = originalVersions.clone();
meta.set('workspaceVersion', 1);
meta.set('blockVersions', blockVersions);
meta.set('pages', pages);
meta.set('name', originalMeta.get('name') as string);
updateBlockVersions(blockVersions);
const mapList = originalPages.map(page => {
const map = new YMap();
Array.from(page.entries())
.filter(([key]) => key !== 'subpageIds')
.forEach(([key, value]) => {
if (key === 'id') {
idMap[value] = nanoid();
map.set(key, idMap[value]);
} else {
map.set(key, value);
}
});
return map;
});
pages.push(mapList);
}
function migrateBlocks(
oldDoc: YDoc,
newDoc: YDoc,
idMap: Record<string, string>
) {
const spaces = newDoc.getMap('spaces');
const originalMeta = oldDoc.getMap('space:meta');
const originalVersions = originalMeta.get('versions') as YMap<number>;
const originalPages = originalMeta.get('pages') as YArray<YMap<unknown>>;
originalPages.forEach(page => {
const id = page.get('id') as string;
const newId = idMap[id];
const spaceId = id.startsWith('space:') ? id : `space:${id}`;
const originalBlocks = oldDoc.getMap(spaceId) as YMap<unknown>;
const subdoc = new YDoc();
spaces.set(newId, subdoc);
subdoc.guid = id;
const blocks = subdoc.getMap('blocks');
Array.from(originalBlocks.entries()).forEach(([key, value]) => {
// @ts-expect-error clone method exists
const blockData = value.clone();
blocks.set(key, blockData);
const flavour = blockData.get('sys:flavour') as string;
const version = originalVersions.get(flavour);
if (version !== undefined) {
runBlockMigration(flavour, blockData, version);
}
});
});
}
export function migrateToSubdoc(oldDoc: YDoc): YDoc {
const needMigration =
Array.from(oldDoc.getMap('space:meta').keys()).length > 0;
if (!needMigration) {
return oldDoc;
}
const newDoc = new YDoc();
const idMap = {} as Record<string, string>;
migrateMeta(oldDoc, newDoc, idMap);
migrateBlocks(oldDoc, newDoc, idMap);
return newDoc;
}
export const upgradeV1ToV2 = async (
oldDoc: YDoc,
createWorkspace: () => Promise<Workspace>
) => {
const newDoc = migrateToSubdoc(oldDoc);
const newWorkspace = await createWorkspace();
applyUpdate(newWorkspace.doc, encodeStateAsUpdate(newDoc), migrationOrigin);
newDoc.getSubdocs().forEach(subdoc => {
newWorkspace.doc.getSubdocs().forEach(newDoc => {
if (subdoc.guid === newDoc.guid) {
applyUpdate(newDoc, encodeStateAsUpdate(subdoc), migrationOrigin);
}
});
});
return newWorkspace;
};

View File

@@ -0,0 +1,87 @@
import type { Workspace } from '@blocksuite/store';
import type { Schema } from '@blocksuite/store';
import type { Doc as YDoc } from 'yjs';
import { migratePages } from './blocksuite';
import { upgradeV1ToV2 } from './subdoc';
interface MigrationOptions {
doc: YDoc;
schema: Schema;
createWorkspace: () => Promise<Workspace>;
}
function createMigrationQueue(options: MigrationOptions) {
return [
async (doc: YDoc) => {
const newWorkspace = await upgradeV1ToV2(doc, options.createWorkspace);
return newWorkspace.doc;
},
async (doc: YDoc) => {
await migratePages(doc, options.schema);
return doc;
},
];
}
/**
* For split migrate function from MigrationQueue.
*/
export enum MigrationPoint {
SubDoc = 1,
BlockVersion = 2,
}
export async function migrateWorkspace(
point: MigrationPoint,
options: MigrationOptions
) {
const migrationQueue = createMigrationQueue(options);
const migrationFns = migrationQueue.slice(point - 1);
let doc = options.doc;
for (const migrate of migrationFns) {
doc = await migrate(doc);
}
return doc;
}
export function checkWorkspaceCompatibility(
workspace: Workspace
): MigrationPoint | null {
const workspaceDocJSON = workspace.doc.toJSON();
const spaceMetaObj = workspaceDocJSON['space:meta'];
const docKeys = Object.keys(workspaceDocJSON);
const haveSpaceMeta = !!spaceMetaObj && Object.keys(spaceMetaObj).length > 0;
const haveLegacySpace = docKeys.some(key => key.startsWith('space:'));
if (haveSpaceMeta || haveLegacySpace) {
return MigrationPoint.SubDoc;
}
const hasVersion = workspace.meta.hasVersion;
if (!hasVersion) {
return MigrationPoint.BlockVersion;
}
// TODO: Catch compatibility error from blocksuite to show upgrade page.
// Temporarily follow the check logic of blocksuite.
if ((workspace.meta.pages?.length ?? 0) <= 1) {
try {
workspace.meta.validateVersion(workspace);
} catch (e) {
console.info('validateVersion error', e);
return MigrationPoint.BlockVersion;
}
}
// From v2, we depend on blocksuite to check and migrate data.
const blockVersions = workspace.meta.blockVersions;
for (const [flavour, version] of Object.entries(blockVersions ?? {})) {
const schema = workspace.schema.flavourSchemaMap.get(flavour);
if (schema?.version !== version) {
return MigrationPoint.BlockVersion;
}
}
return null;
}

View File

@@ -200,6 +200,7 @@ export type WorkspaceHandlers = {
list: () => Promise<[workspaceId: string, meta: WorkspaceMeta][]>;
delete: (id: string) => Promise<void>;
getMeta: (id: string) => Promise<WorkspaceMeta>;
clone: (id: string, newId: string) => Promise<void>;
};
export type UnwrapManagerHandlerToServerSide<

View File

@@ -1,6 +1,6 @@
{
"name": "@affine/sdk",
"version": "0.10.2",
"version": "0.10.3",
"type": "module",
"scripts": {
"build": "vite build",
@@ -22,12 +22,12 @@
"dist"
],
"dependencies": {
"@blocksuite/block-std": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/blocks": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/editor": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/global": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/store": "0.0.0-20231116023037-31273bb7-nightly",
"jotai": "^2.4.3",
"@blocksuite/block-std": "0.0.0-20231122113751-6bf81eb3-nightly",
"@blocksuite/blocks": "0.0.0-20231122113751-6bf81eb3-nightly",
"@blocksuite/editor": "0.0.0-20231122113751-6bf81eb3-nightly",
"@blocksuite/global": "0.0.0-20231122113751-6bf81eb3-nightly",
"@blocksuite/store": "0.0.0-20231122113751-6bf81eb3-nightly",
"jotai": "^2.5.1",
"zod": "^3.22.4"
},
"devDependencies": {

View File

@@ -35,4 +35,4 @@ downloadBinary(yDoc.guid).then(blob => {
## LICENSE
[MIT](https://github.com/toeverything/AFFiNE/blob/master/LICENSE-MIT)
[MIT](https://github.com/toeverything/AFFiNE/blob/canary/LICENSE-MIT)

View File

@@ -1,7 +1,7 @@
{
"name": "@toeverything/y-indexeddb",
"type": "module",
"version": "0.10.2",
"version": "0.10.3",
"description": "IndexedDB database adapter for Yjs",
"repository": "toeverything/AFFiNE",
"author": "toeverything",
@@ -33,18 +33,18 @@
},
"dependencies": {
"idb": "^7.1.1",
"nanoid": "^5.0.1",
"nanoid": "^5.0.3",
"y-provider": "workspace:*"
},
"devDependencies": {
"@blocksuite/blocks": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/store": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/blocks": "0.0.0-20231122113751-6bf81eb3-nightly",
"@blocksuite/store": "0.0.0-20231122113751-6bf81eb3-nightly",
"fake-indexeddb": "^5.0.0",
"vite": "^4.4.11",
"vite-plugin-dts": "3.6.0",
"vitest": "0.34.6",
"y-indexeddb": "^9.0.11",
"yjs": "^13.6.8"
"yjs": "^13.6.10"
},
"peerDependencies": {
"yjs": "^13"

View File

@@ -1,7 +1,7 @@
{
"name": "y-provider",
"type": "module",
"version": "0.10.2",
"version": "0.10.3",
"description": "Yjs provider protocol for multi document support",
"exports": {
".": "./src/index.ts"
@@ -24,11 +24,11 @@
"build": "vite build"
},
"devDependencies": {
"@blocksuite/store": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/store": "0.0.0-20231122113751-6bf81eb3-nightly",
"vite": "^4.4.11",
"vite-plugin-dts": "3.6.0",
"vitest": "0.34.6",
"yjs": "^13.6.8"
"yjs": "^13.6.10"
},
"peerDependencies": {
"yjs": "^13"

View File

@@ -12,7 +12,6 @@
"@blocksuite/editor": "*",
"@blocksuite/global": "*",
"@blocksuite/icons": "2.1.34",
"@blocksuite/lit": "*",
"@blocksuite/store": "*"
},
"dependencies": {
@@ -22,14 +21,11 @@
"@affine/workspace": "workspace:*",
"@dnd-kit/core": "^6.0.8",
"@dnd-kit/modifiers": "^6.0.1",
"@dnd-kit/sortable": "^7.0.2",
"@dnd-kit/sortable": "^8.0.0",
"@emotion/cache": "^11.11.0",
"@emotion/react": "^11.11.1",
"@emotion/server": "^11.11.0",
"@emotion/styled": "^11.11.0",
"@mui/base": "5.0.0-beta.19",
"@mui/icons-material": "^5.14.14",
"@mui/material": "^5.14.14",
"@popperjs/core": "^2.11.8",
"@radix-ui/react-avatar": "^1.0.4",
"@radix-ui/react-collapsible": "^1.0.3",
@@ -47,15 +43,14 @@
"clsx": "^2.0.0",
"dayjs": "^1.11.10",
"foxact": "^0.2.20",
"jotai": "^2.4.3",
"jotai-effect": "^0.2.2",
"jotai-scope": "^0.4.0",
"jotai": "^2.5.1",
"jotai-effect": "^0.2.3",
"jotai-scope": "^0.4.1",
"lit": "^3.0.2",
"lodash": "^4.17.21",
"lodash-es": "^4.17.21",
"lottie-react": "^2.4.0",
"lottie-web": "^5.12.2",
"nanoid": "^5.0.1",
"nanoid": "^5.0.3",
"next-themes": "^0.2.1",
"react": "18.2.0",
"react-datepicker": "^4.20.0",
@@ -69,12 +64,12 @@
"uuid": "^9.0.1"
},
"devDependencies": {
"@blocksuite/blocks": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/editor": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/global": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/icons": "2.1.35",
"@blocksuite/lit": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/store": "0.0.0-20231116023037-31273bb7-nightly",
"@blocksuite/blocks": "0.0.0-20231122113751-6bf81eb3-nightly",
"@blocksuite/editor": "0.0.0-20231122113751-6bf81eb3-nightly",
"@blocksuite/global": "0.0.0-20231122113751-6bf81eb3-nightly",
"@blocksuite/icons": "2.1.36",
"@blocksuite/lit": "0.0.0-20231122113751-6bf81eb3-nightly",
"@blocksuite/store": "0.0.0-20231122113751-6bf81eb3-nightly",
"@storybook/jest": "^0.2.3",
"@storybook/testing-library": "^0.2.2",
"@testing-library/react": "^14.0.0",
@@ -85,10 +80,10 @@
"@types/react-dom": "^18.2.13",
"@vanilla-extract/css": "^1.13.0",
"fake-indexeddb": "^5.0.0",
"typescript": "^5.2.2",
"typescript": "^5.3.2",
"vite": "^4.4.11",
"vitest": "0.34.6",
"yjs": "^13.6.8"
"yjs": "^13.6.10"
},
"version": "0.10.2"
"version": "0.10.3"
}

View File

@@ -1,47 +0,0 @@
import { Trans } from '@affine/i18n';
import { CloseIcon, Logo1Icon } from '@blocksuite/icons';
import {
downloadCloseButtonStyle,
downloadMessageStyle,
downloadTipContainerStyle,
downloadTipIconStyle,
downloadTipStyle,
linkStyle,
} from './index.css';
export const DownloadTips = ({ onClose }: { onClose: () => void }) => {
return (
<div
className={downloadTipContainerStyle}
data-testid="download-client-tip"
>
<div className={downloadTipStyle}>
<Logo1Icon className={downloadTipIconStyle} />
<div className={downloadMessageStyle}>
<Trans i18nKey="com.affine.banner.content">
This demo is limited.
<a
className={linkStyle}
href="https://affine.pro/download"
target="_blank"
rel="noreferrer"
>
Download the AFFiNE Client
</a>
for the latest features and Performance.
</Trans>
</div>
</div>
<div
className={downloadCloseButtonStyle}
onClick={onClose}
data-testid="download-client-tip-close-button"
>
<CloseIcon className={downloadTipIconStyle} />
</div>
</div>
);
};
export default DownloadTips;

View File

@@ -1,13 +1,4 @@
import { keyframes, style } from '@vanilla-extract/css';
const slideDown = keyframes({
'0%': {
height: '0px',
},
'100%': {
height: '44px',
},
});
import { style } from '@vanilla-extract/css';
export const browserWarningStyle = style({
backgroundColor: 'var(--affine-background-warning-color)',
@@ -36,52 +27,31 @@ export const closeIconStyle = style({
position: 'relative',
zIndex: 1,
});
export const downloadTipContainerStyle = style({
backgroundColor: 'var(--affine-primary-color)',
color: 'var(--affine-white)',
export const tipsContainer = style({
backgroundColor: 'var(--affine-background-error-color)',
color: 'var(--affine-error-color)',
width: '100%',
height: '44px',
fontSize: 'var(--affine-font-base)',
fontSize: 'var(--affine-font-sm)',
fontWeight: '700',
display: 'flex',
justifyContent: 'center',
justifyContent: 'space-between',
alignItems: 'center',
position: 'relative',
animation: `${slideDown} .3s ease-in-out forwards`,
padding: '12px 16px',
position: 'sticky',
gap: '16px',
containerType: 'inline-size',
});
export const downloadTipStyle = style({
export const tipsMessage = style({
color: 'var(--affine-error-color)',
flexGrow: 1,
flexShrink: 1,
});
export const tipsRightItem = style({
display: 'flex',
justifyContent: 'center',
flexShrink: 0,
justifyContent: 'space-between',
alignItems: 'center',
});
export const downloadTipIconStyle = style({
color: 'var(--affine-white)',
width: '24px',
height: '24px',
fontSize: '24px',
position: 'relative',
zIndex: 1,
});
export const downloadCloseButtonStyle = style({
color: 'var(--affine-white)',
cursor: 'pointer',
display: 'flex',
justifyContent: 'center',
alignItems: 'center',
position: 'absolute',
right: '24px',
});
export const downloadMessageStyle = style({
color: 'var(--affine-white)',
marginLeft: '8px',
});
export const linkStyle = style({
color: 'var(--affine-white)',
textDecoration: 'underline',
':hover': {
textDecoration: 'underline',
},
':visited': {
color: 'var(--affine-white)',
textDecoration: 'underline',
},
gap: '16px',
});

View File

@@ -1,2 +1,2 @@
export * from './browser-warning';
export * from './download-client';
export * from './local-demo-tips';

View File

@@ -0,0 +1,54 @@
import { CloseIcon } from '@blocksuite/icons';
import { Button, IconButton } from '@toeverything/components/button';
import { useCallback } from 'react';
import * as styles from './index.css';
type LocalDemoTipsProps = {
isLoggedIn: boolean;
onLogin: () => void;
onEnableCloud: () => void;
onClose: () => void;
};
export const LocalDemoTips = ({
onClose,
isLoggedIn,
onLogin,
onEnableCloud,
}: LocalDemoTipsProps) => {
const content = isLoggedIn
? 'This is a local demo workspace, and the data is stored locally. We recommend enabling AFFiNE Cloud.'
: 'This is a local demo workspace, and the data is stored locally in the browser. We recommend Enabling AFFiNE Cloud or downloading the client for a better experience.';
const buttonLabel = isLoggedIn
? 'Enable AFFiNE Cloud'
: 'Sign in with AFFiNE Cloud';
const handleClick = useCallback(() => {
if (isLoggedIn) {
return onEnableCloud();
}
return onLogin();
}, [isLoggedIn, onEnableCloud, onLogin]);
return (
<div className={styles.tipsContainer} data-testid="local-demo-tips">
<div className={styles.tipsMessage}>{content}</div>
<div className={styles.tipsRightItem}>
<div>
<Button onClick={handleClick}>{buttonLabel}</Button>
</div>
<IconButton
onClick={onClose}
data-testid="local-demo-tips-close-button"
>
<CloseIcon />
</IconButton>
</div>
</div>
);
};
export default LocalDemoTips;

Some files were not shown because too many files have changed in this diff Show More