Compare commits

..

10 Commits

Author SHA1 Message Date
renovate[bot]
d7adbb99c9 chore: bump up nestjs 2026-03-21 14:53:50 +00:00
DarkSky
6a93566422 chore: bump deps (#14690)
#### PR Dependency Tree


* **PR #14690** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Chores**
* Updated package manager and development tooling to latest compatible
versions.
* Updated backend framework and monitoring dependencies to latest
minor/patch releases.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-20 05:23:03 +08:00
DarkSky
7ac8b14b65 feat(editor): migrate typst mermaid to native (#14499)
<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Native/WASM Mermaid and Typst SVG preview rendering on desktop and
mobile, plus cross-platform Preview plugin integrations.

* **Improvements**
* Centralized, sanitized rendering bridge with automatic Typst
font-directory handling and configurable native renderer selection.
* More consistent and robust error serialization and worker-backed
preview flows for improved stability and performance.

* **Tests**
* Extensive unit and integration tests for preview rendering, font
discovery, sanitization, and error serialization.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-20 04:04:40 +08:00
DarkSky
16a8f17717 feat(server): improve oidc compatibility (#14686)
fix #13938 
fix #14683 
fix #14532

#### PR Dependency Tree


* **PR #14686** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Flexible OIDC claim mapping for email/name, automatic OIDC discovery
retry with exponential backoff, and explicit OAuth flow modes (popup vs
redirect) propagated through the auth flow.

* **Bug Fixes**
* Stricter OIDC email validation, clearer error messages listing
attempted claim candidates, and improved callback redirect handling for
various flow scenarios.

* **Tests**
* Added unit tests covering OIDC behaviors, backoff scheduler/promise
utilities, and frontend OAuth flow parsing/redirect logic.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-20 04:02:37 +08:00
DarkSky
1ffb8c922c fix(native): cleanup deleted docs and blobs (#14689) 2026-03-20 04:00:25 +08:00
DarkSky
daf536f77a fix(native): misalignment between index clock and snapshot clock (#14688)
fix #14191

#### PR Dependency Tree


* **PR #14688** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Bug Fixes**
* Improved indexer synchronization timing for clock persistence to
prevent premature completion signals
  * Enhanced document-level indexing status tracking accuracy
  * Optimized refresh behavior for better state consistency

* **Chores**
  * Updated indexer versioning system

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-20 02:09:11 +08:00
congzhou09
0d2d4bb6a1 fix(editor): note-edgeless-block loses edit state during shift-click range selection (#14675)
### Problem
●In edgeless mode, when using Shift + click to perform range selection
inside an editing `note-edgeless-block` (click at the starting point,
then hold Shift and click at the end point), the block will unexpectedly
lose its editing and selection state. As a result, subsequent operations
on the selection - such as deleting and moving - no longer work.

●The following video demonstrates this issue:


https://github.com/user-attachments/assets/82c68683-e002-4a58-b011-fe59f7fc9f02

### Solution
●The reason is that this "Shift + click" behavior is being handled by
the default multi-selection logic, which toggles selection mode and
exits the editing state. So I added an `else-if` branch to match this
case.

### After
●The video below shows the behavior after this fix.


https://github.com/user-attachments/assets/18d61108-2089-4def-b2dc-ae13fc5ac333

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Bug Fixes**
* Improved selection behavior during note editing in multi-select mode
to provide more intuitive interaction when using range selection during
active editing.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-19 22:22:22 +08:00
Mohad
cb9897d493 fix(i18n): support Arabic comma separator in date-picker weekDays and monthNames (#14663)
## Problem

The Arabic locale strings in `ar.json` use the Arabic comma `،` (U+060C)
as separator:

```json
"com.affine.calendar-date-picker.week-days": "أ،إث،ث،أر،خ،ج،س"
```

But `day-picker.tsx` splits on ASCII comma only — causing all
weekday/month names to render as a single unsplit string in Arabic
locale.

## Fix

Change `.split(',')` to `.split(/[,،]/)` in two call sites — matches
both ASCII and Arabic comma.

## Impact

One-line fix per call site. No other functionality affected. All
non-Arabic locales unchanged.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Bug Fixes**
* Date picker rendering updated to correctly handle both ASCII and
Arabic/Persian comma formats when determining month and weekday labels.
This fixes inconsistent header and month-name displays in locales using
different comma characters while preserving existing interactions and
behavior.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-19 22:21:51 +08:00
Ishan Goswami
8ca8333cd6 chore(server): update exa search tool description (#14682)
Updated the Exa search tool description to better reflect what Exa does.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Chores**
* Clarified the web search tool description to state it uses Exa, a web
search API optimized for AI applications to improve labeling and user
understanding.
* No functional or behavioral changes to the tool; this update affects
only the displayed description users see.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: Devin AI <158243242+devin-ai-integration[bot]@users.noreply.github.com>
Co-authored-by: ishan <ishan@exa.ai>
2026-03-19 05:42:04 +08:00
George Kapetanakis
3bf2503f55 fix(tools): improve sed error handling in set-version script (#14684)
## Summary
Replace post-command status checks with inline failure handling around
`sed` calls.
In the stream update path, ensure the two `sed` operations are treated
as one success/failure unit.
Keep behavior and file outputs the same on success, while making failure
handling explicit.

## Why
When `set -e` is enabled (which the script itself enables) command
failures cause the script to exit, making error handling by checking
`$?` not work.

## Files affected
- `set-version.sh`

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Refactor**
* Enhanced error handling in version management script with improved
failure reporting and context information.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-19 05:36:41 +08:00
116 changed files with 8364 additions and 2405 deletions

File diff suppressed because one or more lines are too long

940
.yarn/releases/yarn-4.13.0.cjs vendored Executable file

File diff suppressed because one or more lines are too long

View File

@@ -12,4 +12,4 @@ npmPublishAccess: public
npmRegistryServer: "https://registry.npmjs.org"
yarnPath: .yarn/releases/yarn-4.12.0.cjs
yarnPath: .yarn/releases/yarn-4.13.0.cjs

2703
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -36,7 +36,7 @@ resolver = "3"
criterion2 = { version = "3", default-features = false }
crossbeam-channel = "0.5"
dispatch2 = "0.3"
docx-parser = { git = "https://github.com/toeverything/docx-parser" }
docx-parser = { git = "https://github.com/toeverything/docx-parser", rev = "380beea" }
dotenvy = "0.15"
file-format = { version = "0.28", features = ["reader"] }
homedir = "0.3"
@@ -59,6 +59,7 @@ resolver = "3"
lru = "0.16"
matroska = "0.30"
memory-indexer = "0.3.0"
mermaid-rs-renderer = { git = "https://github.com/toeverything/mermaid-rs-renderer", rev = "fba9097", default-features = false }
mimalloc = "0.1"
mp4parse = "0.17"
nanoid = "0.4"
@@ -122,6 +123,14 @@ resolver = "3"
tree-sitter-rust = { version = "0.24" }
tree-sitter-scala = { version = "0.24" }
tree-sitter-typescript = { version = "0.23" }
typst = "0.14.2"
typst-as-lib = { version = "0.15.4", default-features = false, features = [
"packages",
"typst-kit-embed-fonts",
"typst-kit-fonts",
"ureq",
] }
typst-svg = "0.14.2"
uniffi = "0.29"
url = { version = "2.5" }
uuid = "1.8"

View File

@@ -32,6 +32,7 @@
},
"devDependencies": {
"@vitest/browser-playwright": "^4.0.18",
"playwright": "=1.58.2",
"vitest": "^4.0.18"
},
"exports": {

View File

@@ -516,6 +516,9 @@ export const EdgelessNoteInteraction =
}
})
.catch(console.error);
} else if (multiSelect && alreadySelected && editing) {
// range selection using Shift-click when editing
return;
} else {
context.default(context);
}

View File

@@ -35,6 +35,7 @@
},
"devDependencies": {
"@vitest/browser-playwright": "^4.0.18",
"playwright": "=1.58.2",
"vitest": "^4.0.18"
},
"exports": {

View File

@@ -34,6 +34,7 @@
},
"devDependencies": {
"@vitest/browser-playwright": "^4.0.18",
"playwright": "=1.58.2",
"vitest": "^4.0.18"
},
"exports": {

View File

@@ -42,6 +42,7 @@
"devDependencies": {
"@vanilla-extract/vite-plugin": "^5.0.0",
"@vitest/browser-playwright": "^4.0.18",
"playwright": "=1.58.2",
"vite": "^7.2.7",
"vite-plugin-istanbul": "^7.2.1",
"vite-plugin-wasm": "^3.5.0",

48
deny.toml Normal file
View File

@@ -0,0 +1,48 @@
[graph]
all-features = true
exclude-dev = true
targets = [
"x86_64-unknown-linux-gnu",
"aarch64-apple-darwin",
"x86_64-apple-darwin",
"x86_64-pc-windows-msvc",
"aarch64-linux-android",
"aarch64-apple-ios",
"aarch64-apple-ios-sim",
]
[licenses]
allow = [
"0BSD",
"Apache-2.0",
"Apache-2.0 WITH LLVM-exception",
"BSD-2-Clause",
"BSD-3-Clause",
"BSL-1.0",
"CC0-1.0",
"CDLA-Permissive-2.0",
"ISC",
"MIT",
"MPL-2.0",
"Unicode-3.0",
"Unlicense",
"Zlib",
]
confidence-threshold = 0.93
unused-allowed-license = "allow"
version = 2
[[licenses.exceptions]]
allow = ["AGPL-3.0-only"]
crate = "llm_adapter"
[[licenses.exceptions]]
allow = ["AGPL-3.0-or-later"]
crate = "memory-indexer"
[[licenses.exceptions]]
allow = ["AGPL-3.0-or-later"]
crate = "path-ext"
[licenses.private]
ignore = true

View File

@@ -92,7 +92,7 @@
"vite": "^7.2.7",
"vitest": "^4.0.18"
},
"packageManager": "yarn@4.12.0",
"packageManager": "yarn@4.13.0",
"resolutions": {
"array-buffer-byte-length": "npm:@nolyfill/array-buffer-byte-length@^1",
"array-includes": "npm:@nolyfill/array-includes@^1",

View File

@@ -2,6 +2,7 @@
edition = "2024"
license-file = "LICENSE"
name = "affine_server_native"
publish = false
version = "1.0.0"
[lib]

View File

@@ -33,30 +33,30 @@
"@nestjs-cls/transactional-adapter-prisma": "^1.3.4",
"@nestjs/apollo": "^13.0.4",
"@nestjs/bullmq": "^11.0.4",
"@nestjs/common": "^11.0.21",
"@nestjs/core": "^11.1.14",
"@nestjs/common": "^11.1.17",
"@nestjs/core": "^11.1.17",
"@nestjs/graphql": "^13.0.4",
"@nestjs/platform-express": "^11.1.14",
"@nestjs/platform-socket.io": "^11.1.14",
"@nestjs/platform-express": "^11.1.17",
"@nestjs/platform-socket.io": "^11.1.17",
"@nestjs/schedule": "^6.1.1",
"@nestjs/throttler": "^6.5.0",
"@nestjs/websockets": "^11.1.14",
"@nestjs/websockets": "^11.1.17",
"@node-rs/argon2": "^2.0.2",
"@node-rs/crc32": "^1.10.6",
"@opentelemetry/api": "^1.9.0",
"@opentelemetry/core": "^2.2.0",
"@opentelemetry/exporter-prometheus": "^0.212.0",
"@opentelemetry/exporter-zipkin": "^2.2.0",
"@opentelemetry/host-metrics": "^0.38.0",
"@opentelemetry/instrumentation": "^0.212.0",
"@opentelemetry/instrumentation-graphql": "^0.60.0",
"@opentelemetry/instrumentation-http": "^0.212.0",
"@opentelemetry/instrumentation-ioredis": "^0.60.0",
"@opentelemetry/instrumentation-nestjs-core": "^0.58.0",
"@opentelemetry/instrumentation-socket.io": "^0.59.0",
"@opentelemetry/exporter-prometheus": "^0.213.0",
"@opentelemetry/exporter-zipkin": "^2.6.0",
"@opentelemetry/host-metrics": "^0.38.3",
"@opentelemetry/instrumentation": "^0.213.0",
"@opentelemetry/instrumentation-graphql": "^0.61.0",
"@opentelemetry/instrumentation-http": "^0.213.0",
"@opentelemetry/instrumentation-ioredis": "^0.61.0",
"@opentelemetry/instrumentation-nestjs-core": "^0.59.0",
"@opentelemetry/instrumentation-socket.io": "^0.60.0",
"@opentelemetry/resources": "^2.2.0",
"@opentelemetry/sdk-metrics": "^2.2.0",
"@opentelemetry/sdk-node": "^0.212.0",
"@opentelemetry/sdk-node": "^0.213.0",
"@opentelemetry/sdk-trace-node": "^2.2.0",
"@opentelemetry/semantic-conventions": "^1.38.0",
"@prisma/client": "^6.6.0",
@@ -72,7 +72,7 @@
"eventemitter2": "^6.4.9",
"exa-js": "^2.4.0",
"express": "^5.0.1",
"fast-xml-parser": "^5.3.4",
"fast-xml-parser": "^5.5.7",
"get-stream": "^9.0.1",
"google-auth-library": "^10.2.0",
"graphql": "^16.9.0",

View File

@@ -6,13 +6,16 @@ import ava, { TestFn } from 'ava';
import Sinon from 'sinon';
import { AppModule } from '../../app.module';
import { ConfigFactory, URLHelper } from '../../base';
import { ConfigFactory, InvalidOauthResponse, URLHelper } from '../../base';
import { ConfigModule } from '../../base/config';
import { CurrentUser } from '../../core/auth';
import { AuthService } from '../../core/auth/service';
import { ServerFeature } from '../../core/config/types';
import { Models } from '../../models';
import { OAuthProviderName } from '../../plugins/oauth/config';
import { OAuthProviderFactory } from '../../plugins/oauth/factory';
import { GoogleOAuthProvider } from '../../plugins/oauth/providers/google';
import { OIDCProvider } from '../../plugins/oauth/providers/oidc';
import { OAuthService } from '../../plugins/oauth/service';
import { createTestingApp, currentUser, TestingApp } from '../utils';
@@ -35,6 +38,12 @@ test.before(async t => {
clientId: 'google-client-id',
clientSecret: 'google-client-secret',
},
oidc: {
clientId: '',
clientSecret: '',
issuer: '',
args: {},
},
},
},
server: {
@@ -432,6 +441,87 @@ function mockOAuthProvider(
return clientNonce;
}
function mockOidcProvider(
provider: OIDCProvider,
{
args = {},
idTokenClaims,
userinfo,
}: {
args?: Record<string, string>;
idTokenClaims: Record<string, unknown>;
userinfo: Record<string, unknown>;
}
) {
Sinon.stub(provider, 'config').get(() => ({
clientId: '',
clientSecret: '',
issuer: '',
args,
}));
Sinon.stub(
provider as unknown as { endpoints: { userinfo_endpoint: string } },
'endpoints'
).get(() => ({
userinfo_endpoint: 'https://oidc.affine.dev/userinfo',
}));
Sinon.stub(
provider as unknown as { verifyIdToken: () => unknown },
'verifyIdToken'
).resolves(idTokenClaims);
Sinon.stub(
provider as unknown as { fetchJson: () => unknown },
'fetchJson'
).resolves(userinfo);
}
function createOidcRegistrationHarness(config?: {
clientId?: string;
clientSecret?: string;
issuer?: string;
}) {
const server = {
enableFeature: Sinon.spy(),
disableFeature: Sinon.spy(),
};
const factory = new OAuthProviderFactory(server as any);
const affineConfig = {
server: {
externalUrl: 'https://affine.example',
host: 'localhost',
path: '',
https: true,
hosts: [],
},
oauth: {
providers: {
oidc: {
clientId: config?.clientId ?? 'oidc-client-id',
clientSecret: config?.clientSecret ?? 'oidc-client-secret',
issuer: config?.issuer ?? 'https://issuer.affine.dev',
args: {},
},
},
},
};
const provider = new OIDCProvider(new URLHelper(affineConfig as any));
(provider as any).factory = factory;
(provider as any).AFFiNEConfig = affineConfig;
return {
provider,
factory,
server,
};
}
async function flushAsyncWork(iterations = 5) {
for (let i = 0; i < iterations; i++) {
await new Promise(resolve => setImmediate(resolve));
}
}
test('should be able to sign up with oauth', async t => {
const { app, db } = t.context;
@@ -554,3 +644,209 @@ test('should be able to fullfil user with oauth sign in', async t => {
t.truthy(account);
t.is(account!.user.id, u3.id);
});
test('oidc should accept email from id token when userinfo email is missing', async t => {
const { app } = t.context;
const provider = app.get(OIDCProvider);
mockOidcProvider(provider, {
idTokenClaims: {
sub: 'oidc-user',
email: 'oidc-id-token@affine.pro',
name: 'OIDC User',
},
userinfo: {
sub: 'oidc-user',
name: 'OIDC User',
},
});
const user = await provider.getUser(
{ accessToken: 'token', idToken: 'id-token' },
{ token: 'nonce', provider: OAuthProviderName.OIDC }
);
t.is(user.id, 'oidc-user');
t.is(user.email, 'oidc-id-token@affine.pro');
t.is(user.name, 'OIDC User');
});
test('oidc should resolve custom email claim from userinfo', async t => {
const { app } = t.context;
const provider = app.get(OIDCProvider);
mockOidcProvider(provider, {
args: { claim_email: 'mail', claim_name: 'display_name' },
idTokenClaims: {
sub: 'oidc-user',
},
userinfo: {
sub: 'oidc-user',
mail: 'oidc-userinfo@affine.pro',
display_name: 'OIDC Custom',
},
});
const user = await provider.getUser(
{ accessToken: 'token', idToken: 'id-token' },
{ token: 'nonce', provider: OAuthProviderName.OIDC }
);
t.is(user.id, 'oidc-user');
t.is(user.email, 'oidc-userinfo@affine.pro');
t.is(user.name, 'OIDC Custom');
});
test('oidc should resolve custom email claim from id token', async t => {
const { app } = t.context;
const provider = app.get(OIDCProvider);
mockOidcProvider(provider, {
args: { claim_email: 'mail', claim_email_verified: 'mail_verified' },
idTokenClaims: {
sub: 'oidc-user',
mail: 'oidc-custom-id-token@affine.pro',
mail_verified: 'true',
},
userinfo: {
sub: 'oidc-user',
},
});
const user = await provider.getUser(
{ accessToken: 'token', idToken: 'id-token' },
{ token: 'nonce', provider: OAuthProviderName.OIDC }
);
t.is(user.id, 'oidc-user');
t.is(user.email, 'oidc-custom-id-token@affine.pro');
});
test('oidc should reject responses without a usable email claim', async t => {
const { app } = t.context;
const provider = app.get(OIDCProvider);
mockOidcProvider(provider, {
args: { claim_email: 'mail' },
idTokenClaims: {
sub: 'oidc-user',
mail: 'not-an-email',
},
userinfo: {
sub: 'oidc-user',
mail: 'still-not-an-email',
},
});
const error = await t.throwsAsync(
provider.getUser(
{ accessToken: 'token', idToken: 'id-token' },
{ token: 'nonce', provider: OAuthProviderName.OIDC }
)
);
t.true(error instanceof InvalidOauthResponse);
t.true(
error.message.includes(
'Missing valid email claim in OIDC response. Tried userinfo and ID token claims: "mail"'
)
);
});
test('oidc should not fall back to default email claim when custom claim is configured', async t => {
const { app } = t.context;
const provider = app.get(OIDCProvider);
mockOidcProvider(provider, {
args: { claim_email: 'mail' },
idTokenClaims: {
sub: 'oidc-user',
email: 'fallback@affine.pro',
},
userinfo: {
sub: 'oidc-user',
email: 'userinfo-fallback@affine.pro',
},
});
const error = await t.throwsAsync(
provider.getUser(
{ accessToken: 'token', idToken: 'id-token' },
{ token: 'nonce', provider: OAuthProviderName.OIDC }
)
);
t.true(error instanceof InvalidOauthResponse);
t.true(
error.message.includes(
'Missing valid email claim in OIDC response. Tried userinfo and ID token claims: "mail"'
)
);
});
test('oidc discovery should remove oauth feature on failure and restore it after backoff retry succeeds', async t => {
const { provider, factory, server } = createOidcRegistrationHarness();
const fetchStub = Sinon.stub(globalThis, 'fetch');
const scheduledRetries: Array<() => void> = [];
const retryDelays: number[] = [];
const setTimeoutStub = Sinon.stub(globalThis, 'setTimeout').callsFake(((
callback: Parameters<typeof setTimeout>[0],
delay?: number
) => {
retryDelays.push(Number(delay));
scheduledRetries.push(callback as () => void);
return Symbol('timeout') as unknown as ReturnType<typeof setTimeout>;
}) as typeof setTimeout);
t.teardown(() => {
provider.onModuleDestroy();
fetchStub.restore();
setTimeoutStub.restore();
});
fetchStub
.onFirstCall()
.rejects(new Error('temporary discovery failure'))
.onSecondCall()
.rejects(new Error('temporary discovery failure'))
.onThirdCall()
.resolves(
new Response(
JSON.stringify({
authorization_endpoint: 'https://issuer.affine.dev/auth',
token_endpoint: 'https://issuer.affine.dev/token',
userinfo_endpoint: 'https://issuer.affine.dev/userinfo',
issuer: 'https://issuer.affine.dev',
jwks_uri: 'https://issuer.affine.dev/jwks',
}),
{
status: 200,
headers: { 'Content-Type': 'application/json' },
}
)
);
(provider as any).setup();
await flushAsyncWork();
t.deepEqual(factory.providers, []);
t.true(server.disableFeature.calledWith(ServerFeature.OAuth));
t.is(fetchStub.callCount, 1);
t.deepEqual(retryDelays, [1000]);
const firstRetry = scheduledRetries.shift();
t.truthy(firstRetry);
firstRetry!();
await flushAsyncWork();
t.is(fetchStub.callCount, 2);
t.deepEqual(factory.providers, []);
t.deepEqual(retryDelays, [1000, 2000]);
const secondRetry = scheduledRetries.shift();
t.truthy(secondRetry);
secondRetry!();
await flushAsyncWork();
t.is(fetchStub.callCount, 3);
t.deepEqual(factory.providers, [OAuthProviderName.OIDC]);
t.true(server.enableFeature.calledWith(ServerFeature.OAuth));
t.is(scheduledRetries.length, 0);
});

View File

@@ -0,0 +1,75 @@
import test from 'ava';
import Sinon from 'sinon';
import {
exponentialBackoffDelay,
ExponentialBackoffScheduler,
} from '../promise';
test('exponentialBackoffDelay should cap exponential growth at maxDelayMs', t => {
t.is(exponentialBackoffDelay(0, { baseDelayMs: 100, maxDelayMs: 500 }), 100);
t.is(exponentialBackoffDelay(1, { baseDelayMs: 100, maxDelayMs: 500 }), 200);
t.is(exponentialBackoffDelay(3, { baseDelayMs: 100, maxDelayMs: 500 }), 500);
});
test('ExponentialBackoffScheduler should track pending callback and increase delay per attempt', async t => {
const clock = Sinon.useFakeTimers();
t.teardown(() => {
clock.restore();
});
const calls: number[] = [];
const scheduler = new ExponentialBackoffScheduler({
baseDelayMs: 100,
maxDelayMs: 500,
});
t.is(
scheduler.schedule(() => {
calls.push(1);
}),
100
);
t.true(scheduler.pending);
t.is(
scheduler.schedule(() => {
calls.push(2);
}),
null
);
await clock.tickAsync(100);
t.deepEqual(calls, [1]);
t.false(scheduler.pending);
t.is(
scheduler.schedule(() => {
calls.push(3);
}),
200
);
await clock.tickAsync(200);
t.deepEqual(calls, [1, 3]);
});
test('ExponentialBackoffScheduler reset should clear pending work and restart from the base delay', t => {
const scheduler = new ExponentialBackoffScheduler({
baseDelayMs: 100,
maxDelayMs: 500,
});
t.is(
scheduler.schedule(() => {}),
100
);
t.true(scheduler.pending);
scheduler.reset();
t.false(scheduler.pending);
t.is(
scheduler.schedule(() => {}),
100
);
scheduler.clear();
});

View File

@@ -1,4 +1,4 @@
import { setTimeout } from 'node:timers/promises';
import { setTimeout as delay } from 'node:timers/promises';
import { defer as rxjsDefer, retry } from 'rxjs';
@@ -52,5 +52,61 @@ export function defer(dispose: () => Promise<void>) {
}
export function sleep(ms: number): Promise<void> {
return setTimeout(ms);
return delay(ms);
}
export function exponentialBackoffDelay(
attempt: number,
{
baseDelayMs,
maxDelayMs,
factor = 2,
}: { baseDelayMs: number; maxDelayMs: number; factor?: number }
): number {
return Math.min(
baseDelayMs * Math.pow(factor, Math.max(0, attempt)),
maxDelayMs
);
}
export class ExponentialBackoffScheduler {
#attempt = 0;
#timer: ReturnType<typeof globalThis.setTimeout> | null = null;
constructor(
private readonly options: {
baseDelayMs: number;
maxDelayMs: number;
factor?: number;
}
) {}
get pending() {
return this.#timer !== null;
}
clear() {
if (this.#timer) {
clearTimeout(this.#timer);
this.#timer = null;
}
}
reset() {
this.#attempt = 0;
this.clear();
}
schedule(callback: () => void) {
if (this.#timer) return null;
const timeout = exponentialBackoffDelay(this.#attempt, this.options);
this.#timer = globalThis.setTimeout(() => {
this.#timer = null;
callback();
}, timeout);
this.#attempt += 1;
return timeout;
}
}

View File

@@ -7,7 +7,8 @@ import { defineTool } from './tool';
export const createExaSearchTool = (config: Config) => {
return defineTool({
description: 'Search the web for information',
description:
'Search the web using Exa, one of the best web search APIs for AI',
inputSchema: z.object({
query: z.string().describe('The query to search the web for.'),
mode: z

View File

@@ -1,9 +1,10 @@
import { Injectable } from '@nestjs/common';
import { Injectable, OnModuleDestroy } from '@nestjs/common';
import { createRemoteJWKSet, type JWTPayload, jwtVerify } from 'jose';
import { omit } from 'lodash-es';
import { z } from 'zod';
import {
ExponentialBackoffScheduler,
InvalidAuthState,
InvalidOauthResponse,
URLHelper,
@@ -35,7 +36,7 @@ const OIDCUserInfoSchema = z
.object({
sub: z.string(),
preferred_username: z.string().optional(),
email: z.string().email(),
email: z.string().optional(),
name: z.string().optional(),
email_verified: z
.union([z.boolean(), z.enum(['true', 'false', '1', '0', 'yes', 'no'])])
@@ -44,6 +45,8 @@ const OIDCUserInfoSchema = z
})
.passthrough();
const OIDCEmailSchema = z.string().email();
const OIDCConfigurationSchema = z.object({
authorization_endpoint: z.string().url(),
token_endpoint: z.string().url(),
@@ -54,16 +57,28 @@ const OIDCConfigurationSchema = z.object({
type OIDCConfiguration = z.infer<typeof OIDCConfigurationSchema>;
const OIDC_DISCOVERY_INITIAL_RETRY_DELAY = 1000;
const OIDC_DISCOVERY_MAX_RETRY_DELAY = 60_000;
@Injectable()
export class OIDCProvider extends OAuthProvider {
export class OIDCProvider extends OAuthProvider implements OnModuleDestroy {
override provider = OAuthProviderName.OIDC;
#endpoints: OIDCConfiguration | null = null;
#jwks: ReturnType<typeof createRemoteJWKSet> | null = null;
readonly #retryScheduler = new ExponentialBackoffScheduler({
baseDelayMs: OIDC_DISCOVERY_INITIAL_RETRY_DELAY,
maxDelayMs: OIDC_DISCOVERY_MAX_RETRY_DELAY,
});
#validationGeneration = 0;
constructor(private readonly url: URLHelper) {
super();
}
onModuleDestroy() {
this.#retryScheduler.clear();
}
override get requiresPkce() {
return true;
}
@@ -87,58 +102,109 @@ export class OIDCProvider extends OAuthProvider {
}
protected override setup() {
const validate = async () => {
this.#endpoints = null;
this.#jwks = null;
const generation = ++this.#validationGeneration;
this.#retryScheduler.clear();
if (super.configured) {
const config = this.config as OAuthOIDCProviderConfig;
if (!config.issuer) {
this.logger.error('Missing OIDC issuer configuration');
super.setup();
return;
}
try {
const res = await fetch(
`${config.issuer}/.well-known/openid-configuration`,
{
method: 'GET',
headers: { Accept: 'application/json' },
}
);
if (res.ok) {
const configuration = OIDCConfigurationSchema.parse(
await res.json()
);
if (
this.normalizeIssuer(config.issuer) !==
this.normalizeIssuer(configuration.issuer)
) {
this.logger.error(
`OIDC issuer mismatch, expected ${config.issuer}, got ${configuration.issuer}`
);
} else {
this.#endpoints = configuration;
this.#jwks = createRemoteJWKSet(new URL(configuration.jwks_uri));
}
} else {
this.logger.error(`Invalid OIDC issuer ${config.issuer}`);
}
} catch (e) {
this.logger.error('Failed to validate OIDC configuration', e);
}
}
super.setup();
};
validate().catch(() => {
this.validateAndSync(generation).catch(() => {
/* noop */
});
}
private async validateAndSync(generation: number) {
if (generation !== this.#validationGeneration) {
return;
}
if (!super.configured) {
this.resetState();
this.#retryScheduler.reset();
super.setup();
return;
}
const config = this.config as OAuthOIDCProviderConfig;
if (!config.issuer) {
this.logger.error('Missing OIDC issuer configuration');
this.resetState();
this.#retryScheduler.reset();
super.setup();
return;
}
try {
const res = await fetch(
`${config.issuer}/.well-known/openid-configuration`,
{
method: 'GET',
headers: { Accept: 'application/json' },
}
);
if (generation !== this.#validationGeneration) {
return;
}
if (!res.ok) {
this.logger.error(`Invalid OIDC issuer ${config.issuer}`);
this.onValidationFailure(generation);
return;
}
const configuration = OIDCConfigurationSchema.parse(await res.json());
if (
this.normalizeIssuer(config.issuer) !==
this.normalizeIssuer(configuration.issuer)
) {
this.logger.error(
`OIDC issuer mismatch, expected ${config.issuer}, got ${configuration.issuer}`
);
this.onValidationFailure(generation);
return;
}
this.#endpoints = configuration;
this.#jwks = createRemoteJWKSet(new URL(configuration.jwks_uri));
this.#retryScheduler.reset();
super.setup();
} catch (e) {
if (generation !== this.#validationGeneration) {
return;
}
this.logger.error('Failed to validate OIDC configuration', e);
this.onValidationFailure(generation);
}
}
private onValidationFailure(generation: number) {
this.resetState();
super.setup();
this.scheduleRetry(generation);
}
private scheduleRetry(generation: number) {
if (generation !== this.#validationGeneration) {
return;
}
const delay = this.#retryScheduler.schedule(() => {
this.validateAndSync(generation).catch(() => {
/* noop */
});
});
if (delay === null) {
return;
}
this.logger.warn(
`OIDC discovery validation failed, retrying in ${delay}ms`
);
}
private resetState() {
this.#endpoints = null;
this.#jwks = null;
}
getAuthUrl(state: string): string {
const parsedState = this.parseStatePayload(state);
const nonce = parsedState?.state ?? state;
@@ -291,6 +357,68 @@ export class OIDCProvider extends OAuthProvider {
return undefined;
}
private claimCandidates(
configuredClaim: string | undefined,
defaultClaim: string
) {
if (typeof configuredClaim === 'string' && configuredClaim.length > 0) {
return [configuredClaim];
}
return [defaultClaim];
}
private formatClaimCandidates(claims: string[]) {
return claims.map(claim => `"${claim}"`).join(', ');
}
private resolveStringClaim(
claims: string[],
...sources: Array<Record<string, unknown>>
) {
for (const claim of claims) {
for (const source of sources) {
const value = this.extractString(source[claim]);
if (value) {
return value;
}
}
}
return undefined;
}
private resolveBooleanClaim(
claims: string[],
...sources: Array<Record<string, unknown>>
) {
for (const claim of claims) {
for (const source of sources) {
const value = this.extractBoolean(source[claim]);
if (value !== undefined) {
return value;
}
}
}
return undefined;
}
private resolveEmailClaim(
claims: string[],
...sources: Array<Record<string, unknown>>
) {
for (const claim of claims) {
for (const source of sources) {
const value = this.extractString(source[claim]);
if (value && OIDCEmailSchema.safeParse(value).success) {
return value;
}
}
}
return undefined;
}
async getUser(tokens: Tokens, state: OAuthState): Promise<OAuthAccount> {
if (!tokens.idToken) {
throw new InvalidOauthResponse({
@@ -315,6 +443,8 @@ export class OIDCProvider extends OAuthProvider {
{ treatServerErrorAsInvalid: true }
);
const user = OIDCUserInfoSchema.parse(rawUser);
const userClaims = user as Record<string, unknown>;
const idTokenClaimsRecord = idTokenClaims as Record<string, unknown>;
if (!user.sub || !idTokenClaims.sub) {
throw new InvalidOauthResponse({
@@ -327,22 +457,29 @@ export class OIDCProvider extends OAuthProvider {
}
const args = this.config.args ?? {};
const idClaims = this.claimCandidates(args.claim_id, 'sub');
const emailClaims = this.claimCandidates(args.claim_email, 'email');
const nameClaims = this.claimCandidates(args.claim_name, 'name');
const emailVerifiedClaims = this.claimCandidates(
args.claim_email_verified,
'email_verified'
);
const claimsMap = {
id: args.claim_id || 'sub',
email: args.claim_email || 'email',
name: args.claim_name || 'name',
emailVerified: args.claim_email_verified || 'email_verified',
};
const accountId =
this.extractString(user[claimsMap.id]) ?? idTokenClaims.sub;
const email =
this.extractString(user[claimsMap.email]) ||
this.extractString(idTokenClaims.email);
const emailVerified =
this.extractBoolean(user[claimsMap.emailVerified]) ??
this.extractBoolean(idTokenClaims.email_verified);
const accountId = this.resolveStringClaim(
idClaims,
userClaims,
idTokenClaimsRecord
);
const email = this.resolveEmailClaim(
emailClaims,
userClaims,
idTokenClaimsRecord
);
const emailVerified = this.resolveBooleanClaim(
emailVerifiedClaims,
userClaims,
idTokenClaimsRecord
);
if (!accountId) {
throw new InvalidOauthResponse({
@@ -352,7 +489,7 @@ export class OIDCProvider extends OAuthProvider {
if (!email) {
throw new InvalidOauthResponse({
reason: 'Missing required claim for email',
reason: `Missing valid email claim in OIDC response. Tried userinfo and ID token claims: ${this.formatClaimCandidates(emailClaims)}`,
});
}
@@ -367,9 +504,11 @@ export class OIDCProvider extends OAuthProvider {
email,
};
const name =
this.extractString(user[claimsMap.name]) ||
this.extractString(idTokenClaims.name);
const name = this.resolveStringClaim(
nameClaims,
userClaims,
idTokenClaimsRecord
);
if (name) {
account.name = name;
}

View File

@@ -10,6 +10,7 @@ interface TestOps extends OpSchema {
add: [{ a: number; b: number }, number];
bin: [Uint8Array, Uint8Array];
sub: [Uint8Array, number];
init: [{ fastText?: boolean } | undefined, { ok: true }];
}
declare module 'vitest' {
@@ -84,6 +85,55 @@ describe('op client', () => {
expect(data.byteLength).toBe(0);
});
it('should send optional payload call with abort signal', async ctx => {
const abortController = new AbortController();
const result = ctx.producer.call(
'init',
{ fastText: true },
abortController.signal
);
expect(ctx.postMessage.mock.calls[0][0]).toMatchInlineSnapshot(`
{
"id": "init:1",
"name": "init",
"payload": {
"fastText": true,
},
"type": "call",
}
`);
ctx.handlers.return({
type: 'return',
id: 'init:1',
data: { ok: true },
});
await expect(result).resolves.toEqual({ ok: true });
});
it('should send undefined payload for optional input call', async ctx => {
const result = ctx.producer.call('init', undefined);
expect(ctx.postMessage.mock.calls[0][0]).toMatchInlineSnapshot(`
{
"id": "init:1",
"name": "init",
"payload": undefined,
"type": "call",
}
`);
ctx.handlers.return({
type: 'return',
id: 'init:1',
data: { ok: true },
});
await expect(result).resolves.toEqual({ ok: true });
});
it('should cancel call', async ctx => {
const promise = ctx.producer.call('add', { a: 1, b: 2 });

View File

@@ -40,18 +40,14 @@ describe('op consumer', () => {
it('should throw if no handler registered', async ctx => {
ctx.handlers.call({ type: 'call', id: 'add:1', name: 'add', payload: {} });
await vi.advanceTimersToNextTimerAsync();
expect(ctx.postMessage.mock.lastCall).toMatchInlineSnapshot(`
[
{
"error": {
"message": "Handler for operation [add] is not registered.",
"name": "Error",
},
"id": "add:1",
"type": "return",
},
]
`);
expect(ctx.postMessage.mock.lastCall?.[0]).toMatchObject({
type: 'return',
id: 'add:1',
error: {
message: 'Handler for operation [add] is not registered.',
name: 'Error',
},
});
});
it('should handle call message', async ctx => {
@@ -73,6 +69,38 @@ describe('op consumer', () => {
`);
});
it('should serialize string errors with message', async ctx => {
ctx.consumer.register('any', () => {
throw 'worker panic';
});
ctx.handlers.call({ type: 'call', id: 'any:1', name: 'any', payload: {} });
await vi.advanceTimersToNextTimerAsync();
expect(ctx.postMessage.mock.calls[0][0]).toMatchObject({
type: 'return',
id: 'any:1',
error: {
name: 'Error',
message: 'worker panic',
},
});
});
it('should serialize plain object errors with fallback message', async ctx => {
ctx.consumer.register('any', () => {
throw { reason: 'panic', code: 'E_PANIC' };
});
ctx.handlers.call({ type: 'call', id: 'any:1', name: 'any', payload: {} });
await vi.advanceTimersToNextTimerAsync();
const message = ctx.postMessage.mock.calls[0][0]?.error?.message;
expect(typeof message).toBe('string');
expect(message).toContain('"reason":"panic"');
expect(message).toContain('"code":"E_PANIC"');
});
it('should handle cancel message', async ctx => {
ctx.consumer.register('add', ({ a, b }, { signal }) => {
const { reject, resolve, promise } = Promise.withResolvers<number>();

View File

@@ -16,6 +16,96 @@ import {
} from './message';
import type { OpInput, OpNames, OpOutput, OpSchema } from './types';
const SERIALIZABLE_ERROR_FIELDS = [
'name',
'message',
'code',
'type',
'status',
'data',
'stacktrace',
] as const;
type SerializableErrorShape = Partial<
Record<(typeof SERIALIZABLE_ERROR_FIELDS)[number], unknown>
> & {
name?: string;
message?: string;
};
function getFallbackErrorMessage(error: unknown): string {
if (typeof error === 'string') {
return error;
}
if (error instanceof Error && error.message) {
return error.message;
}
if (
typeof error === 'number' ||
typeof error === 'boolean' ||
typeof error === 'bigint' ||
typeof error === 'symbol'
) {
return String(error);
}
if (error === null || error === undefined) {
return 'Unknown error';
}
try {
const jsonMessage = JSON.stringify(error);
if (jsonMessage && jsonMessage !== '{}') {
return jsonMessage;
}
} catch {
return 'Unknown error';
}
return 'Unknown error';
}
function serializeError(error: unknown): Error {
const valueToPick =
error && typeof error === 'object'
? error
: ({} as Record<string, unknown>);
const serialized = pick(
valueToPick,
SERIALIZABLE_ERROR_FIELDS
) as SerializableErrorShape;
if (!serialized.message || typeof serialized.message !== 'string') {
serialized.message = getFallbackErrorMessage(error);
}
if (!serialized.name || typeof serialized.name !== 'string') {
if (error instanceof Error && error.name) {
serialized.name = error.name;
} else if (error && typeof error === 'object') {
const constructorName = error.constructor?.name;
serialized.name =
typeof constructorName === 'string' && constructorName.length > 0
? constructorName
: 'Error';
} else {
serialized.name = 'Error';
}
}
if (
!serialized.stacktrace &&
error instanceof Error &&
typeof error.stack === 'string'
) {
serialized.stacktrace = error.stack;
}
return serialized as Error;
}
interface OpCallContext {
signal: AbortSignal;
}
@@ -71,15 +161,7 @@ export class OpConsumer<Ops extends OpSchema> extends AutoMessageHandler {
this.port.postMessage({
type: 'return',
id: msg.id,
error: pick(error, [
'name',
'message',
'code',
'type',
'status',
'data',
'stacktrace',
]),
error: serializeError(error),
} satisfies ReturnMessage);
},
complete: () => {
@@ -109,15 +191,7 @@ export class OpConsumer<Ops extends OpSchema> extends AutoMessageHandler {
this.port.postMessage({
type: 'error',
id: msg.id,
error: pick(error, [
'name',
'message',
'code',
'type',
'status',
'data',
'stacktrace',
]),
error: serializeError(error),
} satisfies SubscriptionErrorMessage);
},
complete: () => {

View File

@@ -12,7 +12,16 @@ export interface OpSchema {
[key: string]: [any, any?];
}
type RequiredInput<In> = In extends void ? [] : In extends never ? [] : [In];
type IsAny<T> = 0 extends 1 & T ? true : false;
type RequiredInput<In> =
IsAny<In> extends true
? [In]
: [In] extends [never]
? []
: [In] extends [void]
? []
: [In];
export type OpNames<T extends OpSchema> = ValuesOf<KeyToKey<T>>;
export type OpInput<

View File

@@ -2,6 +2,7 @@
edition = "2024"
license-file = "LICENSE"
name = "affine_common"
publish = false
version = "0.1.0"
[features]

View File

@@ -1,18 +1,235 @@
import 'fake-indexeddb/auto';
import { expect, test } from 'vitest';
import * as reader from '@affine/reader';
import { NEVER } from 'rxjs';
import { afterEach, expect, test, vi } from 'vitest';
import { Doc as YDoc, encodeStateAsUpdate } from 'yjs';
import { DummyConnection } from '../connection';
import {
IndexedDBBlobStorage,
IndexedDBBlobSyncStorage,
IndexedDBDocStorage,
IndexedDBDocSyncStorage,
} from '../impls/idb';
import { SpaceStorage } from '../storage';
import {
type AggregateOptions,
type AggregateResult,
type CrawlResult,
type DocClock,
type DocClocks,
type DocDiff,
type DocIndexedClock,
type DocRecord,
type DocStorage,
type DocUpdate,
type IndexerDocument,
type IndexerSchema,
IndexerStorageBase,
IndexerSyncStorageBase,
type Query,
type SearchOptions,
type SearchResult,
SpaceStorage,
} from '../storage';
import { Sync } from '../sync';
import { IndexerSyncImpl } from '../sync/indexer';
import { expectYjsEqual } from './utils';
afterEach(() => {
vi.restoreAllMocks();
});
function deferred<T = void>() {
let resolve!: (value: T | PromiseLike<T>) => void;
let reject!: (reason?: unknown) => void;
const promise = new Promise<T>((res, rej) => {
resolve = res;
reject = rej;
});
return { promise, resolve, reject };
}
class TestDocStorage implements DocStorage {
readonly storageType = 'doc' as const;
readonly connection = new DummyConnection();
readonly isReadonly = false;
private readonly subscribers = new Set<
(update: DocRecord, origin?: string) => void
>();
constructor(
readonly spaceId: string,
private readonly timestamps: Map<string, Date>,
private readonly crawlDocDataImpl: (
docId: string
) => Promise<CrawlResult | null>
) {}
async getDoc(_docId: string): Promise<DocRecord | null> {
return null;
}
async getDocDiff(
_docId: string,
_state?: Uint8Array
): Promise<DocDiff | null> {
return null;
}
async pushDocUpdate(update: DocUpdate, origin?: string): Promise<DocClock> {
const timestamp = this.timestamps.get(update.docId) ?? new Date();
const record = { ...update, timestamp };
this.timestamps.set(update.docId, timestamp);
for (const subscriber of this.subscribers) {
subscriber(record, origin);
}
return { docId: update.docId, timestamp };
}
async getDocTimestamp(docId: string): Promise<DocClock | null> {
const timestamp = this.timestamps.get(docId);
return timestamp ? { docId, timestamp } : null;
}
async getDocTimestamps(): Promise<DocClocks> {
return Object.fromEntries(this.timestamps);
}
async deleteDoc(docId: string): Promise<void> {
this.timestamps.delete(docId);
}
subscribeDocUpdate(callback: (update: DocRecord, origin?: string) => void) {
this.subscribers.add(callback);
return () => {
this.subscribers.delete(callback);
};
}
async crawlDocData(docId: string): Promise<CrawlResult | null> {
return this.crawlDocDataImpl(docId);
}
}
class TrackingIndexerStorage extends IndexerStorageBase {
override readonly connection = new DummyConnection();
override readonly isReadonly = false;
constructor(
private readonly calls: string[],
override readonly recommendRefreshInterval: number
) {
super();
}
override async search<
T extends keyof IndexerSchema,
const O extends SearchOptions<T>,
>(_table: T, _query: Query<T>, _options?: O): Promise<SearchResult<T, O>> {
return {
pagination: { count: 0, limit: 0, skip: 0, hasMore: false },
nodes: [],
} as SearchResult<T, O>;
}
override async aggregate<
T extends keyof IndexerSchema,
const O extends AggregateOptions<T>,
>(
_table: T,
_query: Query<T>,
_field: keyof IndexerSchema[T],
_options?: O
): Promise<AggregateResult<T, O>> {
return {
pagination: { count: 0, limit: 0, skip: 0, hasMore: false },
buckets: [],
} as AggregateResult<T, O>;
}
override search$<
T extends keyof IndexerSchema,
const O extends SearchOptions<T>,
>(_table: T, _query: Query<T>, _options?: O) {
return NEVER;
}
override aggregate$<
T extends keyof IndexerSchema,
const O extends AggregateOptions<T>,
>(_table: T, _query: Query<T>, _field: keyof IndexerSchema[T], _options?: O) {
return NEVER;
}
override async deleteByQuery<T extends keyof IndexerSchema>(
table: T,
_query: Query<T>
): Promise<void> {
this.calls.push(`deleteByQuery:${String(table)}`);
}
override async insert<T extends keyof IndexerSchema>(
table: T,
document: IndexerDocument<T>
): Promise<void> {
this.calls.push(`insert:${String(table)}:${document.id}`);
}
override async delete<T extends keyof IndexerSchema>(
table: T,
id: string
): Promise<void> {
this.calls.push(`delete:${String(table)}:${id}`);
}
override async update<T extends keyof IndexerSchema>(
table: T,
document: IndexerDocument<T>
): Promise<void> {
this.calls.push(`update:${String(table)}:${document.id}`);
}
override async refresh<T extends keyof IndexerSchema>(
_table: T
): Promise<void> {
return;
}
override async refreshIfNeed(): Promise<void> {
this.calls.push('refresh');
}
override async indexVersion(): Promise<number> {
return 1;
}
}
class TrackingIndexerSyncStorage extends IndexerSyncStorageBase {
override readonly connection = new DummyConnection();
private readonly clocks = new Map<string, DocIndexedClock>();
constructor(private readonly calls: string[]) {
super();
}
override async getDocIndexedClock(
docId: string
): Promise<DocIndexedClock | null> {
return this.clocks.get(docId) ?? null;
}
override async setDocIndexedClock(clock: DocIndexedClock): Promise<void> {
this.calls.push(`setClock:${clock.docId}`);
this.clocks.set(clock.docId, clock);
}
override async clearDocIndexedClock(docId: string): Promise<void> {
this.calls.push(`clearClock:${docId}`);
this.clocks.delete(docId);
}
}
test('doc', async () => {
const doc = new YDoc();
doc.getMap('test').set('hello', 'world');
@@ -207,3 +424,114 @@ test('blob', async () => {
expect(c?.data).toEqual(new Uint8Array([4, 3, 2, 1]));
}
});
test('indexer defers indexed clock persistence until a refresh happens on delayed refresh storages', async () => {
const calls: string[] = [];
const docsInRootDoc = new Map([['doc1', { title: 'Doc 1' }]]);
const docStorage = new TestDocStorage(
'workspace-id',
new Map([['doc1', new Date('2026-01-01T00:00:00.000Z')]]),
async () => ({
title: 'Doc 1',
summary: 'summary',
blocks: [
{ blockId: 'block-1', flavour: 'affine:image', blob: ['blob-1'] },
],
})
);
const indexer = new TrackingIndexerStorage(calls, 30_000);
const indexerSyncStorage = new TrackingIndexerSyncStorage(calls);
const sync = new IndexerSyncImpl(
docStorage,
{
local: indexer,
remotes: {},
},
indexerSyncStorage
);
vi.spyOn(reader, 'readAllDocsFromRootDoc').mockImplementation(
() => new Map(docsInRootDoc)
);
try {
sync.start();
await sync.waitForCompleted();
expect(calls).not.toContain('setClock:doc1');
sync.stop();
await vi.waitFor(() => {
expect(calls).toContain('setClock:doc1');
});
const lastRefreshIndex = calls.lastIndexOf('refresh');
const setClockIndex = calls.indexOf('setClock:doc1');
expect(lastRefreshIndex).toBeGreaterThanOrEqual(0);
expect(setClockIndex).toBeGreaterThan(lastRefreshIndex);
} finally {
sync.stop();
}
});
test('indexer completion waits for the current job to finish', async () => {
const docsInRootDoc = new Map([['doc1', { title: 'Doc 1' }]]);
const crawlStarted = deferred<void>();
const releaseCrawl = deferred<void>();
const docStorage = new TestDocStorage(
'workspace-id',
new Map([['doc1', new Date('2026-01-01T00:00:00.000Z')]]),
async () => {
crawlStarted.resolve();
await releaseCrawl.promise;
return {
title: 'Doc 1',
summary: 'summary',
blocks: [
{ blockId: 'block-1', flavour: 'affine:image', blob: ['blob-1'] },
],
};
}
);
const sync = new IndexerSyncImpl(
docStorage,
{
local: new TrackingIndexerStorage([], 30_000),
remotes: {},
},
new TrackingIndexerSyncStorage([])
);
vi.spyOn(reader, 'readAllDocsFromRootDoc').mockImplementation(
() => new Map(docsInRootDoc)
);
try {
sync.start();
await crawlStarted.promise;
let completed = false;
let docCompleted = false;
const waitForCompleted = sync.waitForCompleted().then(() => {
completed = true;
});
const waitForDocCompleted = sync.waitForDocCompleted('doc1').then(() => {
docCompleted = true;
});
await new Promise(resolve => setTimeout(resolve, 20));
expect(completed).toBe(false);
expect(docCompleted).toBe(false);
releaseCrawl.resolve();
await waitForCompleted;
await waitForDocCompleted;
} finally {
sync.stop();
}
});

View File

@@ -112,6 +112,10 @@ export class IndexerSyncImpl implements IndexerSync {
private readonly indexer: IndexerStorage;
private readonly remote?: IndexerStorage;
private readonly pendingIndexedClocks = new Map<
string,
{ docId: string; timestamp: Date; indexerVersion: number }
>();
private lastRefreshed = Date.now();
@@ -372,12 +376,13 @@ export class IndexerSyncImpl implements IndexerSync {
field: 'docId',
match: docId,
});
this.pendingIndexedClocks.delete(docId);
await this.indexerSync.clearDocIndexedClock(docId);
this.status.docsInIndexer.delete(docId);
this.status.statusUpdatedSubject$.next(docId);
}
}
await this.refreshIfNeed();
await this.refreshIfNeed(true);
// #endregion
} else {
// #region crawl doc
@@ -394,7 +399,8 @@ export class IndexerSyncImpl implements IndexerSync {
}
const docIndexedClock =
await this.indexerSync.getDocIndexedClock(docId);
this.pendingIndexedClocks.get(docId) ??
(await this.indexerSync.getDocIndexedClock(docId));
if (
docIndexedClock &&
docIndexedClock.timestamp.getTime() ===
@@ -460,13 +466,12 @@ export class IndexerSyncImpl implements IndexerSync {
);
}
await this.refreshIfNeed();
await this.indexerSync.setDocIndexedClock({
this.pendingIndexedClocks.set(docId, {
docId,
timestamp: docClock.timestamp,
indexerVersion: indexVersion,
});
await this.refreshIfNeed();
// #endregion
}
@@ -476,7 +481,7 @@ export class IndexerSyncImpl implements IndexerSync {
this.status.completeJob();
}
} finally {
await this.refreshIfNeed();
await this.refreshIfNeed(true);
unsubscribe();
}
}
@@ -484,18 +489,27 @@ export class IndexerSyncImpl implements IndexerSync {
// ensure the indexer is refreshed according to recommendRefreshInterval
// recommendRefreshInterval <= 0 means force refresh on each operation
// recommendRefreshInterval > 0 means refresh if the last refresh is older than recommendRefreshInterval
private async refreshIfNeed(): Promise<void> {
private async refreshIfNeed(force = false): Promise<void> {
const recommendRefreshInterval = this.indexer.recommendRefreshInterval ?? 0;
const needRefresh =
recommendRefreshInterval > 0 &&
this.lastRefreshed + recommendRefreshInterval < Date.now();
const forceRefresh = recommendRefreshInterval <= 0;
if (needRefresh || forceRefresh) {
if (force || needRefresh || forceRefresh) {
await this.indexer.refreshIfNeed();
await this.flushPendingIndexedClocks();
this.lastRefreshed = Date.now();
}
}
private async flushPendingIndexedClocks() {
if (this.pendingIndexedClocks.size === 0) return;
for (const [docId, clock] of this.pendingIndexedClocks) {
await this.indexerSync.setDocIndexedClock(clock);
this.pendingIndexedClocks.delete(docId);
}
}
/**
* Get all docs from the root doc, without deleted docs
*/
@@ -706,7 +720,10 @@ class IndexerSyncStatus {
indexing: this.jobs.length() + (this.currentJob ? 1 : 0),
total: this.docsInRootDoc.size + 1,
errorMessage: this.errorMessage,
completed: this.rootDocReady && this.jobs.length() === 0,
completed:
this.rootDocReady &&
this.jobs.length() === 0 &&
this.currentJob === null,
batterySaveMode: this.batterySaveMode,
paused: this.paused !== null,
});
@@ -734,9 +751,10 @@ class IndexerSyncStatus {
completed: true,
});
} else {
const indexing = this.jobs.has(docId) || this.currentJob === docId;
subscribe.next({
indexing: this.jobs.has(docId),
completed: this.docsInIndexer.has(docId) && !this.jobs.has(docId),
indexing,
completed: this.docsInIndexer.has(docId) && !indexing,
});
}
};

View File

@@ -7,7 +7,7 @@
},
"dependencies": {
"aws4": "^1.13.2",
"fast-xml-parser": "^5.3.4",
"fast-xml-parser": "^5.5.7",
"s3mini": "^0.9.1"
},
"devDependencies": {

View File

@@ -19,6 +19,7 @@ import app.affine.pro.plugin.AFFiNEThemePlugin
import app.affine.pro.plugin.AuthPlugin
import app.affine.pro.plugin.HashCashPlugin
import app.affine.pro.plugin.NbStorePlugin
import app.affine.pro.plugin.PreviewPlugin
import app.affine.pro.service.GraphQLService
import app.affine.pro.service.SSEService
import app.affine.pro.service.WebService
@@ -52,6 +53,7 @@ class MainActivity : BridgeActivity(), AIButtonPlugin.Callback, AFFiNEThemePlugi
AuthPlugin::class.java,
HashCashPlugin::class.java,
NbStorePlugin::class.java,
PreviewPlugin::class.java,
)
)
}

View File

@@ -1,8 +1,6 @@
package app.affine.pro.ai.chat
import com.affine.pro.graphql.GetCopilotHistoriesQuery
import com.affine.pro.graphql.fragment.CopilotChatHistory
import com.affine.pro.graphql.fragment.CopilotChatMessage
import kotlinx.datetime.Clock
import kotlinx.datetime.Instant
@@ -53,7 +51,7 @@ data class ChatMessage(
createAt = Clock.System.now(),
)
fun from(message: CopilotChatMessage) = ChatMessage(
fun from(message: CopilotChatHistory.Message) = ChatMessage(
id = message.id,
role = Role.fromValue(message.role),
content = message.content,

View File

@@ -0,0 +1,106 @@
package app.affine.pro.plugin
import android.net.Uri
import com.getcapacitor.JSObject
import com.getcapacitor.Plugin
import com.getcapacitor.PluginCall
import com.getcapacitor.PluginMethod
import com.getcapacitor.annotation.CapacitorPlugin
import kotlinx.coroutines.Dispatchers
import timber.log.Timber
import uniffi.affine_mobile_native.renderMermaidPreviewSvg
import uniffi.affine_mobile_native.renderTypstPreviewSvg
import java.io.File
private fun JSObject.getOptionalString(key: String): String? {
return if (has(key) && !isNull(key)) getString(key) else null
}
private fun JSObject.getOptionalDouble(key: String): Double? {
return if (has(key) && !isNull(key)) getDouble(key) else null
}
private fun resolveLocalFontDir(fontUrl: String): String? {
val uri = Uri.parse(fontUrl)
val path = when {
uri.scheme == null -> {
val file = File(fontUrl)
if (!file.isAbsolute) {
return null
}
file.path
}
uri.scheme == "file" -> uri.path
else -> null
} ?: return null
val file = File(path)
val directory = if (file.isDirectory) file else file.parentFile ?: return null
return directory.absolutePath
}
private fun JSObject.resolveTypstFontDirs(): List<String>? {
if (!has("fontUrls") || isNull("fontUrls")) {
return null
}
val fontUrls = optJSONArray("fontUrls")
?: throw IllegalArgumentException("Typst preview fontUrls must be an array of strings.")
val fontDirs = buildList(fontUrls.length()) {
repeat(fontUrls.length()) { index ->
val fontUrl = fontUrls.optString(index, null)
?: throw IllegalArgumentException("Typst preview fontUrls must be strings.")
val fontDir = resolveLocalFontDir(fontUrl)
?: throw IllegalArgumentException("Typst preview on mobile only supports local font file URLs or absolute font directories.")
add(fontDir)
}
}
return fontDirs.distinct()
}
@CapacitorPlugin(name = "Preview")
class PreviewPlugin : Plugin() {
@PluginMethod
fun renderMermaidSvg(call: PluginCall) {
launch(Dispatchers.IO) {
try {
val code = call.getStringEnsure("code")
val options = call.getObject("options")
val svg = renderMermaidPreviewSvg(
code = code,
theme = options?.getOptionalString("theme"),
fontFamily = options?.getOptionalString("fontFamily"),
fontSize = options?.getOptionalDouble("fontSize"),
)
call.resolve(JSObject().apply {
put("svg", svg)
})
} catch (e: Exception) {
Timber.e(e, "Failed to render Mermaid preview.")
call.reject("Failed to render Mermaid preview.", null, e)
}
}
}
@PluginMethod
fun renderTypstSvg(call: PluginCall) {
launch(Dispatchers.IO) {
try {
val code = call.getStringEnsure("code")
val options = call.getObject("options")
val svg = renderTypstPreviewSvg(
code = code,
fontDirs = options?.resolveTypstFontDirs(),
cacheDir = context.cacheDir.absolutePath,
)
call.resolve(JSObject().apply {
put("svg", svg)
})
} catch (e: Exception) {
Timber.e(e, "Failed to render Typst preview.")
call.reject("Failed to render Typst preview.", null, e)
}
}
}
}

View File

@@ -72,7 +72,7 @@ class GraphQLService @Inject constructor() {
).mapCatching { data ->
data.currentUser?.copilot?.chats?.paginatedCopilotChats?.edges?.map { item -> item.node.copilotChatHistory }?.firstOrNull { history ->
history.sessionId == sessionId
}?.messages?.map { msg -> msg.copilotChatMessage } ?: emptyList()
}?.messages ?: emptyList()
}
suspend fun getCopilotHistoryIds(

View File

@@ -792,6 +792,10 @@ internal interface UniffiForeignFutureCompleteVoid : com.sun.jna.Callback {
@@ -816,6 +820,10 @@ internal interface IntegrityCheckingUniffiLib : Library {
): Short
fun uniffi_affine_mobile_native_checksum_func_new_doc_storage_pool(
): Short
fun uniffi_affine_mobile_native_checksum_func_render_mermaid_preview_svg(
): Short
fun uniffi_affine_mobile_native_checksum_func_render_typst_preview_svg(
): Short
fun uniffi_affine_mobile_native_checksum_method_docstoragepool_clear_clocks(
): Short
fun uniffi_affine_mobile_native_checksum_method_docstoragepool_connect(
@@ -1017,6 +1025,10 @@ fun uniffi_affine_mobile_native_fn_func_hashcash_mint(`resource`: RustBuffer.ByV
): RustBuffer.ByValue
fun uniffi_affine_mobile_native_fn_func_new_doc_storage_pool(uniffi_out_err: UniffiRustCallStatus,
): Pointer
fun uniffi_affine_mobile_native_fn_func_render_mermaid_preview_svg(`code`: RustBuffer.ByValue,`theme`: RustBuffer.ByValue,`fontFamily`: RustBuffer.ByValue,`fontSize`: RustBuffer.ByValue,uniffi_out_err: UniffiRustCallStatus,
): RustBuffer.ByValue
fun uniffi_affine_mobile_native_fn_func_render_typst_preview_svg(`code`: RustBuffer.ByValue,`fontDirs`: RustBuffer.ByValue,`cacheDir`: RustBuffer.ByValue,uniffi_out_err: UniffiRustCallStatus,
): RustBuffer.ByValue
fun ffi_affine_mobile_native_rustbuffer_alloc(`size`: Long,uniffi_out_err: UniffiRustCallStatus,
): RustBuffer.ByValue
fun ffi_affine_mobile_native_rustbuffer_from_bytes(`bytes`: ForeignBytes.ByValue,uniffi_out_err: UniffiRustCallStatus,
@@ -1149,6 +1161,12 @@ private fun uniffiCheckApiChecksums(lib: IntegrityCheckingUniffiLib) {
if (lib.uniffi_affine_mobile_native_checksum_func_new_doc_storage_pool() != 32882.toShort()) {
throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project")
}
if (lib.uniffi_affine_mobile_native_checksum_func_render_mermaid_preview_svg() != 54334.toShort()) {
throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project")
}
if (lib.uniffi_affine_mobile_native_checksum_func_render_typst_preview_svg() != 42796.toShort()) {
throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project")
}
if (lib.uniffi_affine_mobile_native_checksum_method_docstoragepool_clear_clocks() != 51151.toShort()) {
throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project")
}
@@ -3178,6 +3196,38 @@ public object FfiConverterOptionalLong: FfiConverterRustBuffer<kotlin.Long?> {
/**
* @suppress
*/
public object FfiConverterOptionalDouble: FfiConverterRustBuffer<kotlin.Double?> {
override fun read(buf: ByteBuffer): kotlin.Double? {
if (buf.get().toInt() == 0) {
return null
}
return FfiConverterDouble.read(buf)
}
override fun allocationSize(value: kotlin.Double?): ULong {
if (value == null) {
return 1UL
} else {
return 1UL + FfiConverterDouble.allocationSize(value)
}
}
override fun write(value: kotlin.Double?, buf: ByteBuffer) {
if (value == null) {
buf.put(0)
} else {
buf.put(1)
FfiConverterDouble.write(value, buf)
}
}
}
/**
* @suppress
*/
@@ -3584,4 +3634,24 @@ public object FfiConverterSequenceTypeSearchHit: FfiConverterRustBuffer<List<Sea
}
@Throws(UniffiException::class) fun `renderMermaidPreviewSvg`(`code`: kotlin.String, `theme`: kotlin.String?, `fontFamily`: kotlin.String?, `fontSize`: kotlin.Double?): kotlin.String {
return FfiConverterString.lift(
uniffiRustCallWithError(UniffiException) { _status ->
UniffiLib.INSTANCE.uniffi_affine_mobile_native_fn_func_render_mermaid_preview_svg(
FfiConverterString.lower(`code`),FfiConverterOptionalString.lower(`theme`),FfiConverterOptionalString.lower(`fontFamily`),FfiConverterOptionalDouble.lower(`fontSize`),_status)
}
)
}
@Throws(UniffiException::class) fun `renderTypstPreviewSvg`(`code`: kotlin.String, `fontDirs`: List<kotlin.String>?, `cacheDir`: kotlin.String?): kotlin.String {
return FfiConverterString.lift(
uniffiRustCallWithError(UniffiException) { _status ->
UniffiLib.INSTANCE.uniffi_affine_mobile_native_fn_func_render_typst_preview_svg(
FfiConverterString.lower(`code`),FfiConverterOptionalSequenceString.lower(`fontDirs`),FfiConverterOptionalString.lower(`cacheDir`),_status)
}
)
}

View File

@@ -15,6 +15,7 @@ import {
ServersService,
ValidatorProvider,
} from '@affine/core/modules/cloud';
import { registerNativePreviewHandlers } from '@affine/core/modules/code-block-preview-renderer';
import { DocsService } from '@affine/core/modules/doc';
import { GlobalContextService } from '@affine/core/modules/global-context';
import { I18nProvider } from '@affine/core/modules/i18n';
@@ -54,6 +55,7 @@ import { AIButton } from './plugins/ai-button';
import { Auth } from './plugins/auth';
import { HashCash } from './plugins/hashcash';
import { NbStoreNativeDBApis } from './plugins/nbstore';
import { Preview } from './plugins/preview';
import { writeEndpointToken } from './proxy';
const storeManagerClient = createStoreManagerClient();
@@ -85,6 +87,11 @@ framework.impl(NbstoreProvider, {
});
const frameworkProvider = framework.provider();
registerNativePreviewHandlers({
renderMermaidSvg: request => Preview.renderMermaidSvg(request),
renderTypstSvg: request => Preview.renderTypstSvg(request),
});
framework.impl(PopupWindowProvider, {
open: (url: string) => {
InAppBrowser.open({

View File

@@ -0,0 +1,16 @@
export interface PreviewPlugin {
renderMermaidSvg(options: {
code: string;
options?: {
theme?: string;
fontFamily?: string;
fontSize?: number;
};
}): Promise<{ svg: string }>;
renderTypstSvg(options: {
code: string;
options?: {
fontUrls?: string[];
};
}): Promise<{ svg: string }>;
}

View File

@@ -0,0 +1,8 @@
import { registerPlugin } from '@capacitor/core';
import type { PreviewPlugin } from './definitions';
const Preview = registerPlugin<PreviewPlugin>('Preview');
export * from './definitions';
export { Preview };

View File

@@ -1,4 +1,4 @@
import { parse } from 'node:path';
import { parse, resolve } from 'node:path';
import { DocStorage, ValidationResult } from '@affine/native';
import { parseUniversalId } from '@affine/nbstore';
@@ -71,10 +71,34 @@ function getDefaultDBFileName(name: string, id: string) {
return fileName.replace(/[/\\?%*:|"<>]/g, '-');
}
async function resolveExistingPath(path: string) {
if (!(await fs.pathExists(path))) {
return null;
}
try {
return await fs.realpath(path);
} catch {
return resolve(path);
}
}
async function isSameFilePath(sourcePath: string, targetPath: string) {
if (resolve(sourcePath) === resolve(targetPath)) {
return true;
}
const [sourceRealPath, targetRealPath] = await Promise.all([
resolveExistingPath(sourcePath),
resolveExistingPath(targetPath),
]);
return !!sourceRealPath && sourceRealPath === targetRealPath;
}
/**
* This function is called when the user clicks the "Save" button in the "Save Workspace" dialog.
*
* It will just copy the file to the given path
* It will export a compacted database file to the given path
*/
export async function saveDBFileAs(
universalId: string,
@@ -115,12 +139,26 @@ export async function saveDBFileAs(
const filePath = ret.filePath;
if (ret.canceled || !filePath) {
return {
canceled: true,
};
return { canceled: true };
}
await fs.copyFile(dbPath, filePath);
if (await isSameFilePath(dbPath, filePath)) {
return { error: 'DB_FILE_PATH_INVALID' };
}
const tempFilePath = `${filePath}.${nanoid(6)}.tmp`;
if (await fs.pathExists(tempFilePath)) {
await fs.remove(tempFilePath);
}
try {
await pool.vacuumInto(universalId, tempFilePath);
await fs.move(tempFilePath, filePath, { overwrite: true });
} finally {
if (await fs.pathExists(tempFilePath)) {
await fs.remove(tempFilePath);
}
}
logger.log('saved', filePath);
if (!fakedResult) {
mainRPC.showItemInFolder(filePath).catch(err => {
@@ -183,11 +221,7 @@ export async function loadDBFile(
const provided =
getFakedResult() ??
(dbFilePath
? {
filePath: dbFilePath,
filePaths: [dbFilePath],
canceled: false,
}
? { filePath: dbFilePath, filePaths: [dbFilePath], canceled: false }
: undefined);
const ret =
provided ??
@@ -224,6 +258,10 @@ export async function loadDBFile(
return await cpV1DBFile(originalPath, workspaceId);
}
if (!(await storage.validateImportSchema())) {
return { error: 'DB_FILE_INVALID' };
}
// v2 import logic
const internalFilePath = await getSpaceDBPath(
'local',
@@ -231,8 +269,8 @@ export async function loadDBFile(
workspaceId
);
await fs.ensureDir(parse(internalFilePath).dir);
await fs.copy(originalPath, internalFilePath);
logger.info(`loadDBFile, copy: ${originalPath} -> ${internalFilePath}`);
await storage.vacuumInto(internalFilePath);
logger.info(`loadDBFile, vacuum: ${originalPath} -> ${internalFilePath}`);
storage = new DocStorage(internalFilePath);
await storage.setSpaceId(workspaceId);
@@ -260,17 +298,16 @@ async function cpV1DBFile(
return { error: 'DB_FILE_INVALID' }; // invalid db file
}
// checkout to make sure wal is flushed
const connection = new SqliteConnection(originalPath);
await connection.connect();
await connection.checkpoint();
await connection.close();
if (!(await connection.validateImportSchema())) {
return { error: 'DB_FILE_INVALID' };
}
const internalFilePath = await getWorkspaceDBPath('workspace', workspaceId);
await fs.ensureDir(await getWorkspacesBasePath());
await fs.copy(originalPath, internalFilePath);
logger.info(`loadDBFile, copy: ${originalPath} -> ${internalFilePath}`);
await fs.ensureDir(parse(internalFilePath).dir);
await connection.vacuumInto(internalFilePath);
logger.info(`loadDBFile, vacuum: ${originalPath} -> ${internalFilePath}`);
await storeWorkspaceMeta(workspaceId, {
id: workspaceId,

View File

@@ -1,5 +1,6 @@
import { dialogHandlers } from './dialog';
import { dbEventsV1, dbHandlersV1, nbstoreHandlers } from './nbstore';
import { previewHandlers } from './preview';
import { provideExposed } from './provide';
import { workspaceEvents, workspaceHandlers } from './workspace';
@@ -8,6 +9,7 @@ export const handlers = {
nbstore: nbstoreHandlers,
workspace: workspaceHandlers,
dialog: dialogHandlers,
preview: previewHandlers,
};
export const events = {

View File

@@ -0,0 +1,69 @@
import fs from 'node:fs';
import path from 'node:path';
import {
type MermaidRenderRequest,
type MermaidRenderResult,
renderMermaidSvg,
renderTypstSvg,
type TypstRenderRequest,
type TypstRenderResult,
} from '@affine/native';
const TYPST_FONT_DIRS_ENV = 'AFFINE_TYPST_FONT_DIRS';
function parseTypstFontDirsFromEnv() {
const value = process.env[TYPST_FONT_DIRS_ENV];
if (!value) {
return [];
}
return value
.split(path.delimiter)
.map(dir => dir.trim())
.filter(Boolean);
}
function getTypstFontDirCandidates() {
const resourcesPath = process.resourcesPath ?? '';
return [
...parseTypstFontDirsFromEnv(),
path.join(resourcesPath, 'fonts'),
path.join(resourcesPath, 'js', 'fonts'),
path.join(resourcesPath, 'app.asar.unpacked', 'fonts'),
path.join(resourcesPath, 'app.asar.unpacked', 'js', 'fonts'),
];
}
function resolveTypstFontDirs() {
return Array.from(
new Set(getTypstFontDirCandidates().map(dir => path.resolve(dir)))
).filter(dir => fs.statSync(dir, { throwIfNoEntry: false })?.isDirectory());
}
function withTypstFontDirs(
request: TypstRenderRequest,
fontDirs: string[]
): TypstRenderRequest {
const nextOptions = request.options ? { ...request.options } : {};
if (!nextOptions.fontDirs?.length) {
nextOptions.fontDirs = fontDirs;
}
return { ...request, options: nextOptions };
}
const typstFontDirs = resolveTypstFontDirs();
export const previewHandlers = {
renderMermaidSvg: async (
request: MermaidRenderRequest
): Promise<MermaidRenderResult> => {
return renderMermaidSvg(request);
},
renderTypstSvg: async (
request: TypstRenderRequest
): Promise<TypstRenderResult> => {
return renderTypstSvg(withTypstFontDirs(request, typstFontDirs));
},
};

View File

@@ -0,0 +1,268 @@
import { afterEach, describe, expect, test, vi } from 'vitest';
const connect = vi.fn();
const checkpoint = vi.fn();
const poolVacuumInto = vi.fn();
const pathExists = vi.fn();
const remove = vi.fn();
const move = vi.fn();
const realpath = vi.fn();
const copyFile = vi.fn();
const ensureDir = vi.fn();
const copy = vi.fn();
const storeWorkspaceMeta = vi.fn();
const getSpaceDBPath = vi.fn();
const getWorkspaceDBPath = vi.fn();
const getWorkspacesBasePath = vi.fn();
const docValidate = vi.fn();
const docValidateImportSchema = vi.fn();
const docVacuumInto = vi.fn();
const docSetSpaceId = vi.fn();
const sqliteValidate = vi.fn();
const sqliteValidateImportSchema = vi.fn();
const sqliteVacuumInto = vi.fn();
vi.doMock('nanoid', () => ({
nanoid: () => 'workspace-1',
}));
vi.doMock('@affine/native', () => {
const ValidationResult = {
MissingTables: 'MissingTables',
MissingDocIdColumn: 'MissingDocIdColumn',
MissingVersionColumn: 'MissingVersionColumn',
GeneralError: 'GeneralError',
Valid: 'Valid',
};
return {
ValidationResult,
DocStorage: class {
constructor(private readonly path: string) {}
validate() {
return docValidate(this.path);
}
validateImportSchema() {
return docValidateImportSchema(this.path);
}
vacuumInto(path: string) {
return docVacuumInto(this.path, path);
}
setSpaceId(spaceId: string) {
return docSetSpaceId(this.path, spaceId);
}
},
SqliteConnection: class {
static validate(path: string) {
return sqliteValidate(path);
}
constructor(private readonly path: string) {}
validateImportSchema() {
return sqliteValidateImportSchema(this.path);
}
vacuumInto(path: string) {
return sqliteVacuumInto(this.path, path);
}
},
};
});
vi.doMock('@affine/electron/helper/nbstore', () => ({
getDocStoragePool: () => ({
connect,
checkpoint,
vacuumInto: poolVacuumInto,
}),
}));
vi.doMock('@affine/electron/helper/main-rpc', () => ({
mainRPC: {
showItemInFolder: vi.fn(),
},
}));
vi.doMock('@affine/electron/helper/workspace/meta', () => ({
getSpaceDBPath,
getWorkspaceDBPath,
getWorkspacesBasePath,
}));
vi.doMock('@affine/electron/helper/workspace', () => ({
storeWorkspaceMeta,
}));
vi.doMock('fs-extra', () => ({
default: {
pathExists,
remove,
move,
realpath,
copyFile,
ensureDir,
copy,
},
}));
afterEach(() => {
vi.clearAllMocks();
vi.resetModules();
});
describe('dialog export', () => {
test('saveDBFileAs exports a vacuumed backup instead of copying the live db', async () => {
const dbPath = '/tmp/workspace/storage.db';
const exportPath = '/tmp/export.affine';
const tempExportPath = '/tmp/export.affine.workspace-1.tmp';
const id = '@peer(local);@type(workspace);@id(workspace-1);';
pathExists.mockImplementation(async path => path === dbPath);
realpath.mockImplementation(async path => path);
getSpaceDBPath.mockResolvedValue(dbPath);
move.mockResolvedValue(undefined);
const { saveDBFileAs, setFakeDialogResult } =
await import('@affine/electron/helper/dialog/dialog');
setFakeDialogResult({ filePath: exportPath });
const result = await saveDBFileAs(id, 'My Space');
expect(result).toEqual({ filePath: exportPath });
expect(connect).toHaveBeenCalledWith(id, dbPath);
expect(checkpoint).toHaveBeenCalledWith(id);
expect(poolVacuumInto).toHaveBeenCalledWith(id, tempExportPath);
expect(move).toHaveBeenCalledWith(tempExportPath, exportPath, {
overwrite: true,
});
expect(remove).not.toHaveBeenCalledWith(exportPath);
expect(copyFile).not.toHaveBeenCalled();
});
test('saveDBFileAs rejects exporting over the live database path', async () => {
const dbPath = '/tmp/workspace/storage.db';
const id = '@peer(local);@type(workspace);@id(workspace-1);';
pathExists.mockResolvedValue(false);
getSpaceDBPath.mockResolvedValue(dbPath);
const { saveDBFileAs, setFakeDialogResult } =
await import('@affine/electron/helper/dialog/dialog');
setFakeDialogResult({ filePath: dbPath });
const result = await saveDBFileAs(id, 'My Space');
expect(result).toEqual({ error: 'DB_FILE_PATH_INVALID' });
expect(poolVacuumInto).not.toHaveBeenCalled();
expect(copyFile).not.toHaveBeenCalled();
});
test('saveDBFileAs rejects exporting to a symlink alias of the live database', async () => {
const dbPath = '/tmp/workspace/storage.db';
const exportPath = '/tmp/alias.affine';
const id = '@peer(local);@type(workspace);@id(workspace-1);';
pathExists.mockResolvedValue(true);
realpath.mockImplementation(async path =>
path === exportPath ? dbPath : path
);
getSpaceDBPath.mockResolvedValue(dbPath);
const { saveDBFileAs, setFakeDialogResult } =
await import('@affine/electron/helper/dialog/dialog');
setFakeDialogResult({ filePath: exportPath });
const result = await saveDBFileAs(id, 'My Space');
expect(result).toEqual({ error: 'DB_FILE_PATH_INVALID' });
expect(poolVacuumInto).not.toHaveBeenCalled();
expect(move).not.toHaveBeenCalled();
});
});
describe('dialog import', () => {
test('loadDBFile validates schema and vacuums v2 imports into internal storage', async () => {
const originalPath = '/tmp/import.affine';
const internalPath = '/app/workspaces/local/workspace-1/storage.db';
getWorkspacesBasePath.mockResolvedValue('/app/workspaces');
getSpaceDBPath.mockResolvedValue(internalPath);
docValidate.mockResolvedValue(true);
docValidateImportSchema.mockResolvedValue(true);
docVacuumInto.mockResolvedValue(undefined);
docSetSpaceId.mockResolvedValue(undefined);
ensureDir.mockResolvedValue(undefined);
const { loadDBFile, setFakeDialogResult } =
await import('@affine/electron/helper/dialog/dialog');
setFakeDialogResult({ filePath: originalPath });
const result = await loadDBFile();
expect(result).toEqual({ workspaceId: 'workspace-1' });
expect(docValidate).toHaveBeenCalledWith(originalPath);
expect(docValidateImportSchema).toHaveBeenCalledWith(originalPath);
expect(docVacuumInto).toHaveBeenCalledWith(originalPath, internalPath);
expect(docSetSpaceId).toHaveBeenCalledWith(internalPath, 'workspace-1');
expect(copy).not.toHaveBeenCalled();
});
test('loadDBFile rejects v2 imports with unexpected schema objects', async () => {
const originalPath = '/tmp/import.affine';
getWorkspacesBasePath.mockResolvedValue('/app/workspaces');
docValidate.mockResolvedValue(true);
docValidateImportSchema.mockResolvedValue(false);
const { loadDBFile, setFakeDialogResult } =
await import('@affine/electron/helper/dialog/dialog');
setFakeDialogResult({ filePath: originalPath });
const result = await loadDBFile();
expect(result).toEqual({ error: 'DB_FILE_INVALID' });
expect(docVacuumInto).not.toHaveBeenCalled();
expect(copy).not.toHaveBeenCalled();
});
test('loadDBFile validates schema and vacuums v1 imports into internal storage', async () => {
const originalPath = '/tmp/import-v1.affine';
const internalPath = '/app/workspaces/workspace-1/storage.db';
getWorkspacesBasePath.mockResolvedValue('/app/workspaces');
getWorkspaceDBPath.mockResolvedValue(internalPath);
docValidate.mockResolvedValue(false);
sqliteValidate.mockResolvedValue('Valid');
sqliteValidateImportSchema.mockResolvedValue(true);
sqliteVacuumInto.mockResolvedValue(undefined);
ensureDir.mockResolvedValue(undefined);
const { loadDBFile, setFakeDialogResult } =
await import('@affine/electron/helper/dialog/dialog');
setFakeDialogResult({ filePath: originalPath });
const result = await loadDBFile();
expect(result).toEqual({ workspaceId: 'workspace-1' });
expect(sqliteValidate).toHaveBeenCalledWith(originalPath);
expect(sqliteValidateImportSchema).toHaveBeenCalledWith(originalPath);
expect(ensureDir).toHaveBeenCalledWith('/app/workspaces/workspace-1');
expect(sqliteVacuumInto).toHaveBeenCalledWith(originalPath, internalPath);
expect(storeWorkspaceMeta).toHaveBeenCalledWith('workspace-1', {
id: 'workspace-1',
mainDBPath: internalPath,
});
expect(copy).not.toHaveBeenCalled();
});
});

View File

@@ -0,0 +1,85 @@
import path from 'node:path';
import fs from 'fs-extra';
import { afterEach, beforeEach, describe, expect, test, vi } from 'vitest';
const { native } = vi.hoisted(() => ({
native: {
renderMermaidSvg: vi.fn(),
renderTypstSvg: vi.fn(),
},
}));
vi.mock('@affine/native', () => native);
const tmpDir = path.join(__dirname, 'tmp');
const typstFontDirA = path.join(tmpDir, 'fonts-a');
const typstFontDirB = path.join(tmpDir, 'fonts-b');
async function loadPreviewHandlers() {
vi.resetModules();
const module = await import('../../src/helper/preview');
return module.previewHandlers;
}
describe('helper preview handlers', () => {
beforeEach(async () => {
await fs.ensureDir(typstFontDirA);
await fs.ensureDir(typstFontDirB);
process.env.AFFINE_TYPST_FONT_DIRS = [
typstFontDirA,
typstFontDirB,
path.join(tmpDir, 'missing'),
].join(path.delimiter);
native.renderMermaidSvg.mockReset();
native.renderTypstSvg.mockReset();
native.renderMermaidSvg.mockReturnValue({
svg: '<svg><text>mermaid</text></svg>',
});
native.renderTypstSvg.mockReturnValue({
svg: '<svg><text>typst</text></svg>',
});
});
afterEach(async () => {
delete process.env.AFFINE_TYPST_FONT_DIRS;
await fs.remove(tmpDir);
});
test('passes mermaid request to native renderer', async () => {
const previewHandlers = await loadPreviewHandlers();
const request = { code: 'flowchart TD; A-->B' };
await previewHandlers.renderMermaidSvg(request);
expect(native.renderMermaidSvg).toHaveBeenCalledWith(request);
});
test('injects resolved fontDirs into typst requests', async () => {
const previewHandlers = await loadPreviewHandlers();
await previewHandlers.renderTypstSvg({ code: '= hello' });
const [request] = native.renderTypstSvg.mock.calls[0];
expect(request.options?.fontDirs).toEqual(
expect.arrayContaining([
path.resolve(typstFontDirA),
path.resolve(typstFontDirB),
])
);
});
test('keeps explicit typst fontDirs', async () => {
const previewHandlers = await loadPreviewHandlers();
const request = {
code: '= hello',
options: {
fontDirs: ['/tmp/custom-fonts'],
},
};
await previewHandlers.renderTypstSvg(request);
expect(native.renderTypstSvg).toHaveBeenCalledWith(request);
});
});

View File

@@ -14,8 +14,8 @@
"kind" : "remoteSourceControl",
"location" : "https://github.com/Recouse/EventSource",
"state" : {
"revision" : "7b2f4f585d3927876bd76eaede9fdff779eff102",
"version" : "0.1.5"
"revision" : "713f8c0a0270a80a968c007ddc0d6067e80a5393",
"version" : "0.1.7"
}
},
{
@@ -41,8 +41,8 @@
"kind" : "remoteSourceControl",
"location" : "https://github.com/Lakr233/Litext",
"state" : {
"revision" : "c7e83f2f580ce34a102ca9ba9d2bb24e507dccd9",
"version" : "0.5.6"
"revision" : "a2ed9b63ae623a20591effc72f9db7d04e41a64c",
"version" : "1.2.1"
}
},
{
@@ -77,8 +77,8 @@
"kind" : "remoteSourceControl",
"location" : "https://github.com/RevenueCat/purchases-ios-spm.git",
"state" : {
"revision" : "8f5df97653eb361a2097119479332afccf0aa816",
"version" : "5.58.0"
"revision" : "2913a336eb37dc06795cdbaa5b5de330b6707669",
"version" : "5.65.0"
}
},
{

View File

@@ -34,6 +34,7 @@ class AFFiNEViewController: CAPBridgeViewController {
NavigationGesturePlugin(),
NbStorePlugin(),
PayWallPlugin(associatedController: self),
PreviewPlugin(),
]
plugins.forEach { bridge?.registerPluginInstance($0) }
}

View File

@@ -0,0 +1,119 @@
import Foundation
import Capacitor
private func resolveLocalFontDir(from fontURL: String) -> String? {
let path: String
if fontURL.hasPrefix("file://") {
guard let url = URL(string: fontURL), url.isFileURL else {
return nil
}
path = url.path
} else {
let candidate = (fontURL as NSString).standardizingPath
guard candidate.hasPrefix("/") else {
return nil
}
path = candidate
}
var isDirectory: ObjCBool = false
if FileManager.default.fileExists(atPath: path, isDirectory: &isDirectory),
isDirectory.boolValue
{
return path
}
let directory = (path as NSString).deletingLastPathComponent
return directory.isEmpty ? nil : directory
}
private func resolveTypstFontDirs(from options: [AnyHashable: Any]?) throws -> [String]? {
guard let rawFontUrls = options?["fontUrls"] else {
return nil
}
guard let fontUrls = rawFontUrls as? [Any] else {
throw NSError(
domain: "PreviewPlugin",
code: 1,
userInfo: [
NSLocalizedDescriptionKey: "Typst preview fontUrls must be an array of strings."
]
)
}
var seenFontDirs = Set<String>()
var orderedFontDirs = [String]()
orderedFontDirs.reserveCapacity(fontUrls.count)
for fontUrl in fontUrls {
guard let fontURL = fontUrl as? String else {
throw NSError(
domain: "PreviewPlugin",
code: 1,
userInfo: [
NSLocalizedDescriptionKey: "Typst preview fontUrls must be strings."
]
)
}
guard let fontDir = resolveLocalFontDir(from: fontURL) else {
throw NSError(
domain: "PreviewPlugin",
code: 1,
userInfo: [
NSLocalizedDescriptionKey: "Typst preview on mobile only supports local font file URLs or absolute font directories."
]
)
}
if seenFontDirs.insert(fontDir).inserted {
orderedFontDirs.append(fontDir)
}
}
return orderedFontDirs
}
@objc(PreviewPlugin)
public class PreviewPlugin: CAPPlugin, CAPBridgedPlugin {
public let identifier = "PreviewPlugin"
public let jsName = "Preview"
public let pluginMethods: [CAPPluginMethod] = [
CAPPluginMethod(name: "renderMermaidSvg", returnType: CAPPluginReturnPromise),
CAPPluginMethod(name: "renderTypstSvg", returnType: CAPPluginReturnPromise),
]
@objc func renderMermaidSvg(_ call: CAPPluginCall) {
DispatchQueue.global(qos: .userInitiated).async {
do {
let code = try call.getStringEnsure("code")
let options = call.getObject("options")
let svg = try renderMermaidPreviewSvg(
code: code,
theme: options?["theme"] as? String,
fontFamily: options?["fontFamily"] as? String,
fontSize: (options?["fontSize"] as? NSNumber)?.doubleValue
)
call.resolve(["svg": svg])
} catch {
call.reject("Failed to render Mermaid preview, \(error)", nil, error)
}
}
}
@objc func renderTypstSvg(_ call: CAPPluginCall) {
DispatchQueue.global(qos: .userInitiated).async {
do {
let code = try call.getStringEnsure("code")
let options = call.getObject("options")
let cacheDir = FileManager.default.urls(for: .cachesDirectory, in: .userDomainMask).first?.path
let fontDirs = try resolveTypstFontDirs(from: options)
let svg = try renderTypstPreviewSvg(code: code, fontDirs: fontDirs, cacheDir: cacheDir)
call.resolve(["svg": svg])
} catch {
call.reject("Failed to render Typst preview, \(error)", nil, error)
}
}
}
}

View File

@@ -2265,6 +2265,30 @@ fileprivate struct FfiConverterOptionInt64: FfiConverterRustBuffer {
}
}
#if swift(>=5.8)
@_documentation(visibility: private)
#endif
fileprivate struct FfiConverterOptionDouble: FfiConverterRustBuffer {
typealias SwiftType = Double?
public static func write(_ value: SwiftType, into buf: inout [UInt8]) {
guard let value = value else {
writeInt(&buf, Int8(0))
return
}
writeInt(&buf, Int8(1))
FfiConverterDouble.write(value, into: &buf)
}
public static func read(from buf: inout (data: Data, offset: Data.Index)) throws -> SwiftType {
switch try readInt(&buf) as Int8 {
case 0: return nil
case 1: return try FfiConverterDouble.read(from: &buf)
default: throw UniffiInternalError.unexpectedOptionalTag
}
}
}
#if swift(>=5.8)
@_documentation(visibility: private)
#endif
@@ -2644,6 +2668,25 @@ public func newDocStoragePool() -> DocStoragePool {
)
})
}
public func renderMermaidPreviewSvg(code: String, theme: String?, fontFamily: String?, fontSize: Double?)throws -> String {
return try FfiConverterString.lift(try rustCallWithError(FfiConverterTypeUniffiError_lift) {
uniffi_affine_mobile_native_fn_func_render_mermaid_preview_svg(
FfiConverterString.lower(code),
FfiConverterOptionString.lower(theme),
FfiConverterOptionString.lower(fontFamily),
FfiConverterOptionDouble.lower(fontSize),$0
)
})
}
public func renderTypstPreviewSvg(code: String, fontDirs: [String]?, cacheDir: String?)throws -> String {
return try FfiConverterString.lift(try rustCallWithError(FfiConverterTypeUniffiError_lift) {
uniffi_affine_mobile_native_fn_func_render_typst_preview_svg(
FfiConverterString.lower(code),
FfiConverterOptionSequenceString.lower(fontDirs),
FfiConverterOptionString.lower(cacheDir),$0
)
})
}
private enum InitializationResult {
case ok
@@ -2666,6 +2709,12 @@ private let initializationResult: InitializationResult = {
if (uniffi_affine_mobile_native_checksum_func_new_doc_storage_pool() != 32882) {
return InitializationResult.apiChecksumMismatch
}
if (uniffi_affine_mobile_native_checksum_func_render_mermaid_preview_svg() != 54334) {
return InitializationResult.apiChecksumMismatch
}
if (uniffi_affine_mobile_native_checksum_func_render_typst_preview_svg() != 42796) {
return InitializationResult.apiChecksumMismatch
}
if (uniffi_affine_mobile_native_checksum_method_docstoragepool_clear_clocks() != 51151) {
return InitializationResult.apiChecksumMismatch
}

View File

@@ -450,6 +450,16 @@ RustBuffer uniffi_affine_mobile_native_fn_func_hashcash_mint(RustBuffer resource
#define UNIFFI_FFIDEF_UNIFFI_AFFINE_MOBILE_NATIVE_FN_FUNC_NEW_DOC_STORAGE_POOL
void*_Nonnull uniffi_affine_mobile_native_fn_func_new_doc_storage_pool(RustCallStatus *_Nonnull out_status
);
#endif
#ifndef UNIFFI_FFIDEF_UNIFFI_AFFINE_MOBILE_NATIVE_FN_FUNC_RENDER_MERMAID_PREVIEW_SVG
#define UNIFFI_FFIDEF_UNIFFI_AFFINE_MOBILE_NATIVE_FN_FUNC_RENDER_MERMAID_PREVIEW_SVG
RustBuffer uniffi_affine_mobile_native_fn_func_render_mermaid_preview_svg(RustBuffer code, RustBuffer theme, RustBuffer font_family, RustBuffer font_size, RustCallStatus *_Nonnull out_status
);
#endif
#ifndef UNIFFI_FFIDEF_UNIFFI_AFFINE_MOBILE_NATIVE_FN_FUNC_RENDER_TYPST_PREVIEW_SVG
#define UNIFFI_FFIDEF_UNIFFI_AFFINE_MOBILE_NATIVE_FN_FUNC_RENDER_TYPST_PREVIEW_SVG
RustBuffer uniffi_affine_mobile_native_fn_func_render_typst_preview_svg(RustBuffer code, RustBuffer font_dirs, RustBuffer cache_dir, RustCallStatus *_Nonnull out_status
);
#endif
#ifndef UNIFFI_FFIDEF_FFI_AFFINE_MOBILE_NATIVE_RUSTBUFFER_ALLOC
@@ -742,6 +752,18 @@ uint16_t uniffi_affine_mobile_native_checksum_func_hashcash_mint(void
#define UNIFFI_FFIDEF_UNIFFI_AFFINE_MOBILE_NATIVE_CHECKSUM_FUNC_NEW_DOC_STORAGE_POOL
uint16_t uniffi_affine_mobile_native_checksum_func_new_doc_storage_pool(void
);
#endif
#ifndef UNIFFI_FFIDEF_UNIFFI_AFFINE_MOBILE_NATIVE_CHECKSUM_FUNC_RENDER_MERMAID_PREVIEW_SVG
#define UNIFFI_FFIDEF_UNIFFI_AFFINE_MOBILE_NATIVE_CHECKSUM_FUNC_RENDER_MERMAID_PREVIEW_SVG
uint16_t uniffi_affine_mobile_native_checksum_func_render_mermaid_preview_svg(void
);
#endif
#ifndef UNIFFI_FFIDEF_UNIFFI_AFFINE_MOBILE_NATIVE_CHECKSUM_FUNC_RENDER_TYPST_PREVIEW_SVG
#define UNIFFI_FFIDEF_UNIFFI_AFFINE_MOBILE_NATIVE_CHECKSUM_FUNC_RENDER_TYPST_PREVIEW_SVG
uint16_t uniffi_affine_mobile_native_checksum_func_render_typst_preview_svg(void
);
#endif
#ifndef UNIFFI_FFIDEF_UNIFFI_AFFINE_MOBILE_NATIVE_CHECKSUM_METHOD_DOCSTORAGEPOOL_CLEAR_CLOCKS

View File

@@ -8,4 +8,18 @@
import ApolloAPI
/// The `JSON` scalar type represents JSON values as specified by [ECMA-404](http://www.ecma-international.org/publications/files/ECMA-ST/ECMA-404.pdf).
public typealias JSON = String
public struct JSON: CustomScalarType, Hashable, ExpressibleByDictionaryLiteral {
public let value: JSONValue
public init(_jsonValue value: JSONValue) throws {
self.value = value
}
public init(dictionaryLiteral elements: (String, JSONValue)...) {
value = ApolloAPI.JSONObject(uniqueKeysWithValues: elements) as JSONValue
}
public var _jsonValue: JSONValue {
value
}
}

View File

@@ -8,4 +8,22 @@
import ApolloAPI
/// The `JSONObject` scalar type represents JSON objects as specified by [ECMA-404](http://www.ecma-international.org/publications/files/ECMA-ST/ECMA-404.pdf).
public typealias JSONObject = String
public struct JSONObject: CustomScalarType, Hashable, ExpressibleByDictionaryLiteral {
public let object: ApolloAPI.JSONObject
public init(_jsonValue value: JSONValue) throws {
object = try ApolloAPI.JSONObject(_jsonValue: value)
}
public init(_ object: ApolloAPI.JSONObject) {
self.object = object
}
public init(dictionaryLiteral elements: (String, JSONValue)...) {
object = ApolloAPI.JSONObject(uniqueKeysWithValues: elements)
}
public var _jsonValue: JSONValue {
object
}
}

View File

@@ -26,6 +26,7 @@ private extension InputBoxData {
}
public extension ChatManager {
@MainActor
func startUserRequest(editorData: InputBoxData, sessionId: String) {
append(sessionId: sessionId, UserMessageCellViewModel(
id: .init(),
@@ -163,7 +164,7 @@ private extension ChatManager {
assert(!Thread.isMainThread)
print("[+] starting copilot response for session: \(sessionId)")
let messageParameters: [String: AnyHashable] = [
let messageParameters: AffineGraphQL.JSON = [
// packages/frontend/core/src/blocksuite/ai/provider/setup-provider.tsx
"docs": editorData.documentAttachments.map(\.documentID), // affine doc
"files": [String](), // attachment in context, keep nil for now
@@ -193,18 +194,14 @@ private extension ChatManager {
},
].flatMap(\.self)
assert(uploadableAttachments.allSatisfy { !($0.data?.isEmpty ?? true) })
guard let input = try? CreateChatMessageInput(
let input = CreateChatMessageInput(
attachments: [],
blob: attachmentCount == 1 ? "" : .none,
blobs: attachmentCount > 1 && attachmentCount != 0 ? .some([]) : .none,
content: .some(contextSnippet.isEmpty ? editorData.text : "\(contextSnippet)\n\(editorData.text)"),
params: .some(AffineGraphQL.JSON(_jsonValue: messageParameters)),
params: .some(messageParameters),
sessionId: sessionId
) else {
report(sessionId, ChatError.unknownError)
assertionFailure() // very unlikely to happen
return
}
)
let mutation = CreateCopilotMessageMutation(options: input)
QLService.shared.client.upload(operation: mutation, files: uploadableAttachments) { result in
print("[*] createCopilotMessage result: \(result)")
@@ -277,7 +274,7 @@ private extension ChatManager {
let eventSource = EventSource()
let dataTask = eventSource.dataTask(for: request)
var document = ""
self.writeMarkdownContent(document + loadingIndicator, sessionId: sessionId, vmId: vmId)
await self.writeMarkdownContent(document + loadingIndicator, sessionId: sessionId, vmId: vmId)
for await event in dataTask.events() {
switch event {
case .open:
@@ -287,7 +284,7 @@ private extension ChatManager {
case let .event(event):
guard let data = event.data else { continue }
document += data
self.writeMarkdownContent(
await self.writeMarkdownContent(
document + loadingIndicator,
sessionId: sessionId,
vmId: vmId
@@ -297,13 +294,13 @@ private extension ChatManager {
print("[*] connection closed")
}
}
self.writeMarkdownContent(document, sessionId: sessionId, vmId: vmId)
await self.writeMarkdownContent(document, sessionId: sessionId, vmId: vmId)
self.closeAll()
}))
self.closable.append(closable)
}
private func writeMarkdownContent(
@MainActor private func writeMarkdownContent(
_ document: String,
sessionId: SessionID,
vmId: UUID

View File

@@ -9,7 +9,7 @@ import Foundation
import MarkdownView
extension IntelligentContext {
func prepareMarkdownViewThemes() {
@MainActor func prepareMarkdownViewThemes() {
MarkdownTheme.default.colors.body = .affineTextPrimary
MarkdownTheme.default.colors.highlight = .affineTextLink
}

View File

@@ -40,7 +40,7 @@ struct AssistantMessageCellViewModel: ChatCellViewModel {
var preprocessedContent: MarkdownTextView.PreprocessedContent
init(
@MainActor init(
id: UUID,
content: String,
timestamp: Date,

View File

@@ -45,13 +45,13 @@ EXTERNAL SOURCES:
:path: "../../../../../node_modules/capacitor-plugin-app-tracking-transparency"
SPEC CHECKSUMS:
Capacitor: 12914e6f1b7835e161a74ebd19cb361efa37a7dd
CapacitorApp: 63b237168fc869e758481dba283315a85743ee78
CapacitorBrowser: b98aa3db018a2ce4c68242d27e596c344f3b81b3
Capacitor: a5bf59e09f9dd82694fdcca4d107b4d215ac470f
CapacitorApp: 3ddbd30ac18c321531c3da5e707b60873d89dd60
CapacitorBrowser: 66aa8ff09cdca2a327ce464b113b470e6f667753
CapacitorCordova: 31bbe4466000c6b86d9b7f1181ee286cff0205aa
CapacitorHaptics: ce15be8f287fa2c61c7d2d9e958885b90cf0bebc
CapacitorKeyboard: 5660c760113bfa48962817a785879373cf5339c3
CapacitorPluginAppTrackingTransparency: 92ae9c1cfb5cf477753db9269689332a686f675a
CapacitorHaptics: d17da7dd984cae34111b3f097ccd3e21f9feec62
CapacitorKeyboard: 45cae3956a6f4fb1753f9a4df3e884aeaed8fe82
CapacitorPluginAppTrackingTransparency: 2a2792623a5a72795f2e8f9ab3f1147573732fd8
CryptoSwift: 967f37cea5a3294d9cce358f78861652155be483
PODFILE CHECKSUM: 2c1e4be82121f2d9724ecf7e31dd14e165aeb082

View File

@@ -17,6 +17,7 @@ import {
SubscriptionService,
ValidatorProvider,
} from '@affine/core/modules/cloud';
import { registerNativePreviewHandlers } from '@affine/core/modules/code-block-preview-renderer';
import { DocsService } from '@affine/core/modules/doc';
import { FeatureFlagService } from '@affine/core/modules/feature-flag';
import { GlobalContextService } from '@affine/core/modules/global-context';
@@ -71,6 +72,7 @@ import { Auth } from './plugins/auth';
import { Hashcash } from './plugins/hashcash';
import { NbStoreNativeDBApis } from './plugins/nbstore';
import { PayWall } from './plugins/paywall';
import { Preview } from './plugins/preview';
import { writeEndpointToken } from './proxy';
import { enableNavigationGesture$ } from './web-navigation-control';
@@ -215,6 +217,11 @@ framework.impl(NativePaywallProvider, {
const frameworkProvider = framework.provider();
registerNativePreviewHandlers({
renderMermaidSvg: request => Preview.renderMermaidSvg(request),
renderTypstSvg: request => Preview.renderTypstSvg(request),
});
// ------ some apis for native ------
(window as any).getCurrentServerBaseUrl = () => {
const globalContextService = frameworkProvider.get(GlobalContextService);

View File

@@ -0,0 +1,16 @@
export interface PreviewPlugin {
renderMermaidSvg(options: {
code: string;
options?: {
theme?: string;
fontFamily?: string;
fontSize?: number;
};
}): Promise<{ svg: string }>;
renderTypstSvg(options: {
code: string;
options?: {
fontUrls?: string[];
};
}): Promise<{ svg: string }>;
}

View File

@@ -0,0 +1,8 @@
import { registerPlugin } from '@capacitor/core';
import type { PreviewPlugin } from './definitions';
const Preview = registerPlugin<PreviewPlugin>('Preview');
export * from './definitions';
export { Preview };

View File

@@ -126,7 +126,7 @@ export const DayPicker = memo(function DayPicker(
data-month={cursor.month()}
data-year={cursor.year()}
>
{monthNames.split(',')[cursor.month()]}
{monthNames.split(/[,،]/)[cursor.month()]}
</button>
<button
className={styles.calendarHeaderTriggerButton}
@@ -172,7 +172,7 @@ export const DayPicker = memo(function DayPicker(
<main className={styles.monthViewBody}>
{/* weekDays */}
<div className={styles.monthViewRow}>
{weekDays.split(',').map(day => (
{weekDays.split(/[,،]/).map(day => (
<div
key={day}
className={clsx(

View File

@@ -145,7 +145,7 @@ export const MonthPicker = memo(function MonthPicker(
tabIndex={month.isSame(monthCursor, 'month') ? 0 : -1}
aria-label={monthValue}
>
{monthNames.split(',')[month.month()]}
{monthNames.split(/[,،]/)[month.month()]}
</button>
</div>
);

View File

@@ -47,6 +47,7 @@
"@radix-ui/react-toolbar": "^1.1.1",
"@sentry/react": "^10.40.0",
"@toeverything/infra": "workspace:*",
"@toeverything/mermaid-wasm": "^0.1.0",
"@toeverything/pdf-viewer": "^0.1.1",
"@toeverything/theme": "^1.1.23",
"@vanilla-extract/dynamic": "^2.1.2",
@@ -57,6 +58,7 @@
"cmdk": "^1.0.4",
"core-js": "^3.39.0",
"dayjs": "^1.11.13",
"dompurify": "^3.3.0",
"eventemitter2": "^6.4.9",
"file-type": "^21.0.0",
"filesize": "^10.1.6",
@@ -76,7 +78,7 @@
"lit": "^3.2.1",
"lodash-es": "^4.17.23",
"lottie-react": "^2.4.0",
"mermaid": "^11.12.2",
"mermaid": "^11.13.0",
"mp4-muxer": "^5.2.2",
"nanoid": "^5.1.6",
"next-themes": "^0.4.4",

View File

@@ -0,0 +1,63 @@
import {
attachOAuthFlowToAuthUrl,
parseOAuthCallbackState,
resolveOAuthFlowMode,
resolveOAuthRedirect,
} from '@affine/core/desktop/pages/auth/oauth-flow';
import { describe, expect, test } from 'vitest';
describe('oauth flow mode', () => {
test('defaults to redirect for missing or unknown values', () => {
expect(resolveOAuthFlowMode()).toBe('redirect');
expect(resolveOAuthFlowMode(null)).toBe('redirect');
expect(resolveOAuthFlowMode('unknown')).toBe('redirect');
});
test('persists flow in oauth state instead of web storage', () => {
const url = attachOAuthFlowToAuthUrl(
'https://example.com/auth?state=%7B%22state%22%3A%22nonce%22%2C%22provider%22%3A%22Google%22%2C%22client%22%3A%22web%22%7D',
'redirect'
);
expect(
parseOAuthCallbackState(new URL(url).searchParams.get('state')!)
).toEqual({
client: 'web',
flow: 'redirect',
provider: 'Google',
state: 'nonce',
});
});
test('falls back to popup when callback state has no flow', () => {
expect(
parseOAuthCallbackState(
JSON.stringify({ client: 'web', provider: 'Google', state: 'nonce' })
).flow
).toBe('popup');
});
test('keeps same-origin redirects direct', () => {
expect(resolveOAuthRedirect('/workspace', 'https://app.affine.pro')).toBe(
'/workspace'
);
expect(
resolveOAuthRedirect(
'https://app.affine.pro/workspace?from=oauth',
'https://app.affine.pro'
)
).toBe('https://app.affine.pro/workspace?from=oauth');
});
test('wraps external redirects with redirect-proxy', () => {
expect(
resolveOAuthRedirect(
'https://github.com/toeverything/AFFiNE',
'https://app.affine.pro'
)
).toBe(
'https://app.affine.pro/redirect-proxy?redirect_uri=https%3A%2F%2Fgithub.com%2Ftoeverything%2FAFFiNE'
);
});
});

View File

@@ -1,3 +1,4 @@
import { renderMermaidSvg } from '@affine/core/modules/code-block-preview-renderer/bridge';
import { CodeBlockPreviewExtension } from '@blocksuite/affine/blocks/code';
import { SignalWatcher, WithDisposable } from '@blocksuite/affine/global/lit';
import type { CodeBlockModel } from '@blocksuite/affine/model';
@@ -7,7 +8,6 @@ import { css, html, nothing, type PropertyValues } from 'lit';
import { property, query, state } from 'lit/decorators.js';
import { choose } from 'lit/directives/choose.js';
import { styleMap } from 'lit/directives/style-map.js';
import type { Mermaid } from 'mermaid';
export const CodeBlockMermaidPreview = CodeBlockPreviewExtension(
'mermaid',
@@ -154,7 +154,6 @@ export class MermaidPreview extends SignalWatcher(
@query('.mermaid-preview-container')
accessor container!: HTMLDivElement;
private mermaid: Mermaid | null = null;
private retryCount = 0;
private readonly maxRetries = 3;
private renderTimeout: ReturnType<typeof setTimeout> | null = null;
@@ -169,9 +168,6 @@ export class MermaidPreview extends SignalWatcher(
private lastMouseY = 0;
override firstUpdated(_changedProperties: PropertyValues): void {
this._loadMermaid().catch(error => {
console.error('Failed to load mermaid in firstUpdated:', error);
});
this._scheduleRender();
this._setupEventListeners();
@@ -271,7 +267,8 @@ export class MermaidPreview extends SignalWatcher(
event.preventDefault();
const delta = event.deltaY > 0 ? 0.9 : 1.1;
const newScale = Math.max(0.1, Math.min(5, this.scale * delta));
const previousScale = this.scale;
const newScale = Math.max(0.1, Math.min(5, previousScale * delta));
// calculate mouse position relative to container
const rect = this.container.getBoundingClientRect();
@@ -284,8 +281,8 @@ export class MermaidPreview extends SignalWatcher(
// update transform
this.scale = newScale;
this.translateX = mouseX - scaleCenterX * (newScale / this.scale);
this.translateY = mouseY - scaleCenterY * (newScale / this.scale);
this.translateX = mouseX - scaleCenterX * (newScale / previousScale);
this.translateY = mouseY - scaleCenterY * (newScale / previousScale);
this._updateTransform();
};
@@ -309,44 +306,6 @@ export class MermaidPreview extends SignalWatcher(
);
}
private async _loadMermaid() {
try {
// dynamic load mermaid
const mermaidModule = await import('mermaid');
this.mermaid = mermaidModule.default;
// initialize mermaid
this.mermaid.initialize({
startOnLoad: false,
theme: 'default',
securityLevel: 'strict',
fontFamily: 'IBM Plex Mono',
flowchart: {
useMaxWidth: true,
htmlLabels: true,
},
sequence: {
useMaxWidth: true,
},
gantt: {
useMaxWidth: true,
},
pie: {
useMaxWidth: true,
},
journey: {
useMaxWidth: true,
},
gitGraph: {
useMaxWidth: true,
},
});
} catch (error) {
console.error('Failed to load mermaid:', error);
this.state = 'error';
}
}
private async _render() {
// prevent duplicate rendering
if (this.isRendering) {
@@ -356,28 +315,25 @@ export class MermaidPreview extends SignalWatcher(
this.isRendering = true;
this.state = 'loading';
if (!this.normalizedMermaidCode) {
const code = this.normalizedMermaidCode?.trim();
if (!code) {
this.svgContent = '';
this.state = 'fallback';
this.isRendering = false;
return;
}
if (!this.mermaid) {
await this._loadMermaid();
}
if (!this.mermaid) {
return;
}
try {
// generate unique ID
const diagramId = `mermaid-diagram-${Date.now()}-${Math.random().toString(36).substr(2, 9)}`;
// generate SVG
const { svg } = await this.mermaid.render(
diagramId,
this.normalizedMermaidCode
);
const { svg } = await renderMermaidSvg({
code,
options: {
fastText: true,
svgOnly: true,
theme: 'default',
fontFamily: 'IBM Plex Mono',
},
});
// update SVG content
this.svgContent = svg;

View File

@@ -1,3 +1,4 @@
import { renderTypstSvg } from '@affine/core/modules/code-block-preview-renderer/bridge';
import { CodeBlockPreviewExtension } from '@blocksuite/affine/blocks/code';
import { SignalWatcher, WithDisposable } from '@blocksuite/affine/global/lit';
import type { CodeBlockModel } from '@blocksuite/affine/model';
@@ -8,8 +9,6 @@ import { property, query, state } from 'lit/decorators.js';
import { choose } from 'lit/directives/choose.js';
import { styleMap } from 'lit/directives/style-map.js';
import { ensureTypstReady, getTypst } from './typst';
const RENDER_DEBOUNCE_MS = 200;
export const CodeBlockTypstPreview = CodeBlockPreviewExtension(
@@ -378,9 +377,7 @@ ${this.errorMessage}</pre
}
try {
await ensureTypstReady();
const typst = await getTypst();
const svg = await typst.svg({ mainContent: code });
const { svg } = await renderTypstSvg({ code });
this.svgContent = svg;
this.state = 'finish';
this._resetView();

View File

@@ -1,57 +0,0 @@
import { $typst, type BeforeBuildFn, loadFonts } from '@myriaddreamin/typst.ts';
const FONT_CDN_URLS = [
'https://cdn.affine.pro/fonts/Inter-Regular.woff',
'https://cdn.affine.pro/fonts/Inter-SemiBold.woff',
'https://cdn.affine.pro/fonts/Inter-Italic.woff',
'https://cdn.affine.pro/fonts/Inter-SemiBoldItalic.woff',
'https://cdn.affine.pro/fonts/SarasaGothicCL-Regular.ttf',
] as const;
const getBeforeBuildHooks = (): BeforeBuildFn[] => [
loadFonts([...FONT_CDN_URLS]),
];
const compilerWasmUrl = new URL(
'@myriaddreamin/typst-ts-web-compiler/pkg/typst_ts_web_compiler_bg.wasm',
import.meta.url
).toString();
const rendererWasmUrl = new URL(
'@myriaddreamin/typst-ts-renderer/pkg/typst_ts_renderer_bg.wasm',
import.meta.url
).toString();
let typstInitPromise: Promise<void> | null = null;
export async function ensureTypstReady() {
if (typstInitPromise) {
return typstInitPromise;
}
typstInitPromise = Promise.resolve()
.then(() => {
$typst.setCompilerInitOptions({
beforeBuild: getBeforeBuildHooks(),
getModule: () => compilerWasmUrl,
});
$typst.setRendererInitOptions({
beforeBuild: getBeforeBuildHooks(),
getModule: () => rendererWasmUrl,
});
})
.catch(error => {
typstInitPromise = null;
throw error;
});
return typstInitPromise;
}
export async function getTypst() {
await ensureTypstReady();
return $typst;
}
export const TYPST_FONT_URLS = FONT_CDN_URLS;

View File

@@ -73,11 +73,13 @@ export function OAuth({ redirectUrl }: { redirectUrl?: string }) {
params.set('redirect_uri', redirectUrl);
}
params.set('flow', 'redirect');
const oauthUrl =
serverService.server.baseUrl +
`/oauth/login?${params.toString()}`;
urlService.openPopupWindow(oauthUrl);
urlService.openExternal(oauthUrl);
};
const ret = open();

View File

@@ -204,6 +204,7 @@ type ImportResult = {
entryId?: string;
isWorkspaceFile?: boolean;
rootFolderId?: string;
importedWorkspace?: WorkspaceMetadata;
};
type ImportedWorkspacePayload = {
@@ -554,11 +555,12 @@ const importConfigs: Record<ImportType, ImportConfig> = {
_organizeService,
_explorerIconService
) => {
await handleImportAffineFile();
const workspace = await handleImportAffineFile();
return {
docIds: [],
entryId: undefined,
isWorkspaceFile: true,
importedWorkspace: workspace,
};
},
},
@@ -773,7 +775,6 @@ export const ImportDialog = ({
undefined,
(payload?: ImportedWorkspacePayload) => {
if (payload) {
handleCreatedWorkspace({ metadata: payload.workspace });
resolve(payload.workspace);
} else {
reject(new Error('No workspace imported'));
@@ -782,7 +783,7 @@ export const ImportDialog = ({
);
});
};
}, [globalDialogService, handleCreatedWorkspace]);
}, [globalDialogService]);
const handleImport = useAsyncCallback(
async (type: ImportType) => {
@@ -812,16 +813,27 @@ export const ImportDialog = ({
});
}
const { docIds, entryId, isWorkspaceFile, rootFolderId } =
await importConfig.importFunction(
docCollection,
files,
handleImportAffineFile,
organizeService,
explorerIconService
);
const {
docIds,
entryId,
isWorkspaceFile,
rootFolderId,
importedWorkspace,
} = await importConfig.importFunction(
docCollection,
files,
handleImportAffineFile,
organizeService,
explorerIconService
);
setImportResult({ docIds, entryId, isWorkspaceFile, rootFolderId });
setImportResult({
docIds,
entryId,
isWorkspaceFile,
rootFolderId,
importedWorkspace,
});
setStatus('success');
track.$.importModal.$.import({
type,
@@ -855,9 +867,21 @@ export const ImportDialog = ({
]
);
const finishImport = useCallback(() => {
if (importResult?.importedWorkspace) {
handleCreatedWorkspace({ metadata: importResult.importedWorkspace });
}
if (!importResult) {
close();
return;
}
const { importedWorkspace: _workspace, ...result } = importResult;
close(result);
}, [close, handleCreatedWorkspace, importResult]);
const handleComplete = useCallback(() => {
close(importResult || undefined);
}, [importResult, close]);
finishImport();
}, [finishImport]);
const handleRetry = () => {
setStatus('idle');
@@ -875,7 +899,7 @@ export const ImportDialog = ({
open
onOpenChange={(open: boolean) => {
if (!open) {
close(importResult || undefined);
finishImport();
}
}}
width={480}

View File

@@ -13,10 +13,16 @@ import {
buildOpenAppUrlRoute,
} from '../../../modules/open-in-app';
import { supportedClient } from './common';
import {
type OAuthFlowMode,
parseOAuthCallbackState,
resolveOAuthRedirect,
} from './oauth-flow';
interface LoaderData {
state: string;
code: string;
flow: OAuthFlowMode;
provider: string;
}
@@ -31,12 +37,18 @@ export const loader: LoaderFunction = async ({ request }) => {
}
try {
const { state, client, provider } = JSON.parse(stateStr);
const { state, client, flow, provider } = parseOAuthCallbackState(stateStr);
if (!state || !provider) {
return redirect('/sign-in?error=Invalid oauth callback parameters');
}
stateStr = state;
const payload: LoaderData = {
state,
code,
flow,
provider,
};
@@ -79,8 +91,13 @@ export const Component = () => {
triggeredRef.current = true;
auth
.signInOauth(data.code, data.state, data.provider)
.then(() => {
window.close();
.then(({ redirectUri }) => {
if (data.flow === 'popup') {
window.close();
return;
}
location.replace(resolveOAuthRedirect(redirectUri, location.origin));
})
.catch(e => {
nav(`/sign-in?error=${encodeURIComponent(e.message)}`);

View File

@@ -0,0 +1,73 @@
export const oauthFlowModes = ['popup', 'redirect'] as const;
export type OAuthFlowMode = (typeof oauthFlowModes)[number];
export function resolveOAuthFlowMode(
mode?: string | null,
fallback: OAuthFlowMode = 'redirect'
): OAuthFlowMode {
return mode === 'popup' || mode === 'redirect' ? mode : fallback;
}
export function attachOAuthFlowToAuthUrl(url: string, flow: OAuthFlowMode) {
const authUrl = new URL(url);
const state = authUrl.searchParams.get('state');
if (!state) return url;
try {
const payload = JSON.parse(state) as Record<string, unknown>;
authUrl.searchParams.set('state', JSON.stringify({ ...payload, flow }));
return authUrl.toString();
} catch {
return url;
}
}
export function readOAuthFlowModeFromCallbackState(state: string | null) {
if (!state) return 'popup';
try {
const payload = JSON.parse(state) as { flow?: string };
return resolveOAuthFlowMode(payload.flow, 'popup');
} catch {
return 'popup';
}
}
export function parseOAuthCallbackState(state: string) {
const parsed = JSON.parse(state) as {
client?: string;
provider?: string;
state?: string;
};
return {
client: parsed.client,
flow: readOAuthFlowModeFromCallbackState(state),
provider: parsed.provider,
state: parsed.state,
};
}
export function resolveOAuthRedirect(
redirectUri: string | null | undefined,
currentOrigin: string
) {
if (!redirectUri) return '/';
if (redirectUri.startsWith('/') && !redirectUri.startsWith('//')) {
return redirectUri;
}
let target: URL;
try {
target = new URL(redirectUri);
} catch {
return '/';
}
if (target.origin === currentOrigin) return target.toString();
const redirectProxy = new URL('/redirect-proxy', currentOrigin);
redirectProxy.searchParams.set('redirect_uri', target.toString());
return redirectProxy.toString();
}

View File

@@ -12,6 +12,7 @@ import {
import { z } from 'zod';
import { supportedClient } from './common';
import { attachOAuthFlowToAuthUrl, resolveOAuthFlowMode } from './oauth-flow';
const supportedProvider = z.nativeEnum(OAuthProviderType);
const CSRF_COOKIE_NAME = 'affine_csrf_token';
@@ -36,12 +37,14 @@ const oauthParameters = z.object({
provider: supportedProvider,
client: supportedClient,
redirectUri: z.string().optional().nullable(),
flow: z.string().optional().nullable(),
});
interface LoaderData {
provider: OAuthProviderType;
client: string;
redirectUri?: string;
flow: string;
}
export const loader: LoaderFunction = async ({ request }) => {
@@ -50,6 +53,7 @@ export const loader: LoaderFunction = async ({ request }) => {
const provider = searchParams.get('provider');
const client = searchParams.get('client') ?? 'web';
const redirectUri = searchParams.get('redirect_uri');
const flow = searchParams.get('flow');
// sign out first, web only
if (client === 'web') {
@@ -64,6 +68,7 @@ export const loader: LoaderFunction = async ({ request }) => {
provider,
client,
redirectUri,
flow,
});
if (paramsParseResult.success) {
@@ -71,6 +76,7 @@ export const loader: LoaderFunction = async ({ request }) => {
provider,
client,
redirectUri,
flow: resolveOAuthFlowMode(flow),
};
}
@@ -90,7 +96,10 @@ export const Component = () => {
.oauthPreflight(data.provider, data.client, data.redirectUri)
.then(({ url }) => {
// this is the url of oauth provider auth page, can't navigate with react-router
location.href = url;
location.href = attachOAuthFlowToAuthUrl(
url,
resolveOAuthFlowMode(data.flow)
);
})
.catch(e => {
nav(`/sign-in?error=${encodeURIComponent(e.message)}`);

View File

@@ -0,0 +1,74 @@
import { beforeEach, describe, expect, test, vi } from 'vitest';
const { mermaidRender, typstRender } = vi.hoisted(() => ({
mermaidRender: vi.fn(),
typstRender: vi.fn(),
}));
const { domPurifySanitize } = vi.hoisted(() => ({
domPurifySanitize: vi.fn((value: unknown) => {
if (typeof value !== 'string') {
return '';
}
return value.replace(/<script[\s\S]*?<\/script>/gi, '');
}),
}));
vi.mock(
'@affine/core/modules/code-block-preview-renderer/platform-backend',
() => ({
renderMermaidSvgBackend: mermaidRender,
renderTypstSvgBackend: typstRender,
})
);
vi.mock('dompurify', () => ({
default: {
sanitize: domPurifySanitize,
},
}));
import { renderMermaidSvg, renderTypstSvg } from './bridge';
describe('preview render bridge', () => {
beforeEach(() => {
vi.clearAllMocks();
domPurifySanitize.mockImplementation((value: unknown) => {
if (typeof value !== 'string') {
return '';
}
return value.replace(/<script[\s\S]*?<\/script>/gi, '');
});
});
test('uses worker renderers and only sanitizes mermaid output', async () => {
mermaidRender.mockResolvedValue({
svg: '<svg><script>alert(1)</script><text>mermaid</text></svg>',
});
typstRender.mockResolvedValue({
svg: '<div><script>window.__xss__=1</script><svg><text>typst</text></svg></div>',
});
const mermaid = await renderMermaidSvg({ code: 'flowchart TD;A-->B' });
const typst = await renderTypstSvg({ code: '= Title' });
expect(mermaidRender).toHaveBeenCalledTimes(1);
expect(typstRender).toHaveBeenCalledTimes(1);
expect(mermaid.svg).toContain('<svg');
expect(mermaid.svg).toContain('mermaid');
expect(mermaid.svg).not.toContain('<script');
expect(typst.svg).toBe(
'<div><script>window.__xss__=1</script><svg><text>typst</text></svg></div>'
);
});
test('throws when sanitized svg is empty', async () => {
mermaidRender.mockResolvedValue({
svg: '<div><text>invalid</text></div>',
});
await expect(
renderMermaidSvg({ code: 'flowchart TD;A-->B' })
).rejects.toThrow('Preview renderer returned invalid SVG.');
});
});

View File

@@ -0,0 +1,68 @@
import {
renderMermaidSvgBackend,
renderTypstSvgBackend,
} from '@affine/core/modules/code-block-preview-renderer/platform-backend';
import type {
MermaidRenderRequest,
MermaidRenderResult,
} from '@affine/core/modules/mermaid/renderer';
import type {
TypstRenderRequest,
TypstRenderResult,
} from '@affine/core/modules/typst/renderer';
import DOMPurify from 'dompurify';
function removeForeignObject(root: ParentNode) {
root
.querySelectorAll('foreignObject, foreignobject')
.forEach(element => element.remove());
}
export function sanitizeSvg(svg: string): string {
if (
typeof DOMParser === 'undefined' ||
typeof XMLSerializer === 'undefined'
) {
const sanitized = DOMPurify.sanitize(svg, { USE_PROFILES: { svg: true } });
if (typeof sanitized !== 'string' || !/^\s*<svg[\s>]/i.test(sanitized)) {
return '';
}
return sanitized.trim();
}
const parser = new DOMParser();
const parsed = parser.parseFromString(svg, 'image/svg+xml');
const root = parsed.documentElement;
if (!root || root.tagName.toLowerCase() !== 'svg') return '';
const sanitized = DOMPurify.sanitize(root, { USE_PROFILES: { svg: true } });
if (typeof sanitized !== 'string') return '';
const sanitizedDoc = parser.parseFromString(sanitized, 'image/svg+xml');
const sanitizedRoot = sanitizedDoc.documentElement;
if (!sanitizedRoot || sanitizedRoot.tagName.toLowerCase() !== 'svg')
return '';
removeForeignObject(sanitizedRoot);
return new XMLSerializer().serializeToString(sanitizedRoot).trim();
}
export async function renderMermaidSvg(
request: MermaidRenderRequest
): Promise<MermaidRenderResult> {
const rendered = await renderMermaidSvgBackend(request);
const sanitizedSvg = sanitizeSvg(rendered.svg);
if (!sanitizedSvg) {
throw new Error('Preview renderer returned invalid SVG.');
}
return { svg: sanitizedSvg };
}
export async function renderTypstSvg(
request: TypstRenderRequest
): Promise<TypstRenderResult> {
const rendered = await renderTypstSvgBackend(request);
return { svg: rendered.svg };
}

View File

@@ -0,0 +1,68 @@
import { beforeEach, describe, expect, test, vi } from 'vitest';
const { initialize, render } = vi.hoisted(() => ({
initialize: vi.fn(),
render: vi.fn(),
}));
vi.mock('mermaid', () => ({
default: {
initialize,
render,
},
}));
import { renderClassicMermaidSvg } from './classic-mermaid';
describe('renderClassicMermaidSvg', () => {
beforeEach(() => {
vi.clearAllMocks();
});
test('serializes initialize and render across concurrent calls', async () => {
const events: string[] = [];
let releaseFirstRender!: () => void;
initialize.mockImplementation(config => {
events.push(`init:${config.theme}`);
});
render
.mockImplementationOnce(async () => {
events.push('render:first:start');
await new Promise<void>(resolve => {
releaseFirstRender = resolve;
});
events.push('render:first:end');
return { svg: '<svg>first</svg>' };
})
.mockImplementationOnce(async () => {
events.push('render:second:start');
return { svg: '<svg>second</svg>' };
});
const first = renderClassicMermaidSvg({
code: 'flowchart TD;A-->B',
options: { theme: 'default' },
});
const second = renderClassicMermaidSvg({
code: 'flowchart TD;B-->C',
options: { theme: 'modern' },
});
await vi.waitFor(() => {
expect(events).toEqual(['init:default', 'render:first:start']);
});
releaseFirstRender();
await expect(first).resolves.toEqual({ svg: '<svg>first</svg>' });
await expect(second).resolves.toEqual({ svg: '<svg>second</svg>' });
expect(events).toEqual([
'init:default',
'render:first:start',
'render:first:end',
'init:base',
'render:second:start',
]);
});
});

View File

@@ -0,0 +1,62 @@
import type { Mermaid } from 'mermaid';
import type {
MermaidRenderOptions,
MermaidRenderRequest,
MermaidRenderResult,
MermaidRenderTheme,
} from '../mermaid/renderer';
let mermaidPromise: Promise<Mermaid> | null = null;
let mermaidRenderQueue: Promise<void> = Promise.resolve();
function toTheme(theme: MermaidRenderTheme | undefined) {
return theme === 'modern' ? ('base' as const) : ('default' as const);
}
function createClassicMermaidConfig(options?: MermaidRenderOptions) {
return {
startOnLoad: false,
theme: toTheme(options?.theme),
securityLevel: 'strict' as const,
fontFamily: options?.fontFamily ?? 'IBM Plex Mono',
flowchart: { useMaxWidth: true, htmlLabels: true },
sequence: { useMaxWidth: true },
gantt: { useMaxWidth: true },
pie: { useMaxWidth: true },
journey: { useMaxWidth: true },
gitGraph: { useMaxWidth: true },
};
}
async function loadMermaid() {
if (!mermaidPromise) {
mermaidPromise = import('mermaid').then(module => module.default);
}
return mermaidPromise;
}
function createDiagramId() {
return `mermaid-diagram-${Date.now()}-${Math.random().toString(36).slice(2)}`;
}
function enqueueClassicMermaidRender<T>(task: () => Promise<T>): Promise<T> {
const run = mermaidRenderQueue.then(task, task);
mermaidRenderQueue = run.then(
() => undefined,
() => undefined
);
return run;
}
export async function renderClassicMermaidSvg(
request: MermaidRenderRequest
): Promise<MermaidRenderResult> {
return enqueueClassicMermaidRender(async () => {
const mermaid = await loadMermaid();
mermaid.initialize(createClassicMermaidConfig(request.options));
const { svg } = await mermaid.render(createDiagramId(), request.code);
return { svg };
});
}

View File

@@ -0,0 +1,14 @@
import type { Framework } from '@toeverything/infra';
import { FeatureFlagService } from '../feature-flag';
import { PreviewRendererFeatureSyncService } from './services/preview-renderer-feature-sync';
export { renderMermaidSvg, renderTypstSvg, sanitizeSvg } from './bridge';
export {
registerNativePreviewHandlers,
setMermaidWasmNativeRendererEnabled,
} from './runtime-config';
export function configureCodeBlockPreviewRendererModule(framework: Framework) {
framework.service(PreviewRendererFeatureSyncService, [FeatureFlagService]);
}

View File

@@ -0,0 +1,52 @@
import { apis } from '@affine/electron-api';
import { renderClassicMermaidSvg } from './classic-mermaid';
import { isMermaidWasmNativeRendererEnabled } from './runtime-config';
import type { PreviewRenderRequestMap, PreviewRenderResultMap } from './types';
type DesktopPreviewHandlers = {
renderMermaidSvg?: (
request: PreviewRenderRequestMap['mermaid']
) => Promise<PreviewRenderResultMap['mermaid']>;
renderTypstSvg?: (
request: PreviewRenderRequestMap['typst']
) => Promise<PreviewRenderResultMap['typst']>;
};
type DesktopPreviewApis = {
preview?: DesktopPreviewHandlers;
};
function getDesktopPreviewHandlers() {
const previewApis = apis as unknown as DesktopPreviewApis;
return previewApis.preview ?? null;
}
function getRequiredDesktopHandler<Name extends keyof DesktopPreviewHandlers>(
name: Name
): NonNullable<DesktopPreviewHandlers[Name]> {
const handlers = getDesktopPreviewHandlers();
const handler = handlers?.[name];
if (!handler) {
throw new Error(
`Electron preview handler "${String(name)}" is unavailable.`
);
}
return handler as NonNullable<DesktopPreviewHandlers[Name]>;
}
export async function renderMermaidSvgBackend(
request: PreviewRenderRequestMap['mermaid']
): Promise<PreviewRenderResultMap['mermaid']> {
if (!isMermaidWasmNativeRendererEnabled()) {
return renderClassicMermaidSvg(request);
}
return getRequiredDesktopHandler('renderMermaidSvg')(request);
}
export async function renderTypstSvgBackend(
request: PreviewRenderRequestMap['typst']
): Promise<PreviewRenderResultMap['typst']> {
return getRequiredDesktopHandler('renderTypstSvg')(request);
}

View File

@@ -0,0 +1,24 @@
import { getNativePreviewHandlers } from './runtime-config';
import type { PreviewRenderRequestMap, PreviewRenderResultMap } from './types';
function getRequiredNativeHandler<
Name extends keyof NonNullable<ReturnType<typeof getNativePreviewHandlers>>,
>(name: Name) {
const handler = getNativePreviewHandlers()?.[name];
if (!handler) {
throw new Error(`Mobile preview handler "${String(name)}" is unavailable.`);
}
return handler;
}
export async function renderMermaidSvgBackend(
request: PreviewRenderRequestMap['mermaid']
): Promise<PreviewRenderResultMap['mermaid']> {
return getRequiredNativeHandler('renderMermaidSvg')(request);
}
export async function renderTypstSvgBackend(
request: PreviewRenderRequestMap['typst']
): Promise<PreviewRenderResultMap['typst']> {
return getRequiredNativeHandler('renderTypstSvg')(request);
}

View File

@@ -0,0 +1,22 @@
import { getMermaidRenderer } from '@affine/core/modules/mermaid/renderer';
import { getTypstRenderer } from '@affine/core/modules/typst/renderer';
import { renderClassicMermaidSvg } from './classic-mermaid';
import { isMermaidWasmNativeRendererEnabled } from './runtime-config';
import type { PreviewRenderRequestMap, PreviewRenderResultMap } from './types';
export async function renderMermaidSvgBackend(
request: PreviewRenderRequestMap['mermaid']
): Promise<PreviewRenderResultMap['mermaid']> {
if (!isMermaidWasmNativeRendererEnabled()) {
return renderClassicMermaidSvg(request);
}
return getMermaidRenderer().render(request);
}
export async function renderTypstSvgBackend(
request: PreviewRenderRequestMap['typst']
): Promise<PreviewRenderResultMap['typst']> {
return getTypstRenderer().render(request);
}

View File

@@ -0,0 +1,39 @@
import type {
MermaidRenderRequest,
MermaidRenderResult,
} from '@affine/core/modules/mermaid/renderer';
import type {
TypstRenderRequest,
TypstRenderResult,
} from '@affine/core/modules/typst/renderer';
type NativePreviewHandlers = {
renderMermaidSvg?: (
request: MermaidRenderRequest
) => Promise<MermaidRenderResult>;
renderTypstSvg?: (request: TypstRenderRequest) => Promise<TypstRenderResult>;
};
let enableMermaidWasmNativeRenderer =
BUILD_CONFIG.isIOS || BUILD_CONFIG.isAndroid;
let nativePreviewHandlers: NativePreviewHandlers | null = null;
export function setMermaidWasmNativeRendererEnabled(enabled: boolean) {
enableMermaidWasmNativeRenderer = enabled;
}
export function isMermaidWasmNativeRendererEnabled() {
return enableMermaidWasmNativeRenderer;
}
export function registerNativePreviewHandlers(
handlers: NativePreviewHandlers | null
) {
nativePreviewHandlers = handlers;
}
export function getNativePreviewHandlers() {
return nativePreviewHandlers;
}
export type { NativePreviewHandlers };

View File

@@ -0,0 +1,26 @@
import { OnEvent, Service } from '@toeverything/infra';
import { distinctUntilChanged } from 'rxjs';
import type { FeatureFlagService } from '../../feature-flag';
import { ApplicationStarted } from '../../lifecycle';
import { setMermaidWasmNativeRendererEnabled } from '../runtime-config';
@OnEvent(ApplicationStarted, e => e.syncFlag)
export class PreviewRendererFeatureSyncService extends Service {
constructor(private readonly featureFlagService: FeatureFlagService) {
super();
}
syncFlag() {
const mermaidFlag =
this.featureFlagService.flags.enable_mermaid_wasm_native_renderer;
setMermaidWasmNativeRendererEnabled(!!mermaidFlag.value);
const subscription = mermaidFlag.$.pipe(distinctUntilChanged()).subscribe(
enabled => {
setMermaidWasmNativeRendererEnabled(!!enabled);
}
);
this.disposables.push(() => subscription.unsubscribe());
}
}

View File

@@ -0,0 +1,18 @@
import type {
MermaidRenderRequest,
MermaidRenderResult,
} from '@affine/core/modules/mermaid/renderer';
import type {
TypstRenderRequest,
TypstRenderResult,
} from '@affine/core/modules/typst/renderer';
export type PreviewRenderRequestMap = {
mermaid: MermaidRenderRequest;
typst: TypstRenderRequest;
};
export type PreviewRenderResultMap = {
mermaid: MermaidRenderResult;
typst: TypstRenderResult;
};

View File

@@ -4,6 +4,7 @@ import type { FlagInfo } from './types';
const isCanaryBuild = BUILD_CONFIG.appBuildType === 'canary';
const isMobile = BUILD_CONFIG.isMobileEdition;
const isIOS = BUILD_CONFIG.isIOS;
const isAndroid = BUILD_CONFIG.isAndroid;
export const AFFINE_FLAGS = {
enable_ai: {
@@ -203,6 +204,14 @@ export const AFFINE_FLAGS = {
configurable: isMobile && isIOS,
defaultState: isMobile && isIOS,
},
enable_mermaid_wasm_native_renderer: {
category: 'affine',
displayName: 'Enable Native Mermaid Renderer',
description:
'Use the new Mermaid renderer backend. Web uses WASM, desktop uses native, and mobile always uses native. The native renderer is more than 10x faster, but its styling/aesthetic quality and the types of graphics it supports are not as good as the JS version.',
configurable: !isIOS && !isAndroid,
defaultState: isIOS || isAndroid,
},
enable_turbo_renderer: {
category: 'blocksuite',
bsFlag: 'enable_turbo_renderer',

View File

@@ -13,6 +13,7 @@ import { configureAppSidebarModule } from './app-sidebar';
import { configAtMenuConfigModule } from './at-menu-config';
import { configureBlobManagementModule } from './blob-management';
import { configureCloudModule } from './cloud';
import { configureCodeBlockPreviewRendererModule } from './code-block-preview-renderer';
import { configureCollectionModule } from './collection';
import { configureCollectionRulesModule } from './collection-rules';
import { configureCommentModule } from './comment';
@@ -77,6 +78,7 @@ export function configureCommonModules(framework: Framework) {
configureGlobalContextModule(framework);
configureLifecycleModule(framework);
configureFeatureFlagModule(framework);
configureCodeBlockPreviewRendererModule(framework);
configureCollectionModule(framework);
configureNavigationModule(framework);
configureTagModule(framework);

View File

@@ -0,0 +1,39 @@
import { WorkerOpRenderer } from '../../shared/worker-op-renderer';
import type {
MermaidOps,
MermaidRenderOptions,
MermaidRenderRequest,
} from './types';
class MermaidRenderer extends WorkerOpRenderer<MermaidOps> {
constructor() {
super('mermaid');
}
init(options?: MermaidRenderOptions) {
return this.ensureInitialized(() => this.call('init', options));
}
async render(request: MermaidRenderRequest) {
await this.init();
return this.call('render', request);
}
}
let sharedMermaidRenderer: MermaidRenderer | null = null;
export function getMermaidRenderer() {
if (!sharedMermaidRenderer) {
sharedMermaidRenderer = new MermaidRenderer();
}
return sharedMermaidRenderer;
}
export type {
MermaidOps,
MermaidRenderOptions,
MermaidRenderRequest,
MermaidRenderResult,
MermaidRenderTheme,
MermaidTextMetrics,
} from './types';

View File

@@ -0,0 +1,63 @@
import type { MessageCommunicapable } from '@toeverything/infra/op';
import { OpConsumer } from '@toeverything/infra/op';
import initMmdr, { render_mermaid_svg } from '@toeverything/mermaid-wasm';
import type {
MermaidOps,
MermaidRenderOptions,
MermaidRenderRequest,
} from './types';
const DEFAULT_RENDER_OPTIONS: MermaidRenderOptions = {
fastText: true,
svgOnly: true,
theme: 'modern',
fontFamily: 'IBM Plex Mono',
};
function mergeOptions(
base: MermaidRenderOptions,
override: MermaidRenderOptions | undefined
): MermaidRenderOptions {
if (!override) {
return base;
}
return {
...base,
...override,
textMetrics: override.textMetrics ?? base.textMetrics,
};
}
class MermaidRendererBackend extends OpConsumer<MermaidOps> {
private initPromise: Promise<void> | null = null;
private options: MermaidRenderOptions = DEFAULT_RENDER_OPTIONS;
constructor(port: MessageCommunicapable) {
super(port);
this.register('init', this.init.bind(this));
this.register('render', this.render.bind(this));
}
private ensureReady() {
if (!this.initPromise) {
this.initPromise = initMmdr().then(() => undefined);
}
return this.initPromise;
}
async init(options?: MermaidRenderOptions) {
this.options = mergeOptions(DEFAULT_RENDER_OPTIONS, options);
await this.ensureReady();
return { ok: true } as const;
}
async render({ code, options }: MermaidRenderRequest) {
await this.ensureReady();
const mergedOptions = mergeOptions(this.options, options);
const svg = render_mermaid_svg(code, JSON.stringify(mergedOptions));
return { svg };
}
}
new MermaidRendererBackend(self as MessageCommunicapable);

View File

@@ -0,0 +1,32 @@
import type { OpSchema } from '@toeverything/infra/op';
export type MermaidTextMetrics = {
ascii: number;
cjk: number;
space: number;
};
export type MermaidRenderTheme = 'modern' | 'default';
export type MermaidRenderOptions = {
fastText?: boolean;
svgOnly?: boolean;
textMetrics?: MermaidTextMetrics;
theme?: MermaidRenderTheme;
fontFamily?: string;
fontSize?: number;
};
export type MermaidRenderRequest = {
code: string;
options?: MermaidRenderOptions;
};
export type MermaidRenderResult = {
svg: string;
};
export interface MermaidOps extends OpSchema {
init: [MermaidRenderOptions | undefined, { ok: true }];
render: [MermaidRenderRequest, MermaidRenderResult];
}

View File

@@ -1,2 +1,10 @@
export { PDFRenderer } from './renderer';
export type { PDFMeta, RenderedPage, RenderPageOpts } from './types';
import { WorkerOpRenderer } from '../../shared/worker-op-renderer';
import type { PDFOps } from './types';
export class PDFRenderer extends WorkerOpRenderer<PDFOps> {
constructor() {
super('pdf');
}
}
export type { PDFMeta, PDFOps, RenderedPage, RenderPageOpts } from './types';

View File

@@ -1,8 +0,0 @@
import type { OpSchema } from '@toeverything/infra/op';
import type { PDFMeta, RenderedPage, RenderPageOpts } from './types';
export interface ClientOps extends OpSchema {
open: [{ data: ArrayBuffer }, PDFMeta];
render: [RenderPageOpts, RenderedPage];
}

View File

@@ -23,10 +23,9 @@ import {
switchMap,
} from 'rxjs';
import type { ClientOps } from './ops';
import type { PDFMeta, RenderPageOpts } from './types';
import type { PDFMeta, PDFOps, RenderPageOpts } from './types';
class PDFRendererBackend extends OpConsumer<ClientOps> {
class PDFRendererBackend extends OpConsumer<PDFOps> {
constructor(port: MessageCommunicapable) {
super(port);
this.register('open', this.open.bind(this));

View File

@@ -1,24 +0,0 @@
import { getWorkerUrl } from '@affine/env/worker';
import { OpClient } from '@toeverything/infra/op';
import type { ClientOps } from './ops';
export class PDFRenderer extends OpClient<ClientOps> {
private readonly worker: Worker;
constructor() {
const worker = new Worker(getWorkerUrl('pdf'));
super(worker);
this.worker = worker;
}
override destroy() {
super.destroy();
this.worker.terminate();
}
[Symbol.dispose]() {
this.destroy();
}
}

View File

@@ -1,3 +1,5 @@
import type { OpSchema } from '@toeverything/infra/op';
export type PageSize = {
width: number;
height: number;
@@ -21,3 +23,8 @@ export type RenderPageOpts = {
export type RenderedPage = {
bitmap: ImageBitmap;
};
export interface PDFOps extends OpSchema {
open: [{ data: ArrayBuffer }, PDFMeta];
render: [RenderPageOpts, RenderedPage];
}

View File

@@ -0,0 +1,42 @@
import { beforeEach, describe, expect, test, vi } from 'vitest';
import { WorkerOpRenderer } from './worker-op-renderer';
vi.mock('@affine/env/worker', () => ({
getWorkerUrl: vi.fn(() => '/worker.js'),
}));
class MockWorker {
addEventListener = vi.fn();
postMessage = vi.fn();
removeEventListener = vi.fn();
terminate = vi.fn();
}
class TestRenderer extends WorkerOpRenderer<{
init: [undefined, { ok: true }];
}> {
constructor() {
super('test');
}
init() {
return this.ensureInitialized(async () => {
return { ok: true } as const;
});
}
}
describe('WorkerOpRenderer', () => {
beforeEach(() => {
vi.stubGlobal('Worker', MockWorker);
});
test('rejects initialization after destroy', async () => {
const renderer = new TestRenderer();
renderer.destroy();
await expect(renderer.init()).rejects.toThrow('renderer destroyed');
});
});

View File

@@ -0,0 +1,47 @@
import { getWorkerUrl } from '@affine/env/worker';
import { OpClient, type OpSchema } from '@toeverything/infra/op';
type InitTask = () => Promise<unknown>;
export abstract class WorkerOpRenderer<
Ops extends OpSchema,
> extends OpClient<Ops> {
private readonly worker: Worker;
private destroyed = false;
private initPromise: Promise<void> | null = null;
protected constructor(workerName: string) {
const worker = new Worker(getWorkerUrl(workerName));
super(worker);
this.worker = worker;
}
protected ensureInitialized(task: InitTask) {
if (this.destroyed) return Promise.reject(new Error('renderer destroyed'));
if (!this.initPromise) {
this.initPromise = task()
.then(() => undefined)
.catch(error => {
this.initPromise = null;
throw error;
});
}
return this.initPromise;
}
protected resetInitialization() {
this.initPromise = null;
}
override destroy() {
if (this.destroyed) return;
this.destroyed = true;
super.destroy();
this.worker.terminate();
this.resetInitialization();
}
[Symbol.dispose]() {
this.destroy();
}
}

View File

@@ -0,0 +1,33 @@
import { WorkerOpRenderer } from '../../shared/worker-op-renderer';
import type { TypstOps, TypstRenderOptions, TypstRenderRequest } from './types';
class TypstRenderer extends WorkerOpRenderer<TypstOps> {
constructor() {
super('typst');
}
init(options?: TypstRenderOptions) {
return this.ensureInitialized(() => this.call('init', options));
}
async render(request: TypstRenderRequest) {
await this.init();
return this.call('render', request);
}
}
let sharedTypstRenderer: TypstRenderer | null = null;
export function getTypstRenderer() {
if (!sharedTypstRenderer) {
sharedTypstRenderer = new TypstRenderer();
}
return sharedTypstRenderer;
}
export type {
TypstOps,
TypstRenderOptions,
TypstRenderRequest,
TypstRenderResult,
} from './types';

View File

@@ -0,0 +1,84 @@
import { beforeEach, describe, expect, test, vi } from 'vitest';
const { loadFonts, setCompilerInitOptions, setRendererInitOptions, svg } =
vi.hoisted(() => ({
loadFonts: vi.fn((fontUrls: string[]) => ({ fontUrls })),
setCompilerInitOptions: vi.fn(),
setRendererInitOptions: vi.fn(),
svg: vi.fn(),
}));
vi.mock('@myriaddreamin/typst.ts', () => ({
$typst: {
setCompilerInitOptions,
setRendererInitOptions,
svg,
},
loadFonts,
}));
import { ensureTypstReady, renderTypstSvgWithOptions } from './runtime';
describe('typst runtime', () => {
beforeEach(() => {
vi.clearAllMocks();
svg.mockResolvedValue('<svg />');
});
test('reconfigures typst when fontUrls change', async () => {
await ensureTypstReady(['font-a']);
await ensureTypstReady(['font-b']);
expect(loadFonts).toHaveBeenNthCalledWith(
1,
['font-a'],
expect.any(Object)
);
expect(loadFonts).toHaveBeenNthCalledWith(
2,
['font-b'],
expect.any(Object)
);
expect(setCompilerInitOptions).toHaveBeenCalledTimes(2);
expect(setRendererInitOptions).toHaveBeenCalledTimes(2);
});
test('serializes typst renders that need different configuration', async () => {
const events: string[] = [];
let releaseFirstRender!: () => void;
svg.mockImplementationOnce(async () => {
events.push('svg:first:start');
await new Promise<void>(resolve => {
releaseFirstRender = resolve;
});
events.push('svg:first:end');
return '<svg>first</svg>';
});
svg.mockImplementationOnce(async () => {
events.push('svg:second:start');
return '<svg>second</svg>';
});
const first = renderTypstSvgWithOptions('= First', {
fontUrls: ['font-a'],
});
const second = renderTypstSvgWithOptions('= Second', {
fontUrls: ['font-b'],
});
await vi.waitFor(() => {
expect(events).toEqual(['svg:first:start']);
});
releaseFirstRender();
await expect(first).resolves.toEqual({ svg: '<svg>first</svg>' });
await expect(second).resolves.toEqual({ svg: '<svg>second</svg>' });
expect(events).toEqual([
'svg:first:start',
'svg:first:end',
'svg:second:start',
]);
});
});

View File

@@ -0,0 +1,209 @@
import { $typst, type BeforeBuildFn, loadFonts } from '@myriaddreamin/typst.ts';
import type { TypstRenderOptions } from './types';
export const DEFAULT_TYPST_FONT_URLS = [
'https://cdn.affine.pro/fonts/Inter-Regular.woff',
'https://cdn.affine.pro/fonts/Inter-SemiBold.woff',
'https://cdn.affine.pro/fonts/Inter-Italic.woff',
'https://cdn.affine.pro/fonts/Inter-SemiBoldItalic.woff',
'https://cdn.affine.pro/fonts/SarasaGothicCL-Regular.ttf',
] as const;
export const DEFAULT_TYPST_RENDER_OPTIONS: TypstRenderOptions = {
fontUrls: [...DEFAULT_TYPST_FONT_URLS],
};
const DEFAULT_FONT_FALLBACKS: Record<string, string> = {
'Inter-Regular.woff': 'Inter-Regular.woff2',
'Inter-SemiBold.woff': 'Inter-SemiBold.woff2',
'Inter-Italic.woff': 'Inter-Italic.woff2',
'Inter-SemiBoldItalic.woff': 'Inter-SemiBoldItalic.woff2',
'SarasaGothicCL-Regular.ttf': 'Inter-Regular.woff2',
'Inter-Regular.woff2': 'Inter-Regular.woff2',
'Inter-SemiBold.woff2': 'Inter-SemiBold.woff2',
'Inter-Italic.woff2': 'Inter-Italic.woff2',
'Inter-SemiBoldItalic.woff2': 'Inter-SemiBoldItalic.woff2',
};
const compilerWasmUrl = new URL(
'@myriaddreamin/typst-ts-web-compiler/pkg/typst_ts_web_compiler_bg.wasm',
import.meta.url
).toString();
const rendererWasmUrl = new URL(
'@myriaddreamin/typst-ts-renderer/pkg/typst_ts_renderer_bg.wasm',
import.meta.url
).toString();
type TypstWasmModuleUrls = {
compilerWasmUrl?: string;
rendererWasmUrl?: string;
};
type TypstInitState = {
key: string;
promise: Promise<void>;
};
let typstInitState: TypstInitState | null = null;
let typstRenderQueue: Promise<void> = Promise.resolve();
function extractInputUrl(input: RequestInfo | URL): string | null {
if (input instanceof URL) {
return input.toString();
}
if (typeof input === 'string') {
return input;
}
if (typeof Request !== 'undefined' && input instanceof Request) {
return input.url;
}
return null;
}
function resolveLocalFallbackFontUrl(sourceUrl: string): string | null {
if (typeof location === 'undefined') {
return null;
}
const source = new URL(sourceUrl, location.href);
const fileName = source.pathname.split('/').at(-1);
if (!fileName) {
return null;
}
const fallbackFileName = DEFAULT_FONT_FALLBACKS[fileName];
if (!fallbackFileName) {
return null;
}
const workerUrl = new URL(location.href);
const jsPathMarker = '/js/';
const markerIndex = workerUrl.pathname.lastIndexOf(jsPathMarker);
const basePath =
markerIndex >= 0 ? workerUrl.pathname.slice(0, markerIndex + 1) : '/';
return new URL(
`${basePath}fonts/${fallbackFileName}`,
workerUrl.origin
).toString();
}
export function createTypstFontFetcher(baseFetcher: typeof fetch = fetch) {
return async (input: RequestInfo | URL, init?: RequestInit) => {
const sourceUrl = extractInputUrl(input);
const fallbackUrl = sourceUrl
? resolveLocalFallbackFontUrl(sourceUrl)
: null;
try {
const response = await baseFetcher(input, init);
if (!fallbackUrl || response.ok || fallbackUrl === sourceUrl) {
return response;
}
const fallbackResponse = await baseFetcher(fallbackUrl, init);
return fallbackResponse.ok ? fallbackResponse : response;
} catch (error) {
if (!fallbackUrl || fallbackUrl === sourceUrl) {
throw error;
}
return baseFetcher(fallbackUrl, init);
}
};
}
export function mergeTypstRenderOptions(
base: TypstRenderOptions,
override: TypstRenderOptions | undefined
): TypstRenderOptions {
return {
...base,
...override,
fontUrls: override?.fontUrls ?? base.fontUrls,
};
}
function getBeforeBuildHooks(fontUrls: string[]): BeforeBuildFn[] {
return [
loadFonts([...fontUrls], {
assets: ['text'],
fetcher: createTypstFontFetcher(),
}),
];
}
function createTypstInitKey(
fontUrls: string[],
wasmModuleUrls: TypstWasmModuleUrls
) {
return JSON.stringify({
fontUrls,
compilerWasmUrl: wasmModuleUrls.compilerWasmUrl ?? compilerWasmUrl,
rendererWasmUrl: wasmModuleUrls.rendererWasmUrl ?? rendererWasmUrl,
});
}
function enqueueTypstRender<T>(task: () => Promise<T>): Promise<T> {
const run = typstRenderQueue.then(task, task);
typstRenderQueue = run.then(
() => undefined,
() => undefined
);
return run;
}
export async function ensureTypstReady(
fontUrls: string[],
wasmModuleUrls: TypstWasmModuleUrls = {}
) {
const key = createTypstInitKey(fontUrls, wasmModuleUrls);
if (typstInitState?.key === key) {
return typstInitState.promise;
}
const promise = Promise.resolve()
.then(() => {
const compilerBeforeBuild = getBeforeBuildHooks(fontUrls);
$typst.setCompilerInitOptions({
beforeBuild: compilerBeforeBuild,
getModule: () => wasmModuleUrls.compilerWasmUrl ?? compilerWasmUrl,
});
$typst.setRendererInitOptions({
getModule: () => wasmModuleUrls.rendererWasmUrl ?? rendererWasmUrl,
});
})
.catch(error => {
if (typstInitState?.key === key) {
typstInitState = null;
}
throw error;
});
typstInitState = { key, promise };
return promise;
}
export async function renderTypstSvgWithOptions(
code: string,
options: TypstRenderOptions | undefined,
wasmModuleUrls?: TypstWasmModuleUrls
) {
const resolvedOptions = mergeTypstRenderOptions(
DEFAULT_TYPST_RENDER_OPTIONS,
options
);
return enqueueTypstRender(async () => {
await ensureTypstReady(
resolvedOptions.fontUrls ?? [...DEFAULT_TYPST_FONT_URLS],
wasmModuleUrls
);
const svg = await $typst.svg({
mainContent: code,
});
return { svg };
});
}

View File

@@ -0,0 +1,19 @@
import type { OpSchema } from '@toeverything/infra/op';
export type TypstRenderOptions = {
fontUrls?: string[];
};
export type TypstRenderRequest = {
code: string;
options?: TypstRenderOptions;
};
export type TypstRenderResult = {
svg: string;
};
export interface TypstOps extends OpSchema {
init: [TypstRenderOptions | undefined, { ok: true }];
render: [TypstRenderRequest, TypstRenderResult];
}

View File

@@ -0,0 +1,40 @@
import type { MessageCommunicapable } from '@toeverything/infra/op';
import { OpConsumer } from '@toeverything/infra/op';
import {
DEFAULT_TYPST_RENDER_OPTIONS,
ensureTypstReady,
mergeTypstRenderOptions,
renderTypstSvgWithOptions,
} from './runtime';
import type { TypstOps, TypstRenderOptions, TypstRenderRequest } from './types';
class TypstRendererBackend extends OpConsumer<TypstOps> {
private options: TypstRenderOptions = DEFAULT_TYPST_RENDER_OPTIONS;
constructor(port: MessageCommunicapable) {
super(port);
this.register('init', this.init.bind(this));
this.register('render', this.render.bind(this));
}
async init(options?: TypstRenderOptions) {
this.options = mergeTypstRenderOptions(
DEFAULT_TYPST_RENDER_OPTIONS,
options
);
await ensureTypstReady(
this.options.fontUrls ?? [
...(DEFAULT_TYPST_RENDER_OPTIONS.fontUrls ?? []),
]
);
return { ok: true } as const;
}
async render({ code, options }: TypstRenderRequest) {
const mergedOptions = mergeTypstRenderOptions(this.options, options);
return renderTypstSvgWithOptions(code, mergedOptions);
}
}
new TypstRendererBackend(self as MessageCommunicapable);

View File

@@ -1,6 +1,7 @@
[package]
edition = "2024"
name = "affine_mobile_native"
publish = false
version = "0.0.0"
[lib]
@@ -40,7 +41,11 @@ objc2-foundation = { workspace = true, features = [
homedir = { workspace = true }
[target.'cfg(any(target_os = "android", target_os = "ios"))'.dependencies]
lru = { workspace = true }
lru = { workspace = true }
mermaid-rs-renderer = { workspace = true }
typst = { workspace = true }
typst-as-lib = { workspace = true }
typst-svg = { workspace = true }
[build-dependencies]
uniffi = { workspace = true, features = ["build"] }

View File

@@ -1,6 +1,8 @@
mod error;
mod ffi_types;
mod payload_codec;
#[cfg(any(target_os = "android", target_os = "ios"))]
mod preview;
mod storage;
#[cfg(test)]
mod tests;
@@ -14,6 +16,8 @@ pub use error::UniffiError;
pub use ffi_types::{
Blob, BlockInfo, CrawlResult, DocClock, DocRecord, DocUpdate, ListedBlob, MatchRange, SearchHit, SetBlob,
};
#[cfg(any(target_os = "android", target_os = "ios"))]
pub use preview::{render_mermaid_preview_svg, render_typst_preview_svg};
pub use storage::{DocStoragePool, new_doc_storage_pool};
uniffi::setup_scaffolding!("affine_mobile_native");

View File

@@ -0,0 +1,155 @@
use std::{borrow::Cow, path::PathBuf};
use mermaid_rs_renderer::RenderOptions;
use typst::{
diag::FileResult,
foundations::Bytes,
layout::{Abs, PagedDocument},
syntax::{FileId, Source},
};
use typst_as_lib::{
TypstEngine,
cached_file_resolver::{CachedFileResolver, IntoCachedFileResolver},
file_resolver::FileResolver,
package_resolver::{FileSystemCache, PackageResolver},
typst_kit_options::TypstKitFontOptions,
};
use crate::{Result, UniffiError};
const TYPST_PACKAGE_CACHE_DIR: &str = "typst-package-cache";
enum MobileTypstPackageResolver {
FileSystem(CachedFileResolver<PackageResolver<FileSystemCache>>),
InMemory(CachedFileResolver<PackageResolver<typst_as_lib::package_resolver::InMemoryCache>>),
}
impl FileResolver for MobileTypstPackageResolver {
fn resolve_binary(&self, id: FileId) -> FileResult<Cow<'_, Bytes>> {
match self {
Self::FileSystem(resolver) => resolver.resolve_binary(id),
Self::InMemory(resolver) => resolver.resolve_binary(id),
}
}
fn resolve_source(&self, id: FileId) -> FileResult<Cow<'_, Source>> {
match self {
Self::FileSystem(resolver) => resolver.resolve_source(id),
Self::InMemory(resolver) => resolver.resolve_source(id),
}
}
}
fn resolve_mermaid_render_options(
theme: Option<String>,
font_family: Option<String>,
font_size: Option<f64>,
) -> RenderOptions {
let mut render_options = match theme.as_deref() {
Some("default") => RenderOptions::mermaid_default(),
_ => RenderOptions::modern(),
};
if let Some(font_family) = font_family {
render_options.theme.font_family = font_family;
}
if let Some(font_size) = font_size {
render_options.theme.font_size = font_size as f32;
}
render_options
}
#[uniffi::export]
pub fn render_mermaid_preview_svg(
code: String,
theme: Option<String>,
font_family: Option<String>,
font_size: Option<f64>,
) -> Result<String> {
let render_options = resolve_mermaid_render_options(theme, font_family, font_size);
mermaid_rs_renderer::render_with_options(&code, render_options).map_err(|error| UniffiError::Err(error.to_string()))
}
fn normalize_typst_svg(svg: String) -> String {
let mut svg = svg;
let page_background_marker = r##"<path class="typst-shape""##;
let mut cursor = 0;
while let Some(relative_idx) = svg[cursor..].find(page_background_marker) {
let idx = cursor + relative_idx;
let rest = &svg[idx..];
let Some(relative_end) = rest.find("/>") else {
break;
};
let end = idx + relative_end + 2;
let path_fragment = &svg[idx..end];
let is_page_background_path =
path_fragment.contains(r#"d="M 0 0v "#) && path_fragment.contains(r#" h "#) && path_fragment.contains(r#" v -"#);
if is_page_background_path {
svg.replace_range(idx..end, "");
cursor = idx;
continue;
}
cursor = end;
}
svg
}
fn resolve_typst_font_dirs(font_dirs: Option<Vec<String>>) -> Vec<PathBuf> {
font_dirs
.map(|dirs| dirs.into_iter().map(PathBuf::from).collect())
.unwrap_or_default()
}
fn resolve_typst_package_resolver(cache_dir: Option<String>) -> Result<MobileTypstPackageResolver> {
let resolver = match cache_dir {
Some(cache_dir) => {
let cache_dir = PathBuf::from(cache_dir).join(TYPST_PACKAGE_CACHE_DIR);
std::fs::create_dir_all(&cache_dir).map_err(|error| UniffiError::Err(error.to_string()))?;
MobileTypstPackageResolver::FileSystem(
PackageResolver::builder()
.cache(FileSystemCache(cache_dir))
.build()
.into_cached(),
)
}
None => {
MobileTypstPackageResolver::InMemory(PackageResolver::builder().with_in_memory_cache().build().into_cached())
}
};
Ok(resolver)
}
#[uniffi::export]
pub fn render_typst_preview_svg(
code: String,
font_dirs: Option<Vec<String>>,
cache_dir: Option<String>,
) -> Result<String> {
let search_options = TypstKitFontOptions::new()
.include_system_fonts(false)
.include_embedded_fonts(true)
.include_dirs(resolve_typst_font_dirs(font_dirs));
let package_resolver = resolve_typst_package_resolver(cache_dir)?;
let engine = TypstEngine::builder()
.main_file(code)
.search_fonts_with(search_options)
.add_file_resolver(package_resolver)
.build();
let document = engine
.compile::<PagedDocument>()
.output
.map_err(|error| UniffiError::Err(error.to_string()))?;
Ok(normalize_typst_svg(typst_svg::svg_merged(&document, Abs::pt(0.0))))
}

View File

@@ -1,6 +1,7 @@
[package]
edition = "2024"
name = "affine_native"
publish = false
version = "0.0.0"
[lib]
@@ -25,6 +26,12 @@ sqlx = { workspace = true, default-features = false, features = [
thiserror = { workspace = true }
tokio = { workspace = true, features = ["full"] }
[target.'cfg(not(any(target_os = "android", target_os = "ios")))'.dependencies]
mermaid-rs-renderer = { workspace = true }
typst = { workspace = true }
typst-as-lib = { workspace = true }
typst-svg = { workspace = true }
[target.'cfg(not(target_os = "linux"))'.dependencies]
mimalloc = { workspace = true }

View File

@@ -40,12 +40,47 @@ export declare function decodeAudio(buf: Uint8Array, destSampleRate?: number | u
/** Decode audio file into a Float32Array */
export declare function decodeAudioSync(buf: Uint8Array, destSampleRate?: number | undefined | null, filename?: string | undefined | null): Float32Array
export interface MermaidRenderOptions {
theme?: string
fontFamily?: string
fontSize?: number
}
export interface MermaidRenderRequest {
code: string
options?: MermaidRenderOptions
}
export interface MermaidRenderResult {
svg: string
}
export declare function mintChallengeResponse(resource: string, bits?: number | undefined | null): Promise<string>
export declare function renderMermaidSvg(request: MermaidRenderRequest): MermaidRenderResult
export declare function renderTypstSvg(request: TypstRenderRequest): TypstRenderResult
export interface TypstRenderOptions {
fontUrls?: Array<string>
fontDirs?: Array<string>
}
export interface TypstRenderRequest {
code: string
options?: TypstRenderOptions
}
export interface TypstRenderResult {
svg: string
}
export declare function verifyChallengeResponse(response: string, bits: number, resource: string): Promise<boolean>
export declare class DocStorage {
constructor(path: string)
validate(): Promise<boolean>
validateImportSchema(): Promise<boolean>
vacuumInto(path: string): Promise<void>
setSpaceId(spaceId: string): Promise<void>
}
@@ -55,6 +90,7 @@ export declare class DocStoragePool {
connect(universalId: string, path: string): Promise<void>
disconnect(universalId: string): Promise<void>
checkpoint(universalId: string): Promise<void>
vacuumInto(universalId: string, path: string): Promise<void>
crawlDocData(universalId: string, docId: string): Promise<NativeCrawlResult>
setSpaceId(universalId: string, spaceId: string): Promise<void>
pushUpdate(universalId: string, docId: string, update: Uint8Array): Promise<Date>
@@ -196,11 +232,13 @@ export declare class SqliteConnection {
close(): Promise<void>
get isClose(): boolean
static validate(path: string): Promise<ValidationResult>
validateImportSchema(): Promise<boolean>
migrateAddDocId(): Promise<void>
/** * Flush the WAL file to the database file.
* See https://www.sqlite.org/pragma.html#pragma_wal_checkpoint:~:text=PRAGMA%20schema.wal_checkpoint%3B
*/
checkpoint(): Promise<void>
vacuumInto(path: string): Promise<void>
}
export interface BlobRow {

View File

@@ -580,6 +580,8 @@ module.exports.ShareableContent = nativeBinding.ShareableContent
module.exports.decodeAudio = nativeBinding.decodeAudio
module.exports.decodeAudioSync = nativeBinding.decodeAudioSync
module.exports.mintChallengeResponse = nativeBinding.mintChallengeResponse
module.exports.renderMermaidSvg = nativeBinding.renderMermaidSvg
module.exports.renderTypstSvg = nativeBinding.renderTypstSvg
module.exports.verifyChallengeResponse = nativeBinding.verifyChallengeResponse
module.exports.DocStorage = nativeBinding.DocStorage
module.exports.DocStoragePool = nativeBinding.DocStoragePool

Some files were not shown because too many files have changed in this diff Show More