Compare commits

..

18 Commits

Author SHA1 Message Date
DarkSky
59fd942f40 fix(editor): database detail style (#14680)
fix #13923


#### PR Dependency Tree


* **PR #14680** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Style**
* Refined styling and alignment for number field displays in the
database view component.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-18 14:58:53 +08:00
DarkSky
d6d5ae6182 fix(electron): create doc shortcut should follow default type in settings (#14678) 2026-03-18 14:58:22 +08:00
renovate[bot]
c1a09b951f chore: bump up fast-xml-parser version to v5.5.6 [SECURITY] (#14676)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
|
[fast-xml-parser](https://redirect.github.com/NaturalIntelligence/fast-xml-parser)
| [`5.4.1` →
`5.5.6`](https://renovatebot.com/diffs/npm/fast-xml-parser/5.4.1/5.5.6)
|
![age](https://developer.mend.io/api/mc/badges/age/npm/fast-xml-parser/5.5.6?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/fast-xml-parser/5.4.1/5.5.6?slim=true)
|

### GitHub Vulnerability Alerts

####
[CVE-2026-33036](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/security/advisories/GHSA-8gc5-j5rx-235r)

## Summary

The fix for CVE-2026-26278 added entity expansion limits
(`maxTotalExpansions`, `maxExpandedLength`, `maxEntityCount`,
`maxEntitySize`) to prevent XML entity expansion Denial of Service.
However, these limits are only enforced for DOCTYPE-defined entities.
**Numeric character references** (`&#NNN;` and `&#xHH;`) and standard
XML entities (`&lt;`, `&gt;`, etc.) are processed through a separate
code path that does NOT enforce any expansion limits.

An attacker can use massive numbers of numeric entity references to
completely bypass all configured limits, causing excessive memory
allocation and CPU consumption.

## Affected Versions

fast-xml-parser v5.x through v5.5.3 (and likely v5.5.5 on npm)

## Root Cause

In `src/xmlparser/OrderedObjParser.js`, the `replaceEntitiesValue()`
function has two separate entity replacement loops:

1. **Lines 638-670**: DOCTYPE entities — expansion counting with
`entityExpansionCount` and `currentExpandedLength` tracking. This was
the CVE-2026-26278 fix.
2. **Lines 674-677**: `lastEntities` loop — replaces standard entities
including `num_dec` (`/&#([0-9]{1,7});/g`) and `num_hex`
(`/&#x([0-9a-fA-F]{1,6});/g`). **This loop has NO expansion counting at
all.**

The numeric entity regex replacements at lines 97-98 are part of
`lastEntities` and go through the uncounted loop, completely bypassing
the CVE-2026-26278 fix.

## Proof of Concept

```javascript
const { XMLParser } = require('fast-xml-parser');

// Even with strict explicit limits, numeric entities bypass them
const parser = new XMLParser({
  processEntities: {
    enabled: true,
    maxTotalExpansions: 10,
    maxExpandedLength: 100,
    maxEntityCount: 1,
    maxEntitySize: 10
  }
});

// 100K numeric entity references — should be blocked by maxTotalExpansions=10
const xml = `<root>${'&#&#8203;65;'.repeat(100000)}</root>`;
const result = parser.parse(xml);

// Output: 500,000 chars — bypasses maxExpandedLength=100 completely
console.log('Output length:', result.root.length);  // 500000
console.log('Expected max:', 100);  // limit was 100
```

**Results:**
- 100K `&#&#8203;65;` references → 500,000 char output (5x default
maxExpandedLength of 100,000)
- 1M references → 5,000,000 char output, ~147MB memory consumed
- Even with `maxTotalExpansions=10` and `maxExpandedLength=100`, 10K
references produce 50,000 chars
- Hex entities (`&#x41;`) exhibit the same bypass

## Impact

**Denial of Service** — An attacker who can provide XML input to
applications using fast-xml-parser can cause:
- Excessive memory allocation (147MB+ for 1M entity references)
- CPU consumption during regex replacement
- Potential process crash via OOM

This is particularly dangerous because the application developer may
have explicitly configured strict entity expansion limits believing they
are protected, while numeric entities silently bypass all of them.

## Suggested Fix

Apply the same `entityExpansionCount` and `currentExpandedLength`
tracking to the `lastEntities` loop (lines 674-677) and the HTML
entities loop (lines 680-686), similar to how DOCTYPE entities are
tracked at lines 638-670.

## Workaround

Set `htmlEntities:false`

---

### Release Notes

<details>
<summary>NaturalIntelligence/fast-xml-parser (fast-xml-parser)</summary>

###
[`v5.5.6`](e54155f530...870043e75e)

[Compare
Source](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.5...v5.5.6)

###
[`v5.5.5`](ea07bb2e84...e54155f530)

[Compare
Source](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.4...v5.5.5)

###
[`v5.5.4`](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.3...ea07bb2e8435a88136c0e46d7ee8a345107b7582)

[Compare
Source](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.3...v5.5.4)

###
[`v5.5.3`](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.2...v5.5.3)

[Compare
Source](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.2...v5.5.3)

###
[`v5.5.2`](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.1...e0a14f7d15a293732e630ce1b7faa39924de2359)

[Compare
Source](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.1...v5.5.2)

###
[`v5.5.1`](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/releases/tag/v5.5.1):
integrate path-expression-matcher

[Compare
Source](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.0...v5.5.1)

- support path-expression-matcher
- fix: stopNode should not be parsed
- performance improvement for stopNode checking

###
[`v5.5.0`](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.4.2...ce017923460f92861e8fc94c91e52f9f5bd6a1b0)

[Compare
Source](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.4.2...v5.5.0)

###
[`v5.4.2`](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.4.1...v5.4.2)

[Compare
Source](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.4.1...v5.4.2)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "" (UTC), Automerge - At any time (no
schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My42Ni40IiwidXBkYXRlZEluVmVyIjoiNDMuNjYuNCIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-18 13:28:53 +08:00
DarkSky
4ce68d74f1 fix(editor): chat cannot scroll on windows (#14677)
fix #14529 
fix #14612 
replace #14614 #14657


#### PR Dependency Tree


* **PR #14677** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Tests**
* Added test coverage for scroll position tracking and pinned scroll
behavior in AI chat
* Added test suite verifying scroll-to-end and scroll-to-position
functionality

* **New Features**
* Introduced configurable scrollable option for text rendering in AI
chat components, allowing control over scroll behavior

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-18 13:28:05 +08:00
chauhan_s
fbfcc01d14 fix(core): reserve space for auth input error to avoid layout shift (#14670)
Prevents layout shift when showing auth input errors by reserving space
for the error message. Improves visual stability and avoids UI jumps
when validation errors appear.

### Before 


https://github.com/user-attachments/assets/7439aa5e-069d-42ac-8963-e5cdee341ad9



### After

https://github.com/user-attachments/assets/8e758452-5323-4807-8a0d-38913303020d


<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Refactor**
* Improved error message display mechanism in authentication components
for more consistent rendering.

* **Style**
* Enhanced vertical spacing for error messages in form inputs to ensure
better visual consistency and readability.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-18 10:48:50 +08:00
DarkSky
1112a06623 fix: ci 2026-03-17 23:32:57 +08:00
chauhan_s
bbcb7e69fe fix: correct "has accept" to "has accepted" (#14669)
fixes #14407

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Bug Fixes**
* Corrected grammar in the notification message displayed when an
invitation is accepted.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-17 23:29:28 +08:00
steffenrapp
cc2f23339e feat(i18n): update German translation (#14674)
<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Documentation**
* Enhanced German language support with new translations for Obsidian
import, MCP server integration, and Copilot features. Improved error
message translations for better clarity and consistency.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-17 23:28:36 +08:00
chauhan_s
31101a69e7 fix: Refine verify email dialog for verify and change email flows (#14671)
### Summary
This PR improves the verify email dialog by giving the verify-email and
change-email flows distinct messaging instead of reusing the same
generic copy.

### What changed
* Use flow-specific body copy in the verify email dialog
* Keep the existing action-specific subtitle behavior for:
  * Verify email
  * Change email
* Update the English i18n strings so each flow explains the correct
intent:
  * Verify email focuses on confirming email ownership
  * Change email focuses on securely starting the email-change process
### Why
The previous dialog message was shared across both flows, which made the
change-email experience feel ambiguous. This update makes the intent
clearer for users and better matches the action they are taking.



https://www.loom.com/share/c64c20570a8242358bd178a2ac50e413


<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Bug Fixes**
* Improved clarity in email verification and email change dialog
messages to better explain the confirmation process and link purpose.
* Enhanced distinction between email verification and email change
workflows with context-specific messaging.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-17 23:28:16 +08:00
Francisco Jiménez
0b1a44863f feat(editor): add obsidian vault import support (#14593)
fix #14592 

### Description
> 🤖 **Note:** The code in this Pull Request were developed with the
assistance of AI, but have been thoroughly reviewed and manually tested.

> I noticed there's a check when opening an issue that asks _"Is your
content generated by AI?"_, so I mention it here in case it's a deal
breaker. If so I understand, you can close the PR, just wanted to share
this in case it's useful anyways.

This PR introduces **Obsidian Vault Import Support** to AFFiNE. 

Previously, users migrating from Obsidian had to rely on the generic
Markdown importer, which often resulted in broken cross-links, missing
directory structures, and metadata conflicts because Obsidian relies
heavily on proprietary structures not supported by standard Markdown.

This completely new feature makes migrating to AFFiNE easy.

**Key Features & Implementations:**

1. **Vault (Directory) Selection**
- Utilizes the `openDirectory` blocksuite utility in the import modal to
allow users to select an entire folder directly from their filesystem,
maintaining file context rather than forcing `.zip` uploads.

2. **Wikilink Resolution (Two-Pass Import)**
- Restructured the `importObsidianVault` process into a two-pass
architecture.
- **Pass 1:** Discovers all files, assigns new AFFiNE document IDs, and
maps them efficiently (by title, alias, and filename) into a
high-performance hash map.
- **Pass 2:** Processes the generic markdown AST and correctly maps
custom `[[wikilinks]]` to the actual pre-registered AFFiNE blocksuite
document IDs via `obsidianWikilinkToDeltaMatcher`.
- Safely strips leading emojis from wikilink aliases to prevent
duplicated page icons rendering mid-sentence.

3. **Emoji Metadata & State Fixes**
- Implemented an aggressive, single-pass RegExp to extract multiple
leading/combining emojis (`Emoji_Presentation` / `\ufe0f`) from H1
headers and Frontmatter. Emojis are assigned specifically to the page
icon metadata property and cleanly stripped from the visual document
title.
- Fixed a core mutation bug where the loop iterating over existing
`docMetas` was aggressively overwriting newly minted IDs for the current
import batch. This fully resolves the issue where imported pages
(especially re-imports) were incorrectly flagged as `trashed`.
   - Enforces explicit `trash: false` patch instructions.

4. **Syntax Conversion**
- Implemented conversion of Obsidian-style Callouts (`> [!NOTE] Title`)
into native AFFiNE block formats (`> 💡 **Title**`).
- Hardened the `blockquote` parser so that nested structures (like `> -
list items`) are fully preserved instead of discarded.

### UI Changes
- Updated the Import Modal to include the "Import Obsidian Vault" flow
utilizing the native filesystem directory picker.
- Regenerated and synced `i18n-completenesses.json` correctly up to 100%
across all supported locales for the new modal string additions.

### Testing Instructions
1. Navigate to the Workspace sidebar and click "Import".
2. Select "Obsidian" and use the directory picker to define a
comprehensive Vault folder.
3. Validate that cross-links between documents automatically resolve to
their specific AFFiNE instances.
4. Validate documents containing leading Emojis display exactly one
Emoji (in the page icon area), and none duplicated in the actual title
header.
5. Validate Callouts are rendered cleanly and correctly, and no
documents are incorrectly marked as "Trash".


<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Import Obsidian vaults with wikilink resolution, emoji/title
preservation, asset handling, and automatic document creation.
* Folder-based imports via a Directory Picker (with hidden-input
fallback) integrated into the import dialog.

* **Localization**
  * Added Obsidian import label and tooltip translations.

* **Tests**
* Added end-to-end tests validating Obsidian vault import and asset
handling.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: DarkSky <25152247+darkskygit@users.noreply.github.com>
Co-authored-by: DarkSky <darksky2048@gmail.com>
2026-03-17 00:49:17 +08:00
DarkSky
8406f9656e perf(editor): improve bounding box calc caching (#14668)
#### PR Dependency Tree


* **PR #14668** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)
2026-03-16 23:35:38 +08:00
DarkSky
121c0d172d feat(server): improve doc tools error handle (#14662)
<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Centralized sync/status messages for cloud document sync and explicit
user-facing error types.
* Frontend helpers to detect and display tool errors with friendly
names.

* **Bug Fixes**
* Consistent, actionable error reporting for document and attachment
reads instead of silent failures.
* Search and semantic tools now validate workspace sync and permissions
and return clear responses.

* **Tests**
* Added comprehensive tests covering document/blob reads, search tools,
and sync/error paths.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-16 02:20:35 +08:00
renovate[bot]
8f03090780 chore: bump up Lakr233/MarkdownView version to from: "3.8.2" (#14658)
This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
|
[Lakr233/MarkdownView](https://redirect.github.com/Lakr233/MarkdownView)
| minor | `from: "3.6.3"` → `from: "3.8.2"` |

---

### Release Notes

<details>
<summary>Lakr233/MarkdownView (Lakr233/MarkdownView)</summary>

###
[`v3.8.2`](https://redirect.github.com/Lakr233/MarkdownView/compare/3.8.1...3.8.2)

[Compare
Source](https://redirect.github.com/Lakr233/MarkdownView/compare/3.8.1...3.8.2)

###
[`v3.8.1`](https://redirect.github.com/Lakr233/MarkdownView/compare/3.8.0...3.8.1)

[Compare
Source](https://redirect.github.com/Lakr233/MarkdownView/compare/3.8.0...3.8.1)

###
[`v3.8.0`](https://redirect.github.com/Lakr233/MarkdownView/compare/3.7.0...3.8.0)

[Compare
Source](https://redirect.github.com/Lakr233/MarkdownView/compare/3.7.0...3.8.0)

###
[`v3.7.0`](https://redirect.github.com/Lakr233/MarkdownView/compare/3.6.3...3.7.0)

[Compare
Source](https://redirect.github.com/Lakr233/MarkdownView/compare/3.6.3...3.7.0)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My42Ni40IiwidXBkYXRlZEluVmVyIjoiNDMuNjYuNCIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-16 00:57:48 +08:00
renovate[bot]
8125cc0e75 chore: bump up Lakr233/ListViewKit version to from: "1.2.0" (#14617)
This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
| [Lakr233/ListViewKit](https://redirect.github.com/Lakr233/ListViewKit)
| minor | `from: "1.1.8"` → `from: "1.2.0"` |

---

### Release Notes

<details>
<summary>Lakr233/ListViewKit (Lakr233/ListViewKit)</summary>

###
[`v1.2.0`](https://redirect.github.com/Lakr233/ListViewKit/compare/1.1.8...1.2.0)

[Compare
Source](https://redirect.github.com/Lakr233/ListViewKit/compare/1.1.8...1.2.0)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My41OS4wIiwidXBkYXRlZEluVmVyIjoiNDMuNTkuMCIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-14 23:45:32 +08:00
renovate[bot]
f537a75f01 chore: bump up file-type version to v21.3.2 [SECURITY] (#14655)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
| [file-type](https://redirect.github.com/sindresorhus/file-type) |
[`21.3.1` →
`21.3.2`](https://renovatebot.com/diffs/npm/file-type/21.3.1/21.3.2) |
![age](https://developer.mend.io/api/mc/badges/age/npm/file-type/21.3.2?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/file-type/21.3.1/21.3.2?slim=true)
|

### GitHub Vulnerability Alerts

####
[CVE-2026-31808](https://redirect.github.com/sindresorhus/file-type/security/advisories/GHSA-5v7r-6r5c-r473)

### Impact
A denial of service vulnerability exists in the ASF (WMV/WMA) file type
detection parser. When parsing a crafted input where an ASF sub-header
has a `size` field of zero, the parser enters an infinite loop. The
`payload` value becomes negative (-24), causing
`tokenizer.ignore(payload)` to move the read position backwards, so the
same sub-header is read repeatedly forever.

Any application that uses `file-type` to detect the type of
untrusted/attacker-controlled input is affected. An attacker can stall
the Node.js event loop with a 55-byte payload.

### Patches
Fixed in version 21.3.1. Users should upgrade to >= 21.3.1.

### Workarounds
Validate or limit the size of input buffers before passing them to
`file-type`, or run file type detection in a worker thread with a
timeout.

### References
- Fix commit: 319abf871b50ba2fa221b4a7050059f1ae096f4f

### Reporter

crnkovic@lokvica.com

####
[CVE-2026-32630](https://redirect.github.com/sindresorhus/file-type/security/advisories/GHSA-j47w-4g3g-c36v)

## Summary

A crafted ZIP file can trigger excessive memory growth during type
detection in `file-type` when using `fileTypeFromBuffer()`,
`fileTypeFromBlob()`, or `fileTypeFromFile()`.

In affected versions, the ZIP inflate output limit is enforced for
stream-based detection, but not for known-size inputs. As a result, a
small compressed ZIP can cause `file-type` to inflate and process a much
larger payload while probing ZIP-based formats such as OOXML. In testing
on `file-type` `21.3.1`, a ZIP of about `255 KB` caused about `257 MB`
of RSS growth during `fileTypeFromBuffer()`.

This is an availability issue. Applications that use these APIs on
untrusted uploads can be forced to consume large amounts of memory and
may become slow or crash.

## Root Cause

The ZIP detection logic applied different limits depending on whether
the tokenizer had a known file size.

For stream inputs, ZIP probing was bounded by
`maximumZipEntrySizeInBytes` (`1 MiB`). For known-size inputs such as
buffers, blobs, and files, the code instead used
`Number.MAX_SAFE_INTEGER` in two relevant places:

```js
const maximumContentTypesEntrySize = hasUnknownFileSize(tokenizer)
	? maximumZipEntrySizeInBytes
	: Number.MAX_SAFE_INTEGER;
```

and:

```js
const maximumLength = hasUnknownFileSize(this.tokenizer)
	? maximumZipEntrySizeInBytes
	: Number.MAX_SAFE_INTEGER;
```

Together, these checks allowed a crafted ZIP to bypass the intended
inflate limit for known-size APIs and force large decompression during
detection of entries such as `[Content_Types].xml`.

## Proof of Concept

```js
import {fileTypeFromBuffer} from 'file-type';
import archiver from 'archiver';
import {Writable} from 'node:stream';

async function createZipBomb(sizeInMegabytes) {
	return new Promise((resolve, reject) => {
		const chunks = [];
		const writable = new Writable({
			write(chunk, encoding, callback) {
				chunks.push(chunk);
				callback();
			},
		});

		const archive = archiver('zip', {zlib: {level: 9}});
		archive.pipe(writable);
		writable.on('finish', () => {
			resolve(Buffer.concat(chunks));
		});
		archive.on('error', reject);

		const xmlPrefix = '<?xml version="1.0"?><Types xmlns="http://schemas.openxmlformats.org/package/2006/content-types">';
		const padding = Buffer.alloc(sizeInMegabytes * 1024 * 1024 - xmlPrefix.length, 0x20);
		archive.append(Buffer.concat([Buffer.from(xmlPrefix), padding]), {name: '[Content_Types].xml'});
		archive.finalize();
	});
}

const zip = await createZipBomb(256);
console.log('ZIP size (KB):', (zip.length / 1024).toFixed(0));

const before = process.memoryUsage().rss;
await fileTypeFromBuffer(zip);
const after = process.memoryUsage().rss;

console.log('RSS growth (MB):', ((after - before) / 1024 / 1024).toFixed(0));
```

Observed on `file-type` `21.3.1`:
- ZIP size: about `255 KB`
- RSS growth during detection: about `257 MB`

## Affected APIs

Affected:
- `fileTypeFromBuffer()`
- `fileTypeFromBlob()`
- `fileTypeFromFile()`

Not affected:
- `fileTypeFromStream()`, which already enforced the ZIP inflate limit
for unknown-size inputs

## Impact

Applications that inspect untrusted uploads with `fileTypeFromBuffer()`,
`fileTypeFromBlob()`, or `fileTypeFromFile()` can be forced to consume
excessive memory during ZIP-based type detection. This can degrade
service or lead to process termination in memory-constrained
environments.

## Cause

The issue was introduced in 399b0f1

---

### Release Notes

<details>
<summary>sindresorhus/file-type (file-type)</summary>

###
[`v21.3.2`](https://redirect.github.com/sindresorhus/file-type/releases/tag/v21.3.2)

[Compare
Source](https://redirect.github.com/sindresorhus/file-type/compare/v21.3.1...v21.3.2)

- Fix ZIP bomb in known-size ZIP probing (GHSA-j47w-4g3g-c36v)
[`a155cd7`](https://redirect.github.com/sindresorhus/file-type/commit/a155cd7)
- Fix bound recursive BOM and ID3 detection
[`370ed91`](https://redirect.github.com/sindresorhus/file-type/commit/370ed91)

***

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "" (UTC), Automerge - At any time (no
schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My42Ni40IiwidXBkYXRlZEluVmVyIjoiNDMuNjYuNCIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-14 23:44:06 +08:00
renovate[bot]
9456a07889 chore: migrate Renovate config (#14656)
The Renovate config in this repository needs migrating. Typically this
is because one or more configuration options you are using have been
renamed.

You don't need to merge this PR right away, because Renovate will
continue to migrate these fields internally each time it runs. But later
some of these fields may be fully deprecated and the migrations removed.
So it's a good idea to merge this migration PR soon.





🔕 **Ignore**: Close this PR and you won't be reminded about config
migration again, but one day your current config may no longer be valid.

 Got questions? Does something look wrong to you? Please don't hesitate
to [request help
here](https://redirect.github.com/renovatebot/renovate/discussions).


---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-14 23:43:39 +08:00
sahilkhan09k
8f571ddc30 fix: ensure images load correctly when printing to PDF (#14618)
Fixes #14304

## Summary
This PR resolves an issue where images sometimes fail to appear when
exporting or printing AFFiNE pages to PDF. The issue occurs because
images may not finish loading inside the hidden print iframe before
`window.print()` is triggered.

## Changes
- Avoid using `display: none` for the print iframe and instead keep it
hidden while remaining in the rendering tree to ensure resources load
correctly.
- Remove `loading="lazy"` from all images before printing to prevent
viewport-based lazy loading from blocking image fetches.
- Force image reload by reassigning the `src` attribute after removing
lazy loading.
- Add a `waitForImages` helper to ensure all images (including those
inside Shadow DOM) finish loading before calling `window.print()`.
- Improve reliability by checking both `img.complete` and
`img.naturalWidth` to confirm successful image loading.
- Wait for fonts using `document.fonts.ready` before triggering the
print dialog.

## Verification
1. Run AFFiNE in development mode: npm run dev
2. Open a page containing multiple images.
3. Click **Print** and select **Save as PDF** (or any PDF printer).
4. Verify that all images appear correctly in the generated PDF.

## Notes
This change focuses only on improving the reliability of the existing
print-to-PDF workflow without altering any feature flags or export
behavior.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Bug Fixes**
* Improved PDF export reliability by waiting for all images (including
inside shadow content) and fonts to load before printing.
* Removed lazy-loading interference so images reliably appear in
exports.
* Ensured styles and light-theme attributes are consistently applied to
the print document.

* **Improvements**
* More robust print preparation using a hidden-but-rendering iframe
document, deep-cloning content (flattening shadow DOM), and preserved
canvas mapping for accurate renders.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-13 09:57:07 +08:00
Mohad
13ad1beb10 feat(i18n): automatic RTL layout for Arabic, Persian, and Urdu + complete Arabic translations (#14624)
## Changes

### RTL Support (automatic, locale-driven)
- Add `rtl?: boolean` metadata to locale definitions in
`SUPPORTED_LANGUAGES`
- Set `rtl: true` for Arabic (`ar`), Persian (`fa`), and Urdu (`ur`)
- Automatically set `document.documentElement.dir` based on locale RTL
metadata on language change
- Remove hardcoded `lang="en"` from HTML template — JS now controls both
`lang` and `dir`

### Arabic Translations
- Add 100 missing keys to `ar.json` (Calendar integration, Doc
Analytics, MCP Server, AI Chat, and more)
- Arabic locale now has 2,313/2,313 keys (100% coverage, matches
`en.json` exactly)

## Testing
Switching to Arabic/Persian/Urdu now automatically flips the entire UI
layout to RTL without any manual feature flag.

Fixes #7099

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Added Right-to-Left (RTL) support for Arabic, Persian, and Urdu with
automatic document direction and language attributes when a language is
selected.

* **Refactor**
* Centralized and reordered internal language handling so document
language and direction are applied earlier and consistently.

* **Chore**
  * Set a default text direction attribute on the base HTML template.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-12 23:14:19 +08:00
76 changed files with 3570 additions and 1363 deletions

View File

@@ -63,7 +63,7 @@
"groupName": "opentelemetry",
"matchPackageNames": [
"/^@opentelemetry/",
"/^@google-cloud\/opentelemetry-/"
"/^@google-cloud/opentelemetry-/"
]
}
],
@@ -79,7 +79,7 @@
"customManagers": [
{
"customType": "regex",
"fileMatch": ["^rust-toolchain\\.toml?$"],
"managerFilePatterns": ["/^rust-toolchain\\.toml?$/"],
"matchStrings": [
"channel\\s*=\\s*\"(?<currentValue>\\d+\\.\\d+(\\.\\d+)?)\""
],

View File

@@ -0,0 +1,94 @@
// Vitest Snapshot v1, https://vitest.dev/guide/snapshot.html
exports[`snapshot to markdown > imports obsidian vault fixtures 1`] = `
{
"entry": {
"children": [
{
"children": [
{
"children": [
{
"delta": [
{
"insert": "Panel
Body line",
},
],
"flavour": "affine:paragraph",
"type": "text",
},
],
"emoji": "💡",
"flavour": "affine:callout",
},
{
"flavour": "affine:attachment",
"name": "archive.zip",
"style": "horizontalThin",
},
{
"delta": [
{
"footnote": {
"label": "1",
"reference": {
"title": "reference body",
"type": "url",
},
},
"insert": " ",
},
],
"flavour": "affine:paragraph",
"type": "text",
},
{
"flavour": "affine:divider",
},
{
"delta": [
{
"insert": "after note",
},
],
"flavour": "affine:paragraph",
"type": "text",
},
{
"delta": [
{
"insert": " ",
"reference": {
"page": "linked",
"type": "LinkedPage",
},
},
],
"flavour": "affine:paragraph",
"type": "text",
},
{
"delta": [
{
"insert": "Sources",
},
],
"flavour": "affine:paragraph",
"type": "h6",
},
{
"flavour": "affine:bookmark",
},
],
"flavour": "affine:note",
},
],
"flavour": "affine:page",
},
"titles": [
"entry",
"linked",
],
}
`;

View File

@@ -0,0 +1,14 @@
> [!custom] Panel
> Body line
![[archive.zip]]
[^1]
---
after note
[[linked]]
[^1]: reference body

View File

@@ -0,0 +1 @@
plain linked page

View File

@@ -1,4 +1,10 @@
import { MarkdownTransformer } from '@blocksuite/affine/widgets/linked-doc';
import { readFileSync } from 'node:fs';
import { basename, resolve } from 'node:path';
import {
MarkdownTransformer,
ObsidianTransformer,
} from '@blocksuite/affine/widgets/linked-doc';
import {
DefaultTheme,
NoteDisplayMode,
@@ -8,13 +14,18 @@ import {
CalloutAdmonitionType,
CalloutExportStyle,
calloutMarkdownExportMiddleware,
docLinkBaseURLMiddleware,
embedSyncedDocMiddleware,
MarkdownAdapter,
titleMiddleware,
} from '@blocksuite/affine-shared/adapters';
import type { AffineTextAttributes } from '@blocksuite/affine-shared/types';
import type {
BlockSnapshot,
DeltaInsert,
DocSnapshot,
SliceSnapshot,
Store,
TransformerMiddleware,
} from '@blocksuite/store';
import { AssetsManager, MemoryBlobCRUD, Schema } from '@blocksuite/store';
@@ -29,6 +40,138 @@ import { testStoreExtensions } from '../utils/store.js';
const provider = getProvider();
function withRelativePath(file: File, relativePath: string): File {
Object.defineProperty(file, 'webkitRelativePath', {
value: relativePath,
writable: false,
});
return file;
}
function markdownFixture(relativePath: string): File {
return withRelativePath(
new File(
[
readFileSync(
resolve(import.meta.dirname, 'fixtures/obsidian', relativePath),
'utf8'
),
],
basename(relativePath),
{ type: 'text/markdown' }
),
`vault/${relativePath}`
);
}
function exportSnapshot(doc: Store): DocSnapshot {
const job = doc.getTransformer([
docLinkBaseURLMiddleware(doc.workspace.id),
titleMiddleware(doc.workspace.meta.docMetas),
]);
const snapshot = job.docToSnapshot(doc);
expect(snapshot).toBeTruthy();
return snapshot!;
}
function normalizeDeltaForSnapshot(
delta: DeltaInsert<AffineTextAttributes>[],
titleById: ReadonlyMap<string, string>
) {
return delta.map(item => {
const normalized: Record<string, unknown> = {
insert: item.insert,
};
if (item.attributes?.link) {
normalized.link = item.attributes.link;
}
if (item.attributes?.reference?.type === 'LinkedPage') {
normalized.reference = {
type: 'LinkedPage',
page: titleById.get(item.attributes.reference.pageId) ?? '<missing>',
...(item.attributes.reference.title
? { title: item.attributes.reference.title }
: {}),
};
}
if (item.attributes?.footnote) {
const reference = item.attributes.footnote.reference;
normalized.footnote = {
label: item.attributes.footnote.label,
reference:
reference.type === 'doc'
? {
type: 'doc',
page: reference.docId
? (titleById.get(reference.docId) ?? '<missing>')
: '<missing>',
}
: {
type: reference.type,
...(reference.title ? { title: reference.title } : {}),
...(reference.fileName ? { fileName: reference.fileName } : {}),
},
};
}
return normalized;
});
}
function simplifyBlockForSnapshot(
block: BlockSnapshot,
titleById: ReadonlyMap<string, string>
): Record<string, unknown> {
const simplified: Record<string, unknown> = {
flavour: block.flavour,
};
if (block.flavour === 'affine:paragraph' || block.flavour === 'affine:list') {
simplified.type = block.props.type;
const text = block.props.text as
| { delta?: DeltaInsert<AffineTextAttributes>[] }
| undefined;
simplified.delta = normalizeDeltaForSnapshot(text?.delta ?? [], titleById);
}
if (block.flavour === 'affine:callout') {
simplified.emoji = block.props.emoji;
}
if (block.flavour === 'affine:attachment') {
simplified.name = block.props.name;
simplified.style = block.props.style;
}
if (block.flavour === 'affine:image') {
simplified.sourceId = '<asset>';
}
const children = (block.children ?? [])
.filter(child => child.flavour !== 'affine:surface')
.map(child => simplifyBlockForSnapshot(child, titleById));
if (children.length) {
simplified.children = children;
}
return simplified;
}
function snapshotDocByTitle(
collection: TestWorkspace,
title: string,
titleById: ReadonlyMap<string, string>
) {
const meta = collection.meta.docMetas.find(meta => meta.title === title);
expect(meta).toBeTruthy();
const doc = collection.getDoc(meta!.id)?.getStore({ id: meta!.id });
expect(doc).toBeTruthy();
return simplifyBlockForSnapshot(exportSnapshot(doc!).blocks, titleById);
}
describe('snapshot to markdown', () => {
test('code', async () => {
const blockSnapshot: BlockSnapshot = {
@@ -127,6 +270,46 @@ Hello world
expect(meta?.tags).toEqual(['a', 'b']);
});
test('imports obsidian vault fixtures', async () => {
const schema = new Schema().register(AffineSchemas);
const collection = new TestWorkspace();
collection.storeExtensions = testStoreExtensions;
collection.meta.initialize();
const attachment = withRelativePath(
new File([new Uint8Array([80, 75, 3, 4])], 'archive.zip', {
type: 'application/zip',
}),
'vault/archive.zip'
);
const { docIds } = await ObsidianTransformer.importObsidianVault({
collection,
schema,
importedFiles: [
markdownFixture('entry.md'),
markdownFixture('linked.md'),
attachment,
],
extensions: testStoreExtensions,
});
expect(docIds).toHaveLength(2);
const titleById = new Map(
collection.meta.docMetas.map(meta => [
meta.id,
meta.title ?? '<untitled>',
])
);
expect({
titles: collection.meta.docMetas
.map(meta => meta.title)
.sort((a, b) => (a ?? '').localeCompare(b ?? '')),
entry: snapshotDocByTitle(collection, 'entry', titleById),
}).toMatchSnapshot();
});
test('paragraph', async () => {
const blockSnapshot: BlockSnapshot = {
type: 'block',

View File

@@ -5,6 +5,7 @@ import {
import {
BlockMarkdownAdapterExtension,
type BlockMarkdownAdapterMatcher,
createAttachmentBlockSnapshot,
FOOTNOTE_DEFINITION_PREFIX,
getFootnoteDefinitionText,
isFootnoteDefinitionNode,
@@ -56,18 +57,15 @@ export const attachmentBlockMarkdownAdapterMatcher: BlockMarkdownAdapterMatcher
}
walkerContext
.openNode(
{
type: 'block',
createAttachmentBlockSnapshot({
id: nanoid(),
flavour: AttachmentBlockSchema.model.flavour,
props: {
name: fileName,
sourceId: blobId,
footnoteIdentifier,
style: 'citation',
},
children: [],
},
}),
'children'
)
.closeNode();

View File

@@ -83,9 +83,9 @@ export class RecordField extends SignalWatcher(
border: 1px solid transparent;
}
.field-content .affine-database-number {
.field-content affine-database-number-cell .number {
text-align: left;
justify-content: start;
justify-content: flex-start;
}
.field-content:hover {

View File

@@ -3,8 +3,11 @@ import {
EdgelessCRUDIdentifier,
TextUtils,
} from '@blocksuite/affine-block-surface';
import type { ShapeElementModel } from '@blocksuite/affine-model';
import { MindmapElementModel, TextResizing } from '@blocksuite/affine-model';
import {
MindmapElementModel,
ShapeElementModel,
TextResizing,
} from '@blocksuite/affine-model';
import type { RichText } from '@blocksuite/affine-rich-text';
import { ThemeProvider } from '@blocksuite/affine-shared/services';
import { getSelectedRect } from '@blocksuite/affine-shared/utils';
@@ -26,7 +29,7 @@ import { styleMap } from 'lit/directives/style-map.js';
import * as Y from 'yjs';
export function mountShapeTextEditor(
shapeElement: { id: string; text?: Y.Text } | null | undefined,
shapeElement: ShapeElementModel,
edgeless: BlockComponent
) {
const mountElm = edgeless.querySelector('.edgeless-mount-point');
@@ -40,27 +43,24 @@ export function mountShapeTextEditor(
const gfx = edgeless.std.get(GfxControllerIdentifier);
const crud = edgeless.std.get(EdgelessCRUDIdentifier);
if (!shapeElement?.id) {
console.error('Cannot mount text editor on an invalid shape element');
return;
}
const updatedElement = crud.getElementById(shapeElement.id);
if (!updatedElement || !('id' in updatedElement)) {
if (!(updatedElement instanceof ShapeElementModel)) {
console.error('Cannot mount text editor on a non-shape element');
return;
}
gfx.tool.setTool(DefaultTool);
gfx.selection.set({
elements: [updatedElement.id],
elements: [shapeElement.id],
editing: true,
});
if (!updatedElement.text) {
if (!shapeElement.text) {
const text = new Y.Text();
crud.updateElement(updatedElement.id, { text });
edgeless.std
.get(EdgelessCRUDIdentifier)
.updateElement(shapeElement.id, { text });
}
const shapeEditor = new EdgelessShapeTextEditor();
@@ -280,21 +280,6 @@ export class EdgelessShapeTextEditor extends WithDisposable(ShadowlessElement) {
this._unmount();
}
);
this.disposables.addFromEvent(
this.inlineEditorContainer,
'compositionupdate',
() => {
this._updateElementWH();
}
);
this.disposables.addFromEvent(
this.inlineEditorContainer,
'compositionend',
() => {
this._updateElementWH();
}
);
})
.catch(console.error);

View File

@@ -1,4 +1,7 @@
import { AttachmentBlockSchema } from '@blocksuite/affine-model';
import {
type AttachmentBlockProps,
AttachmentBlockSchema,
} from '@blocksuite/affine-model';
import { BlockSuiteError, ErrorCode } from '@blocksuite/global/exceptions';
import {
type AssetsManager,
@@ -23,6 +26,24 @@ import { AdapterFactoryIdentifier } from './types/adapter';
export type Attachment = File[];
type CreateAttachmentBlockSnapshotOptions = {
id?: string;
props: Partial<AttachmentBlockProps> & Pick<AttachmentBlockProps, 'name'>;
};
export function createAttachmentBlockSnapshot({
id = nanoid(),
props,
}: CreateAttachmentBlockSnapshotOptions): BlockSnapshot {
return {
type: 'block',
id,
flavour: AttachmentBlockSchema.model.flavour,
props,
children: [],
};
}
type AttachmentToSliceSnapshotPayload = {
file: Attachment;
assets?: AssetsManager;
@@ -97,8 +118,6 @@ export class AttachmentAdapter extends BaseAdapter<Attachment> {
if (files.length === 0) return null;
const content: SliceSnapshot['content'] = [];
const flavour = AttachmentBlockSchema.model.flavour;
for (const blob of files) {
const id = nanoid();
const { name, size, type } = blob;
@@ -108,22 +127,21 @@ export class AttachmentAdapter extends BaseAdapter<Attachment> {
mapInto: sourceId => ({ sourceId }),
});
content.push({
type: 'block',
flavour,
id,
props: {
name,
size,
type,
embed: false,
style: 'horizontalThin',
index: 'a0',
xywh: '[0,0,0,0]',
rotate: 0,
},
children: [],
});
content.push(
createAttachmentBlockSnapshot({
id,
props: {
name,
size,
type,
embed: false,
style: 'horizontalThin',
index: 'a0',
xywh: '[0,0,0,0]',
rotate: 0,
},
})
);
}
return {

View File

@@ -1,3 +1,20 @@
function safeDecodePathReference(path: string): string {
try {
return decodeURIComponent(path);
} catch {
return path;
}
}
export function normalizeFilePathReference(path: string): string {
return safeDecodePathReference(path)
.trim()
.replace(/\\/g, '/')
.replace(/^\.\/+/, '')
.replace(/^\/+/, '')
.replace(/\/+/g, '/');
}
/**
* Normalizes a relative path by resolving all relative path segments
* @param basePath The base path (markdown file's directory)
@@ -40,7 +57,7 @@ export function getImageFullPath(
imageReference: string
): string {
// Decode the image reference in case it contains URL-encoded characters
const decodedReference = decodeURIComponent(imageReference);
const decodedReference = safeDecodePathReference(imageReference);
// Get the directory of the file path
const markdownDir = filePath.substring(0, filePath.lastIndexOf('/'));

View File

@@ -20,9 +20,30 @@ declare global {
showOpenFilePicker?: (
options?: OpenFilePickerOptions
) => Promise<FileSystemFileHandle[]>;
// Window API: showDirectoryPicker
showDirectoryPicker?: (options?: {
id?: string;
mode?: 'read' | 'readwrite';
startIn?: FileSystemHandle | string;
}) => Promise<FileSystemDirectoryHandle>;
}
}
// Minimal polyfill for FileSystemDirectoryHandle to iterate over files
interface FileSystemDirectoryHandle {
kind: 'directory';
name: string;
values(): AsyncIterableIterator<
FileSystemFileHandle | FileSystemDirectoryHandle
>;
}
interface FileSystemFileHandle {
kind: 'file';
name: string;
getFile(): Promise<File>;
}
// See [Common MIME types](https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/MIME_types/Common_types)
const FileTypes: NonNullable<OpenFilePickerOptions['types']> = [
{
@@ -121,21 +142,27 @@ type AcceptTypes =
| 'Docx'
| 'MindMap';
export async function openFilesWith(
acceptType: AcceptTypes = 'Any',
multiple: boolean = true
): Promise<File[] | null> {
// Feature detection. The API needs to be supported
// and the app not run in an iframe.
const supportsFileSystemAccess =
'showOpenFilePicker' in window &&
function canUseFileSystemAccessAPI(
api: 'showOpenFilePicker' | 'showDirectoryPicker'
) {
return (
api in window &&
(() => {
try {
return window.self === window.top;
} catch {
return false;
}
})();
})()
);
}
export async function openFilesWith(
acceptType: AcceptTypes = 'Any',
multiple: boolean = true
): Promise<File[] | null> {
const supportsFileSystemAccess =
canUseFileSystemAccessAPI('showOpenFilePicker');
// If the File System Access API is supported…
if (supportsFileSystemAccess && window.showOpenFilePicker) {
@@ -194,6 +221,75 @@ export async function openFilesWith(
});
}
export async function openDirectory(): Promise<File[] | null> {
const supportsFileSystemAccess = canUseFileSystemAccessAPI(
'showDirectoryPicker'
);
if (supportsFileSystemAccess && window.showDirectoryPicker) {
try {
const dirHandle = await window.showDirectoryPicker();
const files: File[] = [];
const readDirectory = async (
directoryHandle: FileSystemDirectoryHandle,
path: string
) => {
for await (const handle of directoryHandle.values()) {
const relativePath = path ? `${path}/${handle.name}` : handle.name;
if (handle.kind === 'file') {
const fileHandle = handle as FileSystemFileHandle;
if (fileHandle.getFile) {
const file = await fileHandle.getFile();
Object.defineProperty(file, 'webkitRelativePath', {
value: relativePath,
writable: false,
});
files.push(file);
}
} else if (handle.kind === 'directory') {
await readDirectory(
handle as FileSystemDirectoryHandle,
relativePath
);
}
}
};
await readDirectory(dirHandle, '');
return files;
} catch (err) {
console.error(err);
return null;
}
}
return new Promise(resolve => {
const input = document.createElement('input');
input.classList.add('affine-upload-input');
input.style.display = 'none';
input.type = 'file';
input.setAttribute('webkitdirectory', '');
input.setAttribute('directory', '');
document.body.append(input);
input.addEventListener('change', () => {
input.remove();
resolve(input.files ? Array.from(input.files) : null);
});
input.addEventListener('cancel', () => resolve(null));
if ('showPicker' in HTMLInputElement.prototype) {
input.showPicker();
} else {
input.click();
}
});
}
export async function openSingleFileWith(
acceptType?: AcceptTypes
): Promise<File | null> {

View File

@@ -17,7 +17,14 @@ export async function printToPdf(
return new Promise<void>((resolve, reject) => {
const iframe = document.createElement('iframe');
document.body.append(iframe);
iframe.style.display = 'none';
// Use a hidden but rendering-enabled state instead of display: none
Object.assign(iframe.style, {
visibility: 'hidden',
position: 'absolute',
width: '0',
height: '0',
border: 'none',
});
iframe.srcdoc = '<!DOCTYPE html>';
iframe.onload = async () => {
if (!iframe.contentWindow) {
@@ -28,6 +35,44 @@ export async function printToPdf(
reject(new Error('Root element not defined, unable to print pdf'));
return;
}
const doc = iframe.contentWindow.document;
doc.write(`<!DOCTYPE html><html><head><style>@media print {
html, body {
height: initial !important;
overflow: initial !important;
print-color-adjust: exact;
-webkit-print-color-adjust: exact;
color: #000 !important;
background: #fff !important;
color-scheme: light !important;
}
::-webkit-scrollbar {
display: none;
}
:root, body {
--affine-text-primary: #000 !important;
--affine-text-secondary: #111 !important;
--affine-text-tertiary: #333 !important;
--affine-background-primary: #fff !important;
--affine-background-secondary: #fff !important;
--affine-background-tertiary: #fff !important;
}
body, [data-theme='dark'] {
color: #000 !important;
background: #fff !important;
}
body * {
color: #000 !important;
-webkit-text-fill-color: #000 !important;
}
:root {
--affine-note-shadow-box: none !important;
--affine-note-shadow-sticker: none !important;
}
}</style></head><body></body></html>`);
doc.close();
iframe.contentWindow.document
.write(`<!DOCTYPE html><html><head><style>@media print {
html, body {
@@ -71,7 +116,7 @@ export async function printToPdf(
for (const element of document.styleSheets) {
try {
for (const cssRule of element.cssRules) {
const target = iframe.contentWindow.document.styleSheets[0];
const target = doc.styleSheets[0];
target.insertRule(cssRule.cssText, target.cssRules.length);
}
} catch (e) {
@@ -86,12 +131,33 @@ export async function printToPdf(
}
}
// Recursive function to find all canvases, including those in shadow roots
const findAllCanvases = (root: Node): HTMLCanvasElement[] => {
const canvases: HTMLCanvasElement[] = [];
const traverse = (node: Node) => {
if (node instanceof HTMLCanvasElement) {
canvases.push(node);
}
if (node instanceof HTMLElement || node instanceof ShadowRoot) {
node.childNodes.forEach(traverse);
}
if (node instanceof HTMLElement && node.shadowRoot) {
traverse(node.shadowRoot);
}
};
traverse(root);
return canvases;
};
// convert all canvas to image
const canvasImgObjectUrlMap = new Map<string, string>();
const allCanvas = rootElement.getElementsByTagName('canvas');
const allCanvas = findAllCanvases(rootElement);
let canvasKey = 1;
const canvasToKeyMap = new Map<HTMLCanvasElement, string>();
for (const canvas of allCanvas) {
canvas.dataset['printToPdfCanvasKey'] = canvasKey.toString();
const key = canvasKey.toString();
canvasToKeyMap.set(canvas, key);
canvasKey++;
const canvasImgObjectUrl = await new Promise<Blob | null>(resolve => {
try {
@@ -106,20 +172,42 @@ export async function printToPdf(
);
continue;
}
canvasImgObjectUrlMap.set(
canvas.dataset['printToPdfCanvasKey'],
URL.createObjectURL(canvasImgObjectUrl)
);
canvasImgObjectUrlMap.set(key, URL.createObjectURL(canvasImgObjectUrl));
}
const importedRoot = iframe.contentWindow.document.importNode(
rootElement,
true
) as HTMLDivElement;
// Recursive deep clone that flattens Shadow DOM into Light DOM
const deepCloneWithShadows = (node: Node): Node => {
const clone = doc.importNode(node, false);
if (
clone instanceof HTMLCanvasElement &&
node instanceof HTMLCanvasElement
) {
const key = canvasToKeyMap.get(node);
if (key) {
clone.dataset['printToPdfCanvasKey'] = key;
}
}
const appendChildren = (source: Node) => {
source.childNodes.forEach(child => {
(clone as Element).append(deepCloneWithShadows(child));
});
};
if (node instanceof HTMLElement && node.shadowRoot) {
appendChildren(node.shadowRoot);
}
appendChildren(node);
return clone;
};
const importedRoot = deepCloneWithShadows(rootElement) as HTMLDivElement;
// force light theme in print iframe
iframe.contentWindow.document.documentElement.dataset.theme = 'light';
iframe.contentWindow.document.body.dataset.theme = 'light';
doc.documentElement.dataset.theme = 'light';
doc.body.dataset.theme = 'light';
importedRoot.dataset.theme = 'light';
// draw saved canvas image to canvas
@@ -138,17 +226,67 @@ export async function printToPdf(
}
}
// append to iframe and print
iframe.contentWindow.document.body.append(importedRoot);
// Remove lazy loading from all images and force reload
const allImages = importedRoot.querySelectorAll('img');
allImages.forEach(img => {
img.removeAttribute('loading');
const src = img.getAttribute('src');
if (src) img.setAttribute('src', src);
});
// append to iframe
doc.body.append(importedRoot);
await options.beforeprint?.(iframe);
// browser may take some time to load font
await new Promise<void>(resolve => {
setTimeout(() => {
resolve();
}, 1000);
});
// Robust image waiting logic
const waitForImages = async (container: HTMLElement) => {
const images: HTMLImageElement[] = [];
const view = container.ownerDocument.defaultView;
if (!view) return;
const findImages = (root: Node) => {
if (root instanceof view.HTMLImageElement) {
images.push(root);
}
if (
root instanceof view.HTMLElement ||
root instanceof view.ShadowRoot
) {
root.childNodes.forEach(findImages);
}
if (root instanceof view.HTMLElement && root.shadowRoot) {
findImages(root.shadowRoot);
}
};
findImages(container);
await Promise.all(
images.map(img => {
if (img.complete) {
if (img.naturalWidth === 0) {
console.warn('Image failed to load:', img.src);
}
return Promise.resolve();
}
return new Promise(resolve => {
img.onload = resolve;
img.onerror = resolve;
});
})
);
};
await waitForImages(importedRoot);
// browser may take some time to load font or other resources
await (doc.fonts?.ready ??
new Promise<void>(resolve => {
setTimeout(() => {
resolve();
}, 1000);
}));
iframe.contentWindow.onafterprint = async () => {
iframe.remove();

View File

@@ -7,6 +7,7 @@ import {
NotionIcon,
} from '@blocksuite/affine-components/icons';
import {
openDirectory,
openFilesWith,
openSingleFileWith,
} from '@blocksuite/affine-shared/utils';
@@ -18,11 +19,16 @@ import { query, state } from 'lit/decorators.js';
import { HtmlTransformer } from '../transformers/html.js';
import { MarkdownTransformer } from '../transformers/markdown.js';
import { NotionHtmlTransformer } from '../transformers/notion-html.js';
import { ObsidianTransformer } from '../transformers/obsidian.js';
import { styles } from './styles.js';
export type OnSuccessHandler = (
pageIds: string[],
options: { isWorkspaceFile: boolean; importedCount: number }
options: {
isWorkspaceFile: boolean;
importedCount: number;
docEmojis?: Map<string, string>;
}
) => void;
export type OnFailHandler = (message: string) => void;
@@ -140,6 +146,29 @@ export class ImportDoc extends WithDisposable(LitElement) {
});
}
private async _importObsidian() {
const files = await openDirectory();
if (!files || files.length === 0) return;
const needLoading =
files.reduce((acc, f) => acc + f.size, 0) > SHOW_LOADING_SIZE;
if (needLoading) {
this.hidden = false;
this._loading = true;
} else {
this.abortController.abort();
}
const { docIds, docEmojis } = await ObsidianTransformer.importObsidianVault(
{
collection: this.collection,
schema: this.schema,
importedFiles: files,
extensions: this.extensions,
}
);
needLoading && this.abortController.abort();
this._onImportSuccess(docIds, { docEmojis });
}
private _onCloseClick(event: MouseEvent) {
event.stopPropagation();
this.abortController.abort();
@@ -151,15 +180,21 @@ export class ImportDoc extends WithDisposable(LitElement) {
private _onImportSuccess(
pageIds: string[],
options: { isWorkspaceFile?: boolean; importedCount?: number } = {}
options: {
isWorkspaceFile?: boolean;
importedCount?: number;
docEmojis?: Map<string, string>;
} = {}
) {
const {
isWorkspaceFile = false,
importedCount: pagesImportedCount = pageIds.length,
docEmojis,
} = options;
this.onSuccess?.(pageIds, {
isWorkspaceFile,
importedCount: pagesImportedCount,
docEmojis,
});
}
@@ -258,6 +293,13 @@ export class ImportDoc extends WithDisposable(LitElement) {
</affine-tooltip>
</div>
</icon-button>
<icon-button
class="button-item"
text="Obsidian"
@click="${this._importObsidian}"
>
${ExportToMarkdownIcon}
</icon-button>
<icon-button class="button-item" text="Coming soon..." disabled>
${NewIcon}
</icon-button>

View File

@@ -2,6 +2,7 @@ export { DocxTransformer } from './docx.js';
export { HtmlTransformer } from './html.js';
export { MarkdownTransformer } from './markdown.js';
export { NotionHtmlTransformer } from './notion-html.js';
export { ObsidianTransformer } from './obsidian.js';
export { PdfTransformer } from './pdf.js';
export { createAssetsArchive, download } from './utils.js';
export { ZipTransformer } from './zip.js';

View File

@@ -21,8 +21,11 @@ import { extMimeMap, Transformer } from '@blocksuite/store';
import type { AssetMap, ImportedFileEntry, PathBlobIdMap } from './type.js';
import { createAssetsArchive, download, parseMatter, Unzip } from './utils.js';
type ParsedFrontmatterMeta = Partial<
Pick<DocMeta, 'title' | 'createDate' | 'updatedDate' | 'tags' | 'favorite'>
export type ParsedFrontmatterMeta = Partial<
Pick<
DocMeta,
'title' | 'createDate' | 'updatedDate' | 'tags' | 'favorite' | 'trash'
>
>;
const FRONTMATTER_KEYS = {
@@ -150,11 +153,18 @@ function buildMetaFromFrontmatter(
}
continue;
}
if (FRONTMATTER_KEYS.trash.includes(key)) {
const trash = parseBoolean(value);
if (trash !== undefined) {
meta.trash = trash;
}
continue;
}
}
return meta;
}
function parseFrontmatter(markdown: string): {
export function parseFrontmatter(markdown: string): {
content: string;
meta: ParsedFrontmatterMeta;
} {
@@ -176,7 +186,7 @@ function parseFrontmatter(markdown: string): {
}
}
function applyMetaPatch(
export function applyMetaPatch(
collection: Workspace,
docId: string,
meta: ParsedFrontmatterMeta
@@ -187,13 +197,14 @@ function applyMetaPatch(
if (meta.updatedDate !== undefined) metaPatch.updatedDate = meta.updatedDate;
if (meta.tags) metaPatch.tags = meta.tags;
if (meta.favorite !== undefined) metaPatch.favorite = meta.favorite;
if (meta.trash !== undefined) metaPatch.trash = meta.trash;
if (Object.keys(metaPatch).length) {
collection.meta.setDocMeta(docId, metaPatch);
}
}
function getProvider(extensions: ExtensionType[]) {
export function getProvider(extensions: ExtensionType[]) {
const container = new Container();
extensions.forEach(ext => {
ext.setup(container);
@@ -223,6 +234,103 @@ type ImportMarkdownZipOptions = {
extensions: ExtensionType[];
};
/**
* Filters hidden/system entries that should never participate in imports.
*/
export function isSystemImportPath(path: string) {
return path.includes('__MACOSX') || path.includes('.DS_Store');
}
/**
* Creates the doc CRUD bridge used by importer transformers.
*/
export function createCollectionDocCRUD(collection: Workspace) {
return {
create: (id: string) => collection.createDoc(id).getStore({ id }),
get: (id: string) => collection.getDoc(id)?.getStore({ id }) ?? null,
delete: (id: string) => collection.removeDoc(id),
};
}
type CreateMarkdownImportJobOptions = {
collection: Workspace;
schema: Schema;
preferredTitle?: string;
fullPath?: string;
};
/**
* Creates a markdown import job with the standard collection middlewares.
*/
export function createMarkdownImportJob({
collection,
schema,
preferredTitle,
fullPath,
}: CreateMarkdownImportJobOptions) {
return new Transformer({
schema,
blobCRUD: collection.blobSync,
docCRUD: createCollectionDocCRUD(collection),
middlewares: [
defaultImageProxyMiddleware,
fileNameMiddleware(preferredTitle),
docLinkBaseURLMiddleware(collection.id),
...(fullPath ? [filePathMiddleware(fullPath)] : []),
],
});
}
type StageImportedAssetOptions = {
pendingAssets: AssetMap;
pendingPathBlobIdMap: PathBlobIdMap;
path: string;
content: Blob;
fileName: string;
};
/**
* Hashes a non-markdown import file and stages it into the shared asset maps.
*/
export async function stageImportedAsset({
pendingAssets,
pendingPathBlobIdMap,
path,
content,
fileName,
}: StageImportedAssetOptions) {
const ext = path.split('.').at(-1) ?? '';
const mime = extMimeMap.get(ext.toLowerCase()) ?? '';
const key = await sha(await content.arrayBuffer());
pendingPathBlobIdMap.set(path, key);
pendingAssets.set(key, new File([content], fileName, { type: mime }));
}
/**
* Binds previously staged asset files into a transformer job before import.
*/
export function bindImportedAssetsToJob(
job: Transformer,
pendingAssets: AssetMap,
pendingPathBlobIdMap: PathBlobIdMap
) {
const pathBlobIdMap = job.assetsManager.getPathBlobIdMap();
// Iterate over all assets to be imported
for (const [assetPath, key] of pendingPathBlobIdMap.entries()) {
// Get the relative path of the asset to the markdown file
// Store the path to blobId map
pathBlobIdMap.set(assetPath, key);
// Store the asset to assets, the key is the blobId, the value is the file object
// In block adapter, it will use the blobId to get the file object
const assetFile = pendingAssets.get(key);
if (assetFile) {
job.assets.set(key, assetFile);
}
}
return pathBlobIdMap;
}
/**
* Exports a doc to a Markdown file or a zip archive containing Markdown and assets.
* @param doc The doc to export
@@ -329,19 +437,10 @@ async function importMarkdownToDoc({
const { content, meta } = parseFrontmatter(markdown);
const preferredTitle = meta.title ?? fileName;
const provider = getProvider(extensions);
const job = new Transformer({
const job = createMarkdownImportJob({
collection,
schema,
blobCRUD: collection.blobSync,
docCRUD: {
create: (id: string) => collection.createDoc(id).getStore({ id }),
get: (id: string) => collection.getDoc(id)?.getStore({ id }) ?? null,
delete: (id: string) => collection.removeDoc(id),
},
middlewares: [
defaultImageProxyMiddleware,
fileNameMiddleware(preferredTitle),
docLinkBaseURLMiddleware(collection.id),
],
preferredTitle,
});
const mdAdapter = new MarkdownAdapter(job, provider);
const page = await mdAdapter.toDoc({
@@ -381,7 +480,7 @@ async function importMarkdownZip({
// Iterate over all files in the zip
for (const { path, content: blob } of unzip) {
// Skip the files that are not markdown files
if (path.includes('__MACOSX') || path.includes('.DS_Store')) {
if (isSystemImportPath(path)) {
continue;
}
@@ -395,12 +494,13 @@ async function importMarkdownZip({
fullPath: path,
});
} else {
// If the file is not a markdown file, store it to pendingAssets
const ext = path.split('.').at(-1) ?? '';
const mime = extMimeMap.get(ext) ?? '';
const key = await sha(await blob.arrayBuffer());
pendingPathBlobIdMap.set(path, key);
pendingAssets.set(key, new File([blob], fileName, { type: mime }));
await stageImportedAsset({
pendingAssets,
pendingPathBlobIdMap,
path,
content: blob,
fileName,
});
}
}
@@ -411,34 +511,13 @@ async function importMarkdownZip({
const markdown = await contentBlob.text();
const { content, meta } = parseFrontmatter(markdown);
const preferredTitle = meta.title ?? fileNameWithoutExt;
const job = new Transformer({
const job = createMarkdownImportJob({
collection,
schema,
blobCRUD: collection.blobSync,
docCRUD: {
create: (id: string) => collection.createDoc(id).getStore({ id }),
get: (id: string) => collection.getDoc(id)?.getStore({ id }) ?? null,
delete: (id: string) => collection.removeDoc(id),
},
middlewares: [
defaultImageProxyMiddleware,
fileNameMiddleware(preferredTitle),
docLinkBaseURLMiddleware(collection.id),
filePathMiddleware(fullPath),
],
preferredTitle,
fullPath,
});
const assets = job.assets;
const pathBlobIdMap = job.assetsManager.getPathBlobIdMap();
// Iterate over all assets to be imported
for (const [assetPath, key] of pendingPathBlobIdMap.entries()) {
// Get the relative path of the asset to the markdown file
// Store the path to blobId map
pathBlobIdMap.set(assetPath, key);
// Store the asset to assets, the key is the blobId, the value is the file object
// In block adapter, it will use the blobId to get the file object
if (pendingAssets.get(key)) {
assets.set(key, pendingAssets.get(key)!);
}
}
bindImportedAssetsToJob(job, pendingAssets, pendingPathBlobIdMap);
const mdAdapter = new MarkdownAdapter(job, provider);
const doc = await mdAdapter.toDoc({

View File

@@ -0,0 +1,732 @@
import { FootNoteReferenceParamsSchema } from '@blocksuite/affine-model';
import {
BlockMarkdownAdapterExtension,
createAttachmentBlockSnapshot,
FULL_FILE_PATH_KEY,
getImageFullPath,
MarkdownAdapter,
type MarkdownAST,
MarkdownASTToDeltaExtension,
normalizeFilePathReference,
} from '@blocksuite/affine-shared/adapters';
import type { AffineTextAttributes } from '@blocksuite/affine-shared/types';
import type {
DeltaInsert,
ExtensionType,
Schema,
Workspace,
} from '@blocksuite/store';
import { extMimeMap, nanoid } from '@blocksuite/store';
import type { Html, Text } from 'mdast';
import {
applyMetaPatch,
bindImportedAssetsToJob,
createMarkdownImportJob,
getProvider,
isSystemImportPath,
parseFrontmatter,
stageImportedAsset,
} from './markdown.js';
import type {
AssetMap,
MarkdownFileImportEntry,
PathBlobIdMap,
} from './type.js';
const CALLOUT_TYPE_MAP: Record<string, string> = {
note: '💡',
info: '',
tip: '🔥',
hint: '✅',
important: '‼️',
warning: '⚠️',
caution: '⚠️',
attention: '⚠️',
danger: '⚠️',
error: '🚨',
bug: '🐛',
example: '📌',
quote: '💬',
cite: '💬',
abstract: '📋',
summary: '📋',
todo: '☑️',
success: '✅',
check: '✅',
done: '✅',
failure: '❌',
fail: '❌',
missing: '❌',
question: '❓',
help: '❓',
faq: '❓',
};
const AMBIGUOUS_PAGE_LOOKUP = '__ambiguous__';
const DEFAULT_CALLOUT_EMOJI = '💡';
const OBSIDIAN_TEXT_FOOTNOTE_URL_PREFIX = 'data:text/plain;charset=utf-8,';
const OBSIDIAN_ATTACHMENT_EMBED_TAG = 'obsidian-attachment';
function normalizeLookupKey(value: string): string {
return normalizeFilePathReference(value).toLowerCase();
}
function stripMarkdownExtension(value: string): string {
return value.replace(/\.md$/i, '');
}
function basename(value: string): string {
return normalizeFilePathReference(value).split('/').pop() ?? value;
}
function parseObsidianTarget(rawTarget: string): {
path: string;
fragment: string | null;
} {
const normalizedTarget = normalizeFilePathReference(rawTarget);
const match = normalizedTarget.match(/^([^#^]+)([#^].*)?$/);
return {
path: match?.[1]?.trim() ?? normalizedTarget,
fragment: match?.[2] ?? null,
};
}
function extractTitleAndEmoji(rawTitle: string): {
title: string;
emoji: string | null;
} {
const SINGLE_LEADING_EMOJI_RE =
/^[\s\u200b]*((?:[\p{Emoji_Presentation}\p{Extended_Pictographic}\u200b]|\u200d|\ufe0f)+)/u;
let currentTitle = rawTitle;
let extractedEmojiClusters = '';
let emojiMatch;
while ((emojiMatch = currentTitle.match(SINGLE_LEADING_EMOJI_RE))) {
const matchedCluster = emojiMatch[1].trim();
extractedEmojiClusters +=
(extractedEmojiClusters ? ' ' : '') + matchedCluster;
currentTitle = currentTitle.slice(emojiMatch[0].length);
}
return {
title: currentTitle.trim(),
emoji: extractedEmojiClusters || null,
};
}
function preprocessTitleHeader(markdown: string): string {
return markdown.replace(
/^(\s*#\s+)(.*)$/m,
(_, headerPrefix, titleContent) => {
const { title: cleanTitle } = extractTitleAndEmoji(titleContent);
return `${headerPrefix}${cleanTitle}`;
}
);
}
function preprocessObsidianCallouts(markdown: string): string {
return markdown.replace(
/^(> *)\[!([^\]\n]+)\]([+-]?)([^\n]*)/gm,
(_, prefix, type, _fold, rest) => {
const calloutToken =
CALLOUT_TYPE_MAP[type.trim().toLowerCase()] ?? DEFAULT_CALLOUT_EMOJI;
const title = rest.trim();
return title
? `${prefix}[!${calloutToken}] ${title}`
: `${prefix}[!${calloutToken}]`;
}
);
}
function isStructuredFootnoteDefinition(content: string): boolean {
try {
return FootNoteReferenceParamsSchema.safeParse(JSON.parse(content.trim()))
.success;
} catch {
return false;
}
}
function splitFootnoteTextContent(content: string): {
title: string;
description?: string;
} {
const lines = content
.split('\n')
.map(line => line.trim())
.filter(Boolean);
const title = lines[0] ?? content.trim();
const description = lines.slice(1).join('\n').trim();
return {
title,
...(description ? { description } : {}),
};
}
function createTextFootnoteDefinition(content: string): string {
const normalizedContent = content.trim();
const { title, description } = splitFootnoteTextContent(normalizedContent);
return JSON.stringify({
type: 'url',
url: encodeURIComponent(
`${OBSIDIAN_TEXT_FOOTNOTE_URL_PREFIX}${encodeURIComponent(
normalizedContent
)}`
),
title,
...(description ? { description } : {}),
});
}
function extractObsidianFootnotes(markdown: string): {
content: string;
footnotes: string[];
} {
const lines = markdown.split('\n');
const output: string[] = [];
const footnotes: string[] = [];
for (let index = 0; index < lines.length; index += 1) {
const line = lines[index];
const match = line.match(/^\[\^([^\]]+)\]:\s*(.*)$/);
if (!match) {
output.push(line);
continue;
}
const identifier = match[1];
const contentLines = [match[2]];
while (index + 1 < lines.length) {
const nextLine = lines[index + 1];
if (/^(?: {1,4}|\t)/.test(nextLine)) {
contentLines.push(nextLine.replace(/^(?: {1,4}|\t)/, ''));
index += 1;
continue;
}
if (
nextLine.trim() === '' &&
index + 2 < lines.length &&
/^(?: {1,4}|\t)/.test(lines[index + 2])
) {
contentLines.push('');
index += 1;
continue;
}
break;
}
const content = contentLines.join('\n').trim();
footnotes.push(
`[^${identifier}]: ${
!content || isStructuredFootnoteDefinition(content)
? content
: createTextFootnoteDefinition(content)
}`
);
}
return { content: output.join('\n'), footnotes };
}
function buildLookupKeys(
targetPath: string,
currentFilePath?: string
): string[] {
const parsedTargetPath = normalizeFilePathReference(targetPath);
if (!parsedTargetPath) {
return [];
}
const keys = new Set<string>();
const addPathVariants = (value: string) => {
const normalizedValue = normalizeFilePathReference(value);
if (!normalizedValue) {
return;
}
keys.add(normalizedValue);
keys.add(stripMarkdownExtension(normalizedValue));
const fileName = basename(normalizedValue);
keys.add(fileName);
keys.add(stripMarkdownExtension(fileName));
const cleanTitle = extractTitleAndEmoji(
stripMarkdownExtension(fileName)
).title;
if (cleanTitle) {
keys.add(cleanTitle);
}
};
addPathVariants(parsedTargetPath);
if (currentFilePath) {
addPathVariants(getImageFullPath(currentFilePath, parsedTargetPath));
}
return Array.from(keys).map(normalizeLookupKey);
}
function registerPageLookup(
pageLookupMap: Map<string, string>,
key: string,
pageId: string
) {
const normalizedKey = normalizeLookupKey(key);
if (!normalizedKey) {
return;
}
const existing = pageLookupMap.get(normalizedKey);
if (existing && existing !== pageId) {
pageLookupMap.set(normalizedKey, AMBIGUOUS_PAGE_LOOKUP);
return;
}
pageLookupMap.set(normalizedKey, pageId);
}
function resolvePageIdFromLookup(
pageLookupMap: Pick<ReadonlyMap<string, string>, 'get'>,
rawTarget: string,
currentFilePath?: string
): string | null {
const { path } = parseObsidianTarget(rawTarget);
for (const key of buildLookupKeys(path, currentFilePath)) {
const targetPageId = pageLookupMap.get(key);
if (!targetPageId || targetPageId === AMBIGUOUS_PAGE_LOOKUP) {
continue;
}
return targetPageId;
}
return null;
}
function resolveWikilinkDisplayTitle(
rawAlias: string | undefined,
pageEmoji: string | undefined
): string | undefined {
if (!rawAlias) {
return undefined;
}
const { title: aliasTitle, emoji: aliasEmoji } =
extractTitleAndEmoji(rawAlias);
if (aliasEmoji && aliasEmoji === pageEmoji) {
return aliasTitle;
}
return rawAlias;
}
function isImageAssetPath(path: string): boolean {
const extension = path.split('.').at(-1)?.toLowerCase() ?? '';
return extMimeMap.get(extension)?.startsWith('image/') ?? false;
}
function encodeMarkdownPath(path: string): string {
return encodeURI(path).replaceAll('(', '%28').replaceAll(')', '%29');
}
function escapeMarkdownLabel(label: string): string {
return label.replace(/[[\]\\]/g, '\\$&');
}
function isObsidianSizeAlias(alias: string | undefined): boolean {
return !!alias && /^\d+(?:x\d+)?$/i.test(alias.trim());
}
function getEmbedLabel(
rawAlias: string | undefined,
targetPath: string,
fallbackToFileName: boolean
): string {
if (!rawAlias || isObsidianSizeAlias(rawAlias)) {
return fallbackToFileName
? stripMarkdownExtension(basename(targetPath))
: '';
}
return rawAlias.trim();
}
type ObsidianAttachmentEmbed = {
blobId: string;
fileName: string;
fileType: string;
};
function createObsidianAttach(embed: ObsidianAttachmentEmbed): string {
return `<!-- ${OBSIDIAN_ATTACHMENT_EMBED_TAG} ${encodeURIComponent(
JSON.stringify(embed)
)} -->`;
}
function parseObsidianAttach(value: string): ObsidianAttachmentEmbed | null {
const match = value.match(
new RegExp(`^<!-- ${OBSIDIAN_ATTACHMENT_EMBED_TAG} ([^ ]+) -->$`)
);
if (!match?.[1]) return null;
try {
const parsed = JSON.parse(
decodeURIComponent(match[1])
) as ObsidianAttachmentEmbed;
if (!parsed.blobId || !parsed.fileName) {
return null;
}
return parsed;
} catch {
return null;
}
}
function preprocessObsidianEmbeds(
markdown: string,
filePath: string,
pageLookupMap: ReadonlyMap<string, string>,
pathBlobIdMap: ReadonlyMap<string, string>
): string {
return markdown.replace(
/!\[\[([^\]|]+)(?:\|([^\]]+))?\]\]/g,
(match, rawTarget: string, rawAlias?: string) => {
const targetPageId = resolvePageIdFromLookup(
pageLookupMap,
rawTarget,
filePath
);
if (targetPageId) {
return `[[${rawTarget}${rawAlias ? `|${rawAlias}` : ''}]]`;
}
const { path } = parseObsidianTarget(rawTarget);
if (!path) {
return match;
}
const assetPath = getImageFullPath(filePath, path);
const encodedPath = encodeMarkdownPath(assetPath);
if (isImageAssetPath(path)) {
const alt = getEmbedLabel(rawAlias, path, false);
return `![${escapeMarkdownLabel(alt)}](${encodedPath})`;
}
const label = getEmbedLabel(rawAlias, path, true);
const blobId = pathBlobIdMap.get(assetPath);
if (!blobId) return `[${escapeMarkdownLabel(label)}](${encodedPath})`;
const extension = path.split('.').at(-1)?.toLowerCase() ?? '';
return createObsidianAttach({
blobId,
fileName: basename(path),
fileType: extMimeMap.get(extension) ?? '',
});
}
);
}
function preprocessObsidianMarkdown(
markdown: string,
filePath: string,
pageLookupMap: ReadonlyMap<string, string>,
pathBlobIdMap: ReadonlyMap<string, string>
): string {
const { content: contentWithoutFootnotes, footnotes: extractedFootnotes } =
extractObsidianFootnotes(markdown);
const content = preprocessObsidianEmbeds(
contentWithoutFootnotes,
filePath,
pageLookupMap,
pathBlobIdMap
);
const normalizedMarkdown = preprocessTitleHeader(
preprocessObsidianCallouts(content)
);
if (extractedFootnotes.length === 0) {
return normalizedMarkdown;
}
const trimmedMarkdown = normalizedMarkdown.replace(/\s+$/, '');
return `${trimmedMarkdown}\n\n${extractedFootnotes.join('\n\n')}\n`;
}
function isObsidianAttachmentEmbedNode(node: MarkdownAST): node is Html {
return node.type === 'html' && !!parseObsidianAttach(node.value);
}
export const obsidianAttachmentEmbedMarkdownAdapterMatcher =
BlockMarkdownAdapterExtension({
flavour: 'obsidian:attachment-embed',
toMatch: o => isObsidianAttachmentEmbedNode(o.node),
fromMatch: () => false,
toBlockSnapshot: {
enter: (o, context) => {
if (!isObsidianAttachmentEmbedNode(o.node)) {
return;
}
const attachment = parseObsidianAttach(o.node.value);
if (!attachment) {
return;
}
const assetFile = context.assets?.getAssets().get(attachment.blobId);
context.walkerContext
.openNode(
createAttachmentBlockSnapshot({
id: nanoid(),
props: {
name: attachment.fileName,
size: assetFile?.size ?? 0,
type:
attachment.fileType ||
assetFile?.type ||
'application/octet-stream',
sourceId: attachment.blobId,
embed: false,
style: 'horizontalThin',
footnoteIdentifier: null,
},
}),
'children'
)
.closeNode();
(o.node as unknown as { type: string }).type =
'obsidianAttachmentEmbed';
},
},
fromBlockSnapshot: {},
});
export const obsidianWikilinkToDeltaMatcher = MarkdownASTToDeltaExtension({
name: 'obsidian-wikilink',
match: ast => ast.type === 'text',
toDelta: (ast, context) => {
const textNode = ast as Text;
if (!textNode.value) {
return [];
}
const nodeContent = textNode.value;
const wikilinkRegex = /\[\[([^\]|]+)(?:\|([^\]]+))?\]\]/g;
const deltas: DeltaInsert<AffineTextAttributes>[] = [];
let lastProcessedIndex = 0;
let linkMatch;
while ((linkMatch = wikilinkRegex.exec(nodeContent)) !== null) {
if (linkMatch.index > lastProcessedIndex) {
deltas.push({
insert: nodeContent.substring(lastProcessedIndex, linkMatch.index),
});
}
const targetPageName = linkMatch[1].trim();
const alias = linkMatch[2]?.trim();
const currentFilePath = context.configs.get(FULL_FILE_PATH_KEY);
const targetPageId = resolvePageIdFromLookup(
{ get: key => context.configs.get(`obsidian:pageId:${key}`) },
targetPageName,
typeof currentFilePath === 'string' ? currentFilePath : undefined
);
if (targetPageId) {
const pageEmoji = context.configs.get(
'obsidian:pageEmoji:' + targetPageId
);
const displayTitle = resolveWikilinkDisplayTitle(alias, pageEmoji);
deltas.push({
insert: ' ',
attributes: {
reference: {
type: 'LinkedPage',
pageId: targetPageId,
...(displayTitle ? { title: displayTitle } : {}),
},
},
});
} else {
deltas.push({ insert: linkMatch[0] });
}
lastProcessedIndex = wikilinkRegex.lastIndex;
}
if (lastProcessedIndex < nodeContent.length) {
deltas.push({ insert: nodeContent.substring(lastProcessedIndex) });
}
return deltas;
},
});
export type ImportObsidianVaultOptions = {
collection: Workspace;
schema: Schema;
importedFiles: File[];
extensions: ExtensionType[];
};
export type ImportObsidianVaultResult = {
docIds: string[];
docEmojis: Map<string, string>;
};
export async function importObsidianVault({
collection,
schema,
importedFiles,
extensions,
}: ImportObsidianVaultOptions): Promise<ImportObsidianVaultResult> {
const provider = getProvider([
obsidianWikilinkToDeltaMatcher,
obsidianAttachmentEmbedMarkdownAdapterMatcher,
...extensions,
]);
const docIds: string[] = [];
const docEmojis = new Map<string, string>();
const pendingAssets: AssetMap = new Map();
const pendingPathBlobIdMap: PathBlobIdMap = new Map();
const markdownBlobs: MarkdownFileImportEntry[] = [];
const pageLookupMap = new Map<string, string>();
for (const file of importedFiles) {
const filePath = file.webkitRelativePath || file.name;
if (isSystemImportPath(filePath)) continue;
if (file.name.endsWith('.md')) {
const fileNameWithoutExt = file.name.replace(/\.[^/.]+$/, '');
const markdown = await file.text();
const { content, meta } = parseFrontmatter(markdown);
const documentTitleCandidate = meta.title ?? fileNameWithoutExt;
const { title: preferredTitle, emoji: leadingEmoji } =
extractTitleAndEmoji(documentTitleCandidate);
const newPageId = collection.idGenerator();
registerPageLookup(pageLookupMap, filePath, newPageId);
registerPageLookup(
pageLookupMap,
stripMarkdownExtension(filePath),
newPageId
);
registerPageLookup(pageLookupMap, file.name, newPageId);
registerPageLookup(pageLookupMap, fileNameWithoutExt, newPageId);
registerPageLookup(pageLookupMap, documentTitleCandidate, newPageId);
registerPageLookup(pageLookupMap, preferredTitle, newPageId);
if (leadingEmoji) {
docEmojis.set(newPageId, leadingEmoji);
}
markdownBlobs.push({
filename: file.name,
contentBlob: file,
fullPath: filePath,
pageId: newPageId,
preferredTitle,
content,
meta,
});
} else {
await stageImportedAsset({
pendingAssets,
pendingPathBlobIdMap,
path: filePath,
content: file,
fileName: file.name,
});
}
}
for (const existingDocMeta of collection.meta.docMetas) {
if (existingDocMeta.title) {
registerPageLookup(
pageLookupMap,
existingDocMeta.title,
existingDocMeta.id
);
}
}
await Promise.all(
markdownBlobs.map(async markdownFile => {
const {
fullPath,
pageId: predefinedId,
preferredTitle,
content,
meta,
} = markdownFile;
const job = createMarkdownImportJob({
collection,
schema,
preferredTitle,
fullPath,
});
for (const [lookupKey, id] of pageLookupMap.entries()) {
if (id === AMBIGUOUS_PAGE_LOOKUP) {
continue;
}
job.adapterConfigs.set(`obsidian:pageId:${lookupKey}`, id);
}
for (const [id, emoji] of docEmojis.entries()) {
job.adapterConfigs.set('obsidian:pageEmoji:' + id, emoji);
}
const pathBlobIdMap = bindImportedAssetsToJob(
job,
pendingAssets,
pendingPathBlobIdMap
);
const preprocessedMarkdown = preprocessObsidianMarkdown(
content,
fullPath,
pageLookupMap,
pathBlobIdMap
);
const mdAdapter = new MarkdownAdapter(job, provider);
const snapshot = await mdAdapter.toDocSnapshot({
file: preprocessedMarkdown,
assets: job.assetsManager,
});
if (snapshot) {
snapshot.meta.id = predefinedId;
const doc = await job.snapshotToDoc(snapshot);
if (doc) {
applyMetaPatch(collection, doc.id, {
...meta,
title: preferredTitle,
trash: false,
});
docIds.push(doc.id);
}
}
})
);
return { docIds, docEmojis };
}
export const ObsidianTransformer = {
importObsidianVault,
};

View File

@@ -1,3 +1,5 @@
import type { ParsedFrontmatterMeta } from './markdown.js';
/**
* Represents an imported file entry in the zip archive
*/
@@ -10,6 +12,13 @@ export type ImportedFileEntry = {
fullPath: string;
};
export type MarkdownFileImportEntry = ImportedFileEntry & {
pageId: string;
preferredTitle: string;
content: string;
meta: ParsedFrontmatterMeta;
};
/**
* Map of asset hash to File object for all media files in the zip
* Key: SHA hash of the file content (blobId)

View File

@@ -162,10 +162,11 @@ export class AffineToolbarWidget extends WidgetComponent {
}
setReferenceElementWithElements(gfx: GfxController, elements: GfxModel[]) {
const surfaceBounds = getCommonBoundWithRotation(elements);
const getBoundingClientRect = () => {
const bounds = getCommonBoundWithRotation(elements);
const { x: offsetX, y: offsetY } = this.getBoundingClientRect();
const [x, y, w, h] = gfx.viewport.toViewBound(bounds).toXYWH();
const [x, y, w, h] = gfx.viewport.toViewBound(surfaceBounds).toXYWH();
const rect = new DOMRect(x + offsetX, y + offsetY, w, h);
return rect;
};

View File

@@ -103,8 +103,9 @@ export abstract class GfxPrimitiveElementModel<
}
get deserializedXYWH() {
if (!this._lastXYWH || this.xywh !== this._lastXYWH) {
const xywh = this.xywh;
const xywh = this.xywh;
if (!this._lastXYWH || xywh !== this._lastXYWH) {
this._local.set('deserializedXYWH', deserializeXYWH(xywh));
this._lastXYWH = xywh;
}
@@ -386,6 +387,8 @@ export abstract class GfxGroupLikeElementModel<
{
private _childIds: string[] = [];
private _xywhDirty = true;
private readonly _mutex = createMutex();
abstract children: Y.Map<any>;
@@ -420,24 +423,9 @@ export abstract class GfxGroupLikeElementModel<
get xywh() {
this._mutex(() => {
const curXYWH =
(this._local.get('xywh') as SerializedXYWH) ?? '[0,0,0,0]';
const newXYWH = this._getXYWH().serialize();
if (curXYWH !== newXYWH || !this._local.has('xywh')) {
this._local.set('xywh', newXYWH);
if (curXYWH !== newXYWH) {
this._onChange({
props: {
xywh: newXYWH,
},
oldValues: {
xywh: curXYWH,
},
local: true,
});
}
if (this._xywhDirty || !this._local.has('xywh')) {
this._local.set('xywh', this._getXYWH().serialize());
this._xywhDirty = false;
}
});
@@ -457,15 +445,41 @@ export abstract class GfxGroupLikeElementModel<
bound = bound ? bound.unite(child.elementBound) : child.elementBound;
});
if (bound) {
this._local.set('xywh', bound.serialize());
} else {
this._local.delete('xywh');
}
return bound ?? new Bound(0, 0, 0, 0);
}
invalidateXYWH() {
this._xywhDirty = true;
this._local.delete('deserializedXYWH');
}
refreshXYWH(local: boolean) {
this._mutex(() => {
const oldXYWH =
(this._local.get('xywh') as SerializedXYWH) ?? '[0,0,0,0]';
const nextXYWH = this._getXYWH().serialize();
this._xywhDirty = false;
if (oldXYWH === nextXYWH && this._local.has('xywh')) {
return;
}
this._local.set('xywh', nextXYWH);
this._local.delete('deserializedXYWH');
this._onChange({
props: {
xywh: nextXYWH,
},
oldValues: {
xywh: oldXYWH,
},
local,
});
});
}
abstract addChild(element: GfxModel): void;
/**
@@ -496,6 +510,7 @@ export abstract class GfxGroupLikeElementModel<
setChildIds(value: string[], fromLocal: boolean) {
const oldChildIds = this.childIds;
this._childIds = value;
this.invalidateXYWH();
this._onChange({
props: {

View File

@@ -52,6 +52,12 @@ export type MiddlewareCtx = {
export type SurfaceMiddleware = (ctx: MiddlewareCtx) => void;
export class SurfaceBlockModel extends BlockModel<SurfaceBlockProps> {
private static readonly _groupBoundImpactKeys = new Set([
'xywh',
'rotate',
'hidden',
]);
protected _decoratorState = createDecoratorState();
protected _elementCtorMap: Record<
@@ -308,6 +314,42 @@ export class SurfaceBlockModel extends BlockModel<SurfaceBlockProps> {
Object.keys(payload.props).forEach(key => {
model.propsUpdated.next({ key });
});
this._refreshParentGroupBoundsForElement(model, payload);
}
private _refreshParentGroupBounds(id: string, local: boolean) {
const group = this.getGroup(id);
if (group instanceof GfxGroupLikeElementModel) {
group.refreshXYWH(local);
}
}
private _refreshParentGroupBoundsForElement(
model: GfxPrimitiveElementModel,
payload: ElementUpdatedData
) {
if (
model instanceof GfxGroupLikeElementModel &&
('childIds' in payload.props || 'childIds' in payload.oldValues)
) {
model.refreshXYWH(payload.local);
return;
}
const affectedKeys = new Set([
...Object.keys(payload.props),
...Object.keys(payload.oldValues),
]);
if (
Array.from(affectedKeys).some(key =>
SurfaceBlockModel._groupBoundImpactKeys.has(key)
)
) {
this._refreshParentGroupBounds(model.id, payload.local);
}
}
private _initElementModels() {
@@ -458,6 +500,10 @@ export class SurfaceBlockModel extends BlockModel<SurfaceBlockProps> {
);
}
if (payload.model instanceof BlockModel) {
this._refreshParentGroupBounds(payload.id, payload.isLocal);
}
break;
case 'delete':
if (isGfxGroupCompatibleModel(payload.model)) {
@@ -482,6 +528,13 @@ export class SurfaceBlockModel extends BlockModel<SurfaceBlockProps> {
}
}
if (
payload.props.key &&
SurfaceBlockModel._groupBoundImpactKeys.has(payload.props.key)
) {
this._refreshParentGroupBounds(payload.id, payload.isLocal);
}
break;
}
});

View File

@@ -1,9 +1,8 @@
import type { MindMapView } from '@blocksuite/affine/gfx/mindmap';
import { mountShapeTextEditor } from '@blocksuite/affine/gfx/shape';
import { LayoutType, type MindmapElementModel } from '@blocksuite/affine-model';
import { Bound } from '@blocksuite/global/gfx';
import type { GfxController } from '@blocksuite/std/gfx';
import { beforeEach, describe, expect, test, vi } from 'vitest';
import { beforeEach, describe, expect, test } from 'vitest';
import { click, pointermove, wait } from '../utils/common.js';
import { getDocRootBlock } from '../utils/edgeless.js';
@@ -37,39 +36,6 @@ describe('mindmap', () => {
return cleanup;
});
test('should update mindmap node editor size on compositionupdate', async () => {
const mindmapId = gfx.surface!.addElement({
type: 'mindmap',
children: {
text: 'root',
},
});
const mindmap = () => gfx.getElementById(mindmapId) as MindmapElementModel;
const root = getDocRootBlock(window.doc, window.editor, 'edgeless');
const rootNode = mindmap().tree.element;
mountShapeTextEditor(rootNode, root);
await wait();
const shapeEditor = root.querySelector('edgeless-shape-text-editor') as
| (HTMLElement & { inlineEditorContainer?: HTMLElement })
| null;
expect(shapeEditor).not.toBeNull();
const updateSpy = vi.spyOn(shapeEditor as any, '_updateElementWH');
const compositionUpdate = new CompositionEvent('compositionupdate', {
data: '拼',
bubbles: true,
});
shapeEditor!.inlineEditorContainer?.dispatchEvent(compositionUpdate);
expect(updateSpy).toHaveBeenCalled();
});
test('delete the root node should remove all children', async () => {
const tree = {
text: 'root',

View File

@@ -4,6 +4,7 @@ import type {
ConnectorElementModel,
GroupElementModel,
} from '@blocksuite/affine/model';
import { serializeXYWH } from '@blocksuite/global/gfx';
import { beforeEach, describe, expect, test } from 'vitest';
import { wait } from '../utils/common.js';
@@ -138,6 +139,29 @@ describe('group', () => {
expect(group.childIds).toEqual([id]);
});
test('group xywh should update when child xywh changes', () => {
const shapeId = model.addElement({
type: 'shape',
xywh: serializeXYWH(0, 0, 100, 100),
});
const groupId = model.addElement({
type: 'group',
children: {
[shapeId]: true,
},
});
const group = model.getElementById(groupId) as GroupElementModel;
expect(group.xywh).toBe(serializeXYWH(0, 0, 100, 100));
model.updateElement(shapeId, {
xywh: serializeXYWH(50, 60, 100, 100),
});
expect(group.xywh).toBe(serializeXYWH(50, 60, 100, 100));
});
});
describe('connector', () => {

Binary file not shown.

Before

Width:  |  Height:  |  Size: 24 KiB

After

Width:  |  Height:  |  Size: 25 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 24 KiB

After

Width:  |  Height:  |  Size: 25 KiB

View File

@@ -1,12 +1,35 @@
import test from 'ava';
import { z } from 'zod';
import type { DocReader } from '../../core/doc';
import type { AccessController } from '../../core/permission';
import type { Models } from '../../models';
import { NativeLlmRequest, NativeLlmStreamEvent } from '../../native';
import {
ToolCallAccumulator,
ToolCallLoop,
ToolSchemaExtractor,
} from '../../plugins/copilot/providers/loop';
import {
buildBlobContentGetter,
createBlobReadTool,
} from '../../plugins/copilot/tools/blob-read';
import {
buildDocKeywordSearchGetter,
createDocKeywordSearchTool,
} from '../../plugins/copilot/tools/doc-keyword-search';
import {
buildDocContentGetter,
createDocReadTool,
} from '../../plugins/copilot/tools/doc-read';
import {
buildDocSearchGetter,
createDocSemanticSearchTool,
} from '../../plugins/copilot/tools/doc-semantic-search';
import {
DOCUMENT_SYNC_PENDING_MESSAGE,
LOCAL_WORKSPACE_SYNC_REQUIRED_MESSAGE,
} from '../../plugins/copilot/tools/doc-sync';
test('ToolCallAccumulator should merge deltas and complete tool call', t => {
const accumulator = new ToolCallAccumulator();
@@ -286,3 +309,210 @@ test('ToolCallLoop should surface invalid JSON as tool error without executing',
is_error: true,
});
});
test('doc_read should return specific sync errors for unavailable docs', async t => {
const cases = [
{
name: 'local workspace without cloud sync',
workspace: null,
authors: null,
markdown: null,
expected: {
type: 'error',
name: 'Workspace Sync Required',
message: LOCAL_WORKSPACE_SYNC_REQUIRED_MESSAGE,
},
docReaderCalled: false,
},
{
name: 'cloud workspace document not synced to server yet',
workspace: { id: 'ws-1' },
authors: null,
markdown: null,
expected: {
type: 'error',
name: 'Document Sync Pending',
message: DOCUMENT_SYNC_PENDING_MESSAGE('doc-1'),
},
docReaderCalled: false,
},
{
name: 'cloud workspace document markdown not ready yet',
workspace: { id: 'ws-1' },
authors: {
createdAt: new Date('2026-01-01T00:00:00.000Z'),
updatedAt: new Date('2026-01-01T00:00:00.000Z'),
createdByUser: null,
updatedByUser: null,
},
markdown: null,
expected: {
type: 'error',
name: 'Document Sync Pending',
message: DOCUMENT_SYNC_PENDING_MESSAGE('doc-1'),
},
docReaderCalled: true,
},
] as const;
const ac = {
user: () => ({
workspace: () => ({ doc: () => ({ can: async () => true }) }),
}),
} as unknown as AccessController;
for (const testCase of cases) {
let docReaderCalled = false;
const docReader = {
getDocMarkdown: async () => {
docReaderCalled = true;
return testCase.markdown;
},
} as unknown as DocReader;
const models = {
workspace: {
get: async () => testCase.workspace,
},
doc: {
getAuthors: async () => testCase.authors,
},
} as unknown as Models;
const getDoc = buildDocContentGetter(ac, docReader, models);
const tool = createDocReadTool(
getDoc.bind(null, {
user: 'user-1',
workspace: 'workspace-1',
})
);
const result = await tool.execute?.({ doc_id: 'doc-1' }, {});
t.is(docReaderCalled, testCase.docReaderCalled, testCase.name);
t.deepEqual(result, testCase.expected, testCase.name);
}
});
test('document search tools should return sync error for local workspace', async t => {
const ac = {
user: () => ({
workspace: () => ({
can: async () => true,
docs: async () => [],
}),
}),
} as unknown as AccessController;
const models = {
workspace: {
get: async () => null,
},
} as unknown as Models;
let keywordSearchCalled = false;
const indexerService = {
searchDocsByKeyword: async () => {
keywordSearchCalled = true;
return [];
},
} as unknown as Parameters<typeof buildDocKeywordSearchGetter>[1];
let semanticSearchCalled = false;
const contextService = {
matchWorkspaceAll: async () => {
semanticSearchCalled = true;
return [];
},
} as unknown as Parameters<typeof buildDocSearchGetter>[1];
const keywordTool = createDocKeywordSearchTool(
buildDocKeywordSearchGetter(ac, indexerService, models).bind(null, {
user: 'user-1',
workspace: 'workspace-1',
})
);
const semanticTool = createDocSemanticSearchTool(
buildDocSearchGetter(ac, contextService, null, models).bind(null, {
user: 'user-1',
workspace: 'workspace-1',
})
);
const keywordResult = await keywordTool.execute?.({ query: 'hello' }, {});
const semanticResult = await semanticTool.execute?.({ query: 'hello' }, {});
t.false(keywordSearchCalled);
t.false(semanticSearchCalled);
t.deepEqual(keywordResult, {
type: 'error',
name: 'Workspace Sync Required',
message: LOCAL_WORKSPACE_SYNC_REQUIRED_MESSAGE,
});
t.deepEqual(semanticResult, {
type: 'error',
name: 'Workspace Sync Required',
message: LOCAL_WORKSPACE_SYNC_REQUIRED_MESSAGE,
});
});
test('doc_semantic_search should return empty array when nothing matches', async t => {
const ac = {
user: () => ({
workspace: () => ({
can: async () => true,
docs: async () => [],
}),
}),
} as unknown as AccessController;
const models = {
workspace: {
get: async () => ({ id: 'workspace-1' }),
},
} as unknown as Models;
const contextService = {
matchWorkspaceAll: async () => [],
} as unknown as Parameters<typeof buildDocSearchGetter>[1];
const semanticTool = createDocSemanticSearchTool(
buildDocSearchGetter(ac, contextService, null, models).bind(null, {
user: 'user-1',
workspace: 'workspace-1',
})
);
const result = await semanticTool.execute?.({ query: 'hello' }, {});
t.deepEqual(result, []);
});
test('blob_read should return explicit error when attachment context is missing', async t => {
const ac = {
user: () => ({
workspace: () => ({
allowLocal: () => ({
can: async () => true,
}),
}),
}),
} as unknown as AccessController;
const blobTool = createBlobReadTool(
buildBlobContentGetter(ac, null).bind(null, {
user: 'user-1',
workspace: 'workspace-1',
})
);
const result = await blobTool.execute?.({ blob_id: 'blob-1' }, {});
t.deepEqual(result, {
type: 'error',
name: 'Blob Read Failed',
message:
'Missing workspace, user, blob id, or copilot context for blob_read.',
});
});

View File

@@ -258,7 +258,7 @@ export class FalProvider extends CopilotProvider<FalConfig> {
const model = this.selectModel(cond);
try {
metrics.ai.counter('chat_text_calls').add(1, { model: model.id });
metrics.ai.counter('chat_text_calls').add(1, this.metricLabels(model.id));
// by default, image prompt assumes there is only one message
const prompt = this.extractPrompt(messages[messages.length - 1]);
@@ -283,7 +283,9 @@ export class FalProvider extends CopilotProvider<FalConfig> {
}
return data.output;
} catch (e: any) {
metrics.ai.counter('chat_text_errors').add(1, { model: model.id });
metrics.ai
.counter('chat_text_errors')
.add(1, this.metricLabels(model.id));
throw this.handleError(e);
}
}
@@ -296,12 +298,16 @@ export class FalProvider extends CopilotProvider<FalConfig> {
const model = this.selectModel(cond);
try {
metrics.ai.counter('chat_text_stream_calls').add(1, { model: model.id });
metrics.ai
.counter('chat_text_stream_calls')
.add(1, this.metricLabels(model.id));
const result = await this.text(cond, messages, options);
yield result;
} catch (e) {
metrics.ai.counter('chat_text_stream_errors').add(1, { model: model.id });
metrics.ai
.counter('chat_text_stream_errors')
.add(1, this.metricLabels(model.id));
throw e;
}
}
@@ -319,7 +325,7 @@ export class FalProvider extends CopilotProvider<FalConfig> {
try {
metrics.ai
.counter('generate_images_stream_calls')
.add(1, { model: model.id });
.add(1, this.metricLabels(model.id));
// by default, image prompt assumes there is only one message
const prompt = this.extractPrompt(
@@ -376,7 +382,7 @@ export class FalProvider extends CopilotProvider<FalConfig> {
} catch (e) {
metrics.ai
.counter('generate_images_stream_errors')
.add(1, { model: model.id });
.add(1, this.metricLabels(model.id));
throw this.handleError(e);
}
}

View File

@@ -664,7 +664,7 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
const model = this.selectModel(normalizedCond);
try {
metrics.ai.counter('chat_text_calls').add(1, { model: model.id });
metrics.ai.counter('chat_text_calls').add(1, this.metricLabels(model.id));
const backendConfig = this.createNativeConfig();
const middleware = this.getActiveProviderMiddleware();
const cap = this.getAttachCapability(model, ModelOutputType.Structured);
@@ -687,7 +687,9 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
const validated = schema.parse(parsed);
return JSON.stringify(validated);
} catch (e: any) {
metrics.ai.counter('chat_text_errors').add(1, { model: model.id });
metrics.ai
.counter('chat_text_errors')
.add(1, this.metricLabels(model.id));
throw this.handleError(e);
}
}
@@ -983,7 +985,7 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
metrics.ai
.counter('generate_images_stream_calls')
.add(1, { model: model.id });
.add(1, this.metricLabels(model.id));
const { content: prompt, attachments } = [...messages].pop() || {};
if (!prompt) throw new CopilotPromptInvalid('Prompt is required');
@@ -1021,7 +1023,9 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
}
return;
} catch (e: any) {
metrics.ai.counter('generate_images_errors').add(1, { model: model.id });
metrics.ai
.counter('generate_images_errors')
.add(1, this.metricLabels(model.id));
throw this.handleError(e);
}
}

View File

@@ -470,7 +470,8 @@ export abstract class CopilotProvider<C = any> {
});
const searchDocs = buildDocKeywordSearchGetter(
ac,
indexerService
indexerService,
models
);
tools.doc_keyword_search = createDocKeywordSearchTool(
searchDocs.bind(null, options)

View File

@@ -18,7 +18,10 @@ export const buildBlobContentGetter = (
chunk?: number
) => {
if (!options?.user || !options?.workspace || !blobId || !context) {
return;
return toolError(
'Blob Read Failed',
'Missing workspace, user, blob id, or copilot context for blob_read.'
);
}
const canAccess = await ac
.user(options.user)
@@ -29,7 +32,10 @@ export const buildBlobContentGetter = (
logger.warn(
`User ${options.user} does not have access workspace ${options.workspace}`
);
return;
return toolError(
'Blob Read Failed',
'You do not have permission to access this workspace attachment.'
);
}
const contextFile = context.files.find(
@@ -42,7 +48,12 @@ export const buildBlobContentGetter = (
context.getBlobContent(canonicalBlobId, chunk),
]);
const content = file?.trim() || blob?.trim();
if (!content) return;
if (!content) {
return toolError(
'Blob Read Failed',
`Attachment ${canonicalBlobId} is not available for reading in the current copilot context.`
);
}
const info = contextFile
? { fileName: contextFile.name, fileType: contextFile.mimeType }
: {};
@@ -53,10 +64,7 @@ export const buildBlobContentGetter = (
};
export const createBlobReadTool = (
getBlobContent: (
targetId?: string,
chunk?: number
) => Promise<object | undefined>
getBlobContent: (targetId?: string, chunk?: number) => Promise<object>
) => {
return defineTool({
description:
@@ -73,13 +81,10 @@ export const createBlobReadTool = (
execute: async ({ blob_id, chunk }) => {
try {
const blob = await getBlobContent(blob_id, chunk);
if (!blob) {
return;
}
return { ...blob };
} catch (err: any) {
logger.error(`Failed to read the blob ${blob_id} in context`, err);
return toolError('Blob Read Failed', err.message);
return toolError('Blob Read Failed', err.message ?? String(err));
}
},
});

View File

@@ -1,27 +1,43 @@
import { z } from 'zod';
import type { AccessController } from '../../../core/permission';
import type { Models } from '../../../models';
import type { IndexerService, SearchDoc } from '../../indexer';
import { workspaceSyncRequiredError } from './doc-sync';
import { toolError } from './error';
import { defineTool } from './tool';
import type { CopilotChatOptions } from './types';
export const buildDocKeywordSearchGetter = (
ac: AccessController,
indexerService: IndexerService
indexerService: IndexerService,
models: Models
) => {
const searchDocs = async (options: CopilotChatOptions, query?: string) => {
if (!options || !query?.trim() || !options.user || !options.workspace) {
return undefined;
const queryTrimmed = query?.trim();
if (!options || !queryTrimmed || !options.user || !options.workspace) {
return toolError(
'Doc Keyword Search Failed',
'Missing workspace, user, or query for doc_keyword_search.'
);
}
const workspace = await models.workspace.get(options.workspace);
if (!workspace) {
return workspaceSyncRequiredError();
}
const canAccess = await ac
.user(options.user)
.workspace(options.workspace)
.can('Workspace.Read');
if (!canAccess) return undefined;
if (!canAccess) {
return toolError(
'Doc Keyword Search Failed',
'You do not have permission to access this workspace.'
);
}
const docs = await indexerService.searchDocsByKeyword(
options.workspace,
query
queryTrimmed
);
// filter current user readable docs
@@ -29,13 +45,15 @@ export const buildDocKeywordSearchGetter = (
.user(options.user)
.workspace(options.workspace)
.docs(docs, 'Doc.Read');
return readableDocs;
return readableDocs ?? [];
};
return searchDocs;
};
export const createDocKeywordSearchTool = (
searchDocs: (query: string) => Promise<SearchDoc[] | undefined>
searchDocs: (
query: string
) => Promise<SearchDoc[] | ReturnType<typeof toolError>>
) => {
return defineTool({
description:
@@ -50,8 +68,8 @@ export const createDocKeywordSearchTool = (
execute: async ({ query }) => {
try {
const docs = await searchDocs(query);
if (!docs) {
return;
if (!Array.isArray(docs)) {
return docs;
}
return docs.map(doc => ({
docId: doc.docId,

View File

@@ -3,13 +3,20 @@ import { z } from 'zod';
import { DocReader } from '../../../core/doc';
import { AccessController } from '../../../core/permission';
import { Models, publicUserSelect } from '../../../models';
import { toolError } from './error';
import { Models } from '../../../models';
import {
documentSyncPendingError,
workspaceSyncRequiredError,
} from './doc-sync';
import { type ToolError, toolError } from './error';
import { defineTool } from './tool';
import type { CopilotChatOptions } from './types';
const logger = new Logger('DocReadTool');
const isToolError = (result: ToolError | object): result is ToolError =>
'type' in result && result.type === 'error';
export const buildDocContentGetter = (
ac: AccessController,
docReader: DocReader,
@@ -17,8 +24,17 @@ export const buildDocContentGetter = (
) => {
const getDoc = async (options: CopilotChatOptions, docId?: string) => {
if (!options?.user || !options?.workspace || !docId) {
return;
return toolError(
'Doc Read Failed',
'Missing workspace, user, or document id for doc_read.'
);
}
const workspace = await models.workspace.get(options.workspace);
if (!workspace) {
return workspaceSyncRequiredError();
}
const canAccess = await ac
.user(options.user)
.workspace(options.workspace)
@@ -28,23 +44,15 @@ export const buildDocContentGetter = (
logger.warn(
`User ${options.user} does not have access to doc ${docId} in workspace ${options.workspace}`
);
return;
return toolError(
'Doc Read Failed',
`You do not have permission to read document ${docId} in this workspace.`
);
}
const docMeta = await models.doc.getSnapshot(options.workspace, docId, {
select: {
createdAt: true,
updatedAt: true,
createdByUser: {
select: publicUserSelect,
},
updatedByUser: {
select: publicUserSelect,
},
},
});
const docMeta = await models.doc.getAuthors(options.workspace, docId);
if (!docMeta) {
return;
return documentSyncPendingError(docId);
}
const content = await docReader.getDocMarkdown(
@@ -53,7 +61,7 @@ export const buildDocContentGetter = (
true
);
if (!content) {
return;
return documentSyncPendingError(docId);
}
return {
@@ -69,8 +77,12 @@ export const buildDocContentGetter = (
return getDoc;
};
type DocReadToolResult = Awaited<
ReturnType<ReturnType<typeof buildDocContentGetter>>
>;
export const createDocReadTool = (
getDoc: (targetId?: string) => Promise<object | undefined>
getDoc: (targetId?: string) => Promise<DocReadToolResult>
) => {
return defineTool({
description:
@@ -81,13 +93,10 @@ export const createDocReadTool = (
execute: async ({ doc_id }) => {
try {
const doc = await getDoc(doc_id);
if (!doc) {
return;
}
return { ...doc };
return isToolError(doc) ? doc : { ...doc };
} catch (err: any) {
logger.error(`Failed to read the doc ${doc_id}`, err);
return toolError('Doc Read Failed', err.message);
return toolError('Doc Read Failed', err.message ?? String(err));
}
},
});

View File

@@ -7,6 +7,7 @@ import {
clearEmbeddingChunk,
type Models,
} from '../../../models';
import { workspaceSyncRequiredError } from './doc-sync';
import { toolError } from './error';
import { defineTool } from './tool';
import type {
@@ -27,14 +28,24 @@ export const buildDocSearchGetter = (
signal?: AbortSignal
) => {
if (!options || !query?.trim() || !options.user || !options.workspace) {
return `Invalid search parameters.`;
return toolError(
'Doc Semantic Search Failed',
'Missing workspace, user, or query for doc_semantic_search.'
);
}
const workspace = await models.workspace.get(options.workspace);
if (!workspace) {
return workspaceSyncRequiredError();
}
const canAccess = await ac
.user(options.user)
.workspace(options.workspace)
.can('Workspace.Read');
if (!canAccess)
return 'You do not have permission to access this workspace.';
return toolError(
'Doc Semantic Search Failed',
'You do not have permission to access this workspace.'
);
const [chunks, contextChunks] = await Promise.all([
context.matchWorkspaceAll(options.workspace, query, 10, signal),
docContext?.matchFiles(query, 10, signal) ?? [],
@@ -53,7 +64,7 @@ export const buildDocSearchGetter = (
fileChunks.push(...contextChunks);
}
if (!blobChunks.length && !docChunks.length && !fileChunks.length) {
return `No results found for "${query}".`;
return [];
}
const docIds = docChunks.map(c => ({
@@ -101,7 +112,7 @@ export const createDocSemanticSearchTool = (
searchDocs: (
query: string,
signal?: AbortSignal
) => Promise<ChunkSimilarity[] | string | undefined>
) => Promise<ChunkSimilarity[] | ReturnType<typeof toolError>>
) => {
return defineTool({
description:

View File

@@ -0,0 +1,13 @@
import { toolError } from './error';
export const LOCAL_WORKSPACE_SYNC_REQUIRED_MESSAGE =
'This workspace is local-only and does not have AFFiNE Cloud sync enabled yet. Ask the user to enable workspace sync, then try again.';
export const DOCUMENT_SYNC_PENDING_MESSAGE = (docId: string) =>
`Document ${docId} is not available on AFFiNE Cloud yet. Ask the user to wait for workspace sync to finish, then try again.`;
export const workspaceSyncRequiredError = () =>
toolError('Workspace Sync Required', LOCAL_WORKSPACE_SYNC_REQUIRED_MESSAGE);
export const documentSyncPendingError = (docId: string) =>
toolError('Document Sync Pending', DOCUMENT_SYNC_PENDING_MESSAGE(docId));

View File

@@ -1,5 +1,5 @@
export const encodeLink = (link: string) =>
encodeURI(link)
.replace(/\(/g, '%28')
.replace(/\)/g, '%29')
.replaceAll('(', '%28')
.replaceAll(')', '%29')
.replace(/(\?|&)response-content-disposition=attachment.*$/, '');

View File

@@ -46,7 +46,10 @@ export function setupEvents(frameworkProvider: FrameworkProvider) {
const { workspace } = currentWorkspace;
const docsService = workspace.scope.get(DocsService);
const page = docsService.createDoc({ primaryMode: type });
const page =
type === 'default'
? docsService.createDoc()
: docsService.createDoc({ primaryMode: type });
workspace.scope.get(WorkbenchService).workbench.openDoc(page.id);
})
.catch(err => {

View File

@@ -67,7 +67,7 @@ export function createApplicationMenu() {
click: async () => {
await initAndShowMainWindow();
// fixme: if the window is just created, the new page action will not be triggered
applicationMenuSubjects.newPageAction$.next('page');
applicationMenuSubjects.newPageAction$.next('default');
},
},
],

View File

@@ -1,5 +1,5 @@
import type { MainEventRegister } from '../type';
import { applicationMenuSubjects } from './subject';
import { applicationMenuSubjects, type NewPageAction } from './subject';
export * from './create';
export * from './subject';
@@ -11,7 +11,7 @@ export const applicationMenuEvents = {
/**
* File -> New Doc
*/
onNewPageAction: (fn: (type: 'page' | 'edgeless') => void) => {
onNewPageAction: (fn: (type: NewPageAction) => void) => {
const sub = applicationMenuSubjects.newPageAction$.subscribe(fn);
return () => {
sub.unsubscribe();

View File

@@ -1,7 +1,9 @@
import { Subject } from 'rxjs';
export type NewPageAction = 'page' | 'edgeless' | 'default';
export const applicationMenuSubjects = {
newPageAction$: new Subject<'page' | 'edgeless'>(),
newPageAction$: new Subject<NewPageAction>(),
openJournal$: new Subject<void>(),
openInSettingModal$: new Subject<{
activeTab: string;

View File

@@ -9,6 +9,7 @@ import { beforeAppQuit } from './cleanup';
import { logger } from './logger';
import { powerEvents } from './power';
import { recordingEvents } from './recording';
import { checkSource } from './security-restrictions';
import { sharedStorageEvents } from './shared-storage';
import { uiEvents } from './ui/events';
import { updaterEvents } from './updater/event';
@@ -70,7 +71,7 @@ export function registerEvents() {
action: 'subscribe' | 'unsubscribe',
channel: string
) => {
if (typeof channel !== 'string') return;
if (!checkSource(event) || typeof channel !== 'string') return;
if (action === 'subscribe') {
addSubscription(event.sender, channel);
if (channel === 'power:power-source') {

View File

@@ -7,6 +7,7 @@ import { configStorageHandlers } from './config-storage';
import { findInPageHandlers } from './find-in-page';
import { getLogFilePath, logger, revealLogFile } from './logger';
import { recordingHandlers } from './recording';
import { checkSource } from './security-restrictions';
import { sharedStorageHandlers } from './shared-storage';
import { uiHandlers } from './ui/handlers';
import { updaterHandlers } from './updater';
@@ -49,7 +50,7 @@ export const registerHandlers = () => {
...args: any[]
) => {
// args[0] is the `{namespace:key}`
if (typeof args[0] !== 'string') {
if (!checkSource(e) || typeof args[0] !== 'string') {
logger.error('invalid ipc message', args);
return;
}
@@ -97,6 +98,8 @@ export const registerHandlers = () => {
});
ipcMain.on(AFFINE_API_CHANNEL_NAME, (e, ...args: any[]) => {
if (!checkSource(e)) return;
handleIpcMessage(e, ...args)
.then(ret => {
e.returnValue = ret;

View File

@@ -1,5 +1,3 @@
import './security-restrictions';
import path from 'node:path';
import * as Sentry from '@sentry/electron/main';
@@ -15,6 +13,7 @@ import { registerHandlers } from './handlers';
import { logger } from './logger';
import { registerProtocol } from './protocol';
import { setupRecordingFeature } from './recording/feature';
import { registerSecurityRestrictions } from './security-restrictions';
import { setupTrayState } from './tray';
import { registerUpdater } from './updater';
import { launch } from './windows-manager/launcher';
@@ -105,6 +104,7 @@ app.on('activate', () => {
});
setupDeepLink(app);
registerSecurityRestrictions();
/**
* Create app window when background process will be ready

View File

@@ -4,9 +4,9 @@ import { pathToFileURL } from 'node:url';
import { app, net, protocol, session } from 'electron';
import cookieParser from 'set-cookie-parser';
import { anotherHost, mainHost } from '../shared/internal-origin';
import { isWindows, resourcesPath } from '../shared/utils';
import { buildType, isDev } from './config';
import { anotherHost, mainHost } from './constants';
import { logger } from './logger';
const webStaticDir = join(resourcesPath, 'web-static');

View File

@@ -1,71 +1,71 @@
import { app } from 'electron';
import { anotherHost, mainHost } from './constants';
import { isInternalUrl } from '../shared/internal-origin';
import { logger } from './logger';
import { openExternalSafely } from './security/open-external';
import { validateRedirectProxyUrl } from './security/redirect-proxy';
app.on('web-contents-created', (_, contents) => {
const isInternalUrl = (url: string) => {
try {
const parsed = new URL(url);
if (
parsed.protocol === 'assets:' &&
(parsed.hostname === mainHost || parsed.hostname === anotherHost)
) {
return true;
}
} catch {}
return false;
};
/**
* Block navigation to origins not on the allowlist.
*
* Navigation is a common attack vector. If an attacker can convince the app to navigate away
* from its current page, they can possibly force the app to open web sites on the Internet.
*
* @see https://www.electronjs.org/docs/latest/tutorial/security#13-disable-or-limit-navigation
*/
contents.on('will-navigate', (event, url) => {
if (isInternalUrl(url)) {
return;
}
// Prevent navigation
event.preventDefault();
openExternalSafely(url).catch(error => {
console.error('[security] Failed to open external URL:', error);
});
});
export const checkSource = (
e: Electron.IpcMainInvokeEvent | Electron.IpcMainEvent
) => {
const url = e.senderFrame?.url || e.sender.getURL();
const result = isInternalUrl(url);
if (!result) logger.error('invalid source', url);
return result;
};
/**
* Hyperlinks to allowed sites open in the default browser.
*
* The creation of new `webContents` is a common attack vector. Attackers attempt to convince the app to create new windows,
* frames, or other renderer processes with more privileges than they had before; or with pages opened that they couldn't open before.
* You should deny any unexpected window creation.
*
* @see https://www.electronjs.org/docs/latest/tutorial/security#14-disable-or-limit-creation-of-new-windows
* @see https://www.electronjs.org/docs/latest/tutorial/security#15-do-not-use-openexternal-with-untrusted-content
*/
contents.setWindowOpenHandler(({ url }) => {
if (!isInternalUrl(url)) {
export const registerSecurityRestrictions = () => {
app.on('web-contents-created', (_, contents) => {
/**
* Block navigation to origins not on the allowlist.
*
* Navigation is a common attack vector. If an attacker can convince the app to navigate away
* from its current page, they can possibly force the app to open web sites on the Internet.
*
* @see https://www.electronjs.org/docs/latest/tutorial/security#13-disable-or-limit-navigation
*/
contents.on('will-navigate', (event, url) => {
if (isInternalUrl(url)) {
return;
}
// Prevent navigation
event.preventDefault();
openExternalSafely(url).catch(error => {
console.error('[security] Failed to open external URL:', error);
});
} else if (url.includes('/redirect-proxy')) {
const result = validateRedirectProxyUrl(url);
if (!result.allow) {
console.warn(
`[security] Blocked redirect proxy: ${result.reason}`,
result.redirectTarget ?? url
);
return { action: 'deny' };
}
});
openExternalSafely(result.redirectTarget).catch(error => {
console.error('[security] Failed to open external URL:', error);
});
}
// Prevent creating new window in application
return { action: 'deny' };
/**
* Hyperlinks to allowed sites open in the default browser.
*
* The creation of new `webContents` is a common attack vector. Attackers attempt to convince the app to create new windows,
* frames, or other renderer processes with more privileges than they had before; or with pages opened that they couldn't open before.
* You should deny any unexpected window creation.
*
* @see https://www.electronjs.org/docs/latest/tutorial/security#14-disable-or-limit-creation-of-new-windows
* @see https://www.electronjs.org/docs/latest/tutorial/security#15-do-not-use-openexternal-with-untrusted-content
*/
contents.setWindowOpenHandler(({ url }) => {
if (!isInternalUrl(url)) {
openExternalSafely(url).catch(error => {
console.error('[security] Failed to open external URL:', error);
});
} else if (url.includes('/redirect-proxy')) {
const result = validateRedirectProxyUrl(url);
if (!result.allow) {
console.warn(
`[security] Blocked redirect proxy: ${result.reason}`,
result.redirectTarget ?? url
);
return { action: 'deny' };
}
openExternalSafely(result.redirectTarget).catch(error => {
console.error('[security] Failed to open external URL:', error);
});
}
// Prevent creating new window in application
return { action: 'deny' };
});
});
});
};

View File

@@ -2,8 +2,8 @@ import { join } from 'node:path';
import { BrowserWindow, type Display, screen } from 'electron';
import { customThemeViewUrl } from '../../shared/internal-origin';
import { isMacOS } from '../../shared/utils';
import { customThemeViewUrl } from '../constants';
import { logger } from '../logger';
import { buildWebPreferences } from '../web-preferences';

View File

@@ -4,10 +4,10 @@ import { BrowserWindow, nativeTheme } from 'electron';
import electronWindowState from 'electron-window-state';
import { BehaviorSubject, map, shareReplay } from 'rxjs';
import { mainWindowOrigin } from '../../shared/internal-origin';
import { isLinux, isMacOS, isWindows, resourcesPath } from '../../shared/utils';
import { beforeAppQuit } from '../cleanup';
import { buildType } from '../config';
import { mainWindowOrigin } from '../constants';
import { ensureHelperProcess } from '../helper-process';
import { logger } from '../logger';
import { MenubarStateKey, MenubarStateSchema } from '../shared-state-schema';

View File

@@ -2,8 +2,8 @@ import { join } from 'node:path';
import { BrowserWindow, screen } from 'electron';
import { onboardingViewUrl } from '../../shared/internal-origin';
import { isDev } from '../config';
import { onboardingViewUrl } from '../constants';
// import { getExposedMeta } from './exposed';
import { logger } from '../logger';
import { buildWebPreferences } from '../web-preferences';

View File

@@ -8,7 +8,7 @@ import {
} from 'electron';
import { BehaviorSubject } from 'rxjs';
import { popupViewUrl } from '../constants';
import { popupViewUrl } from '../../shared/internal-origin';
import { logger } from '../logger';
import type { MainEventRegister, NamespaceHandlers } from '../type';
import { buildWebPreferences } from '../web-preferences';

View File

@@ -24,9 +24,9 @@ import {
type Unsubscribable,
} from 'rxjs';
import { mainWindowOrigin, shellViewUrl } from '../../shared/internal-origin';
import { isMacOS } from '../../shared/utils';
import { beforeAppQuit, onTabClose } from '../cleanup';
import { mainWindowOrigin, shellViewUrl } from '../constants';
import { ensureHelperProcess } from '../helper-process';
import { logger } from '../logger';
import {

View File

@@ -2,7 +2,7 @@ import { join } from 'node:path';
import { BrowserWindow, MessageChannelMain, type WebContents } from 'electron';
import { backgroundWorkerViewUrl } from '../constants';
import { backgroundWorkerViewUrl } from '../../shared/internal-origin';
import { ensureHelperProcess } from '../helper-process';
import { logger } from '../logger';
import { buildWebPreferences } from '../web-preferences';

View File

@@ -2,13 +2,21 @@ import '@sentry/electron/preload';
import { contextBridge } from 'electron';
import { isInternalUrl } from '../shared/internal-origin';
import { apis, appInfo, events } from './electron-api';
import { sharedStorage } from './shared-storage';
import { listenWorkerApis } from './worker';
contextBridge.exposeInMainWorld('__appInfo', appInfo);
contextBridge.exposeInMainWorld('__apis', apis);
contextBridge.exposeInMainWorld('__events', events);
contextBridge.exposeInMainWorld('__sharedStorage', sharedStorage);
const locationLike = (globalThis as { location?: { href?: unknown } }).location;
listenWorkerApis();
const currentUrl =
typeof locationLike?.href === 'string' ? locationLike.href : null;
if (currentUrl && isInternalUrl(currentUrl)) {
contextBridge.exposeInMainWorld('__appInfo', appInfo);
contextBridge.exposeInMainWorld('__apis', apis);
contextBridge.exposeInMainWorld('__events', events);
contextBridge.exposeInMainWorld('__sharedStorage', sharedStorage);
listenWorkerApis();
}

View File

@@ -1,5 +1,6 @@
export const mainHost = '.';
export const anotherHost = 'another-host';
export const internalHosts = new Set([mainHost, anotherHost]);
export const mainWindowOrigin = `assets://${mainHost}`;
export const anotherOrigin = `assets://${anotherHost}`;
@@ -13,3 +14,12 @@ export const customThemeViewUrl = `${mainWindowOrigin}/theme-editor`;
// Notes from electron official docs:
// "The zoom policy at the Chromium level is same-origin, meaning that the zoom level for a specific domain propagates across all instances of windows with the same domain. Differentiating the window URLs will make zoom work per-window."
export const popupViewUrl = `${anotherOrigin}/popup.html`;
export const isInternalUrl = (url: string) => {
try {
const parsed = new URL(url);
return parsed.protocol === 'assets:' && internalHosts.has(parsed.hostname);
} catch {
return false;
}
};

View File

@@ -32,8 +32,8 @@
"kind" : "remoteSourceControl",
"location" : "https://github.com/Lakr233/ListViewKit",
"state" : {
"revision" : "5dea05a52a6c2c7bb013a5925c517d6e32940605",
"version" : "1.1.8"
"revision" : "07f7adfa0629f8647991e3c148b7d3e060fe2917",
"version" : "1.2.0"
}
},
{
@@ -59,8 +59,8 @@
"kind" : "remoteSourceControl",
"location" : "https://github.com/Lakr233/MarkdownView",
"state" : {
"revision" : "8b8c1eecd251051c5ec2bdd5f31a2243efd9be6c",
"version" : "3.6.2"
"revision" : "37f97345a108e95f66b6671c317b43063c7f2de1",
"version" : "3.8.2"
}
},
{

View File

@@ -21,8 +21,8 @@ let package = Package(
.package(url: "https://github.com/SnapKit/SnapKit.git", from: "5.7.1"),
.package(url: "https://github.com/SwifterSwift/SwifterSwift.git", from: "6.2.0"),
.package(url: "https://github.com/Recouse/EventSource.git", from: "0.1.7"),
.package(url: "https://github.com/Lakr233/ListViewKit.git", from: "1.1.8"),
.package(url: "https://github.com/Lakr233/MarkdownView.git", from: "3.6.3"),
.package(url: "https://github.com/Lakr233/ListViewKit.git", from: "1.2.0"),
.package(url: "https://github.com/Lakr233/MarkdownView.git", from: "3.8.2"),
],
targets: [
.target(name: "Intelligents", dependencies: [

View File

@@ -32,9 +32,12 @@ export const AuthInput = ({
onEnter={onEnter}
{...inputProps}
/>
{error && errorHint ? (
<div className={styles.authInputError}>{errorHint}</div>
) : null}
<div
className={styles.authInputError}
style={{ visibility: error ? 'visible' : 'hidden' }}
>
{errorHint}
</div>
</div>
);
};

View File

@@ -71,6 +71,7 @@ export const authInputError = style({
color: cssVar('errorColor'),
fontSize: cssVar('fontXs'),
lineHeight: '20px',
minHeight: '20px',
});
globalStyle(`${authContent} a`, {

View File

@@ -0,0 +1,41 @@
/**
* @vitest-environment happy-dom
*/
import { describe, expect, test, vi } from 'vitest';
import { AIChatContent } from './ai-chat-content';
describe('AIChatContent pinned scroll tracking', () => {
test('records scroll position from the chat messages host', async () => {
let scrollEndHandler: (() => void) | undefined;
const chatMessages = {
scrollTop: 256,
updateComplete: Promise.resolve(),
addEventListener: vi.fn((event: string, handler: EventListener) => {
if (event === 'scrollend') {
scrollEndHandler = handler as () => void;
}
}),
};
const content = {
chatMessagesRef: { value: chatMessages },
_scrollListenersInitialized: false,
lastScrollTop: undefined,
} as unknown as AIChatContent;
(AIChatContent.prototype as any)._initializeScrollListeners.call(content);
await chatMessages.updateComplete;
await Promise.resolve();
expect(chatMessages.addEventListener).toHaveBeenCalledWith(
'scrollend',
expect.any(Function)
);
scrollEndHandler?.();
expect((content as any).lastScrollTop).toBe(256);
});
});

View File

@@ -0,0 +1,50 @@
/**
* @vitest-environment happy-dom
*/
import { afterEach, beforeEach, describe, expect, test, vi } from 'vitest';
import { AIChatMessages } from './ai-chat-messages';
describe('AIChatMessages scrolling', () => {
beforeEach(() => {
vi.stubGlobal('requestAnimationFrame', (cb: FrameRequestCallback) => {
cb(0);
return 1;
});
});
afterEach(() => {
vi.unstubAllGlobals();
vi.restoreAllMocks();
});
test('scrollToEnd scrolls the host element', () => {
const scrollTo = vi.fn();
const element = {
scrollTo,
} as unknown as AIChatMessages;
Object.defineProperty(element, 'scrollHeight', {
configurable: true,
value: 480,
});
AIChatMessages.prototype.scrollToEnd.call(element);
expect(scrollTo).toHaveBeenCalledWith({
top: 480,
behavior: 'smooth',
});
});
test('scrollToPos scrolls the host element', () => {
const scrollTo = vi.fn();
const element = {
scrollTo,
} as unknown as AIChatMessages;
AIChatMessages.prototype.scrollToPos.call(element, 128);
expect(scrollTo).toHaveBeenCalledWith({ top: 128 });
});
});

View File

@@ -32,6 +32,7 @@ export class ChatContentRichText extends WithDisposable(ShadowlessElement) {
extensions: this.extensions,
affineFeatureFlagService: this.affineFeatureFlagService,
theme: this.theme,
scrollable: false,
})(text, this.state)}`;
}
}

View File

@@ -7,6 +7,8 @@ import { html, nothing } from 'lit';
import { property } from 'lit/decorators.js';
import type { ToolResult } from './tool-result-card';
import { getToolErrorDisplayName, isToolError } from './tool-result-utils';
import type { ToolError } from './type';
interface DocKeywordSearchToolCall {
type: 'tool-call';
@@ -20,10 +22,7 @@ interface DocKeywordSearchToolResult {
toolCallId: string;
toolName: string;
args: { query: string };
result: Array<{
title: string;
docId: string;
}>;
result: Array<{ title: string; docId: string }> | ToolError | null;
}
export class DocKeywordSearchResult extends WithDisposable(ShadowlessElement) {
@@ -51,9 +50,23 @@ export class DocKeywordSearchResult extends WithDisposable(ShadowlessElement) {
if (this.data.type !== 'tool-result') {
return nothing;
}
const result = this.data.result;
if (!result || isToolError(result)) {
return html`<tool-call-failed
.name=${getToolErrorDisplayName(
isToolError(result) ? result : null,
'Document search failed',
{
'Workspace Sync Required':
'Enable workspace sync to search documents',
}
)}
.icon=${SearchIcon()}
></tool-call-failed>`;
}
let results: ToolResult[] = [];
try {
results = this.data.result.map(item => ({
results = result.map(item => ({
title: item.title,
icon: PageIcon(),
onClick: () => {
@@ -69,7 +82,7 @@ export class DocKeywordSearchResult extends WithDisposable(ShadowlessElement) {
console.error('Failed to parse result', err);
}
return html`<tool-result-card
.name=${`Found ${this.data.result.length} pages for "${this.data.args.query}"`}
.name=${`Found ${result.length} pages for "${this.data.args.query}"`}
.icon=${SearchIcon()}
.width=${this.width}
.results=${results}

View File

@@ -6,6 +6,9 @@ import type { Signal } from '@preact/signals-core';
import { html, nothing } from 'lit';
import { property } from 'lit/decorators.js';
import { getToolErrorDisplayName, isToolError } from './tool-result-utils';
import type { ToolError } from './type';
interface DocReadToolCall {
type: 'tool-call';
toolCallId: string;
@@ -18,14 +21,24 @@ interface DocReadToolResult {
toolCallId: string;
toolName: string;
args: { doc_id: string };
result: {
/** Old result may not have docId */
docId?: string;
title: string;
markdown: string;
};
result:
| {
/** Old result may not have docId */
docId?: string;
title: string;
markdown: string;
}
| ToolError
| null;
}
const getFailedName = (result: ToolError | null) => {
return getToolErrorDisplayName(result, 'Document read failed', {
'Workspace Sync Required': 'Enable workspace sync to read this document',
'Document Sync Pending': 'Wait for document sync to finish',
});
};
export class DocReadResult extends WithDisposable(ShadowlessElement) {
@property({ attribute: false })
accessor data!: DocReadToolCall | DocReadToolResult;
@@ -49,18 +62,25 @@ export class DocReadResult extends WithDisposable(ShadowlessElement) {
if (this.data.type !== 'tool-result') {
return nothing;
}
const result = this.data.result;
if (!result || isToolError(result)) {
return html`<tool-call-failed
.name=${getFailedName(isToolError(result) ? result : null)}
.icon=${ViewIcon()}
></tool-call-failed>`;
}
// TODO: better markdown rendering
return html`<tool-result-card
.name=${`Read "${this.data.result.title}"`}
.name=${`Read "${result.title}"`}
.icon=${ViewIcon()}
.width=${this.width}
.results=${[
{
title: this.data.result.title,
title: result.title,
icon: PageIcon(),
content: this.data.result.markdown,
content: result.markdown,
onClick: () => {
const docId = (this.data as DocReadToolResult).result.docId;
const docId = result.docId;
if (!docId) {
return;
}

View File

@@ -7,6 +7,8 @@ import { html, nothing } from 'lit';
import { property } from 'lit/decorators.js';
import type { DocDisplayConfig } from '../ai-chat-chips';
import { getToolErrorDisplayName, isToolError } from './tool-result-utils';
import type { ToolError } from './type';
interface DocSemanticSearchToolCall {
type: 'tool-call';
@@ -20,10 +22,7 @@ interface DocSemanticSearchToolResult {
toolCallId: string;
toolName: string;
args: { query: string };
result: Array<{
content: string;
docId: string;
}>;
result: Array<{ content: string; docId: string }> | ToolError | null;
}
function parseResultContent(content: string) {
@@ -82,11 +81,25 @@ export class DocSemanticSearchResult extends WithDisposable(ShadowlessElement) {
if (this.data.type !== 'tool-result') {
return nothing;
}
const result = this.data.result;
if (!result || isToolError(result)) {
return html`<tool-call-failed
.name=${getToolErrorDisplayName(
isToolError(result) ? result : null,
'Semantic search failed',
{
'Workspace Sync Required':
'Enable workspace sync to search documents',
}
)}
.icon=${AiEmbeddingIcon()}
></tool-call-failed>`;
}
return html`<tool-result-card
.name=${`Found semantically related pages for "${this.data.args.query}"`}
.icon=${AiEmbeddingIcon()}
.width=${this.width}
.results=${this.data.result
.results=${result
.map(result => ({
...parseResultContent(result.content),
title: this.docDisplayService.getTitle(result.docId),

View File

@@ -0,0 +1,16 @@
import type { ToolError } from './type';
export const isToolError = (result: unknown): result is ToolError =>
!!result &&
typeof result === 'object' &&
'type' in result &&
(result as ToolError).type === 'error';
export const getToolErrorDisplayName = (
result: ToolError | null,
fallback: string,
overrides: Record<string, string> = {}
) => {
if (!result) return fallback;
return overrides[result.name] ?? result.name;
};

View File

@@ -85,6 +85,7 @@ export type TextRendererOptions = {
testId?: string;
affineFeatureFlagService?: FeatureFlagService;
theme?: Signal<ColorScheme>;
scrollable?: boolean;
};
// todo: refactor it for more general purpose usage instead of AI only?
@@ -140,9 +141,12 @@ export class TextRenderer extends SignalWatcher(
}
.text-renderer-container {
padding: 0;
}
.text-renderer-container.scrollable {
overflow-y: auto;
overflow-x: hidden;
padding: 0;
overscroll-behavior-y: none;
}
.text-renderer-container.show-scrollbar::-webkit-scrollbar {
@@ -325,6 +329,7 @@ export class TextRenderer extends SignalWatcher(
const classes = classMap({
'text-renderer-container': true,
'custom-heading': !!customHeading,
scrollable: this.options.scrollable !== false,
});
const theme = this.options.theme?.value;
return html`

View File

@@ -1,4 +1,10 @@
import { Button, IconButton, IconType, Modal } from '@affine/component';
import {
Button,
IconButton,
type IconData,
IconType,
Modal,
} from '@affine/component';
import { getStoreManager } from '@affine/core/blocksuite/manager/store';
import { useAsyncCallback } from '@affine/core/components/hooks/affine-async-hooks';
import { useNavigateHelper } from '@affine/core/components/hooks/use-navigate-helper';
@@ -18,13 +24,14 @@ import {
import { DebugLogger } from '@affine/debug';
import { useI18n } from '@affine/i18n';
import track from '@affine/track';
import { openFilesWith } from '@blocksuite/affine/shared/utils';
import { openDirectory, openFilesWith } from '@blocksuite/affine/shared/utils';
import type { Workspace } from '@blocksuite/affine/store';
import {
DocxTransformer,
HtmlTransformer,
MarkdownTransformer,
NotionHtmlTransformer,
ObsidianTransformer,
ZipTransformer,
} from '@blocksuite/affine/widgets/linked-doc';
import {
@@ -112,10 +119,10 @@ function createFolderStructure(
logger.debug('Icon data:', child.icon);
try {
let iconData;
let iconData: IconData | undefined;
if (child.icon.type === 'emoji') {
iconData = {
type: IconType.Emoji as const,
type: IconType.Emoji,
unicode: child.icon.content,
};
logger.debug('Created emoji icon data:', iconData);
@@ -185,11 +192,12 @@ type ImportType =
| 'markdown'
| 'markdownZip'
| 'notion'
| 'obsidian'
| 'snapshot'
| 'html'
| 'docx'
| 'dotaffinefile';
type AcceptType = 'Markdown' | 'Zip' | 'Html' | 'Docx' | 'Skip'; // Skip is used for dotaffinefile
type AcceptType = 'Markdown' | 'Zip' | 'Html' | 'Docx' | 'Directory' | 'Skip'; // Skip is used for dotaffinefile
type Status = 'idle' | 'importing' | 'success' | 'error';
type ImportResult = {
docIds: string[];
@@ -198,6 +206,10 @@ type ImportResult = {
rootFolderId?: string;
};
type ImportedWorkspacePayload = {
workspace: WorkspaceMetadata;
};
type ImportConfig = {
fileOptions: { acceptType: AcceptType; multiple: boolean };
importFunction: (
@@ -264,6 +276,19 @@ const importOptions = [
testId: 'editor-option-menu-import-notion',
type: 'notion' as ImportType,
},
{
key: 'obsidian',
label: 'com.affine.import.obsidian',
prefixIcon: (
<ExportToMarkdownIcon color={cssVar('black')} width={20} height={20} />
),
suffixIcon: (
<HelpIcon color={cssVarV2('icon/primary')} width={20} height={20} />
),
suffixTooltip: 'com.affine.import.obsidian.tooltip',
testId: 'editor-option-menu-import-obsidian',
type: 'obsidian' as ImportType,
},
{
key: 'docx',
label: 'com.affine.import.docx',
@@ -445,6 +470,36 @@ const importConfigs: Record<ImportType, ImportConfig> = {
};
},
},
obsidian: {
fileOptions: { acceptType: 'Directory', multiple: false },
importFunction: async (
docCollection,
files,
_handleImportAffineFile,
_organizeService,
explorerIconService
) => {
const { docIds, docEmojis } =
await ObsidianTransformer.importObsidianVault({
collection: docCollection,
schema: getAFFiNEWorkspaceSchema(),
importedFiles: files,
extensions: getStoreManager().config.init().value.get('store'),
});
if (explorerIconService) {
for (const [id, emoji] of docEmojis.entries()) {
explorerIconService.setIcon({
where: 'doc',
id,
icon: { type: IconType.Emoji, unicode: emoji },
});
}
}
return { docIds };
},
},
docx: {
fileOptions: { acceptType: 'Docx', multiple: false },
importFunction: async (docCollection, file) => {
@@ -482,7 +537,7 @@ const importConfigs: Record<ImportType, ImportConfig> = {
file
)
)
.filter(doc => doc !== undefined)
.filter((doc): doc is NonNullable<typeof doc> => doc !== undefined)
.map(doc => doc.id);
return {
@@ -713,14 +768,18 @@ export const ImportDialog = ({
});
return new Promise<WorkspaceMetadata | undefined>((resolve, reject) => {
globalDialogService.open('import-workspace', undefined, payload => {
if (payload) {
handleCreatedWorkspace({ metadata: payload.workspace });
resolve(payload.workspace);
} else {
reject(new Error('No workspace imported'));
globalDialogService.open(
'import-workspace',
undefined,
(payload?: ImportedWorkspacePayload) => {
if (payload) {
handleCreatedWorkspace({ metadata: payload.workspace });
resolve(payload.workspace);
} else {
reject(new Error('No workspace imported'));
}
}
});
);
});
};
}, [globalDialogService, handleCreatedWorkspace]);
@@ -735,7 +794,9 @@ export const ImportDialog = ({
const files =
acceptType === 'Skip'
? []
: await openFilesWith(acceptType, multiple);
: acceptType === 'Directory'
? await openDirectory()
: await openFilesWith(acceptType, multiple);
if (!files || (files.length === 0 && acceptType !== 'Skip')) {
throw new Error(

View File

@@ -108,10 +108,18 @@ export const VerifyEmailDialog = ({
>
<AuthHeader
title={serverName}
subTitle={t['com.affine.settings.email.action.change']()}
subTitle={
changeEmail
? t['com.affine.settings.email.action.change']()
: t['com.affine.settings.email.action.verify']()
}
/>
<AuthContent>
<p>{t['com.affine.auth.verify.email.message']({ email })}</p>
<p>
{changeEmail
? t['com.affine.auth.change.email.message']({ email })
: t['com.affine.auth.verify.email.message']({ email })}
</p>
<AuthInput
label={t['com.affine.settings.email']()}
disabled={true}

View File

@@ -54,13 +54,22 @@ export class I18n extends Entity {
constructor(private readonly cache: GlobalCache) {
super();
this.i18n.on('languageChanged', (language: Language) => {
document.documentElement.lang = language;
this.applyDocumentLanguage(language);
this.cache.set('i18n_lng', language);
});
}
init() {
this.changeLanguage(this.currentLanguageKey$.value ?? 'en');
const language = this.currentLanguageKey$.value ?? 'en';
this.applyDocumentLanguage(language);
this.changeLanguage(language);
}
private applyDocumentLanguage(language: Language) {
document.documentElement.lang = language;
document.documentElement.dir = SUPPORTED_LANGUAGES[language]?.rtl
? 'rtl'
: 'ltr';
}
changeLanguage = effect(

View File

@@ -1,6 +1,6 @@
{
"ar": 96,
"ca": 98,
"ar": 100,
"ca": 97,
"da": 4,
"de": 100,
"el-GR": 96,
@@ -11,16 +11,16 @@
"fa": 96,
"fr": 100,
"hi": 1,
"it": 98,
"it": 97,
"ja": 96,
"ko": 97,
"nb-NO": 47,
"pl": 98,
"pt-BR": 96,
"ru": 98,
"ru": 97,
"sv-SE": 96,
"uk": 96,
"ur": 2,
"zh-Hans": 98,
"zh-Hant": 97
"zh-Hant": 96
}

View File

@@ -1095,7 +1095,7 @@ export function useAFFiNEI18N(): {
*/
["com.affine.appearanceSettings.showLinkedDocInSidebar.description"](): string;
/**
* `Your current email is {{email}}. We'll send a temporary verification link to this email.`
* `Your current email is {{email}}. We'll send a confirmation link there first so you can securely switch to a new email address.`
*/
["com.affine.auth.change.email.message"](options: {
readonly email: string;
@@ -1427,7 +1427,7 @@ export function useAFFiNEI18N(): {
*/
["com.affine.auth.toast.title.signed-in"](): string;
/**
* `Your current email is {{email}}. We'll send a temporary verification link to this email.`
* `Your current email is {{email}}. We'll send a verification link to this email so you can confirm it belongs to you.`
*/
["com.affine.auth.verify.email.message"](options: {
readonly email: string;
@@ -2494,6 +2494,14 @@ export function useAFFiNEI18N(): {
* `Import your Notion data. Supported import formats: HTML with subpages.`
*/
["com.affine.import.notion.tooltip"](): string;
/**
* `Obsidian Vault`
*/
["com.affine.import.obsidian"](): string;
/**
* `Import an Obsidian vault. Select a folder to import all notes, images, and assets with wikilinks resolved.`
*/
["com.affine.import.obsidian.tooltip"](): string;
/**
* `Snapshot`
*/
@@ -9889,7 +9897,7 @@ export const TypedTrans: {
["2"]: JSX.Element;
}>>;
/**
* `<1>{{username}}</1> has accept your invitation`
* `<1>{{username}}</1> has accepted your invitation`
*/
["com.affine.notification.invitation-accepted"]: ComponentType<TypedTransProps<{
readonly username: string;

File diff suppressed because it is too large Load Diff

View File

@@ -622,6 +622,8 @@
"com.affine.import.modal.tip": "Wenn du Unterstützung für zusätzliche Dateitypen anfordern möchtest, lass es uns gerne wissen auf",
"com.affine.import.notion": "Notion",
"com.affine.import.notion.tooltip": "Importiere deine Notion-Daten. Unterstützte Importformate: HTML mit Unterseiten.",
"com.affine.import.obsidian": "Obsidian-Vault",
"com.affine.import.obsidian.tooltip": "Importiere einen Obsidian-Vault. Wähle einen Ordner aus, um alle Notizen, Bilder und Assets mit aufgelösten Wiki-Links zu importieren.",
"com.affine.import.snapshot": "Snapshot",
"com.affine.import.snapshot.tooltip": "Importiere deine AFFiNE-Workspace- und Seiten-Snapshot-Datei.",
"com.affine.import.dotaffinefile": ".affine-Datei",
@@ -2129,6 +2131,7 @@
"com.affine.integration.calendar.no-calendar": "Noch keine abonnierten Kalender.",
"com.affine.integration.mcp-server.name": "MCP-Server",
"com.affine.integration.mcp-server.desc": "Anderen MCP-Clients ermöglichen, die Seite von AFFiNE zu suchen und zu lesen.",
"com.affine.integration.mcp-server.copy-json.disabled-hint": "Der MCP-Token wird nur einmal angezeigt. Lösche ihn und erstelle ihn neu, um die JSON-Konfiguration zu kopieren.",
"com.affine.audio.notes": "Notizen",
"com.affine.audio.transcribing": "Transkription läuft",
"com.affine.audio.transcribe.non-owner.confirm.title": "KI-Ergebnisse für andere können nicht abgerufen werden",
@@ -2176,6 +2179,7 @@
"error.SSRF_BLOCKED_ERROR": "Ungültige URL",
"error.RESPONSE_TOO_LARGE_ERROR": "Antwort zu groß ({{receivedBytes}} Bytes), Limit beträgt {{limitBytes}} Bytes",
"error.EMAIL_SERVICE_NOT_CONFIGURED": "E-Mail-Dienst ist nicht konfiguriert.",
"error.IMAGE_FORMAT_NOT_SUPPORTED": "Bildformat nicht unterstützt: {{format}}",
"error.QUERY_TOO_LONG": "Abfrage ist zu lang, die maximale Länge beträgt {{max}}.",
"error.VALIDATION_ERROR": "Validierungsfehler, Fehler: {{errors}}",
"error.USER_NOT_FOUND": "Benutzer nicht gefunden.",
@@ -2291,7 +2295,7 @@
"error.CANNOT_DELETE_ACCOUNT_WITH_OWNED_TEAM_WORKSPACE": "Konto kann nicht gelöscht werden. Du bist Besitzer eines oder mehrerer Team-Workspaces. Bitte übertrage den Besitz oder lösche die Workspaces zuerst.",
"error.CAPTCHA_VERIFICATION_FAILED": "Captcha-Überprüfung fehlgeschlagen.",
"error.INVALID_LICENSE_SESSION_ID": "Ungültige Sitzungs-ID zur Erstellung des Lizenzschlüssels.",
"error.LICENSE_REVEALED": "Der Lizenzschlüssel wurde aufgedeckt. Bitte überprüfe dein Postfach, das du beim Bezahlvorgang angegeben hast.",
"error.LICENSE_REVEALED": "Der Lizenzschlüssel wurde angezeigt. Bitte überprüfe dein Postfach, das du beim Bezahlvorgang angegeben hast.",
"error.WORKSPACE_LICENSE_ALREADY_EXISTS": "Workspace hat bereits eine Lizenz angewendet.",
"error.LICENSE_NOT_FOUND": "Lizenz nicht gefunden.",
"error.INVALID_LICENSE_TO_ACTIVATE": "Ungültige Lizenz zum Aktivieren. {{reason}}",

View File

@@ -262,7 +262,7 @@
"com.affine.appearanceSettings.translucentUI.title": "Translucent UI on the sidebar",
"com.affine.appearanceSettings.showLinkedDocInSidebar.title": "Show linked doc in sidebar",
"com.affine.appearanceSettings.showLinkedDocInSidebar.description": "Control whether to show the structure of linked docs in the sidebar.",
"com.affine.auth.change.email.message": "Your current email is {{email}}. We'll send a temporary verification link to this email.",
"com.affine.auth.change.email.message": "Your current email is {{email}}. We'll send a confirmation link there first so you can securely switch to a new email address.",
"com.affine.auth.change.email.page.subtitle": "Please enter your new email address below. We will send a verification link to this email address to complete the process.",
"com.affine.auth.change.email.page.success.subtitle": "Congratulations! You have successfully updated the email address associated with your AFFiNE Cloud account.",
"com.affine.auth.change.email.page.success.title": "Email address updated!",
@@ -347,7 +347,7 @@
"com.affine.auth.toast.message.signed-in": "You have been signed in, start to sync your data with AFFiNE Cloud!",
"com.affine.auth.toast.title.failed": "Unable to sign in",
"com.affine.auth.toast.title.signed-in": "Signed in",
"com.affine.auth.verify.email.message": "Your current email is {{email}}. We'll send a temporary verification link to this email.",
"com.affine.auth.verify.email.message": "Your current email is {{email}}. We'll send a verification link to this email so you can confirm it belongs to you.",
"com.affine.backButton": "Back",
"com.affine.banner.content": "This demo is limited. <1>Download the AFFiNE Client</1> for the latest features and Performance.",
"com.affine.banner.local-warning": "Your local data is stored in the browser and may be lost. Don't risk it - enable cloud now!",
@@ -622,6 +622,8 @@
"com.affine.import.modal.tip": "If you'd like to request support for additional file types, feel free to let us know on",
"com.affine.import.notion": "Notion",
"com.affine.import.notion.tooltip": "Import your Notion data. Supported import formats: HTML with subpages.",
"com.affine.import.obsidian": "Obsidian Vault",
"com.affine.import.obsidian.tooltip": "Import an Obsidian vault. Select a folder to import all notes, images, and assets with wikilinks resolved.",
"com.affine.import.snapshot": "Snapshot",
"com.affine.import.snapshot.tooltip": "Import your AFFiNE workspace and page snapshot file.",
"com.affine.import.dotaffinefile": ".affine file",
@@ -1992,7 +1994,7 @@
"com.affine.notification.empty": "No new notifications",
"com.affine.notification.loading-more": "Loading more...",
"com.affine.notification.empty.description": "You'll be notified here for @mentions and workspace invites.",
"com.affine.notification.invitation-accepted": "<1>{{username}}</1> has accept your invitation",
"com.affine.notification.invitation-accepted": "<1>{{username}}</1> has accepted your invitation",
"com.affine.notification.invitation-review-request": "<1>{{username}}</1> has requested to join <2>{{workspaceName}}</2>",
"com.affine.notification.invitation-review-declined": "<1>{{username}}</1> has declined your request to join <2>{{workspaceName}}</2>",
"com.affine.notification.invitation-review-approved": "<1>{{username}}</1> has approved your request to join <2>{{workspaceName}}</2>",

View File

@@ -32,6 +32,7 @@ export const SUPPORTED_LANGUAGES: Record<
name: string;
originalName: string;
flagEmoji: string;
rtl?: boolean;
resource:
| LanguageResource
| (() => Promise<{ default: Partial<LanguageResource> }>);
@@ -149,18 +150,21 @@ export const SUPPORTED_LANGUAGES: Record<
name: 'Urdu',
originalName: 'اردو',
flagEmoji: '🇵🇰',
rtl: true,
resource: () => import('./ur.json'),
},
ar: {
name: 'Arabic',
originalName: 'العربية',
flagEmoji: '🇸🇦',
rtl: true,
resource: () => import('./ar.json'),
},
fa: {
name: 'Persian',
originalName: 'فارسی',
flagEmoji: '🇮🇷',
rtl: true,
resource: () => import('./fa.json'),
},
uk: {

View File

@@ -77,8 +77,8 @@ function requireNative() {
try {
const binding = require('@affine/native-android-arm64')
const bindingPackageVersion = require('@affine/native-android-arm64/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -93,8 +93,8 @@ function requireNative() {
try {
const binding = require('@affine/native-android-arm-eabi')
const bindingPackageVersion = require('@affine/native-android-arm-eabi/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -114,8 +114,8 @@ function requireNative() {
try {
const binding = require('@affine/native-win32-x64-gnu')
const bindingPackageVersion = require('@affine/native-win32-x64-gnu/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -130,8 +130,8 @@ function requireNative() {
try {
const binding = require('@affine/native-win32-x64-msvc')
const bindingPackageVersion = require('@affine/native-win32-x64-msvc/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -147,8 +147,8 @@ function requireNative() {
try {
const binding = require('@affine/native-win32-ia32-msvc')
const bindingPackageVersion = require('@affine/native-win32-ia32-msvc/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -163,8 +163,8 @@ function requireNative() {
try {
const binding = require('@affine/native-win32-arm64-msvc')
const bindingPackageVersion = require('@affine/native-win32-arm64-msvc/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -182,8 +182,8 @@ function requireNative() {
try {
const binding = require('@affine/native-darwin-universal')
const bindingPackageVersion = require('@affine/native-darwin-universal/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -198,8 +198,8 @@ function requireNative() {
try {
const binding = require('@affine/native-darwin-x64')
const bindingPackageVersion = require('@affine/native-darwin-x64/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -214,8 +214,8 @@ function requireNative() {
try {
const binding = require('@affine/native-darwin-arm64')
const bindingPackageVersion = require('@affine/native-darwin-arm64/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -234,8 +234,8 @@ function requireNative() {
try {
const binding = require('@affine/native-freebsd-x64')
const bindingPackageVersion = require('@affine/native-freebsd-x64/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -250,8 +250,8 @@ function requireNative() {
try {
const binding = require('@affine/native-freebsd-arm64')
const bindingPackageVersion = require('@affine/native-freebsd-arm64/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -271,8 +271,8 @@ function requireNative() {
try {
const binding = require('@affine/native-linux-x64-musl')
const bindingPackageVersion = require('@affine/native-linux-x64-musl/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -287,8 +287,8 @@ function requireNative() {
try {
const binding = require('@affine/native-linux-x64-gnu')
const bindingPackageVersion = require('@affine/native-linux-x64-gnu/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -305,8 +305,8 @@ function requireNative() {
try {
const binding = require('@affine/native-linux-arm64-musl')
const bindingPackageVersion = require('@affine/native-linux-arm64-musl/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -321,8 +321,8 @@ function requireNative() {
try {
const binding = require('@affine/native-linux-arm64-gnu')
const bindingPackageVersion = require('@affine/native-linux-arm64-gnu/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -339,8 +339,8 @@ function requireNative() {
try {
const binding = require('@affine/native-linux-arm-musleabihf')
const bindingPackageVersion = require('@affine/native-linux-arm-musleabihf/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -355,8 +355,8 @@ function requireNative() {
try {
const binding = require('@affine/native-linux-arm-gnueabihf')
const bindingPackageVersion = require('@affine/native-linux-arm-gnueabihf/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -373,8 +373,8 @@ function requireNative() {
try {
const binding = require('@affine/native-linux-loong64-musl')
const bindingPackageVersion = require('@affine/native-linux-loong64-musl/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -389,8 +389,8 @@ function requireNative() {
try {
const binding = require('@affine/native-linux-loong64-gnu')
const bindingPackageVersion = require('@affine/native-linux-loong64-gnu/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -407,8 +407,8 @@ function requireNative() {
try {
const binding = require('@affine/native-linux-riscv64-musl')
const bindingPackageVersion = require('@affine/native-linux-riscv64-musl/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -423,8 +423,8 @@ function requireNative() {
try {
const binding = require('@affine/native-linux-riscv64-gnu')
const bindingPackageVersion = require('@affine/native-linux-riscv64-gnu/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -440,8 +440,8 @@ function requireNative() {
try {
const binding = require('@affine/native-linux-ppc64-gnu')
const bindingPackageVersion = require('@affine/native-linux-ppc64-gnu/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -456,8 +456,8 @@ function requireNative() {
try {
const binding = require('@affine/native-linux-s390x-gnu')
const bindingPackageVersion = require('@affine/native-linux-s390x-gnu/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -476,8 +476,8 @@ function requireNative() {
try {
const binding = require('@affine/native-openharmony-arm64')
const bindingPackageVersion = require('@affine/native-openharmony-arm64/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -492,8 +492,8 @@ function requireNative() {
try {
const binding = require('@affine/native-openharmony-x64')
const bindingPackageVersion = require('@affine/native-openharmony-x64/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {
@@ -508,8 +508,8 @@ function requireNative() {
try {
const binding = require('@affine/native-openharmony-arm')
const bindingPackageVersion = require('@affine/native-openharmony-arm/package.json').version
if (bindingPackageVersion !== '0.26.0' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.0 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
if (bindingPackageVersion !== '0.26.3' && process.env.NAPI_RS_ENFORCE_VERSION_CHECK && process.env.NAPI_RS_ENFORCE_VERSION_CHECK !== '0') {
throw new Error(`Native binding package version mismatch, expected 0.26.3 but got ${bindingPackageVersion}. You can reinstall dependencies to fix this issue.`)
}
return binding
} catch (e) {

View File

@@ -1,7 +1,12 @@
import { test } from '@affine-test/kit/electron';
import {
ensureInEdgelessMode,
ensureInPageMode,
} from '@affine-test/kit/utils/editor';
import {
clickNewPageButton,
getBlockSuiteEditorTitle,
waitForEditorLoad,
} from '@affine-test/kit/utils/page-logic';
import { clickSideBarSettingButton } from '@affine-test/kit/utils/sidebar';
import { createLocalWorkspace } from '@affine-test/kit/utils/workspace';
@@ -14,12 +19,55 @@ const historyShortcut = async (page: Page, command: 'goBack' | 'goForward') => {
);
};
const setNewDocDefaultMode = async (
page: Page,
mode: 'page' | 'edgeless' | 'ask'
) => {
const modeTriggerByValue = {
page: 'page-mode-trigger',
edgeless: 'edgeless-mode-trigger',
ask: 'ask-every-time-trigger',
} as const;
await clickSideBarSettingButton(page);
await page.getByTestId('editor-panel-trigger').click();
await page.getByTestId('new-doc-default-mode-trigger').click();
await page.getByTestId(modeTriggerByValue[mode]).click();
await page.getByTestId('modal-close-button').click();
};
test('new page', async ({ page, workspace }) => {
await clickNewPageButton(page);
const flavour = (await workspace.current()).meta.flavour;
expect(flavour).toBe('local');
});
test('application menu respects default new doc mode', async ({
electronApp,
page,
}) => {
await waitForEditorLoad(page);
await ensureInPageMode(page);
await setNewDocDefaultMode(page, 'edgeless');
await electronApp.evaluate(({ BrowserWindow, Menu }) => {
const menuItem =
Menu.getApplicationMenu()?.getMenuItemById('affine:new-page');
const focusedWindow = BrowserWindow.getFocusedWindow();
if (!menuItem) {
throw new Error('Missing application menu item: affine:new-page');
}
if (!focusedWindow) {
throw new Error('Missing focused window for application menu dispatch');
}
menuItem.click(undefined, focusedWindow, focusedWindow.webContents);
});
await ensureInEdgelessMode(page);
});
test('app sidebar router forward/back', async ({ page }) => {
// create pages
await page.waitForTimeout(500);

View File

@@ -1,5 +1,5 @@
<!doctype html>
<html lang="en">
<html lang="en" dir="ltr">
<head>
<meta charset="utf-8" />
<meta

View File

@@ -23442,22 +23442,25 @@ __metadata:
languageName: node
linkType: hard
"fast-xml-builder@npm:^1.0.0":
version: 1.0.0
resolution: "fast-xml-builder@npm:1.0.0"
checksum: 10/06c04d80545e5c9f4d1d6cca00567b5cc09953a92c6328fa48cfb4d7f42630313b8c2bb62e9cb81accee7bb5e1c5312fcae06c3d20dbe52d969a5938233316da
"fast-xml-builder@npm:^1.1.4":
version: 1.1.4
resolution: "fast-xml-builder@npm:1.1.4"
dependencies:
path-expression-matcher: "npm:^1.1.3"
checksum: 10/32937866aaf5a90e69d1f4ee6e15e875248d5b5d2afd70277e9e8323074de4980cef24575a591b8e43c29f405d5f12377b3bad3842dc412b0c5c17a3eaee4b6b
languageName: node
linkType: hard
"fast-xml-parser@npm:^5.3.4":
version: 5.4.1
resolution: "fast-xml-parser@npm:5.4.1"
version: 5.5.6
resolution: "fast-xml-parser@npm:5.5.6"
dependencies:
fast-xml-builder: "npm:^1.0.0"
fast-xml-builder: "npm:^1.1.4"
path-expression-matcher: "npm:^1.1.3"
strnum: "npm:^2.1.2"
bin:
fxparser: src/cli/cli.js
checksum: 10/2b40067c3ad3542ca197d1353bcb0416cd5db20d5c66d74ac176b99af6ff9bd55a6182d36856a2fd477c95b8fc1f07405475f1662a31185480130ba7076c702a
checksum: 10/91a42a0cf99c83b0e721ceef9c189509e96c91c1875901c6ce6017f78ad25284f646a77a541e96ee45a15c2f13b7780d090c906c3ec3f262db03e7feb1e62315
languageName: node
linkType: hard
@@ -23595,14 +23598,14 @@ __metadata:
linkType: hard
"file-type@npm:^21.0.0":
version: 21.3.1
resolution: "file-type@npm:21.3.1"
version: 21.3.2
resolution: "file-type@npm:21.3.2"
dependencies:
"@tokenizer/inflate": "npm:^0.4.1"
strtok3: "npm:^10.3.4"
token-types: "npm:^6.1.1"
uint8array-extras: "npm:^1.4.0"
checksum: 10/0f99d4fa85184ea635cdccdfa677c7838bff790cdffde7fa9ec9f52e94fa8c0e7b6c2eeeb3f6a3d6dcc0a036192c13a8ec7008bdcef374e745ae0d64a162ad33
checksum: 10/3912271811e0c745d43ff1f6c97e66d4b0d890c68d1041de4ef0c8068ede46f725ef3ed0f92c97d0cd2a261f84c3b51881d60ab797e47fa9a15e7ed227f04c85
languageName: node
linkType: hard
@@ -30280,6 +30283,13 @@ __metadata:
languageName: node
linkType: hard
"path-expression-matcher@npm:^1.1.3":
version: 1.1.3
resolution: "path-expression-matcher@npm:1.1.3"
checksum: 10/9a607d0bf9807cf86b0a29fb4263f0c00285c13bedafb6ad3efc8bc87ae878da2faf657a9138ac918726cb19f147235a0ca695aec3e4ea1ee04641b6520e6c9e
languageName: node
linkType: hard
"path-is-absolute@npm:^1.0.0":
version: 1.0.1
resolution: "path-is-absolute@npm:1.0.1"