Compare commits

...

29 Commits

Author SHA1 Message Date
renovate[bot]
311ed7f6e1 chore: bump up Apollo GraphQL packages 2026-03-21 19:07:09 +00:00
renovate[bot]
ffa3ff9d7f chore: bump up apple/swift-collections version to from: "1.4.1" (#14697)
This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
|
[apple/swift-collections](https://redirect.github.com/apple/swift-collections)
| patch | `from: "1.4.0"` → `from: "1.4.1"` |

---

### Release Notes

<details>
<summary>apple/swift-collections (apple/swift-collections)</summary>

###
[`v1.4.1`](https://redirect.github.com/apple/swift-collections/releases/tag/1.4.1):
Swift Collections 1.4.1

[Compare
Source](https://redirect.github.com/apple/swift-collections/compare/1.4.0...1.4.1)

This patch release is mostly focusing on evolving the package traits
`UnstableContainersPreview` and `UnstableHashedContainers`, with the
following notable fixes and improvements to the stable parts of the
package:

- Make the package documentation build successfully on the DocC that
ships in Swift 6.2.
- Avoid using floating point arithmetic to size collection storage in
the `DequeModule` and `OrderedCollections` modules.

#### Changes to experimental package traits

The new set and dictionary types enabled by the
`UnstableHashedContainers` trait have now resolved several correctness
issues in their implementation of insertions. They have also gained some
low-hanging performance optimizations. Like before, these types are in
"working prototype" phase, and while they have working implementations
of basic primitive operations, we haven't done much work validating
their performance yet. Feedback from intrepid early adopters would be
very welcome.

The `UnstableContainersPreview` trait has gained several new protocols
and algorithm implementations, working towards one possible working
model of a coherent, ownership-aware container/iteration model.

- [`BidirectionalContainer`][BidirectionalContainer] defines a container
that allows iterating over spans backwards, and provides decrement
operations on indices -- an analogue of the classic
`BidirectionalCollection` protocol.
- [`RandomAccessContainer`][RandomAccessContainer] models containers
that allow constant-time repositioning of their indices, like
`RandomAccessCollection`.
- [`MutableContainer`][MutableContainer] is the ownership-aware analogue
of `MutableCollection` -- it models a container type that allows its
elements to be arbitrarily reordered and mutated/reassigned without
changing the shape of the data structure (that is to say, without
invalidating any indices).
- [`PermutableContainer`][PermutableContainer] is an experimental new
spinoff of `MutableContainer`, focusing on reordering items without
allowing arbitrary mutations.
- [`RangeReplaceableContainer`][RangeReplaceableContainer] is a partial,
ownership-aware analogue of `RangeReplaceableCollection`, providing a
full set of insertion/append/removal/consumption operations, with
support for fixed-capacity conforming types.
- [`DynamicContainer`][DynamicContainer] rounds out the
range-replacement operations with initializer and capacity reservation
requirements that can only be implemented by dynamically sized
containers.

[BidirectionalContainer]:
https://redirect.github.com/apple/swift-collections/blob/main/Sources/ContainersPreview/Protocols/Container/BidirectionalContainer.swift

[RandomAccessContainer]:
https://redirect.github.com/apple/swift-collections/blob/main/Sources/ContainersPreview/Protocols/Container/RandomAccessContainer.swift

[MutableContainer]:
https://redirect.github.com/apple/swift-collections/blob/main/Sources/ContainersPreview/Protocols/Container/MutableContainer.swift

[PermutableContainer]:
https://redirect.github.com/apple/swift-collections/blob/main/Sources/ContainersPreview/Protocols/Container/PermutableContainer.swift

[RangeReplaceableContainer]:
https://redirect.github.com/apple/swift-collections/blob/main/Sources/ContainersPreview/Protocols/Container/RangeReplaceableContainer.swift

[DynamicContainer]:
https://redirect.github.com/apple/swift-collections/blob/main/Sources/ContainersPreview/Protocols/Container/DynamicContainer.swift

- We now have [working reference
implementations](https://redirect.github.com/apple/swift-collections/tree/main/Sources/ContainersPreview/Protocols)
of lazy `map`, `reduce` and `filter` operations on borrowing iterators,
producers and drains, as well a `collect(into:)` family of methods to
supply "greedy" variants, generating items into a container of the
user's choice. Importantly, the algorithms tend to be defined on the
iterator types, rather than directly on some sequence/container -- going
this way has some interesting benefits (explicitness, no confusion
between the various flavors or the existing `Sequence` algorithms), but
they also have notable drawbacks (minor design issues with the borrowing
iterator protocol, unknowns on how the pattern would apply to container
algorithms, etc.).

```swift
    let items: RigidArray<Int> = ...
    let transformed = 
      items.makeBorrowingIterator() // obviously we'd want a better name here, like `borrow()`
      .map { 2 * $0 }
      .collect(into: UniqueArray.self)
    // `transformed` is a UniqueArray instance holding all values in `items`, doubled up 
```

```swift
    let items: RigidArray = ...
    let transformed = 
       items.makeBorrowingIterator()
      .filter { !$0.isMultiple(of: 7) }
      .copy()
      .collect(into: UniqueArray.self)
    // `transformed` holds a copy of all values in `items` that aren't a multiple of 7
```

```swift
    let items: RigidArray = ...
    let transformed = 
       items.consumeAll()
      .filter { !$0.isMultiple(of: 7) }
      .collect(into: UniqueArray.self)
    // `transformed` holds all values that were previously in `items` that aren't a multiple of 7. `items` is now empty.
```

Like before, these are highly experimental, and they will definitely
change in dramatic/radical ways on the way to stabilization. Note that
there is no project- or team-wide consensus on any of these constructs.
I'm publishing them primarily as a crucial reference point, and to gain
a level of shared understanding of the actual problems that need to be
resolved, and the consequences of the design path we are on.

#### What's Changed

- Add some decorative badges in the README by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;591](https://redirect.github.com/apple/swift-collections/pull/591)
- \[Dequemodule, OrderedCollections] Avoid using floating point
arithmetic by [@&#8203;lorentey](https://redirect.github.com/lorentey)
in
[#&#8203;592](https://redirect.github.com/apple/swift-collections/pull/592)
- Enforce dress code for license headers by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;593](https://redirect.github.com/apple/swift-collections/pull/593)
- Bump swiftlang/github-workflows/.github/workflows/soundness.yml from
0.0.7 to 0.0.8 by
[@&#8203;dependabot](https://redirect.github.com/dependabot)\[bot] in
[#&#8203;595](https://redirect.github.com/apple/swift-collections/pull/595)
- Documentation updates for latest DocC by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;596](https://redirect.github.com/apple/swift-collections/pull/596)
- \[BasicContainers] Allow standalone use of the
UnstableHashedContainers trait by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;597](https://redirect.github.com/apple/swift-collections/pull/597)
- Bump
swiftlang/github-workflows/.github/workflows/swift\_package\_test.yml
from 0.0.7 to 0.0.8 by
[@&#8203;dependabot](https://redirect.github.com/dependabot)\[bot] in
[#&#8203;594](https://redirect.github.com/apple/swift-collections/pull/594)
- \[ContainersPreview] Rename Producer.generateNext() to next() by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;599](https://redirect.github.com/apple/swift-collections/pull/599)
- \[ContainersPreview] Remove BorrowingSequence.first by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;598](https://redirect.github.com/apple/swift-collections/pull/598)
- \[CI] Enable Android testing by
[@&#8203;marcprux](https://redirect.github.com/marcprux) in
[#&#8203;558](https://redirect.github.com/apple/swift-collections/pull/558)
- \[BasicContainers] Assorted hashed container fixes and improvements by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;601](https://redirect.github.com/apple/swift-collections/pull/601)
- Flesh out BorrowingSequence/Container/Producer model a little more by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;603](https://redirect.github.com/apple/swift-collections/pull/603)
- More exploration of ownership-aware container/iterator algorithms by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;605](https://redirect.github.com/apple/swift-collections/pull/605)

#### New Contributors

- [@&#8203;marcprux](https://redirect.github.com/marcprux) made their
first contribution in
[#&#8203;558](https://redirect.github.com/apple/swift-collections/pull/558)

**Full Changelog**:
<https://github.com/apple/swift-collections/compare/1.4.0...1.4.1>

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My42Ni40IiwidXBkYXRlZEluVmVyIjoiNDMuNjYuNCIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-22 02:52:07 +08:00
DarkSky
f47ee2bc8a feat(server): improve indexer (#14698)
fix #13862 


#### PR Dependency Tree


* **PR #14698** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **New Features**
* Enhanced search support for Chinese, Japanese, and Korean languages
with improved text segmentation and character matching.
* Added index management capabilities with table recreation
functionality.

* **Bug Fixes**
* Improved search accuracy for non-Latin scripts through updated
morphology and n-gram configuration.

* **Chores**
  * Added database migration for search index optimization.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-22 02:50:59 +08:00
DarkSky
bcf2a51d41 feat(native): record encoding (#14188)
fix #13784 

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Start/stop system or meeting recordings with Ogg/Opus artifacts and
native start/stop APIs; workspace backup recovery.

* **Refactor**
* Simplified recording lifecycle and UI flows; native runtime now
orchestrates recording/processing and reporting.

* **Bug Fixes**
* Stronger path validation, safer import/export dialogs, consistent
error handling/logging, and retry-safe recording processing.

* **Chores**
* Added cross-platform native audio capture and Ogg/Opus encoding
support.

* **Tests**
* New unit, integration, and e2e tests for recording, path guards,
dialogs, and workspace recovery.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-22 02:50:14 +08:00
DarkSky
6a93566422 chore: bump deps (#14690)
#### PR Dependency Tree


* **PR #14690** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Chores**
* Updated package manager and development tooling to latest compatible
versions.
* Updated backend framework and monitoring dependencies to latest
minor/patch releases.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-20 05:23:03 +08:00
DarkSky
7ac8b14b65 feat(editor): migrate typst mermaid to native (#14499)
<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Native/WASM Mermaid and Typst SVG preview rendering on desktop and
mobile, plus cross-platform Preview plugin integrations.

* **Improvements**
* Centralized, sanitized rendering bridge with automatic Typst
font-directory handling and configurable native renderer selection.
* More consistent and robust error serialization and worker-backed
preview flows for improved stability and performance.

* **Tests**
* Extensive unit and integration tests for preview rendering, font
discovery, sanitization, and error serialization.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-20 04:04:40 +08:00
DarkSky
16a8f17717 feat(server): improve oidc compatibility (#14686)
fix #13938 
fix #14683 
fix #14532

#### PR Dependency Tree


* **PR #14686** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Flexible OIDC claim mapping for email/name, automatic OIDC discovery
retry with exponential backoff, and explicit OAuth flow modes (popup vs
redirect) propagated through the auth flow.

* **Bug Fixes**
* Stricter OIDC email validation, clearer error messages listing
attempted claim candidates, and improved callback redirect handling for
various flow scenarios.

* **Tests**
* Added unit tests covering OIDC behaviors, backoff scheduler/promise
utilities, and frontend OAuth flow parsing/redirect logic.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-20 04:02:37 +08:00
DarkSky
1ffb8c922c fix(native): cleanup deleted docs and blobs (#14689) 2026-03-20 04:00:25 +08:00
DarkSky
daf536f77a fix(native): misalignment between index clock and snapshot clock (#14688)
fix #14191

#### PR Dependency Tree


* **PR #14688** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Bug Fixes**
* Improved indexer synchronization timing for clock persistence to
prevent premature completion signals
  * Enhanced document-level indexing status tracking accuracy
  * Optimized refresh behavior for better state consistency

* **Chores**
  * Updated indexer versioning system

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-20 02:09:11 +08:00
congzhou09
0d2d4bb6a1 fix(editor): note-edgeless-block loses edit state during shift-click range selection (#14675)
### Problem
●In edgeless mode, when using Shift + click to perform range selection
inside an editing `note-edgeless-block` (click at the starting point,
then hold Shift and click at the end point), the block will unexpectedly
lose its editing and selection state. As a result, subsequent operations
on the selection - such as deleting and moving - no longer work.

●The following video demonstrates this issue:


https://github.com/user-attachments/assets/82c68683-e002-4a58-b011-fe59f7fc9f02

### Solution
●The reason is that this "Shift + click" behavior is being handled by
the default multi-selection logic, which toggles selection mode and
exits the editing state. So I added an `else-if` branch to match this
case.

### After
●The video below shows the behavior after this fix.


https://github.com/user-attachments/assets/18d61108-2089-4def-b2dc-ae13fc5ac333

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Bug Fixes**
* Improved selection behavior during note editing in multi-select mode
to provide more intuitive interaction when using range selection during
active editing.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-19 22:22:22 +08:00
Mohad
cb9897d493 fix(i18n): support Arabic comma separator in date-picker weekDays and monthNames (#14663)
## Problem

The Arabic locale strings in `ar.json` use the Arabic comma `،` (U+060C)
as separator:

```json
"com.affine.calendar-date-picker.week-days": "أ،إث،ث،أر،خ،ج،س"
```

But `day-picker.tsx` splits on ASCII comma only — causing all
weekday/month names to render as a single unsplit string in Arabic
locale.

## Fix

Change `.split(',')` to `.split(/[,،]/)` in two call sites — matches
both ASCII and Arabic comma.

## Impact

One-line fix per call site. No other functionality affected. All
non-Arabic locales unchanged.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Bug Fixes**
* Date picker rendering updated to correctly handle both ASCII and
Arabic/Persian comma formats when determining month and weekday labels.
This fixes inconsistent header and month-name displays in locales using
different comma characters while preserving existing interactions and
behavior.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-19 22:21:51 +08:00
Ishan Goswami
8ca8333cd6 chore(server): update exa search tool description (#14682)
Updated the Exa search tool description to better reflect what Exa does.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **Chores**
* Clarified the web search tool description to state it uses Exa, a web
search API optimized for AI applications to improve labeling and user
understanding.
* No functional or behavioral changes to the tool; this update affects
only the displayed description users see.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: Devin AI <158243242+devin-ai-integration[bot]@users.noreply.github.com>
Co-authored-by: ishan <ishan@exa.ai>
2026-03-19 05:42:04 +08:00
George Kapetanakis
3bf2503f55 fix(tools): improve sed error handling in set-version script (#14684)
## Summary
Replace post-command status checks with inline failure handling around
`sed` calls.
In the stream update path, ensure the two `sed` operations are treated
as one success/failure unit.
Keep behavior and file outputs the same on success, while making failure
handling explicit.

## Why
When `set -e` is enabled (which the script itself enables) command
failures cause the script to exit, making error handling by checking
`$?` not work.

## Files affected
- `set-version.sh`

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Refactor**
* Enhanced error handling in version management script with improved
failure reporting and context information.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-19 05:36:41 +08:00
DarkSky
59fd942f40 fix(editor): database detail style (#14680)
fix #13923


#### PR Dependency Tree


* **PR #14680** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Style**
* Refined styling and alignment for number field displays in the
database view component.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-18 14:58:53 +08:00
DarkSky
d6d5ae6182 fix(electron): create doc shortcut should follow default type in settings (#14678) 2026-03-18 14:58:22 +08:00
renovate[bot]
c1a09b951f chore: bump up fast-xml-parser version to v5.5.6 [SECURITY] (#14676)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
|
[fast-xml-parser](https://redirect.github.com/NaturalIntelligence/fast-xml-parser)
| [`5.4.1` →
`5.5.6`](https://renovatebot.com/diffs/npm/fast-xml-parser/5.4.1/5.5.6)
|
![age](https://developer.mend.io/api/mc/badges/age/npm/fast-xml-parser/5.5.6?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/fast-xml-parser/5.4.1/5.5.6?slim=true)
|

### GitHub Vulnerability Alerts

####
[CVE-2026-33036](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/security/advisories/GHSA-8gc5-j5rx-235r)

## Summary

The fix for CVE-2026-26278 added entity expansion limits
(`maxTotalExpansions`, `maxExpandedLength`, `maxEntityCount`,
`maxEntitySize`) to prevent XML entity expansion Denial of Service.
However, these limits are only enforced for DOCTYPE-defined entities.
**Numeric character references** (`&#NNN;` and `&#xHH;`) and standard
XML entities (`&lt;`, `&gt;`, etc.) are processed through a separate
code path that does NOT enforce any expansion limits.

An attacker can use massive numbers of numeric entity references to
completely bypass all configured limits, causing excessive memory
allocation and CPU consumption.

## Affected Versions

fast-xml-parser v5.x through v5.5.3 (and likely v5.5.5 on npm)

## Root Cause

In `src/xmlparser/OrderedObjParser.js`, the `replaceEntitiesValue()`
function has two separate entity replacement loops:

1. **Lines 638-670**: DOCTYPE entities — expansion counting with
`entityExpansionCount` and `currentExpandedLength` tracking. This was
the CVE-2026-26278 fix.
2. **Lines 674-677**: `lastEntities` loop — replaces standard entities
including `num_dec` (`/&#([0-9]{1,7});/g`) and `num_hex`
(`/&#x([0-9a-fA-F]{1,6});/g`). **This loop has NO expansion counting at
all.**

The numeric entity regex replacements at lines 97-98 are part of
`lastEntities` and go through the uncounted loop, completely bypassing
the CVE-2026-26278 fix.

## Proof of Concept

```javascript
const { XMLParser } = require('fast-xml-parser');

// Even with strict explicit limits, numeric entities bypass them
const parser = new XMLParser({
  processEntities: {
    enabled: true,
    maxTotalExpansions: 10,
    maxExpandedLength: 100,
    maxEntityCount: 1,
    maxEntitySize: 10
  }
});

// 100K numeric entity references — should be blocked by maxTotalExpansions=10
const xml = `<root>${'&#&#8203;65;'.repeat(100000)}</root>`;
const result = parser.parse(xml);

// Output: 500,000 chars — bypasses maxExpandedLength=100 completely
console.log('Output length:', result.root.length);  // 500000
console.log('Expected max:', 100);  // limit was 100
```

**Results:**
- 100K `&#&#8203;65;` references → 500,000 char output (5x default
maxExpandedLength of 100,000)
- 1M references → 5,000,000 char output, ~147MB memory consumed
- Even with `maxTotalExpansions=10` and `maxExpandedLength=100`, 10K
references produce 50,000 chars
- Hex entities (`&#x41;`) exhibit the same bypass

## Impact

**Denial of Service** — An attacker who can provide XML input to
applications using fast-xml-parser can cause:
- Excessive memory allocation (147MB+ for 1M entity references)
- CPU consumption during regex replacement
- Potential process crash via OOM

This is particularly dangerous because the application developer may
have explicitly configured strict entity expansion limits believing they
are protected, while numeric entities silently bypass all of them.

## Suggested Fix

Apply the same `entityExpansionCount` and `currentExpandedLength`
tracking to the `lastEntities` loop (lines 674-677) and the HTML
entities loop (lines 680-686), similar to how DOCTYPE entities are
tracked at lines 638-670.

## Workaround

Set `htmlEntities:false`

---

### Release Notes

<details>
<summary>NaturalIntelligence/fast-xml-parser (fast-xml-parser)</summary>

###
[`v5.5.6`](e54155f530...870043e75e)

[Compare
Source](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.5...v5.5.6)

###
[`v5.5.5`](ea07bb2e84...e54155f530)

[Compare
Source](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.4...v5.5.5)

###
[`v5.5.4`](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.3...ea07bb2e8435a88136c0e46d7ee8a345107b7582)

[Compare
Source](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.3...v5.5.4)

###
[`v5.5.3`](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.2...v5.5.3)

[Compare
Source](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.2...v5.5.3)

###
[`v5.5.2`](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.1...e0a14f7d15a293732e630ce1b7faa39924de2359)

[Compare
Source](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.1...v5.5.2)

###
[`v5.5.1`](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/releases/tag/v5.5.1):
integrate path-expression-matcher

[Compare
Source](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.0...v5.5.1)

- support path-expression-matcher
- fix: stopNode should not be parsed
- performance improvement for stopNode checking

###
[`v5.5.0`](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.4.2...ce017923460f92861e8fc94c91e52f9f5bd6a1b0)

[Compare
Source](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.4.2...v5.5.0)

###
[`v5.4.2`](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.4.1...v5.4.2)

[Compare
Source](https://redirect.github.com/NaturalIntelligence/fast-xml-parser/compare/v5.4.1...v5.4.2)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "" (UTC), Automerge - At any time (no
schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My42Ni40IiwidXBkYXRlZEluVmVyIjoiNDMuNjYuNCIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-18 13:28:53 +08:00
DarkSky
4ce68d74f1 fix(editor): chat cannot scroll on windows (#14677)
fix #14529 
fix #14612 
replace #14614 #14657


#### PR Dependency Tree


* **PR #14677** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Tests**
* Added test coverage for scroll position tracking and pinned scroll
behavior in AI chat
* Added test suite verifying scroll-to-end and scroll-to-position
functionality

* **New Features**
* Introduced configurable scrollable option for text rendering in AI
chat components, allowing control over scroll behavior

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-18 13:28:05 +08:00
chauhan_s
fbfcc01d14 fix(core): reserve space for auth input error to avoid layout shift (#14670)
Prevents layout shift when showing auth input errors by reserving space
for the error message. Improves visual stability and avoids UI jumps
when validation errors appear.

### Before 


https://github.com/user-attachments/assets/7439aa5e-069d-42ac-8963-e5cdee341ad9



### After

https://github.com/user-attachments/assets/8e758452-5323-4807-8a0d-38913303020d


<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Refactor**
* Improved error message display mechanism in authentication components
for more consistent rendering.

* **Style**
* Enhanced vertical spacing for error messages in form inputs to ensure
better visual consistency and readability.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-18 10:48:50 +08:00
DarkSky
1112a06623 fix: ci 2026-03-17 23:32:57 +08:00
chauhan_s
bbcb7e69fe fix: correct "has accept" to "has accepted" (#14669)
fixes #14407

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Bug Fixes**
* Corrected grammar in the notification message displayed when an
invitation is accepted.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-17 23:29:28 +08:00
steffenrapp
cc2f23339e feat(i18n): update German translation (#14674)
<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Documentation**
* Enhanced German language support with new translations for Obsidian
import, MCP server integration, and Copilot features. Improved error
message translations for better clarity and consistency.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-17 23:28:36 +08:00
chauhan_s
31101a69e7 fix: Refine verify email dialog for verify and change email flows (#14671)
### Summary
This PR improves the verify email dialog by giving the verify-email and
change-email flows distinct messaging instead of reusing the same
generic copy.

### What changed
* Use flow-specific body copy in the verify email dialog
* Keep the existing action-specific subtitle behavior for:
  * Verify email
  * Change email
* Update the English i18n strings so each flow explains the correct
intent:
  * Verify email focuses on confirming email ownership
  * Change email focuses on securely starting the email-change process
### Why
The previous dialog message was shared across both flows, which made the
change-email experience feel ambiguous. This update makes the intent
clearer for users and better matches the action they are taking.



https://www.loom.com/share/c64c20570a8242358bd178a2ac50e413


<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

* **Bug Fixes**
* Improved clarity in email verification and email change dialog
messages to better explain the confirmation process and link purpose.
* Enhanced distinction between email verification and email change
workflows with context-specific messaging.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-17 23:28:16 +08:00
Francisco Jiménez
0b1a44863f feat(editor): add obsidian vault import support (#14593)
fix #14592 

### Description
> 🤖 **Note:** The code in this Pull Request were developed with the
assistance of AI, but have been thoroughly reviewed and manually tested.

> I noticed there's a check when opening an issue that asks _"Is your
content generated by AI?"_, so I mention it here in case it's a deal
breaker. If so I understand, you can close the PR, just wanted to share
this in case it's useful anyways.

This PR introduces **Obsidian Vault Import Support** to AFFiNE. 

Previously, users migrating from Obsidian had to rely on the generic
Markdown importer, which often resulted in broken cross-links, missing
directory structures, and metadata conflicts because Obsidian relies
heavily on proprietary structures not supported by standard Markdown.

This completely new feature makes migrating to AFFiNE easy.

**Key Features & Implementations:**

1. **Vault (Directory) Selection**
- Utilizes the `openDirectory` blocksuite utility in the import modal to
allow users to select an entire folder directly from their filesystem,
maintaining file context rather than forcing `.zip` uploads.

2. **Wikilink Resolution (Two-Pass Import)**
- Restructured the `importObsidianVault` process into a two-pass
architecture.
- **Pass 1:** Discovers all files, assigns new AFFiNE document IDs, and
maps them efficiently (by title, alias, and filename) into a
high-performance hash map.
- **Pass 2:** Processes the generic markdown AST and correctly maps
custom `[[wikilinks]]` to the actual pre-registered AFFiNE blocksuite
document IDs via `obsidianWikilinkToDeltaMatcher`.
- Safely strips leading emojis from wikilink aliases to prevent
duplicated page icons rendering mid-sentence.

3. **Emoji Metadata & State Fixes**
- Implemented an aggressive, single-pass RegExp to extract multiple
leading/combining emojis (`Emoji_Presentation` / `\ufe0f`) from H1
headers and Frontmatter. Emojis are assigned specifically to the page
icon metadata property and cleanly stripped from the visual document
title.
- Fixed a core mutation bug where the loop iterating over existing
`docMetas` was aggressively overwriting newly minted IDs for the current
import batch. This fully resolves the issue where imported pages
(especially re-imports) were incorrectly flagged as `trashed`.
   - Enforces explicit `trash: false` patch instructions.

4. **Syntax Conversion**
- Implemented conversion of Obsidian-style Callouts (`> [!NOTE] Title`)
into native AFFiNE block formats (`> 💡 **Title**`).
- Hardened the `blockquote` parser so that nested structures (like `> -
list items`) are fully preserved instead of discarded.

### UI Changes
- Updated the Import Modal to include the "Import Obsidian Vault" flow
utilizing the native filesystem directory picker.
- Regenerated and synced `i18n-completenesses.json` correctly up to 100%
across all supported locales for the new modal string additions.

### Testing Instructions
1. Navigate to the Workspace sidebar and click "Import".
2. Select "Obsidian" and use the directory picker to define a
comprehensive Vault folder.
3. Validate that cross-links between documents automatically resolve to
their specific AFFiNE instances.
4. Validate documents containing leading Emojis display exactly one
Emoji (in the page icon area), and none duplicated in the actual title
header.
5. Validate Callouts are rendered cleanly and correctly, and no
documents are incorrectly marked as "Trash".


<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Import Obsidian vaults with wikilink resolution, emoji/title
preservation, asset handling, and automatic document creation.
* Folder-based imports via a Directory Picker (with hidden-input
fallback) integrated into the import dialog.

* **Localization**
  * Added Obsidian import label and tooltip translations.

* **Tests**
* Added end-to-end tests validating Obsidian vault import and asset
handling.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: DarkSky <25152247+darkskygit@users.noreply.github.com>
Co-authored-by: DarkSky <darksky2048@gmail.com>
2026-03-17 00:49:17 +08:00
DarkSky
8406f9656e perf(editor): improve bounding box calc caching (#14668)
#### PR Dependency Tree


* **PR #14668** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)
2026-03-16 23:35:38 +08:00
DarkSky
121c0d172d feat(server): improve doc tools error handle (#14662)
<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Centralized sync/status messages for cloud document sync and explicit
user-facing error types.
* Frontend helpers to detect and display tool errors with friendly
names.

* **Bug Fixes**
* Consistent, actionable error reporting for document and attachment
reads instead of silent failures.
* Search and semantic tools now validate workspace sync and permissions
and return clear responses.

* **Tests**
* Added comprehensive tests covering document/blob reads, search tools,
and sync/error paths.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-16 02:20:35 +08:00
renovate[bot]
8f03090780 chore: bump up Lakr233/MarkdownView version to from: "3.8.2" (#14658)
This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
|
[Lakr233/MarkdownView](https://redirect.github.com/Lakr233/MarkdownView)
| minor | `from: "3.6.3"` → `from: "3.8.2"` |

---

### Release Notes

<details>
<summary>Lakr233/MarkdownView (Lakr233/MarkdownView)</summary>

###
[`v3.8.2`](https://redirect.github.com/Lakr233/MarkdownView/compare/3.8.1...3.8.2)

[Compare
Source](https://redirect.github.com/Lakr233/MarkdownView/compare/3.8.1...3.8.2)

###
[`v3.8.1`](https://redirect.github.com/Lakr233/MarkdownView/compare/3.8.0...3.8.1)

[Compare
Source](https://redirect.github.com/Lakr233/MarkdownView/compare/3.8.0...3.8.1)

###
[`v3.8.0`](https://redirect.github.com/Lakr233/MarkdownView/compare/3.7.0...3.8.0)

[Compare
Source](https://redirect.github.com/Lakr233/MarkdownView/compare/3.7.0...3.8.0)

###
[`v3.7.0`](https://redirect.github.com/Lakr233/MarkdownView/compare/3.6.3...3.7.0)

[Compare
Source](https://redirect.github.com/Lakr233/MarkdownView/compare/3.6.3...3.7.0)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My42Ni40IiwidXBkYXRlZEluVmVyIjoiNDMuNjYuNCIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-16 00:57:48 +08:00
renovate[bot]
8125cc0e75 chore: bump up Lakr233/ListViewKit version to from: "1.2.0" (#14617)
This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
| [Lakr233/ListViewKit](https://redirect.github.com/Lakr233/ListViewKit)
| minor | `from: "1.1.8"` → `from: "1.2.0"` |

---

### Release Notes

<details>
<summary>Lakr233/ListViewKit (Lakr233/ListViewKit)</summary>

###
[`v1.2.0`](https://redirect.github.com/Lakr233/ListViewKit/compare/1.1.8...1.2.0)

[Compare
Source](https://redirect.github.com/Lakr233/ListViewKit/compare/1.1.8...1.2.0)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My41OS4wIiwidXBkYXRlZEluVmVyIjoiNDMuNTkuMCIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-14 23:45:32 +08:00
renovate[bot]
f537a75f01 chore: bump up file-type version to v21.3.2 [SECURITY] (#14655)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
| [file-type](https://redirect.github.com/sindresorhus/file-type) |
[`21.3.1` →
`21.3.2`](https://renovatebot.com/diffs/npm/file-type/21.3.1/21.3.2) |
![age](https://developer.mend.io/api/mc/badges/age/npm/file-type/21.3.2?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/file-type/21.3.1/21.3.2?slim=true)
|

### GitHub Vulnerability Alerts

####
[CVE-2026-31808](https://redirect.github.com/sindresorhus/file-type/security/advisories/GHSA-5v7r-6r5c-r473)

### Impact
A denial of service vulnerability exists in the ASF (WMV/WMA) file type
detection parser. When parsing a crafted input where an ASF sub-header
has a `size` field of zero, the parser enters an infinite loop. The
`payload` value becomes negative (-24), causing
`tokenizer.ignore(payload)` to move the read position backwards, so the
same sub-header is read repeatedly forever.

Any application that uses `file-type` to detect the type of
untrusted/attacker-controlled input is affected. An attacker can stall
the Node.js event loop with a 55-byte payload.

### Patches
Fixed in version 21.3.1. Users should upgrade to >= 21.3.1.

### Workarounds
Validate or limit the size of input buffers before passing them to
`file-type`, or run file type detection in a worker thread with a
timeout.

### References
- Fix commit: 319abf871b50ba2fa221b4a7050059f1ae096f4f

### Reporter

crnkovic@lokvica.com

####
[CVE-2026-32630](https://redirect.github.com/sindresorhus/file-type/security/advisories/GHSA-j47w-4g3g-c36v)

## Summary

A crafted ZIP file can trigger excessive memory growth during type
detection in `file-type` when using `fileTypeFromBuffer()`,
`fileTypeFromBlob()`, or `fileTypeFromFile()`.

In affected versions, the ZIP inflate output limit is enforced for
stream-based detection, but not for known-size inputs. As a result, a
small compressed ZIP can cause `file-type` to inflate and process a much
larger payload while probing ZIP-based formats such as OOXML. In testing
on `file-type` `21.3.1`, a ZIP of about `255 KB` caused about `257 MB`
of RSS growth during `fileTypeFromBuffer()`.

This is an availability issue. Applications that use these APIs on
untrusted uploads can be forced to consume large amounts of memory and
may become slow or crash.

## Root Cause

The ZIP detection logic applied different limits depending on whether
the tokenizer had a known file size.

For stream inputs, ZIP probing was bounded by
`maximumZipEntrySizeInBytes` (`1 MiB`). For known-size inputs such as
buffers, blobs, and files, the code instead used
`Number.MAX_SAFE_INTEGER` in two relevant places:

```js
const maximumContentTypesEntrySize = hasUnknownFileSize(tokenizer)
	? maximumZipEntrySizeInBytes
	: Number.MAX_SAFE_INTEGER;
```

and:

```js
const maximumLength = hasUnknownFileSize(this.tokenizer)
	? maximumZipEntrySizeInBytes
	: Number.MAX_SAFE_INTEGER;
```

Together, these checks allowed a crafted ZIP to bypass the intended
inflate limit for known-size APIs and force large decompression during
detection of entries such as `[Content_Types].xml`.

## Proof of Concept

```js
import {fileTypeFromBuffer} from 'file-type';
import archiver from 'archiver';
import {Writable} from 'node:stream';

async function createZipBomb(sizeInMegabytes) {
	return new Promise((resolve, reject) => {
		const chunks = [];
		const writable = new Writable({
			write(chunk, encoding, callback) {
				chunks.push(chunk);
				callback();
			},
		});

		const archive = archiver('zip', {zlib: {level: 9}});
		archive.pipe(writable);
		writable.on('finish', () => {
			resolve(Buffer.concat(chunks));
		});
		archive.on('error', reject);

		const xmlPrefix = '<?xml version="1.0"?><Types xmlns="http://schemas.openxmlformats.org/package/2006/content-types">';
		const padding = Buffer.alloc(sizeInMegabytes * 1024 * 1024 - xmlPrefix.length, 0x20);
		archive.append(Buffer.concat([Buffer.from(xmlPrefix), padding]), {name: '[Content_Types].xml'});
		archive.finalize();
	});
}

const zip = await createZipBomb(256);
console.log('ZIP size (KB):', (zip.length / 1024).toFixed(0));

const before = process.memoryUsage().rss;
await fileTypeFromBuffer(zip);
const after = process.memoryUsage().rss;

console.log('RSS growth (MB):', ((after - before) / 1024 / 1024).toFixed(0));
```

Observed on `file-type` `21.3.1`:
- ZIP size: about `255 KB`
- RSS growth during detection: about `257 MB`

## Affected APIs

Affected:
- `fileTypeFromBuffer()`
- `fileTypeFromBlob()`
- `fileTypeFromFile()`

Not affected:
- `fileTypeFromStream()`, which already enforced the ZIP inflate limit
for unknown-size inputs

## Impact

Applications that inspect untrusted uploads with `fileTypeFromBuffer()`,
`fileTypeFromBlob()`, or `fileTypeFromFile()` can be forced to consume
excessive memory during ZIP-based type detection. This can degrade
service or lead to process termination in memory-constrained
environments.

## Cause

The issue was introduced in 399b0f1

---

### Release Notes

<details>
<summary>sindresorhus/file-type (file-type)</summary>

###
[`v21.3.2`](https://redirect.github.com/sindresorhus/file-type/releases/tag/v21.3.2)

[Compare
Source](https://redirect.github.com/sindresorhus/file-type/compare/v21.3.1...v21.3.2)

- Fix ZIP bomb in known-size ZIP probing (GHSA-j47w-4g3g-c36v)
[`a155cd7`](https://redirect.github.com/sindresorhus/file-type/commit/a155cd7)
- Fix bound recursive BOM and ID3 detection
[`370ed91`](https://redirect.github.com/sindresorhus/file-type/commit/370ed91)

***

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "" (UTC), Automerge - At any time (no
schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My42Ni40IiwidXBkYXRlZEluVmVyIjoiNDMuNjYuNCIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-14 23:44:06 +08:00
renovate[bot]
9456a07889 chore: migrate Renovate config (#14656)
The Renovate config in this repository needs migrating. Typically this
is because one or more configuration options you are using have been
renamed.

You don't need to merge this PR right away, because Renovate will
continue to migrate these fields internally each time it runs. But later
some of these fields may be fully deprecated and the migrations removed.
So it's a good idea to merge this migration PR soon.





🔕 **Ignore**: Close this PR and you won't be reminded about config
migration again, but one day your current config may no longer be valid.

 Got questions? Does something look wrong to you? Please don't hesitate
to [request help
here](https://redirect.github.com/renovatebot/renovate/discussions).


---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-14 23:43:39 +08:00
225 changed files with 13973 additions and 3914 deletions

View File

@@ -63,7 +63,7 @@
"groupName": "opentelemetry", "groupName": "opentelemetry",
"matchPackageNames": [ "matchPackageNames": [
"/^@opentelemetry/", "/^@opentelemetry/",
"/^@google-cloud\/opentelemetry-/" "/^@google-cloud/opentelemetry-/"
] ]
} }
], ],
@@ -79,7 +79,7 @@
"customManagers": [ "customManagers": [
{ {
"customType": "regex", "customType": "regex",
"fileMatch": ["^rust-toolchain\\.toml?$"], "managerFilePatterns": ["/^rust-toolchain\\.toml?$/"],
"matchStrings": [ "matchStrings": [
"channel\\s*=\\s*\"(?<currentValue>\\d+\\.\\d+(\\.\\d+)?)\"" "channel\\s*=\\s*\"(?<currentValue>\\d+\\.\\d+(\\.\\d+)?)\""
], ],

View File

@@ -269,10 +269,13 @@ jobs:
- name: Run playground build - name: Run playground build
run: yarn workspace @blocksuite/playground build run: yarn workspace @blocksuite/playground build
- name: Run playwright tests - name: Run integration browser tests
run: | timeout-minutes: 10
yarn workspace @blocksuite/integration-test test:unit run: yarn workspace @blocksuite/integration-test test:unit
yarn workspace @affine-test/blocksuite test "cross-platform/" --forbid-only
- name: Run cross-platform playwright tests
timeout-minutes: 10
run: yarn workspace @affine-test/blocksuite test "cross-platform/" --forbid-only
- name: Upload test results - name: Upload test results
if: always() if: always()

File diff suppressed because one or more lines are too long

940
.yarn/releases/yarn-4.13.0.cjs vendored Executable file

File diff suppressed because one or more lines are too long

View File

@@ -12,4 +12,4 @@ npmPublishAccess: public
npmRegistryServer: "https://registry.npmjs.org" npmRegistryServer: "https://registry.npmjs.org"
yarnPath: .yarn/releases/yarn-4.12.0.cjs yarnPath: .yarn/releases/yarn-4.13.0.cjs

2738
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -36,7 +36,7 @@ resolver = "3"
criterion2 = { version = "3", default-features = false } criterion2 = { version = "3", default-features = false }
crossbeam-channel = "0.5" crossbeam-channel = "0.5"
dispatch2 = "0.3" dispatch2 = "0.3"
docx-parser = { git = "https://github.com/toeverything/docx-parser" } docx-parser = { git = "https://github.com/toeverything/docx-parser", rev = "380beea" }
dotenvy = "0.15" dotenvy = "0.15"
file-format = { version = "0.28", features = ["reader"] } file-format = { version = "0.28", features = ["reader"] }
homedir = "0.3" homedir = "0.3"
@@ -59,6 +59,7 @@ resolver = "3"
lru = "0.16" lru = "0.16"
matroska = "0.30" matroska = "0.30"
memory-indexer = "0.3.0" memory-indexer = "0.3.0"
mermaid-rs-renderer = { git = "https://github.com/toeverything/mermaid-rs-renderer", rev = "fba9097", default-features = false }
mimalloc = "0.1" mimalloc = "0.1"
mp4parse = "0.17" mp4parse = "0.17"
nanoid = "0.4" nanoid = "0.4"
@@ -75,6 +76,7 @@ resolver = "3"
notify = { version = "8", features = ["serde"] } notify = { version = "8", features = ["serde"] }
objc2 = "0.6" objc2 = "0.6"
objc2-foundation = "0.3" objc2-foundation = "0.3"
ogg = "0.9"
once_cell = "1" once_cell = "1"
ordered-float = "5" ordered-float = "5"
parking_lot = "0.12" parking_lot = "0.12"
@@ -122,6 +124,14 @@ resolver = "3"
tree-sitter-rust = { version = "0.24" } tree-sitter-rust = { version = "0.24" }
tree-sitter-scala = { version = "0.24" } tree-sitter-scala = { version = "0.24" }
tree-sitter-typescript = { version = "0.23" } tree-sitter-typescript = { version = "0.23" }
typst = "0.14.2"
typst-as-lib = { version = "0.15.4", default-features = false, features = [
"packages",
"typst-kit-embed-fonts",
"typst-kit-fonts",
"ureq",
] }
typst-svg = "0.14.2"
uniffi = "0.29" uniffi = "0.29"
url = { version = "2.5" } url = { version = "2.5" }
uuid = "1.8" uuid = "1.8"

View File

@@ -0,0 +1,94 @@
// Vitest Snapshot v1, https://vitest.dev/guide/snapshot.html
exports[`snapshot to markdown > imports obsidian vault fixtures 1`] = `
{
"entry": {
"children": [
{
"children": [
{
"children": [
{
"delta": [
{
"insert": "Panel
Body line",
},
],
"flavour": "affine:paragraph",
"type": "text",
},
],
"emoji": "💡",
"flavour": "affine:callout",
},
{
"flavour": "affine:attachment",
"name": "archive.zip",
"style": "horizontalThin",
},
{
"delta": [
{
"footnote": {
"label": "1",
"reference": {
"title": "reference body",
"type": "url",
},
},
"insert": " ",
},
],
"flavour": "affine:paragraph",
"type": "text",
},
{
"flavour": "affine:divider",
},
{
"delta": [
{
"insert": "after note",
},
],
"flavour": "affine:paragraph",
"type": "text",
},
{
"delta": [
{
"insert": " ",
"reference": {
"page": "linked",
"type": "LinkedPage",
},
},
],
"flavour": "affine:paragraph",
"type": "text",
},
{
"delta": [
{
"insert": "Sources",
},
],
"flavour": "affine:paragraph",
"type": "h6",
},
{
"flavour": "affine:bookmark",
},
],
"flavour": "affine:note",
},
],
"flavour": "affine:page",
},
"titles": [
"entry",
"linked",
],
}
`;

View File

@@ -0,0 +1,14 @@
> [!custom] Panel
> Body line
![[archive.zip]]
[^1]
---
after note
[[linked]]
[^1]: reference body

View File

@@ -0,0 +1 @@
plain linked page

View File

@@ -1,4 +1,10 @@
import { MarkdownTransformer } from '@blocksuite/affine/widgets/linked-doc'; import { readFileSync } from 'node:fs';
import { basename, resolve } from 'node:path';
import {
MarkdownTransformer,
ObsidianTransformer,
} from '@blocksuite/affine/widgets/linked-doc';
import { import {
DefaultTheme, DefaultTheme,
NoteDisplayMode, NoteDisplayMode,
@@ -8,13 +14,18 @@ import {
CalloutAdmonitionType, CalloutAdmonitionType,
CalloutExportStyle, CalloutExportStyle,
calloutMarkdownExportMiddleware, calloutMarkdownExportMiddleware,
docLinkBaseURLMiddleware,
embedSyncedDocMiddleware, embedSyncedDocMiddleware,
MarkdownAdapter, MarkdownAdapter,
titleMiddleware,
} from '@blocksuite/affine-shared/adapters'; } from '@blocksuite/affine-shared/adapters';
import type { AffineTextAttributes } from '@blocksuite/affine-shared/types';
import type { import type {
BlockSnapshot, BlockSnapshot,
DeltaInsert,
DocSnapshot, DocSnapshot,
SliceSnapshot, SliceSnapshot,
Store,
TransformerMiddleware, TransformerMiddleware,
} from '@blocksuite/store'; } from '@blocksuite/store';
import { AssetsManager, MemoryBlobCRUD, Schema } from '@blocksuite/store'; import { AssetsManager, MemoryBlobCRUD, Schema } from '@blocksuite/store';
@@ -29,6 +40,138 @@ import { testStoreExtensions } from '../utils/store.js';
const provider = getProvider(); const provider = getProvider();
function withRelativePath(file: File, relativePath: string): File {
Object.defineProperty(file, 'webkitRelativePath', {
value: relativePath,
writable: false,
});
return file;
}
function markdownFixture(relativePath: string): File {
return withRelativePath(
new File(
[
readFileSync(
resolve(import.meta.dirname, 'fixtures/obsidian', relativePath),
'utf8'
),
],
basename(relativePath),
{ type: 'text/markdown' }
),
`vault/${relativePath}`
);
}
function exportSnapshot(doc: Store): DocSnapshot {
const job = doc.getTransformer([
docLinkBaseURLMiddleware(doc.workspace.id),
titleMiddleware(doc.workspace.meta.docMetas),
]);
const snapshot = job.docToSnapshot(doc);
expect(snapshot).toBeTruthy();
return snapshot!;
}
function normalizeDeltaForSnapshot(
delta: DeltaInsert<AffineTextAttributes>[],
titleById: ReadonlyMap<string, string>
) {
return delta.map(item => {
const normalized: Record<string, unknown> = {
insert: item.insert,
};
if (item.attributes?.link) {
normalized.link = item.attributes.link;
}
if (item.attributes?.reference?.type === 'LinkedPage') {
normalized.reference = {
type: 'LinkedPage',
page: titleById.get(item.attributes.reference.pageId) ?? '<missing>',
...(item.attributes.reference.title
? { title: item.attributes.reference.title }
: {}),
};
}
if (item.attributes?.footnote) {
const reference = item.attributes.footnote.reference;
normalized.footnote = {
label: item.attributes.footnote.label,
reference:
reference.type === 'doc'
? {
type: 'doc',
page: reference.docId
? (titleById.get(reference.docId) ?? '<missing>')
: '<missing>',
}
: {
type: reference.type,
...(reference.title ? { title: reference.title } : {}),
...(reference.fileName ? { fileName: reference.fileName } : {}),
},
};
}
return normalized;
});
}
function simplifyBlockForSnapshot(
block: BlockSnapshot,
titleById: ReadonlyMap<string, string>
): Record<string, unknown> {
const simplified: Record<string, unknown> = {
flavour: block.flavour,
};
if (block.flavour === 'affine:paragraph' || block.flavour === 'affine:list') {
simplified.type = block.props.type;
const text = block.props.text as
| { delta?: DeltaInsert<AffineTextAttributes>[] }
| undefined;
simplified.delta = normalizeDeltaForSnapshot(text?.delta ?? [], titleById);
}
if (block.flavour === 'affine:callout') {
simplified.emoji = block.props.emoji;
}
if (block.flavour === 'affine:attachment') {
simplified.name = block.props.name;
simplified.style = block.props.style;
}
if (block.flavour === 'affine:image') {
simplified.sourceId = '<asset>';
}
const children = (block.children ?? [])
.filter(child => child.flavour !== 'affine:surface')
.map(child => simplifyBlockForSnapshot(child, titleById));
if (children.length) {
simplified.children = children;
}
return simplified;
}
function snapshotDocByTitle(
collection: TestWorkspace,
title: string,
titleById: ReadonlyMap<string, string>
) {
const meta = collection.meta.docMetas.find(meta => meta.title === title);
expect(meta).toBeTruthy();
const doc = collection.getDoc(meta!.id)?.getStore({ id: meta!.id });
expect(doc).toBeTruthy();
return simplifyBlockForSnapshot(exportSnapshot(doc!).blocks, titleById);
}
describe('snapshot to markdown', () => { describe('snapshot to markdown', () => {
test('code', async () => { test('code', async () => {
const blockSnapshot: BlockSnapshot = { const blockSnapshot: BlockSnapshot = {
@@ -127,6 +270,46 @@ Hello world
expect(meta?.tags).toEqual(['a', 'b']); expect(meta?.tags).toEqual(['a', 'b']);
}); });
test('imports obsidian vault fixtures', async () => {
const schema = new Schema().register(AffineSchemas);
const collection = new TestWorkspace();
collection.storeExtensions = testStoreExtensions;
collection.meta.initialize();
const attachment = withRelativePath(
new File([new Uint8Array([80, 75, 3, 4])], 'archive.zip', {
type: 'application/zip',
}),
'vault/archive.zip'
);
const { docIds } = await ObsidianTransformer.importObsidianVault({
collection,
schema,
importedFiles: [
markdownFixture('entry.md'),
markdownFixture('linked.md'),
attachment,
],
extensions: testStoreExtensions,
});
expect(docIds).toHaveLength(2);
const titleById = new Map(
collection.meta.docMetas.map(meta => [
meta.id,
meta.title ?? '<untitled>',
])
);
expect({
titles: collection.meta.docMetas
.map(meta => meta.title)
.sort((a, b) => (a ?? '').localeCompare(b ?? '')),
entry: snapshotDocByTitle(collection, 'entry', titleById),
}).toMatchSnapshot();
});
test('paragraph', async () => { test('paragraph', async () => {
const blockSnapshot: BlockSnapshot = { const blockSnapshot: BlockSnapshot = {
type: 'block', type: 'block',

View File

@@ -5,6 +5,7 @@ import {
import { import {
BlockMarkdownAdapterExtension, BlockMarkdownAdapterExtension,
type BlockMarkdownAdapterMatcher, type BlockMarkdownAdapterMatcher,
createAttachmentBlockSnapshot,
FOOTNOTE_DEFINITION_PREFIX, FOOTNOTE_DEFINITION_PREFIX,
getFootnoteDefinitionText, getFootnoteDefinitionText,
isFootnoteDefinitionNode, isFootnoteDefinitionNode,
@@ -56,18 +57,15 @@ export const attachmentBlockMarkdownAdapterMatcher: BlockMarkdownAdapterMatcher
} }
walkerContext walkerContext
.openNode( .openNode(
{ createAttachmentBlockSnapshot({
type: 'block',
id: nanoid(), id: nanoid(),
flavour: AttachmentBlockSchema.model.flavour,
props: { props: {
name: fileName, name: fileName,
sourceId: blobId, sourceId: blobId,
footnoteIdentifier, footnoteIdentifier,
style: 'citation', style: 'citation',
}, },
children: [], }),
},
'children' 'children'
) )
.closeNode(); .closeNode();

View File

@@ -32,6 +32,7 @@
}, },
"devDependencies": { "devDependencies": {
"@vitest/browser-playwright": "^4.0.18", "@vitest/browser-playwright": "^4.0.18",
"playwright": "=1.58.2",
"vitest": "^4.0.18" "vitest": "^4.0.18"
}, },
"exports": { "exports": {

View File

@@ -516,6 +516,9 @@ export const EdgelessNoteInteraction =
} }
}) })
.catch(console.error); .catch(console.error);
} else if (multiSelect && alreadySelected && editing) {
// range selection using Shift-click when editing
return;
} else { } else {
context.default(context); context.default(context);
} }

View File

@@ -83,9 +83,9 @@ export class RecordField extends SignalWatcher(
border: 1px solid transparent; border: 1px solid transparent;
} }
.field-content .affine-database-number { .field-content affine-database-number-cell .number {
text-align: left; text-align: left;
justify-content: start; justify-content: flex-start;
} }
.field-content:hover { .field-content:hover {

View File

@@ -35,6 +35,7 @@
}, },
"devDependencies": { "devDependencies": {
"@vitest/browser-playwright": "^4.0.18", "@vitest/browser-playwright": "^4.0.18",
"playwright": "=1.58.2",
"vitest": "^4.0.18" "vitest": "^4.0.18"
}, },
"exports": { "exports": {

View File

@@ -1,4 +1,7 @@
import { AttachmentBlockSchema } from '@blocksuite/affine-model'; import {
type AttachmentBlockProps,
AttachmentBlockSchema,
} from '@blocksuite/affine-model';
import { BlockSuiteError, ErrorCode } from '@blocksuite/global/exceptions'; import { BlockSuiteError, ErrorCode } from '@blocksuite/global/exceptions';
import { import {
type AssetsManager, type AssetsManager,
@@ -23,6 +26,24 @@ import { AdapterFactoryIdentifier } from './types/adapter';
export type Attachment = File[]; export type Attachment = File[];
type CreateAttachmentBlockSnapshotOptions = {
id?: string;
props: Partial<AttachmentBlockProps> & Pick<AttachmentBlockProps, 'name'>;
};
export function createAttachmentBlockSnapshot({
id = nanoid(),
props,
}: CreateAttachmentBlockSnapshotOptions): BlockSnapshot {
return {
type: 'block',
id,
flavour: AttachmentBlockSchema.model.flavour,
props,
children: [],
};
}
type AttachmentToSliceSnapshotPayload = { type AttachmentToSliceSnapshotPayload = {
file: Attachment; file: Attachment;
assets?: AssetsManager; assets?: AssetsManager;
@@ -97,8 +118,6 @@ export class AttachmentAdapter extends BaseAdapter<Attachment> {
if (files.length === 0) return null; if (files.length === 0) return null;
const content: SliceSnapshot['content'] = []; const content: SliceSnapshot['content'] = [];
const flavour = AttachmentBlockSchema.model.flavour;
for (const blob of files) { for (const blob of files) {
const id = nanoid(); const id = nanoid();
const { name, size, type } = blob; const { name, size, type } = blob;
@@ -108,22 +127,21 @@ export class AttachmentAdapter extends BaseAdapter<Attachment> {
mapInto: sourceId => ({ sourceId }), mapInto: sourceId => ({ sourceId }),
}); });
content.push({ content.push(
type: 'block', createAttachmentBlockSnapshot({
flavour, id,
id, props: {
props: { name,
name, size,
size, type,
type, embed: false,
embed: false, style: 'horizontalThin',
style: 'horizontalThin', index: 'a0',
index: 'a0', xywh: '[0,0,0,0]',
xywh: '[0,0,0,0]', rotate: 0,
rotate: 0, },
}, })
children: [], );
});
} }
return { return {

View File

@@ -1,3 +1,20 @@
function safeDecodePathReference(path: string): string {
try {
return decodeURIComponent(path);
} catch {
return path;
}
}
export function normalizeFilePathReference(path: string): string {
return safeDecodePathReference(path)
.trim()
.replace(/\\/g, '/')
.replace(/^\.\/+/, '')
.replace(/^\/+/, '')
.replace(/\/+/g, '/');
}
/** /**
* Normalizes a relative path by resolving all relative path segments * Normalizes a relative path by resolving all relative path segments
* @param basePath The base path (markdown file's directory) * @param basePath The base path (markdown file's directory)
@@ -40,7 +57,7 @@ export function getImageFullPath(
imageReference: string imageReference: string
): string { ): string {
// Decode the image reference in case it contains URL-encoded characters // Decode the image reference in case it contains URL-encoded characters
const decodedReference = decodeURIComponent(imageReference); const decodedReference = safeDecodePathReference(imageReference);
// Get the directory of the file path // Get the directory of the file path
const markdownDir = filePath.substring(0, filePath.lastIndexOf('/')); const markdownDir = filePath.substring(0, filePath.lastIndexOf('/'));

View File

@@ -20,9 +20,30 @@ declare global {
showOpenFilePicker?: ( showOpenFilePicker?: (
options?: OpenFilePickerOptions options?: OpenFilePickerOptions
) => Promise<FileSystemFileHandle[]>; ) => Promise<FileSystemFileHandle[]>;
// Window API: showDirectoryPicker
showDirectoryPicker?: (options?: {
id?: string;
mode?: 'read' | 'readwrite';
startIn?: FileSystemHandle | string;
}) => Promise<FileSystemDirectoryHandle>;
} }
} }
// Minimal polyfill for FileSystemDirectoryHandle to iterate over files
interface FileSystemDirectoryHandle {
kind: 'directory';
name: string;
values(): AsyncIterableIterator<
FileSystemFileHandle | FileSystemDirectoryHandle
>;
}
interface FileSystemFileHandle {
kind: 'file';
name: string;
getFile(): Promise<File>;
}
// See [Common MIME types](https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/MIME_types/Common_types) // See [Common MIME types](https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/MIME_types/Common_types)
const FileTypes: NonNullable<OpenFilePickerOptions['types']> = [ const FileTypes: NonNullable<OpenFilePickerOptions['types']> = [
{ {
@@ -121,21 +142,27 @@ type AcceptTypes =
| 'Docx' | 'Docx'
| 'MindMap'; | 'MindMap';
export async function openFilesWith( function canUseFileSystemAccessAPI(
acceptType: AcceptTypes = 'Any', api: 'showOpenFilePicker' | 'showDirectoryPicker'
multiple: boolean = true ) {
): Promise<File[] | null> { return (
// Feature detection. The API needs to be supported api in window &&
// and the app not run in an iframe.
const supportsFileSystemAccess =
'showOpenFilePicker' in window &&
(() => { (() => {
try { try {
return window.self === window.top; return window.self === window.top;
} catch { } catch {
return false; return false;
} }
})(); })()
);
}
export async function openFilesWith(
acceptType: AcceptTypes = 'Any',
multiple: boolean = true
): Promise<File[] | null> {
const supportsFileSystemAccess =
canUseFileSystemAccessAPI('showOpenFilePicker');
// If the File System Access API is supported… // If the File System Access API is supported…
if (supportsFileSystemAccess && window.showOpenFilePicker) { if (supportsFileSystemAccess && window.showOpenFilePicker) {
@@ -194,6 +221,75 @@ export async function openFilesWith(
}); });
} }
export async function openDirectory(): Promise<File[] | null> {
const supportsFileSystemAccess = canUseFileSystemAccessAPI(
'showDirectoryPicker'
);
if (supportsFileSystemAccess && window.showDirectoryPicker) {
try {
const dirHandle = await window.showDirectoryPicker();
const files: File[] = [];
const readDirectory = async (
directoryHandle: FileSystemDirectoryHandle,
path: string
) => {
for await (const handle of directoryHandle.values()) {
const relativePath = path ? `${path}/${handle.name}` : handle.name;
if (handle.kind === 'file') {
const fileHandle = handle as FileSystemFileHandle;
if (fileHandle.getFile) {
const file = await fileHandle.getFile();
Object.defineProperty(file, 'webkitRelativePath', {
value: relativePath,
writable: false,
});
files.push(file);
}
} else if (handle.kind === 'directory') {
await readDirectory(
handle as FileSystemDirectoryHandle,
relativePath
);
}
}
};
await readDirectory(dirHandle, '');
return files;
} catch (err) {
console.error(err);
return null;
}
}
return new Promise(resolve => {
const input = document.createElement('input');
input.classList.add('affine-upload-input');
input.style.display = 'none';
input.type = 'file';
input.setAttribute('webkitdirectory', '');
input.setAttribute('directory', '');
document.body.append(input);
input.addEventListener('change', () => {
input.remove();
resolve(input.files ? Array.from(input.files) : null);
});
input.addEventListener('cancel', () => resolve(null));
if ('showPicker' in HTMLInputElement.prototype) {
input.showPicker();
} else {
input.click();
}
});
}
export async function openSingleFileWith( export async function openSingleFileWith(
acceptType?: AcceptTypes acceptType?: AcceptTypes
): Promise<File | null> { ): Promise<File | null> {

View File

@@ -7,6 +7,7 @@ import {
NotionIcon, NotionIcon,
} from '@blocksuite/affine-components/icons'; } from '@blocksuite/affine-components/icons';
import { import {
openDirectory,
openFilesWith, openFilesWith,
openSingleFileWith, openSingleFileWith,
} from '@blocksuite/affine-shared/utils'; } from '@blocksuite/affine-shared/utils';
@@ -18,11 +19,16 @@ import { query, state } from 'lit/decorators.js';
import { HtmlTransformer } from '../transformers/html.js'; import { HtmlTransformer } from '../transformers/html.js';
import { MarkdownTransformer } from '../transformers/markdown.js'; import { MarkdownTransformer } from '../transformers/markdown.js';
import { NotionHtmlTransformer } from '../transformers/notion-html.js'; import { NotionHtmlTransformer } from '../transformers/notion-html.js';
import { ObsidianTransformer } from '../transformers/obsidian.js';
import { styles } from './styles.js'; import { styles } from './styles.js';
export type OnSuccessHandler = ( export type OnSuccessHandler = (
pageIds: string[], pageIds: string[],
options: { isWorkspaceFile: boolean; importedCount: number } options: {
isWorkspaceFile: boolean;
importedCount: number;
docEmojis?: Map<string, string>;
}
) => void; ) => void;
export type OnFailHandler = (message: string) => void; export type OnFailHandler = (message: string) => void;
@@ -140,6 +146,29 @@ export class ImportDoc extends WithDisposable(LitElement) {
}); });
} }
private async _importObsidian() {
const files = await openDirectory();
if (!files || files.length === 0) return;
const needLoading =
files.reduce((acc, f) => acc + f.size, 0) > SHOW_LOADING_SIZE;
if (needLoading) {
this.hidden = false;
this._loading = true;
} else {
this.abortController.abort();
}
const { docIds, docEmojis } = await ObsidianTransformer.importObsidianVault(
{
collection: this.collection,
schema: this.schema,
importedFiles: files,
extensions: this.extensions,
}
);
needLoading && this.abortController.abort();
this._onImportSuccess(docIds, { docEmojis });
}
private _onCloseClick(event: MouseEvent) { private _onCloseClick(event: MouseEvent) {
event.stopPropagation(); event.stopPropagation();
this.abortController.abort(); this.abortController.abort();
@@ -151,15 +180,21 @@ export class ImportDoc extends WithDisposable(LitElement) {
private _onImportSuccess( private _onImportSuccess(
pageIds: string[], pageIds: string[],
options: { isWorkspaceFile?: boolean; importedCount?: number } = {} options: {
isWorkspaceFile?: boolean;
importedCount?: number;
docEmojis?: Map<string, string>;
} = {}
) { ) {
const { const {
isWorkspaceFile = false, isWorkspaceFile = false,
importedCount: pagesImportedCount = pageIds.length, importedCount: pagesImportedCount = pageIds.length,
docEmojis,
} = options; } = options;
this.onSuccess?.(pageIds, { this.onSuccess?.(pageIds, {
isWorkspaceFile, isWorkspaceFile,
importedCount: pagesImportedCount, importedCount: pagesImportedCount,
docEmojis,
}); });
} }
@@ -258,6 +293,13 @@ export class ImportDoc extends WithDisposable(LitElement) {
</affine-tooltip> </affine-tooltip>
</div> </div>
</icon-button> </icon-button>
<icon-button
class="button-item"
text="Obsidian"
@click="${this._importObsidian}"
>
${ExportToMarkdownIcon}
</icon-button>
<icon-button class="button-item" text="Coming soon..." disabled> <icon-button class="button-item" text="Coming soon..." disabled>
${NewIcon} ${NewIcon}
</icon-button> </icon-button>

View File

@@ -2,6 +2,7 @@ export { DocxTransformer } from './docx.js';
export { HtmlTransformer } from './html.js'; export { HtmlTransformer } from './html.js';
export { MarkdownTransformer } from './markdown.js'; export { MarkdownTransformer } from './markdown.js';
export { NotionHtmlTransformer } from './notion-html.js'; export { NotionHtmlTransformer } from './notion-html.js';
export { ObsidianTransformer } from './obsidian.js';
export { PdfTransformer } from './pdf.js'; export { PdfTransformer } from './pdf.js';
export { createAssetsArchive, download } from './utils.js'; export { createAssetsArchive, download } from './utils.js';
export { ZipTransformer } from './zip.js'; export { ZipTransformer } from './zip.js';

View File

@@ -21,8 +21,11 @@ import { extMimeMap, Transformer } from '@blocksuite/store';
import type { AssetMap, ImportedFileEntry, PathBlobIdMap } from './type.js'; import type { AssetMap, ImportedFileEntry, PathBlobIdMap } from './type.js';
import { createAssetsArchive, download, parseMatter, Unzip } from './utils.js'; import { createAssetsArchive, download, parseMatter, Unzip } from './utils.js';
type ParsedFrontmatterMeta = Partial< export type ParsedFrontmatterMeta = Partial<
Pick<DocMeta, 'title' | 'createDate' | 'updatedDate' | 'tags' | 'favorite'> Pick<
DocMeta,
'title' | 'createDate' | 'updatedDate' | 'tags' | 'favorite' | 'trash'
>
>; >;
const FRONTMATTER_KEYS = { const FRONTMATTER_KEYS = {
@@ -150,11 +153,18 @@ function buildMetaFromFrontmatter(
} }
continue; continue;
} }
if (FRONTMATTER_KEYS.trash.includes(key)) {
const trash = parseBoolean(value);
if (trash !== undefined) {
meta.trash = trash;
}
continue;
}
} }
return meta; return meta;
} }
function parseFrontmatter(markdown: string): { export function parseFrontmatter(markdown: string): {
content: string; content: string;
meta: ParsedFrontmatterMeta; meta: ParsedFrontmatterMeta;
} { } {
@@ -176,7 +186,7 @@ function parseFrontmatter(markdown: string): {
} }
} }
function applyMetaPatch( export function applyMetaPatch(
collection: Workspace, collection: Workspace,
docId: string, docId: string,
meta: ParsedFrontmatterMeta meta: ParsedFrontmatterMeta
@@ -187,13 +197,14 @@ function applyMetaPatch(
if (meta.updatedDate !== undefined) metaPatch.updatedDate = meta.updatedDate; if (meta.updatedDate !== undefined) metaPatch.updatedDate = meta.updatedDate;
if (meta.tags) metaPatch.tags = meta.tags; if (meta.tags) metaPatch.tags = meta.tags;
if (meta.favorite !== undefined) metaPatch.favorite = meta.favorite; if (meta.favorite !== undefined) metaPatch.favorite = meta.favorite;
if (meta.trash !== undefined) metaPatch.trash = meta.trash;
if (Object.keys(metaPatch).length) { if (Object.keys(metaPatch).length) {
collection.meta.setDocMeta(docId, metaPatch); collection.meta.setDocMeta(docId, metaPatch);
} }
} }
function getProvider(extensions: ExtensionType[]) { export function getProvider(extensions: ExtensionType[]) {
const container = new Container(); const container = new Container();
extensions.forEach(ext => { extensions.forEach(ext => {
ext.setup(container); ext.setup(container);
@@ -223,6 +234,103 @@ type ImportMarkdownZipOptions = {
extensions: ExtensionType[]; extensions: ExtensionType[];
}; };
/**
* Filters hidden/system entries that should never participate in imports.
*/
export function isSystemImportPath(path: string) {
return path.includes('__MACOSX') || path.includes('.DS_Store');
}
/**
* Creates the doc CRUD bridge used by importer transformers.
*/
export function createCollectionDocCRUD(collection: Workspace) {
return {
create: (id: string) => collection.createDoc(id).getStore({ id }),
get: (id: string) => collection.getDoc(id)?.getStore({ id }) ?? null,
delete: (id: string) => collection.removeDoc(id),
};
}
type CreateMarkdownImportJobOptions = {
collection: Workspace;
schema: Schema;
preferredTitle?: string;
fullPath?: string;
};
/**
* Creates a markdown import job with the standard collection middlewares.
*/
export function createMarkdownImportJob({
collection,
schema,
preferredTitle,
fullPath,
}: CreateMarkdownImportJobOptions) {
return new Transformer({
schema,
blobCRUD: collection.blobSync,
docCRUD: createCollectionDocCRUD(collection),
middlewares: [
defaultImageProxyMiddleware,
fileNameMiddleware(preferredTitle),
docLinkBaseURLMiddleware(collection.id),
...(fullPath ? [filePathMiddleware(fullPath)] : []),
],
});
}
type StageImportedAssetOptions = {
pendingAssets: AssetMap;
pendingPathBlobIdMap: PathBlobIdMap;
path: string;
content: Blob;
fileName: string;
};
/**
* Hashes a non-markdown import file and stages it into the shared asset maps.
*/
export async function stageImportedAsset({
pendingAssets,
pendingPathBlobIdMap,
path,
content,
fileName,
}: StageImportedAssetOptions) {
const ext = path.split('.').at(-1) ?? '';
const mime = extMimeMap.get(ext.toLowerCase()) ?? '';
const key = await sha(await content.arrayBuffer());
pendingPathBlobIdMap.set(path, key);
pendingAssets.set(key, new File([content], fileName, { type: mime }));
}
/**
* Binds previously staged asset files into a transformer job before import.
*/
export function bindImportedAssetsToJob(
job: Transformer,
pendingAssets: AssetMap,
pendingPathBlobIdMap: PathBlobIdMap
) {
const pathBlobIdMap = job.assetsManager.getPathBlobIdMap();
// Iterate over all assets to be imported
for (const [assetPath, key] of pendingPathBlobIdMap.entries()) {
// Get the relative path of the asset to the markdown file
// Store the path to blobId map
pathBlobIdMap.set(assetPath, key);
// Store the asset to assets, the key is the blobId, the value is the file object
// In block adapter, it will use the blobId to get the file object
const assetFile = pendingAssets.get(key);
if (assetFile) {
job.assets.set(key, assetFile);
}
}
return pathBlobIdMap;
}
/** /**
* Exports a doc to a Markdown file or a zip archive containing Markdown and assets. * Exports a doc to a Markdown file or a zip archive containing Markdown and assets.
* @param doc The doc to export * @param doc The doc to export
@@ -329,19 +437,10 @@ async function importMarkdownToDoc({
const { content, meta } = parseFrontmatter(markdown); const { content, meta } = parseFrontmatter(markdown);
const preferredTitle = meta.title ?? fileName; const preferredTitle = meta.title ?? fileName;
const provider = getProvider(extensions); const provider = getProvider(extensions);
const job = new Transformer({ const job = createMarkdownImportJob({
collection,
schema, schema,
blobCRUD: collection.blobSync, preferredTitle,
docCRUD: {
create: (id: string) => collection.createDoc(id).getStore({ id }),
get: (id: string) => collection.getDoc(id)?.getStore({ id }) ?? null,
delete: (id: string) => collection.removeDoc(id),
},
middlewares: [
defaultImageProxyMiddleware,
fileNameMiddleware(preferredTitle),
docLinkBaseURLMiddleware(collection.id),
],
}); });
const mdAdapter = new MarkdownAdapter(job, provider); const mdAdapter = new MarkdownAdapter(job, provider);
const page = await mdAdapter.toDoc({ const page = await mdAdapter.toDoc({
@@ -381,7 +480,7 @@ async function importMarkdownZip({
// Iterate over all files in the zip // Iterate over all files in the zip
for (const { path, content: blob } of unzip) { for (const { path, content: blob } of unzip) {
// Skip the files that are not markdown files // Skip the files that are not markdown files
if (path.includes('__MACOSX') || path.includes('.DS_Store')) { if (isSystemImportPath(path)) {
continue; continue;
} }
@@ -395,12 +494,13 @@ async function importMarkdownZip({
fullPath: path, fullPath: path,
}); });
} else { } else {
// If the file is not a markdown file, store it to pendingAssets await stageImportedAsset({
const ext = path.split('.').at(-1) ?? ''; pendingAssets,
const mime = extMimeMap.get(ext) ?? ''; pendingPathBlobIdMap,
const key = await sha(await blob.arrayBuffer()); path,
pendingPathBlobIdMap.set(path, key); content: blob,
pendingAssets.set(key, new File([blob], fileName, { type: mime })); fileName,
});
} }
} }
@@ -411,34 +511,13 @@ async function importMarkdownZip({
const markdown = await contentBlob.text(); const markdown = await contentBlob.text();
const { content, meta } = parseFrontmatter(markdown); const { content, meta } = parseFrontmatter(markdown);
const preferredTitle = meta.title ?? fileNameWithoutExt; const preferredTitle = meta.title ?? fileNameWithoutExt;
const job = new Transformer({ const job = createMarkdownImportJob({
collection,
schema, schema,
blobCRUD: collection.blobSync, preferredTitle,
docCRUD: { fullPath,
create: (id: string) => collection.createDoc(id).getStore({ id }),
get: (id: string) => collection.getDoc(id)?.getStore({ id }) ?? null,
delete: (id: string) => collection.removeDoc(id),
},
middlewares: [
defaultImageProxyMiddleware,
fileNameMiddleware(preferredTitle),
docLinkBaseURLMiddleware(collection.id),
filePathMiddleware(fullPath),
],
}); });
const assets = job.assets; bindImportedAssetsToJob(job, pendingAssets, pendingPathBlobIdMap);
const pathBlobIdMap = job.assetsManager.getPathBlobIdMap();
// Iterate over all assets to be imported
for (const [assetPath, key] of pendingPathBlobIdMap.entries()) {
// Get the relative path of the asset to the markdown file
// Store the path to blobId map
pathBlobIdMap.set(assetPath, key);
// Store the asset to assets, the key is the blobId, the value is the file object
// In block adapter, it will use the blobId to get the file object
if (pendingAssets.get(key)) {
assets.set(key, pendingAssets.get(key)!);
}
}
const mdAdapter = new MarkdownAdapter(job, provider); const mdAdapter = new MarkdownAdapter(job, provider);
const doc = await mdAdapter.toDoc({ const doc = await mdAdapter.toDoc({

View File

@@ -0,0 +1,834 @@
import { FootNoteReferenceParamsSchema } from '@blocksuite/affine-model';
import {
BlockMarkdownAdapterExtension,
createAttachmentBlockSnapshot,
FULL_FILE_PATH_KEY,
getImageFullPath,
MarkdownAdapter,
type MarkdownAST,
MarkdownASTToDeltaExtension,
normalizeFilePathReference,
} from '@blocksuite/affine-shared/adapters';
import type { AffineTextAttributes } from '@blocksuite/affine-shared/types';
import type {
DeltaInsert,
ExtensionType,
Schema,
Workspace,
} from '@blocksuite/store';
import { extMimeMap, nanoid } from '@blocksuite/store';
import type { Html, Text } from 'mdast';
import {
applyMetaPatch,
bindImportedAssetsToJob,
createMarkdownImportJob,
getProvider,
isSystemImportPath,
parseFrontmatter,
stageImportedAsset,
} from './markdown.js';
import type {
AssetMap,
MarkdownFileImportEntry,
PathBlobIdMap,
} from './type.js';
const CALLOUT_TYPE_MAP: Record<string, string> = {
note: '💡',
info: '',
tip: '🔥',
hint: '✅',
important: '‼️',
warning: '⚠️',
caution: '⚠️',
attention: '⚠️',
danger: '⚠️',
error: '🚨',
bug: '🐛',
example: '📌',
quote: '💬',
cite: '💬',
abstract: '📋',
summary: '📋',
todo: '☑️',
success: '✅',
check: '✅',
done: '✅',
failure: '❌',
fail: '❌',
missing: '❌',
question: '❓',
help: '❓',
faq: '❓',
};
const AMBIGUOUS_PAGE_LOOKUP = '__ambiguous__';
const DEFAULT_CALLOUT_EMOJI = '💡';
const OBSIDIAN_TEXT_FOOTNOTE_URL_PREFIX = 'data:text/plain;charset=utf-8,';
const OBSIDIAN_ATTACHMENT_EMBED_TAG = 'obsidian-attachment';
function normalizeLookupKey(value: string): string {
return normalizeFilePathReference(value).toLowerCase();
}
function stripMarkdownExtension(value: string): string {
return value.replace(/\.md$/i, '');
}
function basename(value: string): string {
return normalizeFilePathReference(value).split('/').pop() ?? value;
}
function parseObsidianTarget(rawTarget: string): {
path: string;
fragment: string | null;
} {
const normalizedTarget = normalizeFilePathReference(rawTarget);
const match = normalizedTarget.match(/^([^#^]+)([#^].*)?$/);
return {
path: match?.[1]?.trim() ?? normalizedTarget,
fragment: match?.[2] ?? null,
};
}
function extractTitleAndEmoji(rawTitle: string): {
title: string;
emoji: string | null;
} {
const SINGLE_LEADING_EMOJI_RE =
/^[\s\u200b]*((?:[\p{Emoji_Presentation}\p{Extended_Pictographic}\u200b]|\u200d|\ufe0f)+)/u;
let currentTitle = rawTitle;
let extractedEmojiClusters = '';
let emojiMatch;
while ((emojiMatch = currentTitle.match(SINGLE_LEADING_EMOJI_RE))) {
const matchedCluster = emojiMatch[1].trim();
extractedEmojiClusters +=
(extractedEmojiClusters ? ' ' : '') + matchedCluster;
currentTitle = currentTitle.slice(emojiMatch[0].length);
}
return {
title: currentTitle.trim(),
emoji: extractedEmojiClusters || null,
};
}
function preprocessTitleHeader(markdown: string): string {
return markdown.replace(
/^(\s*#\s+)(.*)$/m,
(_, headerPrefix, titleContent) => {
const { title: cleanTitle } = extractTitleAndEmoji(titleContent);
return `${headerPrefix}${cleanTitle}`;
}
);
}
function preprocessObsidianCallouts(markdown: string): string {
return markdown.replace(
/^(> *)\[!([^\]\n]+)\]([+-]?)([^\n]*)/gm,
(_, prefix, type, _fold, rest) => {
const calloutToken =
CALLOUT_TYPE_MAP[type.trim().toLowerCase()] ?? DEFAULT_CALLOUT_EMOJI;
const title = rest.trim();
return title
? `${prefix}[!${calloutToken}] ${title}`
: `${prefix}[!${calloutToken}]`;
}
);
}
function isStructuredFootnoteDefinition(content: string): boolean {
try {
return FootNoteReferenceParamsSchema.safeParse(JSON.parse(content.trim()))
.success;
} catch {
return false;
}
}
function splitFootnoteTextContent(content: string): {
title: string;
description?: string;
} {
const lines = content
.split('\n')
.map(line => line.trim())
.filter(Boolean);
const title = lines[0] ?? content.trim();
const description = lines.slice(1).join('\n').trim();
return {
title,
...(description ? { description } : {}),
};
}
function createTextFootnoteDefinition(content: string): string {
const normalizedContent = content.trim();
const { title, description } = splitFootnoteTextContent(normalizedContent);
return JSON.stringify({
type: 'url',
url: encodeURIComponent(
`${OBSIDIAN_TEXT_FOOTNOTE_URL_PREFIX}${encodeURIComponent(
normalizedContent
)}`
),
title,
...(description ? { description } : {}),
});
}
function parseFootnoteDefLine(line: string): {
identifier: string;
content: string;
} | null {
if (!line.startsWith('[^')) return null;
const closeBracketIndex = line.indexOf(']:', 2);
if (closeBracketIndex <= 2) return null;
const identifier = line.slice(2, closeBracketIndex);
if (!identifier || identifier.includes(']')) return null;
let contentStart = closeBracketIndex + 2;
while (
contentStart < line.length &&
(line[contentStart] === ' ' || line[contentStart] === '\t')
) {
contentStart += 1;
}
return {
identifier,
content: line.slice(contentStart),
};
}
function extractObsidianFootnotes(markdown: string): {
content: string;
footnotes: string[];
} {
const lines = markdown.split('\n');
const output: string[] = [];
const footnotes: string[] = [];
for (let index = 0; index < lines.length; index += 1) {
const line = lines[index];
const definition = parseFootnoteDefLine(line);
if (!definition) {
output.push(line);
continue;
}
const { identifier } = definition;
const contentLines = [definition.content];
while (index + 1 < lines.length) {
const nextLine = lines[index + 1];
if (/^(?: {1,4}|\t)/.test(nextLine)) {
contentLines.push(nextLine.replace(/^(?: {1,4}|\t)/, ''));
index += 1;
continue;
}
if (
nextLine.trim() === '' &&
index + 2 < lines.length &&
/^(?: {1,4}|\t)/.test(lines[index + 2])
) {
contentLines.push('');
index += 1;
continue;
}
break;
}
const content = contentLines.join('\n').trim();
footnotes.push(
`[^${identifier}]: ${
!content || isStructuredFootnoteDefinition(content)
? content
: createTextFootnoteDefinition(content)
}`
);
}
return { content: output.join('\n'), footnotes };
}
function buildLookupKeys(
targetPath: string,
currentFilePath?: string
): string[] {
const parsedTargetPath = normalizeFilePathReference(targetPath);
if (!parsedTargetPath) {
return [];
}
const keys = new Set<string>();
const addPathVariants = (value: string) => {
const normalizedValue = normalizeFilePathReference(value);
if (!normalizedValue) {
return;
}
keys.add(normalizedValue);
keys.add(stripMarkdownExtension(normalizedValue));
const fileName = basename(normalizedValue);
keys.add(fileName);
keys.add(stripMarkdownExtension(fileName));
const cleanTitle = extractTitleAndEmoji(
stripMarkdownExtension(fileName)
).title;
if (cleanTitle) {
keys.add(cleanTitle);
}
};
addPathVariants(parsedTargetPath);
if (currentFilePath) {
addPathVariants(getImageFullPath(currentFilePath, parsedTargetPath));
}
return Array.from(keys).map(normalizeLookupKey);
}
function registerPageLookup(
pageLookupMap: Map<string, string>,
key: string,
pageId: string
) {
const normalizedKey = normalizeLookupKey(key);
if (!normalizedKey) {
return;
}
const existing = pageLookupMap.get(normalizedKey);
if (existing && existing !== pageId) {
pageLookupMap.set(normalizedKey, AMBIGUOUS_PAGE_LOOKUP);
return;
}
pageLookupMap.set(normalizedKey, pageId);
}
function resolvePageIdFromLookup(
pageLookupMap: Pick<ReadonlyMap<string, string>, 'get'>,
rawTarget: string,
currentFilePath?: string
): string | null {
const { path } = parseObsidianTarget(rawTarget);
for (const key of buildLookupKeys(path, currentFilePath)) {
const targetPageId = pageLookupMap.get(key);
if (!targetPageId || targetPageId === AMBIGUOUS_PAGE_LOOKUP) {
continue;
}
return targetPageId;
}
return null;
}
function resolveWikilinkDisplayTitle(
rawAlias: string | undefined,
pageEmoji: string | undefined
): string | undefined {
if (!rawAlias) {
return undefined;
}
const { title: aliasTitle, emoji: aliasEmoji } =
extractTitleAndEmoji(rawAlias);
if (aliasEmoji && aliasEmoji === pageEmoji) {
return aliasTitle;
}
return rawAlias;
}
function isImageAssetPath(path: string): boolean {
const extension = path.split('.').at(-1)?.toLowerCase() ?? '';
return extMimeMap.get(extension)?.startsWith('image/') ?? false;
}
function encodeMarkdownPath(path: string): string {
return encodeURI(path).replaceAll('(', '%28').replaceAll(')', '%29');
}
function escapeMarkdownLabel(label: string): string {
return label.replace(/[[\]\\]/g, '\\$&');
}
function isObsidianSizeAlias(alias: string | undefined): boolean {
return !!alias && /^\d+(?:x\d+)?$/i.test(alias.trim());
}
function getEmbedLabel(
rawAlias: string | undefined,
targetPath: string,
fallbackToFileName: boolean
): string {
if (!rawAlias || isObsidianSizeAlias(rawAlias)) {
return fallbackToFileName
? stripMarkdownExtension(basename(targetPath))
: '';
}
return rawAlias.trim();
}
type ObsidianAttachmentEmbed = {
blobId: string;
fileName: string;
fileType: string;
};
function createObsidianAttach(embed: ObsidianAttachmentEmbed): string {
return `<!-- ${OBSIDIAN_ATTACHMENT_EMBED_TAG} ${encodeURIComponent(
JSON.stringify(embed)
)} -->`;
}
function parseObsidianAttach(value: string): ObsidianAttachmentEmbed | null {
const match = value.match(
new RegExp(`^<!-- ${OBSIDIAN_ATTACHMENT_EMBED_TAG} ([^ ]+) -->$`)
);
if (!match?.[1]) return null;
try {
const parsed = JSON.parse(
decodeURIComponent(match[1])
) as ObsidianAttachmentEmbed;
if (!parsed.blobId || !parsed.fileName) {
return null;
}
return parsed;
} catch {
return null;
}
}
function parseWikiLinkAt(
source: string,
startIdx: number,
embedded: boolean
): {
raw: string;
rawTarget: string;
rawAlias?: string;
endIdx: number;
} | null {
const opener = embedded ? '![[' : '[[';
if (!source.startsWith(opener, startIdx)) return null;
const contentStart = startIdx + opener.length;
const closeIndex = source.indexOf(']]', contentStart);
if (closeIndex === -1) return null;
const inner = source.slice(contentStart, closeIndex);
const separatorIdx = inner.indexOf('|');
const rawTarget = separatorIdx === -1 ? inner : inner.slice(0, separatorIdx);
const rawAlias =
separatorIdx === -1 ? undefined : inner.slice(separatorIdx + 1);
if (
rawTarget.length === 0 ||
rawTarget.includes(']') ||
rawTarget.includes('|') ||
rawAlias?.includes(']')
) {
return null;
}
return {
raw: source.slice(startIdx, closeIndex + 2),
rawTarget,
rawAlias,
endIdx: closeIndex + 2,
};
}
function replaceWikiLinks(
source: string,
embedded: boolean,
replacer: (match: {
raw: string;
rawTarget: string;
rawAlias?: string;
}) => string
): string {
const opener = embedded ? '![[' : '[[';
let cursor = 0;
let output = '';
while (cursor < source.length) {
const matchStart = source.indexOf(opener, cursor);
if (matchStart === -1) {
output += source.slice(cursor);
break;
}
output += source.slice(cursor, matchStart);
const match = parseWikiLinkAt(source, matchStart, embedded);
if (!match) {
output += source.slice(matchStart, matchStart + opener.length);
cursor = matchStart + opener.length;
continue;
}
output += replacer(match);
cursor = match.endIdx;
}
return output;
}
function preprocessObsidianEmbeds(
markdown: string,
filePath: string,
pageLookupMap: ReadonlyMap<string, string>,
pathBlobIdMap: ReadonlyMap<string, string>
): string {
return replaceWikiLinks(markdown, true, ({ raw, rawTarget, rawAlias }) => {
const targetPageId = resolvePageIdFromLookup(
pageLookupMap,
rawTarget,
filePath
);
if (targetPageId) {
return `[[${rawTarget}${rawAlias ? `|${rawAlias}` : ''}]]`;
}
const { path } = parseObsidianTarget(rawTarget);
if (!path) return raw;
const assetPath = getImageFullPath(filePath, path);
const encodedPath = encodeMarkdownPath(assetPath);
if (isImageAssetPath(path)) {
const alt = getEmbedLabel(rawAlias, path, false);
return `![${escapeMarkdownLabel(alt)}](${encodedPath})`;
}
const label = getEmbedLabel(rawAlias, path, true);
const blobId = pathBlobIdMap.get(assetPath);
if (!blobId) return `[${escapeMarkdownLabel(label)}](${encodedPath})`;
const extension = path.split('.').at(-1)?.toLowerCase() ?? '';
return createObsidianAttach({
blobId,
fileName: basename(path),
fileType: extMimeMap.get(extension) ?? '',
});
});
}
function preprocessObsidianMarkdown(
markdown: string,
filePath: string,
pageLookupMap: ReadonlyMap<string, string>,
pathBlobIdMap: ReadonlyMap<string, string>
): string {
const { content: contentWithoutFootnotes, footnotes: extractedFootnotes } =
extractObsidianFootnotes(markdown);
const content = preprocessObsidianEmbeds(
contentWithoutFootnotes,
filePath,
pageLookupMap,
pathBlobIdMap
);
const normalizedMarkdown = preprocessTitleHeader(
preprocessObsidianCallouts(content)
);
if (extractedFootnotes.length === 0) {
return normalizedMarkdown;
}
const trimmedMarkdown = normalizedMarkdown.replace(/\s+$/, '');
return `${trimmedMarkdown}\n\n${extractedFootnotes.join('\n\n')}\n`;
}
function isObsidianAttachmentEmbedNode(node: MarkdownAST): node is Html {
return node.type === 'html' && !!parseObsidianAttach(node.value);
}
export const obsidianAttachmentEmbedMarkdownAdapterMatcher =
BlockMarkdownAdapterExtension({
flavour: 'obsidian:attachment-embed',
toMatch: o => isObsidianAttachmentEmbedNode(o.node),
fromMatch: () => false,
toBlockSnapshot: {
enter: (o, context) => {
if (!isObsidianAttachmentEmbedNode(o.node)) {
return;
}
const attachment = parseObsidianAttach(o.node.value);
if (!attachment) {
return;
}
const assetFile = context.assets?.getAssets().get(attachment.blobId);
context.walkerContext
.openNode(
createAttachmentBlockSnapshot({
id: nanoid(),
props: {
name: attachment.fileName,
size: assetFile?.size ?? 0,
type:
attachment.fileType ||
assetFile?.type ||
'application/octet-stream',
sourceId: attachment.blobId,
embed: false,
style: 'horizontalThin',
footnoteIdentifier: null,
},
}),
'children'
)
.closeNode();
(o.node as unknown as { type: string }).type =
'obsidianAttachmentEmbed';
},
},
fromBlockSnapshot: {},
});
export const obsidianWikilinkToDeltaMatcher = MarkdownASTToDeltaExtension({
name: 'obsidian-wikilink',
match: ast => ast.type === 'text',
toDelta: (ast, context) => {
const textNode = ast as Text;
if (!textNode.value) {
return [];
}
const nodeContent = textNode.value;
const deltas: DeltaInsert<AffineTextAttributes>[] = [];
let cursor = 0;
while (cursor < nodeContent.length) {
const matchStart = nodeContent.indexOf('[[', cursor);
if (matchStart === -1) {
deltas.push({ insert: nodeContent.substring(cursor) });
break;
}
if (matchStart > cursor) {
deltas.push({
insert: nodeContent.substring(cursor, matchStart),
});
}
const linkMatch = parseWikiLinkAt(nodeContent, matchStart, false);
if (!linkMatch) {
deltas.push({ insert: '[[' });
cursor = matchStart + 2;
continue;
}
const targetPageName = linkMatch.rawTarget.trim();
const alias = linkMatch.rawAlias?.trim();
const currentFilePath = context.configs.get(FULL_FILE_PATH_KEY);
const targetPageId = resolvePageIdFromLookup(
{ get: key => context.configs.get(`obsidian:pageId:${key}`) },
targetPageName,
typeof currentFilePath === 'string' ? currentFilePath : undefined
);
if (targetPageId) {
const pageEmoji = context.configs.get(
'obsidian:pageEmoji:' + targetPageId
);
const displayTitle = resolveWikilinkDisplayTitle(alias, pageEmoji);
deltas.push({
insert: ' ',
attributes: {
reference: {
type: 'LinkedPage',
pageId: targetPageId,
...(displayTitle ? { title: displayTitle } : {}),
},
},
});
} else {
deltas.push({ insert: linkMatch.raw });
}
cursor = linkMatch.endIdx;
}
return deltas;
},
});
export type ImportObsidianVaultOptions = {
collection: Workspace;
schema: Schema;
importedFiles: File[];
extensions: ExtensionType[];
};
export type ImportObsidianVaultResult = {
docIds: string[];
docEmojis: Map<string, string>;
};
export async function importObsidianVault({
collection,
schema,
importedFiles,
extensions,
}: ImportObsidianVaultOptions): Promise<ImportObsidianVaultResult> {
const provider = getProvider([
obsidianWikilinkToDeltaMatcher,
obsidianAttachmentEmbedMarkdownAdapterMatcher,
...extensions,
]);
const docIds: string[] = [];
const docEmojis = new Map<string, string>();
const pendingAssets: AssetMap = new Map();
const pendingPathBlobIdMap: PathBlobIdMap = new Map();
const markdownBlobs: MarkdownFileImportEntry[] = [];
const pageLookupMap = new Map<string, string>();
for (const file of importedFiles) {
const filePath = file.webkitRelativePath || file.name;
if (isSystemImportPath(filePath)) continue;
if (file.name.endsWith('.md')) {
const fileNameWithoutExt = file.name.replace(/\.[^/.]+$/, '');
const markdown = await file.text();
const { content, meta } = parseFrontmatter(markdown);
const documentTitleCandidate = meta.title ?? fileNameWithoutExt;
const { title: preferredTitle, emoji: leadingEmoji } =
extractTitleAndEmoji(documentTitleCandidate);
const newPageId = collection.idGenerator();
registerPageLookup(pageLookupMap, filePath, newPageId);
registerPageLookup(
pageLookupMap,
stripMarkdownExtension(filePath),
newPageId
);
registerPageLookup(pageLookupMap, file.name, newPageId);
registerPageLookup(pageLookupMap, fileNameWithoutExt, newPageId);
registerPageLookup(pageLookupMap, documentTitleCandidate, newPageId);
registerPageLookup(pageLookupMap, preferredTitle, newPageId);
if (leadingEmoji) {
docEmojis.set(newPageId, leadingEmoji);
}
markdownBlobs.push({
filename: file.name,
contentBlob: file,
fullPath: filePath,
pageId: newPageId,
preferredTitle,
content,
meta,
});
} else {
await stageImportedAsset({
pendingAssets,
pendingPathBlobIdMap,
path: filePath,
content: file,
fileName: file.name,
});
}
}
for (const existingDocMeta of collection.meta.docMetas) {
if (existingDocMeta.title) {
registerPageLookup(
pageLookupMap,
existingDocMeta.title,
existingDocMeta.id
);
}
}
await Promise.all(
markdownBlobs.map(async markdownFile => {
const {
fullPath,
pageId: predefinedId,
preferredTitle,
content,
meta,
} = markdownFile;
const job = createMarkdownImportJob({
collection,
schema,
preferredTitle,
fullPath,
});
for (const [lookupKey, id] of pageLookupMap.entries()) {
if (id === AMBIGUOUS_PAGE_LOOKUP) {
continue;
}
job.adapterConfigs.set(`obsidian:pageId:${lookupKey}`, id);
}
for (const [id, emoji] of docEmojis.entries()) {
job.adapterConfigs.set('obsidian:pageEmoji:' + id, emoji);
}
const pathBlobIdMap = bindImportedAssetsToJob(
job,
pendingAssets,
pendingPathBlobIdMap
);
const preprocessedMarkdown = preprocessObsidianMarkdown(
content,
fullPath,
pageLookupMap,
pathBlobIdMap
);
const mdAdapter = new MarkdownAdapter(job, provider);
const snapshot = await mdAdapter.toDocSnapshot({
file: preprocessedMarkdown,
assets: job.assetsManager,
});
if (snapshot) {
snapshot.meta.id = predefinedId;
const doc = await job.snapshotToDoc(snapshot);
if (doc) {
applyMetaPatch(collection, doc.id, {
...meta,
title: preferredTitle,
trash: false,
});
docIds.push(doc.id);
}
}
})
);
return { docIds, docEmojis };
}
export const ObsidianTransformer = {
importObsidianVault,
};

View File

@@ -1,3 +1,5 @@
import type { ParsedFrontmatterMeta } from './markdown.js';
/** /**
* Represents an imported file entry in the zip archive * Represents an imported file entry in the zip archive
*/ */
@@ -10,6 +12,13 @@ export type ImportedFileEntry = {
fullPath: string; fullPath: string;
}; };
export type MarkdownFileImportEntry = ImportedFileEntry & {
pageId: string;
preferredTitle: string;
content: string;
meta: ParsedFrontmatterMeta;
};
/** /**
* Map of asset hash to File object for all media files in the zip * Map of asset hash to File object for all media files in the zip
* Key: SHA hash of the file content (blobId) * Key: SHA hash of the file content (blobId)

View File

@@ -162,10 +162,11 @@ export class AffineToolbarWidget extends WidgetComponent {
} }
setReferenceElementWithElements(gfx: GfxController, elements: GfxModel[]) { setReferenceElementWithElements(gfx: GfxController, elements: GfxModel[]) {
const surfaceBounds = getCommonBoundWithRotation(elements);
const getBoundingClientRect = () => { const getBoundingClientRect = () => {
const bounds = getCommonBoundWithRotation(elements);
const { x: offsetX, y: offsetY } = this.getBoundingClientRect(); const { x: offsetX, y: offsetY } = this.getBoundingClientRect();
const [x, y, w, h] = gfx.viewport.toViewBound(bounds).toXYWH(); const [x, y, w, h] = gfx.viewport.toViewBound(surfaceBounds).toXYWH();
const rect = new DOMRect(x + offsetX, y + offsetY, w, h); const rect = new DOMRect(x + offsetX, y + offsetY, w, h);
return rect; return rect;
}; };

View File

@@ -34,6 +34,7 @@
}, },
"devDependencies": { "devDependencies": {
"@vitest/browser-playwright": "^4.0.18", "@vitest/browser-playwright": "^4.0.18",
"playwright": "=1.58.2",
"vitest": "^4.0.18" "vitest": "^4.0.18"
}, },
"exports": { "exports": {

View File

@@ -103,8 +103,9 @@ export abstract class GfxPrimitiveElementModel<
} }
get deserializedXYWH() { get deserializedXYWH() {
if (!this._lastXYWH || this.xywh !== this._lastXYWH) { const xywh = this.xywh;
const xywh = this.xywh;
if (!this._lastXYWH || xywh !== this._lastXYWH) {
this._local.set('deserializedXYWH', deserializeXYWH(xywh)); this._local.set('deserializedXYWH', deserializeXYWH(xywh));
this._lastXYWH = xywh; this._lastXYWH = xywh;
} }
@@ -386,6 +387,8 @@ export abstract class GfxGroupLikeElementModel<
{ {
private _childIds: string[] = []; private _childIds: string[] = [];
private _xywhDirty = true;
private readonly _mutex = createMutex(); private readonly _mutex = createMutex();
abstract children: Y.Map<any>; abstract children: Y.Map<any>;
@@ -420,24 +423,9 @@ export abstract class GfxGroupLikeElementModel<
get xywh() { get xywh() {
this._mutex(() => { this._mutex(() => {
const curXYWH = if (this._xywhDirty || !this._local.has('xywh')) {
(this._local.get('xywh') as SerializedXYWH) ?? '[0,0,0,0]'; this._local.set('xywh', this._getXYWH().serialize());
const newXYWH = this._getXYWH().serialize(); this._xywhDirty = false;
if (curXYWH !== newXYWH || !this._local.has('xywh')) {
this._local.set('xywh', newXYWH);
if (curXYWH !== newXYWH) {
this._onChange({
props: {
xywh: newXYWH,
},
oldValues: {
xywh: curXYWH,
},
local: true,
});
}
} }
}); });
@@ -457,15 +445,41 @@ export abstract class GfxGroupLikeElementModel<
bound = bound ? bound.unite(child.elementBound) : child.elementBound; bound = bound ? bound.unite(child.elementBound) : child.elementBound;
}); });
if (bound) {
this._local.set('xywh', bound.serialize());
} else {
this._local.delete('xywh');
}
return bound ?? new Bound(0, 0, 0, 0); return bound ?? new Bound(0, 0, 0, 0);
} }
invalidateXYWH() {
this._xywhDirty = true;
this._local.delete('deserializedXYWH');
}
refreshXYWH(local: boolean) {
this._mutex(() => {
const oldXYWH =
(this._local.get('xywh') as SerializedXYWH) ?? '[0,0,0,0]';
const nextXYWH = this._getXYWH().serialize();
this._xywhDirty = false;
if (oldXYWH === nextXYWH && this._local.has('xywh')) {
return;
}
this._local.set('xywh', nextXYWH);
this._local.delete('deserializedXYWH');
this._onChange({
props: {
xywh: nextXYWH,
},
oldValues: {
xywh: oldXYWH,
},
local,
});
});
}
abstract addChild(element: GfxModel): void; abstract addChild(element: GfxModel): void;
/** /**
@@ -496,6 +510,7 @@ export abstract class GfxGroupLikeElementModel<
setChildIds(value: string[], fromLocal: boolean) { setChildIds(value: string[], fromLocal: boolean) {
const oldChildIds = this.childIds; const oldChildIds = this.childIds;
this._childIds = value; this._childIds = value;
this.invalidateXYWH();
this._onChange({ this._onChange({
props: { props: {

View File

@@ -52,6 +52,12 @@ export type MiddlewareCtx = {
export type SurfaceMiddleware = (ctx: MiddlewareCtx) => void; export type SurfaceMiddleware = (ctx: MiddlewareCtx) => void;
export class SurfaceBlockModel extends BlockModel<SurfaceBlockProps> { export class SurfaceBlockModel extends BlockModel<SurfaceBlockProps> {
private static readonly _groupBoundImpactKeys = new Set([
'xywh',
'rotate',
'hidden',
]);
protected _decoratorState = createDecoratorState(); protected _decoratorState = createDecoratorState();
protected _elementCtorMap: Record< protected _elementCtorMap: Record<
@@ -308,6 +314,42 @@ export class SurfaceBlockModel extends BlockModel<SurfaceBlockProps> {
Object.keys(payload.props).forEach(key => { Object.keys(payload.props).forEach(key => {
model.propsUpdated.next({ key }); model.propsUpdated.next({ key });
}); });
this._refreshParentGroupBoundsForElement(model, payload);
}
private _refreshParentGroupBounds(id: string, local: boolean) {
const group = this.getGroup(id);
if (group instanceof GfxGroupLikeElementModel) {
group.refreshXYWH(local);
}
}
private _refreshParentGroupBoundsForElement(
model: GfxPrimitiveElementModel,
payload: ElementUpdatedData
) {
if (
model instanceof GfxGroupLikeElementModel &&
('childIds' in payload.props || 'childIds' in payload.oldValues)
) {
model.refreshXYWH(payload.local);
return;
}
const affectedKeys = new Set([
...Object.keys(payload.props),
...Object.keys(payload.oldValues),
]);
if (
Array.from(affectedKeys).some(key =>
SurfaceBlockModel._groupBoundImpactKeys.has(key)
)
) {
this._refreshParentGroupBounds(model.id, payload.local);
}
} }
private _initElementModels() { private _initElementModels() {
@@ -458,6 +500,10 @@ export class SurfaceBlockModel extends BlockModel<SurfaceBlockProps> {
); );
} }
if (payload.model instanceof BlockModel) {
this._refreshParentGroupBounds(payload.id, payload.isLocal);
}
break; break;
case 'delete': case 'delete':
if (isGfxGroupCompatibleModel(payload.model)) { if (isGfxGroupCompatibleModel(payload.model)) {
@@ -482,6 +528,13 @@ export class SurfaceBlockModel extends BlockModel<SurfaceBlockProps> {
} }
} }
if (
payload.props.key &&
SurfaceBlockModel._groupBoundImpactKeys.has(payload.props.key)
) {
this._refreshParentGroupBounds(payload.id, payload.isLocal);
}
break; break;
} }
}); });

View File

@@ -42,6 +42,7 @@
"devDependencies": { "devDependencies": {
"@vanilla-extract/vite-plugin": "^5.0.0", "@vanilla-extract/vite-plugin": "^5.0.0",
"@vitest/browser-playwright": "^4.0.18", "@vitest/browser-playwright": "^4.0.18",
"playwright": "=1.58.2",
"vite": "^7.2.7", "vite": "^7.2.7",
"vite-plugin-istanbul": "^7.2.1", "vite-plugin-istanbul": "^7.2.1",
"vite-plugin-wasm": "^3.5.0", "vite-plugin-wasm": "^3.5.0",

View File

@@ -4,6 +4,7 @@ import type {
ConnectorElementModel, ConnectorElementModel,
GroupElementModel, GroupElementModel,
} from '@blocksuite/affine/model'; } from '@blocksuite/affine/model';
import { serializeXYWH } from '@blocksuite/global/gfx';
import { beforeEach, describe, expect, test } from 'vitest'; import { beforeEach, describe, expect, test } from 'vitest';
import { wait } from '../utils/common.js'; import { wait } from '../utils/common.js';
@@ -138,6 +139,29 @@ describe('group', () => {
expect(group.childIds).toEqual([id]); expect(group.childIds).toEqual([id]);
}); });
test('group xywh should update when child xywh changes', () => {
const shapeId = model.addElement({
type: 'shape',
xywh: serializeXYWH(0, 0, 100, 100),
});
const groupId = model.addElement({
type: 'group',
children: {
[shapeId]: true,
},
});
const group = model.getElementById(groupId) as GroupElementModel;
expect(group.xywh).toBe(serializeXYWH(0, 0, 100, 100));
model.updateElement(shapeId, {
xywh: serializeXYWH(50, 60, 100, 100),
});
expect(group.xywh).toBe(serializeXYWH(50, 60, 100, 100));
});
}); });
describe('connector', () => { describe('connector', () => {

Binary file not shown.

Before

Width:  |  Height:  |  Size: 24 KiB

After

Width:  |  Height:  |  Size: 25 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 24 KiB

After

Width:  |  Height:  |  Size: 25 KiB

48
deny.toml Normal file
View File

@@ -0,0 +1,48 @@
[graph]
all-features = true
exclude-dev = true
targets = [
"x86_64-unknown-linux-gnu",
"aarch64-apple-darwin",
"x86_64-apple-darwin",
"x86_64-pc-windows-msvc",
"aarch64-linux-android",
"aarch64-apple-ios",
"aarch64-apple-ios-sim",
]
[licenses]
allow = [
"0BSD",
"Apache-2.0",
"Apache-2.0 WITH LLVM-exception",
"BSD-2-Clause",
"BSD-3-Clause",
"BSL-1.0",
"CC0-1.0",
"CDLA-Permissive-2.0",
"ISC",
"MIT",
"MPL-2.0",
"Unicode-3.0",
"Unlicense",
"Zlib",
]
confidence-threshold = 0.93
unused-allowed-license = "allow"
version = 2
[[licenses.exceptions]]
allow = ["AGPL-3.0-only"]
crate = "llm_adapter"
[[licenses.exceptions]]
allow = ["AGPL-3.0-or-later"]
crate = "memory-indexer"
[[licenses.exceptions]]
allow = ["AGPL-3.0-or-later"]
crate = "path-ext"
[licenses.private]
ignore = true

View File

@@ -92,7 +92,7 @@
"vite": "^7.2.7", "vite": "^7.2.7",
"vitest": "^4.0.18" "vitest": "^4.0.18"
}, },
"packageManager": "yarn@4.12.0", "packageManager": "yarn@4.13.0",
"resolutions": { "resolutions": {
"array-buffer-byte-length": "npm:@nolyfill/array-buffer-byte-length@^1", "array-buffer-byte-length": "npm:@nolyfill/array-buffer-byte-length@^1",
"array-includes": "npm:@nolyfill/array-includes@^1", "array-includes": "npm:@nolyfill/array-includes@^1",

View File

@@ -2,6 +2,7 @@
edition = "2024" edition = "2024"
license-file = "LICENSE" license-file = "LICENSE"
name = "affine_server_native" name = "affine_server_native"
publish = false
version = "1.0.0" version = "1.0.0"
[lib] [lib]

View File

@@ -25,7 +25,7 @@
"dependencies": { "dependencies": {
"@affine/s3-compat": "workspace:*", "@affine/s3-compat": "workspace:*",
"@affine/server-native": "workspace:*", "@affine/server-native": "workspace:*",
"@apollo/server": "^4.13.0", "@apollo/server": "^5.0.0",
"@fal-ai/serverless-client": "^0.15.0", "@fal-ai/serverless-client": "^0.15.0",
"@google-cloud/opentelemetry-cloud-trace-exporter": "^3.0.0", "@google-cloud/opentelemetry-cloud-trace-exporter": "^3.0.0",
"@google-cloud/opentelemetry-resource-util": "^3.0.0", "@google-cloud/opentelemetry-resource-util": "^3.0.0",
@@ -33,30 +33,30 @@
"@nestjs-cls/transactional-adapter-prisma": "^1.3.4", "@nestjs-cls/transactional-adapter-prisma": "^1.3.4",
"@nestjs/apollo": "^13.0.4", "@nestjs/apollo": "^13.0.4",
"@nestjs/bullmq": "^11.0.4", "@nestjs/bullmq": "^11.0.4",
"@nestjs/common": "^11.0.21", "@nestjs/common": "^11.1.17",
"@nestjs/core": "^11.1.14", "@nestjs/core": "^11.1.17",
"@nestjs/graphql": "^13.0.4", "@nestjs/graphql": "^13.0.4",
"@nestjs/platform-express": "^11.1.14", "@nestjs/platform-express": "^11.1.17",
"@nestjs/platform-socket.io": "^11.1.14", "@nestjs/platform-socket.io": "^11.1.17",
"@nestjs/schedule": "^6.1.1", "@nestjs/schedule": "^6.1.1",
"@nestjs/throttler": "^6.5.0", "@nestjs/throttler": "^6.5.0",
"@nestjs/websockets": "^11.1.14", "@nestjs/websockets": "^11.1.17",
"@node-rs/argon2": "^2.0.2", "@node-rs/argon2": "^2.0.2",
"@node-rs/crc32": "^1.10.6", "@node-rs/crc32": "^1.10.6",
"@opentelemetry/api": "^1.9.0", "@opentelemetry/api": "^1.9.0",
"@opentelemetry/core": "^2.2.0", "@opentelemetry/core": "^2.2.0",
"@opentelemetry/exporter-prometheus": "^0.212.0", "@opentelemetry/exporter-prometheus": "^0.213.0",
"@opentelemetry/exporter-zipkin": "^2.2.0", "@opentelemetry/exporter-zipkin": "^2.6.0",
"@opentelemetry/host-metrics": "^0.38.0", "@opentelemetry/host-metrics": "^0.38.3",
"@opentelemetry/instrumentation": "^0.212.0", "@opentelemetry/instrumentation": "^0.213.0",
"@opentelemetry/instrumentation-graphql": "^0.60.0", "@opentelemetry/instrumentation-graphql": "^0.61.0",
"@opentelemetry/instrumentation-http": "^0.212.0", "@opentelemetry/instrumentation-http": "^0.213.0",
"@opentelemetry/instrumentation-ioredis": "^0.60.0", "@opentelemetry/instrumentation-ioredis": "^0.61.0",
"@opentelemetry/instrumentation-nestjs-core": "^0.58.0", "@opentelemetry/instrumentation-nestjs-core": "^0.59.0",
"@opentelemetry/instrumentation-socket.io": "^0.59.0", "@opentelemetry/instrumentation-socket.io": "^0.60.0",
"@opentelemetry/resources": "^2.2.0", "@opentelemetry/resources": "^2.2.0",
"@opentelemetry/sdk-metrics": "^2.2.0", "@opentelemetry/sdk-metrics": "^2.2.0",
"@opentelemetry/sdk-node": "^0.212.0", "@opentelemetry/sdk-node": "^0.213.0",
"@opentelemetry/sdk-trace-node": "^2.2.0", "@opentelemetry/sdk-trace-node": "^2.2.0",
"@opentelemetry/semantic-conventions": "^1.38.0", "@opentelemetry/semantic-conventions": "^1.38.0",
"@prisma/client": "^6.6.0", "@prisma/client": "^6.6.0",
@@ -72,7 +72,7 @@
"eventemitter2": "^6.4.9", "eventemitter2": "^6.4.9",
"exa-js": "^2.4.0", "exa-js": "^2.4.0",
"express": "^5.0.1", "express": "^5.0.1",
"fast-xml-parser": "^5.3.4", "fast-xml-parser": "^5.5.7",
"get-stream": "^9.0.1", "get-stream": "^9.0.1",
"google-auth-library": "^10.2.0", "google-auth-library": "^10.2.0",
"graphql": "^16.9.0", "graphql": "^16.9.0",

View File

@@ -1,12 +1,35 @@
import test from 'ava'; import test from 'ava';
import { z } from 'zod'; import { z } from 'zod';
import type { DocReader } from '../../core/doc';
import type { AccessController } from '../../core/permission';
import type { Models } from '../../models';
import { NativeLlmRequest, NativeLlmStreamEvent } from '../../native'; import { NativeLlmRequest, NativeLlmStreamEvent } from '../../native';
import { import {
ToolCallAccumulator, ToolCallAccumulator,
ToolCallLoop, ToolCallLoop,
ToolSchemaExtractor, ToolSchemaExtractor,
} from '../../plugins/copilot/providers/loop'; } from '../../plugins/copilot/providers/loop';
import {
buildBlobContentGetter,
createBlobReadTool,
} from '../../plugins/copilot/tools/blob-read';
import {
buildDocKeywordSearchGetter,
createDocKeywordSearchTool,
} from '../../plugins/copilot/tools/doc-keyword-search';
import {
buildDocContentGetter,
createDocReadTool,
} from '../../plugins/copilot/tools/doc-read';
import {
buildDocSearchGetter,
createDocSemanticSearchTool,
} from '../../plugins/copilot/tools/doc-semantic-search';
import {
DOCUMENT_SYNC_PENDING_MESSAGE,
LOCAL_WORKSPACE_SYNC_REQUIRED_MESSAGE,
} from '../../plugins/copilot/tools/doc-sync';
test('ToolCallAccumulator should merge deltas and complete tool call', t => { test('ToolCallAccumulator should merge deltas and complete tool call', t => {
const accumulator = new ToolCallAccumulator(); const accumulator = new ToolCallAccumulator();
@@ -286,3 +309,210 @@ test('ToolCallLoop should surface invalid JSON as tool error without executing',
is_error: true, is_error: true,
}); });
}); });
test('doc_read should return specific sync errors for unavailable docs', async t => {
const cases = [
{
name: 'local workspace without cloud sync',
workspace: null,
authors: null,
markdown: null,
expected: {
type: 'error',
name: 'Workspace Sync Required',
message: LOCAL_WORKSPACE_SYNC_REQUIRED_MESSAGE,
},
docReaderCalled: false,
},
{
name: 'cloud workspace document not synced to server yet',
workspace: { id: 'ws-1' },
authors: null,
markdown: null,
expected: {
type: 'error',
name: 'Document Sync Pending',
message: DOCUMENT_SYNC_PENDING_MESSAGE('doc-1'),
},
docReaderCalled: false,
},
{
name: 'cloud workspace document markdown not ready yet',
workspace: { id: 'ws-1' },
authors: {
createdAt: new Date('2026-01-01T00:00:00.000Z'),
updatedAt: new Date('2026-01-01T00:00:00.000Z'),
createdByUser: null,
updatedByUser: null,
},
markdown: null,
expected: {
type: 'error',
name: 'Document Sync Pending',
message: DOCUMENT_SYNC_PENDING_MESSAGE('doc-1'),
},
docReaderCalled: true,
},
] as const;
const ac = {
user: () => ({
workspace: () => ({ doc: () => ({ can: async () => true }) }),
}),
} as unknown as AccessController;
for (const testCase of cases) {
let docReaderCalled = false;
const docReader = {
getDocMarkdown: async () => {
docReaderCalled = true;
return testCase.markdown;
},
} as unknown as DocReader;
const models = {
workspace: {
get: async () => testCase.workspace,
},
doc: {
getAuthors: async () => testCase.authors,
},
} as unknown as Models;
const getDoc = buildDocContentGetter(ac, docReader, models);
const tool = createDocReadTool(
getDoc.bind(null, {
user: 'user-1',
workspace: 'workspace-1',
})
);
const result = await tool.execute?.({ doc_id: 'doc-1' }, {});
t.is(docReaderCalled, testCase.docReaderCalled, testCase.name);
t.deepEqual(result, testCase.expected, testCase.name);
}
});
test('document search tools should return sync error for local workspace', async t => {
const ac = {
user: () => ({
workspace: () => ({
can: async () => true,
docs: async () => [],
}),
}),
} as unknown as AccessController;
const models = {
workspace: {
get: async () => null,
},
} as unknown as Models;
let keywordSearchCalled = false;
const indexerService = {
searchDocsByKeyword: async () => {
keywordSearchCalled = true;
return [];
},
} as unknown as Parameters<typeof buildDocKeywordSearchGetter>[1];
let semanticSearchCalled = false;
const contextService = {
matchWorkspaceAll: async () => {
semanticSearchCalled = true;
return [];
},
} as unknown as Parameters<typeof buildDocSearchGetter>[1];
const keywordTool = createDocKeywordSearchTool(
buildDocKeywordSearchGetter(ac, indexerService, models).bind(null, {
user: 'user-1',
workspace: 'workspace-1',
})
);
const semanticTool = createDocSemanticSearchTool(
buildDocSearchGetter(ac, contextService, null, models).bind(null, {
user: 'user-1',
workspace: 'workspace-1',
})
);
const keywordResult = await keywordTool.execute?.({ query: 'hello' }, {});
const semanticResult = await semanticTool.execute?.({ query: 'hello' }, {});
t.false(keywordSearchCalled);
t.false(semanticSearchCalled);
t.deepEqual(keywordResult, {
type: 'error',
name: 'Workspace Sync Required',
message: LOCAL_WORKSPACE_SYNC_REQUIRED_MESSAGE,
});
t.deepEqual(semanticResult, {
type: 'error',
name: 'Workspace Sync Required',
message: LOCAL_WORKSPACE_SYNC_REQUIRED_MESSAGE,
});
});
test('doc_semantic_search should return empty array when nothing matches', async t => {
const ac = {
user: () => ({
workspace: () => ({
can: async () => true,
docs: async () => [],
}),
}),
} as unknown as AccessController;
const models = {
workspace: {
get: async () => ({ id: 'workspace-1' }),
},
} as unknown as Models;
const contextService = {
matchWorkspaceAll: async () => [],
} as unknown as Parameters<typeof buildDocSearchGetter>[1];
const semanticTool = createDocSemanticSearchTool(
buildDocSearchGetter(ac, contextService, null, models).bind(null, {
user: 'user-1',
workspace: 'workspace-1',
})
);
const result = await semanticTool.execute?.({ query: 'hello' }, {});
t.deepEqual(result, []);
});
test('blob_read should return explicit error when attachment context is missing', async t => {
const ac = {
user: () => ({
workspace: () => ({
allowLocal: () => ({
can: async () => true,
}),
}),
}),
} as unknown as AccessController;
const blobTool = createBlobReadTool(
buildBlobContentGetter(ac, null).bind(null, {
user: 'user-1',
workspace: 'workspace-1',
})
);
const result = await blobTool.execute?.({ blob_id: 'blob-1' }, {});
t.deepEqual(result, {
type: 'error',
name: 'Blob Read Failed',
message:
'Missing workspace, user, blob id, or copilot context for blob_read.',
});
});

View File

@@ -6,13 +6,16 @@ import ava, { TestFn } from 'ava';
import Sinon from 'sinon'; import Sinon from 'sinon';
import { AppModule } from '../../app.module'; import { AppModule } from '../../app.module';
import { ConfigFactory, URLHelper } from '../../base'; import { ConfigFactory, InvalidOauthResponse, URLHelper } from '../../base';
import { ConfigModule } from '../../base/config'; import { ConfigModule } from '../../base/config';
import { CurrentUser } from '../../core/auth'; import { CurrentUser } from '../../core/auth';
import { AuthService } from '../../core/auth/service'; import { AuthService } from '../../core/auth/service';
import { ServerFeature } from '../../core/config/types';
import { Models } from '../../models'; import { Models } from '../../models';
import { OAuthProviderName } from '../../plugins/oauth/config'; import { OAuthProviderName } from '../../plugins/oauth/config';
import { OAuthProviderFactory } from '../../plugins/oauth/factory';
import { GoogleOAuthProvider } from '../../plugins/oauth/providers/google'; import { GoogleOAuthProvider } from '../../plugins/oauth/providers/google';
import { OIDCProvider } from '../../plugins/oauth/providers/oidc';
import { OAuthService } from '../../plugins/oauth/service'; import { OAuthService } from '../../plugins/oauth/service';
import { createTestingApp, currentUser, TestingApp } from '../utils'; import { createTestingApp, currentUser, TestingApp } from '../utils';
@@ -35,6 +38,12 @@ test.before(async t => {
clientId: 'google-client-id', clientId: 'google-client-id',
clientSecret: 'google-client-secret', clientSecret: 'google-client-secret',
}, },
oidc: {
clientId: '',
clientSecret: '',
issuer: '',
args: {},
},
}, },
}, },
server: { server: {
@@ -432,6 +441,87 @@ function mockOAuthProvider(
return clientNonce; return clientNonce;
} }
function mockOidcProvider(
provider: OIDCProvider,
{
args = {},
idTokenClaims,
userinfo,
}: {
args?: Record<string, string>;
idTokenClaims: Record<string, unknown>;
userinfo: Record<string, unknown>;
}
) {
Sinon.stub(provider, 'config').get(() => ({
clientId: '',
clientSecret: '',
issuer: '',
args,
}));
Sinon.stub(
provider as unknown as { endpoints: { userinfo_endpoint: string } },
'endpoints'
).get(() => ({
userinfo_endpoint: 'https://oidc.affine.dev/userinfo',
}));
Sinon.stub(
provider as unknown as { verifyIdToken: () => unknown },
'verifyIdToken'
).resolves(idTokenClaims);
Sinon.stub(
provider as unknown as { fetchJson: () => unknown },
'fetchJson'
).resolves(userinfo);
}
function createOidcRegistrationHarness(config?: {
clientId?: string;
clientSecret?: string;
issuer?: string;
}) {
const server = {
enableFeature: Sinon.spy(),
disableFeature: Sinon.spy(),
};
const factory = new OAuthProviderFactory(server as any);
const affineConfig = {
server: {
externalUrl: 'https://affine.example',
host: 'localhost',
path: '',
https: true,
hosts: [],
},
oauth: {
providers: {
oidc: {
clientId: config?.clientId ?? 'oidc-client-id',
clientSecret: config?.clientSecret ?? 'oidc-client-secret',
issuer: config?.issuer ?? 'https://issuer.affine.dev',
args: {},
},
},
},
};
const provider = new OIDCProvider(new URLHelper(affineConfig as any));
(provider as any).factory = factory;
(provider as any).AFFiNEConfig = affineConfig;
return {
provider,
factory,
server,
};
}
async function flushAsyncWork(iterations = 5) {
for (let i = 0; i < iterations; i++) {
await new Promise(resolve => setImmediate(resolve));
}
}
test('should be able to sign up with oauth', async t => { test('should be able to sign up with oauth', async t => {
const { app, db } = t.context; const { app, db } = t.context;
@@ -554,3 +644,209 @@ test('should be able to fullfil user with oauth sign in', async t => {
t.truthy(account); t.truthy(account);
t.is(account!.user.id, u3.id); t.is(account!.user.id, u3.id);
}); });
test('oidc should accept email from id token when userinfo email is missing', async t => {
const { app } = t.context;
const provider = app.get(OIDCProvider);
mockOidcProvider(provider, {
idTokenClaims: {
sub: 'oidc-user',
email: 'oidc-id-token@affine.pro',
name: 'OIDC User',
},
userinfo: {
sub: 'oidc-user',
name: 'OIDC User',
},
});
const user = await provider.getUser(
{ accessToken: 'token', idToken: 'id-token' },
{ token: 'nonce', provider: OAuthProviderName.OIDC }
);
t.is(user.id, 'oidc-user');
t.is(user.email, 'oidc-id-token@affine.pro');
t.is(user.name, 'OIDC User');
});
test('oidc should resolve custom email claim from userinfo', async t => {
const { app } = t.context;
const provider = app.get(OIDCProvider);
mockOidcProvider(provider, {
args: { claim_email: 'mail', claim_name: 'display_name' },
idTokenClaims: {
sub: 'oidc-user',
},
userinfo: {
sub: 'oidc-user',
mail: 'oidc-userinfo@affine.pro',
display_name: 'OIDC Custom',
},
});
const user = await provider.getUser(
{ accessToken: 'token', idToken: 'id-token' },
{ token: 'nonce', provider: OAuthProviderName.OIDC }
);
t.is(user.id, 'oidc-user');
t.is(user.email, 'oidc-userinfo@affine.pro');
t.is(user.name, 'OIDC Custom');
});
test('oidc should resolve custom email claim from id token', async t => {
const { app } = t.context;
const provider = app.get(OIDCProvider);
mockOidcProvider(provider, {
args: { claim_email: 'mail', claim_email_verified: 'mail_verified' },
idTokenClaims: {
sub: 'oidc-user',
mail: 'oidc-custom-id-token@affine.pro',
mail_verified: 'true',
},
userinfo: {
sub: 'oidc-user',
},
});
const user = await provider.getUser(
{ accessToken: 'token', idToken: 'id-token' },
{ token: 'nonce', provider: OAuthProviderName.OIDC }
);
t.is(user.id, 'oidc-user');
t.is(user.email, 'oidc-custom-id-token@affine.pro');
});
test('oidc should reject responses without a usable email claim', async t => {
const { app } = t.context;
const provider = app.get(OIDCProvider);
mockOidcProvider(provider, {
args: { claim_email: 'mail' },
idTokenClaims: {
sub: 'oidc-user',
mail: 'not-an-email',
},
userinfo: {
sub: 'oidc-user',
mail: 'still-not-an-email',
},
});
const error = await t.throwsAsync(
provider.getUser(
{ accessToken: 'token', idToken: 'id-token' },
{ token: 'nonce', provider: OAuthProviderName.OIDC }
)
);
t.true(error instanceof InvalidOauthResponse);
t.true(
error.message.includes(
'Missing valid email claim in OIDC response. Tried userinfo and ID token claims: "mail"'
)
);
});
test('oidc should not fall back to default email claim when custom claim is configured', async t => {
const { app } = t.context;
const provider = app.get(OIDCProvider);
mockOidcProvider(provider, {
args: { claim_email: 'mail' },
idTokenClaims: {
sub: 'oidc-user',
email: 'fallback@affine.pro',
},
userinfo: {
sub: 'oidc-user',
email: 'userinfo-fallback@affine.pro',
},
});
const error = await t.throwsAsync(
provider.getUser(
{ accessToken: 'token', idToken: 'id-token' },
{ token: 'nonce', provider: OAuthProviderName.OIDC }
)
);
t.true(error instanceof InvalidOauthResponse);
t.true(
error.message.includes(
'Missing valid email claim in OIDC response. Tried userinfo and ID token claims: "mail"'
)
);
});
test('oidc discovery should remove oauth feature on failure and restore it after backoff retry succeeds', async t => {
const { provider, factory, server } = createOidcRegistrationHarness();
const fetchStub = Sinon.stub(globalThis, 'fetch');
const scheduledRetries: Array<() => void> = [];
const retryDelays: number[] = [];
const setTimeoutStub = Sinon.stub(globalThis, 'setTimeout').callsFake(((
callback: Parameters<typeof setTimeout>[0],
delay?: number
) => {
retryDelays.push(Number(delay));
scheduledRetries.push(callback as () => void);
return Symbol('timeout') as unknown as ReturnType<typeof setTimeout>;
}) as typeof setTimeout);
t.teardown(() => {
provider.onModuleDestroy();
fetchStub.restore();
setTimeoutStub.restore();
});
fetchStub
.onFirstCall()
.rejects(new Error('temporary discovery failure'))
.onSecondCall()
.rejects(new Error('temporary discovery failure'))
.onThirdCall()
.resolves(
new Response(
JSON.stringify({
authorization_endpoint: 'https://issuer.affine.dev/auth',
token_endpoint: 'https://issuer.affine.dev/token',
userinfo_endpoint: 'https://issuer.affine.dev/userinfo',
issuer: 'https://issuer.affine.dev',
jwks_uri: 'https://issuer.affine.dev/jwks',
}),
{
status: 200,
headers: { 'Content-Type': 'application/json' },
}
)
);
(provider as any).setup();
await flushAsyncWork();
t.deepEqual(factory.providers, []);
t.true(server.disableFeature.calledWith(ServerFeature.OAuth));
t.is(fetchStub.callCount, 1);
t.deepEqual(retryDelays, [1000]);
const firstRetry = scheduledRetries.shift();
t.truthy(firstRetry);
firstRetry!();
await flushAsyncWork();
t.is(fetchStub.callCount, 2);
t.deepEqual(factory.providers, []);
t.deepEqual(retryDelays, [1000, 2000]);
const secondRetry = scheduledRetries.shift();
t.truthy(secondRetry);
secondRetry!();
await flushAsyncWork();
t.is(fetchStub.callCount, 3);
t.deepEqual(factory.providers, [OAuthProviderName.OIDC]);
t.true(server.enableFeature.calledWith(ServerFeature.OAuth));
t.is(scheduledRetries.length, 0);
});

View File

@@ -111,3 +111,20 @@ test('delete', async t => {
await t.throwsAsync(() => fs.access(join(config.path, provider.bucket, key))); await t.throwsAsync(() => fs.access(join(config.path, provider.bucket, key)));
}); });
test('rejects unsafe object keys', async t => {
const provider = createProvider();
await t.throwsAsync(() => provider.put('../escape', Buffer.from('nope')));
await t.throwsAsync(() => provider.get('nested/../escape'));
await t.throwsAsync(() => provider.head('./escape'));
t.throws(() => provider.delete('nested//escape'));
});
test('rejects unsafe list prefixes', async t => {
const provider = createProvider();
await t.throwsAsync(() => provider.list('../escape'));
await t.throwsAsync(() => provider.list('nested/../../escape'));
await t.throwsAsync(() => provider.list('/absolute'));
});

View File

@@ -25,9 +25,47 @@ import {
} from './provider'; } from './provider';
import { autoMetadata, toBuffer } from './utils'; import { autoMetadata, toBuffer } from './utils';
function escapeKey(key: string): string { function normalizeStorageKey(key: string): string {
// avoid '../' and './' in key const normalized = key.replaceAll('\\', '/');
return key.replace(/\.?\.[/\\]/g, '%'); const segments = normalized.split('/');
if (
!normalized ||
normalized.startsWith('/') ||
segments.some(segment => !segment || segment === '.' || segment === '..')
) {
throw new Error(`Invalid storage key: ${key}`);
}
return segments.join('/');
}
function normalizeStoragePrefix(prefix: string): string {
const normalized = prefix.replaceAll('\\', '/');
if (!normalized) {
return normalized;
}
if (normalized.startsWith('/')) {
throw new Error(`Invalid storage prefix: ${prefix}`);
}
const segments = normalized.split('/');
const lastSegment = segments.pop();
if (
lastSegment === undefined ||
segments.some(segment => !segment || segment === '.' || segment === '..') ||
lastSegment === '.' ||
lastSegment === '..'
) {
throw new Error(`Invalid storage prefix: ${prefix}`);
}
if (lastSegment === '') {
return `${segments.join('/')}/`;
}
return [...segments, lastSegment].join('/');
} }
export interface FsStorageConfig { export interface FsStorageConfig {
@@ -57,7 +95,7 @@ export class FsStorageProvider implements StorageProvider {
body: BlobInputType, body: BlobInputType,
metadata: PutObjectMetadata = {} metadata: PutObjectMetadata = {}
): Promise<void> { ): Promise<void> {
key = escapeKey(key); key = normalizeStorageKey(key);
const blob = await toBuffer(body); const blob = await toBuffer(body);
// write object // write object
@@ -68,6 +106,7 @@ export class FsStorageProvider implements StorageProvider {
} }
async head(key: string) { async head(key: string) {
key = normalizeStorageKey(key);
const metadata = this.readMetadata(key); const metadata = this.readMetadata(key);
if (!metadata) { if (!metadata) {
this.logger.verbose(`Object \`${key}\` not found`); this.logger.verbose(`Object \`${key}\` not found`);
@@ -80,7 +119,7 @@ export class FsStorageProvider implements StorageProvider {
body?: Readable; body?: Readable;
metadata?: GetObjectMetadata; metadata?: GetObjectMetadata;
}> { }> {
key = escapeKey(key); key = normalizeStorageKey(key);
try { try {
const metadata = this.readMetadata(key); const metadata = this.readMetadata(key);
@@ -105,7 +144,7 @@ export class FsStorageProvider implements StorageProvider {
// read dir recursively and filter out '.metadata.json' files // read dir recursively and filter out '.metadata.json' files
let dir = this.path; let dir = this.path;
if (prefix) { if (prefix) {
prefix = escapeKey(prefix); prefix = normalizeStoragePrefix(prefix);
const parts = prefix.split(/[/\\]/); const parts = prefix.split(/[/\\]/);
// for prefix `a/b/c`, move `a/b` to dir and `c` to key prefix // for prefix `a/b/c`, move `a/b` to dir and `c` to key prefix
if (parts.length > 1) { if (parts.length > 1) {
@@ -152,7 +191,7 @@ export class FsStorageProvider implements StorageProvider {
} }
delete(key: string): Promise<void> { delete(key: string): Promise<void> {
key = escapeKey(key); key = normalizeStorageKey(key);
try { try {
rmSync(this.join(key), { force: true }); rmSync(this.join(key), { force: true });

View File

@@ -0,0 +1,75 @@
import test from 'ava';
import Sinon from 'sinon';
import {
exponentialBackoffDelay,
ExponentialBackoffScheduler,
} from '../promise';
test('exponentialBackoffDelay should cap exponential growth at maxDelayMs', t => {
t.is(exponentialBackoffDelay(0, { baseDelayMs: 100, maxDelayMs: 500 }), 100);
t.is(exponentialBackoffDelay(1, { baseDelayMs: 100, maxDelayMs: 500 }), 200);
t.is(exponentialBackoffDelay(3, { baseDelayMs: 100, maxDelayMs: 500 }), 500);
});
test('ExponentialBackoffScheduler should track pending callback and increase delay per attempt', async t => {
const clock = Sinon.useFakeTimers();
t.teardown(() => {
clock.restore();
});
const calls: number[] = [];
const scheduler = new ExponentialBackoffScheduler({
baseDelayMs: 100,
maxDelayMs: 500,
});
t.is(
scheduler.schedule(() => {
calls.push(1);
}),
100
);
t.true(scheduler.pending);
t.is(
scheduler.schedule(() => {
calls.push(2);
}),
null
);
await clock.tickAsync(100);
t.deepEqual(calls, [1]);
t.false(scheduler.pending);
t.is(
scheduler.schedule(() => {
calls.push(3);
}),
200
);
await clock.tickAsync(200);
t.deepEqual(calls, [1, 3]);
});
test('ExponentialBackoffScheduler reset should clear pending work and restart from the base delay', t => {
const scheduler = new ExponentialBackoffScheduler({
baseDelayMs: 100,
maxDelayMs: 500,
});
t.is(
scheduler.schedule(() => {}),
100
);
t.true(scheduler.pending);
scheduler.reset();
t.false(scheduler.pending);
t.is(
scheduler.schedule(() => {}),
100
);
scheduler.clear();
});

View File

@@ -1,4 +1,4 @@
import { setTimeout } from 'node:timers/promises'; import { setTimeout as delay } from 'node:timers/promises';
import { defer as rxjsDefer, retry } from 'rxjs'; import { defer as rxjsDefer, retry } from 'rxjs';
@@ -52,5 +52,61 @@ export function defer(dispose: () => Promise<void>) {
} }
export function sleep(ms: number): Promise<void> { export function sleep(ms: number): Promise<void> {
return setTimeout(ms); return delay(ms);
}
export function exponentialBackoffDelay(
attempt: number,
{
baseDelayMs,
maxDelayMs,
factor = 2,
}: { baseDelayMs: number; maxDelayMs: number; factor?: number }
): number {
return Math.min(
baseDelayMs * Math.pow(factor, Math.max(0, attempt)),
maxDelayMs
);
}
export class ExponentialBackoffScheduler {
#attempt = 0;
#timer: ReturnType<typeof globalThis.setTimeout> | null = null;
constructor(
private readonly options: {
baseDelayMs: number;
maxDelayMs: number;
factor?: number;
}
) {}
get pending() {
return this.#timer !== null;
}
clear() {
if (this.#timer) {
clearTimeout(this.#timer);
this.#timer = null;
}
}
reset() {
this.#attempt = 0;
this.clear();
}
schedule(callback: () => void) {
if (this.#timer) return null;
const timeout = exponentialBackoffDelay(this.#attempt, this.options);
this.#timer = globalThis.setTimeout(() => {
this.#timer = null;
callback();
}, timeout);
this.#attempt += 1;
return timeout;
}
} }

View File

@@ -0,0 +1,12 @@
import { ModuleRef } from '@nestjs/core';
import { PrismaClient } from '@prisma/client';
import { IndexerService } from '../../plugins/indexer';
export class RebuildManticoreMixedScriptIndexes1763800000000 {
static async up(_db: PrismaClient, ref: ModuleRef) {
await ref.get(IndexerService, { strict: false }).rebuildManticoreIndexes();
}
static async down(_db: PrismaClient) {}
}

View File

@@ -3,3 +3,4 @@ export * from './1703756315970-unamed-account';
export * from './1721299086340-refresh-unnamed-user'; export * from './1721299086340-refresh-unnamed-user';
export * from './1745211351719-create-indexer-tables'; export * from './1745211351719-create-indexer-tables';
export * from './1751966744168-correct-session-update-time'; export * from './1751966744168-correct-session-update-time';
export * from './1763800000000-rebuild-manticore-mixed-script-indexes';

View File

@@ -258,7 +258,7 @@ export class FalProvider extends CopilotProvider<FalConfig> {
const model = this.selectModel(cond); const model = this.selectModel(cond);
try { try {
metrics.ai.counter('chat_text_calls').add(1, { model: model.id }); metrics.ai.counter('chat_text_calls').add(1, this.metricLabels(model.id));
// by default, image prompt assumes there is only one message // by default, image prompt assumes there is only one message
const prompt = this.extractPrompt(messages[messages.length - 1]); const prompt = this.extractPrompt(messages[messages.length - 1]);
@@ -283,7 +283,9 @@ export class FalProvider extends CopilotProvider<FalConfig> {
} }
return data.output; return data.output;
} catch (e: any) { } catch (e: any) {
metrics.ai.counter('chat_text_errors').add(1, { model: model.id }); metrics.ai
.counter('chat_text_errors')
.add(1, this.metricLabels(model.id));
throw this.handleError(e); throw this.handleError(e);
} }
} }
@@ -296,12 +298,16 @@ export class FalProvider extends CopilotProvider<FalConfig> {
const model = this.selectModel(cond); const model = this.selectModel(cond);
try { try {
metrics.ai.counter('chat_text_stream_calls').add(1, { model: model.id }); metrics.ai
.counter('chat_text_stream_calls')
.add(1, this.metricLabels(model.id));
const result = await this.text(cond, messages, options); const result = await this.text(cond, messages, options);
yield result; yield result;
} catch (e) { } catch (e) {
metrics.ai.counter('chat_text_stream_errors').add(1, { model: model.id }); metrics.ai
.counter('chat_text_stream_errors')
.add(1, this.metricLabels(model.id));
throw e; throw e;
} }
} }
@@ -319,7 +325,7 @@ export class FalProvider extends CopilotProvider<FalConfig> {
try { try {
metrics.ai metrics.ai
.counter('generate_images_stream_calls') .counter('generate_images_stream_calls')
.add(1, { model: model.id }); .add(1, this.metricLabels(model.id));
// by default, image prompt assumes there is only one message // by default, image prompt assumes there is only one message
const prompt = this.extractPrompt( const prompt = this.extractPrompt(
@@ -376,7 +382,7 @@ export class FalProvider extends CopilotProvider<FalConfig> {
} catch (e) { } catch (e) {
metrics.ai metrics.ai
.counter('generate_images_stream_errors') .counter('generate_images_stream_errors')
.add(1, { model: model.id }); .add(1, this.metricLabels(model.id));
throw this.handleError(e); throw this.handleError(e);
} }
} }

View File

@@ -664,7 +664,7 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
const model = this.selectModel(normalizedCond); const model = this.selectModel(normalizedCond);
try { try {
metrics.ai.counter('chat_text_calls').add(1, { model: model.id }); metrics.ai.counter('chat_text_calls').add(1, this.metricLabels(model.id));
const backendConfig = this.createNativeConfig(); const backendConfig = this.createNativeConfig();
const middleware = this.getActiveProviderMiddleware(); const middleware = this.getActiveProviderMiddleware();
const cap = this.getAttachCapability(model, ModelOutputType.Structured); const cap = this.getAttachCapability(model, ModelOutputType.Structured);
@@ -687,7 +687,9 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
const validated = schema.parse(parsed); const validated = schema.parse(parsed);
return JSON.stringify(validated); return JSON.stringify(validated);
} catch (e: any) { } catch (e: any) {
metrics.ai.counter('chat_text_errors').add(1, { model: model.id }); metrics.ai
.counter('chat_text_errors')
.add(1, this.metricLabels(model.id));
throw this.handleError(e); throw this.handleError(e);
} }
} }
@@ -983,7 +985,7 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
metrics.ai metrics.ai
.counter('generate_images_stream_calls') .counter('generate_images_stream_calls')
.add(1, { model: model.id }); .add(1, this.metricLabels(model.id));
const { content: prompt, attachments } = [...messages].pop() || {}; const { content: prompt, attachments } = [...messages].pop() || {};
if (!prompt) throw new CopilotPromptInvalid('Prompt is required'); if (!prompt) throw new CopilotPromptInvalid('Prompt is required');
@@ -1021,7 +1023,9 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
} }
return; return;
} catch (e: any) { } catch (e: any) {
metrics.ai.counter('generate_images_errors').add(1, { model: model.id }); metrics.ai
.counter('generate_images_errors')
.add(1, this.metricLabels(model.id));
throw this.handleError(e); throw this.handleError(e);
} }
} }

View File

@@ -470,7 +470,8 @@ export abstract class CopilotProvider<C = any> {
}); });
const searchDocs = buildDocKeywordSearchGetter( const searchDocs = buildDocKeywordSearchGetter(
ac, ac,
indexerService indexerService,
models
); );
tools.doc_keyword_search = createDocKeywordSearchTool( tools.doc_keyword_search = createDocKeywordSearchTool(
searchDocs.bind(null, options) searchDocs.bind(null, options)

View File

@@ -18,7 +18,10 @@ export const buildBlobContentGetter = (
chunk?: number chunk?: number
) => { ) => {
if (!options?.user || !options?.workspace || !blobId || !context) { if (!options?.user || !options?.workspace || !blobId || !context) {
return; return toolError(
'Blob Read Failed',
'Missing workspace, user, blob id, or copilot context for blob_read.'
);
} }
const canAccess = await ac const canAccess = await ac
.user(options.user) .user(options.user)
@@ -29,7 +32,10 @@ export const buildBlobContentGetter = (
logger.warn( logger.warn(
`User ${options.user} does not have access workspace ${options.workspace}` `User ${options.user} does not have access workspace ${options.workspace}`
); );
return; return toolError(
'Blob Read Failed',
'You do not have permission to access this workspace attachment.'
);
} }
const contextFile = context.files.find( const contextFile = context.files.find(
@@ -42,7 +48,12 @@ export const buildBlobContentGetter = (
context.getBlobContent(canonicalBlobId, chunk), context.getBlobContent(canonicalBlobId, chunk),
]); ]);
const content = file?.trim() || blob?.trim(); const content = file?.trim() || blob?.trim();
if (!content) return; if (!content) {
return toolError(
'Blob Read Failed',
`Attachment ${canonicalBlobId} is not available for reading in the current copilot context.`
);
}
const info = contextFile const info = contextFile
? { fileName: contextFile.name, fileType: contextFile.mimeType } ? { fileName: contextFile.name, fileType: contextFile.mimeType }
: {}; : {};
@@ -53,10 +64,7 @@ export const buildBlobContentGetter = (
}; };
export const createBlobReadTool = ( export const createBlobReadTool = (
getBlobContent: ( getBlobContent: (targetId?: string, chunk?: number) => Promise<object>
targetId?: string,
chunk?: number
) => Promise<object | undefined>
) => { ) => {
return defineTool({ return defineTool({
description: description:
@@ -73,13 +81,10 @@ export const createBlobReadTool = (
execute: async ({ blob_id, chunk }) => { execute: async ({ blob_id, chunk }) => {
try { try {
const blob = await getBlobContent(blob_id, chunk); const blob = await getBlobContent(blob_id, chunk);
if (!blob) {
return;
}
return { ...blob }; return { ...blob };
} catch (err: any) { } catch (err: any) {
logger.error(`Failed to read the blob ${blob_id} in context`, err); logger.error(`Failed to read the blob ${blob_id} in context`, err);
return toolError('Blob Read Failed', err.message); return toolError('Blob Read Failed', err.message ?? String(err));
} }
}, },
}); });

View File

@@ -1,27 +1,43 @@
import { z } from 'zod'; import { z } from 'zod';
import type { AccessController } from '../../../core/permission'; import type { AccessController } from '../../../core/permission';
import type { Models } from '../../../models';
import type { IndexerService, SearchDoc } from '../../indexer'; import type { IndexerService, SearchDoc } from '../../indexer';
import { workspaceSyncRequiredError } from './doc-sync';
import { toolError } from './error'; import { toolError } from './error';
import { defineTool } from './tool'; import { defineTool } from './tool';
import type { CopilotChatOptions } from './types'; import type { CopilotChatOptions } from './types';
export const buildDocKeywordSearchGetter = ( export const buildDocKeywordSearchGetter = (
ac: AccessController, ac: AccessController,
indexerService: IndexerService indexerService: IndexerService,
models: Models
) => { ) => {
const searchDocs = async (options: CopilotChatOptions, query?: string) => { const searchDocs = async (options: CopilotChatOptions, query?: string) => {
if (!options || !query?.trim() || !options.user || !options.workspace) { const queryTrimmed = query?.trim();
return undefined; if (!options || !queryTrimmed || !options.user || !options.workspace) {
return toolError(
'Doc Keyword Search Failed',
'Missing workspace, user, or query for doc_keyword_search.'
);
}
const workspace = await models.workspace.get(options.workspace);
if (!workspace) {
return workspaceSyncRequiredError();
} }
const canAccess = await ac const canAccess = await ac
.user(options.user) .user(options.user)
.workspace(options.workspace) .workspace(options.workspace)
.can('Workspace.Read'); .can('Workspace.Read');
if (!canAccess) return undefined; if (!canAccess) {
return toolError(
'Doc Keyword Search Failed',
'You do not have permission to access this workspace.'
);
}
const docs = await indexerService.searchDocsByKeyword( const docs = await indexerService.searchDocsByKeyword(
options.workspace, options.workspace,
query queryTrimmed
); );
// filter current user readable docs // filter current user readable docs
@@ -29,13 +45,15 @@ export const buildDocKeywordSearchGetter = (
.user(options.user) .user(options.user)
.workspace(options.workspace) .workspace(options.workspace)
.docs(docs, 'Doc.Read'); .docs(docs, 'Doc.Read');
return readableDocs; return readableDocs ?? [];
}; };
return searchDocs; return searchDocs;
}; };
export const createDocKeywordSearchTool = ( export const createDocKeywordSearchTool = (
searchDocs: (query: string) => Promise<SearchDoc[] | undefined> searchDocs: (
query: string
) => Promise<SearchDoc[] | ReturnType<typeof toolError>>
) => { ) => {
return defineTool({ return defineTool({
description: description:
@@ -50,8 +68,8 @@ export const createDocKeywordSearchTool = (
execute: async ({ query }) => { execute: async ({ query }) => {
try { try {
const docs = await searchDocs(query); const docs = await searchDocs(query);
if (!docs) { if (!Array.isArray(docs)) {
return; return docs;
} }
return docs.map(doc => ({ return docs.map(doc => ({
docId: doc.docId, docId: doc.docId,

View File

@@ -3,13 +3,20 @@ import { z } from 'zod';
import { DocReader } from '../../../core/doc'; import { DocReader } from '../../../core/doc';
import { AccessController } from '../../../core/permission'; import { AccessController } from '../../../core/permission';
import { Models, publicUserSelect } from '../../../models'; import { Models } from '../../../models';
import { toolError } from './error'; import {
documentSyncPendingError,
workspaceSyncRequiredError,
} from './doc-sync';
import { type ToolError, toolError } from './error';
import { defineTool } from './tool'; import { defineTool } from './tool';
import type { CopilotChatOptions } from './types'; import type { CopilotChatOptions } from './types';
const logger = new Logger('DocReadTool'); const logger = new Logger('DocReadTool');
const isToolError = (result: ToolError | object): result is ToolError =>
'type' in result && result.type === 'error';
export const buildDocContentGetter = ( export const buildDocContentGetter = (
ac: AccessController, ac: AccessController,
docReader: DocReader, docReader: DocReader,
@@ -17,8 +24,17 @@ export const buildDocContentGetter = (
) => { ) => {
const getDoc = async (options: CopilotChatOptions, docId?: string) => { const getDoc = async (options: CopilotChatOptions, docId?: string) => {
if (!options?.user || !options?.workspace || !docId) { if (!options?.user || !options?.workspace || !docId) {
return; return toolError(
'Doc Read Failed',
'Missing workspace, user, or document id for doc_read.'
);
} }
const workspace = await models.workspace.get(options.workspace);
if (!workspace) {
return workspaceSyncRequiredError();
}
const canAccess = await ac const canAccess = await ac
.user(options.user) .user(options.user)
.workspace(options.workspace) .workspace(options.workspace)
@@ -28,23 +44,15 @@ export const buildDocContentGetter = (
logger.warn( logger.warn(
`User ${options.user} does not have access to doc ${docId} in workspace ${options.workspace}` `User ${options.user} does not have access to doc ${docId} in workspace ${options.workspace}`
); );
return; return toolError(
'Doc Read Failed',
`You do not have permission to read document ${docId} in this workspace.`
);
} }
const docMeta = await models.doc.getSnapshot(options.workspace, docId, { const docMeta = await models.doc.getAuthors(options.workspace, docId);
select: {
createdAt: true,
updatedAt: true,
createdByUser: {
select: publicUserSelect,
},
updatedByUser: {
select: publicUserSelect,
},
},
});
if (!docMeta) { if (!docMeta) {
return; return documentSyncPendingError(docId);
} }
const content = await docReader.getDocMarkdown( const content = await docReader.getDocMarkdown(
@@ -53,7 +61,7 @@ export const buildDocContentGetter = (
true true
); );
if (!content) { if (!content) {
return; return documentSyncPendingError(docId);
} }
return { return {
@@ -69,8 +77,12 @@ export const buildDocContentGetter = (
return getDoc; return getDoc;
}; };
type DocReadToolResult = Awaited<
ReturnType<ReturnType<typeof buildDocContentGetter>>
>;
export const createDocReadTool = ( export const createDocReadTool = (
getDoc: (targetId?: string) => Promise<object | undefined> getDoc: (targetId?: string) => Promise<DocReadToolResult>
) => { ) => {
return defineTool({ return defineTool({
description: description:
@@ -81,13 +93,10 @@ export const createDocReadTool = (
execute: async ({ doc_id }) => { execute: async ({ doc_id }) => {
try { try {
const doc = await getDoc(doc_id); const doc = await getDoc(doc_id);
if (!doc) { return isToolError(doc) ? doc : { ...doc };
return;
}
return { ...doc };
} catch (err: any) { } catch (err: any) {
logger.error(`Failed to read the doc ${doc_id}`, err); logger.error(`Failed to read the doc ${doc_id}`, err);
return toolError('Doc Read Failed', err.message); return toolError('Doc Read Failed', err.message ?? String(err));
} }
}, },
}); });

View File

@@ -7,6 +7,7 @@ import {
clearEmbeddingChunk, clearEmbeddingChunk,
type Models, type Models,
} from '../../../models'; } from '../../../models';
import { workspaceSyncRequiredError } from './doc-sync';
import { toolError } from './error'; import { toolError } from './error';
import { defineTool } from './tool'; import { defineTool } from './tool';
import type { import type {
@@ -27,14 +28,24 @@ export const buildDocSearchGetter = (
signal?: AbortSignal signal?: AbortSignal
) => { ) => {
if (!options || !query?.trim() || !options.user || !options.workspace) { if (!options || !query?.trim() || !options.user || !options.workspace) {
return `Invalid search parameters.`; return toolError(
'Doc Semantic Search Failed',
'Missing workspace, user, or query for doc_semantic_search.'
);
}
const workspace = await models.workspace.get(options.workspace);
if (!workspace) {
return workspaceSyncRequiredError();
} }
const canAccess = await ac const canAccess = await ac
.user(options.user) .user(options.user)
.workspace(options.workspace) .workspace(options.workspace)
.can('Workspace.Read'); .can('Workspace.Read');
if (!canAccess) if (!canAccess)
return 'You do not have permission to access this workspace.'; return toolError(
'Doc Semantic Search Failed',
'You do not have permission to access this workspace.'
);
const [chunks, contextChunks] = await Promise.all([ const [chunks, contextChunks] = await Promise.all([
context.matchWorkspaceAll(options.workspace, query, 10, signal), context.matchWorkspaceAll(options.workspace, query, 10, signal),
docContext?.matchFiles(query, 10, signal) ?? [], docContext?.matchFiles(query, 10, signal) ?? [],
@@ -53,7 +64,7 @@ export const buildDocSearchGetter = (
fileChunks.push(...contextChunks); fileChunks.push(...contextChunks);
} }
if (!blobChunks.length && !docChunks.length && !fileChunks.length) { if (!blobChunks.length && !docChunks.length && !fileChunks.length) {
return `No results found for "${query}".`; return [];
} }
const docIds = docChunks.map(c => ({ const docIds = docChunks.map(c => ({
@@ -101,7 +112,7 @@ export const createDocSemanticSearchTool = (
searchDocs: ( searchDocs: (
query: string, query: string,
signal?: AbortSignal signal?: AbortSignal
) => Promise<ChunkSimilarity[] | string | undefined> ) => Promise<ChunkSimilarity[] | ReturnType<typeof toolError>>
) => { ) => {
return defineTool({ return defineTool({
description: description:

View File

@@ -0,0 +1,13 @@
import { toolError } from './error';
export const LOCAL_WORKSPACE_SYNC_REQUIRED_MESSAGE =
'This workspace is local-only and does not have AFFiNE Cloud sync enabled yet. Ask the user to enable workspace sync, then try again.';
export const DOCUMENT_SYNC_PENDING_MESSAGE = (docId: string) =>
`Document ${docId} is not available on AFFiNE Cloud yet. Ask the user to wait for workspace sync to finish, then try again.`;
export const workspaceSyncRequiredError = () =>
toolError('Workspace Sync Required', LOCAL_WORKSPACE_SYNC_REQUIRED_MESSAGE);
export const documentSyncPendingError = (docId: string) =>
toolError('Document Sync Pending', DOCUMENT_SYNC_PENDING_MESSAGE(docId));

View File

@@ -7,7 +7,8 @@ import { defineTool } from './tool';
export const createExaSearchTool = (config: Config) => { export const createExaSearchTool = (config: Config) => {
return defineTool({ return defineTool({
description: 'Search the web for information', description:
'Search the web using Exa, one of the best web search APIs for AI',
inputSchema: z.object({ inputSchema: z.object({
query: z.string().describe('The query to search the web for.'), query: z.string().describe('The query to search the web for.'),
mode: z mode: z

View File

@@ -4,6 +4,75 @@ The actual snapshot is saved in `manticoresearch.spec.ts.snap`.
Generated by [AVA](https://avajs.dev). Generated by [AVA](https://avajs.dev).
## should search doc title match chinese word segmentation
> Snapshot 1
[
{
_id: '5373363211628325828',
_source: {
doc_id: 'doc-chinese',
workspace_id: 'workspace-test-doc-title-chinese',
},
fields: {
doc_id: [
'doc-chinese',
],
title: [
'AFFiNE 是一个基于云端的笔记应用',
],
},
highlights: undefined,
},
]
## should search block content match korean ngram
> Snapshot 1
[
{
_id: '1227635764506850985',
_source: {
doc_id: 'doc-korean',
workspace_id: 'workspace-test-block-content-korean',
},
fields: {
block_id: [
'block-korean',
],
content: [
'다람쥐 헌 쳇바퀴에 타고파',
],
},
highlights: undefined,
},
]
## should search block content match japanese kana ngram
> Snapshot 1
[
{
_id: '381498385699454292',
_source: {
doc_id: 'doc-japanese',
workspace_id: 'workspace-test-block-content-japanese',
},
fields: {
block_id: [
'block-japanese',
],
content: [
'いろはにほへと ちりぬるを',
],
},
highlights: undefined,
},
]
## should write document work ## should write document work
> Snapshot 1 > Snapshot 1
@@ -889,7 +958,7 @@ Generated by [AVA](https://avajs.dev).
> Snapshot 1 > Snapshot 1
{ {
term: { equals: {
workspace_id: 'workspaceId1', workspace_id: 'workspaceId1',
}, },
} }
@@ -897,7 +966,7 @@ Generated by [AVA](https://avajs.dev).
> Snapshot 2 > Snapshot 2
{ {
term: { equals: {
workspace_id: 'workspaceId1', workspace_id: 'workspaceId1',
}, },
} }

View File

@@ -33,8 +33,8 @@ const user = await module.create(Mockers.User);
const workspace = await module.create(Mockers.Workspace); const workspace = await module.create(Mockers.Workspace);
test.before(async () => { test.before(async () => {
await searchProvider.createTable(SearchTable.block, blockSQL); await searchProvider.recreateTable(SearchTable.block, blockSQL);
await searchProvider.createTable(SearchTable.doc, docSQL); await searchProvider.recreateTable(SearchTable.doc, docSQL);
await searchProvider.write( await searchProvider.write(
SearchTable.block, SearchTable.block,
@@ -163,6 +163,135 @@ test('should provider is manticoresearch', t => {
t.is(searchProvider.type, SearchProviderType.Manticoresearch); t.is(searchProvider.type, SearchProviderType.Manticoresearch);
}); });
test('should search doc title match chinese word segmentation', async t => {
const workspaceId = 'workspace-test-doc-title-chinese';
const docId = 'doc-chinese';
const title = 'AFFiNE 是一个基于云端的笔记应用';
await searchProvider.write(
SearchTable.doc,
[
{
workspace_id: workspaceId,
doc_id: docId,
title,
},
],
{
refresh: true,
}
);
const result = await searchProvider.search(SearchTable.doc, {
_source: ['workspace_id', 'doc_id'],
query: {
bool: {
must: [
{ term: { workspace_id: { value: workspaceId } } },
{ match: { title: '笔记' } },
],
},
},
fields: ['doc_id', 'title'],
sort: ['_score'],
});
t.true(result.total >= 1);
t.snapshot(
result.nodes
.filter(node => node._source.doc_id === docId)
.map(node => omit(node, ['_score']))
);
});
test('should search block content match korean ngram', async t => {
const workspaceId = 'workspace-test-block-content-korean';
const docId = 'doc-korean';
const blockId = 'block-korean';
const content = '다람쥐 헌 쳇바퀴에 타고파';
await searchProvider.write(
SearchTable.block,
[
{
workspace_id: workspaceId,
doc_id: docId,
block_id: blockId,
content,
flavour: 'affine:paragraph',
},
],
{
refresh: true,
}
);
const result = await searchProvider.search(SearchTable.block, {
_source: ['workspace_id', 'doc_id'],
query: {
bool: {
must: [
{ term: { workspace_id: { value: workspaceId } } },
{ match: { content: '쥐' } },
],
},
},
fields: ['block_id', 'content'],
sort: ['_score'],
});
t.true(result.total >= 1);
t.snapshot(
result.nodes
.filter(node => node.fields.block_id?.[0] === blockId)
.map(node => omit(node, ['_score']))
);
});
test('should search block content match japanese kana ngram', async t => {
const workspaceId = 'workspace-test-block-content-japanese';
const docId = 'doc-japanese';
const blockId = 'block-japanese';
const content = 'いろはにほへと ちりぬるを';
await searchProvider.write(
SearchTable.block,
[
{
workspace_id: workspaceId,
doc_id: docId,
block_id: blockId,
content,
flavour: 'affine:paragraph',
},
],
{
refresh: true,
}
);
const result = await searchProvider.search(SearchTable.block, {
_source: ['workspace_id', 'doc_id'],
query: {
bool: {
must: [
{ term: { workspace_id: { value: workspaceId } } },
{ match: { content: 'へ' } },
],
},
},
fields: ['block_id', 'content'],
sort: ['_score'],
});
t.true(result.total >= 1);
t.snapshot(
result.nodes
.filter(node => node.fields.block_id?.[0] === blockId)
.map(node => omit(node, ['_score']))
);
});
// #region write // #region write
test('should write document work', async t => { test('should write document work', async t => {
@@ -189,7 +318,7 @@ test('should write document work', async t => {
let result = await searchProvider.search(SearchTable.block, { let result = await searchProvider.search(SearchTable.block, {
_source: ['workspace_id', 'doc_id'], _source: ['workspace_id', 'doc_id'],
query: { match: { doc_id: docId } }, query: { term: { doc_id: { value: docId } } },
fields: [ fields: [
'flavour', 'flavour',
'flavour_indexed', 'flavour_indexed',
@@ -232,7 +361,7 @@ test('should write document work', async t => {
result = await searchProvider.search(SearchTable.block, { result = await searchProvider.search(SearchTable.block, {
_source: ['workspace_id', 'doc_id'], _source: ['workspace_id', 'doc_id'],
query: { match: { doc_id: docId } }, query: { term: { doc_id: { value: docId } } },
fields: ['flavour', 'block_id', 'content', 'ref_doc_id'], fields: ['flavour', 'block_id', 'content', 'ref_doc_id'],
sort: ['_score'], sort: ['_score'],
}); });
@@ -263,7 +392,7 @@ test('should write document work', async t => {
result = await searchProvider.search(SearchTable.block, { result = await searchProvider.search(SearchTable.block, {
_source: ['workspace_id', 'doc_id'], _source: ['workspace_id', 'doc_id'],
query: { match: { doc_id: docId } }, query: { term: { doc_id: { value: docId } } },
fields: ['flavour', 'block_id', 'content', 'ref_doc_id'], fields: ['flavour', 'block_id', 'content', 'ref_doc_id'],
sort: ['_score'], sort: ['_score'],
}); });
@@ -319,8 +448,8 @@ test('should handle ref_doc_id as string[]', async t => {
query: { query: {
bool: { bool: {
must: [ must: [
{ match: { workspace_id: workspaceId } }, { term: { workspace_id: { value: workspaceId } } },
{ match: { doc_id: docId } }, { term: { doc_id: { value: docId } } },
], ],
}, },
}, },
@@ -371,8 +500,8 @@ test('should handle ref_doc_id as string[]', async t => {
query: { query: {
bool: { bool: {
must: [ must: [
{ match: { workspace_id: workspaceId } }, { term: { workspace_id: { value: workspaceId } } },
{ match: { doc_id: docId } }, { term: { doc_id: { value: docId } } },
], ],
}, },
}, },
@@ -416,8 +545,8 @@ test('should handle content as string[]', async t => {
query: { query: {
bool: { bool: {
must: [ must: [
{ match: { workspace_id: workspaceId } }, { term: { workspace_id: { value: workspaceId } } },
{ match: { doc_id: docId } }, { term: { doc_id: { value: docId } } },
], ],
}, },
}, },
@@ -455,8 +584,8 @@ test('should handle content as string[]', async t => {
query: { query: {
bool: { bool: {
must: [ must: [
{ match: { workspace_id: workspaceId } }, { term: { workspace_id: { value: workspaceId } } },
{ match: { doc_id: docId } }, { term: { doc_id: { value: docId } } },
], ],
}, },
}, },
@@ -497,8 +626,8 @@ test('should handle blob as string[]', async t => {
query: { query: {
bool: { bool: {
must: [ must: [
{ match: { workspace_id: workspaceId } }, { term: { workspace_id: { value: workspaceId } } },
{ match: { doc_id: docId } }, { term: { doc_id: { value: docId } } },
], ],
}, },
}, },
@@ -534,8 +663,8 @@ test('should handle blob as string[]', async t => {
query: { query: {
bool: { bool: {
must: [ must: [
{ match: { workspace_id: workspaceId } }, { term: { workspace_id: { value: workspaceId } } },
{ match: { doc_id: docId } }, { term: { doc_id: { value: docId } } },
], ],
}, },
}, },
@@ -571,8 +700,8 @@ test('should handle blob as string[]', async t => {
query: { query: {
bool: { bool: {
must: [ must: [
{ match: { workspace_id: workspaceId } }, { term: { workspace_id: { value: workspaceId } } },
{ match: { doc_id: docId } }, { term: { doc_id: { value: docId } } },
], ],
}, },
}, },
@@ -682,8 +811,10 @@ test('should search query all and get next cursor work', async t => {
'id', 'id',
], ],
query: { query: {
match: { term: {
workspace_id: workspaceId, workspace_id: {
value: workspaceId,
},
}, },
}, },
fields: ['flavour', 'workspace_id', 'doc_id', 'block_id'], fields: ['flavour', 'workspace_id', 'doc_id', 'block_id'],
@@ -708,8 +839,10 @@ test('should search query all and get next cursor work', async t => {
'id', 'id',
], ],
query: { query: {
match: { term: {
workspace_id: workspaceId, workspace_id: {
value: workspaceId,
},
}, },
}, },
fields: ['flavour', 'workspace_id', 'doc_id', 'block_id'], fields: ['flavour', 'workspace_id', 'doc_id', 'block_id'],
@@ -734,8 +867,10 @@ test('should search query all and get next cursor work', async t => {
'id', 'id',
], ],
query: { query: {
match: { term: {
workspace_id: workspaceId, workspace_id: {
value: workspaceId,
},
}, },
}, },
fields: ['flavour', 'workspace_id', 'doc_id', 'block_id'], fields: ['flavour', 'workspace_id', 'doc_id', 'block_id'],
@@ -780,16 +915,20 @@ test('should filter by workspace_id work', async t => {
bool: { bool: {
must: [ must: [
{ {
match: { term: {
workspace_id: workspaceId, workspace_id: {
value: workspaceId,
},
}, },
}, },
{ {
bool: { bool: {
must: [ must: [
{ {
match: { term: {
doc_id: docId, doc_id: {
value: docId,
},
}, },
}, },
], ],

View File

@@ -8,11 +8,12 @@ import { createModule } from '../../../__tests__/create-module';
import { Mockers } from '../../../__tests__/mocks'; import { Mockers } from '../../../__tests__/mocks';
import { ConfigModule } from '../../../base/config'; import { ConfigModule } from '../../../base/config';
import { ServerConfigModule } from '../../../core/config'; import { ServerConfigModule } from '../../../core/config';
import { Models } from '../../../models';
import { SearchProviderFactory } from '../factory'; import { SearchProviderFactory } from '../factory';
import { IndexerModule, IndexerService } from '../index'; import { IndexerModule, IndexerService } from '../index';
import { ManticoresearchProvider } from '../providers'; import { ManticoresearchProvider } from '../providers';
import { UpsertDoc } from '../service'; import { UpsertDoc } from '../service';
import { SearchTable } from '../tables'; import { blockSQL, docSQL, SearchTable } from '../tables';
import { import {
AggregateInput, AggregateInput,
SearchInput, SearchInput,
@@ -35,6 +36,7 @@ const module = await createModule({
const indexerService = module.get(IndexerService); const indexerService = module.get(IndexerService);
const searchProviderFactory = module.get(SearchProviderFactory); const searchProviderFactory = module.get(SearchProviderFactory);
const manticoresearch = module.get(ManticoresearchProvider); const manticoresearch = module.get(ManticoresearchProvider);
const models = module.get(Models);
const user = await module.create(Mockers.User); const user = await module.create(Mockers.User);
const workspace = await module.create(Mockers.Workspace, { const workspace = await module.create(Mockers.Workspace, {
snapshot: true, snapshot: true,
@@ -50,7 +52,8 @@ test.after.always(async () => {
}); });
test.before(async () => { test.before(async () => {
await indexerService.createTables(); await manticoresearch.recreateTable(SearchTable.block, blockSQL);
await manticoresearch.recreateTable(SearchTable.doc, docSQL);
}); });
test.afterEach.always(async () => { test.afterEach.always(async () => {
@@ -2311,3 +2314,29 @@ test('should search docs by keyword work', async t => {
}); });
// #endregion // #endregion
test('should rebuild manticore indexes and requeue workspaces', async t => {
const workspace1 = await module.create(Mockers.Workspace, {
indexed: true,
});
const workspace2 = await module.create(Mockers.Workspace, {
indexed: true,
});
const queueCount = module.queue.count('indexer.indexWorkspace');
await indexerService.rebuildManticoreIndexes();
const queuedWorkspaceIds = new Set(
module.queue.add
.getCalls()
.filter(call => call.args[0] === 'indexer.indexWorkspace')
.slice(queueCount)
.map(call => call.args[1].workspaceId)
);
t.true(queuedWorkspaceIds.has(workspace1.id));
t.true(queuedWorkspaceIds.has(workspace2.id));
t.is((await models.workspace.get(workspace1.id))?.indexed, false);
t.is((await models.workspace.get(workspace2.id))?.indexed, false);
});

View File

@@ -38,6 +38,17 @@ const SupportIndexedAttributes = [
'parent_block_id', 'parent_block_id',
]; ];
const SupportExactTermFields = new Set([
'workspace_id',
'doc_id',
'block_id',
'flavour',
'parent_flavour',
'parent_block_id',
'created_by_user_id',
'updated_by_user_id',
]);
const ConvertEmptyStringToNullValueFields = new Set([ const ConvertEmptyStringToNullValueFields = new Set([
'ref_doc_id', 'ref_doc_id',
'ref', 'ref',
@@ -55,23 +66,20 @@ export class ManticoresearchProvider extends ElasticsearchProvider {
table: SearchTable, table: SearchTable,
mapping: string mapping: string
): Promise<void> { ): Promise<void> {
const url = `${this.config.provider.endpoint}/cli`; const text = await this.#executeSQL(mapping);
const response = await fetch(url, {
method: 'POST',
body: mapping,
headers: {
'Content-Type': 'text/plain',
},
});
// manticoresearch cli response is not json, so we need to handle it manually
const text = (await response.text()).trim();
if (!response.ok) {
this.logger.error(`failed to create table ${table}, response: ${text}`);
throw new InternalServerError();
}
this.logger.log(`created table ${table}, response: ${text}`); this.logger.log(`created table ${table}, response: ${text}`);
} }
async dropTable(table: SearchTable): Promise<void> {
const text = await this.#executeSQL(`DROP TABLE IF EXISTS ${table}`);
this.logger.log(`dropped table ${table}, response: ${text}`);
}
async recreateTable(table: SearchTable, mapping: string): Promise<void> {
await this.dropTable(table);
await this.createTable(table, mapping);
}
override async write( override async write(
table: SearchTable, table: SearchTable,
documents: Record<string, unknown>[], documents: Record<string, unknown>[],
@@ -252,6 +260,12 @@ export class ManticoresearchProvider extends ElasticsearchProvider {
// 1750389254 => new Date(1750389254 * 1000) // 1750389254 => new Date(1750389254 * 1000)
return new Date(value * 1000); return new Date(value * 1000);
} }
if (value && typeof value === 'string') {
const timestamp = Date.parse(value);
if (!Number.isNaN(timestamp)) {
return new Date(timestamp);
}
}
return value; return value;
} }
@@ -302,8 +316,10 @@ export class ManticoresearchProvider extends ElasticsearchProvider {
// workspace_id: 'workspaceId1' // workspace_id: 'workspaceId1'
// } // }
// } // }
let termField = options?.termMappingField ?? 'term';
let field = Object.keys(query.term)[0]; let field = Object.keys(query.term)[0];
let termField =
options?.termMappingField ??
(SupportExactTermFields.has(field) ? 'equals' : 'term');
let value = query.term[field]; let value = query.term[field];
if (typeof value === 'object' && 'value' in value) { if (typeof value === 'object' && 'value' in value) {
if ('boost' in value) { if ('boost' in value) {
@@ -432,4 +448,28 @@ export class ManticoresearchProvider extends ElasticsearchProvider {
} }
return value; return value;
} }
async #executeSQL(sql: string) {
const url = `${this.config.provider.endpoint}/cli`;
const headers: Record<string, string> = {
'Content-Type': 'text/plain',
};
if (this.config.provider.apiKey) {
headers.Authorization = `ApiKey ${this.config.provider.apiKey}`;
} else if (this.config.provider.password) {
headers.Authorization = `Basic ${Buffer.from(`${this.config.provider.username}:${this.config.provider.password}`).toString('base64')}`;
}
const response = await fetch(url, {
method: 'POST',
body: sql,
headers,
});
const text = (await response.text()).trim();
if (!response.ok) {
this.logger.error(`failed to execute SQL "${sql}", response: ${text}`);
throw new InternalServerError();
}
return text;
}
} }

View File

@@ -14,6 +14,7 @@ import {
AggregateQueryDSL, AggregateQueryDSL,
BaseQueryDSL, BaseQueryDSL,
HighlightDSL, HighlightDSL,
ManticoresearchProvider,
OperationOptions, OperationOptions,
SearchNode, SearchNode,
SearchProvider, SearchProvider,
@@ -130,6 +131,63 @@ export class IndexerService {
} }
} }
async rebuildManticoreIndexes() {
let searchProvider: SearchProvider | undefined;
try {
searchProvider = this.factory.get();
} catch (err) {
if (err instanceof SearchProviderNotFound) {
this.logger.debug('No search provider found, skip rebuilding tables');
return;
}
throw err;
}
if (!(searchProvider instanceof ManticoresearchProvider)) {
this.logger.debug(
`Search provider ${searchProvider.type} does not need manticore rebuild`
);
return;
}
const mappings = SearchTableMappingStrings[searchProvider.type];
for (const table of Object.keys(mappings) as SearchTable[]) {
await searchProvider.recreateTable(table, mappings[table]);
}
let lastWorkspaceSid = 0;
while (true) {
const workspaces = await this.models.workspace.list(
{ sid: { gt: lastWorkspaceSid } },
{ id: true, sid: true },
100
);
if (!workspaces.length) {
break;
}
for (const workspace of workspaces) {
await this.models.workspace.update(
workspace.id,
{ indexed: false },
false
);
await this.queue.add(
'indexer.indexWorkspace',
{
workspaceId: workspace.id,
},
{
jobId: `indexWorkspace/${workspace.id}`,
priority: 100,
}
);
}
lastWorkspaceSid = workspaces[workspaces.length - 1].sid;
}
}
async write<T extends SearchTable>( async write<T extends SearchTable>(
table: T, table: T,
documents: UpsertTypeByTable<T>[], documents: UpsertTypeByTable<T>[],

View File

@@ -150,6 +150,8 @@ CREATE TABLE IF NOT EXISTS block (
updated_at timestamp updated_at timestamp
) )
morphology = 'jieba_chinese, lemmatize_en_all, lemmatize_de_all, lemmatize_ru_all, libstemmer_ar, libstemmer_ca, stem_cz, libstemmer_da, libstemmer_nl, libstemmer_fi, libstemmer_fr, libstemmer_el, libstemmer_hi, libstemmer_hu, libstemmer_id, libstemmer_ga, libstemmer_it, libstemmer_lt, libstemmer_ne, libstemmer_no, libstemmer_pt, libstemmer_ro, libstemmer_es, libstemmer_sv, libstemmer_ta, libstemmer_tr' morphology = 'jieba_chinese, lemmatize_en_all, lemmatize_de_all, lemmatize_ru_all, libstemmer_ar, libstemmer_ca, stem_cz, libstemmer_da, libstemmer_nl, libstemmer_fi, libstemmer_fr, libstemmer_el, libstemmer_hi, libstemmer_hu, libstemmer_id, libstemmer_ga, libstemmer_it, libstemmer_lt, libstemmer_ne, libstemmer_no, libstemmer_pt, libstemmer_ro, libstemmer_es, libstemmer_sv, libstemmer_ta, libstemmer_tr'
charset_table = 'non_cjk, cjk' charset_table = 'non_cjk, chinese'
ngram_len = '1'
ngram_chars = 'U+1100..U+11FF, U+3130..U+318F, U+A960..U+A97F, U+AC00..U+D7AF, U+D7B0..U+D7FF, U+3040..U+30FF, U+0E00..U+0E7F'
index_field_lengths = '1' index_field_lengths = '1'
`; `;

View File

@@ -109,6 +109,8 @@ CREATE TABLE IF NOT EXISTS doc (
updated_at timestamp updated_at timestamp
) )
morphology = 'jieba_chinese, lemmatize_en_all, lemmatize_de_all, lemmatize_ru_all, libstemmer_ar, libstemmer_ca, stem_cz, libstemmer_da, libstemmer_nl, libstemmer_fi, libstemmer_fr, libstemmer_el, libstemmer_hi, libstemmer_hu, libstemmer_id, libstemmer_ga, libstemmer_it, libstemmer_lt, libstemmer_ne, libstemmer_no, libstemmer_pt, libstemmer_ro, libstemmer_es, libstemmer_sv, libstemmer_ta, libstemmer_tr' morphology = 'jieba_chinese, lemmatize_en_all, lemmatize_de_all, lemmatize_ru_all, libstemmer_ar, libstemmer_ca, stem_cz, libstemmer_da, libstemmer_nl, libstemmer_fi, libstemmer_fr, libstemmer_el, libstemmer_hi, libstemmer_hu, libstemmer_id, libstemmer_ga, libstemmer_it, libstemmer_lt, libstemmer_ne, libstemmer_no, libstemmer_pt, libstemmer_ro, libstemmer_es, libstemmer_sv, libstemmer_ta, libstemmer_tr'
charset_table = 'non_cjk, cjk' charset_table = 'non_cjk, chinese'
ngram_len = '1'
ngram_chars = 'U+1100..U+11FF, U+3130..U+318F, U+A960..U+A97F, U+AC00..U+D7AF, U+D7B0..U+D7FF, U+3040..U+30FF, U+0E00..U+0E7F'
index_field_lengths = '1' index_field_lengths = '1'
`; `;

View File

@@ -1,9 +1,10 @@
import { Injectable } from '@nestjs/common'; import { Injectable, OnModuleDestroy } from '@nestjs/common';
import { createRemoteJWKSet, type JWTPayload, jwtVerify } from 'jose'; import { createRemoteJWKSet, type JWTPayload, jwtVerify } from 'jose';
import { omit } from 'lodash-es'; import { omit } from 'lodash-es';
import { z } from 'zod'; import { z } from 'zod';
import { import {
ExponentialBackoffScheduler,
InvalidAuthState, InvalidAuthState,
InvalidOauthResponse, InvalidOauthResponse,
URLHelper, URLHelper,
@@ -35,7 +36,7 @@ const OIDCUserInfoSchema = z
.object({ .object({
sub: z.string(), sub: z.string(),
preferred_username: z.string().optional(), preferred_username: z.string().optional(),
email: z.string().email(), email: z.string().optional(),
name: z.string().optional(), name: z.string().optional(),
email_verified: z email_verified: z
.union([z.boolean(), z.enum(['true', 'false', '1', '0', 'yes', 'no'])]) .union([z.boolean(), z.enum(['true', 'false', '1', '0', 'yes', 'no'])])
@@ -44,6 +45,8 @@ const OIDCUserInfoSchema = z
}) })
.passthrough(); .passthrough();
const OIDCEmailSchema = z.string().email();
const OIDCConfigurationSchema = z.object({ const OIDCConfigurationSchema = z.object({
authorization_endpoint: z.string().url(), authorization_endpoint: z.string().url(),
token_endpoint: z.string().url(), token_endpoint: z.string().url(),
@@ -54,16 +57,28 @@ const OIDCConfigurationSchema = z.object({
type OIDCConfiguration = z.infer<typeof OIDCConfigurationSchema>; type OIDCConfiguration = z.infer<typeof OIDCConfigurationSchema>;
const OIDC_DISCOVERY_INITIAL_RETRY_DELAY = 1000;
const OIDC_DISCOVERY_MAX_RETRY_DELAY = 60_000;
@Injectable() @Injectable()
export class OIDCProvider extends OAuthProvider { export class OIDCProvider extends OAuthProvider implements OnModuleDestroy {
override provider = OAuthProviderName.OIDC; override provider = OAuthProviderName.OIDC;
#endpoints: OIDCConfiguration | null = null; #endpoints: OIDCConfiguration | null = null;
#jwks: ReturnType<typeof createRemoteJWKSet> | null = null; #jwks: ReturnType<typeof createRemoteJWKSet> | null = null;
readonly #retryScheduler = new ExponentialBackoffScheduler({
baseDelayMs: OIDC_DISCOVERY_INITIAL_RETRY_DELAY,
maxDelayMs: OIDC_DISCOVERY_MAX_RETRY_DELAY,
});
#validationGeneration = 0;
constructor(private readonly url: URLHelper) { constructor(private readonly url: URLHelper) {
super(); super();
} }
onModuleDestroy() {
this.#retryScheduler.clear();
}
override get requiresPkce() { override get requiresPkce() {
return true; return true;
} }
@@ -87,58 +102,109 @@ export class OIDCProvider extends OAuthProvider {
} }
protected override setup() { protected override setup() {
const validate = async () => { const generation = ++this.#validationGeneration;
this.#endpoints = null; this.#retryScheduler.clear();
this.#jwks = null;
if (super.configured) { this.validateAndSync(generation).catch(() => {
const config = this.config as OAuthOIDCProviderConfig;
if (!config.issuer) {
this.logger.error('Missing OIDC issuer configuration');
super.setup();
return;
}
try {
const res = await fetch(
`${config.issuer}/.well-known/openid-configuration`,
{
method: 'GET',
headers: { Accept: 'application/json' },
}
);
if (res.ok) {
const configuration = OIDCConfigurationSchema.parse(
await res.json()
);
if (
this.normalizeIssuer(config.issuer) !==
this.normalizeIssuer(configuration.issuer)
) {
this.logger.error(
`OIDC issuer mismatch, expected ${config.issuer}, got ${configuration.issuer}`
);
} else {
this.#endpoints = configuration;
this.#jwks = createRemoteJWKSet(new URL(configuration.jwks_uri));
}
} else {
this.logger.error(`Invalid OIDC issuer ${config.issuer}`);
}
} catch (e) {
this.logger.error('Failed to validate OIDC configuration', e);
}
}
super.setup();
};
validate().catch(() => {
/* noop */ /* noop */
}); });
} }
private async validateAndSync(generation: number) {
if (generation !== this.#validationGeneration) {
return;
}
if (!super.configured) {
this.resetState();
this.#retryScheduler.reset();
super.setup();
return;
}
const config = this.config as OAuthOIDCProviderConfig;
if (!config.issuer) {
this.logger.error('Missing OIDC issuer configuration');
this.resetState();
this.#retryScheduler.reset();
super.setup();
return;
}
try {
const res = await fetch(
`${config.issuer}/.well-known/openid-configuration`,
{
method: 'GET',
headers: { Accept: 'application/json' },
}
);
if (generation !== this.#validationGeneration) {
return;
}
if (!res.ok) {
this.logger.error(`Invalid OIDC issuer ${config.issuer}`);
this.onValidationFailure(generation);
return;
}
const configuration = OIDCConfigurationSchema.parse(await res.json());
if (
this.normalizeIssuer(config.issuer) !==
this.normalizeIssuer(configuration.issuer)
) {
this.logger.error(
`OIDC issuer mismatch, expected ${config.issuer}, got ${configuration.issuer}`
);
this.onValidationFailure(generation);
return;
}
this.#endpoints = configuration;
this.#jwks = createRemoteJWKSet(new URL(configuration.jwks_uri));
this.#retryScheduler.reset();
super.setup();
} catch (e) {
if (generation !== this.#validationGeneration) {
return;
}
this.logger.error('Failed to validate OIDC configuration', e);
this.onValidationFailure(generation);
}
}
private onValidationFailure(generation: number) {
this.resetState();
super.setup();
this.scheduleRetry(generation);
}
private scheduleRetry(generation: number) {
if (generation !== this.#validationGeneration) {
return;
}
const delay = this.#retryScheduler.schedule(() => {
this.validateAndSync(generation).catch(() => {
/* noop */
});
});
if (delay === null) {
return;
}
this.logger.warn(
`OIDC discovery validation failed, retrying in ${delay}ms`
);
}
private resetState() {
this.#endpoints = null;
this.#jwks = null;
}
getAuthUrl(state: string): string { getAuthUrl(state: string): string {
const parsedState = this.parseStatePayload(state); const parsedState = this.parseStatePayload(state);
const nonce = parsedState?.state ?? state; const nonce = parsedState?.state ?? state;
@@ -291,6 +357,68 @@ export class OIDCProvider extends OAuthProvider {
return undefined; return undefined;
} }
private claimCandidates(
configuredClaim: string | undefined,
defaultClaim: string
) {
if (typeof configuredClaim === 'string' && configuredClaim.length > 0) {
return [configuredClaim];
}
return [defaultClaim];
}
private formatClaimCandidates(claims: string[]) {
return claims.map(claim => `"${claim}"`).join(', ');
}
private resolveStringClaim(
claims: string[],
...sources: Array<Record<string, unknown>>
) {
for (const claim of claims) {
for (const source of sources) {
const value = this.extractString(source[claim]);
if (value) {
return value;
}
}
}
return undefined;
}
private resolveBooleanClaim(
claims: string[],
...sources: Array<Record<string, unknown>>
) {
for (const claim of claims) {
for (const source of sources) {
const value = this.extractBoolean(source[claim]);
if (value !== undefined) {
return value;
}
}
}
return undefined;
}
private resolveEmailClaim(
claims: string[],
...sources: Array<Record<string, unknown>>
) {
for (const claim of claims) {
for (const source of sources) {
const value = this.extractString(source[claim]);
if (value && OIDCEmailSchema.safeParse(value).success) {
return value;
}
}
}
return undefined;
}
async getUser(tokens: Tokens, state: OAuthState): Promise<OAuthAccount> { async getUser(tokens: Tokens, state: OAuthState): Promise<OAuthAccount> {
if (!tokens.idToken) { if (!tokens.idToken) {
throw new InvalidOauthResponse({ throw new InvalidOauthResponse({
@@ -315,6 +443,8 @@ export class OIDCProvider extends OAuthProvider {
{ treatServerErrorAsInvalid: true } { treatServerErrorAsInvalid: true }
); );
const user = OIDCUserInfoSchema.parse(rawUser); const user = OIDCUserInfoSchema.parse(rawUser);
const userClaims = user as Record<string, unknown>;
const idTokenClaimsRecord = idTokenClaims as Record<string, unknown>;
if (!user.sub || !idTokenClaims.sub) { if (!user.sub || !idTokenClaims.sub) {
throw new InvalidOauthResponse({ throw new InvalidOauthResponse({
@@ -327,22 +457,29 @@ export class OIDCProvider extends OAuthProvider {
} }
const args = this.config.args ?? {}; const args = this.config.args ?? {};
const idClaims = this.claimCandidates(args.claim_id, 'sub');
const emailClaims = this.claimCandidates(args.claim_email, 'email');
const nameClaims = this.claimCandidates(args.claim_name, 'name');
const emailVerifiedClaims = this.claimCandidates(
args.claim_email_verified,
'email_verified'
);
const claimsMap = { const accountId = this.resolveStringClaim(
id: args.claim_id || 'sub', idClaims,
email: args.claim_email || 'email', userClaims,
name: args.claim_name || 'name', idTokenClaimsRecord
emailVerified: args.claim_email_verified || 'email_verified', );
}; const email = this.resolveEmailClaim(
emailClaims,
const accountId = userClaims,
this.extractString(user[claimsMap.id]) ?? idTokenClaims.sub; idTokenClaimsRecord
const email = );
this.extractString(user[claimsMap.email]) || const emailVerified = this.resolveBooleanClaim(
this.extractString(idTokenClaims.email); emailVerifiedClaims,
const emailVerified = userClaims,
this.extractBoolean(user[claimsMap.emailVerified]) ?? idTokenClaimsRecord
this.extractBoolean(idTokenClaims.email_verified); );
if (!accountId) { if (!accountId) {
throw new InvalidOauthResponse({ throw new InvalidOauthResponse({
@@ -352,7 +489,7 @@ export class OIDCProvider extends OAuthProvider {
if (!email) { if (!email) {
throw new InvalidOauthResponse({ throw new InvalidOauthResponse({
reason: 'Missing required claim for email', reason: `Missing valid email claim in OIDC response. Tried userinfo and ID token claims: ${this.formatClaimCandidates(emailClaims)}`,
}); });
} }
@@ -367,9 +504,11 @@ export class OIDCProvider extends OAuthProvider {
email, email,
}; };
const name = const name = this.resolveStringClaim(
this.extractString(user[claimsMap.name]) || nameClaims,
this.extractString(idTokenClaims.name); userClaims,
idTokenClaimsRecord
);
if (name) { if (name) {
account.name = name; account.name = name;
} }

View File

@@ -10,6 +10,7 @@ interface TestOps extends OpSchema {
add: [{ a: number; b: number }, number]; add: [{ a: number; b: number }, number];
bin: [Uint8Array, Uint8Array]; bin: [Uint8Array, Uint8Array];
sub: [Uint8Array, number]; sub: [Uint8Array, number];
init: [{ fastText?: boolean } | undefined, { ok: true }];
} }
declare module 'vitest' { declare module 'vitest' {
@@ -84,6 +85,55 @@ describe('op client', () => {
expect(data.byteLength).toBe(0); expect(data.byteLength).toBe(0);
}); });
it('should send optional payload call with abort signal', async ctx => {
const abortController = new AbortController();
const result = ctx.producer.call(
'init',
{ fastText: true },
abortController.signal
);
expect(ctx.postMessage.mock.calls[0][0]).toMatchInlineSnapshot(`
{
"id": "init:1",
"name": "init",
"payload": {
"fastText": true,
},
"type": "call",
}
`);
ctx.handlers.return({
type: 'return',
id: 'init:1',
data: { ok: true },
});
await expect(result).resolves.toEqual({ ok: true });
});
it('should send undefined payload for optional input call', async ctx => {
const result = ctx.producer.call('init', undefined);
expect(ctx.postMessage.mock.calls[0][0]).toMatchInlineSnapshot(`
{
"id": "init:1",
"name": "init",
"payload": undefined,
"type": "call",
}
`);
ctx.handlers.return({
type: 'return',
id: 'init:1',
data: { ok: true },
});
await expect(result).resolves.toEqual({ ok: true });
});
it('should cancel call', async ctx => { it('should cancel call', async ctx => {
const promise = ctx.producer.call('add', { a: 1, b: 2 }); const promise = ctx.producer.call('add', { a: 1, b: 2 });

View File

@@ -40,18 +40,14 @@ describe('op consumer', () => {
it('should throw if no handler registered', async ctx => { it('should throw if no handler registered', async ctx => {
ctx.handlers.call({ type: 'call', id: 'add:1', name: 'add', payload: {} }); ctx.handlers.call({ type: 'call', id: 'add:1', name: 'add', payload: {} });
await vi.advanceTimersToNextTimerAsync(); await vi.advanceTimersToNextTimerAsync();
expect(ctx.postMessage.mock.lastCall).toMatchInlineSnapshot(` expect(ctx.postMessage.mock.lastCall?.[0]).toMatchObject({
[ type: 'return',
{ id: 'add:1',
"error": { error: {
"message": "Handler for operation [add] is not registered.", message: 'Handler for operation [add] is not registered.',
"name": "Error", name: 'Error',
}, },
"id": "add:1", });
"type": "return",
},
]
`);
}); });
it('should handle call message', async ctx => { it('should handle call message', async ctx => {
@@ -73,6 +69,38 @@ describe('op consumer', () => {
`); `);
}); });
it('should serialize string errors with message', async ctx => {
ctx.consumer.register('any', () => {
throw 'worker panic';
});
ctx.handlers.call({ type: 'call', id: 'any:1', name: 'any', payload: {} });
await vi.advanceTimersToNextTimerAsync();
expect(ctx.postMessage.mock.calls[0][0]).toMatchObject({
type: 'return',
id: 'any:1',
error: {
name: 'Error',
message: 'worker panic',
},
});
});
it('should serialize plain object errors with fallback message', async ctx => {
ctx.consumer.register('any', () => {
throw { reason: 'panic', code: 'E_PANIC' };
});
ctx.handlers.call({ type: 'call', id: 'any:1', name: 'any', payload: {} });
await vi.advanceTimersToNextTimerAsync();
const message = ctx.postMessage.mock.calls[0][0]?.error?.message;
expect(typeof message).toBe('string');
expect(message).toContain('"reason":"panic"');
expect(message).toContain('"code":"E_PANIC"');
});
it('should handle cancel message', async ctx => { it('should handle cancel message', async ctx => {
ctx.consumer.register('add', ({ a, b }, { signal }) => { ctx.consumer.register('add', ({ a, b }, { signal }) => {
const { reject, resolve, promise } = Promise.withResolvers<number>(); const { reject, resolve, promise } = Promise.withResolvers<number>();

View File

@@ -16,6 +16,96 @@ import {
} from './message'; } from './message';
import type { OpInput, OpNames, OpOutput, OpSchema } from './types'; import type { OpInput, OpNames, OpOutput, OpSchema } from './types';
const SERIALIZABLE_ERROR_FIELDS = [
'name',
'message',
'code',
'type',
'status',
'data',
'stacktrace',
] as const;
type SerializableErrorShape = Partial<
Record<(typeof SERIALIZABLE_ERROR_FIELDS)[number], unknown>
> & {
name?: string;
message?: string;
};
function getFallbackErrorMessage(error: unknown): string {
if (typeof error === 'string') {
return error;
}
if (error instanceof Error && error.message) {
return error.message;
}
if (
typeof error === 'number' ||
typeof error === 'boolean' ||
typeof error === 'bigint' ||
typeof error === 'symbol'
) {
return String(error);
}
if (error === null || error === undefined) {
return 'Unknown error';
}
try {
const jsonMessage = JSON.stringify(error);
if (jsonMessage && jsonMessage !== '{}') {
return jsonMessage;
}
} catch {
return 'Unknown error';
}
return 'Unknown error';
}
function serializeError(error: unknown): Error {
const valueToPick =
error && typeof error === 'object'
? error
: ({} as Record<string, unknown>);
const serialized = pick(
valueToPick,
SERIALIZABLE_ERROR_FIELDS
) as SerializableErrorShape;
if (!serialized.message || typeof serialized.message !== 'string') {
serialized.message = getFallbackErrorMessage(error);
}
if (!serialized.name || typeof serialized.name !== 'string') {
if (error instanceof Error && error.name) {
serialized.name = error.name;
} else if (error && typeof error === 'object') {
const constructorName = error.constructor?.name;
serialized.name =
typeof constructorName === 'string' && constructorName.length > 0
? constructorName
: 'Error';
} else {
serialized.name = 'Error';
}
}
if (
!serialized.stacktrace &&
error instanceof Error &&
typeof error.stack === 'string'
) {
serialized.stacktrace = error.stack;
}
return serialized as Error;
}
interface OpCallContext { interface OpCallContext {
signal: AbortSignal; signal: AbortSignal;
} }
@@ -71,15 +161,7 @@ export class OpConsumer<Ops extends OpSchema> extends AutoMessageHandler {
this.port.postMessage({ this.port.postMessage({
type: 'return', type: 'return',
id: msg.id, id: msg.id,
error: pick(error, [ error: serializeError(error),
'name',
'message',
'code',
'type',
'status',
'data',
'stacktrace',
]),
} satisfies ReturnMessage); } satisfies ReturnMessage);
}, },
complete: () => { complete: () => {
@@ -109,15 +191,7 @@ export class OpConsumer<Ops extends OpSchema> extends AutoMessageHandler {
this.port.postMessage({ this.port.postMessage({
type: 'error', type: 'error',
id: msg.id, id: msg.id,
error: pick(error, [ error: serializeError(error),
'name',
'message',
'code',
'type',
'status',
'data',
'stacktrace',
]),
} satisfies SubscriptionErrorMessage); } satisfies SubscriptionErrorMessage);
}, },
complete: () => { complete: () => {

View File

@@ -12,7 +12,16 @@ export interface OpSchema {
[key: string]: [any, any?]; [key: string]: [any, any?];
} }
type RequiredInput<In> = In extends void ? [] : In extends never ? [] : [In]; type IsAny<T> = 0 extends 1 & T ? true : false;
type RequiredInput<In> =
IsAny<In> extends true
? [In]
: [In] extends [never]
? []
: [In] extends [void]
? []
: [In];
export type OpNames<T extends OpSchema> = ValuesOf<KeyToKey<T>>; export type OpNames<T extends OpSchema> = ValuesOf<KeyToKey<T>>;
export type OpInput< export type OpInput<

View File

@@ -2,6 +2,7 @@
edition = "2024" edition = "2024"
license-file = "LICENSE" license-file = "LICENSE"
name = "affine_common" name = "affine_common"
publish = false
version = "0.1.0" version = "0.1.0"
[features] [features]

View File

@@ -1,18 +1,235 @@
import 'fake-indexeddb/auto'; import 'fake-indexeddb/auto';
import { expect, test } from 'vitest'; import * as reader from '@affine/reader';
import { NEVER } from 'rxjs';
import { afterEach, expect, test, vi } from 'vitest';
import { Doc as YDoc, encodeStateAsUpdate } from 'yjs'; import { Doc as YDoc, encodeStateAsUpdate } from 'yjs';
import { DummyConnection } from '../connection';
import { import {
IndexedDBBlobStorage, IndexedDBBlobStorage,
IndexedDBBlobSyncStorage, IndexedDBBlobSyncStorage,
IndexedDBDocStorage, IndexedDBDocStorage,
IndexedDBDocSyncStorage, IndexedDBDocSyncStorage,
} from '../impls/idb'; } from '../impls/idb';
import { SpaceStorage } from '../storage'; import {
type AggregateOptions,
type AggregateResult,
type CrawlResult,
type DocClock,
type DocClocks,
type DocDiff,
type DocIndexedClock,
type DocRecord,
type DocStorage,
type DocUpdate,
type IndexerDocument,
type IndexerSchema,
IndexerStorageBase,
IndexerSyncStorageBase,
type Query,
type SearchOptions,
type SearchResult,
SpaceStorage,
} from '../storage';
import { Sync } from '../sync'; import { Sync } from '../sync';
import { IndexerSyncImpl } from '../sync/indexer';
import { expectYjsEqual } from './utils'; import { expectYjsEqual } from './utils';
afterEach(() => {
vi.restoreAllMocks();
});
function deferred<T = void>() {
let resolve!: (value: T | PromiseLike<T>) => void;
let reject!: (reason?: unknown) => void;
const promise = new Promise<T>((res, rej) => {
resolve = res;
reject = rej;
});
return { promise, resolve, reject };
}
class TestDocStorage implements DocStorage {
readonly storageType = 'doc' as const;
readonly connection = new DummyConnection();
readonly isReadonly = false;
private readonly subscribers = new Set<
(update: DocRecord, origin?: string) => void
>();
constructor(
readonly spaceId: string,
private readonly timestamps: Map<string, Date>,
private readonly crawlDocDataImpl: (
docId: string
) => Promise<CrawlResult | null>
) {}
async getDoc(_docId: string): Promise<DocRecord | null> {
return null;
}
async getDocDiff(
_docId: string,
_state?: Uint8Array
): Promise<DocDiff | null> {
return null;
}
async pushDocUpdate(update: DocUpdate, origin?: string): Promise<DocClock> {
const timestamp = this.timestamps.get(update.docId) ?? new Date();
const record = { ...update, timestamp };
this.timestamps.set(update.docId, timestamp);
for (const subscriber of this.subscribers) {
subscriber(record, origin);
}
return { docId: update.docId, timestamp };
}
async getDocTimestamp(docId: string): Promise<DocClock | null> {
const timestamp = this.timestamps.get(docId);
return timestamp ? { docId, timestamp } : null;
}
async getDocTimestamps(): Promise<DocClocks> {
return Object.fromEntries(this.timestamps);
}
async deleteDoc(docId: string): Promise<void> {
this.timestamps.delete(docId);
}
subscribeDocUpdate(callback: (update: DocRecord, origin?: string) => void) {
this.subscribers.add(callback);
return () => {
this.subscribers.delete(callback);
};
}
async crawlDocData(docId: string): Promise<CrawlResult | null> {
return this.crawlDocDataImpl(docId);
}
}
class TrackingIndexerStorage extends IndexerStorageBase {
override readonly connection = new DummyConnection();
override readonly isReadonly = false;
constructor(
private readonly calls: string[],
override readonly recommendRefreshInterval: number
) {
super();
}
override async search<
T extends keyof IndexerSchema,
const O extends SearchOptions<T>,
>(_table: T, _query: Query<T>, _options?: O): Promise<SearchResult<T, O>> {
return {
pagination: { count: 0, limit: 0, skip: 0, hasMore: false },
nodes: [],
} as SearchResult<T, O>;
}
override async aggregate<
T extends keyof IndexerSchema,
const O extends AggregateOptions<T>,
>(
_table: T,
_query: Query<T>,
_field: keyof IndexerSchema[T],
_options?: O
): Promise<AggregateResult<T, O>> {
return {
pagination: { count: 0, limit: 0, skip: 0, hasMore: false },
buckets: [],
} as AggregateResult<T, O>;
}
override search$<
T extends keyof IndexerSchema,
const O extends SearchOptions<T>,
>(_table: T, _query: Query<T>, _options?: O) {
return NEVER;
}
override aggregate$<
T extends keyof IndexerSchema,
const O extends AggregateOptions<T>,
>(_table: T, _query: Query<T>, _field: keyof IndexerSchema[T], _options?: O) {
return NEVER;
}
override async deleteByQuery<T extends keyof IndexerSchema>(
table: T,
_query: Query<T>
): Promise<void> {
this.calls.push(`deleteByQuery:${String(table)}`);
}
override async insert<T extends keyof IndexerSchema>(
table: T,
document: IndexerDocument<T>
): Promise<void> {
this.calls.push(`insert:${String(table)}:${document.id}`);
}
override async delete<T extends keyof IndexerSchema>(
table: T,
id: string
): Promise<void> {
this.calls.push(`delete:${String(table)}:${id}`);
}
override async update<T extends keyof IndexerSchema>(
table: T,
document: IndexerDocument<T>
): Promise<void> {
this.calls.push(`update:${String(table)}:${document.id}`);
}
override async refresh<T extends keyof IndexerSchema>(
_table: T
): Promise<void> {
return;
}
override async refreshIfNeed(): Promise<void> {
this.calls.push('refresh');
}
override async indexVersion(): Promise<number> {
return 1;
}
}
class TrackingIndexerSyncStorage extends IndexerSyncStorageBase {
override readonly connection = new DummyConnection();
private readonly clocks = new Map<string, DocIndexedClock>();
constructor(private readonly calls: string[]) {
super();
}
override async getDocIndexedClock(
docId: string
): Promise<DocIndexedClock | null> {
return this.clocks.get(docId) ?? null;
}
override async setDocIndexedClock(clock: DocIndexedClock): Promise<void> {
this.calls.push(`setClock:${clock.docId}`);
this.clocks.set(clock.docId, clock);
}
override async clearDocIndexedClock(docId: string): Promise<void> {
this.calls.push(`clearClock:${docId}`);
this.clocks.delete(docId);
}
}
test('doc', async () => { test('doc', async () => {
const doc = new YDoc(); const doc = new YDoc();
doc.getMap('test').set('hello', 'world'); doc.getMap('test').set('hello', 'world');
@@ -207,3 +424,114 @@ test('blob', async () => {
expect(c?.data).toEqual(new Uint8Array([4, 3, 2, 1])); expect(c?.data).toEqual(new Uint8Array([4, 3, 2, 1]));
} }
}); });
test('indexer defers indexed clock persistence until a refresh happens on delayed refresh storages', async () => {
const calls: string[] = [];
const docsInRootDoc = new Map([['doc1', { title: 'Doc 1' }]]);
const docStorage = new TestDocStorage(
'workspace-id',
new Map([['doc1', new Date('2026-01-01T00:00:00.000Z')]]),
async () => ({
title: 'Doc 1',
summary: 'summary',
blocks: [
{ blockId: 'block-1', flavour: 'affine:image', blob: ['blob-1'] },
],
})
);
const indexer = new TrackingIndexerStorage(calls, 30_000);
const indexerSyncStorage = new TrackingIndexerSyncStorage(calls);
const sync = new IndexerSyncImpl(
docStorage,
{
local: indexer,
remotes: {},
},
indexerSyncStorage
);
vi.spyOn(reader, 'readAllDocsFromRootDoc').mockImplementation(
() => new Map(docsInRootDoc)
);
try {
sync.start();
await sync.waitForCompleted();
expect(calls).not.toContain('setClock:doc1');
sync.stop();
await vi.waitFor(() => {
expect(calls).toContain('setClock:doc1');
});
const lastRefreshIndex = calls.lastIndexOf('refresh');
const setClockIndex = calls.indexOf('setClock:doc1');
expect(lastRefreshIndex).toBeGreaterThanOrEqual(0);
expect(setClockIndex).toBeGreaterThan(lastRefreshIndex);
} finally {
sync.stop();
}
});
test('indexer completion waits for the current job to finish', async () => {
const docsInRootDoc = new Map([['doc1', { title: 'Doc 1' }]]);
const crawlStarted = deferred<void>();
const releaseCrawl = deferred<void>();
const docStorage = new TestDocStorage(
'workspace-id',
new Map([['doc1', new Date('2026-01-01T00:00:00.000Z')]]),
async () => {
crawlStarted.resolve();
await releaseCrawl.promise;
return {
title: 'Doc 1',
summary: 'summary',
blocks: [
{ blockId: 'block-1', flavour: 'affine:image', blob: ['blob-1'] },
],
};
}
);
const sync = new IndexerSyncImpl(
docStorage,
{
local: new TrackingIndexerStorage([], 30_000),
remotes: {},
},
new TrackingIndexerSyncStorage([])
);
vi.spyOn(reader, 'readAllDocsFromRootDoc').mockImplementation(
() => new Map(docsInRootDoc)
);
try {
sync.start();
await crawlStarted.promise;
let completed = false;
let docCompleted = false;
const waitForCompleted = sync.waitForCompleted().then(() => {
completed = true;
});
const waitForDocCompleted = sync.waitForDocCompleted('doc1').then(() => {
docCompleted = true;
});
await new Promise(resolve => setTimeout(resolve, 20));
expect(completed).toBe(false);
expect(docCompleted).toBe(false);
releaseCrawl.resolve();
await waitForCompleted;
await waitForDocCompleted;
} finally {
sync.stop();
}
});

View File

@@ -112,6 +112,10 @@ export class IndexerSyncImpl implements IndexerSync {
private readonly indexer: IndexerStorage; private readonly indexer: IndexerStorage;
private readonly remote?: IndexerStorage; private readonly remote?: IndexerStorage;
private readonly pendingIndexedClocks = new Map<
string,
{ docId: string; timestamp: Date; indexerVersion: number }
>();
private lastRefreshed = Date.now(); private lastRefreshed = Date.now();
@@ -372,12 +376,13 @@ export class IndexerSyncImpl implements IndexerSync {
field: 'docId', field: 'docId',
match: docId, match: docId,
}); });
this.pendingIndexedClocks.delete(docId);
await this.indexerSync.clearDocIndexedClock(docId); await this.indexerSync.clearDocIndexedClock(docId);
this.status.docsInIndexer.delete(docId); this.status.docsInIndexer.delete(docId);
this.status.statusUpdatedSubject$.next(docId); this.status.statusUpdatedSubject$.next(docId);
} }
} }
await this.refreshIfNeed(); await this.refreshIfNeed(true);
// #endregion // #endregion
} else { } else {
// #region crawl doc // #region crawl doc
@@ -394,7 +399,8 @@ export class IndexerSyncImpl implements IndexerSync {
} }
const docIndexedClock = const docIndexedClock =
await this.indexerSync.getDocIndexedClock(docId); this.pendingIndexedClocks.get(docId) ??
(await this.indexerSync.getDocIndexedClock(docId));
if ( if (
docIndexedClock && docIndexedClock &&
docIndexedClock.timestamp.getTime() === docIndexedClock.timestamp.getTime() ===
@@ -460,13 +466,12 @@ export class IndexerSyncImpl implements IndexerSync {
); );
} }
await this.refreshIfNeed(); this.pendingIndexedClocks.set(docId, {
await this.indexerSync.setDocIndexedClock({
docId, docId,
timestamp: docClock.timestamp, timestamp: docClock.timestamp,
indexerVersion: indexVersion, indexerVersion: indexVersion,
}); });
await this.refreshIfNeed();
// #endregion // #endregion
} }
@@ -476,7 +481,7 @@ export class IndexerSyncImpl implements IndexerSync {
this.status.completeJob(); this.status.completeJob();
} }
} finally { } finally {
await this.refreshIfNeed(); await this.refreshIfNeed(true);
unsubscribe(); unsubscribe();
} }
} }
@@ -484,18 +489,27 @@ export class IndexerSyncImpl implements IndexerSync {
// ensure the indexer is refreshed according to recommendRefreshInterval // ensure the indexer is refreshed according to recommendRefreshInterval
// recommendRefreshInterval <= 0 means force refresh on each operation // recommendRefreshInterval <= 0 means force refresh on each operation
// recommendRefreshInterval > 0 means refresh if the last refresh is older than recommendRefreshInterval // recommendRefreshInterval > 0 means refresh if the last refresh is older than recommendRefreshInterval
private async refreshIfNeed(): Promise<void> { private async refreshIfNeed(force = false): Promise<void> {
const recommendRefreshInterval = this.indexer.recommendRefreshInterval ?? 0; const recommendRefreshInterval = this.indexer.recommendRefreshInterval ?? 0;
const needRefresh = const needRefresh =
recommendRefreshInterval > 0 && recommendRefreshInterval > 0 &&
this.lastRefreshed + recommendRefreshInterval < Date.now(); this.lastRefreshed + recommendRefreshInterval < Date.now();
const forceRefresh = recommendRefreshInterval <= 0; const forceRefresh = recommendRefreshInterval <= 0;
if (needRefresh || forceRefresh) { if (force || needRefresh || forceRefresh) {
await this.indexer.refreshIfNeed(); await this.indexer.refreshIfNeed();
await this.flushPendingIndexedClocks();
this.lastRefreshed = Date.now(); this.lastRefreshed = Date.now();
} }
} }
private async flushPendingIndexedClocks() {
if (this.pendingIndexedClocks.size === 0) return;
for (const [docId, clock] of this.pendingIndexedClocks) {
await this.indexerSync.setDocIndexedClock(clock);
this.pendingIndexedClocks.delete(docId);
}
}
/** /**
* Get all docs from the root doc, without deleted docs * Get all docs from the root doc, without deleted docs
*/ */
@@ -706,7 +720,10 @@ class IndexerSyncStatus {
indexing: this.jobs.length() + (this.currentJob ? 1 : 0), indexing: this.jobs.length() + (this.currentJob ? 1 : 0),
total: this.docsInRootDoc.size + 1, total: this.docsInRootDoc.size + 1,
errorMessage: this.errorMessage, errorMessage: this.errorMessage,
completed: this.rootDocReady && this.jobs.length() === 0, completed:
this.rootDocReady &&
this.jobs.length() === 0 &&
this.currentJob === null,
batterySaveMode: this.batterySaveMode, batterySaveMode: this.batterySaveMode,
paused: this.paused !== null, paused: this.paused !== null,
}); });
@@ -734,9 +751,10 @@ class IndexerSyncStatus {
completed: true, completed: true,
}); });
} else { } else {
const indexing = this.jobs.has(docId) || this.currentJob === docId;
subscribe.next({ subscribe.next({
indexing: this.jobs.has(docId), indexing,
completed: this.docsInIndexer.has(docId) && !this.jobs.has(docId), completed: this.docsInIndexer.has(docId) && !indexing,
}); });
} }
}; };

View File

@@ -1,5 +1,5 @@
export const encodeLink = (link: string) => export const encodeLink = (link: string) =>
encodeURI(link) encodeURI(link)
.replace(/\(/g, '%28') .replaceAll('(', '%28')
.replace(/\)/g, '%29') .replaceAll(')', '%29')
.replace(/(\?|&)response-content-disposition=attachment.*$/, ''); .replace(/(\?|&)response-content-disposition=attachment.*$/, '');

View File

@@ -7,7 +7,7 @@
}, },
"dependencies": { "dependencies": {
"aws4": "^1.13.2", "aws4": "^1.13.2",
"fast-xml-parser": "^5.3.4", "fast-xml-parser": "^5.5.7",
"s3mini": "^0.9.1" "s3mini": "^0.9.1"
}, },
"devDependencies": { "devDependencies": {

View File

@@ -19,6 +19,7 @@ import app.affine.pro.plugin.AFFiNEThemePlugin
import app.affine.pro.plugin.AuthPlugin import app.affine.pro.plugin.AuthPlugin
import app.affine.pro.plugin.HashCashPlugin import app.affine.pro.plugin.HashCashPlugin
import app.affine.pro.plugin.NbStorePlugin import app.affine.pro.plugin.NbStorePlugin
import app.affine.pro.plugin.PreviewPlugin
import app.affine.pro.service.GraphQLService import app.affine.pro.service.GraphQLService
import app.affine.pro.service.SSEService import app.affine.pro.service.SSEService
import app.affine.pro.service.WebService import app.affine.pro.service.WebService
@@ -52,6 +53,7 @@ class MainActivity : BridgeActivity(), AIButtonPlugin.Callback, AFFiNEThemePlugi
AuthPlugin::class.java, AuthPlugin::class.java,
HashCashPlugin::class.java, HashCashPlugin::class.java,
NbStorePlugin::class.java, NbStorePlugin::class.java,
PreviewPlugin::class.java,
) )
) )
} }

View File

@@ -1,8 +1,6 @@
package app.affine.pro.ai.chat package app.affine.pro.ai.chat
import com.affine.pro.graphql.GetCopilotHistoriesQuery
import com.affine.pro.graphql.fragment.CopilotChatHistory import com.affine.pro.graphql.fragment.CopilotChatHistory
import com.affine.pro.graphql.fragment.CopilotChatMessage
import kotlinx.datetime.Clock import kotlinx.datetime.Clock
import kotlinx.datetime.Instant import kotlinx.datetime.Instant
@@ -53,7 +51,7 @@ data class ChatMessage(
createAt = Clock.System.now(), createAt = Clock.System.now(),
) )
fun from(message: CopilotChatMessage) = ChatMessage( fun from(message: CopilotChatHistory.Message) = ChatMessage(
id = message.id, id = message.id,
role = Role.fromValue(message.role), role = Role.fromValue(message.role),
content = message.content, content = message.content,

View File

@@ -0,0 +1,106 @@
package app.affine.pro.plugin
import android.net.Uri
import com.getcapacitor.JSObject
import com.getcapacitor.Plugin
import com.getcapacitor.PluginCall
import com.getcapacitor.PluginMethod
import com.getcapacitor.annotation.CapacitorPlugin
import kotlinx.coroutines.Dispatchers
import timber.log.Timber
import uniffi.affine_mobile_native.renderMermaidPreviewSvg
import uniffi.affine_mobile_native.renderTypstPreviewSvg
import java.io.File
private fun JSObject.getOptionalString(key: String): String? {
return if (has(key) && !isNull(key)) getString(key) else null
}
private fun JSObject.getOptionalDouble(key: String): Double? {
return if (has(key) && !isNull(key)) getDouble(key) else null
}
private fun resolveLocalFontDir(fontUrl: String): String? {
val uri = Uri.parse(fontUrl)
val path = when {
uri.scheme == null -> {
val file = File(fontUrl)
if (!file.isAbsolute) {
return null
}
file.path
}
uri.scheme == "file" -> uri.path
else -> null
} ?: return null
val file = File(path)
val directory = if (file.isDirectory) file else file.parentFile ?: return null
return directory.absolutePath
}
private fun JSObject.resolveTypstFontDirs(): List<String>? {
if (!has("fontUrls") || isNull("fontUrls")) {
return null
}
val fontUrls = optJSONArray("fontUrls")
?: throw IllegalArgumentException("Typst preview fontUrls must be an array of strings.")
val fontDirs = buildList(fontUrls.length()) {
repeat(fontUrls.length()) { index ->
val fontUrl = fontUrls.optString(index, null)
?: throw IllegalArgumentException("Typst preview fontUrls must be strings.")
val fontDir = resolveLocalFontDir(fontUrl)
?: throw IllegalArgumentException("Typst preview on mobile only supports local font file URLs or absolute font directories.")
add(fontDir)
}
}
return fontDirs.distinct()
}
@CapacitorPlugin(name = "Preview")
class PreviewPlugin : Plugin() {
@PluginMethod
fun renderMermaidSvg(call: PluginCall) {
launch(Dispatchers.IO) {
try {
val code = call.getStringEnsure("code")
val options = call.getObject("options")
val svg = renderMermaidPreviewSvg(
code = code,
theme = options?.getOptionalString("theme"),
fontFamily = options?.getOptionalString("fontFamily"),
fontSize = options?.getOptionalDouble("fontSize"),
)
call.resolve(JSObject().apply {
put("svg", svg)
})
} catch (e: Exception) {
Timber.e(e, "Failed to render Mermaid preview.")
call.reject("Failed to render Mermaid preview.", null, e)
}
}
}
@PluginMethod
fun renderTypstSvg(call: PluginCall) {
launch(Dispatchers.IO) {
try {
val code = call.getStringEnsure("code")
val options = call.getObject("options")
val svg = renderTypstPreviewSvg(
code = code,
fontDirs = options?.resolveTypstFontDirs(),
cacheDir = context.cacheDir.absolutePath,
)
call.resolve(JSObject().apply {
put("svg", svg)
})
} catch (e: Exception) {
Timber.e(e, "Failed to render Typst preview.")
call.reject("Failed to render Typst preview.", null, e)
}
}
}
}

View File

@@ -72,7 +72,7 @@ class GraphQLService @Inject constructor() {
).mapCatching { data -> ).mapCatching { data ->
data.currentUser?.copilot?.chats?.paginatedCopilotChats?.edges?.map { item -> item.node.copilotChatHistory }?.firstOrNull { history -> data.currentUser?.copilot?.chats?.paginatedCopilotChats?.edges?.map { item -> item.node.copilotChatHistory }?.firstOrNull { history ->
history.sessionId == sessionId history.sessionId == sessionId
}?.messages?.map { msg -> msg.copilotChatMessage } ?: emptyList() }?.messages ?: emptyList()
} }
suspend fun getCopilotHistoryIds( suspend fun getCopilotHistoryIds(

View File

@@ -792,6 +792,10 @@ internal interface UniffiForeignFutureCompleteVoid : com.sun.jna.Callback {
@@ -816,6 +820,10 @@ internal interface IntegrityCheckingUniffiLib : Library {
): Short ): Short
fun uniffi_affine_mobile_native_checksum_func_new_doc_storage_pool( fun uniffi_affine_mobile_native_checksum_func_new_doc_storage_pool(
): Short ): Short
fun uniffi_affine_mobile_native_checksum_func_render_mermaid_preview_svg(
): Short
fun uniffi_affine_mobile_native_checksum_func_render_typst_preview_svg(
): Short
fun uniffi_affine_mobile_native_checksum_method_docstoragepool_clear_clocks( fun uniffi_affine_mobile_native_checksum_method_docstoragepool_clear_clocks(
): Short ): Short
fun uniffi_affine_mobile_native_checksum_method_docstoragepool_connect( fun uniffi_affine_mobile_native_checksum_method_docstoragepool_connect(
@@ -1017,6 +1025,10 @@ fun uniffi_affine_mobile_native_fn_func_hashcash_mint(`resource`: RustBuffer.ByV
): RustBuffer.ByValue ): RustBuffer.ByValue
fun uniffi_affine_mobile_native_fn_func_new_doc_storage_pool(uniffi_out_err: UniffiRustCallStatus, fun uniffi_affine_mobile_native_fn_func_new_doc_storage_pool(uniffi_out_err: UniffiRustCallStatus,
): Pointer ): Pointer
fun uniffi_affine_mobile_native_fn_func_render_mermaid_preview_svg(`code`: RustBuffer.ByValue,`theme`: RustBuffer.ByValue,`fontFamily`: RustBuffer.ByValue,`fontSize`: RustBuffer.ByValue,uniffi_out_err: UniffiRustCallStatus,
): RustBuffer.ByValue
fun uniffi_affine_mobile_native_fn_func_render_typst_preview_svg(`code`: RustBuffer.ByValue,`fontDirs`: RustBuffer.ByValue,`cacheDir`: RustBuffer.ByValue,uniffi_out_err: UniffiRustCallStatus,
): RustBuffer.ByValue
fun ffi_affine_mobile_native_rustbuffer_alloc(`size`: Long,uniffi_out_err: UniffiRustCallStatus, fun ffi_affine_mobile_native_rustbuffer_alloc(`size`: Long,uniffi_out_err: UniffiRustCallStatus,
): RustBuffer.ByValue ): RustBuffer.ByValue
fun ffi_affine_mobile_native_rustbuffer_from_bytes(`bytes`: ForeignBytes.ByValue,uniffi_out_err: UniffiRustCallStatus, fun ffi_affine_mobile_native_rustbuffer_from_bytes(`bytes`: ForeignBytes.ByValue,uniffi_out_err: UniffiRustCallStatus,
@@ -1149,6 +1161,12 @@ private fun uniffiCheckApiChecksums(lib: IntegrityCheckingUniffiLib) {
if (lib.uniffi_affine_mobile_native_checksum_func_new_doc_storage_pool() != 32882.toShort()) { if (lib.uniffi_affine_mobile_native_checksum_func_new_doc_storage_pool() != 32882.toShort()) {
throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project")
} }
if (lib.uniffi_affine_mobile_native_checksum_func_render_mermaid_preview_svg() != 54334.toShort()) {
throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project")
}
if (lib.uniffi_affine_mobile_native_checksum_func_render_typst_preview_svg() != 42796.toShort()) {
throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project")
}
if (lib.uniffi_affine_mobile_native_checksum_method_docstoragepool_clear_clocks() != 51151.toShort()) { if (lib.uniffi_affine_mobile_native_checksum_method_docstoragepool_clear_clocks() != 51151.toShort()) {
throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project")
} }
@@ -3178,6 +3196,38 @@ public object FfiConverterOptionalLong: FfiConverterRustBuffer<kotlin.Long?> {
/**
* @suppress
*/
public object FfiConverterOptionalDouble: FfiConverterRustBuffer<kotlin.Double?> {
override fun read(buf: ByteBuffer): kotlin.Double? {
if (buf.get().toInt() == 0) {
return null
}
return FfiConverterDouble.read(buf)
}
override fun allocationSize(value: kotlin.Double?): ULong {
if (value == null) {
return 1UL
} else {
return 1UL + FfiConverterDouble.allocationSize(value)
}
}
override fun write(value: kotlin.Double?, buf: ByteBuffer) {
if (value == null) {
buf.put(0)
} else {
buf.put(1)
FfiConverterDouble.write(value, buf)
}
}
}
/** /**
* @suppress * @suppress
*/ */
@@ -3584,4 +3634,24 @@ public object FfiConverterSequenceTypeSearchHit: FfiConverterRustBuffer<List<Sea
} }
@Throws(UniffiException::class) fun `renderMermaidPreviewSvg`(`code`: kotlin.String, `theme`: kotlin.String?, `fontFamily`: kotlin.String?, `fontSize`: kotlin.Double?): kotlin.String {
return FfiConverterString.lift(
uniffiRustCallWithError(UniffiException) { _status ->
UniffiLib.INSTANCE.uniffi_affine_mobile_native_fn_func_render_mermaid_preview_svg(
FfiConverterString.lower(`code`),FfiConverterOptionalString.lower(`theme`),FfiConverterOptionalString.lower(`fontFamily`),FfiConverterOptionalDouble.lower(`fontSize`),_status)
}
)
}
@Throws(UniffiException::class) fun `renderTypstPreviewSvg`(`code`: kotlin.String, `fontDirs`: List<kotlin.String>?, `cacheDir`: kotlin.String?): kotlin.String {
return FfiConverterString.lift(
uniffiRustCallWithError(UniffiException) { _status ->
UniffiLib.INSTANCE.uniffi_affine_mobile_native_fn_func_render_typst_preview_svg(
FfiConverterString.lower(`code`),FfiConverterOptionalSequenceString.lower(`fontDirs`),FfiConverterOptionalString.lower(`cacheDir`),_status)
}
)
}

View File

@@ -15,6 +15,7 @@ import {
ServersService, ServersService,
ValidatorProvider, ValidatorProvider,
} from '@affine/core/modules/cloud'; } from '@affine/core/modules/cloud';
import { registerNativePreviewHandlers } from '@affine/core/modules/code-block-preview-renderer';
import { DocsService } from '@affine/core/modules/doc'; import { DocsService } from '@affine/core/modules/doc';
import { GlobalContextService } from '@affine/core/modules/global-context'; import { GlobalContextService } from '@affine/core/modules/global-context';
import { I18nProvider } from '@affine/core/modules/i18n'; import { I18nProvider } from '@affine/core/modules/i18n';
@@ -54,6 +55,7 @@ import { AIButton } from './plugins/ai-button';
import { Auth } from './plugins/auth'; import { Auth } from './plugins/auth';
import { HashCash } from './plugins/hashcash'; import { HashCash } from './plugins/hashcash';
import { NbStoreNativeDBApis } from './plugins/nbstore'; import { NbStoreNativeDBApis } from './plugins/nbstore';
import { Preview } from './plugins/preview';
import { writeEndpointToken } from './proxy'; import { writeEndpointToken } from './proxy';
const storeManagerClient = createStoreManagerClient(); const storeManagerClient = createStoreManagerClient();
@@ -85,6 +87,11 @@ framework.impl(NbstoreProvider, {
}); });
const frameworkProvider = framework.provider(); const frameworkProvider = framework.provider();
registerNativePreviewHandlers({
renderMermaidSvg: request => Preview.renderMermaidSvg(request),
renderTypstSvg: request => Preview.renderTypstSvg(request),
});
framework.impl(PopupWindowProvider, { framework.impl(PopupWindowProvider, {
open: (url: string) => { open: (url: string) => {
InAppBrowser.open({ InAppBrowser.open({

View File

@@ -433,7 +433,9 @@ export const NbStoreNativeDBApis: NativeDBApis = {
id: string, id: string,
docId: string docId: string
): Promise<DocIndexedClock | null> { ): Promise<DocIndexedClock | null> {
return NbStore.getDocIndexedClock({ id, docId }); return NbStore.getDocIndexedClock({ id, docId }).then(clock =>
clock ? { ...clock, timestamp: new Date(clock.timestamp) } : null
);
}, },
setDocIndexedClock: function ( setDocIndexedClock: function (
id: string, id: string,

View File

@@ -0,0 +1,16 @@
export interface PreviewPlugin {
renderMermaidSvg(options: {
code: string;
options?: {
theme?: string;
fontFamily?: string;
fontSize?: number;
};
}): Promise<{ svg: string }>;
renderTypstSvg(options: {
code: string;
options?: {
fontUrls?: string[];
};
}): Promise<{ svg: string }>;
}

View File

@@ -0,0 +1,8 @@
import { registerPlugin } from '@capacitor/core';
import type { PreviewPlugin } from './definitions';
const Preview = registerPlugin<PreviewPlugin>('Preview');
export * from './definitions';
export { Preview };

View File

@@ -46,7 +46,10 @@ export function setupEvents(frameworkProvider: FrameworkProvider) {
const { workspace } = currentWorkspace; const { workspace } = currentWorkspace;
const docsService = workspace.scope.get(DocsService); const docsService = workspace.scope.get(DocsService);
const page = docsService.createDoc({ primaryMode: type }); const page =
type === 'default'
? docsService.createDoc()
: docsService.createDoc({ primaryMode: type });
workspace.scope.get(WorkbenchService).workbench.openDoc(page.id); workspace.scope.get(WorkbenchService).workbench.openDoc(page.id);
}) })
.catch(err => { .catch(err => {

View File

@@ -13,6 +13,19 @@ import type { FrameworkProvider } from '@toeverything/infra';
import { getCurrentWorkspace, isAiEnabled } from './utils'; import { getCurrentWorkspace, isAiEnabled } from './utils';
const logger = new DebugLogger('electron-renderer:recording'); const logger = new DebugLogger('electron-renderer:recording');
const RECORDING_PROCESS_RETRY_MS = 1000;
const NATIVE_RECORDING_MIME_TYPE = 'audio/ogg';
type ProcessingRecordingStatus = {
id: number;
status: 'processing';
appName?: string;
blockCreationStatus?: undefined;
filepath: string;
startTime: number;
};
type WorkspaceHandle = NonNullable<ReturnType<typeof getCurrentWorkspace>>;
async function readRecordingFile(filepath: string) { async function readRecordingFile(filepath: string) {
if (apis?.recording?.readRecordingFile) { if (apis?.recording?.readRecordingFile) {
@@ -45,118 +58,217 @@ async function saveRecordingBlob(blobEngine: BlobEngine, filepath: string) {
logger.debug('Saving recording', filepath); logger.debug('Saving recording', filepath);
const opusBuffer = await readRecordingFile(filepath); const opusBuffer = await readRecordingFile(filepath);
const blob = new Blob([opusBuffer], { const blob = new Blob([opusBuffer], {
type: 'audio/mp4', type: NATIVE_RECORDING_MIME_TYPE,
}); });
const blobId = await blobEngine.set(blob); const blobId = await blobEngine.set(blob);
logger.debug('Recording saved', blobId); logger.debug('Recording saved', blobId);
return { blob, blobId }; return { blob, blobId };
} }
export function setupRecordingEvents(frameworkProvider: FrameworkProvider) { function shouldProcessRecording(
events?.recording.onRecordingStatusChanged(status => { status: unknown
(async () => { ): status is ProcessingRecordingStatus {
if ((await apis?.ui.isActiveTab()) && status?.status === 'ready') { return (
using currentWorkspace = getCurrentWorkspace(frameworkProvider); !!status &&
if (!currentWorkspace) { typeof status === 'object' &&
// maybe the workspace is not ready yet, eg. for shared workspace view 'status' in status &&
await apis?.recording.handleBlockCreationFailed(status.id); status.status === 'processing' &&
return; 'filepath' in status &&
} typeof status.filepath === 'string' &&
const { workspace } = currentWorkspace; !('blockCreationStatus' in status && status.blockCreationStatus)
const docsService = workspace.scope.get(DocsService); );
const aiEnabled = isAiEnabled(frameworkProvider); }
const timestamp = i18nTime(status.startTime, { async function createRecordingDoc(
absolute: { frameworkProvider: FrameworkProvider,
accuracy: 'minute', workspace: WorkspaceHandle['workspace'],
noYear: true, status: ProcessingRecordingStatus
}, ) {
}); const docsService = workspace.scope.get(DocsService);
const aiEnabled = isAiEnabled(frameworkProvider);
const recordingFilepath = status.filepath;
const docProps: DocProps = { const timestamp = i18nTime(status.startTime, {
onStoreLoad: (doc, { noteId }) => { absolute: {
(async () => { accuracy: 'minute',
if (status.filepath) { noYear: true,
// it takes a while to save the blob, so we show the attachment first },
const { blobId, blob } = await saveRecordingBlob( });
doc.workspace.blobSync,
status.filepath
);
// name + timestamp(readable) + extension await new Promise<void>((resolve, reject) => {
const attachmentName = const docProps: DocProps = {
(status.appName ?? 'System Audio') + onStoreLoad: (doc, { noteId }) => {
' ' + void (async () => {
timestamp + // it takes a while to save the blob, so we show the attachment first
'.opus'; const { blobId, blob } = await saveRecordingBlob(
doc.workspace.blobSync,
recordingFilepath
);
// add size and sourceId to the attachment later // name + timestamp(readable) + extension
const attachmentId = doc.addBlock( const attachmentName =
'affine:attachment', (status.appName ?? 'System Audio') + ' ' + timestamp + '.opus';
{
name: attachmentName,
type: 'audio/opus',
size: blob.size,
sourceId: blobId,
embed: true,
},
noteId
);
const model = doc.getBlock(attachmentId) const attachmentId = doc.addBlock(
?.model as AttachmentBlockModel; 'affine:attachment',
{
name: attachmentName,
type: NATIVE_RECORDING_MIME_TYPE,
size: blob.size,
sourceId: blobId,
embed: true,
},
noteId
);
if (!aiEnabled) { const model = doc.getBlock(attachmentId)
return; ?.model as AttachmentBlockModel;
}
using currentWorkspace = getCurrentWorkspace(frameworkProvider); if (!aiEnabled) {
if (!currentWorkspace) { return;
return; }
}
const { workspace } = currentWorkspace; using currentWorkspace = getCurrentWorkspace(frameworkProvider);
using audioAttachment = workspace.scope if (!currentWorkspace) {
.get(AudioAttachmentService) return;
.get(model); }
audioAttachment?.obj const { workspace } = currentWorkspace;
.transcribe() using audioAttachment = workspace.scope
.then(() => { .get(AudioAttachmentService)
track.doc.editor.audioBlock.transcribeRecording({ .get(model);
type: 'Meeting record', audioAttachment?.obj
method: 'success', .transcribe()
option: 'Auto transcribing', .then(() => {
}); track.doc.editor.audioBlock.transcribeRecording({
}) type: 'Meeting record',
.catch(err => { method: 'success',
logger.error('Failed to transcribe recording', err); option: 'Auto transcribing',
});
} else {
throw new Error('No attachment model found');
}
})()
.then(async () => {
await apis?.recording.handleBlockCreationSuccess(status.id);
})
.catch(error => {
logger.error('Failed to transcribe recording', error);
return apis?.recording.handleBlockCreationFailed(
status.id,
error
);
})
.catch(error => {
console.error('unknown error', error);
}); });
}, })
}; .catch(err => {
const page = docsService.createDoc({ logger.error('Failed to transcribe recording', err);
docProps, });
title: })().then(resolve, reject);
'Recording ' + (status.appName ?? 'System Audio') + ' ' + timestamp, },
primaryMode: 'page', };
});
workspace.scope.get(WorkbenchService).workbench.openDoc(page.id); const page = docsService.createDoc({
} docProps,
})().catch(console.error); title:
'Recording ' + (status.appName ?? 'System Audio') + ' ' + timestamp,
primaryMode: 'page',
});
workspace.scope.get(WorkbenchService).workbench.openDoc(page.id);
});
}
export function setupRecordingEvents(frameworkProvider: FrameworkProvider) {
let pendingStatus: ProcessingRecordingStatus | null = null;
let retryTimer: ReturnType<typeof setTimeout> | null = null;
let processingStatusId: number | null = null;
const clearRetry = () => {
if (retryTimer !== null) {
clearTimeout(retryTimer);
retryTimer = null;
}
};
const clearPending = (id?: number) => {
if (id === undefined || pendingStatus?.id === id) {
pendingStatus = null;
clearRetry();
}
if (id === undefined || processingStatusId === id) {
processingStatusId = null;
}
};
const scheduleRetry = () => {
if (!pendingStatus || retryTimer !== null) {
return;
}
retryTimer = setTimeout(() => {
retryTimer = null;
void processPendingStatus().catch(console.error);
}, RECORDING_PROCESS_RETRY_MS);
};
const processPendingStatus = async () => {
const status = pendingStatus;
if (!status || processingStatusId === status.id) {
return;
}
let isActiveTab = false;
try {
isActiveTab = !!(await apis?.ui.isActiveTab());
} catch (error) {
logger.error('Failed to probe active recording tab', error);
scheduleRetry();
return;
}
if (!isActiveTab) {
scheduleRetry();
return;
}
using currentWorkspace = getCurrentWorkspace(frameworkProvider);
if (!currentWorkspace) {
// Workspace can lag behind the post-recording status update for a short
// time; keep retrying instead of permanently failing the import.
scheduleRetry();
return;
}
processingStatusId = status.id;
try {
await createRecordingDoc(
frameworkProvider,
currentWorkspace.workspace,
status
);
await apis?.recording.setRecordingBlockCreationStatus(
status.id,
'success'
);
clearPending(status.id);
} catch (error) {
logger.error('Failed to create recording block', error);
try {
await apis?.recording.setRecordingBlockCreationStatus(
status.id,
'failed',
error instanceof Error ? error.message : undefined
);
} finally {
clearPending(status.id);
}
} finally {
if (pendingStatus?.id === status.id) {
processingStatusId = null;
scheduleRetry();
}
}
};
events?.recording.onRecordingStatusChanged(status => {
if (shouldProcessRecording(status)) {
pendingStatus = status;
clearRetry();
void processPendingStatus().catch(console.error);
return;
}
if (!status) {
clearPending();
return;
}
if (pendingStatus?.id === status.id) {
clearPending(status.id);
}
}); });
} }

View File

@@ -1,28 +1,17 @@
import { Button } from '@affine/component'; import { Button } from '@affine/component';
import { useAsyncCallback } from '@affine/core/components/hooks/affine-async-hooks'; import { useAsyncCallback } from '@affine/core/components/hooks/affine-async-hooks';
import { appIconMap } from '@affine/core/utils'; import { appIconMap } from '@affine/core/utils';
import {
createStreamEncoder,
encodeRawBufferToOpus,
type OpusStreamEncoder,
} from '@affine/core/utils/opus-encoding';
import { apis, events } from '@affine/electron-api'; import { apis, events } from '@affine/electron-api';
import { useI18n } from '@affine/i18n'; import { useI18n } from '@affine/i18n';
import track from '@affine/track'; import track from '@affine/track';
import { useEffect, useMemo, useState } from 'react'; import { useEffect, useMemo, useRef, useState } from 'react';
import * as styles from './styles.css'; import * as styles from './styles.css';
type Status = { type Status = {
id: number; id: number;
status: status: 'new' | 'recording' | 'processing' | 'ready';
| 'new' blockCreationStatus?: 'success' | 'failed';
| 'recording'
| 'paused'
| 'stopped'
| 'ready'
| 'create-block-success'
| 'create-block-failed';
appName?: string; appName?: string;
appGroupId?: number; appGroupId?: number;
icon?: Buffer; icon?: Buffer;
@@ -58,6 +47,7 @@ const appIcon = appIconMap[BUILD_CONFIG.appBuildType];
export function Recording() { export function Recording() {
const status = useRecordingStatus(); const status = useRecordingStatus();
const trackedNewRecordingIdsRef = useRef<Set<number>>(new Set());
const t = useI18n(); const t = useI18n();
const textElement = useMemo(() => { const textElement = useMemo(() => {
@@ -66,14 +56,19 @@ export function Recording() {
} }
if (status.status === 'new') { if (status.status === 'new') {
return t['com.affine.recording.new'](); return t['com.affine.recording.new']();
} else if (status.status === 'create-block-success') { } else if (
status.status === 'ready' &&
status.blockCreationStatus === 'success'
) {
return t['com.affine.recording.success.prompt'](); return t['com.affine.recording.success.prompt']();
} else if (status.status === 'create-block-failed') { } else if (
status.status === 'ready' &&
status.blockCreationStatus === 'failed'
) {
return t['com.affine.recording.failed.prompt'](); return t['com.affine.recording.failed.prompt']();
} else if ( } else if (
status.status === 'recording' || status.status === 'recording' ||
status.status === 'ready' || status.status === 'processing'
status.status === 'stopped'
) { ) {
if (status.appName) { if (status.appName) {
return t['com.affine.recording.recording']({ return t['com.affine.recording.recording']({
@@ -105,106 +100,16 @@ export function Recording() {
await apis?.recording?.stopRecording(status.id); await apis?.recording?.stopRecording(status.id);
}, [status]); }, [status]);
const handleProcessStoppedRecording = useAsyncCallback(
async (currentStreamEncoder?: OpusStreamEncoder) => {
let id: number | undefined;
try {
const result = await apis?.recording?.getCurrentRecording();
if (!result) {
return;
}
id = result.id;
const { filepath, sampleRate, numberOfChannels } = result;
if (!filepath || !sampleRate || !numberOfChannels) {
return;
}
const [buffer] = await Promise.all([
currentStreamEncoder
? currentStreamEncoder.finish()
: encodeRawBufferToOpus({
filepath,
sampleRate,
numberOfChannels,
}),
new Promise<void>(resolve => {
setTimeout(() => {
resolve();
}, 500); // wait at least 500ms for better user experience
}),
]);
await apis?.recording.readyRecording(result.id, buffer);
} catch (error) {
console.error('Failed to stop recording', error);
await apis?.popup?.dismissCurrentRecording();
if (id) {
await apis?.recording.removeRecording(id);
}
}
},
[]
);
useEffect(() => { useEffect(() => {
let removed = false; if (!status || status.status !== 'new') return;
let currentStreamEncoder: OpusStreamEncoder | undefined; if (trackedNewRecordingIdsRef.current.has(status.id)) return;
apis?.recording trackedNewRecordingIdsRef.current.add(status.id);
.getCurrentRecording() track.popup.$.recordingBar.toggleRecordingBar({
.then(status => { type: 'Meeting record',
if (status) { appName: status.appName || 'System Audio',
return handleRecordingStatusChanged(status);
}
return;
})
.catch(console.error);
const handleRecordingStatusChanged = async (status: Status) => {
if (removed) {
return;
}
if (status?.status === 'new') {
track.popup.$.recordingBar.toggleRecordingBar({
type: 'Meeting record',
appName: status.appName || 'System Audio',
});
}
if (
status?.status === 'recording' &&
status.sampleRate &&
status.numberOfChannels &&
(!currentStreamEncoder || currentStreamEncoder.id !== status.id)
) {
currentStreamEncoder?.close();
currentStreamEncoder = createStreamEncoder(status.id, {
sampleRate: status.sampleRate,
numberOfChannels: status.numberOfChannels,
});
currentStreamEncoder.poll().catch(console.error);
}
if (status?.status === 'stopped') {
handleProcessStoppedRecording(currentStreamEncoder);
currentStreamEncoder = undefined;
}
};
// allow processing stopped event in tray menu as well:
const unsubscribe = events?.recording.onRecordingStatusChanged(status => {
if (status) {
handleRecordingStatusChanged(status).catch(console.error);
}
}); });
}, [status]);
return () => {
removed = true;
unsubscribe?.();
currentStreamEncoder?.close();
};
}, [handleProcessStoppedRecording]);
const handleStartRecording = useAsyncCallback(async () => { const handleStartRecording = useAsyncCallback(async () => {
if (!status) { if (!status) {
@@ -249,7 +154,10 @@ export function Recording() {
{t['com.affine.recording.stop']()} {t['com.affine.recording.stop']()}
</Button> </Button>
); );
} else if (status.status === 'stopped' || status.status === 'ready') { } else if (
status.status === 'processing' ||
(status.status === 'ready' && !status.blockCreationStatus)
) {
return ( return (
<Button <Button
variant="error" variant="error"
@@ -258,13 +166,19 @@ export function Recording() {
disabled disabled
/> />
); );
} else if (status.status === 'create-block-success') { } else if (
status.status === 'ready' &&
status.blockCreationStatus === 'success'
) {
return ( return (
<Button variant="primary" onClick={handleDismiss}> <Button variant="primary" onClick={handleDismiss}>
{t['com.affine.recording.success.button']()} {t['com.affine.recording.success.button']()}
</Button> </Button>
); );
} else if (status.status === 'create-block-failed') { } else if (
status.status === 'ready' &&
status.blockCreationStatus === 'failed'
) {
return ( return (
<> <>
<Button variant="plain" onClick={handleDismiss}> <Button variant="plain" onClick={handleDismiss}>

View File

@@ -1,10 +1,11 @@
import { parse } from 'node:path'; import { parse, resolve } from 'node:path';
import { DocStorage, ValidationResult } from '@affine/native'; import { DocStorage, ValidationResult } from '@affine/native';
import { parseUniversalId } from '@affine/nbstore'; import { parseUniversalId } from '@affine/nbstore';
import fs from 'fs-extra'; import fs from 'fs-extra';
import { nanoid } from 'nanoid'; import { nanoid } from 'nanoid';
import { isPathInsideBase } from '../../shared/utils';
import { logger } from '../logger'; import { logger } from '../logger';
import { mainRPC } from '../main-rpc'; import { mainRPC } from '../main-rpc';
import { getDocStoragePool } from '../nbstore'; import { getDocStoragePool } from '../nbstore';
@@ -38,31 +39,6 @@ export interface SelectDBFileLocationResult {
canceled?: boolean; canceled?: boolean;
} }
// provide a backdoor to set dialog path for testing in playwright
export interface FakeDialogResult {
canceled?: boolean;
filePath?: string;
filePaths?: string[];
}
// result will be used in the next call to showOpenDialog
// if it is being read once, it will be reset to undefined
let fakeDialogResult: FakeDialogResult | undefined = undefined;
function getFakedResult() {
const result = fakeDialogResult;
fakeDialogResult = undefined;
return result;
}
export function setFakeDialogResult(result: FakeDialogResult | undefined) {
fakeDialogResult = result;
// for convenience, we will fill filePaths with filePath if it is not set
if (result?.filePaths === undefined && result?.filePath !== undefined) {
result.filePaths = [result.filePath];
}
}
const extension = 'affine'; const extension = 'affine';
function getDefaultDBFileName(name: string, id: string) { function getDefaultDBFileName(name: string, id: string) {
@@ -71,10 +47,55 @@ function getDefaultDBFileName(name: string, id: string) {
return fileName.replace(/[/\\?%*:|"<>]/g, '-'); return fileName.replace(/[/\\?%*:|"<>]/g, '-');
} }
async function resolveExistingPath(path: string) {
if (!(await fs.pathExists(path))) {
return null;
}
try {
return await fs.realpath(path);
} catch {
return resolve(path);
}
}
async function isSameFilePath(sourcePath: string, targetPath: string) {
if (resolve(sourcePath) === resolve(targetPath)) {
return true;
}
const [resolvedSourcePath, resolvedTargetPath] = await Promise.all([
resolveExistingPath(sourcePath),
resolveExistingPath(targetPath),
]);
return !!resolvedSourcePath && resolvedSourcePath === resolvedTargetPath;
}
async function normalizeImportDBPath(selectedPath: string) {
if (!(await fs.pathExists(selectedPath))) {
return null;
}
const [normalizedPath, workspacesBasePath] = await Promise.all([
resolveExistingPath(selectedPath),
resolveExistingPath(await getWorkspacesBasePath()),
]);
const resolvedSelectedPath = normalizedPath ?? resolve(selectedPath);
const resolvedWorkspacesBasePath =
workspacesBasePath ?? resolve(await getWorkspacesBasePath());
if (isPathInsideBase(resolvedWorkspacesBasePath, resolvedSelectedPath)) {
logger.warn('loadDBFile: db file in app data dir');
return null;
}
return resolvedSelectedPath;
}
/** /**
* This function is called when the user clicks the "Save" button in the "Save Workspace" dialog. * This function is called when the user clicks the "Save" button in the "Save Workspace" dialog.
* *
* It will just copy the file to the given path * It will export a compacted database file to the given path
*/ */
export async function saveDBFileAs( export async function saveDBFileAs(
universalId: string, universalId: string,
@@ -89,44 +110,53 @@ export async function saveDBFileAs(
await pool.connect(universalId, dbPath); await pool.connect(universalId, dbPath);
await pool.checkpoint(universalId); // make sure all changes (WAL) are written to db await pool.checkpoint(universalId); // make sure all changes (WAL) are written to db
const fakedResult = getFakedResult();
if (!dbPath) { if (!dbPath) {
return { return {
error: 'DB_FILE_PATH_INVALID', error: 'DB_FILE_PATH_INVALID',
}; };
} }
const ret = const ret = await mainRPC.showSaveDialog({
fakedResult ?? properties: ['showOverwriteConfirmation'],
(await mainRPC.showSaveDialog({ title: 'Save Workspace',
properties: ['showOverwriteConfirmation'], showsTagField: false,
title: 'Save Workspace', buttonLabel: 'Save',
showsTagField: false, filters: [
buttonLabel: 'Save', {
filters: [ extensions: [extension],
{ name: '',
extensions: [extension], },
name: '', ],
}, defaultPath: getDefaultDBFileName(name, id),
], message: 'Save Workspace as a SQLite Database file',
defaultPath: getDefaultDBFileName(name, id), });
message: 'Save Workspace as a SQLite Database file',
}));
const filePath = ret.filePath; const filePath = ret.filePath;
if (ret.canceled || !filePath) { if (ret.canceled || !filePath) {
return { return { canceled: true };
canceled: true,
};
} }
await fs.copyFile(dbPath, filePath); if (await isSameFilePath(dbPath, filePath)) {
logger.log('saved', filePath); return { error: 'DB_FILE_PATH_INVALID' };
if (!fakedResult) {
mainRPC.showItemInFolder(filePath).catch(err => {
console.error(err);
});
} }
const tempFilePath = `${filePath}.${nanoid(6)}.tmp`;
if (await fs.pathExists(tempFilePath)) {
await fs.remove(tempFilePath);
}
try {
await pool.vacuumInto(universalId, tempFilePath);
await fs.move(tempFilePath, filePath, { overwrite: true });
} finally {
if (await fs.pathExists(tempFilePath)) {
await fs.remove(tempFilePath);
}
}
logger.log('saved', filePath);
mainRPC.showItemInFolder(filePath).catch(err => {
console.error(err);
});
return { filePath }; return { filePath };
} catch (err) { } catch (err) {
logger.error('saveDBFileAs', err); logger.error('saveDBFileAs', err);
@@ -138,15 +168,13 @@ export async function saveDBFileAs(
export async function selectDBFileLocation(): Promise<SelectDBFileLocationResult> { export async function selectDBFileLocation(): Promise<SelectDBFileLocationResult> {
try { try {
const ret = const ret = await mainRPC.showOpenDialog({
getFakedResult() ?? properties: ['openDirectory'],
(await mainRPC.showOpenDialog({ title: 'Set Workspace Storage Location',
properties: ['openDirectory'], buttonLabel: 'Select',
title: 'Set Workspace Storage Location', defaultPath: await mainRPC.getPath('documents'),
buttonLabel: 'Select', message: "Select a location to store the workspace's database file",
defaultPath: await mainRPC.getPath('documents'), });
message: "Select a location to store the workspace's database file",
}));
const dir = ret.filePaths?.[0]; const dir = ret.filePaths?.[0];
if (ret.canceled || !dir) { if (ret.canceled || !dir) {
return { return {
@@ -176,43 +204,29 @@ export async function selectDBFileLocation(): Promise<SelectDBFileLocationResult
* update the local workspace id list and then connect to it. * update the local workspace id list and then connect to it.
* *
*/ */
export async function loadDBFile( export async function loadDBFile(): Promise<LoadDBFileResult> {
dbFilePath?: string
): Promise<LoadDBFileResult> {
try { try {
const provided = const ret = await mainRPC.showOpenDialog({
getFakedResult() ?? properties: ['openFile'],
(dbFilePath title: 'Load Workspace',
? { buttonLabel: 'Load',
filePath: dbFilePath, filters: [
filePaths: [dbFilePath], {
canceled: false, name: 'SQLite Database',
} // do we want to support other file format?
: undefined); extensions: ['db', 'affine'],
const ret = },
provided ?? ],
(await mainRPC.showOpenDialog({ message: 'Load Workspace from a AFFiNE file',
properties: ['openFile'], });
title: 'Load Workspace', const selectedPath = ret.filePaths?.[0];
buttonLabel: 'Load', if (ret.canceled || !selectedPath) {
filters: [
{
name: 'SQLite Database',
// do we want to support other file format?
extensions: ['db', 'affine'],
},
],
message: 'Load Workspace from a AFFiNE file',
}));
const originalPath = ret.filePaths?.[0];
if (ret.canceled || !originalPath) {
logger.info('loadDBFile canceled'); logger.info('loadDBFile canceled');
return { canceled: true }; return { canceled: true };
} }
// the imported file should not be in app data dir const originalPath = await normalizeImportDBPath(selectedPath);
if (originalPath.startsWith(await getWorkspacesBasePath())) { if (!originalPath) {
logger.warn('loadDBFile: db file in app data dir');
return { error: 'DB_FILE_PATH_INVALID' }; return { error: 'DB_FILE_PATH_INVALID' };
} }
@@ -224,6 +238,10 @@ export async function loadDBFile(
return await cpV1DBFile(originalPath, workspaceId); return await cpV1DBFile(originalPath, workspaceId);
} }
if (!(await storage.validateImportSchema())) {
return { error: 'DB_FILE_INVALID' };
}
// v2 import logic // v2 import logic
const internalFilePath = await getSpaceDBPath( const internalFilePath = await getSpaceDBPath(
'local', 'local',
@@ -231,8 +249,8 @@ export async function loadDBFile(
workspaceId workspaceId
); );
await fs.ensureDir(parse(internalFilePath).dir); await fs.ensureDir(parse(internalFilePath).dir);
await fs.copy(originalPath, internalFilePath); await storage.vacuumInto(internalFilePath);
logger.info(`loadDBFile, copy: ${originalPath} -> ${internalFilePath}`); logger.info(`loadDBFile, vacuum: ${originalPath} -> ${internalFilePath}`);
storage = new DocStorage(internalFilePath); storage = new DocStorage(internalFilePath);
await storage.setSpaceId(workspaceId); await storage.setSpaceId(workspaceId);
@@ -260,24 +278,27 @@ async function cpV1DBFile(
return { error: 'DB_FILE_INVALID' }; // invalid db file return { error: 'DB_FILE_INVALID' }; // invalid db file
} }
// checkout to make sure wal is flushed
const connection = new SqliteConnection(originalPath); const connection = new SqliteConnection(originalPath);
await connection.connect(); try {
await connection.checkpoint(); if (!(await connection.validateImportSchema())) {
await connection.close(); return { error: 'DB_FILE_INVALID' };
}
const internalFilePath = await getWorkspaceDBPath('workspace', workspaceId); const internalFilePath = await getWorkspaceDBPath('workspace', workspaceId);
await fs.ensureDir(await getWorkspacesBasePath()); await fs.ensureDir(parse(internalFilePath).dir);
await fs.copy(originalPath, internalFilePath); await connection.vacuumInto(internalFilePath);
logger.info(`loadDBFile, copy: ${originalPath} -> ${internalFilePath}`); logger.info(`loadDBFile, vacuum: ${originalPath} -> ${internalFilePath}`);
await storeWorkspaceMeta(workspaceId, { await storeWorkspaceMeta(workspaceId, {
id: workspaceId, id: workspaceId,
mainDBPath: internalFilePath, mainDBPath: internalFilePath,
}); });
return { return {
workspaceId, workspaceId,
}; };
} finally {
await connection.close();
}
} }

View File

@@ -1,13 +1,8 @@
import { import { loadDBFile, saveDBFileAs, selectDBFileLocation } from './dialog';
loadDBFile,
saveDBFileAs,
selectDBFileLocation,
setFakeDialogResult,
} from './dialog';
export const dialogHandlers = { export const dialogHandlers = {
loadDBFile: async (dbFilePath?: string) => { loadDBFile: async () => {
return loadDBFile(dbFilePath); return loadDBFile();
}, },
saveDBFileAs: async (universalId: string, name: string) => { saveDBFileAs: async (universalId: string, name: string) => {
return saveDBFileAs(universalId, name); return saveDBFileAs(universalId, name);
@@ -15,9 +10,4 @@ export const dialogHandlers = {
selectDBFileLocation: async () => { selectDBFileLocation: async () => {
return selectDBFileLocation(); return selectDBFileLocation();
}, },
setFakeDialogResult: async (
result: Parameters<typeof setFakeDialogResult>[0]
) => {
return setFakeDialogResult(result);
},
}; };

View File

@@ -1,5 +1,6 @@
import { dialogHandlers } from './dialog'; import { dialogHandlers } from './dialog';
import { dbEventsV1, dbHandlersV1, nbstoreHandlers } from './nbstore'; import { dbEventsV1, dbHandlersV1, nbstoreHandlers } from './nbstore';
import { previewHandlers } from './preview';
import { provideExposed } from './provide'; import { provideExposed } from './provide';
import { workspaceEvents, workspaceHandlers } from './workspace'; import { workspaceEvents, workspaceHandlers } from './workspace';
@@ -8,6 +9,7 @@ export const handlers = {
nbstore: nbstoreHandlers, nbstore: nbstoreHandlers,
workspace: workspaceHandlers, workspace: workspaceHandlers,
dialog: dialogHandlers, dialog: dialogHandlers,
preview: previewHandlers,
}; };
export const events = { export const events = {

View File

@@ -0,0 +1,69 @@
import fs from 'node:fs';
import path from 'node:path';
import {
type MermaidRenderRequest,
type MermaidRenderResult,
renderMermaidSvg,
renderTypstSvg,
type TypstRenderRequest,
type TypstRenderResult,
} from '@affine/native';
const TYPST_FONT_DIRS_ENV = 'AFFINE_TYPST_FONT_DIRS';
function parseTypstFontDirsFromEnv() {
const value = process.env[TYPST_FONT_DIRS_ENV];
if (!value) {
return [];
}
return value
.split(path.delimiter)
.map(dir => dir.trim())
.filter(Boolean);
}
function getTypstFontDirCandidates() {
const resourcesPath = process.resourcesPath ?? '';
return [
...parseTypstFontDirsFromEnv(),
path.join(resourcesPath, 'fonts'),
path.join(resourcesPath, 'js', 'fonts'),
path.join(resourcesPath, 'app.asar.unpacked', 'fonts'),
path.join(resourcesPath, 'app.asar.unpacked', 'js', 'fonts'),
];
}
function resolveTypstFontDirs() {
return Array.from(
new Set(getTypstFontDirCandidates().map(dir => path.resolve(dir)))
).filter(dir => fs.statSync(dir, { throwIfNoEntry: false })?.isDirectory());
}
function withTypstFontDirs(
request: TypstRenderRequest,
fontDirs: string[]
): TypstRenderRequest {
const nextOptions = request.options ? { ...request.options } : {};
if (!nextOptions.fontDirs?.length) {
nextOptions.fontDirs = fontDirs;
}
return { ...request, options: nextOptions };
}
const typstFontDirs = resolveTypstFontDirs();
export const previewHandlers = {
renderMermaidSvg: async (
request: MermaidRenderRequest
): Promise<MermaidRenderResult> => {
return renderMermaidSvg(request);
},
renderTypstSvg: async (
request: TypstRenderRequest
): Promise<TypstRenderResult> => {
return renderTypstSvg(withTypstFontDirs(request, typstFontDirs));
},
};

View File

@@ -1,13 +1,18 @@
import path from 'node:path'; import path from 'node:path';
import { DocStorage } from '@affine/native'; import { DocStorage, ValidationResult } from '@affine/native';
import { import {
parseUniversalId, parseUniversalId,
universalId as generateUniversalId, universalId as generateUniversalId,
} from '@affine/nbstore'; } from '@affine/nbstore';
import fs from 'fs-extra'; import fs from 'fs-extra';
import { nanoid } from 'nanoid';
import { applyUpdate, Doc as YDoc } from 'yjs'; import { applyUpdate, Doc as YDoc } from 'yjs';
import {
normalizeWorkspaceIdForPath,
resolveExistingPathInBase,
} from '../../shared/utils';
import { logger } from '../logger'; import { logger } from '../logger';
import { getDocStoragePool } from '../nbstore'; import { getDocStoragePool } from '../nbstore';
import { ensureSQLiteDisconnected } from '../nbstore/v1/ensure-db'; import { ensureSQLiteDisconnected } from '../nbstore/v1/ensure-db';
@@ -18,6 +23,7 @@ import {
getSpaceBasePath, getSpaceBasePath,
getSpaceDBPath, getSpaceDBPath,
getWorkspaceBasePathV1, getWorkspaceBasePathV1,
getWorkspaceDBPath,
getWorkspaceMeta, getWorkspaceMeta,
} from './meta'; } from './meta';
@@ -58,7 +64,7 @@ export async function trashWorkspace(universalId: string) {
const dbPath = await getSpaceDBPath(peer, type, id); const dbPath = await getSpaceDBPath(peer, type, id);
const basePath = await getDeletedWorkspacesBasePath(); const basePath = await getDeletedWorkspacesBasePath();
const movedPath = path.join(basePath, `${id}`); const movedPath = path.join(basePath, normalizeWorkspaceIdForPath(id));
try { try {
const storage = new DocStorage(dbPath); const storage = new DocStorage(dbPath);
if (await storage.validate()) { if (await storage.validate()) {
@@ -258,12 +264,88 @@ export async function getDeletedWorkspaces() {
}; };
} }
async function importLegacyWorkspaceDb(
originalPath: string,
workspaceId: string
) {
const { SqliteConnection } = await import('@affine/native');
const validationResult = await SqliteConnection.validate(originalPath);
if (validationResult !== ValidationResult.Valid) {
return {};
}
const connection = new SqliteConnection(originalPath);
if (!(await connection.validateImportSchema())) {
return {};
}
const internalFilePath = await getWorkspaceDBPath('workspace', workspaceId);
await fs.ensureDir(path.parse(internalFilePath).dir);
await connection.vacuumInto(internalFilePath);
logger.info(
`recoverBackupWorkspace, vacuum: ${originalPath} -> ${internalFilePath}`
);
await storeWorkspaceMeta(workspaceId, {
id: workspaceId,
mainDBPath: internalFilePath,
});
return {
workspaceId,
};
}
async function importWorkspaceDb(originalPath: string) {
const workspaceId = nanoid(10);
let storage = new DocStorage(originalPath);
if (!(await storage.validate())) {
return await importLegacyWorkspaceDb(originalPath, workspaceId);
}
if (!(await storage.validateImportSchema())) {
return {};
}
const internalFilePath = await getSpaceDBPath(
'local',
'workspace',
workspaceId
);
await fs.ensureDir(path.parse(internalFilePath).dir);
await storage.vacuumInto(internalFilePath);
logger.info(
`recoverBackupWorkspace, vacuum: ${originalPath} -> ${internalFilePath}`
);
storage = new DocStorage(internalFilePath);
await storage.setSpaceId(workspaceId);
return {
workspaceId,
};
}
export async function deleteBackupWorkspace(id: string) { export async function deleteBackupWorkspace(id: string) {
const basePath = await getDeletedWorkspacesBasePath(); const basePath = await getDeletedWorkspacesBasePath();
const workspacePath = path.join(basePath, id); const workspacePath = path.join(basePath, normalizeWorkspaceIdForPath(id));
await fs.rmdir(workspacePath, { recursive: true }); await fs.rmdir(workspacePath, { recursive: true });
logger.info( logger.info(
'deleteBackupWorkspace', 'deleteBackupWorkspace',
`Deleted backup workspace: ${workspacePath}` `Deleted backup workspace: ${workspacePath}`
); );
} }
export async function recoverBackupWorkspace(id: string) {
const basePath = await getDeletedWorkspacesBasePath();
const workspacePath = path.join(basePath, normalizeWorkspaceIdForPath(id));
const dbPath = await resolveExistingPathInBase(
basePath,
path.join(workspacePath, 'storage.db'),
{ label: 'backup workspace filepath' }
);
return await importWorkspaceDb(dbPath);
}

View File

@@ -4,6 +4,7 @@ import {
deleteWorkspace, deleteWorkspace,
getDeletedWorkspaces, getDeletedWorkspaces,
listLocalWorkspaceIds, listLocalWorkspaceIds,
recoverBackupWorkspace,
trashWorkspace, trashWorkspace,
} from './handlers'; } from './handlers';
@@ -19,5 +20,6 @@ export const workspaceHandlers = {
return getDeletedWorkspaces(); return getDeletedWorkspaces();
}, },
deleteBackupWorkspace: async (id: string) => deleteBackupWorkspace(id), deleteBackupWorkspace: async (id: string) => deleteBackupWorkspace(id),
recoverBackupWorkspace: async (id: string) => recoverBackupWorkspace(id),
listLocalWorkspaceIds: async () => listLocalWorkspaceIds(), listLocalWorkspaceIds: async () => listLocalWorkspaceIds(),
}; };

View File

@@ -2,7 +2,7 @@ import path from 'node:path';
import { type SpaceType } from '@affine/nbstore'; import { type SpaceType } from '@affine/nbstore';
import { isWindows } from '../../shared/utils'; import { normalizeWorkspaceIdForPath } from '../../shared/utils';
import { mainRPC } from '../main-rpc'; import { mainRPC } from '../main-rpc';
import type { WorkspaceMeta } from '../type'; import type { WorkspaceMeta } from '../type';
@@ -24,10 +24,11 @@ export async function getWorkspaceBasePathV1(
spaceType: SpaceType, spaceType: SpaceType,
workspaceId: string workspaceId: string
) { ) {
const safeWorkspaceId = normalizeWorkspaceIdForPath(workspaceId);
return path.join( return path.join(
await getAppDataPath(), await getAppDataPath(),
spaceType === 'userspace' ? 'userspaces' : 'workspaces', spaceType === 'userspace' ? 'userspaces' : 'workspaces',
isWindows() ? workspaceId.replace(':', '_') : workspaceId safeWorkspaceId
); );
} }
@@ -52,10 +53,11 @@ export async function getSpaceDBPath(
spaceType: SpaceType, spaceType: SpaceType,
id: string id: string
) { ) {
const safeId = normalizeWorkspaceIdForPath(id);
return path.join( return path.join(
await getSpaceBasePath(spaceType), await getSpaceBasePath(spaceType),
escapeFilename(peer), escapeFilename(peer),
id, safeId,
'storage.db' 'storage.db'
); );
} }

View File

@@ -67,7 +67,7 @@ export function createApplicationMenu() {
click: async () => { click: async () => {
await initAndShowMainWindow(); await initAndShowMainWindow();
// fixme: if the window is just created, the new page action will not be triggered // fixme: if the window is just created, the new page action will not be triggered
applicationMenuSubjects.newPageAction$.next('page'); applicationMenuSubjects.newPageAction$.next('default');
}, },
}, },
], ],

View File

@@ -1,5 +1,5 @@
import type { MainEventRegister } from '../type'; import type { MainEventRegister } from '../type';
import { applicationMenuSubjects } from './subject'; import { applicationMenuSubjects, type NewPageAction } from './subject';
export * from './create'; export * from './create';
export * from './subject'; export * from './subject';
@@ -11,7 +11,7 @@ export const applicationMenuEvents = {
/** /**
* File -> New Doc * File -> New Doc
*/ */
onNewPageAction: (fn: (type: 'page' | 'edgeless') => void) => { onNewPageAction: (fn: (type: NewPageAction) => void) => {
const sub = applicationMenuSubjects.newPageAction$.subscribe(fn); const sub = applicationMenuSubjects.newPageAction$.subscribe(fn);
return () => { return () => {
sub.unsubscribe(); sub.unsubscribe();

View File

@@ -1,7 +1,9 @@
import { Subject } from 'rxjs'; import { Subject } from 'rxjs';
export type NewPageAction = 'page' | 'edgeless' | 'default';
export const applicationMenuSubjects = { export const applicationMenuSubjects = {
newPageAction$: new Subject<'page' | 'edgeless'>(), newPageAction$: new Subject<NewPageAction>(),
openJournal$: new Subject<void>(), openJournal$: new Subject<void>(),
openInSettingModal$: new Subject<{ openInSettingModal$: new Subject<{
activeTab: string; activeTab: string;

View File

@@ -9,6 +9,7 @@ import { beforeAppQuit } from './cleanup';
import { logger } from './logger'; import { logger } from './logger';
import { powerEvents } from './power'; import { powerEvents } from './power';
import { recordingEvents } from './recording'; import { recordingEvents } from './recording';
import { checkSource } from './security-restrictions';
import { sharedStorageEvents } from './shared-storage'; import { sharedStorageEvents } from './shared-storage';
import { uiEvents } from './ui/events'; import { uiEvents } from './ui/events';
import { updaterEvents } from './updater/event'; import { updaterEvents } from './updater/event';
@@ -70,7 +71,7 @@ export function registerEvents() {
action: 'subscribe' | 'unsubscribe', action: 'subscribe' | 'unsubscribe',
channel: string channel: string
) => { ) => {
if (typeof channel !== 'string') return; if (!checkSource(event) || typeof channel !== 'string') return;
if (action === 'subscribe') { if (action === 'subscribe') {
addSubscription(event.sender, channel); addSubscription(event.sender, channel);
if (channel === 'power:power-source') { if (channel === 'power:power-source') {

View File

@@ -7,6 +7,7 @@ import { configStorageHandlers } from './config-storage';
import { findInPageHandlers } from './find-in-page'; import { findInPageHandlers } from './find-in-page';
import { getLogFilePath, logger, revealLogFile } from './logger'; import { getLogFilePath, logger, revealLogFile } from './logger';
import { recordingHandlers } from './recording'; import { recordingHandlers } from './recording';
import { checkSource } from './security-restrictions';
import { sharedStorageHandlers } from './shared-storage'; import { sharedStorageHandlers } from './shared-storage';
import { uiHandlers } from './ui/handlers'; import { uiHandlers } from './ui/handlers';
import { updaterHandlers } from './updater'; import { updaterHandlers } from './updater';
@@ -49,7 +50,7 @@ export const registerHandlers = () => {
...args: any[] ...args: any[]
) => { ) => {
// args[0] is the `{namespace:key}` // args[0] is the `{namespace:key}`
if (typeof args[0] !== 'string') { if (!checkSource(e) || typeof args[0] !== 'string') {
logger.error('invalid ipc message', args); logger.error('invalid ipc message', args);
return; return;
} }
@@ -97,6 +98,8 @@ export const registerHandlers = () => {
}); });
ipcMain.on(AFFINE_API_CHANNEL_NAME, (e, ...args: any[]) => { ipcMain.on(AFFINE_API_CHANNEL_NAME, (e, ...args: any[]) => {
if (!checkSource(e)) return;
handleIpcMessage(e, ...args) handleIpcMessage(e, ...args)
.then(ret => { .then(ret => {
e.returnValue = ret; e.returnValue = ret;

View File

@@ -1,5 +1,3 @@
import './security-restrictions';
import path from 'node:path'; import path from 'node:path';
import * as Sentry from '@sentry/electron/main'; import * as Sentry from '@sentry/electron/main';
@@ -15,6 +13,7 @@ import { registerHandlers } from './handlers';
import { logger } from './logger'; import { logger } from './logger';
import { registerProtocol } from './protocol'; import { registerProtocol } from './protocol';
import { setupRecordingFeature } from './recording/feature'; import { setupRecordingFeature } from './recording/feature';
import { registerSecurityRestrictions } from './security-restrictions';
import { setupTrayState } from './tray'; import { setupTrayState } from './tray';
import { registerUpdater } from './updater'; import { registerUpdater } from './updater';
import { launch } from './windows-manager/launcher'; import { launch } from './windows-manager/launcher';
@@ -105,6 +104,7 @@ app.on('activate', () => {
}); });
setupDeepLink(app); setupDeepLink(app);
registerSecurityRestrictions();
/** /**
* Create app window when background process will be ready * Create app window when background process will be ready

View File

@@ -4,25 +4,47 @@ import { pathToFileURL } from 'node:url';
import { app, net, protocol, session } from 'electron'; import { app, net, protocol, session } from 'electron';
import cookieParser from 'set-cookie-parser'; import cookieParser from 'set-cookie-parser';
import { isWindows, resourcesPath } from '../shared/utils'; import { anotherHost, mainHost } from '../shared/internal-origin';
import {
isPathInsideBase,
isWindows,
resolveExistingPathInBase,
resolvePathInBase,
resourcesPath,
} from '../shared/utils';
import { buildType, isDev } from './config'; import { buildType, isDev } from './config';
import { anotherHost, mainHost } from './constants';
import { logger } from './logger'; import { logger } from './logger';
const webStaticDir = join(resourcesPath, 'web-static'); const webStaticDir = join(resourcesPath, 'web-static');
const devServerBase = process.env.DEV_SERVER_URL; const devServerBase = process.env.DEV_SERVER_URL;
const localWhiteListDirs = [ const localWhiteListDirs = [
path.resolve(app.getPath('sessionData')).toLowerCase(), path.resolve(app.getPath('sessionData')),
path.resolve(app.getPath('temp')).toLowerCase(), path.resolve(app.getPath('temp')),
]; ];
function isPathInWhiteList(filepath: string) { function isPathInWhiteList(filepath: string) {
const lowerFilePath = filepath.toLowerCase();
return localWhiteListDirs.some(whitelistDir => return localWhiteListDirs.some(whitelistDir =>
lowerFilePath.startsWith(whitelistDir) isPathInsideBase(whitelistDir, filepath, {
caseInsensitive: isWindows(),
})
); );
} }
async function resolveWhitelistedLocalPath(filepath: string) {
for (const whitelistDir of localWhiteListDirs) {
try {
return await resolveExistingPathInBase(whitelistDir, filepath, {
caseInsensitive: isWindows(),
label: 'filepath',
});
} catch {
continue;
}
}
throw new Error('Invalid filepath');
}
const apiBaseByBuildType: Record<typeof buildType, string> = { const apiBaseByBuildType: Record<typeof buildType, string> = {
stable: 'https://app.affine.pro', stable: 'https://app.affine.pro',
beta: 'https://insider.affine.pro', beta: 'https://insider.affine.pro',
@@ -94,15 +116,14 @@ async function handleFileRequest(request: Request) {
// for relative path, load the file in resources // for relative path, load the file in resources
if (!isAbsolutePath) { if (!isAbsolutePath) {
if (urlObject.pathname.split('/').at(-1)?.includes('.')) { if (urlObject.pathname.split('/').at(-1)?.includes('.')) {
// Sanitize pathname to prevent path traversal attacks const decodedPath = decodeURIComponent(urlObject.pathname).replace(
const decodedPath = decodeURIComponent(urlObject.pathname); /^\/+/,
const normalizedPath = join(webStaticDir, decodedPath).normalize(); ''
if (!normalizedPath.startsWith(webStaticDir)) { );
// Attempted path traversal - reject by using empty path filepath = resolvePathInBase(webStaticDir, decodedPath, {
filepath = join(webStaticDir, ''); caseInsensitive: isWindows(),
} else { label: 'filepath',
filepath = normalizedPath; });
}
} else { } else {
// else, fallback to load the index.html instead // else, fallback to load the index.html instead
filepath = join(webStaticDir, 'index.html'); filepath = join(webStaticDir, 'index.html');
@@ -113,10 +134,10 @@ async function handleFileRequest(request: Request) {
if (isWindows()) { if (isWindows()) {
filepath = path.resolve(filepath.replace(/^\//, '')); filepath = path.resolve(filepath.replace(/^\//, ''));
} }
// security check if the filepath is within app.getPath('sessionData')
if (urlObject.host !== 'local-file' || !isPathInWhiteList(filepath)) { if (urlObject.host !== 'local-file' || !isPathInWhiteList(filepath)) {
throw new Error('Invalid filepath'); throw new Error('Invalid filepath');
} }
filepath = await resolveWhitelistedLocalPath(filepath);
} }
return net.fetch(pathToFileURL(filepath).toString(), clonedRequest); return net.fetch(pathToFileURL(filepath).toString(), clonedRequest);
} }

View File

@@ -1,11 +1,10 @@
/* oxlint-disable no-var-requires */ /* oxlint-disable no-var-requires */
import { execSync } from 'node:child_process'; import { execSync } from 'node:child_process';
import { createHash } from 'node:crypto';
import fsp from 'node:fs/promises'; import fsp from 'node:fs/promises';
import path from 'node:path'; import path from 'node:path';
// Should not load @affine/native for unsupported platforms // Should not load @affine/native for unsupported platforms
import type { ShareableContent as ShareableContentType } from '@affine/native'; import type * as NativeModuleType from '@affine/native';
import { app, systemPreferences } from 'electron'; import { app, systemPreferences } from 'electron';
import fs from 'fs-extra'; import fs from 'fs-extra';
import { debounce } from 'lodash-es'; import { debounce } from 'lodash-es';
@@ -20,7 +19,12 @@ import {
} from 'rxjs'; } from 'rxjs';
import { filter, map, shareReplay } from 'rxjs/operators'; import { filter, map, shareReplay } from 'rxjs/operators';
import { isMacOS, isWindows, shallowEqual } from '../../shared/utils'; import {
isMacOS,
isWindows,
resolveExistingPathInBase,
shallowEqual,
} from '../../shared/utils';
import { beforeAppQuit } from '../cleanup'; import { beforeAppQuit } from '../cleanup';
import { logger } from '../logger'; import { logger } from '../logger';
import { import {
@@ -32,12 +36,7 @@ import { getMainWindow } from '../windows-manager';
import { popupManager } from '../windows-manager/popup'; import { popupManager } from '../windows-manager/popup';
import { isAppNameAllowed } from './allow-list'; import { isAppNameAllowed } from './allow-list';
import { recordingStateMachine } from './state-machine'; import { recordingStateMachine } from './state-machine';
import type { import type { AppGroupInfo, RecordingStatus, TappableAppInfo } from './types';
AppGroupInfo,
Recording,
RecordingStatus,
TappableAppInfo,
} from './types';
export const MeetingsSettingsState = { export const MeetingsSettingsState = {
$: globalStateStorage.watch<MeetingSettingsSchema>(MeetingSettingsKey).pipe( $: globalStateStorage.watch<MeetingSettingsSchema>(MeetingSettingsKey).pipe(
@@ -56,7 +55,12 @@ export const MeetingsSettingsState = {
}, },
}; };
type Subscriber = {
unsubscribe: () => void;
};
const subscribers: Subscriber[] = []; const subscribers: Subscriber[] = [];
let appStateSubscribers: Subscriber[] = [];
// recordings are saved in the app data directory // recordings are saved in the app data directory
// may need a way to clean up old recordings // may need a way to clean up old recordings
@@ -65,10 +69,29 @@ export const SAVED_RECORDINGS_DIR = path.join(
'recordings' 'recordings'
); );
type NativeModule = typeof NativeModuleType;
type ShareableContentType = InstanceType<NativeModule['ShareableContent']>;
type ShareableContentStatic = NativeModule['ShareableContent'];
let shareableContent: ShareableContentType | null = null; let shareableContent: ShareableContentType | null = null;
function getNativeModule(): NativeModule {
return require('@affine/native') as NativeModule;
}
function cleanup() { function cleanup() {
const nativeId = recordingStateMachine.status?.nativeId;
if (nativeId) cleanupAbandonedNativeRecording(nativeId);
recordingStatus$.next(null);
shareableContent = null; shareableContent = null;
appStateSubscribers.forEach(subscriber => {
try {
subscriber.unsubscribe();
} catch {
// ignore unsubscribe error
}
});
appStateSubscribers = [];
subscribers.forEach(subscriber => { subscribers.forEach(subscriber => {
try { try {
subscriber.unsubscribe(); subscriber.unsubscribe();
@@ -76,6 +99,9 @@ function cleanup() {
// ignore unsubscribe error // ignore unsubscribe error
} }
}); });
subscribers.length = 0;
applications$.next([]);
appGroups$.next([]);
} }
beforeAppQuit(() => { beforeAppQuit(() => {
@@ -87,18 +113,21 @@ export const appGroups$ = new BehaviorSubject<AppGroupInfo[]>([]);
export const updateApplicationsPing$ = new Subject<number>(); export const updateApplicationsPing$ = new Subject<number>();
// recording id -> recording // There should be only one active recording at a time; state is managed by the state machine
// recordings will be saved in memory before consumed and created as an audio block to user's doc
const recordings = new Map<number, Recording>();
// there should be only one active recording at a time
// We'll now use recordingStateMachine.status$ instead of our own BehaviorSubject
export const recordingStatus$ = recordingStateMachine.status$; export const recordingStatus$ = recordingStateMachine.status$;
function isRecordingSettled(
status: RecordingStatus | null | undefined
): status is RecordingStatus & {
status: 'ready';
blockCreationStatus: 'success' | 'failed';
} {
return status?.status === 'ready' && status.blockCreationStatus !== undefined;
}
function createAppGroup(processGroupId: number): AppGroupInfo | undefined { function createAppGroup(processGroupId: number): AppGroupInfo | undefined {
// MUST require dynamically to avoid loading @affine/native for unsupported platforms // MUST require dynamically to avoid loading @affine/native for unsupported platforms
const SC: typeof ShareableContentType = const SC: ShareableContentStatic = getNativeModule().ShareableContent;
require('@affine/native').ShareableContent;
const groupProcess = SC?.applicationWithProcessId(processGroupId); const groupProcess = SC?.applicationWithProcessId(processGroupId);
if (!groupProcess) { if (!groupProcess) {
return; return;
@@ -174,9 +203,13 @@ function setupNewRunningAppGroup() {
}); });
const debounceStartRecording = debounce((appGroup: AppGroupInfo) => { const debounceStartRecording = debounce((appGroup: AppGroupInfo) => {
// check if the app is running again const currentGroup = appGroups$.value.find(
if (appGroup.isRunning) { group => group.processGroupId === appGroup.processGroupId
startRecording(appGroup); );
if (currentGroup?.isRunning) {
startRecording(currentGroup).catch(err => {
logger.error('failed to start recording', err);
});
} }
}, 1000); }, 1000);
@@ -200,8 +233,7 @@ function setupNewRunningAppGroup() {
if ( if (
!recordingStatus || !recordingStatus ||
recordingStatus.status === 'new' || recordingStatus.status === 'new' ||
recordingStatus.status === 'create-block-success' || isRecordingSettled(recordingStatus)
recordingStatus.status === 'create-block-failed'
) { ) {
if (MeetingsSettingsState.value.recordingMode === 'prompt') { if (MeetingsSettingsState.value.recordingMode === 'prompt') {
newRecording(currentGroup); newRecording(currentGroup);
@@ -226,7 +258,7 @@ function setupNewRunningAppGroup() {
removeRecording(recordingStatus.id); removeRecording(recordingStatus.id);
} }
// if the recording is stopped and we are recording it, // if the watched app stops while we are recording it,
// we should stop the recording // we should stop the recording
if ( if (
recordingStatus?.status === 'recording' && recordingStatus?.status === 'recording' &&
@@ -242,100 +274,28 @@ function setupNewRunningAppGroup() {
); );
} }
function getSanitizedAppId(bundleIdentifier?: string) {
if (!bundleIdentifier) {
return 'unknown';
}
return isWindows()
? createHash('sha256')
.update(bundleIdentifier)
.digest('hex')
.substring(0, 8)
: bundleIdentifier;
}
export function createRecording(status: RecordingStatus) {
let recording = recordings.get(status.id);
if (recording) {
return recording;
}
const appId = getSanitizedAppId(status.appGroup?.bundleIdentifier);
const bufferedFilePath = path.join(
SAVED_RECORDINGS_DIR,
`${appId}-${status.id}-${status.startTime}.raw`
);
fs.ensureDirSync(SAVED_RECORDINGS_DIR);
const file = fs.createWriteStream(bufferedFilePath);
function tapAudioSamples(err: Error | null, samples: Float32Array) {
const recordingStatus = recordingStatus$.getValue();
if (
!recordingStatus ||
recordingStatus.id !== status.id ||
recordingStatus.status === 'paused'
) {
return;
}
if (err) {
logger.error('failed to get audio samples', err);
} else {
// Writing raw Float32Array samples directly to file
// For stereo audio, samples are interleaved [L,R,L,R,...]
file.write(Buffer.from(samples.buffer));
}
}
// MUST require dynamically to avoid loading @affine/native for unsupported platforms
const SC: typeof ShareableContentType =
require('@affine/native').ShareableContent;
const stream = status.app
? SC.tapAudio(status.app.processId, tapAudioSamples)
: SC.tapGlobalAudio(null, tapAudioSamples);
recording = {
id: status.id,
startTime: status.startTime,
app: status.app,
appGroup: status.appGroup,
file,
session: stream,
};
recordings.set(status.id, recording);
return recording;
}
export async function getRecording(id: number) { export async function getRecording(id: number) {
const recording = recordings.get(id); const recording = recordingStateMachine.status;
if (!recording) { if (!recording || recording.id !== id) {
logger.error(`Recording ${id} not found`); logger.error(`Recording ${id} not found`);
return; return;
} }
const rawFilePath = String(recording.file.path);
return { return {
id, id,
appGroup: recording.appGroup, appGroup: recording.appGroup,
app: recording.app, app: recording.app,
startTime: recording.startTime, startTime: recording.startTime,
filepath: rawFilePath, filepath: recording.filepath,
sampleRate: recording.session.sampleRate, sampleRate: recording.sampleRate,
numberOfChannels: recording.session.channels, numberOfChannels: recording.numberOfChannels,
}; };
} }
// recording popup status // recording popup status
// new: recording is started, popup is shown // new: waiting for user confirmation
// recording: recording is started, popup is shown // recording: native recording is ongoing
// stopped: recording is stopped, popup showing processing status // processing: native stop or renderer import/transcription is ongoing
// create-block-success: recording is ready, show "open app" button // ready + blockCreationStatus: post-processing finished
// create-block-failed: recording is failed, show "failed to save" button
// null: hide popup // null: hide popup
function setupRecordingListeners() { function setupRecordingListeners() {
subscribers.push( subscribers.push(
@@ -350,36 +310,21 @@ function setupRecordingListeners() {
}); });
} }
if (status?.status === 'recording') { if (isRecordingSettled(status)) {
let recording = recordings.get(status.id);
// create a recording if not exists
if (!recording) {
recording = createRecording(status);
}
} else if (status?.status === 'stopped') {
const recording = recordings.get(status.id);
if (recording) {
recording.session.stop();
}
} else if (
status?.status === 'create-block-success' ||
status?.status === 'create-block-failed'
) {
// show the popup for 10s // show the popup for 10s
setTimeout( setTimeout(
() => { () => {
// check again if current status is still ready const currentStatus = recordingStatus$.value;
if ( if (
(recordingStatus$.value?.status === 'create-block-success' || isRecordingSettled(currentStatus) &&
recordingStatus$.value?.status === 'create-block-failed') && currentStatus.id === status.id
recordingStatus$.value.id === status.id
) { ) {
popup.hide().catch(err => { popup.hide().catch(err => {
logger.error('failed to hide recording popup', err); logger.error('failed to hide recording popup', err);
}); });
} }
}, },
status?.status === 'create-block-failed' ? 30_000 : 10_000 status.blockCreationStatus === 'failed' ? 30_000 : 10_000
); );
} else if (!status) { } else if (!status) {
// status is removed, we should hide the popup // status is removed, we should hide the popup
@@ -400,9 +345,7 @@ function getAllApps(): TappableAppInfo[] {
} }
// MUST require dynamically to avoid loading @affine/native for unsupported platforms // MUST require dynamically to avoid loading @affine/native for unsupported platforms
const { ShareableContent } = require('@affine/native') as { const { ShareableContent } = getNativeModule();
ShareableContent: typeof ShareableContentType;
};
const apps = ShareableContent.applications().map(app => { const apps = ShareableContent.applications().map(app => {
try { try {
@@ -433,12 +376,8 @@ function getAllApps(): TappableAppInfo[] {
return filteredApps; return filteredApps;
} }
type Subscriber = {
unsubscribe: () => void;
};
function setupMediaListeners() { function setupMediaListeners() {
const ShareableContent = require('@affine/native').ShareableContent; const ShareableContent = getNativeModule().ShareableContent;
applications$.next(getAllApps()); applications$.next(getAllApps());
subscribers.push( subscribers.push(
interval(3000).subscribe(() => { interval(3000).subscribe(() => {
@@ -454,8 +393,6 @@ function setupMediaListeners() {
}) })
); );
let appStateSubscribers: Subscriber[] = [];
subscribers.push( subscribers.push(
applications$.subscribe(apps => { applications$.subscribe(apps => {
appStateSubscribers.forEach(subscriber => { appStateSubscribers.forEach(subscriber => {
@@ -484,15 +421,6 @@ function setupMediaListeners() {
}); });
appStateSubscribers = _appStateSubscribers; appStateSubscribers = _appStateSubscribers;
return () => {
_appStateSubscribers.forEach(subscriber => {
try {
subscriber.unsubscribe();
} catch {
// ignore unsubscribe error
}
});
};
}) })
); );
} }
@@ -502,7 +430,7 @@ function askForScreenRecordingPermission() {
return false; return false;
} }
try { try {
const ShareableContent = require('@affine/native').ShareableContent; const ShareableContent = getNativeModule().ShareableContent;
// this will trigger the permission prompt // this will trigger the permission prompt
new ShareableContent(); new ShareableContent();
return true; return true;
@@ -519,7 +447,7 @@ export function setupRecordingFeature() {
} }
try { try {
const ShareableContent = require('@affine/native').ShareableContent; const ShareableContent = getNativeModule().ShareableContent;
if (!shareableContent) { if (!shareableContent) {
shareableContent = new ShareableContent(); shareableContent = new ShareableContent();
setupMediaListeners(); setupMediaListeners();
@@ -537,7 +465,6 @@ export function setupRecordingFeature() {
} }
export function disableRecordingFeature() { export function disableRecordingFeature() {
recordingStatus$.next(null);
cleanup(); cleanup();
} }
@@ -558,222 +485,175 @@ export function newRecording(
}); });
} }
export function startRecording( export async function startRecording(
appGroup?: AppGroupInfo | number appGroup?: AppGroupInfo | number
): RecordingStatus | null { ): Promise<RecordingStatus | null> {
const state = recordingStateMachine.dispatch( const previousState = recordingStateMachine.status;
{ const state = recordingStateMachine.dispatch({
type: 'START_RECORDING', type: 'START_RECORDING',
appGroup: normalizeAppGroupInfo(appGroup), appGroup: normalizeAppGroupInfo(appGroup),
}, });
false
);
if (state?.status === 'recording') { if (!state || state.status !== 'recording' || state === previousState) {
createRecording(state); return state;
} }
recordingStateMachine.status$.next(state); let nativeId: string | undefined;
return state; try {
} fs.ensureDirSync(SAVED_RECORDINGS_DIR);
export function pauseRecording(id: number) { const meta = getNativeModule().startRecording({
return recordingStateMachine.dispatch({ type: 'PAUSE_RECORDING', id }); appProcessId: state.app?.processId,
} outputDir: SAVED_RECORDINGS_DIR,
format: 'opus',
id: String(state.id),
});
nativeId = meta.id;
export function resumeRecording(id: number) { const filepath = await assertRecordingFilepath(meta.filepath);
return recordingStateMachine.dispatch({ type: 'RESUME_RECORDING', id }); const nextState = recordingStateMachine.dispatch({
type: 'ATTACH_NATIVE_RECORDING',
id: state.id,
nativeId: meta.id,
startTime: meta.startedAt ?? state.startTime,
filepath,
sampleRate: meta.sampleRate,
numberOfChannels: meta.channels,
});
if (!nextState || nextState.nativeId !== meta.id) {
throw new Error('Failed to attach native recording metadata');
}
return nextState;
} catch (error) {
if (nativeId) {
cleanupAbandonedNativeRecording(nativeId);
}
logger.error('failed to start recording', error);
return setRecordingBlockCreationStatus(
state.id,
'failed',
error instanceof Error ? error.message : undefined
);
}
} }
export async function stopRecording(id: number) { export async function stopRecording(id: number) {
const recording = recordings.get(id); const recording = recordingStateMachine.status;
if (!recording) { if (!recording || recording.id !== id) {
logger.error(`stopRecording: Recording ${id} not found`); logger.error(`stopRecording: Recording ${id} not found`);
return; return;
} }
if (!recording.file.path) { if (!recording.nativeId) {
logger.error(`Recording ${id} has no file path`); logger.error(`stopRecording: Recording ${id} missing native id`);
return; return;
} }
const { file, session: stream } = recording; const processingState = recordingStateMachine.dispatch({
type: 'STOP_RECORDING',
// First stop the audio stream to prevent more data coming in id,
try { });
stream.stop(); if (
} catch (err) { !processingState ||
logger.error('Failed to stop audio stream', err); processingState.id !== id ||
processingState.status !== 'processing'
) {
return serializeRecordingStatus(processingState ?? recording);
} }
// End the file with a timeout
file.end();
try { try {
await Promise.race([ const artifact = getNativeModule().stopRecording(recording.nativeId);
new Promise<void>((resolve, reject) => { const filepath = await assertRecordingFilepath(artifact.filepath);
file.on('finish', () => { const readyStatus = recordingStateMachine.dispatch({
// check if the file is empty type: 'ATTACH_RECORDING_ARTIFACT',
const stats = fs.statSync(file.path);
if (stats.size === 0) {
reject(new Error('Recording is empty'));
return;
}
resolve();
});
file.on('error', err => {
reject(err);
});
}),
new Promise<never>((_, reject) =>
setTimeout(() => reject(new Error('File writing timeout')), 10000)
),
]);
const recordingStatus = recordingStateMachine.dispatch({
type: 'STOP_RECORDING',
id, id,
filepath,
sampleRate: artifact.sampleRate,
numberOfChannels: artifact.channels,
}); });
if (!recordingStatus) { if (!readyStatus) {
logger.error('No recording status to stop'); logger.error('No recording status to save');
return; return;
} }
return serializeRecordingStatus(recordingStatus);
getMainWindow()
.then(mainWindow => {
if (mainWindow) {
mainWindow.show();
}
})
.catch(err => {
logger.error('failed to bring up the window', err);
});
return serializeRecordingStatus(readyStatus);
} catch (error: unknown) { } catch (error: unknown) {
logger.error('Failed to stop recording', error); logger.error('Failed to stop recording', error);
const recordingStatus = recordingStateMachine.dispatch({ const recordingStatus = await setRecordingBlockCreationStatus(
type: 'CREATE_BLOCK_FAILED',
id, id,
error: error instanceof Error ? error : undefined, 'failed',
}); error instanceof Error ? error.message : undefined
);
if (!recordingStatus) { if (!recordingStatus) {
logger.error('No recording status to stop'); logger.error('No recording status to stop');
return; return;
} }
return serializeRecordingStatus(recordingStatus); return serializeRecordingStatus(recordingStatus);
} finally {
// Clean up the file stream if it's still open
if (!file.closed) {
file.destroy();
}
} }
} }
export async function getRawAudioBuffers( async function assertRecordingFilepath(filepath: string) {
id: number, return await resolveExistingPathInBase(SAVED_RECORDINGS_DIR, filepath, {
cursor?: number caseInsensitive: isWindows(),
): Promise<{ label: 'recording filepath',
buffer: Buffer; });
nextCursor: number;
}> {
const recording = recordings.get(id);
if (!recording) {
throw new Error(`getRawAudioBuffers: Recording ${id} not found`);
}
const start = cursor ?? 0;
const file = await fsp.open(recording.file.path, 'r');
const stats = await file.stat();
const buffer = Buffer.alloc(stats.size - start);
const result = await file.read(buffer, 0, buffer.length, start);
await file.close();
return {
buffer,
nextCursor: start + result.bytesRead,
};
}
function assertRecordingFilepath(filepath: string) {
const normalizedPath = path.normalize(filepath);
const normalizedBase = path.normalize(SAVED_RECORDINGS_DIR + path.sep);
if (!normalizedPath.toLowerCase().startsWith(normalizedBase.toLowerCase())) {
throw new Error('Invalid recording filepath');
}
return normalizedPath;
} }
export async function readRecordingFile(filepath: string) { export async function readRecordingFile(filepath: string) {
const normalizedPath = assertRecordingFilepath(filepath); const normalizedPath = await assertRecordingFilepath(filepath);
return fsp.readFile(normalizedPath); return fsp.readFile(normalizedPath);
} }
export async function readyRecording(id: number, buffer: Buffer) { function cleanupAbandonedNativeRecording(nativeId: string) {
logger.info('readyRecording', id); try {
const artifact = getNativeModule().stopRecording(nativeId);
const recordingStatus = recordingStatus$.value; void assertRecordingFilepath(artifact.filepath)
const recording = recordings.get(id); .then(filepath => {
if (!recordingStatus || recordingStatus.id !== id || !recording) { fs.removeSync(filepath);
logger.error(`readyRecording: Recording ${id} not found`); })
return; .catch(error => {
logger.error('failed to validate abandoned recording filepath', error);
});
} catch (error) {
logger.error('failed to cleanup abandoned native recording', error);
} }
const rawFilePath = String(recording.file.path);
const filepath = rawFilePath.replace('.raw', '.opus');
if (!filepath) {
logger.error(`readyRecording: Recording ${id} has no filepath`);
return;
}
await fs.writeFile(filepath, buffer);
// can safely remove the raw file now
logger.info('remove raw file', rawFilePath);
if (rawFilePath) {
try {
await fs.unlink(rawFilePath);
} catch (err) {
logger.error('failed to remove raw file', err);
}
}
// Update the status through the state machine
recordingStateMachine.dispatch({
type: 'SAVE_RECORDING',
id,
filepath,
});
// bring up the window
getMainWindow()
.then(mainWindow => {
if (mainWindow) {
mainWindow.show();
}
})
.catch(err => {
logger.error('failed to bring up the window', err);
});
} }
export async function handleBlockCreationSuccess(id: number) { export async function setRecordingBlockCreationStatus(
recordingStateMachine.dispatch({ id: number,
type: 'CREATE_BLOCK_SUCCESS', status: 'success' | 'failed',
errorMessage?: string
) {
return recordingStateMachine.dispatch({
type: 'SET_BLOCK_CREATION_STATUS',
id, id,
}); status,
} errorMessage,
export async function handleBlockCreationFailed(id: number, error?: Error) {
recordingStateMachine.dispatch({
type: 'CREATE_BLOCK_FAILED',
id,
error,
}); });
} }
export function removeRecording(id: number) { export function removeRecording(id: number) {
recordings.delete(id);
recordingStateMachine.dispatch({ type: 'REMOVE_RECORDING', id }); recordingStateMachine.dispatch({ type: 'REMOVE_RECORDING', id });
} }
export interface SerializedRecordingStatus { export interface SerializedRecordingStatus {
id: number; id: number;
status: RecordingStatus['status']; status: RecordingStatus['status'];
blockCreationStatus?: RecordingStatus['blockCreationStatus'];
appName?: string; appName?: string;
// if there is no app group, it means the recording is for system audio // if there is no app group, it means the recording is for system audio
appGroupId?: number; appGroupId?: number;
@@ -787,18 +667,17 @@ export interface SerializedRecordingStatus {
export function serializeRecordingStatus( export function serializeRecordingStatus(
status: RecordingStatus status: RecordingStatus
): SerializedRecordingStatus | null { ): SerializedRecordingStatus | null {
const recording = recordings.get(status.id);
return { return {
id: status.id, id: status.id,
status: status.status, status: status.status,
blockCreationStatus: status.blockCreationStatus,
appName: status.appGroup?.name, appName: status.appGroup?.name,
appGroupId: status.appGroup?.processGroupId, appGroupId: status.appGroup?.processGroupId,
icon: status.appGroup?.icon, icon: status.appGroup?.icon,
startTime: status.startTime, startTime: status.startTime,
filepath: filepath: status.filepath,
status.filepath ?? (recording ? String(recording.file.path) : undefined), sampleRate: status.sampleRate,
sampleRate: recording?.session.sampleRate, numberOfChannels: status.numberOfChannels,
numberOfChannels: recording?.session.channels,
}; };
} }

View File

@@ -2,11 +2,9 @@
// Should not load @affine/native for unsupported platforms // Should not load @affine/native for unsupported platforms
import path from 'node:path';
import { shell } from 'electron'; import { shell } from 'electron';
import { isMacOS } from '../../shared/utils'; import { isMacOS, resolvePathInBase } from '../../shared/utils';
import { openExternalSafely } from '../security/open-external'; import { openExternalSafely } from '../security/open-external';
import type { NamespaceHandlers } from '../type'; import type { NamespaceHandlers } from '../type';
import { import {
@@ -14,18 +12,14 @@ import {
checkMeetingPermissions, checkMeetingPermissions,
checkRecordingAvailable, checkRecordingAvailable,
disableRecordingFeature, disableRecordingFeature,
getRawAudioBuffers,
getRecording, getRecording,
handleBlockCreationFailed,
handleBlockCreationSuccess,
pauseRecording,
readRecordingFile, readRecordingFile,
readyRecording,
recordingStatus$, recordingStatus$,
removeRecording, removeRecording,
SAVED_RECORDINGS_DIR, SAVED_RECORDINGS_DIR,
type SerializedRecordingStatus, type SerializedRecordingStatus,
serializeRecordingStatus, serializeRecordingStatus,
setRecordingBlockCreationStatus,
setupRecordingFeature, setupRecordingFeature,
startRecording, startRecording,
stopRecording, stopRecording,
@@ -45,27 +39,19 @@ export const recordingHandlers = {
startRecording: async (_, appGroup?: AppGroupInfo | number) => { startRecording: async (_, appGroup?: AppGroupInfo | number) => {
return startRecording(appGroup); return startRecording(appGroup);
}, },
pauseRecording: async (_, id: number) => {
return pauseRecording(id);
},
stopRecording: async (_, id: number) => { stopRecording: async (_, id: number) => {
return stopRecording(id); return stopRecording(id);
}, },
getRawAudioBuffers: async (_, id: number, cursor?: number) => {
return getRawAudioBuffers(id, cursor);
},
readRecordingFile: async (_, filepath: string) => { readRecordingFile: async (_, filepath: string) => {
return readRecordingFile(filepath); return readRecordingFile(filepath);
}, },
// save the encoded recording buffer to the file system setRecordingBlockCreationStatus: async (
readyRecording: async (_, id: number, buffer: Uint8Array) => { _,
return readyRecording(id, Buffer.from(buffer)); id: number,
}, status: 'success' | 'failed',
handleBlockCreationSuccess: async (_, id: number) => { errorMessage?: string
return handleBlockCreationSuccess(id); ) => {
}, return setRecordingBlockCreationStatus(id, status, errorMessage);
handleBlockCreationFailed: async (_, id: number, error?: Error) => {
return handleBlockCreationFailed(id, error);
}, },
removeRecording: async (_, id: number) => { removeRecording: async (_, id: number) => {
return removeRecording(id); return removeRecording(id);
@@ -100,15 +86,10 @@ export const recordingHandlers = {
return false; return false;
}, },
showSavedRecordings: async (_, subpath?: string) => { showSavedRecordings: async (_, subpath?: string) => {
const normalizedDir = path.normalize( const directory = resolvePathInBase(SAVED_RECORDINGS_DIR, subpath ?? '', {
path.join(SAVED_RECORDINGS_DIR, subpath ?? '') label: 'directory',
); });
const normalizedBase = path.normalize(SAVED_RECORDINGS_DIR); return shell.showItemInFolder(directory);
if (!normalizedDir.startsWith(normalizedBase)) {
throw new Error('Invalid directory');
}
return shell.showItemInFolder(normalizedDir);
}, },
} satisfies NamespaceHandlers; } satisfies NamespaceHandlers;

View File

@@ -13,25 +13,31 @@ export type RecordingEvent =
type: 'START_RECORDING'; type: 'START_RECORDING';
appGroup?: AppGroupInfo; appGroup?: AppGroupInfo;
} }
| { type: 'PAUSE_RECORDING'; id: number } | {
| { type: 'RESUME_RECORDING'; id: number } type: 'ATTACH_NATIVE_RECORDING';
id: number;
nativeId: string;
startTime: number;
filepath: string;
sampleRate: number;
numberOfChannels: number;
}
| { | {
type: 'STOP_RECORDING'; type: 'STOP_RECORDING';
id: number; id: number;
} }
| { | {
type: 'SAVE_RECORDING'; type: 'ATTACH_RECORDING_ARTIFACT';
id: number; id: number;
filepath: string; filepath: string;
sampleRate?: number;
numberOfChannels?: number;
} }
| { | {
type: 'CREATE_BLOCK_FAILED'; type: 'SET_BLOCK_CREATION_STATUS';
id: number;
error?: Error;
}
| {
type: 'CREATE_BLOCK_SUCCESS';
id: number; id: number;
status: 'success' | 'failed';
errorMessage?: string;
} }
| { type: 'REMOVE_RECORDING'; id: number }; | { type: 'REMOVE_RECORDING'; id: number };
@@ -74,23 +80,26 @@ export class RecordingStateMachine {
case 'START_RECORDING': case 'START_RECORDING':
newStatus = this.handleStartRecording(event.appGroup); newStatus = this.handleStartRecording(event.appGroup);
break; break;
case 'PAUSE_RECORDING': case 'ATTACH_NATIVE_RECORDING':
newStatus = this.handlePauseRecording(); newStatus = this.handleAttachNativeRecording(event);
break;
case 'RESUME_RECORDING':
newStatus = this.handleResumeRecording();
break; break;
case 'STOP_RECORDING': case 'STOP_RECORDING':
newStatus = this.handleStopRecording(event.id); newStatus = this.handleStopRecording(event.id);
break; break;
case 'SAVE_RECORDING': case 'ATTACH_RECORDING_ARTIFACT':
newStatus = this.handleSaveRecording(event.id, event.filepath); newStatus = this.handleAttachRecordingArtifact(
event.id,
event.filepath,
event.sampleRate,
event.numberOfChannels
);
break; break;
case 'CREATE_BLOCK_SUCCESS': case 'SET_BLOCK_CREATION_STATUS':
newStatus = this.handleCreateBlockSuccess(event.id); newStatus = this.handleSetBlockCreationStatus(
break; event.id,
case 'CREATE_BLOCK_FAILED': event.status,
newStatus = this.handleCreateBlockFailed(event.id, event.error); event.errorMessage
);
break; break;
case 'REMOVE_RECORDING': case 'REMOVE_RECORDING':
this.handleRemoveRecording(event.id); this.handleRemoveRecording(event.id);
@@ -133,7 +142,7 @@ export class RecordingStateMachine {
const currentStatus = this.recordingStatus$.value; const currentStatus = this.recordingStatus$.value;
if ( if (
currentStatus?.status === 'recording' || currentStatus?.status === 'recording' ||
currentStatus?.status === 'stopped' currentStatus?.status === 'processing'
) { ) {
logger.error( logger.error(
'Cannot start a new recording if there is already a recording' 'Cannot start a new recording if there is already a recording'
@@ -160,46 +169,31 @@ export class RecordingStateMachine {
} }
/** /**
* Handle the PAUSE_RECORDING event * Attach native recording metadata to the current recording
*/ */
private handlePauseRecording(): RecordingStatus | null { private handleAttachNativeRecording(
event: Extract<RecordingEvent, { type: 'ATTACH_NATIVE_RECORDING' }>
): RecordingStatus | null {
const currentStatus = this.recordingStatus$.value; const currentStatus = this.recordingStatus$.value;
if (!currentStatus || currentStatus.id !== event.id) {
if (!currentStatus) { logger.error(`Recording ${event.id} not found for native attachment`);
logger.error('No active recording to pause'); return currentStatus;
return null;
} }
if (currentStatus.status !== 'recording') { if (currentStatus.status !== 'recording') {
logger.error(`Cannot pause recording in ${currentStatus.status} state`); logger.error(
`Cannot attach native metadata when recording is in ${currentStatus.status} state`
);
return currentStatus; return currentStatus;
} }
return { return {
...currentStatus, ...currentStatus,
status: 'paused', nativeId: event.nativeId,
}; startTime: event.startTime,
} filepath: event.filepath,
sampleRate: event.sampleRate,
/** numberOfChannels: event.numberOfChannels,
* Handle the RESUME_RECORDING event
*/
private handleResumeRecording(): RecordingStatus | null {
const currentStatus = this.recordingStatus$.value;
if (!currentStatus) {
logger.error('No active recording to resume');
return null;
}
if (currentStatus.status !== 'paused') {
logger.error(`Cannot resume recording in ${currentStatus.status} state`);
return currentStatus;
}
return {
...currentStatus,
status: 'recording',
}; };
} }
@@ -214,26 +208,25 @@ export class RecordingStateMachine {
return currentStatus; return currentStatus;
} }
if ( if (currentStatus.status !== 'recording') {
currentStatus.status !== 'recording' &&
currentStatus.status !== 'paused'
) {
logger.error(`Cannot stop recording in ${currentStatus.status} state`); logger.error(`Cannot stop recording in ${currentStatus.status} state`);
return currentStatus; return currentStatus;
} }
return { return {
...currentStatus, ...currentStatus,
status: 'stopped', status: 'processing',
}; };
} }
/** /**
* Handle the SAVE_RECORDING event * Attach the encoded artifact once native stop completes
*/ */
private handleSaveRecording( private handleAttachRecordingArtifact(
id: number, id: number,
filepath: string filepath: string,
sampleRate?: number,
numberOfChannels?: number
): RecordingStatus | null { ): RecordingStatus | null {
const currentStatus = this.recordingStatus$.value; const currentStatus = this.recordingStatus$.value;
@@ -242,51 +235,54 @@ export class RecordingStateMachine {
return currentStatus; return currentStatus;
} }
return { if (currentStatus.status !== 'processing') {
...currentStatus, logger.error(`Cannot attach artifact in ${currentStatus.status} state`);
status: 'ready',
filepath,
};
}
/**
* Handle the CREATE_BLOCK_SUCCESS event
*/
private handleCreateBlockSuccess(id: number): RecordingStatus | null {
const currentStatus = this.recordingStatus$.value;
if (!currentStatus || currentStatus.id !== id) {
logger.error(`Recording ${id} not found for create-block-success`);
return currentStatus; return currentStatus;
} }
return { return {
...currentStatus, ...currentStatus,
status: 'create-block-success', filepath,
sampleRate: sampleRate ?? currentStatus.sampleRate,
numberOfChannels: numberOfChannels ?? currentStatus.numberOfChannels,
}; };
} }
/** /**
* Handle the CREATE_BLOCK_FAILED event * Set the renderer-side block creation result
*/ */
private handleCreateBlockFailed( private handleSetBlockCreationStatus(
id: number, id: number,
error?: Error status: 'success' | 'failed',
errorMessage?: string
): RecordingStatus | null { ): RecordingStatus | null {
const currentStatus = this.recordingStatus$.value; const currentStatus = this.recordingStatus$.value;
if (!currentStatus || currentStatus.id !== id) { if (!currentStatus || currentStatus.id !== id) {
logger.error(`Recording ${id} not found for create-block-failed`); logger.error(`Recording ${id} not found for block creation status`);
return currentStatus; return currentStatus;
} }
if (error) { if (currentStatus.status === 'new') {
logger.error(`Recording ${id} create block failed:`, error); logger.error(`Cannot settle recording ${id} before it starts`);
return currentStatus;
}
if (
currentStatus.status === 'ready' &&
currentStatus.blockCreationStatus !== undefined
) {
return currentStatus;
}
if (errorMessage) {
logger.error(`Recording ${id} create block failed: ${errorMessage}`);
} }
return { return {
...currentStatus, ...currentStatus,
status: 'create-block-failed', status: 'ready',
blockCreationStatus: status,
}; };
} }

Some files were not shown because too many files have changed in this diff Show More